i'm developing an android application uses bluetooth LE, i need to get RR interval. I start form this example on android developer site:
private void broadcastUpdate(final String action,
final BluetoothGattCharacteristic characteristic) {
final Intent intent = new Intent(action);
// This is special handling for the Heart Rate Measurement profile. Data
// parsing is carried out as per profile specifications.
if (UUID_HEART_RATE_MEASUREMENT.equals(characteristic.getUuid())) {
int flag = characteristic.getProperties();
int format = -1;
if ((flag & 0x01) != 0) {
format = BluetoothGattCharacteristic.FORMAT_UINT16;
Log.d(TAG, "Heart rate format UINT16.");
} else {
format = BluetoothGattCharacteristic.FORMAT_UINT8;
Log.d(TAG, "Heart rate format UINT8.");
}
final int heartRate = characteristic.getIntValue(format, 1);
Log.d(TAG, String.format("Received heart rate: %d", heartRate));
intent.putExtra(EXTRA_DATA, String.valueOf(heartRate));
} else {
// For all other profiles, writes the data formatted in HEX.
final byte[] data = characteristic.getValue();
if (data != null && data.length > 0) {
final StringBuilder stringBuilder = new StringBuilder(data.length);
for(byte byteChar : data)
stringBuilder.append(String.format("%02X ", byteChar));
intent.putExtra(EXTRA_DATA, new String(data) + "\n" +
stringBuilder.toString());
}
}
sendBroadcast(intent);
}
How i can get RR interval? I just do it with iOS, but on java i don't know how to do..
this is my code on iOS, and works perfectly:
- (void) updateWithHRMData:(NSData *)datas {
const uint8_t *reportData = [datas bytes];
uint16_t bpm = 0;
uint16_t bpm2 = 0;
if ((reportData[0] & 0x04) == 0)
{
NSLog(#"%#", #"Data are not present");
}
else
{
bpm = CFSwapInt16LittleToHost(*(uint16_t *)(&reportData[2]));
bpm2 = CFSwapInt16LittleToHost(*(uint16_t *)(&reportData[4]));
if (bpm != 0 || bpm2 != 0) {
self.deviceReady = true;
[lblNofascia setAlpha:0.0];
[btnMonitor setAlpha:1.0];
if (isRunning) {
[self.elencoBattiti addObject:[NSString stringWithFormat:#"%u", bpm]];
NSLog(#"%u", bpm);
if (bpm2 != 0) {
[self.elencoBattiti addObject:[NSString stringWithFormat:#"%u", bpm2]];
NSLog(#"%u", bpm2);
}
}
} else {
if (isRunning) {
totErrori++;
NSLog(#"Dato non trasmesso");
if (totErrori > 5) {
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Attenzione" message:#"Ho perso la connettività con la fascia. Ripetere la misurazione" delegate:self cancelButtonTitle:nil otherButtonTitles:#"Continua", nil];
[alert show];
[self stopRunning];
}
}
}
}
}
thank you
I have the following.
if (UUID_HEART_RATE_MEASUREMENT.equals(characteristic.getUuid())) {
int flag = characteristic.getIntValue(BluetoothGattCharacteristic.FORMAT_UINT8, 0);
int format = -1;
int energy = -1;
int offset = 1;
int rr_count = 0;
if ((flag & 0x01) != 0) {
format = BluetoothGattCharacteristic.FORMAT_UINT16;
Logger.trace("Heart rate format UINT16.");
offset = 3;
} else {
format = BluetoothGattCharacteristic.FORMAT_UINT8;
Logger.trace("Heart rate format UINT8.");
offset = 2;
}
final int heartRate = characteristic.getIntValue(format, 1);
Logger.trace("Received heart rate: {}", heartRate);
if ((flag & 0x08) != 0) {
// calories present
energy = characteristic.getIntValue(BluetoothGattCharacteristic.FORMAT_UINT16, offset);
offset += 2;
Logger.trace("Received energy: {}", energy);
}
if ((flag & 0x10) != 0){
// RR stuff.
rr_count = ((characteristic.getValue()).length - offset) / 2;
for (int i = 0; i < rr_count; i++){
mRr_values[i] = characteristic.getIntValue(BluetoothGattCharacteristic.FORMAT_UINT16, offset);
offset += 2;
Logger.trace("Received RR: {}", mRr_values[i]);
}
}
Thanks Ifor. With some fixes the android code should work. The bit mask for RR data flag should be 16 according to specs and the array should be declared:
if ((flag & 0x16) != 0){
// RR stuff.
rr_count = ((characteristic.getValue()).length - offset) / 2;
Integer[] mRr_values = new Integer[rr_count];
for (int i = 0; i < rr_count; i++){
mRr_values[i] = characteristic.getIntValue(BluetoothGattCharacteristic.FORMAT_UINT16, offset);
offset += 2;
Logger.trace("Received RR: {}", mRr_values[i]);
}
}
Related
This code used to be android11,but android12 video Unable to set size.Always full screen.Whatever size is set. How should I use it normally. The problem was discovered while upgrading the Android version.Please help me!thanks.android Why change the usage of the underlying interface. Is there any good in it.
void createSubSurface(int winSize[]){
int z_order = INT_MAX - 10;
int x_pos=0,y_pos=0 ,pip_width=960,pip_height =540;
const int numFds = 0;
const int numInts = 3;
x_pos = winSize[0];
y_pos = winSize[1];
pip_width = winSize[2];
pip_height = winSize[3];
printf(" createSurface pip start\n");
m_composerClient_pip = SurfaceComposerClient::getDefault();
m_composerClient_pip->initCheck();
#if 0
const sp<IBinder> display2 = SurfaceComposerClient::getInternalDisplayToken();
DisplayInfo info2;
if(display2 != NULL){
SurfaceComposerClient::getDisplayInfo(display2, &info2);
}
#endif
//surface 2
m_surfaceControl_pip = m_composerClient_pip->createSurface(String8("V1"),pip_width, pip_height, PIXEL_FORMAT_RGB_888, 0);
printf(" createSurface2 m_surfaceControl_pip !=NULL ? %d\n",m_surfaceControl_pip!=NULL);
if(m_surfaceControl_pip !=NULL && m_surfaceControl_pip->isValid()){
printf(" createSurface2 isValid == true\n");
SurfaceComposerClient::Transaction{}
.setLayer(m_surfaceControl_pip, z_order)
.setAlpha(m_surfaceControl_pip, 1.0f)
.setPosition(m_surfaceControl_pip,x_pos,y_pos)
.setSize(m_surfaceControl_pip, pip_width,pip_height)
.show(m_surfaceControl_pip)
.apply();
m_surface_pip = m_surfaceControl_pip->getSurface();
}
if(m_surface_pip !=NULL){
for (int k = 0; k < 10; k++)
{
int status =native_window_api_connect(m_surface_pip.get(),NATIVE_WINDOW_API_MEDIA);
printf(" createSurface2 status = %d\n",status );
handle2= native_handle_create(numFds, numInts);
handle2->data[numFds] = 1;
handle2->data[numFds + 1] = 1;
handle2->data[numFds + 2] = SidebandClient::DVDPLAYER_V2;
native_window_set_sideband_stream(m_surface_pip.get(), handle2);
if(status == 0){
break;
}else{
printf(" createSurface2 native_window_api_connect != OK\n");
}
usleep(200000);
}
}
printf("\n createSurface2 OK \n");
}
I used AudioRecord to record audio, following is my code:
#include<stdio.h>
#include<stdlib.h>
#include <media/AudioRecord.h>
using namespace android;
//==============================================
// Audio Record Defination
//==============================================
static pthread_t g_AudioRecordThread;
static pthread_t * g_AudioRecordThreadPtr = NULL;
volatile bool g_bQuitAudioRecordThread = false;
volatile int g_iInSampleTime = 0;
int g_iNotificationPeriodInFrames = 8000/10;
// g_iNotificationPeriodInFrames should be change when sample rate changes.
static void * AudioRecordThread( void *inArg );
void AudioRecordCallback(int event, void* user, void *info)
{
if(event == android::AudioRecord::EVENT_NEW_POS)
{
g_iInSampleTime += g_iNotificationPeriodInFrames;
//if(g_iInSampleTime > g_iNotificationPeriodInFrames*100)
// g_bQuitAudioRecordThread = true;
}
else if (event == android::AudioRecord::EVENT_MORE_DATA)
{
android::AudioRecord::Buffer* pBuff = (android::AudioRecord::Buffer*)info;
pBuff->size = 0;
}
else if (event == android::AudioRecord::EVENT_OVERRUN)
{
//LOGE(" EVENT_OVERRUN \n");
printf("[%d%s]EVENT_OVERRUN \n",__LINE__,__FUNCTION__);
}
}
static void * AudioRecordThread( void *inArg )
{
uint64_t inHostTime = 0;
void * inBuffer = NULL;
audio_source_t inputSource = AUDIO_SOURCE_MIC;
audio_format_t audioFormat = AUDIO_FORMAT_PCM_16_BIT;
audio_channel_mask_t channelConfig = AUDIO_CHANNEL_IN_MONO; //AUDIO_CHANNEL_IN_STEREO;
int bufferSizeInBytes = 1600;
int sampleRateInHz = 8000;
android::AudioRecord * pAudioRecord = NULL;
FILE * g_pAudioRecordFile = NULL;
//char strAudioFile[] = "/mnt/sdcard/external_sd/AudioRecordFile.pcm";
char strAudioFile[] = "./AudioRecordFile.pcm";
int iNbChannels = 1; // 1 channel for mono, 2 channel for streo
int iBytesPerSample = 2; // 16bits pcm, 2Bytes
int frameSize = 0; // frameSize = iNbChannels * iBytesPerSample
//int minFrameCount = 0; // get from AudroRecord object
size_t minFrameCount = 0; // get from AudroRecord object
int iWriteDataCount = 0; // how many data are there write to file
printf("[%d%s]thread enter ok!\n",__LINE__,__FUNCTION__);
String16 pack_name("wang_test");
printf("[%d%s]thread enter ok!\n",__LINE__,__FUNCTION__);
#if 1
// log the thread id for debug info
//LOGD("%s Thread ID = %d \n", __FUNCTION__, pthread_self());
printf("[%d%s] Thread ID = %d \n", __LINE__,__FUNCTION__, pthread_self());
g_iInSampleTime = 0;
g_pAudioRecordFile = fopen(strAudioFile, "wb+");
if(g_pAudioRecordFile == NULL)
{
printf("open file erro !\n");
}
iNbChannels = (channelConfig == AUDIO_CHANNEL_IN_STEREO) ? 2 : 1;
frameSize = iNbChannels * iBytesPerSample;
android::status_t status = android::AudioRecord::getMinFrameCount(
&minFrameCount, sampleRateInHz, audioFormat, channelConfig);
if(status != android::NO_ERROR)
{
//LOGE("%s AudioRecord.getMinFrameCount fail \n", __FUNCTION__);
printf("[%d%s]AudioRecord.getMinFrameCount fail \n",__LINE__,__FUNCTION__);
goto exit ;
}
//LOGE("sampleRateInHz = %d minFrameCount = %d iNbChannels = %d frameSize = %d ",
// sampleRateInHz, minFrameCount, iNbChannels, frameSize);
printf("[%d%s]sampleRateInHz = %d minFrameCount = %d iNbChannels = %d frameSize = %d \n",
__LINE__,__FUNCTION__,sampleRateInHz, minFrameCount, iNbChannels, frameSize);
bufferSizeInBytes = minFrameCount * frameSize;
inBuffer = malloc(bufferSizeInBytes);
if(inBuffer == NULL)
{
//LOGE("%s alloc mem failed \n", __FUNCTION__);
printf("[%d%s] alloc mem failed \n",__LINE__, __FUNCTION__);
goto exit ;
}
g_iNotificationPeriodInFrames = sampleRateInHz/10;
pAudioRecord = new android::AudioRecord(pack_name);
if(NULL == pAudioRecord)
{
//LOGE(" create native AudioRecord failed! ");
printf(" [%d%s] create native AudioRecord failed! \n",__LINE__,__FUNCTION__);
goto exit;
}
pAudioRecord->set( inputSource,
sampleRateInHz,
audioFormat,
channelConfig,
0,
AudioRecordCallback,
NULL,
0,
true);
if(pAudioRecord->initCheck() != android::NO_ERROR)
{
//LOGE("AudioTrack initCheck error!");
printf("[%d%s]AudioTrack initCheck error!\n",__LINE__,__FUNCTION__);
goto exit;
}
if(pAudioRecord->setPositionUpdatePeriod(g_iNotificationPeriodInFrames) != android::NO_ERROR)
{
//LOGE("AudioTrack setPositionUpdatePeriod error!");
printf("[%d%s]AudioTrack setPositionUpdatePeriod error!\n",__LINE__,__FUNCTION__);
goto exit;
}
if(pAudioRecord->start()!= android::NO_ERROR)
{
//LOGE("AudioTrack start error!");
printf("[%d%s]AudioTrack start error!\n",__LINE__,__FUNCTION__);
goto exit;
}
while (!g_bQuitAudioRecordThread)
{
//inHostTime = UpTicks();
int readLen = pAudioRecord->read(inBuffer, bufferSizeInBytes);
int writeResult = -1;
if(readLen > 0)
{
iWriteDataCount += readLen;
if(NULL != g_pAudioRecordFile)
{
writeResult = fwrite(inBuffer, 1, readLen, g_pAudioRecordFile);
if(writeResult < readLen)
{
//LOGE("Write Audio Record Stream error");
printf("[%d%s]Write Audio Record Stream error\n",__LINE__,__FUNCTION__);
}
}
// write PCM data to file or other stream,implement it yourself
//writeResult = WriteAudioData(
// g_iInSampleTime,
// inHostTime,
// inBuffer,
// readLen);
//LOGD("readLen = %d writeResult = %d iWriteDataCount = %d", readLen, writeResult, iWriteDataCount);
}
else
{
//LOGE("pAudioRecord->read readLen = 0");
printf("[%d%s]pAudioRecord->read readLen = 0\n",__LINE__,__FUNCTION__);
}
}
exit:
if(NULL != g_pAudioRecordFile)
{
fflush(g_pAudioRecordFile);
fclose(g_pAudioRecordFile);
g_pAudioRecordFile = NULL;
}
if(pAudioRecord)
{
pAudioRecord->stop();
//delete pAudioRecord;
pAudioRecord == NULL;
}
if(inBuffer)
{
free(inBuffer);
inBuffer = NULL;
}
#endif
//LOGD("%s Thread ID = %d quit\n", __FUNCTION__, pthread_self());
printf("[%d%s] Thread ID = %d quit\n", __LINE__,__FUNCTION__, pthread_self());
return NULL;
}
int main()
{
printf("hello world! \n");
pthread_t record_pid ;
if(pthread_create(&record_pid,NULL,AudioRecordThread,NULL)<0)
{
printf("%d%s pthread create erro !\n",__LINE__,__FUNCTION__);
}
while(1)
{
}
return 0;
}
I pushed the executable file under /data directory. After running it, i get the pcm file. I used Cool Pro Edit software to play it, however i couldn't hear anything.
I need to create videos with data hidden in them. i managed to extract video frames using mediacodec decoder as NV21 buffer and save them, then i create mp4 file from frames using mediacodec encoder.
the class below is responsible for saving frame files if we are in encode process or check the value if we want to extract data from stego-video.
public class ExtractMpegFramesBufferDecoder {
private static final String TAG = "ExtractMpegFramesDec";
private static final boolean VERBOSE = true; // lots of logging
// where to find files (note: requires WRITE_EXTERNAL_STORAGE permission)
private File STORE_FRAME_DIRECTORY;
private String INPUT_FILE;
private int frameRate; // stop extracting after this many
private int saveWidth;
private int saveHeight;
private int decodeCount;
private Handler _progressBarHandler;
private int duration;
//
private int MAX_FRAMES;
private boolean fromDecode;
//
public ExtractMpegFramesBufferDecoder(File storeFrameDirectory, String inputVideoPath, int frameRate
, int saveWidth, int saveHeight
, double duration, int rotation
, Handler _progressBarHandler) {
this.STORE_FRAME_DIRECTORY = storeFrameDirectory;
this.INPUT_FILE = inputVideoPath;
this.frameRate = frameRate;
this.saveWidth = saveWidth;
this.saveHeight = saveHeight;
this._progressBarHandler = _progressBarHandler;
this.duration = (int) duration;
}
/**
* Tests extraction from an MP4 to a series of PNG files.
* <p>
* We scale the video to 640x480 for the PNG just to demonstrate that we can scale the
* video with the GPU. If the input video has a different aspect ratio, we could preserve
* it by adjusting the GL viewport to get letterboxing or pillarboxing, but generally if
* you're extracting frames you don't want black bars.
*/
public void extractMpegFrames(int maxFrame, boolean fromDecode) throws IOException {
MediaCodec decoder = null;
MediaExtractor extractor = null;
MAX_FRAMES = maxFrame;
this.fromDecode = fromDecode;
try {
File inputFile = new File(INPUT_FILE); // must be an absolute path
// The MediaExtractor error messages aren't very useful. Check to see if the input
// file exists so we can throw a better one if it's not there.
if (!inputFile.canRead()) {
throw new FileNotFoundException("Unable to read " + inputFile);
}
extractor = new MediaExtractor();
extractor.setDataSource(inputFile.toString());
int trackIndex = selectTrack(extractor);
if (trackIndex < 0) {
throw new RuntimeException("No video track found in " + inputFile);
}
extractor.selectTrack(trackIndex);
MediaFormat format = extractor.getTrackFormat(trackIndex);
if (VERBOSE) {
Log.d(TAG, "Video size is " + format.getInteger(MediaFormat.KEY_WIDTH) + "x" +
format.getInteger(MediaFormat.KEY_HEIGHT));
}
// Create a MediaCodec decoder, and configure it with the MediaFormat from the
// extractor. It's very important to use the format from the extractor because
// it contains a copy of the CSD-0/CSD-1 codec-specific data chunks.
String mime = format.getString(MediaFormat.KEY_MIME);
decoder = MediaCodec.createDecoderByType(mime);
decoder.configure(format, null, null, 0);
decoder.start();
doExtract(extractor, trackIndex, decoder);
} finally {
if (decoder != null) {
decoder.stop();
decoder.release();
decoder = null;
}
if (extractor != null) {
extractor.release();
extractor = null;
}
}
}
/**
* Selects the video track, if any.
*
* #return the track index, or -1 if no video track is found.
*/
private int selectTrack(MediaExtractor extractor) {
// Select the first video track we find, ignore the rest.
int numTracks = extractor.getTrackCount();
for (int i = 0; i < numTracks; i++) {
MediaFormat format = extractor.getTrackFormat(i);
String mime = format.getString(MediaFormat.KEY_MIME);
if (mime.startsWith("video/")) {
if (VERBOSE) {
Log.d(TAG, "Extractor selected track " + i + " (" + mime + "): " + format);
}
return i;
}
}
return -1;
}
/**
* Work loop.
*/
public void doExtract(MediaExtractor extractor, int trackIndex, MediaCodec decoder) throws IOException {
final int TIMEOUT_USEC = 10000;
ByteBuffer[] decoderInputBuffers = decoder.getInputBuffers();
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
int inputChunk = 0;
decodeCount = 0;
long frameSaveTime = 0;
boolean outputDone = false;
boolean inputDone = false;
ByteBuffer[] decoderOutputBuffers = decoder.getOutputBuffers();
MediaFormat decoderOutputFormat = null;
long rawSize = 0;
while (!outputDone) {
if (VERBOSE) Log.d(TAG, "loop");
// Feed more data to the decoder.
if (!inputDone) {
int inputBufIndex = decoder.dequeueInputBuffer(TIMEOUT_USEC);
if (inputBufIndex >= 0) {
ByteBuffer inputBuf = decoderInputBuffers[inputBufIndex];
// Read the sample data into the ByteBuffer. This neither respects nor
// updates inputBuf's position, limit, etc.
int chunkSize = extractor.readSampleData(inputBuf, 0);
if (chunkSize < 0) {
// End of stream -- send empty frame with EOS flag set.
decoder.queueInputBuffer(inputBufIndex, 0, 0, 0L,
MediaCodec.BUFFER_FLAG_END_OF_STREAM);
inputDone = true;
if (VERBOSE) Log.d(TAG, "sent input EOS");
} else {
if (extractor.getSampleTrackIndex() != trackIndex) {
Log.w(TAG, "WEIRD: got sample from track " +
extractor.getSampleTrackIndex() + ", expected " + trackIndex);
}
long presentationTimeUs = extractor.getSampleTime();
decoder.queueInputBuffer(inputBufIndex, 0, chunkSize,
presentationTimeUs, 0 /*flags*/);
if (VERBOSE) {
Log.d(TAG, "submitted frame " + inputChunk + " to dec, size=" +
chunkSize);
}
inputChunk++;
extractor.advance();
}
} else {
if (VERBOSE) Log.d(TAG, "input buffer not available");
}
}
if (!outputDone) {
int decoderStatus = decoder.dequeueOutputBuffer(info, TIMEOUT_USEC);
if (decoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
// no output available yet
if (VERBOSE) Log.d(TAG, "no output from decoder available");
} else if (decoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
// not important for us, since we're using Surface
if (VERBOSE) Log.d(TAG, "decoder output buffers changed");
decoderOutputBuffers = decoder.getOutputBuffers();
} else if (decoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
MediaFormat newFormat = decoder.getOutputFormat();
decoderOutputFormat = newFormat;
if (VERBOSE) Log.d(TAG, "decoder output format changed: " + newFormat);
} else if (decoderStatus < 0) {
Log.e(TAG, "unexpected result from decoder.dequeueOutputBuffer: " + decoderStatus);
} else { // decoderStatus >= 0
if (VERBOSE) Log.d(TAG, "surface decoder given buffer " + decoderStatus +
" (size=" + info.size + ")");
if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
if (VERBOSE) Log.d(TAG, "output EOS");
outputDone = true;
}
ByteBuffer outputFrame = decoderOutputBuffers[decoderStatus];
outputFrame.position(info.offset);
outputFrame.limit(info.offset + info.size);
rawSize += info.size;
if (info.size == 0) {
if (VERBOSE) Log.d(TAG, "got empty frame");
} else {
// if it's decode then check the altered value
// else save the frames
if (fromDecode) {
outputFrame.rewind();
byte[] data = new byte[outputFrame.remaining()];
outputFrame.get(data);
int size = saveWidth * saveHeight;
int offset = size;
int[] pixels = new int[size];
int u, v, y1, y2, y3, y4;
int uvIndex = 0;
if (decodeCount == 1) {
// i percorre os Y and the final pixels
// k percorre os pixles U e V
for (int i = 0, k = 0; i < size; i += 2, k += 2) {
y1 = data[i] & 0xff;
y2 = data[i + 1] & 0xff;
y3 = data[saveWidth + i] & 0xff;
y4 = data[saveWidth + i + 1] & 0xff;
u = data[offset + k] & 0xff;
v = data[offset + k + 1] & 0xff;
// getting size
if (uvIndex == 0) {
int specialByte1P1 = u & 15;
int specialByte1P2 = v & 15;
int specialCharacter1 = (specialByte1P1 << 4) | specialByte1P2;
if (specialCharacter1 != 17) {
throw new IllegalArgumentException("value has changed");
}
}
uvIndex++;
if (i != 0 && (i + 2) % saveWidth == 0)
i += saveWidth;
}
}
} else {
outputFrame.rewind();
byte[] data = new byte[outputFrame.remaining()];
outputFrame.get(data);
try {
File outputFile = new File(STORE_FRAME_DIRECTORY,
String.format(Locale.US, "frame_%d.frame", decodeCount));
FileOutputStream stream = new FileOutputStream(outputFile.getAbsoluteFile());
stream.write(data);
} catch (FileNotFoundException e1) {
e1.printStackTrace();
}
}
decodeCount++;
}
if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
if (VERBOSE) Log.d(TAG, "output EOS");
outputDone = true;
}
decoder.releaseOutputBuffer(decoderStatus, false);
}
}
}
int numSaved = (frameRate < decodeCount) ? frameRate : decodeCount;
Log.d(TAG, "Saving " + numSaved + " frames took " +
(frameSaveTime / numSaved / 1000) + " us per frame");
}
public int getDecodeCount() {
return decodeCount;
}
}
in class below i encode frames, alter one u v value of (frame 1), store digit number 17 in lsb of first u and v and build mp4 using mediacodec encoder.
public class YUVFrameBufferToVideoEncoder {
private static final String TAG = BitmapToVideoEncoder.class.getSimpleName();
private static final int ERROR_IN_PROCESS = 0;
private IBitmapToVideoEncoderCallback mCallback;
private File mOutputFile;
private Queue<File> mEncodeQueue = new ConcurrentLinkedQueue();
private MediaCodec mediaCodec;
private MediaMuxer mediaMuxer;
private Object mFrameSync = new Object();
private CountDownLatch mNewFrameLatch;
private static final String MIME_TYPE = "video/avc"; // H.264 Advanced Video Coding
private static int mWidth;
private static int mHeight;
private static int BIT_RATE;
private static int FRAME_RATE; // Frames per second
private int frameCount;
private Handler _progressBarHandler;
private Handler _processHandler;
private static final int I_FRAME_INTERVAL = 1;
private int mGenerateIndex = 0;
private int mTrackIndex;
private boolean mNoMoreFrames = false;
private boolean mAbort = false;
//
private byte[] dataToHide;
public interface IBitmapToVideoEncoderCallback {
void onEncodingComplete(File outputFile);
}
public YUVFrameBufferToVideoEncoder(IBitmapToVideoEncoderCallback callback) {
mCallback = callback;
}
public boolean isEncodingStarted() {
return (mediaCodec != null) && (mediaMuxer != null) && !mNoMoreFrames && !mAbort;
}
public int getActiveBitmaps() {
return mEncodeQueue.size();
}
public boolean startEncoding(int width, int height, int fps, int bitrate, int frameCount
, byte[] dataToHide, Handler _progressBarHandler, Handler _processHandler
, File outputFile) {
mWidth = width;
mHeight = height;
FRAME_RATE = fps;
BIT_RATE = bitrate;
this.frameCount = frameCount;
this._progressBarHandler = _progressBarHandler;
this._processHandler = _processHandler;
mOutputFile = outputFile;
this.dataToHide = dataToHide;
String outputFileString;
try {
outputFileString = outputFile.getCanonicalPath();
} catch (IOException e) {
Log.e(TAG, "Unable to get path for " + outputFile);
ErrorManager.getInstance().addErrorMessage("Unable to get path for " + outputFile);
return false;
}
MediaCodecInfo codecInfo = selectCodec(MIME_TYPE);
if (codecInfo == null) {
Log.e(TAG, "Unable to find an appropriate codec for " + MIME_TYPE);
ErrorManager.getInstance().addErrorMessage("Unable to find an appropriate codec for " + MIME_TYPE);
return false;
}
Log.d(TAG, "found codec: " + codecInfo.getName());
int colorFormat;
try {
colorFormat = MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar;
} catch (Exception e) {
colorFormat = MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar;
}
try {
mediaCodec = MediaCodec.createByCodecName(codecInfo.getName());
} catch (IOException e) {
Log.e(TAG, "Unable to create MediaCodec " + e.getMessage());
ErrorManager.getInstance().addErrorMessage("Unable to create MediaCodec " + e.getMessage());
return false;
}
MediaFormat mediaFormat = MediaFormat.createVideoFormat(MIME_TYPE, mWidth, mHeight);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, colorFormat);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, I_FRAME_INTERVAL);
mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mediaCodec.start();
try {
mediaMuxer = new MediaMuxer(outputFileString, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
} catch (IOException e) {
Log.e(TAG, "MediaMuxer creation failed. " + e.getMessage());
ErrorManager.getInstance().addErrorMessage("MediaMuxer creation failed. " + e.getMessage());
return false;
}
Log.d(TAG, "Initialization complete. Starting encoder...");
Completable.fromAction(this::encode)
.subscribeOn(Schedulers.io())
.observeOn(AndroidSchedulers.mainThread())
.subscribe();
return true;
}
public void stopEncoding() {
if (mediaCodec == null || mediaMuxer == null) {
Log.d(TAG, "Failed to stop encoding since it never started");
return;
}
Log.d(TAG, "Stopping encoding");
mNoMoreFrames = true;
synchronized (mFrameSync) {
if ((mNewFrameLatch != null) && (mNewFrameLatch.getCount() > 0)) {
mNewFrameLatch.countDown();
}
}
}
public void abortEncoding() {
if (mediaCodec == null || mediaMuxer == null) {
Log.d(TAG, "Failed to abort encoding since it never started");
return;
}
Log.d(TAG, "Aborting encoding");
mNoMoreFrames = true;
mAbort = true;
mEncodeQueue = new ConcurrentLinkedQueue(); // Drop all frames
synchronized (mFrameSync) {
if ((mNewFrameLatch != null) && (mNewFrameLatch.getCount() > 0)) {
mNewFrameLatch.countDown();
}
}
}
public void queueFrame(File frame) {
if (mediaCodec == null || mediaMuxer == null) {
Log.d(TAG, "Failed to queue frame. Encoding not started");
return;
}
Log.d(TAG, "Queueing frame");
mEncodeQueue.add(frame);
synchronized (mFrameSync) {
if ((mNewFrameLatch != null) && (mNewFrameLatch.getCount() > 0)) {
mNewFrameLatch.countDown();
}
}
}
private void encode() {
Log.d(TAG, "Encoder started");
while (true) {
if (mNoMoreFrames && (mEncodeQueue.size() == 0)) break;
File frame = mEncodeQueue.poll();
if (frame == null) {
synchronized (mFrameSync) {
mNewFrameLatch = new CountDownLatch(1);
}
try {
mNewFrameLatch.await();
} catch (InterruptedException e) {
}
frame = mEncodeQueue.poll();
}
if (frame == null) continue;
int size = (int) frame.length();
byte[] bytesNV21 = new byte[size];
try {
BufferedInputStream buf = new BufferedInputStream(new FileInputStream(frame));
buf.read(bytesNV21, 0, bytesNV21.length);
buf.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
int offsetSize = mWidth * mHeight;
int byteNV21Offset = offsetSize;
int u, v, y1, y2, y3, y4;
//
int dataToHideLength = 0;
if (dataToHide != null)
dataToHideLength = dataToHide.length;
boolean isLastIndexInserted1 = false;
boolean isLastIndexInserted2 = false;
boolean isLastIndexInserted3 = false;
int uvIndex = 0;
int frameByteCapacity = ((mWidth * mHeight) / 4) / 20;
Log.e(TAG, "encode: dataToHideLength: " + dataToHideLength);
Log.e(TAG, "encode: frameByteCapacity: " + dataToHideLength);
//
// i percorre os Y and the final pixels
// k percorre os pixles U e V
for (int i = 0, k = 0; i < offsetSize; i += 2, k += 2) {
y1 = bytesNV21[i] & 0xff;
y2 = bytesNV21[i + 1] & 0xff;
y3 = bytesNV21[mWidth + i] & 0xff;
y4 = bytesNV21[mWidth + i + 1] & 0xff;
u = bytesNV21[byteNV21Offset + k] & 0xff;
v = bytesNV21[byteNV21Offset + k + 1] & 0xff;
// frame 1
// altering u and v for test
if (mGenerateIndex == 1) {
int Unew = u & 240;
int Vnew = v & 240;
if (uvIndex == 0) {
// used in start and end of stego bytes
int specialByte1Integer = 17;
int specialByte1P1 = specialByte1Integer & 240;
int specialByte1P2 = specialByte1Integer & 15;
// shift p1 right 4 position
specialByte1P1 = specialByte1P1 >> 4;
u = Unew | specialByte1P1;
v = Vnew | specialByte1P2;
}
bytesNV21[byteNV21Offset + k] = (byte) u;
bytesNV21[byteNV21Offset + k + 1] = (byte) v;
}
uvIndex++;
if (i != 0 && (i + 2) % mWidth == 0)
i += mWidth;
}
long TIMEOUT_USEC = 500000;
int inputBufIndex = mediaCodec.dequeueInputBuffer(TIMEOUT_USEC);
long ptsUsec = computePresentationTime(mGenerateIndex, FRAME_RATE);
if (inputBufIndex >= 0) {
final ByteBuffer inputBuffer = mediaCodec.getInputBuffers()[inputBufIndex];
inputBuffer.clear();
inputBuffer.put(bytesNV21);
mediaCodec.queueInputBuffer(inputBufIndex, 0, bytesNV21.length, ptsUsec, 0);
mGenerateIndex++;
int percentComplete = 70 + (int) ((((double) mGenerateIndex) / (frameCount)) * 30);
if (_progressBarHandler != null) {
_progressBarHandler.sendMessage(_progressBarHandler.obtainMessage(percentComplete));
}
Log.w("creatingVideo: ", "is:" + percentComplete);
}
MediaCodec.BufferInfo mBufferInfo = new MediaCodec.BufferInfo();
int encoderStatus = mediaCodec.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);
if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
// no output available yet
Log.e(TAG, "No output from encoder available");
} else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
// not expected for an encoder
MediaFormat newFormat = mediaCodec.getOutputFormat();
mTrackIndex = mediaMuxer.addTrack(newFormat);
mediaMuxer.start();
} else if (encoderStatus < 0) {
Log.e(TAG, "unexpected result from encoder.dequeueOutputBuffer: " + encoderStatus);
} else if (mBufferInfo.size != 0) {
ByteBuffer encodedData = mediaCodec.getOutputBuffers()[encoderStatus];
if (encodedData == null) {
Log.e(TAG, "encoderOutputBuffer " + encoderStatus + " was null");
} else {
encodedData.position(mBufferInfo.offset);
encodedData.limit(mBufferInfo.offset + mBufferInfo.size);
mediaMuxer.writeSampleData(mTrackIndex, encodedData, mBufferInfo);
mediaCodec.releaseOutputBuffer(encoderStatus, false);
}
}
}
release();
if (mAbort) {
mOutputFile.delete();
} else {
mCallback.onEncodingComplete(mOutputFile);
}
}
private void release() {
try {
if (mediaCodec != null) {
mediaCodec.stop();
mediaCodec.release();
mediaCodec = null;
Log.d(TAG, "RELEASE CODEC");
}
if (mediaMuxer != null) {
mediaMuxer.stop();
mediaMuxer.release();
mediaMuxer = null;
Log.d(TAG, "RELEASE MUXER");
}
} catch (Exception ignored) {
ErrorManager.getInstance().addErrorMessage("unsupported video file");
Message res = _processHandler.obtainMessage(ERROR_IN_PROCESS);
_processHandler.sendMessage(res);
}
}
private static MediaCodecInfo selectCodec(String mimeType) {
int numCodecs = MediaCodecList.getCodecCount();
for (int i = 0; i < numCodecs; i++) {
MediaCodecInfo codecInfo = MediaCodecList.getCodecInfoAt(i);
if (!codecInfo.isEncoder()) {
continue;
}
String[] types = codecInfo.getSupportedTypes();
for (int j = 0; j < types.length; j++) {
if (types[j].equalsIgnoreCase(mimeType)) {
return codecInfo;
}
}
}
return null;
}
private static int selectColorFormat(MediaCodecInfo codecInfo,
String mimeType) {
MediaCodecInfo.CodecCapabilities capabilities = codecInfo
.getCapabilitiesForType(mimeType);
for (int i = 0; i < capabilities.colorFormats.length; i++) {
int colorFormat = capabilities.colorFormats[i];
if (isRecognizedFormat(colorFormat)) {
return colorFormat;
}
}
return 0; // not reached
}
private static boolean isRecognizedFormat(int colorFormat) {
switch (colorFormat) {
// these are the formats we know how to handle for
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:
return true;
default:
return false;
}
}
private long computePresentationTime(long frameIndex, int framerate) {
return 132 + frameIndex * 1000000 / framerate;
}}
output video is created successfully without any problem, but mediacodec has changed the altered test value and i cannot retrieve it.
here is my question, is this a right approach for doing video steganography in android? if it is not the right way can you please make a suggestion?
Steganography comes with a prerequisite - lossless encoding.
None of the codecs available on Android support lossless video encoding, as of now.
So I'm afraid your LSBs would never remain the same post encoding/decoding.
Suggestion: If you don't have a lot many frames, I would suggest you use a lossless format. You may encode your frames into a sequence of PNG images.
I am so new on developing of BLE , i am creating a demo which is showing battery status and percentage. It is working correctly but sometimes it is showing 0% , which is not correct.
Here is my code :
final byte[] data = characteristic.getValue();
if (data != null && data.length > 0) {
final StringBuilder stringBuilder = new StringBuilder(data.length);
for(byte byteChar : data)
stringBuilder.append(String.format("%02X ", byteChar));
final int flag = characteristic.getProperties();
int format = -1;
if ((flag & 0x01) != 0) {
format = BluetoothGattCharacteristic.FORMAT_UINT16;
Log.d(TAG, " format UINT16.");
} else {
format = BluetoothGattCharacteristic.FORMAT_UINT8;
Log.d(TAG, " UINT8.");
}
int batterylevel = characteristic.getIntValue(format, 0);
Intent in = new Intent(getApplicationContext(), HomeActivity.class);
in.putExtra("battery_status", String.valueOf(batterylevel));
in.setFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP);
startActivity(in);
finish();
}
try to get batterylevel like below.
characteristic.getValue()[0]
I'm owning a Polar H10 device and I'm interested in the heart rate as well as RR-interval which I read out with the official bluetooth low energy API of Android. The Polar device sends every every second a package with the heart rate and the RR-interval. Now I have recognized that in every such package is a heart rate value but in some packages there are no RR-interval values (The value of the RR-interval is -1).
Why does this happen? Is my device broken or did I made a mistake in the implementation or does somebody else also face this issue?
Edit: Here is the code. In the method public void onCharacteristicChanged(BluetoothGatt gatt, BluetoothGattCharacteristic characteristic) I'm receiving changed values from the Polar Device. This method is triggered approximately every second. Then I parse the characteristic as follows:
public int[] parse(BluetoothGattCharacteristic characteristic) {
double heartRate = extractHeartRate(c);
Integer[] interval = extractBeatToBeatInterval(c);
int[] result = null;
if (interval != null) {
result = new int[interval.length + 1];
} else {
result = new int[2];
result[1] = -1;
}
result[0] = (int) heartRate;
if (interval != null) {
for (int i = 0; i < interval.length; i++) {
result[i+1] = interval[i];
}
}
return result;
}
private static double extractHeartRate(
BluetoothGattCharacteristic characteristic) {
int flag = characteristic.getProperties();
Log.d(TAG, "Heart rate flag: " + flag);
int format = -1;
// Heart rate bit number format
if ((flag & 0x01) != 0) {
format = BluetoothGattCharacteristic.FORMAT_UINT16;
Log.d(TAG, "Heart rate format UINT16.");
} else {
format = BluetoothGattCharacteristic.FORMAT_UINT8;
Log.d(TAG, "Heart rate format UINT8.");
}
final int heartRate = characteristic.getIntValue(format, 1);
Log.d(TAG, String.format("Received heart rate: %d", heartRate));
return heartRate;
}
private static Integer[] extractBeatToBeatInterval(
BluetoothGattCharacteristic characteristic) {
int flag = characteristic.getIntValue(BluetoothGattCharacteristic.FORMAT_UINT8, 0);
int format = -1;
int energy = -1;
int offset = 1; // This depends on hear rate value format and if there is energy data
int rr_count = 0;
if ((flag & 0x01) != 0) {
format = BluetoothGattCharacteristic.FORMAT_UINT16;
Log.d(TAG, "Heart rate format UINT16.");
offset = 3;
} else {
format = BluetoothGattCharacteristic.FORMAT_UINT8;
Log.d(TAG, "Heart rate format UINT8.");
offset = 2;
}
if ((flag & 0x08) != 0) {
// calories present
energy = characteristic.getIntValue(BluetoothGattCharacteristic.FORMAT_UINT16, offset);
offset += 2;
Log.d(TAG, "Received energy: {}"+ energy);
}
if ((flag & 0x16) != 0){
// RR stuff.
Log.d(TAG, "RR stuff found at offset: "+ offset);
Log.d(TAG, "RR length: "+ (characteristic.getValue()).length);
rr_count = ((characteristic.getValue()).length - offset) / 2;
Log.d(TAG, "RR length: "+ (characteristic.getValue()).length);
Log.d(TAG, "rr_count: "+ rr_count);
if (rr_count > 0) {
Integer[] mRr_values = new Integer[rr_count];
for (int i = 0; i < rr_count; i++) {
mRr_values[i] = characteristic.getIntValue(
BluetoothGattCharacteristic.FORMAT_UINT16, offset);
offset += 2;
Log.d(TAG, "Received RR: " + mRr_values[i]);
}
return mRr_values;
}
}
Log.d(TAG, "No RR data on this update: ");
return null;
}
The first element returned by the parse method is the heart rate and the second element is the RR-interval. It happens that sometimes the second element is -1 (i.e. no RR-interval detected).
There is nothing wrong with your Polar device or the software you posted.
The RR-interval measure may be missing from some transmitted packets and the if ((flag & 0x16) != 0) accounts for this case.
Suppose for example that your device send a heart measure every second and you have 50 beats/sec: there will be some intervals where the RR interval is not measured because there isnt a detected beat in that second (it is a simplified explanation, just to get the point).