Google-NearBy video stream client's SurfaceView blank - android

Hello I'm trying to stream video from camera by Google NearBy Connections.
the connection established and messaging works but when I send stream the server device surface view show the camera preview but the client receive nothing with no error this is how is stream:
server:
SurfaceView surface;
public void sendCam(){
try {
surface = findViewById(R.id.surface);
ParcelFileDescriptor[] payloadPipe = ParcelFileDescriptor.createPipe();
ParcelFileDescriptor readFD = payloadPipe[0];
ParcelFileDescriptor writeFD = payloadPipe[1];
mCamera = Camera.open();
MediaRecorder rec = new MediaRecorder();
mCamera.lock();
mCamera.unlock();
rec.setCamera(mCamera);
rec.setVideoSource(MediaRecorder.VideoSource.CAMERA);
rec.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
rec.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
rec.setPreviewDisplay(surface.getHolder().getSurface());
rec.setOutputFile(writeFD.getFileDescriptor());
send(Payload.fromStream(readFD));
rec.prepare();
rec.start();
} catch (Exception e) {
}
}
client receive:
#Override
protected void onReceive(Endpoint endpoint, Payload payload) {
if (payload.getType() == Payload.Type.STREAM) {
try {
MediaPlayer mMediaPlayer = new MediaPlayer();
// mMediaPlayer.setDataSource(payload.asStream().asParcelFileDescriptor().getFileDescriptor()); // did not work also
FileInputStream inputStream = new FileInputStream(payload.asStream().asParcelFileDescriptor().getFileDescriptor());
mMediaPlayer.setDataSource(inputStream.getFD());
mMediaPlayer.setDisplay(surface.getHolder());
mMediaPlayer.prepare();
mMediaPlayer.start();
} catch (IOException e) {
e.printStackTrace();
}
}
}
I'm not even sure if i can stream video using nearby, the sample projects show how to stream audio. is this api not able to stream video or there is a problem with my code and what it could be.
UPDATE:
} catch (IOException e) { GIVES ME setDataSource failed.: status=0x80000000

There's nothing in Nearby Connections that precludes you from sending video streams -- if yours is a 1:1 scenario (one client device connected to one server device), it would be better to use the P2P_STAR Strategy (if you aren't already) since it provides you a higher bandwidth connection (which should help with video).
To make sure that you're at least receiving the stream's raw bytes on the client, you can check whether the onPayloadTransferUpdate() callback is firing.
If not, then you should check that same callback on the server, to see whether or not the bytes are being sent out.
If no bytes are being sent out from the server, then it might be a problem with your application-level video capture code.
Good luck!

Related

Screen Sharing between Devices using Media Projection API

I am developing an app which has the functionality of sharing screens with other apps.
I used the Media projection API for this. I also used MediaMuxer to combine the audio and video outputs for screen sharing.
I know that Media Projection APIs are used for screen recording but all I want is to share the screen while recording.
For this, I have modified the writeSampleData method of the MediaMuxer class to send bytes via a socket to the other device over the network.
Below is the code for that:
OutputStream outStream;
outStream = ScreenRecordingActivity.getInstance().socket.getOutputStream();
void writeSampleData(final int trackIndex, final ByteBuffer byteBuf, final MediaCodec.BufferInfo bufferInfo) {
if (mStatredCount > 0) {
mMediaMuxer.writeSampleData(trackIndex, byteBuf, bufferInfo);
if (bufferInfo.size != 0) {
byteBuf.position(bufferInfo.offset);
byteBuf.limit(bufferInfo.offset + bufferInfo.size);
if (outStream != null) {
try {
byte[] bytes = new byte[byteBuf.remaining()];
byteBuf.get(bytes);
//Send the data
outStream.write(bytes);
outStream.flush();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
}
The bytes are successfully transferred via socket and I am also able to receive these bytes at the receiver's end.
Below is the code for receiving bytes at the receiver's end:
private class SocketThread implements Runnable {
#Override
public void run() {
Socket socket;
try {
serverSocket = new ServerSocket(SERVER_PORT);
} catch (IOException e) {
e.printStackTrace();
}
if (null != serverSocket) {
while (!Thread.currentThread().isInterrupted()) {
try {
socket = serverSocket.accept();
CommunicationThread commThread = new CommunicationThread(socket);
new Thread(commThread).start();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
class CommunicationThread implements Runnable {
InputStream in;
DataInputStream dis;
public CommunicationThread(Socket clientSocket) {
updateMessage("Server Started...");
}
public void run() {
while (!Thread.currentThread().isInterrupted()) {
try {
byte[] data = new byte[512];
} catch (Exception e) {
e.printStackTrace();
try {
fos.close();
} catch (Exception e1) {
e1.printStackTrace();
}
}
}
}
}
}
I followed the following links for Screen Sharing:
Screen capture
screenrecorder
Screen recording with mediaProjection
I used some code from the above examples to make an app.
All I want to know is how to handle the bytes at the receiver. How do I format these bytes to play a live stream from the sender's side?
Am I following the correct approach for sending and receiving byte data?
Does MediaProjection allow one to stream the Screen while recording between applications?
Any help will be deeply appreciated.
Generally for streaming, including screen sharing, the audio and video tracks are not muxed. Instead, each video frame and audio sample is sent using a protocol like RTP/RTSP, in which each data chunk is wrapped with other things like timestamps.
You can take a look at spyadroid which is a good starting point for streaming audio and video over RTSP to a browser or VLC. It streams the camera and microphone but you can adapt it for your own use case.
If you want to go with sockets for the moment, you have to get rid of the MediaMuxer and send frames/samples directly from the Encoder output, appended with timestamps at least to synchronize the playback in the receiver side, after sending CSDs - assuming that you encode in h.264 format - data (SPS PPS aka csd-0 and csd-1 that you can get when the encoder format is changed) to the receiver Decoder, which you can configure with an output surface to render your stream.
Some extra links :
android-h264-stream-demo
RTMP Java Muxer for Android
RTSP
RTP
WebRTC

LibStreaming gives black screen when change resolution

I have using spydroid from https://github.com/fyhertz/spydroid-ipcamera.
Based on the requirement streaming should be send and receive in the device. from local network we should able to shown the rtsp stream. Ex. VLC Media Player.
The issue I am facing is, When I am change the resolution Ex. 640*480. It should give black screen with streaming live. In Default demo, It should support 320*240, which is working fine. I have also change the bitrate and framerate according to 640*480 resolution. But couldn't get the result.
Any help would be appreciate.
You might be using old library which the demo SpyDroid is using.
I had same issue with the code i tried following way :-
Steps:-
1.)include library LibStreaming .
As it is the latest library and supporting above Lollipop version.
2.)Find H263Stream class Change following method :-
From
#SuppressLint("NewApi")
private MP4Config testMediaCodecAPI() throws RuntimeException, IOException {
createCamera();
updateCamera();
try {
if (mQuality.resX>=640) {
// Using the MediaCodec API with the buffer method for high resolutions is too slow
mMode = MODE_MEDIARECORDER_API;
}
EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
return new MP4Config(debugger.getB64SPS(), debugger.getB64PPS());
} catch (Exception e) {
// Fallback on the old streaming method using the MediaRecorder API
Log.e(TAG,"Resolution not supported with the MediaCodec API, we fallback on the old streamign method.");
mMode = MODE_MEDIARECORDER_API;
return testH264();
}
}
To
#SuppressLint("NewApi")
private MP4Config testMediaCodecAPI() throws RuntimeException, IOException {
createCamera();
updateCamera();
try {
if (mQuality.resX>=1080) {
// Using the MediaCodec API with the buffer method for high resolutions is too slow
mMode = MODE_MEDIARECORDER_API;
}
EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
return new MP4Config(debugger.getB64SPS(), debugger.getB64PPS());
} catch (Exception e) {
// Fallback on the old streaming method using the MediaRecorder API
Log.e(TAG,"Resolution not supported with the MediaCodec API, we fallback on the old streamign method.");
mMode = MODE_MEDIARECORDER_API;
return testH264();
}
}
-You can find the difference change resolution from "640" into "1080"
-Don't know what exact reason is but above solution worked for me.
-Revert me back if there is any loop fall.
In my case the problem was in MediaRecorder:
file: VideoStream.java
method: encodeWithMediaRecorder
MediaRecorder doesn't support ParcelFileDescriptor correctly. I created local file to save stream form MediaRecorder:
mMediaRecorder.setOutputFile(this.tmpFileToStream);
//mMediaRecorder.setOutputFile(fd); //disable..
and then I run new thread to copy bytes from tmpFileToStream to mParcelWrite
public void run()
{
FileDescriptor fd = mParcelWrite.getFileDescriptor();
try {
InputStream isS = new FileInputStream(tmpFileToStream); //read from local file..
FileOutputStream outputStream = new FileOutputStream(fd);
while (!Thread.interrupted()) {
int content;
while ((content = isS.read()) != -1) {
outputStream.write(content);
}
Thread.sleep(10);
}
} catch (Exception e)
{
Log.e(TAG, "E.. " + e.getMessage());
}
}
/**
* Video encoding is done by a MediaRecorder.
*/
protected void encodeWithMediaRecorder() throws IOException, ConfNotSupportedException {
Log.d(TAG,"Video encoded using the MediaRecorder API");
Log.d(TAG,"Roz" + mRequestedQuality.resX + " x " + mRequestedQuality.resY + " frame " + mRequestedQuality.framerate );
// We need a local socket to forward data output by the camera to the packetizer
createSockets();
// Reopens the camera if needed
destroyCamera();
createCamera();
// The camera must be unlocked before the MediaRecorder can use it
unlockCamera();
this.tmpFileToStream = this.getOutputMediaFile(MEDIA_TYPE_VIDEO);
Log.d(TAG,"Video record to " + this.tmpFileToStream.getAbsolutePath() );
try {
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
// mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setVideoEncoder(mVideoEncoder);
mMediaRecorder.setPreviewDisplay(mSurfaceView.getHolder().getSurface());
mMediaRecorder.setVideoSize(mRequestedQuality.resX,mRequestedQuality.resY);
mMediaRecorder.setVideoFrameRate(mRequestedQuality.framerate);
// The bandwidth actually consumed is often above what was requested
mMediaRecorder.setVideoEncodingBitRate((int)(mRequestedQuality.bitrate*0.8));
// We write the output of the camera in a local socket instead of a file !
// This one little trick makes streaming feasible quiet simply: data from the camera
// can then be manipulated at the other end of the socket
FileDescriptor fd = null;
if (sPipeApi == PIPE_API_PFD) {
fd = mParcelWrite.getFileDescriptor();
} else {
fd = mSender.getFileDescriptor();
}
mMediaRecorder.setOutputFile(this.tmpFileToStream); //save to local.. afeter read and stream that..
// mMediaRecorder.setOutputFile(fd); //disable..
// copy bytes from local file to mParcelWrite.. :)
if(tInput == null) {
this.tInput = new Thread(this);
this.tInput.start();
}
mMediaRecorder.prepare();
mMediaRecorder.start();
} catch (Exception e) {
StringWriter sw = new StringWriter();
PrintWriter pw = new PrintWriter(sw);
e.printStackTrace(pw);
Log.d(TAG, "Error stack " + sw.toString());
throw new ConfNotSupportedException(e.getMessage());
}
InputStream is = null;
if (sPipeApi == PIPE_API_PFD) {
is = new ParcelFileDescriptor.AutoCloseInputStream(mParcelRead);
} else {
is = mReceiver.getInputStream();
}
// This will skip the MPEG4 header if this step fails we can't stream anything :(
try {
byte buffer[] = new byte[4];
// Skip all atoms preceding mdat atom
while (!Thread.interrupted()) {
while (is.read() != 'm');
is.read(buffer,0,3);
if (buffer[0] == 'd' && buffer[1] == 'a' && buffer[2] == 't') break;
}
} catch (IOException e) {
Log.e(TAG,"Couldn't skip mp4 header :/");
stop();
throw e;
}
// The packetizer encapsulates the bit stream in an RTP stream and send it over the network
mPacketizer.setInputStream(is);
mPacketizer.start();
mStreaming = true;
}
That's unusual solution but it works for another resolution

MediaRecorder issue on Android Lollipop

I'm testing libstreaming on new Android Lollipop, and this code that worked on previous release, seems to launch exception.
try {
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mMediaRecorder.setVideoEncoder(mVideoEncoder);
mMediaRecorder.setPreviewDisplay(mSurfaceView.getHolder().getSurface());
mMediaRecorder.setVideoSize(mRequestedQuality.resX,mRequestedQuality.resY);
mMediaRecorder.setVideoFrameRate(mRequestedQuality.framerate);
// The bandwidth actually consumed is often above what was requested
mMediaRecorder.setVideoEncodingBitRate((int)(mRequestedQuality.bitrate*0.8));
// We write the ouput of the camera in a local socket instead of a file !
// This one little trick makes streaming feasible quiet simply: data from the camera
// can then be manipulated at the other end of the socket
mMediaRecorder.setOutputFile(mSender.getFileDescriptor());
mMediaRecorder.prepare();
mMediaRecorder.start();
} catch (Exception e) {
throw new ConfNotSupportedException(e.getMessage());
}
Launched exception is:
MediaRecorder: start failed -38
11-18 09:50:21.028: W/System.err(15783): net.majorkernelpanic.streaming.exceptions.ConfNotSupportedException
11-18 09:50:21.028: W/System.err(15783): at net.majorkernelpanic.streaming.video.VideoStream.encodeWithMediaRecorder(VideoStream.java:442)
11-18 09:50:21.028: W/System.err(15783): at net.majorkernelpanic.streaming.MediaStream.start(MediaStream.java:250)
I've tried to comment:
mMediaRecorder.setOutputFile(mSender.getFileDescriptor());
no exception launched, but when I start streaming a dialog tell me that need an outputfile.
Help appreciated.
I filed a bug report on AOSP.
https://code.google.com/p/android/issues/detail?id=80715
"The current SELinux policies don't allow for mediaserver to handle app generated abstract unix domain sockets.
Instead, I'd recommend you create a pipe-pair ( http://developer.android.com/reference/android/os/ParcelFileDescriptor.html#createPipe() ) which is allowed by the Android 5.0 policy.
"
I don't know why they did this or how we were supposed to know.
I'm using a very old/modified (can't tell) version of libstreaming where mediastream is still extended from mediarecorder, but looking at the current version, in MediaStream you'll probably want to change createSockets to something including the following:
ParcelFileDescriptor[] parcelFileDescriptors =ParcelFileDescriptor.createPipe();
parcelRead = new ParcelFileDescriptor(parcelFileDescriptors[0]);
parcelWrite = new ParcelFileDescriptor(parcelFileDescriptors[1]);
then in your video/audio stream
setOutputFile(parcelWrite.getFileDescriptor());
and in that same file
change
// The packetizer encapsulates the bit stream in an RTP stream and send it over the network
mPacketizer.setInputStream(mReceiver.getInputStream());
mPacketizer.start();
to
InputStream is = null;
try{ is = new ParcelFileDescriptor.AutoCloseInputStream(parcelRead);
}
catch (Exception e){}
mPacketizer.setInputStream(is);
As andreasperelli pointed out in the comment, make sure to close the ParcelFileDescriptors in closeSockets(), or depending on your implementation and version, before closeSockets() and before you call MediaRecorder.stop().
at Android 6.0 I resolve this problem with the code
new Thread(new Runnable() {
#Override public void run() {
FileInputStream inputStream = null;
try {
inputStream = new FileInputStream(path);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
while (true) {
byte[] buffer = new byte[0];
try {
buffer = new byte[inputStream.available()];
} catch (IOException e) {
e.printStackTrace();
}
try {
inputStream.read(buffer);
} catch (IOException e) {
e.printStackTrace();
}
try {
mSender.getOutputStream().write(buffer);
mSender.getOutputStream().flush();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}).start();
I use a file as buffer and write bytes at another thread.the MediaRecorder output to the file.

IMediaDeathNotifier: media server died

recently i came across this question but have no clue. I want to use "mediarecorder" record 720P video stream and send it to server in realtime. Here is my code:
try{
soc = new Socket(InetAddress.getByName(hostname), port);
} catch (UnknownHostException e){
e.printStackTrace();
} catch (IOException e){
e.printStackTrace();
}
CamcorderProfile pProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_720P);
//pProfile.videoFrameWidth = 1280;
//pProfile.videoFrameWidth = 720;
recorder.setProfile(pProfile);
//recorder.setOutputFile(myRecAudioFile.getAbsolutePath());//保存路径
pfd = ParcelFileDescriptor.fromSocket(soc);
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.prepare();
recorder.start();
the android API level is 17, when i start the media recorder, the program will die just one or two seconds . The error log like this :
IMediaDeathNotifier- media server died
Camera - Camera server died
Can anybody who have came across the similar question give me some advice?
I think you need to set a preview display for your MediaRecorder:
SurfaceView mySurfaceView = (SurfaceView) findViewById(R.id.my_surface_view);
Surface mySurface = camera.getHolder().getSurface();
recorder.setPreviewDisplay(mySurface);
Apparently android does not allow you to play video unless you have set a preview display surface (ref).

Android MediaRecorder in streaming

Its possible to "stream" result of MediaRecorder?
The unique method i can see is mediaRecorder.setOutputFile that receives a FileDescriptor. So i can write the result to a File or send via socket to receiver.
I tried the second solution but the result video is corrupted because is not "seekable" in stream.
The idea is to use the camera of android device to publish result to Red5.
Yes, it possible, there are many examples for that.
You can checkout sipdroid example.
Or even Android IP camera which is much more simple.
Good Luck
Yes it is possible.
Here is the sample code with FileDescriptor and socket:
socket = new Socket("192.168.1.234",8888);
ParcelFileDescriptor fileDescriptor = ParcelFileDescriptor.fromSocket(socket);
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mRecorder.setOutputFile(fileDescriptor.getFileDescriptor);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
try {
mRecorder.prepare();
} catch (IOException e) {
Log.e(LOG_TAG, "prepare() failed");
}
mRecorder.start();

Categories

Resources