Its possible to "stream" result of MediaRecorder?
The unique method i can see is mediaRecorder.setOutputFile that receives a FileDescriptor. So i can write the result to a File or send via socket to receiver.
I tried the second solution but the result video is corrupted because is not "seekable" in stream.
The idea is to use the camera of android device to publish result to Red5.
Yes, it possible, there are many examples for that.
You can checkout sipdroid example.
Or even Android IP camera which is much more simple.
Good Luck
Yes it is possible.
Here is the sample code with FileDescriptor and socket:
socket = new Socket("192.168.1.234",8888);
ParcelFileDescriptor fileDescriptor = ParcelFileDescriptor.fromSocket(socket);
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mRecorder.setOutputFile(fileDescriptor.getFileDescriptor);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
try {
mRecorder.prepare();
} catch (IOException e) {
Log.e(LOG_TAG, "prepare() failed");
}
mRecorder.start();
Related
Hello I'm trying to stream video from camera by Google NearBy Connections.
the connection established and messaging works but when I send stream the server device surface view show the camera preview but the client receive nothing with no error this is how is stream:
server:
SurfaceView surface;
public void sendCam(){
try {
surface = findViewById(R.id.surface);
ParcelFileDescriptor[] payloadPipe = ParcelFileDescriptor.createPipe();
ParcelFileDescriptor readFD = payloadPipe[0];
ParcelFileDescriptor writeFD = payloadPipe[1];
mCamera = Camera.open();
MediaRecorder rec = new MediaRecorder();
mCamera.lock();
mCamera.unlock();
rec.setCamera(mCamera);
rec.setVideoSource(MediaRecorder.VideoSource.CAMERA);
rec.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
rec.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
rec.setPreviewDisplay(surface.getHolder().getSurface());
rec.setOutputFile(writeFD.getFileDescriptor());
send(Payload.fromStream(readFD));
rec.prepare();
rec.start();
} catch (Exception e) {
}
}
client receive:
#Override
protected void onReceive(Endpoint endpoint, Payload payload) {
if (payload.getType() == Payload.Type.STREAM) {
try {
MediaPlayer mMediaPlayer = new MediaPlayer();
// mMediaPlayer.setDataSource(payload.asStream().asParcelFileDescriptor().getFileDescriptor()); // did not work also
FileInputStream inputStream = new FileInputStream(payload.asStream().asParcelFileDescriptor().getFileDescriptor());
mMediaPlayer.setDataSource(inputStream.getFD());
mMediaPlayer.setDisplay(surface.getHolder());
mMediaPlayer.prepare();
mMediaPlayer.start();
} catch (IOException e) {
e.printStackTrace();
}
}
}
I'm not even sure if i can stream video using nearby, the sample projects show how to stream audio. is this api not able to stream video or there is a problem with my code and what it could be.
UPDATE:
} catch (IOException e) { GIVES ME setDataSource failed.: status=0x80000000
There's nothing in Nearby Connections that precludes you from sending video streams -- if yours is a 1:1 scenario (one client device connected to one server device), it would be better to use the P2P_STAR Strategy (if you aren't already) since it provides you a higher bandwidth connection (which should help with video).
To make sure that you're at least receiving the stream's raw bytes on the client, you can check whether the onPayloadTransferUpdate() callback is firing.
If not, then you should check that same callback on the server, to see whether or not the bytes are being sent out.
If no bytes are being sent out from the server, then it might be a problem with your application-level video capture code.
Good luck!
I'm struggling with this quiet long and I decided I need help.
The basis idea is described here: Link
I want to stream the output from the MediaRecorder directly to my Server over Sockets while recording. (In this specific case I want to see if this is possible without using any streaming protocol just via a HTTP-POST.)
The error of my current approach is: E/MediaRecorder: start failed: -2147483648 - RuntimeException
Before I had an error pointing me to this topic mediarecorder-issue-on-lollipop
I use a AsyncTask class to call my recordData() function:
new sendAsync().execute("test");
private void recordData(Socket socket){
try{
OutputStream out = socket.getOutputStream();
InputStream is = null;
ParcelFileDescriptor[] parcelFileDescriptors = ParcelFileDescriptor.createPipe();
parcelRead = new ParcelFileDescriptor(parcelFileDescriptors[0]);
parcelWrite = new ParcelFileDescriptor(parcelFileDescriptors[1]);
try{
is = new ParcelFileDescriptor.AutoCloseInputStream(parcelRead);
} catch (Exception e){
e.printStackTrace();
}
//pfd = ParcelFileDescriptor.fromSocket(socket);
sft = new SurfaceTexture(0);
sf = new Surface(sft);
if(recorder == null) {
recorder = new MediaRecorder();
recorder.reset();
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT);
recorder.setVideoFrameRate(30);
recorder.setPreviewDisplay(sf);
recorder.setOutputFile(parcelWrite.getFileDescriptor());
recorder.setMaxDuration(10000); // 10 seconds
try {
recorder.prepare();
recorder.start();
System.out.println("This is Recorder running");
} catch (IOException e) {
e.printStackTrace();
}
byte[] buffer = new byte[16384];
int byteread = 0;
while ((byteread = is.read(buffer, 0, buffer.length)) != -1) {
out.write(buffer, 0, byteread);
}
out.flush();
}else{
stopRecording();
}
} catch (Exception e) {
e.printStackTrace();
}
}
Sorry for the bad coding-style.
I tried to change the OutputFormat and VideoEncoder following other solutions found to the topic. But nevertheless I'm still not sure if I'm going in the right direction.
Even after coming around this bug I think I somehow need to read out and send the Stream to the server in its own thread.
Every hint could help.
I'm not sure if this is an error due to latest versions on Android (tested on Nougat and Oreo). But as much I tried to find a workaround I still ended up on the same error message with trying to use the MediaRecorder.
Thanks to libstreaming I use now the MediaCodec API explained in their example how to use the Library: Example 2.
Then I take the Inputstream of the MediaCodec API and read it into a byte array until a certain size is reached and send this to my http-server.
I'm testing libstreaming on new Android Lollipop, and this code that worked on previous release, seems to launch exception.
try {
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mMediaRecorder.setVideoEncoder(mVideoEncoder);
mMediaRecorder.setPreviewDisplay(mSurfaceView.getHolder().getSurface());
mMediaRecorder.setVideoSize(mRequestedQuality.resX,mRequestedQuality.resY);
mMediaRecorder.setVideoFrameRate(mRequestedQuality.framerate);
// The bandwidth actually consumed is often above what was requested
mMediaRecorder.setVideoEncodingBitRate((int)(mRequestedQuality.bitrate*0.8));
// We write the ouput of the camera in a local socket instead of a file !
// This one little trick makes streaming feasible quiet simply: data from the camera
// can then be manipulated at the other end of the socket
mMediaRecorder.setOutputFile(mSender.getFileDescriptor());
mMediaRecorder.prepare();
mMediaRecorder.start();
} catch (Exception e) {
throw new ConfNotSupportedException(e.getMessage());
}
Launched exception is:
MediaRecorder: start failed -38
11-18 09:50:21.028: W/System.err(15783): net.majorkernelpanic.streaming.exceptions.ConfNotSupportedException
11-18 09:50:21.028: W/System.err(15783): at net.majorkernelpanic.streaming.video.VideoStream.encodeWithMediaRecorder(VideoStream.java:442)
11-18 09:50:21.028: W/System.err(15783): at net.majorkernelpanic.streaming.MediaStream.start(MediaStream.java:250)
I've tried to comment:
mMediaRecorder.setOutputFile(mSender.getFileDescriptor());
no exception launched, but when I start streaming a dialog tell me that need an outputfile.
Help appreciated.
I filed a bug report on AOSP.
https://code.google.com/p/android/issues/detail?id=80715
"The current SELinux policies don't allow for mediaserver to handle app generated abstract unix domain sockets.
Instead, I'd recommend you create a pipe-pair ( http://developer.android.com/reference/android/os/ParcelFileDescriptor.html#createPipe() ) which is allowed by the Android 5.0 policy.
"
I don't know why they did this or how we were supposed to know.
I'm using a very old/modified (can't tell) version of libstreaming where mediastream is still extended from mediarecorder, but looking at the current version, in MediaStream you'll probably want to change createSockets to something including the following:
ParcelFileDescriptor[] parcelFileDescriptors =ParcelFileDescriptor.createPipe();
parcelRead = new ParcelFileDescriptor(parcelFileDescriptors[0]);
parcelWrite = new ParcelFileDescriptor(parcelFileDescriptors[1]);
then in your video/audio stream
setOutputFile(parcelWrite.getFileDescriptor());
and in that same file
change
// The packetizer encapsulates the bit stream in an RTP stream and send it over the network
mPacketizer.setInputStream(mReceiver.getInputStream());
mPacketizer.start();
to
InputStream is = null;
try{ is = new ParcelFileDescriptor.AutoCloseInputStream(parcelRead);
}
catch (Exception e){}
mPacketizer.setInputStream(is);
As andreasperelli pointed out in the comment, make sure to close the ParcelFileDescriptors in closeSockets(), or depending on your implementation and version, before closeSockets() and before you call MediaRecorder.stop().
at Android 6.0 I resolve this problem with the code
new Thread(new Runnable() {
#Override public void run() {
FileInputStream inputStream = null;
try {
inputStream = new FileInputStream(path);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
while (true) {
byte[] buffer = new byte[0];
try {
buffer = new byte[inputStream.available()];
} catch (IOException e) {
e.printStackTrace();
}
try {
inputStream.read(buffer);
} catch (IOException e) {
e.printStackTrace();
}
try {
mSender.getOutputStream().write(buffer);
mSender.getOutputStream().flush();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}).start();
I use a file as buffer and write bytes at another thread.the MediaRecorder output to the file.
I have an application, which turns recording from mic when I have outgoing or incoming call on my phone...
mRecorder = new MediaRecorder();
mRecorder.reset();
// mRecorder.setAudioSource(MediaRecorder.AudioSource.VOICE_CALL);
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mRecorder.setOutputFile(mFileName);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
try {
mRecorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
mRecorder.start();
It works on Android 2.xxx, now I have tryed to use it on HTC with android 4.0 and was surprised.... everything seems to work good.... But... recordings which were made are simply empty... At the beginning of it there is a sound for a second and then silence))) In LOG I have seen something like ...
09-19 14:10:36.462: INvAudioALSADevice(143): getMicMute state=F
09-19 14:10:36.462: INvAudioALSADevice(143): setMicMute state=F
I have found info about android interface on:
http://www.kandroid.org/online-pdk/guide/classandroid_1_1AudioHardwareInterface.html#_details
How can I win this BUG????
I'm new in Android development and I have the next question/problem.
I'm playing around with the MediaRecorder class to record just audio from the microphone. I'm following the steps indicated in the official site: http://developer.android.com/reference/android/media/MediaRecorder.html
So I have a method that initializes and configure the MediaRecorder object in order to start recording. Here you have the code:
//initializes audio recorder
MediaRecorder mrecorder = new MediaRecorder();
//configure the input sources
mrecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
//set the output format
mrecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
//set the audio encoding
mrecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);
//specify the output file
mrecorder.setOutputFile("/sdcard/test.3gp");
//prepare for recording
try {
mrecorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
Log.d("Syso". e.toString());
} catch (IOException e) {
e.printStackTrace();
Log.d("Syso". e.toString());
}
When I execute this code in the simulator, thanks to logcat, I can see that the method prepare() gives an exception when is called:
java.io.FileNotFoundException: /sdcard/test.3gp (Permission denied)
And I have no idea why this is happening. Due to the message of the exception, I've given permissions in the manifest to access to storage by adding the following line to the xml:
<uses-permission android:name="android.permission.STORAGE" />
But this doesn't fix anything and I still get the same exception all the time. The SDCard is mounted according to the emulator, so I have no clue.
Add the WRITE_EXTERNAL_STORAGE permission to AndroidManifest.xml.