IMediaDeathNotifier: media server died - android

recently i came across this question but have no clue. I want to use "mediarecorder" record 720P video stream and send it to server in realtime. Here is my code:
try{
soc = new Socket(InetAddress.getByName(hostname), port);
} catch (UnknownHostException e){
e.printStackTrace();
} catch (IOException e){
e.printStackTrace();
}
CamcorderProfile pProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_720P);
//pProfile.videoFrameWidth = 1280;
//pProfile.videoFrameWidth = 720;
recorder.setProfile(pProfile);
//recorder.setOutputFile(myRecAudioFile.getAbsolutePath());//保存路径
pfd = ParcelFileDescriptor.fromSocket(soc);
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.prepare();
recorder.start();
the android API level is 17, when i start the media recorder, the program will die just one or two seconds . The error log like this :
IMediaDeathNotifier- media server died
Camera - Camera server died
Can anybody who have came across the similar question give me some advice?

I think you need to set a preview display for your MediaRecorder:
SurfaceView mySurfaceView = (SurfaceView) findViewById(R.id.my_surface_view);
Surface mySurface = camera.getHolder().getSurface();
recorder.setPreviewDisplay(mySurface);
Apparently android does not allow you to play video unless you have set a preview display surface (ref).

Related

Is there a service which provides audio recording within an app?

I'm developing an app that would be utilizing the phone's microphone to record and store audio. However, the quality is horrible.
There are several voice recording apps that use the same mic, but their quality is exceptional.
Are there any services which would allow me to achieve this? I can recall Twilio offering something like this before, but it seems to have stopped. Basically users would be able to record audio clips and then store them for playback later. If a service can do either or both, it would be perfect.
Are you aware of any such service?
You can start MediaRecorder in this way. Here key is setAudioEncodingBitRate and setAudioSamplingRate
MediaRecorder recorder = new MediaRecorder();
recorder.setAudioEncodingBitRate(384000);
recorder.setAudioSamplingRate(48000);
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.AAC_ADTS);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
recorder.setOutputFile(outputFile); // path of your recording in phone storage
try {
recorder.prepare();
recorder.start();
} catch (IOException e) {
Log.d(TAG, "onCreate: " + e);
}
And you can stop Media recording in this way:
private void stopRecording() {
File file = new File(outputFile); // this is result of recording, you can play this file via MediaPlayer
try {
if (recorder != null) {
recorder.stop();
recorder.release();
recorder = null;
}
} catch (Exception e) {
Log.d(TAG, "stopMediaRecording: ");
} }

Google-NearBy video stream client's SurfaceView blank

Hello I'm trying to stream video from camera by Google NearBy Connections.
the connection established and messaging works but when I send stream the server device surface view show the camera preview but the client receive nothing with no error this is how is stream:
server:
SurfaceView surface;
public void sendCam(){
try {
surface = findViewById(R.id.surface);
ParcelFileDescriptor[] payloadPipe = ParcelFileDescriptor.createPipe();
ParcelFileDescriptor readFD = payloadPipe[0];
ParcelFileDescriptor writeFD = payloadPipe[1];
mCamera = Camera.open();
MediaRecorder rec = new MediaRecorder();
mCamera.lock();
mCamera.unlock();
rec.setCamera(mCamera);
rec.setVideoSource(MediaRecorder.VideoSource.CAMERA);
rec.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
rec.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
rec.setPreviewDisplay(surface.getHolder().getSurface());
rec.setOutputFile(writeFD.getFileDescriptor());
send(Payload.fromStream(readFD));
rec.prepare();
rec.start();
} catch (Exception e) {
}
}
client receive:
#Override
protected void onReceive(Endpoint endpoint, Payload payload) {
if (payload.getType() == Payload.Type.STREAM) {
try {
MediaPlayer mMediaPlayer = new MediaPlayer();
// mMediaPlayer.setDataSource(payload.asStream().asParcelFileDescriptor().getFileDescriptor()); // did not work also
FileInputStream inputStream = new FileInputStream(payload.asStream().asParcelFileDescriptor().getFileDescriptor());
mMediaPlayer.setDataSource(inputStream.getFD());
mMediaPlayer.setDisplay(surface.getHolder());
mMediaPlayer.prepare();
mMediaPlayer.start();
} catch (IOException e) {
e.printStackTrace();
}
}
}
I'm not even sure if i can stream video using nearby, the sample projects show how to stream audio. is this api not able to stream video or there is a problem with my code and what it could be.
UPDATE:
} catch (IOException e) { GIVES ME setDataSource failed.: status=0x80000000
There's nothing in Nearby Connections that precludes you from sending video streams -- if yours is a 1:1 scenario (one client device connected to one server device), it would be better to use the P2P_STAR Strategy (if you aren't already) since it provides you a higher bandwidth connection (which should help with video).
To make sure that you're at least receiving the stream's raw bytes on the client, you can check whether the onPayloadTransferUpdate() callback is firing.
If not, then you should check that same callback on the server, to see whether or not the bytes are being sent out.
If no bytes are being sent out from the server, then it might be a problem with your application-level video capture code.
Good luck!

Media Player Show video

Why doesn't the MediaPlayer show the video as soon as it is available. What I mean is on the IPhone when a video is played the video shows up right away. Even when returning from pause. But on the Android the screen stays black for a couple of milliseconds to a second depending on the device used and how many processes are running in the background.
I'm asking this because i want to use one of the beginning frames from my video play as a type of screenshot and currently I'm using a handler to wait 1 second before pausing the video.
Can someone tell me a quick way to make the video show up as soon as it is started or even prepared instead of my workaround?
EDIT:
Here is how I prepare my video player so It should be prepared right.
private void initVideo()
{
Log.i("VideoPlayer", "Initialize Video File" + videoFileName);
AssetFileDescriptor afd;
try {
if(videoFileName != null);
{
afd = getAssets().openFd(videoFileName);
vidplayer = new MediaPlayer();
vidplayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getDeclaredLength());
vidplayer.setDisplay(holder);
vidplayer.prepare();
vidplayer.setOnCompletionListener(this);
vidplayer.setOnPreparedListener(this);
//Log.i("INITVIDEO", Integer.toString(videoPausedAt));
vidplayer.seekTo(videoPausedAt);
//Log.i("VideoPlayer", "video Prepared");
videoDuration = vidplayer.getDuration()/1000;
isVideoReady = true;
}
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} catch (Exception e)
{
//Log.i("InitPlayer", e.getClass().toString());
e.printStackTrace();
}
}
For the background, you can get a thumbnail of the video:
private Bitmap getThumbnail(String path){
try{
return ThumbnailUtils.createVideoThumbnail(path, MediaStore.Images.Thumbnails.MINI_KIND);
}catch(Exception e){
return null;
}
}
When the video starts, you'll need to set the background back to null or you won't be able to see the video.
As for it not playing right away, it should play as soon as start() is called if you prepared it correctly, but it could be delayed if it has to load data let's say from a stream over the internet.
I have found that it is the phones fault.(mostly) Video's will show up automatically unless phone is bogged down with apps and thus loading of the video takes longer (noticed after having a voip service running).

Error opening android camera for streaming video

I'm trying to write video stream from my Galaxy Tab to server.
according to this manual i should do something like this:
frontCamera = getFrontCamera();
if((socket!= null)&&(frontCamera!=null))
{
try {
frontCamera.setPreviewDisplay(cameraPreview.getHolder());
} catch (IOException e1) {
// TODO Auto-generated catch block
Log.e("","",e1);
}
frontCamera.startPreview();
recorder = new MediaRecorder();
frontCamera.unlock();
recorder.setCamera(frontCamera);
recorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
recorder.setProfile(CamcorderProfile.get( CamcorderProfile.QUALITY_HIGH));
pfd = ParcelFileDescriptor.fromSocket(socket);
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.setPreviewDisplay(cameraPreview.getHolder().getSurface());
try {
recorder.prepare();
recorder.start();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
Log.e("","",e);
} catch (IOException e) {
// TODO Auto-generated catch block
Log.e("","",e);
}
but all fails on step recorder.start(); with strange error
02-01 19:03:39.265: E/MediaRecorder(11922): start failed: -19
what does that mean and what should I do to start recorder?
UPD:
Trouble happens because of my getFrontCamera method. when I replace it with camera.open() all works correct.
protected Camera getFrontCamera()
{
Camera.CameraInfo inf = new Camera.CameraInfo();
for(int i = 0; i< Camera.getNumberOfCameras(); i++)
{
Camera.getCameraInfo(i, inf);
if(inf.facing==Camera.CameraInfo.CAMERA_FACING_FRONT)
{
return Camera.open(i);
}
}
return null;
}
Upd2 - yes, explicit setting of format and encoders solved the trouble -
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
Maybe because of pre-build formats are for back camera... But strange anyway.
I don't see output format setup, so try adding to recorder set up:
recorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT);
Have a look
And though it is streaming video, so that set -
recorder.setOutputFormat(8);
recorder.setOutputFile(socketFd);
Have fun.
I've a hack here, extending media recorder class and removing super.setVideoFrameRate(rate) solves the problem for me.
If you still want to use CamcorderProfile.QUALITY_HIGH with the front camera, you can use the following:
CamcorderProfile camcorderProfile = CamcorderProfile.get(currentCameraId, CamcorderProfile.QUALITY_HIGH);
recorder.setProfile(camcorderProfile);
where int currentCameraId is Camera.CameraInfo.CAMERA_FACING_BACK or ...FRONT
So the profile is indeed dependent on the camera (for high-end phones it appears to work fine without the distinction, since they all support 1080p by now, but low-end phones may crash otherwise)

android: record audio with MediaPlayer on emulator

I am trying to record audio from the microphone on the Android emulator with this code:
recorder = new MediaRecorder();
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setOutputFile(Environment.getExternalStorageDirectory() + "/test/test.3gp");
try {
recorder.prepare();
}
catch (IOException io) {
Log.v(LOG_TAG, "Could not prepare the audio " + io.getMessage());
}
recorder.start();
For stopping the audio, this is the code:
recorder.stop();
recorder.reset();
recorder.release();
The recording process works fine but the resulting audio that is distorted. When I record an audio for 60 seconds duration and play it, it's duration is being shown as 120 seconds. The measurement is not exact but the this is just to give you an idea.
Only the AMR_NB encoder is working on my emulator. I have tried different output formats but the result is always the same.
Is it a limitation of the emulator or am I doing something wrong here?
Edit 1:
I have tried the AudioRecord class too and the result is the same dragging audio.
Thanks.
I have been working for the same and found the solution, Try using the following code:
private void startRecording()
{
this.recorder = new MediaRecorder();
this.recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
this.recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
this.recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
MediaRecorder.getAudioSourceMax();
this.recorder.setOutputFile(this.getFilename());
this.recorder.setOnErrorListener(this.errorListener);
this.recorder.setOnInfoListener(this.infoListener);
try
{
this.recorder.prepare();
this.recorder.start();
} catch (final IllegalStateException e)
{
e.printStackTrace();
} catch (final IOException e)
{
e.printStackTrace();
}
}
This is working perfactly. Hope it helps you :)

Categories

Resources