It is running & recording video 3 to 4 Mints. Later it stops the video recording.
At Log file it shows. Media Server died, Camera dies, Error 100.
Problem with videorecorder, CamcorderProfile settings.
I am using Android Emulator & Nexus 7 AVD,
Video settings are:
private void StartVideoRecording(Camera videoCamera){
videoRecorder = new MediaRecorder();
videoRecorder.setCamera(videoCamera);
videoRecorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
videoRecorder.setOutputFormat(cameraProfile.fileFormat); // look at ARTPWriter.cpp // videoRecorder.setOutputFormat(1); // look at ARTPWriter.cpp
videoRecorder.setVideoEncoder(cameraProfile.videoCodec);
videoRecorder.setVideoFrameRate(cameraProfile.videoFrameRate);
videoRecorder.setVideoSize(cameraProfile.videoFrameWidth, cameraProfile.videoFrameHeight);
videoRecorder.setVideoEncodingBitRate(cameraProfile.videoBitRate);
videoRecorder.setOutputFile("/mnt/sdcard/.Y3KVideo"); // File not used, but is needed for prepare to succeed.
videoRecorder.setPreviewDisplay(cameraViewSurfaceHolder.getSurface());
videoRecorder.setOrientationHint(90);
try {
videoRecorder.prepare();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
videoRecorder.start();
}
CamcorderProflie settings:
private void configureCameraProfile() {
cameraProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_LOW);
cameraProfile.videoCodec = MediaRecorder.VideoEncoder.H264;
// cameraProfile.fileFormat = MediaRecorder.OutputFormat.THREE_GPP;
cameraProfile.fileFormat = 7; // RTP
cameraProfile.videoFrameWidth = 176; // QCIF PAL
cameraProfile.videoFrameHeight = 144;
cameraProfile.videoBitRate = 128000;
cameraProfile.videoFrameRate = 15; //15
Log.i(TAG, " bitrate=" + cameraProfile.videoBitRate + ",width=" + cameraProfile.videoFrameWidth + ",height=" + cameraProfile.videoFrameHeight);
}
Media Recorder May Work in Emulator.
But The settings You provided are Wrong.
please verify this link
http://developer.android.com/reference/android/media/MediaRecorder.html
and configure the settings as accordingly.
Related
I have using spydroid from https://github.com/fyhertz/spydroid-ipcamera.
Based on the requirement streaming should be send and receive in the device. from local network we should able to shown the rtsp stream. Ex. VLC Media Player.
The issue I am facing is, When I am change the resolution Ex. 640*480. It should give black screen with streaming live. In Default demo, It should support 320*240, which is working fine. I have also change the bitrate and framerate according to 640*480 resolution. But couldn't get the result.
Any help would be appreciate.
You might be using old library which the demo SpyDroid is using.
I had same issue with the code i tried following way :-
Steps:-
1.)include library LibStreaming .
As it is the latest library and supporting above Lollipop version.
2.)Find H263Stream class Change following method :-
From
#SuppressLint("NewApi")
private MP4Config testMediaCodecAPI() throws RuntimeException, IOException {
createCamera();
updateCamera();
try {
if (mQuality.resX>=640) {
// Using the MediaCodec API with the buffer method for high resolutions is too slow
mMode = MODE_MEDIARECORDER_API;
}
EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
return new MP4Config(debugger.getB64SPS(), debugger.getB64PPS());
} catch (Exception e) {
// Fallback on the old streaming method using the MediaRecorder API
Log.e(TAG,"Resolution not supported with the MediaCodec API, we fallback on the old streamign method.");
mMode = MODE_MEDIARECORDER_API;
return testH264();
}
}
To
#SuppressLint("NewApi")
private MP4Config testMediaCodecAPI() throws RuntimeException, IOException {
createCamera();
updateCamera();
try {
if (mQuality.resX>=1080) {
// Using the MediaCodec API with the buffer method for high resolutions is too slow
mMode = MODE_MEDIARECORDER_API;
}
EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
return new MP4Config(debugger.getB64SPS(), debugger.getB64PPS());
} catch (Exception e) {
// Fallback on the old streaming method using the MediaRecorder API
Log.e(TAG,"Resolution not supported with the MediaCodec API, we fallback on the old streamign method.");
mMode = MODE_MEDIARECORDER_API;
return testH264();
}
}
-You can find the difference change resolution from "640" into "1080"
-Don't know what exact reason is but above solution worked for me.
-Revert me back if there is any loop fall.
In my case the problem was in MediaRecorder:
file: VideoStream.java
method: encodeWithMediaRecorder
MediaRecorder doesn't support ParcelFileDescriptor correctly. I created local file to save stream form MediaRecorder:
mMediaRecorder.setOutputFile(this.tmpFileToStream);
//mMediaRecorder.setOutputFile(fd); //disable..
and then I run new thread to copy bytes from tmpFileToStream to mParcelWrite
public void run()
{
FileDescriptor fd = mParcelWrite.getFileDescriptor();
try {
InputStream isS = new FileInputStream(tmpFileToStream); //read from local file..
FileOutputStream outputStream = new FileOutputStream(fd);
while (!Thread.interrupted()) {
int content;
while ((content = isS.read()) != -1) {
outputStream.write(content);
}
Thread.sleep(10);
}
} catch (Exception e)
{
Log.e(TAG, "E.. " + e.getMessage());
}
}
/**
* Video encoding is done by a MediaRecorder.
*/
protected void encodeWithMediaRecorder() throws IOException, ConfNotSupportedException {
Log.d(TAG,"Video encoded using the MediaRecorder API");
Log.d(TAG,"Roz" + mRequestedQuality.resX + " x " + mRequestedQuality.resY + " frame " + mRequestedQuality.framerate );
// We need a local socket to forward data output by the camera to the packetizer
createSockets();
// Reopens the camera if needed
destroyCamera();
createCamera();
// The camera must be unlocked before the MediaRecorder can use it
unlockCamera();
this.tmpFileToStream = this.getOutputMediaFile(MEDIA_TYPE_VIDEO);
Log.d(TAG,"Video record to " + this.tmpFileToStream.getAbsolutePath() );
try {
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
// mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setVideoEncoder(mVideoEncoder);
mMediaRecorder.setPreviewDisplay(mSurfaceView.getHolder().getSurface());
mMediaRecorder.setVideoSize(mRequestedQuality.resX,mRequestedQuality.resY);
mMediaRecorder.setVideoFrameRate(mRequestedQuality.framerate);
// The bandwidth actually consumed is often above what was requested
mMediaRecorder.setVideoEncodingBitRate((int)(mRequestedQuality.bitrate*0.8));
// We write the output of the camera in a local socket instead of a file !
// This one little trick makes streaming feasible quiet simply: data from the camera
// can then be manipulated at the other end of the socket
FileDescriptor fd = null;
if (sPipeApi == PIPE_API_PFD) {
fd = mParcelWrite.getFileDescriptor();
} else {
fd = mSender.getFileDescriptor();
}
mMediaRecorder.setOutputFile(this.tmpFileToStream); //save to local.. afeter read and stream that..
// mMediaRecorder.setOutputFile(fd); //disable..
// copy bytes from local file to mParcelWrite.. :)
if(tInput == null) {
this.tInput = new Thread(this);
this.tInput.start();
}
mMediaRecorder.prepare();
mMediaRecorder.start();
} catch (Exception e) {
StringWriter sw = new StringWriter();
PrintWriter pw = new PrintWriter(sw);
e.printStackTrace(pw);
Log.d(TAG, "Error stack " + sw.toString());
throw new ConfNotSupportedException(e.getMessage());
}
InputStream is = null;
if (sPipeApi == PIPE_API_PFD) {
is = new ParcelFileDescriptor.AutoCloseInputStream(mParcelRead);
} else {
is = mReceiver.getInputStream();
}
// This will skip the MPEG4 header if this step fails we can't stream anything :(
try {
byte buffer[] = new byte[4];
// Skip all atoms preceding mdat atom
while (!Thread.interrupted()) {
while (is.read() != 'm');
is.read(buffer,0,3);
if (buffer[0] == 'd' && buffer[1] == 'a' && buffer[2] == 't') break;
}
} catch (IOException e) {
Log.e(TAG,"Couldn't skip mp4 header :/");
stop();
throw e;
}
// The packetizer encapsulates the bit stream in an RTP stream and send it over the network
mPacketizer.setInputStream(is);
mPacketizer.start();
mStreaming = true;
}
That's unusual solution but it works for another resolution
I want to be able to take a video recorded with an Android device and encode it to a new Resolution and Frame Rate using my app. The purpose is to upload a much smaller version of the original video (in size), since this will be videos 30 min long or more.
So far, I've read of people saying FFmpeg is they way to go. However, the documentation seems to be lacking.
I have also considered using http opencv http://opencv.org/platforms/android.html
Considering I need to manipulate the video resolution and frame rate, which tool do you think can do such things better? Are there any other technologies to consider?
An important question is, since this will be long videos, is it reasonable to do the encoding in an android device (Consider power resources, time, etc.)
Thanks in advance!
I decided to use ffmpeg to tackle this project. After much researching and trials, I was not able to build ffmpeg for library (using Ubuntu 14.04 LTS.)
However, I used this excellent library https://github.com/guardianproject/android-ffmpeg-java
I just created a project and added that library and it works like a charm. No need to build your own files or mess with the Android NDK. Of course you would still need to build the library yourself if you want to customize it. But it has everything I need.
Here is an example of how I used to lower a video resolution and change the frame rate:
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
// input source
final Clip clip_in = new Clip("/storage/emulated/0/Developer/test.mp4");
Activity activity = (Activity) MainActivity.this;
File fileTmp = activity.getCacheDir();
File fileAppRoot = new File(activity.getApplicationInfo().dataDir);
final Clip clip_out = new Clip("/storage/emulated/0/Developer/result2.mp4");
//put flags in clip
clip_out.videoFps = "30";
clip_out.width = 480;
clip_out.height = 320;
clip_out.videoCodec = "libx264";
clip_out.audioCodec = "copy";
try {
FfmpegController fc = new FfmpegController(fileTmp, fileAppRoot);
fc.processVideo(clip_in, clip_out, false, new ShellUtils.ShellCallback() {
#Override
public void shellOut(String shellLine) {
System.out.println("MIX> " + shellLine);
}
#Override
public void processComplete(int exitValue) {
if (exitValue != 0) {
System.err.println("concat non-zero exit: " + exitValue);
Log.d("ffmpeg","Compilation error. FFmpeg failed");
Toast.makeText(MainActivity.this, "result: ffmpeg failed", Toast.LENGTH_LONG).show();
} else {
if(new File( "/storage/emulated/0/Developer/result2.mp4").exists()) {
Log.d("ffmpeg","Success file:"+ "/storage/emulated/0/Developer/result2.mp4");
}
}
}
});
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
// automated try and catch
setContentView(R.layout.activity_main);
}
}
The function processVideo produces a command similar to ffmpeg -i input -s 480X320 -r 30 -vcodec libx264 -acodec copy output
This a very simple example, but it outputted the same kind of conversion done by ffmpeg desktop. This codes needs lots of work! I hope it helps anyone.
I'm trying to add media controls to my app but I can't get the RemoteMediaPlayer to send commands.
The video starts playing but then I can't control it.
This is the code I use:
RemoteMediaPlayer mRemoteMediaPlayer = new RemoteMediaPlayer();
try {
Cast.CastApi.setMessageReceivedCallbacks(apiClient, mRemoteMediaPlayer.getNamespace(), mRemoteMediaPlayer);
MediaMetadata mediaMetadata = new MediaMetadata(MediaMetadata.MEDIA_TYPE_MOVIE);
JSONObject jsonExtra = new JSONObject();
mediaMetadata.putString(MediaMetadata.KEY_TITLE, "My video");
if (mediaType != null) {
jsonExtra.put("type", mediaType);
}
if ("audio".equals(mediaType)) {
mimeType = "audio/mp3";
}
com.google.android.gms.cast.MediaInfo.Builder builder = new MediaInfo.Builder(getUrl()).setContentType(mimeType).setMetadata(mediaMetadata).setCustomData(jsonExtra);
builder.setStreamType(MediaInfo.STREAM_TYPE_BUFFERED);
MediaInfo mediaInfo = builder.build();
mRemoteMediaPlayer.load(apiClient, mediaInfo, true, inititalTime).setResultCallback(new ResultCallback<RemoteMediaPlayer.MediaChannelResult>() {
#Override
public void onResult(MediaChannelResult result) {
if (result.getStatus().isSuccess()) {
log(context, "Media loaded successfully");
}
}
});
Thread.sleep(5000);
mRemoteMediaPlayer.pause(apiClient);
} catch (IllegalStateException e) {
log(context, "Problem occurred with media during loading " + e);
} catch (IOException e) {
log(context, "Problem occurred with media during loading " + e);
} catch (JSONException e) {
log(context, "Problem occurred with media during loading " + e);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
You may notice that at the end I wait 5 seconds to make sure that the video is playing and then try to pause it.
This always results in an IllegalStateException with the message:
No current media session
Am I missing something?
I also notice that the ResultCallback is never called after the video starts playing. Maybe this is also related to the same issue I'm experiencing.
Thanks in advance.
It is not clear when you are running this piece of code. Do not set a sleep like you did; you have to register callbacks on mRemoteMediaPlayer to be notified when status or metadata changes on the remote player. When the status changed listener is called, get the updated status by calling mRemoteMediaPlayer.getMediaStatus().getPlayerState() and based on the status (whether it is playing, buffering, idle, paused), make the appropriate decision. In asynchronous systems, never use "sleep()", always hook into the callbacks.
I am working on streaming radio application. everything is working fine except the changing the equalizer effect does not affect sound.
Changing the equalizer effect by calling usePreset(preset) does not make any changes in the sound effects.
Even though there is no error, why usePreset does not change the sound effects.
I have tested in samsung galaxy sII with 4.0.3.
public void startPlayer() {
//
// Check whether we can acquire the audio focus
// to start the player
//
if (!requestAudioFocus()) {
return;
}
if (null != mAudioPlayer) {
if (mAudioPlayer.isPlaying()) {
mAudioPlayer.stop();
}
mAudioPlayer.reset();
} else {
mAudioPlayer = new MediaPlayer();
mAudioPlayer.reset();
}
try {
notifyProgressUpdate(PLAYER_INITIALIZING);
try {
mEqualizer = new Equalizer(0, mAudioPlayer.getAudioSessionId());
mEqualizer.setEnabled(true);
Log.d(TAG,
"Audio Session ID " + mAudioPlayer.getAudioSessionId()
+ "Equalizer " + mEqualizer + " Preset "
+ mEqualizer.getCurrentPreset());
} catch (Exception ex) {
mEqualizer = null;
}
mAudioPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mAudioPlayer.setDataSource(mCurrentTrack.getStreamURL());
//
// Add the Listener to track the player status
//
mAudioPlayer.setOnCompletionListener(this);
mAudioPlayer.setOnBufferingUpdateListener(this);
mAudioPlayer.setOnPreparedListener(this);
mAudioPlayer.setOnInfoListener(this);
mAudioPlayer.setOnErrorListener(this);
notifyProgressUpdate(PLAYER_BUFFERING);
mAudioPlayer.prepareAsync();
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (SecurityException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
//Get the available presets from the equalizer
public String[] getEqualizerPresets() {
String[] presets = null;
short noOfPresets = -1;
if (null != mEqualizer) {
noOfPresets = mEqualizer.getNumberOfPresets();
presets = new String[noOfPresets];
for (short index = 0; index < noOfPresets; index++) {
presets[index] = mEqualizer.getPresetName(index);
}
}
return presets;
}
//Set the user preferred presets
public void setEqualizerPreset(int position) {
if (null != mEqualizer) {
Log.d(TAG, "setting equlizer effects " + position);
Log.d(TAG, "Equalizer " + mEqualizer + " set Preset " + position);
mEqualizer.usePreset((short)position);
Log.d(TAG, "Equalizer " + mEqualizer + " current Preset "
+ mEqualizer.getCurrentPreset());
}
}
Appreciate your help to identify the issue.
EDIT
This issue is not resolved yet. i did not find any sample code which explain Equalizer Preset usage.
Any reference to code sample which uses Preset welcome.
this is a fully source code for equalizer, hope this will help you
I have the same problem. When I load it on emulator it produce an error that I don't really know why, it always says ...audiofx.Equalizer. and audiofx.AudioEffect. or something similar. But I have discovered that if you have other media player like n7player in my case, try to close it and try again your media player. In my case it works, but I think that it has to be one method to get some equalizer that is active.
I'm trying to write video stream from my Galaxy Tab to server.
according to this manual i should do something like this:
frontCamera = getFrontCamera();
if((socket!= null)&&(frontCamera!=null))
{
try {
frontCamera.setPreviewDisplay(cameraPreview.getHolder());
} catch (IOException e1) {
// TODO Auto-generated catch block
Log.e("","",e1);
}
frontCamera.startPreview();
recorder = new MediaRecorder();
frontCamera.unlock();
recorder.setCamera(frontCamera);
recorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
recorder.setProfile(CamcorderProfile.get( CamcorderProfile.QUALITY_HIGH));
pfd = ParcelFileDescriptor.fromSocket(socket);
recorder.setOutputFile(pfd.getFileDescriptor());
recorder.setPreviewDisplay(cameraPreview.getHolder().getSurface());
try {
recorder.prepare();
recorder.start();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
Log.e("","",e);
} catch (IOException e) {
// TODO Auto-generated catch block
Log.e("","",e);
}
but all fails on step recorder.start(); with strange error
02-01 19:03:39.265: E/MediaRecorder(11922): start failed: -19
what does that mean and what should I do to start recorder?
UPD:
Trouble happens because of my getFrontCamera method. when I replace it with camera.open() all works correct.
protected Camera getFrontCamera()
{
Camera.CameraInfo inf = new Camera.CameraInfo();
for(int i = 0; i< Camera.getNumberOfCameras(); i++)
{
Camera.getCameraInfo(i, inf);
if(inf.facing==Camera.CameraInfo.CAMERA_FACING_FRONT)
{
return Camera.open(i);
}
}
return null;
}
Upd2 - yes, explicit setting of format and encoders solved the trouble -
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
Maybe because of pre-build formats are for back camera... But strange anyway.
I don't see output format setup, so try adding to recorder set up:
recorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT);
Have a look
And though it is streaming video, so that set -
recorder.setOutputFormat(8);
recorder.setOutputFile(socketFd);
Have fun.
I've a hack here, extending media recorder class and removing super.setVideoFrameRate(rate) solves the problem for me.
If you still want to use CamcorderProfile.QUALITY_HIGH with the front camera, you can use the following:
CamcorderProfile camcorderProfile = CamcorderProfile.get(currentCameraId, CamcorderProfile.QUALITY_HIGH);
recorder.setProfile(camcorderProfile);
where int currentCameraId is Camera.CameraInfo.CAMERA_FACING_BACK or ...FRONT
So the profile is indeed dependent on the camera (for high-end phones it appears to work fine without the distinction, since they all support 1080p by now, but low-end phones may crash otherwise)