I'm trying to display the video stream from my wifi camera (SJ6 Legend) to an Android device.
When turning on the wifi from the camera and connecting to its network from my mac I can see the video stream from vlc simply by going to File -> Open Network and connecting to rtsp://MY_CAM_IP.
I then connect to the wifi from my android device and I tried using MediaPlayer or VideoView and it doesn't work.
vlc for android also doesn't display the video.
Just to make sure there is no problem playing RTSP I tried this file:
rtsp://184.72.239.149/vod/mp4:BigBuckBunny_175k.mov
and it works fine on vlc for android and using MediaPlayer.
I also tried a vlc for android lib, that didn't work as well...
Relevant code:
In onCreate:
SurfaceView surfaceView = (SurfaceView)
findViewById(R.id.am_surface_view);
mSurfaceHolder = surfaceView.getHolder();
mSurfaceHolder.addCallback(this);
mSurfaceHolder.setFixedSize(320, 240);
and:
/**
* {#link MediaPlayer.OnPreparedListener} interface methods
*/
#Override
public void onPrepared(MediaPlayer mediaPlayer) {
mMediaPlayer.start();
}
/**
* {#link SurfaceHolder.Callback} interface methods
*/
#Override
public void surfaceChanged(final SurfaceHolder holder, final int format, final int width, final int height) {}
#Override
public void surfaceCreated(SurfaceHolder sh) {
mMediaPlayer = new MediaPlayer();
mMediaPlayer.setDisplay(sh);
// Context context = getApplicationContext();
// Map<String, String> headers = getRtspHeaders();
// Uri source = Uri.parse(RTSP_URL);
try {
// Specify the IP camera's URL and auth headers.
// mMediaPlayer.setDataSource(context, source, headers);
// mMediaPlayer.setDataSource(context, source);
mMediaPlayer.setDataSource(RTSP_URL); // RTSP_URL = "rtsp://MY_CAM_IP"
// Begin the process of setting up a video stream.
mMediaPlayer.setOnPreparedListener(this);
mMediaPlayer.prepareAsync();
} catch (Exception e) {}
}
#Override
public void surfaceDestroyed(SurfaceHolder sh) {
mMediaPlayer.release();
}
Anyone can point me to some solution???
Thanks
It finally worked when I turned off the phones' cellular network data.
Unfortunately I need to receive the camera stream and have a network connection to send the frames received but I guess that's another question...
Related
I have written the Android service shown below for recording the front cam in the background. This works very well. But now I would like to also take a picture every 5 seconds while recording. Is this somehow possible? When I try to open a second camera (in another service) I'm getting an error.
public class RecorderService extends Service implements SurfaceHolder.Callback {
private WindowManager windowManager;
private SurfaceView surfaceView;
private Camera camera = null;
private MediaRecorder mediaRecorder = null;
#Override
public void onCreate() {
// Create new SurfaceView, set its size to 1x1, move it to the top left corner and set this service as a callback
windowManager = (WindowManager) this.getSystemService(Context.WINDOW_SERVICE);
surfaceView = new SurfaceView(this);
WindowManager.LayoutParams layoutParams = new WindowManager.LayoutParams(
1, 1,
WindowManager.LayoutParams.TYPE_SYSTEM_OVERLAY,
WindowManager.LayoutParams.FLAG_WATCH_OUTSIDE_TOUCH,
PixelFormat.TRANSLUCENT
);
layoutParams.gravity = Gravity.LEFT | Gravity.TOP;
windowManager.addView(surfaceView, layoutParams);
surfaceView.getHolder().addCallback(this);
}
#Override
public int onStartCommand(Intent intent, int flags, int startId) {
Intent notificationIntent = new Intent(this, MainActivity.class);
PendingIntent pendingIntent = PendingIntent.getActivity(this, 0,
notificationIntent, 0);
Notification notification = new NotificationCompat.Builder(this)
//.setSmallIcon(R.mipmap.app_icon)
.setContentTitle("Background Video Recorder")
.setContentText("")
.setContentIntent(pendingIntent).build();
startForeground(MainActivity.NOTIFICATION_ID_RECORDER_SERVICE, notification);
return Service.START_NOT_STICKY;
}
// Method called right after Surface created (initializing and starting MediaRecorder)
#Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
camera = Camera.open(1);
mediaRecorder = new MediaRecorder();
camera.unlock();
mediaRecorder.setPreviewDisplay(surfaceHolder.getSurface());
mediaRecorder.setCamera(camera);
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_720P));
FileUtil.createDir("/storage/emulated/0/Study/Camera");
mediaRecorder.setOutputFile("/storage/emulated/0/Study/Camera/" + Long.toString(System.currentTimeMillis()) + ".mp4");
try { mediaRecorder.prepare(); } catch (Exception e) {}
mediaRecorder.start();
try {
camera.setPreviewDisplay(surfaceHolder);
} catch (IOException e) {
e.printStackTrace();
}
Runnable runnable = new PictureThread(camera);
Thread thread = new Thread(runnable);
thread.start();
}
// Stop recording and remove SurfaceView
#Override
public void onDestroy() {
mediaRecorder.stop();
mediaRecorder.reset();
mediaRecorder.release();
camera.lock();
camera.release();
windowManager.removeView(surfaceView);
}
#Override
public void surfaceChanged(SurfaceHolder surfaceHolder, int format, int width, int height) {}
#Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {}
#Override
public IBinder onBind(Intent intent) { return null; }
}
Edit: I have now written a thread PictureThread. This thread is started from RecorderService and tries to take a picture while video recording.
public class PictureThread implements Runnable {
private final static String TAG = PictureThread.class.getSimpleName();
private Camera camera;
PictureThread(Camera camera) {
this.camera = camera;
}
#Override
public void run() {
camera.startPreview();
camera.takePicture(shutterCallback, rawCallback, jpegCallback);
}
Camera.ShutterCallback shutterCallback = new Camera.ShutterCallback() {
public void onShutter() {
}
};
Camera.PictureCallback rawCallback = new Camera.PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
}
};
Camera.PictureCallback jpegCallback = new Camera.PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
Log.i(TAG, "onPictureTaken - jpeg");
}
};
}
Unfortunately jpegCallback gets never called (i.e. the Log message is never printed). When I open the camera app of my tablet then I can take pictures while video recording, so this should be possible.
I have also tried the Camera2 API example as suggested by Alex Cohn (https://github.com/mobapptuts/android_camera2_api_video_app). Recording a video works and also taking a picture works but when I try to take a picture while recording, no picture is created (but also no error). Nevertheless, I have found this example app not working very reliable (perhaps there is another example app).
Edit 2: The shutterCallback and rawCallback of takePicture gets called but the data of the rawCallback is null. The jpegCallback gets never called.. Any idea why and how this can be solved? I have also tried to wait in the thread for a period of time to give the callback time for being called and I have tried to make the callbacks static in my main activity (so that it gets not garbage collected). Nothing worked.
Edit:
With the clarification:
The old camera API supports calling takePicture() while video is being recorded, if Camera.Parameters.isVideoSnapshotSupported reports true on the device is question.
Just hold on to the same camera instance you're passing into the MediaRecorder, and call Camera.takePicture() on it.
Camera2 also supports this with more flexibility, by creating a session with preview, recording, and JPEG outputs at the same time.
Original answer:
If you mean taking pictures with the back camera, while recording with the front camera - that's device-dependent. Some devices have enough hardware resources to run multiple cameras at once, but most won't (they share processing hardware between the two cameras).
The only way to tell if multiple cameras can be used at once is to try opening a second camera when one is already open. If it works, you should be good to go; if not, that device doesn't support multiple cameras at once.
No, you cannot open separate camera instances for video recording and stills capture. The deprecated Camera API is not reliable for such tasks (see e.g. Android camera parameter IsVideoSnapshotSupported incorrectly set to false about Samsung S4).
You can use camera2 API (on devices that support such mode) to capture different formats and resolutions from the same camera instance. Here is a video tutorial: https://www.nigeapptuts.com/android-video-app-still-capture-recording/
I am creating an android activity that will have many mediaplayers inside. The activity will have many mediaplayer objects and if i could describe it its like a
4 X 1 grid of media players. I created the 4X1 grid by using TextureView class's in android.
so the xml layout for the activity looks like this more or less:
<TextureView
android:id="#+id/surface_one"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
/>
<TextureView
android:id="#+id/surface_two"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
/>
<TextureView
android:id="#+id/surface_three"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
/>
<TextureView
android:id="#+id/surface_four"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
/>
</LinearLayout>
So each textureView will be the surface where the MediaPlayers will play there video.
programatically i've set up SurfaceTextureListener's so that i know when the surface is available like this for each one:
TextureView tv1 = (TextureView)findViewById(R.id.surface_one);
tv1.setSurfaceTextureListener(new SurfaceListener_1());
and likewise for the rest. here is how i would do the second texture view setup:
TextureView tv2 = (TextureView)findViewById(R.id.surface_two);
tv2.setSurfaceTextureListener(new SurfaceListener_2());
Then in code i am creating 4 MediaPlayer Objects and setting there surface like this for mediaplayer #1:
private class SurfaceListener_1 implements TextureView.SurfaceTextureListener {
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
Surface sfc = new Surface(surface);
try {
m_mp1 = new MediaPlayer();
m_mp1.setSurface(sfc); //critical , here i am adding the surface so the media player can play on it
m_mp1.setDataSource(m_context, m_videoUri_one);
m_mp1.prepareAsync();
m_mp1.setLooping(true);
m_mp1.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
video1Prepared = true;
}
});
m_mp1.setOnErrorListener(new MediaPlayer.OnErrorListener(){
#Override
public boolean onError(MediaPlayer mp, int what, int extra) {
return true;
}
});
} catch (Exception e) {
e.printStackTrace();
}
finally {
//release the surface as its now set in the media player
if(sfc!=null)
sfc.release();
}
}
//...... the rest of the call backs not important...
and for the other media players i repeat the same thing so that they all have surfaces to play on. here is how i would do mediaplayer # 2:
private class SurfaceListener_2 implements TextureView.SurfaceTextureListener {
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
Surface sfc = new Surface(surface);
try {
m_mp2 = new MediaPlayer();
m_mp2.setSurface(sfc); //critical , here i am adding the surface so the media player can play on it
m_mp2.setDataSource(m_context, m_videoUri_two);
m_mp2.prepareAsync();
m_mp2.setLooping(true);
m_mp2.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
video2Prepared = true;
}
});
m_mp2.setOnErrorListener(new MediaPlayer.OnErrorListener(){
#Override
public boolean onError(MediaPlayer mp, int what, int extra) {
return true;
}
});
} catch (Exception e) {
e.printStackTrace();
}
finally {
//release the surface as its now set in the media player
if(sfc!=null)
sfc.release();
}
}
//...... the rest of the call backs not important...
so in the end what i have created is 4 surfaces where i can play a video. All 4 media players play different video files.
Once all the media players are prepared, i call their start() method on all of them and they play at the same time.
The issue i am facing is that this can be cpu intensive. On some devices i can get a ANR or the app just gets slow to respond.
The media file i am playing is a MP4 and the size of the media files are all about 9 MB.
Is there anything you can recommend so that i can be more cpu efficient in this already heavy task ? For example, can i get the GPU to help ?
Or if i change media type will that make it more efficient ? the video's do not have sound there only visual if that helps.
I'm writing an app that consumes media (audio/video) and that allows users to reply and/or post new media.
My question relates to the SurfaceView used to display the videos. This SurfaceView object is shared between the MediaRecorder (recording a video) and the MediaPlayer (consuming/playing the video).
The MediaPlayer is located on its own Service, which runs on its own thread, as per the NPR example: PlaybackService.java
Since the NPR example doesn't involve video, I was not sure about how to make the MediaPlayer on the Service aware of the SurfaceView in the UI. I ended up using a static variable to solve this issue:
// MyFragmentClass.java
// CameraPreview is the sample class shown int he media section
// of the Android Developer site:
// http://developer.android.com/guide/topics/media/camera.html#camera-preview
private static CameraPreview sCameraPreview;
#Override
public void onStart() {
super.onStart();
CameraPreview cameraPreview = new CameraPreview(getActivity());
FrameLayout preview = (FrameLayout) getView().findViewById(R.id.video_preview);
preview.addView(cameraPreview);
setCameraPreview(cameraPreview); // a static setter.
// Other classes are initialized afterwards, not relevant to this question...
}
public static CameraPreview getCameraPreview() {
return sCameraPreview;
}
public static void setCameraPreview(CameraPreview cameraPreview) {
sCameraPreview = cameraPreview;
}
Here's the method on the PlaybackService that takes care of preparing a video for playback:
// Params url and isVideo were extras on an Intent that triggers
// this method
synchronized private void prepareMediaPlayer(String url, boolean isVideo) {
if (mMediaPlayer == null) {
mMediaPlayer = new MediaPlayer();
mMediaPlayer.setOnCompletionListener(this);
mMediaPlayer.setOnErrorListener(this);
mMediaPlayer.setOnInfoListener(this);
mMediaPlayer.setOnPreparedListener(this);
} else {
mMediaPlayer.reset();
}
mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mMediaPlayer.setDataSource(mediaUrl);
if (isVideo) {
// this is the important line
mMediaPlayer.setDisplay(MyFragmentClass.getCameraPreview().getHolder());
mMediaPlayer.setScreenOnWhilePlaying(true);
}
mMediaPlayer.prepareAsync();
sendIntent(STATUS_PLAYBACK_PREPARED);
}
It gets the job done, but I'm wondering if there's any better way to do it, specially because I'm debugging a random bug where the shared surface is not released, and it got me thinking:
Is there a better way to make my service class aware of the Service?
Is it a good approach to have a single SurfaceView for both Recording and Playback?
Thanks!
Hi I am creating an app which will play livestream.com's rtsp live channel.
I am launching the player using intent within my app as following:
iPlayer = new Intent(Intent.ACTION_VIEW);
//iPlayer.setType("video/*");
iPlayer.setData(Uri.parse(videoUrl));
startActivity(iPlayer);
When the media player is launched through my Application, the video performance is very poor. It stops for buffering every few seconds, plays for few seconds and pauses for buffering again.
ON the other hand, If I open the url in android browser (eg. http://m.livestream.com/abcalbania) it has a video tag on that page and triggers video player. THIS time, the video runs very smooth.
Any Idea why this might happen? And how this can be fixed?
I do not want to launch browser URL as intent.
This is done on Atmel cortex A9 chipset with Android 2.3.4
The problem is caused by the codecs that probably are not supported by your player.
for example i have a video created with MPEG Audio codec along with the H.264 video codec.
if i launch the video through my Application the video runs smoothly, but if i launch a video in Ooyala Hook Player it has a very poor performance, it plays the video every 3 seconds, the reason is that the stream use MPEG audio codec instead of AAC Audio codec that is supported.
You will find the answer with:
what codecs are used to create de video, and what are supported by
your player?
Use this code for smooth STREAM
String videoUrl = "rtmp://mystream";
Intent i = new Intent(android.content.Intent.ACTION_VIEW);
i.setDataAndType(Uri.parse(videoUrl), "video/*");
startActivity(i);
Why not you play this in your own activity, create activity and render the video view like
private String path2 = "rtsp://...";
Uri video = Uri.parse(path2);
mVideoView.setVideoURI(video);
mVideoView.setMediaController(new MediaController(this));
mVideoView.requestFocus();
mVideoView.postInvalidateDelayed(100);
mVideoView.start();
Also you can buffer before start playing maybe 5 secs and than successive buffering will be fast. you can control more thing by your own.
Android's MediaPlayer handle very well RTSP - don't open an external app - it's not necessary and product-wise wrong.
About your question - the browser might send another parameters to the video player that help the video play smooth, I didn't check it but it sounds like the only possible option for what you're describing. Example for the extra param might be the video resolution / encoding / size .. you can get all of them easily using MediaMetaDataRetriever.
If you don't want to use the native VideoView or MediaPlayer you can always add external player to your
app, like libVLC or Vitamio.
I recommend of using Vitamio, is really easy to use and integrate. LibVLC is in native code, it means you'll have to build it using ndk and add its libs to your project.
You can find here how to do that.
Android video view support RTSP urls well no need to pass intent to other application.Try out with this code, pass xml with declaration of video view and find it inside this activity.
public class VideoPlayer extends Activity
{
private VideoView mVideoView;
String videoURL="";
static Utility utility;
static Context context;
//MediaController mediaController;
//int iCurrentpostion=0;
int counter=0;
#Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.tab_video_player);
setupViews();
}
private void setupViews()
{
context=VideoPlayer.this;
//utility=new Utility(VideoPlayer.this);
showProgressDialog("Please wait", "Loading video..");
//videoURL=getIntent().getExtras().getString("url");
mVideoView=(VideoView)findViewById(R.id.xvdvwTab);
// mediaController=new MediaController(context);
// mVideoView.setMediaController(mediaController);
mVideoView.setOnPreparedListener(new OnPreparedListener()
{
#Override
public void onPrepared(MediaPlayer mp)
{
utility.hideProgressDialog();
mVideoView.start();
mVideoView.requestFocus();
}
});
mVideoView.setOnCompletionListener(new OnCompletionListener()
{
#Override
public void onCompletion(MediaPlayer mp)
{
finish();
}
});
mVideoView.setOnErrorListener(new OnErrorListener()
{
#Override
public boolean onError(MediaPlayer mp, int what, int extra)
{
utility.hideProgressDialog();
return false;
}
});
playVideoFile();
}
private void playVideoFile()
{
try
{
mVideoView.setVideoURI(Uri.parse("your url"));
}
catch (Exception e)
{
utility.hideProgressDialog();
if (mVideoView != null)
{
mVideoView.stopPlayback();
}
}
}
#Override
protected void onResume()
{
/*if(mVideoView!=null)
{
//setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
mVideoView.requestFocus();
if(iCurrentpostion!=0)
mVideoView.seekTo(iCurrentpostion);
mVideoView.start();
}
super.onResume();*/
}
#Override
protected void onDestroy()
{
try
{
if (mVideoView != null)
{
mVideoView.stopPlayback();
mVideoView=null;
}
super.onDestroy();
} catch (Exception e)
{}
}
public void showProgressDialog(String title,String Message)
{
hideProgressDialog();
progressDialog=new ProgressDialog(mActivity);
progressDialog.setTitle(title);
progressDialog.setMessage(Message);
if(Constant.isActivityisRunning)
progressDialog.show();
}
public void hideProgressDialog()
{
if (progressDialog != null)
{
if (progressDialog.isShowing())
{
progressDialog.dismiss();
progressDialog = null;
}
}
}
}
I think play video by Asynchronously for better performance. My code is:
private class myAsync extends AsyncTask<Void, Integer, Void> {
int duration = 0;
//int current = 0;
#Override
protected Void doInBackground(Void... params) {
videoView.setOnPreparedListener(new OnPreparedListener() {
public void onPrepared(MediaPlayer mp) {
progressDialog.dismiss();
videoView.seekTo(check);
videoView.start();
duration = videoView.getDuration();
}
});
do {
current = videoView.getCurrentPosition();
System.out.println("duration - " + duration + " current- "
+ current);
}
if (sync.isCancelled())
break;
} while (current != duration || current == 0);
return null;
}
}
Is there any way to reference programmatically a very small video file adn include it in teh package - i.e. I don't want to have it separate on the SD card. I am thinking of putting it in the 'raw' package directory.
E.g. MPEG4 called 'video' in 'raw'
Am trying to work out what the correct format for Uri.parse() but it has beaten me. I thought it should be something like R.raw (as used when setting up a media player for audio myMediaPlayer = MediaPlayer.create(this, R.raw.audiocameralive1) - but it doesn't seem to be.
Any suggestions
Oliver
I see there have been a number of views, so in case anyone is looking for a solution, this is what I eventually did - and seems to work fine. There is probably cleaner way of doing the same but, this one makes sense to me ...
Oliver
public class ShowVideoActivity extends Activity
implements SurfaceHolder.Callback,
OnErrorListener,
OnPreparedListener,
OnCompletionListener
{
/** Called when the activity is first created. */
private MediaPlayer myMediaPlayer;
boolean bolMediaPlayerIsReleased = false;
// The SurfaceHolder and SurfaceView are used to display the video
// By implementing the SurfaceHolder.Callback interface means that we have
// to implement surfaceChanged(), surfaceCreated() and surfaceDestroyed()
private SurfaceView videoSurface;
private SurfaceHolder videoHolder;
Display currentDisplay;
#Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.showvideo); // Inflate ShowVideo
// Identify the Surface that will be used to hold the camera image
videoSurface = (SurfaceView)findViewById(R.id.videosurface);
// The SurfaceHolder 'monitors' activity on the Surface
videoHolder = videoSurface.getHolder();
videoHolder.setKeepScreenOn(true);
// Data will be Pushed onto the buffers external to the surface
videoHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
videoHolder.setKeepScreenOn(true);
// Let the monitor know that 'this' activity is responsible for
// all the callback functions.
videoHolder.addCallback(this);
// It is now up to the 'callbacks' to do any further processing
myMediaPlayer = MediaPlayer.create(this,R.raw.filename);
myMediaPlayer.setOnCompletionListener(this);
myMediaPlayer.setOnErrorListener(this);
myMediaPlayer.setOnPreparedListener(this);
myMediaPlayer.setOnCompletionListener(this);
currentDisplay = getWindowManager().getDefaultDisplay();
}
// Set up a listener to wait for MediaPlayer End (Is this PlaybackCompleted()?)
public void onCompletion(MediaPlayer mp)
{
Wrapup(mp);
}
public void surfaceCreated(SurfaceHolder CreatedHolder) {
// Surface created, now it is possible to set the preview
myMediaPlayer.setDisplay(CreatedHolder);
}
public void surfaceDestroyed(SurfaceHolder DestroyedHolder)
{
if (myMediaPlayer != null)
{
if (myMediaPlayer.isPlaying() )
myMediaPlayer.stop();
myMediaPlayer.release();
bolMediaPlayerIsReleased = true;
}
}
public void surfaceChanged(SurfaceHolder ChangedHolder, int intFormat, int intWidth, int intHeight)
{
if (myMediaPlayer.isPlaying())
return;
else
{
setVideoSurfaceSize(myMediaPlayer);
myMediaPlayer.start();
}
}
public boolean onError(MediaPlayer mPlayer, int intError, int intExtra)
{
return false;
}
public void onPrepared(MediaPlayer mPlayer)
{
setVideoSurfaceSize(mPlayer);
mPlayer.start();
// From the 'Started' mode, the player can either be 'Stopped', 'Paused' or PlaybackCompleted'
} // End onPrepared
public void Wrapup(MediaPlayer mp)
{
if (mp != null)
{
if (myMediaPlayer.isPlaying() )
mp.stop();
mp.release();
bolMediaPlayerIsReleased = true;
}
// Now clean up before terminating. This is ESSENTIAL
// If cleanup is NOT done then the surfaceDestroyed will get called
// and screw up everything
// Firstly remove the callback
videoHolder.removeCallback(this); // Prevents callbacks when the surface is destroyed
ShowVideoActivity.this.finish();
}
}
Use Activity.getAssets() to get an AssetManager. The load the file with open.