Downloading rtsp stream to play locally - android

I manage to play a live stream from a url such as this
rtsp://192.168.0.18:554/user=admin&password=&channel=1&stream=0.sdp?
But I want to download this stream into a temporary file and then play it locally so that I can make it seems like the buffering time is short (around 2-4 seconds delay maybe)
Is it possible to do this with rtsp? or do I have to use http?Because this url only works on rtsp protocol
If so,a bit of example would help me alot
Example of my codes
cA.mPlayer1 = new MediaPlayer();
try {
cA.mPlayer1.setDataSource("rtsp://192.168.0.18:554/user=admin&password=&channel=1&stream=0.sdp?");
cA.mPlayer1.prepareAsync();
cA.mPlayer1.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mediaPlayer) {
cA.mPlayer1.start();
Toast.makeText(getBaseContext(), "Connecting...", Toast.LENGTH_LONG).show();
}
});
} catch (IOException e) {
e.printStackTrace();
}
cA.mCallback1 = new SurfaceHolder.Callback() {
#Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
cA.mPlayer1.setDisplay(surfaceHolder);
}
#Override
public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i2, int i3) {
}
#Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
}
};
final SurfaceView surfaceView1 =
(SurfaceView) findViewById(R.id.surfaceView1);
// Configure the Surface View.
surfaceView1.setKeepScreenOn(true);
// Configure the Surface Holder and register the callback.
SurfaceHolder holder1 = surfaceView1.getHolder();
holder1.addCallback(cA.mCallback1);
holder1.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);

you can not use MediaPlayer to save a Raw stream in file . you can use one of these :
use 'vlc' to save the stream like this:
https://wiki.videolan.org/Documentation:Streaming_HowTo/Receive_and_Save_a_Stream/
or use 'VLC lib' for android that can downloaded from :
https://github.com/mrmaffen/vlc-android-sdk
use FFMPEG library to record RTSP recording locally in sd card :
1- Capture or decode the RAW frames from live stream and pass them to ffmpeg and save them to sdcard in .h264 format.
2- Then again pick .h264 raw file and decode the file using ffmpeg, and save the file with extention .mp4 into sd card.
3- delete the .h264 file programmatically, and save only .mp4, or which format you want.
Try .mp4 playback.
https://stackoverflow.com/a/24586256/6502368

Related

Android - Add mp3 audio to mp4 video

I'm trying to mix a MP3 audio file into a MP4 video.
After hours of searching, I've concluded that I need to convert the MP3 files to an AAC format, which would fit in a MP4 container.
Add audio to video in android
But I can't find any documentation on how to convert the MP3 files into AAC format.
So do you have any advice on how I could convert a MP3 audio to an AAC audio ?
Also, I would need to insert several audios at specific times in the final video.
You can try to use mp4parser which does some muxing to mp4, there are some people (as seen from within github repo "issues" part) who are using this to mux mp3 and mp4.
Another option you've got is to use FFmpeg, either compile your self or use something premade, I've used this.
Library it self is a bit bulky, plus you might need some playing around to get FFmpeg commands right in order to get optimum quality and mux speed.
It looks something like this:
try {
FFmpeg ffmpeg = FFmpeg.getInstance(this);
String cmd = "-i " + videoFilePath + " -i " + audioFilePath + " -shortest -threads 0 -preset ultrafast -strict -2 " + outputFilePath
ffmpeg.execute(cmd, mergeListener);
} catch (FFmpegCommandAlreadyRunningException e) {
e.printStackTrace();
}
And a listener:
ExecuteBinaryResponseHandler mergeListener = new ExecuteBinaryResponseHandler() {
#Override
public void onStart() {
//started
}
#Override
public void onFailure(String message) {
//failed
}
#Override
public void onFinish() {
File output = new File(outputFilePath);
//Do whatever with your muxed file
}
};

Android Media Recorder not recording long videos on Google Glass

I have written an opensource camera for Google glass but some of the people who have used it have reported that the video recorded doesn't get saved properly for lengthy videos.
I couldn't find info regarding any such limitation in the Android documentation
So Upon checking it out i found that for videos greater than 26 minutes , the video file got saved in Glass and Its size was around 2.7 GB but its duration was 0:00. And it couldn't be played using any video player.
So i am wondering why is that? Why does the video get properly recorded for duration < 26 minutes and gets messed up for longer videos.
Code to start video Recording is
/**
* Initialize video recorder to record video
*/
private void initRecorder() {
try {
File dir = new File(Environment.getExternalStorageDirectory()
+ File.separator + Environment.DIRECTORY_PICTURES
+ File.separator + "My Videos");
if (!dir.exists()) {
dir.mkdirs();
}
videofile = new File(dir, "video.mp4");
recorder.setCamera(mCamera);
// Step 2: Set sources
recorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
// Step 3: Set a CamcorderProfile (requires API Level 8 or higher)
recorder.setProfile(CamcorderProfile
.get(CamcorderProfile.QUALITY_HIGH));
// Step 4: Set output file
recorder.setOutputFile(videofile.getAbsolutePath());
// Step 5: Set the preview output
recorder.setPreviewDisplay(mPreview.getHolder().getSurface());
// Step 6: Prepare configured MediaRecorder
recorder.setMaxDuration(3600* 1000);
recorder.setMaxFileSize(-1);
recorder.setOnErrorListener(new OnErrorListener() {
#Override
public void onError(MediaRecorder mr, int what, int extra) {
Log.e("Error Recording", what+" Extra "+extra);
}
});
recorder.setOnInfoListener(new OnInfoListener() {
#Override
public void onInfo(MediaRecorder mr, int what, int extra) {
if (what == MediaRecorder.MEDIA_RECORDER_INFO_MAX_DURATION_REACHED) {
endVideoRecording();
}
}
});
recorder.prepare();
recorder.start();
mOverlay.setMode(Mode.RECORDING);
} catch (Exception e) {
if (e != null && e.getMessage() != null)
Log.e("Error Starting CuXtom Camera for video recording",
e.getMessage());
}
}
According to my research it seems that Google glass is only capable of recording video under 2GB size so if you want to record any video whose size might be greater than that then i would advise you to divide the video into smaller parts and then at the end merge it with mp4 parser

how to differentiate between 3gp audio and 3gp video

I am accessing the videos directly from content provider without storing it in my database. But it is giving me 3gp audio file along with other video and 3gp videos. how could i filter only the video files.I am working for API 8
Try this since you are working on API 8, otherwise METADATA_KEY_HAS_VIDEO could have done the job if API level >= 10.
A work around using MediaPlayer. If media has height it means it is a video.
public boolean isVideoFile(File file) {
int height = 0;
try {
MediaPlayer mp = new MediaPlayer();
FileInputStream fs;
FileDescriptor fd;
fs = new FileInputStream(file);
fd = fs.getFD();
mp.setDataSource(fd);
mp.prepare();
height = mp.getVideoHeight();
mp.release();
} catch (Exception e) {
Log.e(TAG, "Exception trying to determine if 3gp file is video.", e);
}
return height > 0;
}
Source

Live audio streaming with Android 2.x

I need to play a live stream on devices with 2.x and greater versions. This states that it's impossible to play live streams on devices with Android 2.x.
What're my options here ? Especially I'm interested in streaming audio - what format should i pick and in conjunction with which protocol ?
P.S. I've tried Vitamio - don't want to make customers download third party libraries.
UPD
How come I can play this stream "http://188.138.112.71:9018/" ?
try this example for RTSP streaming (the url should support RTSP) for video change the code to support just audio
public class MultimediaActivity extends Activity {
private static final String RTSP = "rtsp://url here";
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.multimedia);
//***VideoView to video element inside Multimedia.xml file
VideoView videoView = (VideoView) findViewById(R.id.video);
Log.v("Video", "***Video to Play:: " + RTSP);
MediaController mc = new MediaController(this);
mc.setAnchorView(videoView);
Uri video = Uri.parse(RTSP);
videoView.setMediaController(mc);
videoView.setVideoURI(video);
videoView.start();
}
}
EDIT:
Live Audio streaming using MediaPlayer in Android
Live Audio streaming in android, from 1.6 sdk onwards is become so easy. In setDataSource() API directly pass the url and audio will play without any issues.
The complete code snippet is,
public class AudioStream extends Activity {
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
String url = "http://www.songblasts.com/songs/hindi/t/three-idiots/01-Aal_Izz_Well-(SongsBlasts.Com).mp3";
MediaPlayer mp = new MediaPlayer();
try {
mp.setDataSource(url);
mp.setAudioStreamType(AudioManager.STREAM_MUSIC);
mp.prepare();
mp.start();
} catch (Exception e) {
Log.i("Exception", "Exception in streaming mediaplayer e = " + e);
}
}
}
You can use RTSP protocol which is supported by Android native media player.
player = new MediaPlayer();
player.reset();
player.setDataSource(intent.getStringExtra("Path"));
player.prepare();
player.setOnPreparedListener(new OnPreparedListener() {
public void onPrepared(MediaPlayer mp) {
player.start();
}
});
Where path would be your rtsp audio streaming url.

Can a videoview play a video stored on internal storage?

I'm trying to provide my users with the ability to use either external or internal storage. I'm displaying both images and videos (of a scientific nature). When storing the media on the SD card, all is fine. But when I store the media internally, only the images will display. No matter what I try I get various errors when trying to load and display the media stored under the applicationcontext.getFilesDir().
Is there a trick to setting a videoview's content to such a file?
Can a ContentResolver help me?
On a related note, is it considered bad form to assume that external storage exists?
Thanks in advance,
Sid
Below is one version that fails with "Cannot play video. Sorry, this video cannot be played". But I have many other modes of failure. I can copy the internal video to temp storage (external) and play it, so this copy to internal does indeed create a valid movie. It only fails when I try to play it directly from the internal storage.
videoFile = new File(this.getFilesDir() + File.separator + "test.mp4");
InputStream data = res.openRawResource(R.raw.moviegood);
try {
OutputStream myOutputStream = new FileOutputStream(videoFile);
byte[] buffer = new byte[8192];
int length;
while ( (length = data.read(buffer)) > 0 ) {
myOutputStream.write(buffer);
}
//Close the streams
myOutputStream.flush();
myOutputStream.close();
data.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
vview.setKeepScreenOn(true);
vview.setVideoPath(videoFile.getAbsolutePath());
vview.start();
MediaPlayer requires that the file being played has world-readable permissions. You can view the permissions of the file with the following command in adb shell:
ls -al /data/data/com.mypackage/myfile
You will probably see "-rw------", which means that only the owner (your app, not MediaPlayer) has read/write permissions.
Note: Your phone must be rooted in order to use the ls command without specifying the file (in the internal memory).
If your phone is rooted, you can add world-read permissions in adb shell with the following command:
chmod o+r /data/data/com.mypackage/myfile
If you need to modify these permissions programmatically (requires rooted phone!), you can use the following command in your app code:
Runtime.getRuntime().exec("chmod o+r /data/data/com.mypackage/myfile");
Which is basically a linux command. See https://help.ubuntu.com/community/FilePermissions for more on chmod.
EDIT: Found another simple approach here (useful for those without rooted phones). Since the application owns the file, it can create a file descriptor and pass that to mediaPlayer.setDataSource():
FileInputStream fileInputStream = new FileInputStream("/data/data/com.mypackage/myfile");
mediaPlayer.setDataSource(fileInputStream.getFD());
This approach avoids the permission issue completely.
You can use:
videoView.setVideoURI(Uri.parse(file.getAbsolutePath()));
if the file is world readable
Or you can use a content provider
For detail check this tutorial
public class AndroidVideoViewExample extends Activity {
private VideoView myVideoView;
private int position = 0;
private ProgressDialog progressDialog;
private MediaController mediaControls;
#Override
protected void onCreate(final Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
// set the main layout of the activity
setContentView(R.layout.activity_main);
//set the media controller buttons
if (mediaControls == null) {
mediaControls = new MediaController(AndroidVideoViewExample.this);
}
//initialize the VideoView
myVideoView = (VideoView) findViewById(R.id.video_view);
// create a progress bar while the video file is loading
progressDialog = new ProgressDialog(AndroidVideoViewExample.this);
// set a title for the progress bar
progressDialog.setTitle("JavaCodeGeeks Android Video View Example");
// set a message for the progress bar
progressDialog.setMessage("Loading...");
//set the progress bar not cancelable on users' touch
progressDialog.setCancelable(false);
// show the progress bar
progressDialog.show();
try {
//set the media controller in the VideoView
myVideoView.setMediaController(mediaControls);
//set the uri of the video to be played
myVideoView.setVideoURI(Uri.parse("android.resource://" + getPackageName() + "/" + R.raw.kitkat));
} catch (Exception e) {
Log.e("Error", e.getMessage());
e.printStackTrace();
}
myVideoView.requestFocus();
//we also set an setOnPreparedListener in order to know when the video file is ready for playback
myVideoView.setOnPreparedListener(new OnPreparedListener() {
public void onPrepared(MediaPlayer mediaPlayer) {
// close the progress bar and play the video
progressDialog.dismiss();
//if we have a position on savedInstanceState, the video playback should start from here
myVideoView.seekTo(position);
if (position == 0) {
myVideoView.start();
} else {
//if we come from a resumed activity, video playback will be paused
myVideoView.pause();
}
}
});
}
#Override
public void onSaveInstanceState(Bundle savedInstanceState) {
super.onSaveInstanceState(savedInstanceState);
//we use onSaveInstanceState in order to store the video playback position for orientation change
savedInstanceState.putInt("Position", myVideoView.getCurrentPosition());
myVideoView.pause();
}
#Override
public void onRestoreInstanceState(Bundle savedInstanceState) {
super.onRestoreInstanceState(savedInstanceState);
//we use onRestoreInstanceState in order to play the video playback from the stored position
position = savedInstanceState.getInt("Position");
myVideoView.seekTo(position);
}
}
I posted a custom VideoView implementation there.
The VideoView implementation has the setVideoFD(FileDescriptor fd) method and solves this issue.
I came across this thread with the same problem, I'm downloading my videos from the web to the internal storage, turns out when saving you can specify the RW mode, i.e change from PRIVATE to WORLD_READABLE
URL url = new URL(_url);
InputStream input = null;
FileOutputStream output = null;
try {
String outputName = "video.mp4";
input = url.openConnection().getInputStream();
output = c.openFileOutput(outputName, Context.MODE_WORLD_READABLE);
int read;
byte[] data = new byte[5120]; //5MB byte array
while ((read = input.read(data)) != -1)
output.write(data, 0, read);
return true;
} finally {
if (output != null)
output.close();
if (input != null)
input.close();
}
}
You can't just play it directly.
You need to implement a ContentProvider then pass the defined Uri to setVideoUri(uri) method.

Categories

Resources