I'm trying to send h264/AAC video from Android's MediaRecorder through a local Socket. The goal is to to send video to a WOWZA server throught RTMP or RTSP, but it's giving me a lot of trouble and for now I'm just trying to write the data to a file from the LocalServerSocket.
Here is some code. Sorry it's not really clean, but I spent hours testing many things and my project is a mess right now.
In the Camera activity, the output file setup:
LocalSocket outSocket = new LocalSocket();
try {
outSocket.connect(new LocalSocketAddress(LOCAL_SOCKET));
} catch (Exception e) {
Log.i(LOG_TAG, "Error connecting socket: "+e);
}
mMediaRecorder.setOutputFile(outSocket.getFileDescriptor());
The LocalServerSocket implementation:
try {
mLocalServerSocket = new LocalServerSocket(mName);
} catch (Exception e) {
Log.e(LOG_TAG, "Error creating server socket: "+e);
return;
}
while (true) {
File out = null;
FileOutputStream fop = null;
try {
mLocalClientSocket = mLocalServerSocket.accept();
InputStream in = mLocalClientSocket.getInputStream();
out = new File(mContext.getExternalFilesDir(null), "testfile.mp4");
fop = new FileOutputStream(out);
int len = 0;
byte[] buffer = new byte[1024];
while ((len = in.read(buffer)) >= 0) {
Log.i(LOG_TAG, "Writing "+len+" bytes");
fop.write(buffer, 0, len);
}
} catch (Exception e) {
e.printStackTrace();
}
finally{
try {
fop.close();
mLocalClientSocket.close();
} catch (Exception e2) {}
}
}
The problem is that the file resulting from this is not readable by any media player. Do you think this is because of an encoding issue? This code should generate a binary file if I understand well?!
Thanks in advance, cheers.
Ok, I've found why the files couldn't play. In MP4 and 3GPP files, there is a header containing the bytes:
ftyp3gp4 3gp43gp6 wide mdat
in HEX
0000001866747970336770340000030033677034336770360000000877696465000392D86D6461740000
The 4 bytes before the 'mdat' tag represent the position of another 'moov' tag situated at the end of the file. The position is usually set when the recording is over, but as MediaRecorder can't seek sockets, it can't set these bytes to the correct value in our case.
My problem now is to find a way to make such a file streamable, as it involves for it to be played before the recording is over.
You could try using mp4box to restructure your file. The moov box gives the indexes for each audio and video sample. If that is at the end of the file, it makes it difficult to stream.
This might help:
http://boliston.wordpress.com/tag/moov-box/
Or this:
mp4box -inter 0.5 some_file.mp4
(I don't have the chance to try currently)
If you need this to work with your app, I am not aware of any activities to port mp4box to Android.
I tried today to do the same, but mp4 is not very easy to stream (as said before some parts are written at the end). I don't say it's impossible but it seems at least quite hard.
So a workaround for newer Android APIs (4.3) could be this one:
Set the camera preview to a SurfaceTexture: camera.setPreviewTexture
Record this texture using OpenGL and MediaCodex + Muxer
The drawback of this solution is that the preview size of a camera might be smaller than the video size. This means depending on you device you can't record at the highest resolution. Hint: some cameras say they don't support higher preview sizes but they do and you can try to configure the camera to set the preview size to the video size. If you do so catch the RuntimeException of camera.setParameters and if it fails only use the supported preview sizes.
Some links how to record from a SurfaceTexture:
Bigflage: great examples for MediaCodec stuff.
The VideoRecorder class from Lablet.
May also be useful: spydroid-ipcamera streams the data from the MediaRecorder socket as RTP streams but I have found no way to feed that to the MediaCodec. (I already got stuck reading the correct NAL unit sizes as they do...)
Related
I am now trying to play an encrypted video(mp4) complete with my own logic. It takes too much time to play back the decoded file because it is too large to create and play. So, what I have found is how to play it while decrypting it with InputStream using ExoPlayer. But it's too difficult at my level to apply it. When I was worried for two days, I had a night, but I still do not see any results. So I ask for help here.
What I am looking for is a reference that can be helpful. I must accept and decode the buffer size (4096). I do not know where to write this code.
And the flow to complete the function I think is as follows.
1. Complete the ExoPlayer UI.
2. Encrypt the downloaded file using my encryption logic. (buffer size is 4096)
3. InputStream receives the file, decodes it at the same time, and plays it. (streaming)
I will do it somehow until 1 and 2, but 3 is very difficult for me. Do you have any specific code and explanation? If you know anyone, please give me a favor. Thank you.
try {
ios = new FileInputStream(params[0]);
fos = context.openFileOutput(params[1] + ".mp4", MODE_PRIVATE);
ScatteringByteChannel sbc = ios.getChannel();
GatheringByteChannel gbc = fos.getChannel();
File file = new File(params[0]);
fileLength = file.length();
startTime = System.currentTimeMillis();
int read = 0;
readb = 0;
ByteBuffer bb = ByteBuffer.allocate(4096);
while ((read = sbc.read(bb)) != -1) {
bb.flip();
gbc.write(ByteBuffer.wrap(enDecryptVideo.combineByteArray(bb.array())));
bb.clear();
readb += read;
if (readb % (4096 * 1024 * 3) == 0){
publishProgress(((int) ( readb * 100 / fileLength)));
} else if (readb == fileLength) {
publishProgress(101);
}
}
ios.close();
fos.close();
} catch (Exception e) {
e.getMessage();
} finally {
Log.d(TAG, "doInBackground: " + (System.currentTimeMillis() - startTime));
}
This is my code when I use File play. The above code is the code I used when I made a decoded file and played it. Now I have to play back at the same time as decoding. It does not create a file. I am very eager. Because I have been working for a month since I started work, but I have received something that does not fit my level. But I really want to hit this target... Teach me please.
You can actually leverage the platforms inbuilt encryption functionality for streamed video, either using a commercial DRM or using a 'clear key' encryption.
If these meet your needs it should much easier to work with as you won't have to implement the encryption and decryption yourself.
This answer provides an example for creating both an HLS / AES stream and a DASH clearkey stream:
https://stackoverflow.com/a/45103073/334402
This does not provide the same security as DRM, as the keys themselves are not encrypted, but it may be sufficient for your needs.
These streams can then be played with the standard iOS, Android or HTML5 players.
I have multicast UDP video stream and android box.
I wrote application on xamarin for android box
that plays this video stream.
Now, I want to write code for the application, that would record video stream to file on the android box and play video from the file when user will want. It will be "TimeShift" and PVR (Personal Video Recorder) function for my application.
I looked at a lot of examples, but there is showed only record video stream from the camera.
If anyone has example code for my case, please share with me.
I tried to use various elements to connect and write data (UdpClient, HttpClient, also UdpSocket from plugin rda.SocketsForPCL). Received an error of connection and error protocol type or protocol is not supported.
Is there a way to connect and save data from udp stream (video)?
string host;
string port;
IPAddress ip_multi = IPAddress.Parse(host);
IPEndPoint ipEndpoint = new IPEndPoint(ip_multi, Convert.ToInt32(port));
Socket clientSocket;
try
{
clientSocket = new Socket(SocketType.Dgram, ProtocolType.Udp);
int Length;
Byte[] b = new Byte[Length];
clientSocket.Connect(ipEndpoint);
clientSocket.Receive(b);
//clientSocket.BeginReceive(b, 0, Length, SocketFlags.Multicast, receiveCallback, clientSocket);
clientSocket.Close();
}
catch (IOException ex)
{
Log.Debug("esception","error!"+ex.ToString());
}
Since i can record my stream on my client phone using Spydroid here from another phone. now i want to stream through rtsp my local recorded .mp4 file using spydroid for example i had a /mount/sdcard/vid.mp4...and i want to play it through rtsp from another phone.
My questions Are:
*is this possible? if its not what class will i need to modify from spydroid?
*can i just use a .mp4 file for stream rather than the camerastream or do i need to convert them to another format?
*Does the video stream class create a video file while streaming? or it just passes data through packet directly?
any opinions and solutions will be appreciated especially if sample codes similar to my questions thank you.
Edited:
I tried to modify the code for VideoStream.java and replace the mPacketizer.setinputstream from my local file and the error comes on
01-23 17:49:06.960: E/H263Packetizer(646): Couldn't skip mp4 header :/
here is my code:
File file= new File("/sdcard/DCIM/Camera/samp.3gp");
InputStream stream = new FileInputStream(file);
// mMediaRecorder.prepare();
// mMediaRecorder.start();
try {
// mReceiver.getInputStream contains the data from the camera
// the mPacketizer encapsulates this stream in an RTP stream and send it over the network
mPacketizer.setDestination(mDestination, mRtpPort, mRtcpPort);
// mPacketizer.setInputStream(mReceiver.getInputStream());
mPacketizer.setInputStream(stream);
mPacketizer.start();
mStreaming = true;
} catch (IOException e) {
stop();
throw new IOException("Something happened with the local sockets :/ Start failed !");
}
}
I have a set of videos stored in a folder on the android file system.
I would like to read each frame by frame so that i can perform some OpenCv functions on them and then display them in a Bitmap.
I'm not sure how to do this correctly, any help would be appreciated.
You can take a look at Javacv.
"JavaCV first provides wrappers to commonly used libraries by researchers in the field of computer vision: OpenCV, FFmpeg, libdc1394, PGR FlyCapture, OpenKinect, videoInput, and ARToolKitPlus"
To read each frame by frame you'd have to do something like below
FrameGrabber videoGrabber = new FFmpegFrameGrabber(videoFilePath);
try
{
videoGrabber.setFormat("video format goes here");//mp4 for example
videoGrabber.start();
} catch (com.googlecode.javacv.FrameGrabber.Exception e)
{
Log.e("javacv", "Failed to start grabber" + e);
return -1;
}
Frame vFrame = null;
do
{
try
{
vFrame = videoGrabber.grabFrame();
if(vFrame != null)
//do your magic here
} catch (com.googlecode.javacv.FrameGrabber.Exception e)
{
Log.e("javacv", "video grabFrame failed: "+ e);
}
}while(vFrame != null);
try
{
videoGrabber.stop();
}catch (com.googlecode.javacv.FrameGrabber.Exception e)
{
Log.e("javacv", "failed to stop video grabber", e);
return -1;
}
Hope that helps. Goodluck
i know it's to late but any one can use it if he need it
so you can use #Pawan Kumar code and you need to add read permession to your manifest file <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
and you will get it working.
I donĀ“t know about Android but generally you would have to use VideoCapture::open to open your video and then use VideoCapture::grab to get the next frame. See the Documentation of OpenCV for more information on this.
Update:
It seems like camera access is not officially supported for Android at the moment, see this issue on the OpenCV Github: https://github.com/opencv/opencv/issues/11952
You can either try the unofficial branch linked in the issue: https://github.com/komakai/opencv/tree/android-ndk-camera
or use another library to read in the frames and then create an OpenCV image from the data buffer like in this question.
My aim is to pause in recording file.
I see in Android developer site its but Media Recorder have not pause option.
Java supports merge two audio file programatically but In android its not work.
Join two WAV files from Java?
And also I used default device audio recorder Apps which is available in all device but in Samsung few devices have not returened recording path.
Intent intent = new Intent(MediaStore.Audio.Media.RECORD_SOUND_ACTION);
startActivityForResult(intent,REQUESTCODE_RECORDING);
Any one help for voice recording with pause functionality.
http://developer.android.com/reference/android/media/MediaRecorder.html
MediaRecorder does not have pause and resume methods. You need to use stop and start methods instead.
I had such a requirement in one of my projects, What we done was like make a raw file for saving recorded data in start of recording using AudioRecord , the for each resume we append the data to the same file
like
FileOutputStream fos= new FileOutputStream(filename, true);
here the filename is the name of the raw file and append the new recording data to it.
And when user stop the recording we will convert the entire raw file to .wav( or other) formats. Sorry that i cant post the entire code. Hope this will give you a direction to work.
You can refer my answer here if still have this issue. For API level >= 24 pause/resume methods are available in Android MediaRecorder class.
For API level < 24
Add below dependency in your gradle file:
compile 'com.googlecode.mp4parser:isoparser:1.0.2'
The solution is to stop recorder when user pause and start again on resume as already mentioned in many other answers in stackoverflow. Store all the audio/video files generated in an array and use below method to merge all media files. The example is taken from mp4parser library and modified little bit as per my need.
public static boolean mergeMediaFiles(boolean isAudio, String sourceFiles[], String targetFile) {
try {
String mediaKey = isAudio ? "soun" : "vide";
List<Movie> listMovies = new ArrayList<>();
for (String filename : sourceFiles) {
listMovies.add(MovieCreator.build(filename));
}
List<Track> listTracks = new LinkedList<>();
for (Movie movie : listMovies) {
for (Track track : movie.getTracks()) {
if (track.getHandler().equals(mediaKey)) {
listTracks.add(track);
}
}
}
Movie outputMovie = new Movie();
if (!listTracks.isEmpty()) {
outputMovie.addTrack(new AppendTrack(listTracks.toArray(new Track[listTracks.size()])));
}
Container container = new DefaultMp4Builder().build(outputMovie);
FileChannel fileChannel = new RandomAccessFile(String.format(targetFile), "rw").getChannel();
container.writeContainer(fileChannel);
fileChannel.close();
return true;
}
catch (IOException e) {
Log.e(LOG_TAG, "Error merging media files. exception: "+e.getMessage());
return false;
}
}
Use flag isAudio as true for Audio files and false for Video files.
You can't do it using Android API, but you can save a lot of mp4 files and merge it using mp4parser: powerful library written in Java. Also see my simple recorder with a "pause": https://github.com/lassana/continuous-audiorecorder.