Since i can record my stream on my client phone using Spydroid here from another phone. now i want to stream through rtsp my local recorded .mp4 file using spydroid for example i had a /mount/sdcard/vid.mp4...and i want to play it through rtsp from another phone.
My questions Are:
*is this possible? if its not what class will i need to modify from spydroid?
*can i just use a .mp4 file for stream rather than the camerastream or do i need to convert them to another format?
*Does the video stream class create a video file while streaming? or it just passes data through packet directly?
any opinions and solutions will be appreciated especially if sample codes similar to my questions thank you.
Edited:
I tried to modify the code for VideoStream.java and replace the mPacketizer.setinputstream from my local file and the error comes on
01-23 17:49:06.960: E/H263Packetizer(646): Couldn't skip mp4 header :/
here is my code:
File file= new File("/sdcard/DCIM/Camera/samp.3gp");
InputStream stream = new FileInputStream(file);
// mMediaRecorder.prepare();
// mMediaRecorder.start();
try {
// mReceiver.getInputStream contains the data from the camera
// the mPacketizer encapsulates this stream in an RTP stream and send it over the network
mPacketizer.setDestination(mDestination, mRtpPort, mRtcpPort);
// mPacketizer.setInputStream(mReceiver.getInputStream());
mPacketizer.setInputStream(stream);
mPacketizer.start();
mStreaming = true;
} catch (IOException e) {
stop();
throw new IOException("Something happened with the local sockets :/ Start failed !");
}
}
Related
I am trying to get the audio input stream from a file in the local file system on an android device.
This is so that i can use this library to show a wave form for the audio file.
https://github.com/newventuresoftware/WaveformControl/blob/master/app/src/main/java/com/newventuresoftware/waveformdemo/MainActivity.java#L125
The example in the project uses rawResource like so
InputStream is = getResources().openRawResource(R.raw.jinglebells);
This input stream is later converted into byte array and passed to somewhere that uses it to paint a wave picture and sound.
however when I did
InputStream is = new InputFileSystem(new File(filePath));
But this does not seem to work properly. The image generated is wrong, and the sound played is nothing like what the file actually is.
This is the body of the function in that library that gets the input stream and convert it into byte arrays.
private short[] getAudioSample() throws IOException {
// If i replace this part with new FileInput(new File(filePath))
// the generated "samples" from it does not work properly with the library.
InputStream is = getResources().openRawResource(R.raw.jinglebells);
byte[] data;
try {
data = IOUtils.toByteArray(is);
} finally {
if (is != null) {
is.close();
}
}
ShortBuffer sb = ByteBuffer.wrap(data).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer();
short[] samples = new short[sb.limit()];
sb.get(samples);
return samples;
}
The sound file that I would like to get processed and pass to that library is created by a MediaRecorder with the following configurations
MediaRecorder recorder = new MediaRecorder();
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
Basically that library requires PCM samples. Using that 3gp FileInputStream generated by MediaRecorder directly is not gonna cut it.
What I did is to use MediaExtractor and MediaCodec to convert the 3gp audio data into PCM samples by samples. and then it is fed into that library. Then everything worked =)
The logic in encoding audio data can be almost directly taken from here this awesome github repo
I have multicast UDP video stream and android box.
I wrote application on xamarin for android box
that plays this video stream.
Now, I want to write code for the application, that would record video stream to file on the android box and play video from the file when user will want. It will be "TimeShift" and PVR (Personal Video Recorder) function for my application.
I looked at a lot of examples, but there is showed only record video stream from the camera.
If anyone has example code for my case, please share with me.
I tried to use various elements to connect and write data (UdpClient, HttpClient, also UdpSocket from plugin rda.SocketsForPCL). Received an error of connection and error protocol type or protocol is not supported.
Is there a way to connect and save data from udp stream (video)?
string host;
string port;
IPAddress ip_multi = IPAddress.Parse(host);
IPEndPoint ipEndpoint = new IPEndPoint(ip_multi, Convert.ToInt32(port));
Socket clientSocket;
try
{
clientSocket = new Socket(SocketType.Dgram, ProtocolType.Udp);
int Length;
Byte[] b = new Byte[Length];
clientSocket.Connect(ipEndpoint);
clientSocket.Receive(b);
//clientSocket.BeginReceive(b, 0, Length, SocketFlags.Multicast, receiveCallback, clientSocket);
clientSocket.Close();
}
catch (IOException ex)
{
Log.Debug("esception","error!"+ex.ToString());
}
I have followed this example to convert raw audio data coming from AudioRecord to mp3, and it happened successfully, if I store this data in a file the mp3 file and play with music player then it is audible.
Now my question is instead of storing mp3 data to a file i need to play it with AudioTrack, the data is coming from the Red5 media server as live stream, but the problem is AudioTrack can only play PCM data, so i can only hear noise from my data.
Now i am using JLayer to my require task.
My code is as follows.
int readresult = recorder.read(audioData, 0, recorderBufSize);
int encResult = SimpleLame.encode(audioData,audioData, readresult, mp3buffer);
and this mp3buffer data is sent to other user by Red5 stream.
data received at other user is in form of stream, so for playing it the code is
Bitstream bitstream = new Bitstream(data.read());
Decoder decoder = new Decoder();
Header frameHeader = bitstream.readFrame();
SampleBuffer output = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream);
short[] pcm = output.getBuffer();
player.write(pcm, 0, pcm.length);
But my code freezes at bitstream.readFrame after 2-3 seconds, also no sound is produced before that.
Any guess what will be the problem? Any suggestion is appreciated.
Note: I don't need to store the mp3 data, so i cant use MediaPlayer, as it requires a file or filedescriptor.
just a tip, but try to
output.close();
bitstream.closeFrame();
after yours write code. I'm processing MP3 same as you do, but I'm closing buffers after usage and I have no problem.
Second tip - do it in Thread or any other Background process. As you mentioned these deaf 2 seconds, media player may wait until you process whole stream because you are loading it in same thread.
Try both tips (and you should anyway). In first, problem could be in internal buffers; In second you probably fulfill Media's input buffer and you locked app (same thread, full buffer cannot receive your input and code to play it and release same buffer is not invoked because writing locks it...)
Also, if you don't doing it now, check for 'frameHeader == null' due to file end.
Good luck.
You need to loop through the frames like this:
While (frameHeader = bitstream.readFrame()){
SampleBuffer output = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream);
short[] pcm = output.getBuffer();
player.write(pcm, 0, pcm.length);
bitstream.close();
}
And make sure you are not running them on main thread.(This is probably the reason of freezing.)
I am developing a low data rate VoIP kind of project . I need to capture audio at low data rates and store it in an internal buffer or FIFO (NOT in a file).
I would like to use low data rate .AMR encoders, which means AudioRecord is out. MediaRecorder looks like it does exactly what I want except that it seems to write to a file.
MediaRecorder takes a FileDescriptor... is there any way I can write a class that implements the FileDescriptor interface... acting as a sync for bytes... but instead of sending them to a file they are stored in a buffer? The documentation on FileDescriptor specifically says that Applications shouldn't write their own but why not and is it possible anyway?
http://docs.oracle.com/javase/1.4.2/docs/api/java/io/FileDescriptor.html
In short, I'd like to develop my own stream, and trick MediaRecorder to send data to it. Perhaps doing something tricky with opening both ends of a socket within the same APK and giving MediaRecorder the socket to write to? Using the socket as my FIFO? I'm somewhat new to this so any help/suggestions greatly appreciated.
I have a related question on the RX side. I'd like to have a buffer/fifo that feeds MediaPlayer. Can I trick MediaPlayer to accept data from a buffer fed by my own proprietary stream?
I know its a bit late to answer this question now...
...But if it helps here's the solution.
Android MediaRecorder's method setOutputFile() accepts FileDescriptor as a parameter.
As for your need a unix data pipe could be created and its FD could be passed as an argument in the following manner...
mediaRecorder.setOutputFile(getPipeFD());
FileDescriptor getPipeFD()
{
final String FUNCTION = "getPipeFD";
FileDescriptor outputPipe = null;
try
{
ParcelFileDescriptor[] pipe = ParcelFileDescriptor.createPipe();
outputPipe = pipe[1].getFileDescriptor();
}
catch(Exception e)
{
Log.e(TAG, FUNCTION + " : " + e.getMessage());
}
return outputPipe;
}
The ParcelFileDescriptor.createPipe() creates a Unix Data Pipe and returns an array of ParcelFileDescriptors. The first object refers to the read channel (Source Channel) and the second one refers to the write channel (Sink Channel) of the pipe. Use MediaRecorder object to write the recorded data to the write channel...
As far as MediaPlayer is concerned the same technique could be used by passing the FileDescriptor object related to the created pipe's read channel to the setDataSource() method...
I'm trying to send h264/AAC video from Android's MediaRecorder through a local Socket. The goal is to to send video to a WOWZA server throught RTMP or RTSP, but it's giving me a lot of trouble and for now I'm just trying to write the data to a file from the LocalServerSocket.
Here is some code. Sorry it's not really clean, but I spent hours testing many things and my project is a mess right now.
In the Camera activity, the output file setup:
LocalSocket outSocket = new LocalSocket();
try {
outSocket.connect(new LocalSocketAddress(LOCAL_SOCKET));
} catch (Exception e) {
Log.i(LOG_TAG, "Error connecting socket: "+e);
}
mMediaRecorder.setOutputFile(outSocket.getFileDescriptor());
The LocalServerSocket implementation:
try {
mLocalServerSocket = new LocalServerSocket(mName);
} catch (Exception e) {
Log.e(LOG_TAG, "Error creating server socket: "+e);
return;
}
while (true) {
File out = null;
FileOutputStream fop = null;
try {
mLocalClientSocket = mLocalServerSocket.accept();
InputStream in = mLocalClientSocket.getInputStream();
out = new File(mContext.getExternalFilesDir(null), "testfile.mp4");
fop = new FileOutputStream(out);
int len = 0;
byte[] buffer = new byte[1024];
while ((len = in.read(buffer)) >= 0) {
Log.i(LOG_TAG, "Writing "+len+" bytes");
fop.write(buffer, 0, len);
}
} catch (Exception e) {
e.printStackTrace();
}
finally{
try {
fop.close();
mLocalClientSocket.close();
} catch (Exception e2) {}
}
}
The problem is that the file resulting from this is not readable by any media player. Do you think this is because of an encoding issue? This code should generate a binary file if I understand well?!
Thanks in advance, cheers.
Ok, I've found why the files couldn't play. In MP4 and 3GPP files, there is a header containing the bytes:
ftyp3gp4 3gp43gp6 wide mdat
in HEX
0000001866747970336770340000030033677034336770360000000877696465000392D86D6461740000
The 4 bytes before the 'mdat' tag represent the position of another 'moov' tag situated at the end of the file. The position is usually set when the recording is over, but as MediaRecorder can't seek sockets, it can't set these bytes to the correct value in our case.
My problem now is to find a way to make such a file streamable, as it involves for it to be played before the recording is over.
You could try using mp4box to restructure your file. The moov box gives the indexes for each audio and video sample. If that is at the end of the file, it makes it difficult to stream.
This might help:
http://boliston.wordpress.com/tag/moov-box/
Or this:
mp4box -inter 0.5 some_file.mp4
(I don't have the chance to try currently)
If you need this to work with your app, I am not aware of any activities to port mp4box to Android.
I tried today to do the same, but mp4 is not very easy to stream (as said before some parts are written at the end). I don't say it's impossible but it seems at least quite hard.
So a workaround for newer Android APIs (4.3) could be this one:
Set the camera preview to a SurfaceTexture: camera.setPreviewTexture
Record this texture using OpenGL and MediaCodex + Muxer
The drawback of this solution is that the preview size of a camera might be smaller than the video size. This means depending on you device you can't record at the highest resolution. Hint: some cameras say they don't support higher preview sizes but they do and you can try to configure the camera to set the preview size to the video size. If you do so catch the RuntimeException of camera.setParameters and if it fails only use the supported preview sizes.
Some links how to record from a SurfaceTexture:
Bigflage: great examples for MediaCodec stuff.
The VideoRecorder class from Lablet.
May also be useful: spydroid-ipcamera streams the data from the MediaRecorder socket as RTP streams but I have found no way to feed that to the MediaCodec. (I already got stuck reading the correct NAL unit sizes as they do...)