I am developing a application in Android that plays out a Shoutcast stream using the MediaPlayer. I have a requirement of parallely recording the played stream to the SD card in the MP3 format
Is there is a way out in Android? Is there any sample code available to achieve this
I used to do this with the Last.FM player (when it actually worked). It was, however, not a simple means of recording.
Step 1: Write proxy with stream recording function
Step 2: root your phone
Step 3: run on phone:
iptables -t nat -N proxy
iptables -t nat -A OUTPUT -m owner --uid-owner (uid of streaming app) -p tcp -j proxy
iptables -t nat -A proxy -p tcp -j DNAT --to proxyip:port
My 'step 1' was written in perl, and rather messy. For shoutcast, there may be a recording proxy already available.
I am working on something similar and facing the same problem.
I am able to play the stream but i am having trouble with the recording part, i assume i have to read the stream bit by bit and save it to a file. So far this was my approach for the recording.
P.S. Don't forget permissions while recording.
private void startRecording() {
String fileName = Environment.getExternalStorageDirectory().getAbsolutePath();
fileName += "/FM-Recording-"+recordFile;
mRecorder = new MediaRecorder();
//mRecorder.setAudioSource(mediaPlayer.getAudioSessionID());
mRecorder.setAudioSource(MediaRecorder.AudioSource.DEFAULT);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mRecorder.setOutputFile(fileName);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
try {
mRecorder.prepare();
} catch (IOException e) {
Log.e(LOG_TAG, "prepare() failed "+e.toString());
}
mRecorder.start();
}
private class RecorderThread extends Thread {
public void go(){
runOnUiThread (new Thread(new Runnable() {
public void run() {
isRecording = true;
startRecording();
}
}));
}
}
Here is layout:
Application Layout
Or implement something like this
private void startRecording() {
BufferedOutputStream writer = null;
try {
URL url = new URL(RADIO_STATION_URL);
URLConnection connection = url.openConnection();
writer = new BufferedOutputStream(new FileOutputStream(new File(fileName)));
recordingStream = connection.getInputStream();
final int BUFFER_SIZE = 100;
byte[] buffer = new byte[BUFFER_SIZE];
while (recordingStream.read(buffer, 0, BUFFER_SIZE) != -1 && isRecording) {
writer.write(buffer, 0, BUFFER_SIZE);
writer.flush();
}
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
recordingStream.close();
writer.flush();
writer.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
Related
I am developing an app which has the functionality of sharing screens with other apps.
I used the Media projection API for this. I also used MediaMuxer to combine the audio and video outputs for screen sharing.
I know that Media Projection APIs are used for screen recording but all I want is to share the screen while recording.
For this, I have modified the writeSampleData method of the MediaMuxer class to send bytes via a socket to the other device over the network.
Below is the code for that:
OutputStream outStream;
outStream = ScreenRecordingActivity.getInstance().socket.getOutputStream();
void writeSampleData(final int trackIndex, final ByteBuffer byteBuf, final MediaCodec.BufferInfo bufferInfo) {
if (mStatredCount > 0) {
mMediaMuxer.writeSampleData(trackIndex, byteBuf, bufferInfo);
if (bufferInfo.size != 0) {
byteBuf.position(bufferInfo.offset);
byteBuf.limit(bufferInfo.offset + bufferInfo.size);
if (outStream != null) {
try {
byte[] bytes = new byte[byteBuf.remaining()];
byteBuf.get(bytes);
//Send the data
outStream.write(bytes);
outStream.flush();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
}
The bytes are successfully transferred via socket and I am also able to receive these bytes at the receiver's end.
Below is the code for receiving bytes at the receiver's end:
private class SocketThread implements Runnable {
#Override
public void run() {
Socket socket;
try {
serverSocket = new ServerSocket(SERVER_PORT);
} catch (IOException e) {
e.printStackTrace();
}
if (null != serverSocket) {
while (!Thread.currentThread().isInterrupted()) {
try {
socket = serverSocket.accept();
CommunicationThread commThread = new CommunicationThread(socket);
new Thread(commThread).start();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
class CommunicationThread implements Runnable {
InputStream in;
DataInputStream dis;
public CommunicationThread(Socket clientSocket) {
updateMessage("Server Started...");
}
public void run() {
while (!Thread.currentThread().isInterrupted()) {
try {
byte[] data = new byte[512];
} catch (Exception e) {
e.printStackTrace();
try {
fos.close();
} catch (Exception e1) {
e1.printStackTrace();
}
}
}
}
}
}
I followed the following links for Screen Sharing:
Screen capture
screenrecorder
Screen recording with mediaProjection
I used some code from the above examples to make an app.
All I want to know is how to handle the bytes at the receiver. How do I format these bytes to play a live stream from the sender's side?
Am I following the correct approach for sending and receiving byte data?
Does MediaProjection allow one to stream the Screen while recording between applications?
Any help will be deeply appreciated.
Generally for streaming, including screen sharing, the audio and video tracks are not muxed. Instead, each video frame and audio sample is sent using a protocol like RTP/RTSP, in which each data chunk is wrapped with other things like timestamps.
You can take a look at spyadroid which is a good starting point for streaming audio and video over RTSP to a browser or VLC. It streams the camera and microphone but you can adapt it for your own use case.
If you want to go with sockets for the moment, you have to get rid of the MediaMuxer and send frames/samples directly from the Encoder output, appended with timestamps at least to synchronize the playback in the receiver side, after sending CSDs - assuming that you encode in h.264 format - data (SPS PPS aka csd-0 and csd-1 that you can get when the encoder format is changed) to the receiver Decoder, which you can configure with an output surface to render your stream.
Some extra links :
android-h264-stream-demo
RTMP Java Muxer for Android
RTSP
RTP
WebRTC
I have using spydroid from https://github.com/fyhertz/spydroid-ipcamera.
Based on the requirement streaming should be send and receive in the device. from local network we should able to shown the rtsp stream. Ex. VLC Media Player.
The issue I am facing is, When I am change the resolution Ex. 640*480. It should give black screen with streaming live. In Default demo, It should support 320*240, which is working fine. I have also change the bitrate and framerate according to 640*480 resolution. But couldn't get the result.
Any help would be appreciate.
You might be using old library which the demo SpyDroid is using.
I had same issue with the code i tried following way :-
Steps:-
1.)include library LibStreaming .
As it is the latest library and supporting above Lollipop version.
2.)Find H263Stream class Change following method :-
From
#SuppressLint("NewApi")
private MP4Config testMediaCodecAPI() throws RuntimeException, IOException {
createCamera();
updateCamera();
try {
if (mQuality.resX>=640) {
// Using the MediaCodec API with the buffer method for high resolutions is too slow
mMode = MODE_MEDIARECORDER_API;
}
EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
return new MP4Config(debugger.getB64SPS(), debugger.getB64PPS());
} catch (Exception e) {
// Fallback on the old streaming method using the MediaRecorder API
Log.e(TAG,"Resolution not supported with the MediaCodec API, we fallback on the old streamign method.");
mMode = MODE_MEDIARECORDER_API;
return testH264();
}
}
To
#SuppressLint("NewApi")
private MP4Config testMediaCodecAPI() throws RuntimeException, IOException {
createCamera();
updateCamera();
try {
if (mQuality.resX>=1080) {
// Using the MediaCodec API with the buffer method for high resolutions is too slow
mMode = MODE_MEDIARECORDER_API;
}
EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
return new MP4Config(debugger.getB64SPS(), debugger.getB64PPS());
} catch (Exception e) {
// Fallback on the old streaming method using the MediaRecorder API
Log.e(TAG,"Resolution not supported with the MediaCodec API, we fallback on the old streamign method.");
mMode = MODE_MEDIARECORDER_API;
return testH264();
}
}
-You can find the difference change resolution from "640" into "1080"
-Don't know what exact reason is but above solution worked for me.
-Revert me back if there is any loop fall.
In my case the problem was in MediaRecorder:
file: VideoStream.java
method: encodeWithMediaRecorder
MediaRecorder doesn't support ParcelFileDescriptor correctly. I created local file to save stream form MediaRecorder:
mMediaRecorder.setOutputFile(this.tmpFileToStream);
//mMediaRecorder.setOutputFile(fd); //disable..
and then I run new thread to copy bytes from tmpFileToStream to mParcelWrite
public void run()
{
FileDescriptor fd = mParcelWrite.getFileDescriptor();
try {
InputStream isS = new FileInputStream(tmpFileToStream); //read from local file..
FileOutputStream outputStream = new FileOutputStream(fd);
while (!Thread.interrupted()) {
int content;
while ((content = isS.read()) != -1) {
outputStream.write(content);
}
Thread.sleep(10);
}
} catch (Exception e)
{
Log.e(TAG, "E.. " + e.getMessage());
}
}
/**
* Video encoding is done by a MediaRecorder.
*/
protected void encodeWithMediaRecorder() throws IOException, ConfNotSupportedException {
Log.d(TAG,"Video encoded using the MediaRecorder API");
Log.d(TAG,"Roz" + mRequestedQuality.resX + " x " + mRequestedQuality.resY + " frame " + mRequestedQuality.framerate );
// We need a local socket to forward data output by the camera to the packetizer
createSockets();
// Reopens the camera if needed
destroyCamera();
createCamera();
// The camera must be unlocked before the MediaRecorder can use it
unlockCamera();
this.tmpFileToStream = this.getOutputMediaFile(MEDIA_TYPE_VIDEO);
Log.d(TAG,"Video record to " + this.tmpFileToStream.getAbsolutePath() );
try {
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
// mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setVideoEncoder(mVideoEncoder);
mMediaRecorder.setPreviewDisplay(mSurfaceView.getHolder().getSurface());
mMediaRecorder.setVideoSize(mRequestedQuality.resX,mRequestedQuality.resY);
mMediaRecorder.setVideoFrameRate(mRequestedQuality.framerate);
// The bandwidth actually consumed is often above what was requested
mMediaRecorder.setVideoEncodingBitRate((int)(mRequestedQuality.bitrate*0.8));
// We write the output of the camera in a local socket instead of a file !
// This one little trick makes streaming feasible quiet simply: data from the camera
// can then be manipulated at the other end of the socket
FileDescriptor fd = null;
if (sPipeApi == PIPE_API_PFD) {
fd = mParcelWrite.getFileDescriptor();
} else {
fd = mSender.getFileDescriptor();
}
mMediaRecorder.setOutputFile(this.tmpFileToStream); //save to local.. afeter read and stream that..
// mMediaRecorder.setOutputFile(fd); //disable..
// copy bytes from local file to mParcelWrite.. :)
if(tInput == null) {
this.tInput = new Thread(this);
this.tInput.start();
}
mMediaRecorder.prepare();
mMediaRecorder.start();
} catch (Exception e) {
StringWriter sw = new StringWriter();
PrintWriter pw = new PrintWriter(sw);
e.printStackTrace(pw);
Log.d(TAG, "Error stack " + sw.toString());
throw new ConfNotSupportedException(e.getMessage());
}
InputStream is = null;
if (sPipeApi == PIPE_API_PFD) {
is = new ParcelFileDescriptor.AutoCloseInputStream(mParcelRead);
} else {
is = mReceiver.getInputStream();
}
// This will skip the MPEG4 header if this step fails we can't stream anything :(
try {
byte buffer[] = new byte[4];
// Skip all atoms preceding mdat atom
while (!Thread.interrupted()) {
while (is.read() != 'm');
is.read(buffer,0,3);
if (buffer[0] == 'd' && buffer[1] == 'a' && buffer[2] == 't') break;
}
} catch (IOException e) {
Log.e(TAG,"Couldn't skip mp4 header :/");
stop();
throw e;
}
// The packetizer encapsulates the bit stream in an RTP stream and send it over the network
mPacketizer.setInputStream(is);
mPacketizer.start();
mStreaming = true;
}
That's unusual solution but it works for another resolution
I'm struggling with this quiet long and I decided I need help.
The basis idea is described here: Link
I want to stream the output from the MediaRecorder directly to my Server over Sockets while recording. (In this specific case I want to see if this is possible without using any streaming protocol just via a HTTP-POST.)
The error of my current approach is: E/MediaRecorder: start failed: -2147483648 - RuntimeException
Before I had an error pointing me to this topic mediarecorder-issue-on-lollipop
I use a AsyncTask class to call my recordData() function:
new sendAsync().execute("test");
private void recordData(Socket socket){
try{
OutputStream out = socket.getOutputStream();
InputStream is = null;
ParcelFileDescriptor[] parcelFileDescriptors = ParcelFileDescriptor.createPipe();
parcelRead = new ParcelFileDescriptor(parcelFileDescriptors[0]);
parcelWrite = new ParcelFileDescriptor(parcelFileDescriptors[1]);
try{
is = new ParcelFileDescriptor.AutoCloseInputStream(parcelRead);
} catch (Exception e){
e.printStackTrace();
}
//pfd = ParcelFileDescriptor.fromSocket(socket);
sft = new SurfaceTexture(0);
sf = new Surface(sft);
if(recorder == null) {
recorder = new MediaRecorder();
recorder.reset();
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT);
recorder.setVideoFrameRate(30);
recorder.setPreviewDisplay(sf);
recorder.setOutputFile(parcelWrite.getFileDescriptor());
recorder.setMaxDuration(10000); // 10 seconds
try {
recorder.prepare();
recorder.start();
System.out.println("This is Recorder running");
} catch (IOException e) {
e.printStackTrace();
}
byte[] buffer = new byte[16384];
int byteread = 0;
while ((byteread = is.read(buffer, 0, buffer.length)) != -1) {
out.write(buffer, 0, byteread);
}
out.flush();
}else{
stopRecording();
}
} catch (Exception e) {
e.printStackTrace();
}
}
Sorry for the bad coding-style.
I tried to change the OutputFormat and VideoEncoder following other solutions found to the topic. But nevertheless I'm still not sure if I'm going in the right direction.
Even after coming around this bug I think I somehow need to read out and send the Stream to the server in its own thread.
Every hint could help.
I'm not sure if this is an error due to latest versions on Android (tested on Nougat and Oreo). But as much I tried to find a workaround I still ended up on the same error message with trying to use the MediaRecorder.
Thanks to libstreaming I use now the MediaCodec API explained in their example how to use the Library: Example 2.
Then I take the Inputstream of the MediaCodec API and read it into a byte array until a certain size is reached and send this to my http-server.
I'm testing libstreaming on new Android Lollipop, and this code that worked on previous release, seems to launch exception.
try {
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mMediaRecorder.setVideoEncoder(mVideoEncoder);
mMediaRecorder.setPreviewDisplay(mSurfaceView.getHolder().getSurface());
mMediaRecorder.setVideoSize(mRequestedQuality.resX,mRequestedQuality.resY);
mMediaRecorder.setVideoFrameRate(mRequestedQuality.framerate);
// The bandwidth actually consumed is often above what was requested
mMediaRecorder.setVideoEncodingBitRate((int)(mRequestedQuality.bitrate*0.8));
// We write the ouput of the camera in a local socket instead of a file !
// This one little trick makes streaming feasible quiet simply: data from the camera
// can then be manipulated at the other end of the socket
mMediaRecorder.setOutputFile(mSender.getFileDescriptor());
mMediaRecorder.prepare();
mMediaRecorder.start();
} catch (Exception e) {
throw new ConfNotSupportedException(e.getMessage());
}
Launched exception is:
MediaRecorder: start failed -38
11-18 09:50:21.028: W/System.err(15783): net.majorkernelpanic.streaming.exceptions.ConfNotSupportedException
11-18 09:50:21.028: W/System.err(15783): at net.majorkernelpanic.streaming.video.VideoStream.encodeWithMediaRecorder(VideoStream.java:442)
11-18 09:50:21.028: W/System.err(15783): at net.majorkernelpanic.streaming.MediaStream.start(MediaStream.java:250)
I've tried to comment:
mMediaRecorder.setOutputFile(mSender.getFileDescriptor());
no exception launched, but when I start streaming a dialog tell me that need an outputfile.
Help appreciated.
I filed a bug report on AOSP.
https://code.google.com/p/android/issues/detail?id=80715
"The current SELinux policies don't allow for mediaserver to handle app generated abstract unix domain sockets.
Instead, I'd recommend you create a pipe-pair ( http://developer.android.com/reference/android/os/ParcelFileDescriptor.html#createPipe() ) which is allowed by the Android 5.0 policy.
"
I don't know why they did this or how we were supposed to know.
I'm using a very old/modified (can't tell) version of libstreaming where mediastream is still extended from mediarecorder, but looking at the current version, in MediaStream you'll probably want to change createSockets to something including the following:
ParcelFileDescriptor[] parcelFileDescriptors =ParcelFileDescriptor.createPipe();
parcelRead = new ParcelFileDescriptor(parcelFileDescriptors[0]);
parcelWrite = new ParcelFileDescriptor(parcelFileDescriptors[1]);
then in your video/audio stream
setOutputFile(parcelWrite.getFileDescriptor());
and in that same file
change
// The packetizer encapsulates the bit stream in an RTP stream and send it over the network
mPacketizer.setInputStream(mReceiver.getInputStream());
mPacketizer.start();
to
InputStream is = null;
try{ is = new ParcelFileDescriptor.AutoCloseInputStream(parcelRead);
}
catch (Exception e){}
mPacketizer.setInputStream(is);
As andreasperelli pointed out in the comment, make sure to close the ParcelFileDescriptors in closeSockets(), or depending on your implementation and version, before closeSockets() and before you call MediaRecorder.stop().
at Android 6.0 I resolve this problem with the code
new Thread(new Runnable() {
#Override public void run() {
FileInputStream inputStream = null;
try {
inputStream = new FileInputStream(path);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
while (true) {
byte[] buffer = new byte[0];
try {
buffer = new byte[inputStream.available()];
} catch (IOException e) {
e.printStackTrace();
}
try {
inputStream.read(buffer);
} catch (IOException e) {
e.printStackTrace();
}
try {
mSender.getOutputStream().write(buffer);
mSender.getOutputStream().flush();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}).start();
I use a file as buffer and write bytes at another thread.the MediaRecorder output to the file.
I am trying to build ffmpeg for android. I want to achieve two things with it.
1. Rotate video
2. Join two or more videos.
There are two approaches for having ffmpeg in my application.
1. Having ffmpeg executable, copying it to /data/package/ and executing ffmpeg commands.
2. Build ffmpeg library .so file with ndk and write jni code etc.
Which approach is best according to my needs? And can I have some code snippets that follows those approaches?
You can achieve it by two ways, I would do it with the first one:
Place your ffmpeg file into you raw folder.
You need to use the ffmpeg executable file using commands, but you'll need to place the file into a file-system folder and change the permissions of the file, so use this code:
public static void installBinaryFromRaw(Context context, int resId, File file) {
final InputStream rawStream = context.getResources().openRawResource(resId);
final OutputStream binStream = getFileOutputStream(file);
if (rawStream != null && binStream != null) {
pipeStreams(rawStream, binStream);
try {
rawStream.close();
binStream.close();
} catch (IOException e) {
Log.e(TAG, "Failed to close streams!", e);
}
doChmod(file, 777);
}
}
public static OutputStream getFileOutputStream(File file) {
try {
return new FileOutputStream(file);
} catch (FileNotFoundException e) {
Log.e(TAG, "File not found attempting to stream file.", e);
}
return null;
}
public static void pipeStreams(InputStream is, OutputStream os) {
byte[] buffer = new byte[IO_BUFFER_SIZE];
int count;
try {
while ((count = is.read(buffer)) > 0) {
os.write(buffer, 0, count);
}
} catch (IOException e) {
Log.e(TAG, "Error writing stream.", e);
}
}
public static void doChmod(File file, int chmodValue) {
final StringBuilder sb = new StringBuilder();
sb.append("chmod");
sb.append(' ');
sb.append(chmodValue);
sb.append(' ');
sb.append(file.getAbsolutePath());
try {
Runtime.getRuntime().exec(sb.toString());
} catch (IOException e) {
Log.e(TAG, "Error performing chmod", e);
}
}
Call this method:
private void installFfmpeg() {
File ffmpegFile = new File(getCacheDir(), "ffmpeg");
String mFfmpegInstallPath = ffmpegFile.toString();
Log.d(TAG, "ffmpeg install path: " + mFfmpegInstallPath);
if (!ffmpegFile.exists()) {
try {
ffmpegFile.createNewFile();
} catch (IOException e) {
Log.e(TAG, "Failed to create new file!", e);
}
Utils.installBinaryFromRaw(this, R.raw.ffmpeg, ffmpegFile);
}else{
Log.d(TAG, "It was installed");
}
ffmpegFile.setExecutable(true);
}
Then, you will have your ffmpeg file ready to use by commands. (This way works for me but there are some people that says that it doesn't work, I don't know why, hope it isn't your case). Then, we use the ffmpeg with this code:
String command = "data/data/YOUR_PACKAGE/cache/ffmpeg" + THE_REST_OF_YOUR_COMMAND;
try {
Process process = Runtime.getRuntime().exec(command);
process.waitFor();
Log.d(TAG, "Process finished");
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
As I said, you have to use the ffmpeg file by commands, so you should search on Internet and choose the command you want to use, then, add it into the command string. If the command fails, you won't be alerted by any log, so you should try your command with a terminal emulator and be sure that it works. If it doesn´t work, you won't see any result.
Hope it's useful!!
The advantage of library approach is that you have better control over the progress of your conversion, and can tune it in the middle. One the other hand, operating the executable is a bit easier. Finally, you can simply install the ffmpeg4android app and work with their API.