Phonegap Media record mp3 file corrupt - android

I am using phonegap media to record audio as mp3. After recording, it plays fine on my Android and plays fine on Windows Media Player. However, when I try it in the browser it says that the file is corrupt.
Exact errors:
Chrome: "We cannot play this audio file right now."
Firefox: "Video can't be played because the file is corrupt."
IE: Opens the file in WMP and it plays.
I used the code from the example. http://docs.phonegap.com/en/2.6.0/cordova_media_media.md.html#media.startRecord
// Record audio
//
function recordAudio() {
var src = "myrecording.mp3";
var mediaRec = new Media(src,
// success callback
function() {
console.log("recordAudio():Audio Success");
},
// error callback
function(err) {
console.log("recordAudio():Audio Error: "+ err.code);
});
// Record audio
mediaRec.startRecord();
}
Thanks in advance.
Edit:
Here is an example. http://blrbrdev.azurewebsites.net/voice/blrbr_130419951008830874.mp3
This plays in WMP but not the browser.

The file you provided as an mp3 does not appear to be an mp3 file.
I have attached below the details of the file. As you can see it is AMR codec packed in a MPEG-4/3GPP container. I would say no current browser can decode that natively (but software like VLC can play it back).
If you are attempting to play an audio file in a browser - be it HTML5 audio for say - you need to provide a compatible format. Have a look here for a compatibility table.
This is expected behavior as stated here:
Android devices record audio in Adaptive Multi-Rate format. The specified file should end with a .amr extension.
If you want to play it in a browser/HTML5 audio tag you would need to post process the file to convert it to a valid mp3 file (add ogg audio for full browser coverage). Server side this can be done with a program called ffmpeg for example. I am no specialist of Phonegap development so I cannot point you to a valid lib that does that client side but maybe this has already been asked on SO.
File specs:
General
Complete name : C:\wamp\www\stack\sample\thisTest.mp3
Format : MPEG-4
Format profile : 3GPP Media Release 4
Codec ID : 3gp4
File size : 10.5 KiB
Duration : 4s 780ms
Overall bit rate mode : Constant
Overall bit rate : 18.0 Kbps
Performer : LGE
Encoded date : UTC 2014-04-15 00:24:57
Tagged date : UTC 2014-04-15 00:24:57
Audio
ID : 1
Format : AMR
Format/Info : Adaptive Multi-Rate
Format profile : Narrow band
Codec ID : samr
Duration : 4s 780ms
Bit rate mode : Constant
Bit rate : 12.8 Kbps
Channel(s) : 1 channel
Sampling rate : 8 000 Hz
Bit depth : 13 bits
Stream size : 7.47 KiB (71%)
Title : SoundHandle
Writing library :
Language : English
Encoded date : UTC 2014-04-15 00:24:57
Tagged date : UTC 2014-04-15 00:24:57

Firstly thanks to #Forestan06 for pointing me in the right direction.
For those of you recording on Android devices in .amr format and needing said recording in .mp3 format on your server using .Net C#, this is how I did it.
Install-Package MediaToolkit --> http://www.nuget.org/packages/MediaToolkit/
Write this code:
var fileName ="myVoice.mp3";
string fileNameWithPath = Server.MapPath("~/Voice/" + fileName);
if (request.FileName.EndsWith(".amr"))
{
var amrFileName = "myVoice.amr";
string amrFileNameWithPath = Server.MapPath("~/Voice/Amr/" + amrFileName);
request.SaveAs(amrFileNameWithPath);
var inputFile = new MediaFile { Filename = amrFileNameWithPath };
var outputFile = new MediaFile { Filename = fileNameWithPath };
using (var engine = new Engine())
{
engine.Convert(inputFile, outputFile);
}
}
else
{
request.SaveAs(fileNameWithPath);
}

Related

Does RTMP support the Display Orientation SEI Message in h264 streams?

I'm streaming video h264 video and AAC audio over RTMP on Android using the native MediaCodec APIs. Video and audio look great, however while the video is shot in potrait mode, playback on the web or with VLC is always in landscape.
Having read through the h264 spec, I see that this sort of extra metadata can be specified in Supplemental Enhancement Information (SEI), and I've gone about adding it to the raw h264 bit stream. My SEI NAL unit for this follows this rudimentary format, I plan to optimize later:
val displayOrientationSEI = {
val prefix = byteArrayOf(0, 0, 0, 1)
val nalHeader = byteArrayOf(6) // forbidden_zero_bit:0; nal_ref_idc:0; nal_unit_type:6
val display = byteArrayOf(47 /* Display orientation type*/, 3 /*payload size*/)
val displayOrientationCancelFlag = "0" // u(1); Rotation information follows
val horFlip = "1" // hor_flip; u(1); Flip horizontally
val verFlip = "1" // ver_flip; u(1); Flip vertically
val anticlockwiseRotation = "0100000000000000" // u(16); value / 2^16 -> 90 degrees
val displayOrientationRepetitionPeriod = "010" // ue(v); Persistent till next video sequence
val displayOrientationExtensionFlag = "0" // u(1); No other value is permitted by the spec atm
val byteAlignment = "1"
val bitString = displayOrientationCancelFlag +
horFlip +
verFlip +
anticlockwiseRotation +
displayOrientationRepetitionPeriod +
displayOrientationExtensionFlag +
byteAlignment
prefix + nalHeader + display + BigInteger(bitString, 2).toByteArray()
}()
Using Jcodec's SEI class, I can see that my SEI message is parsed properly. I write out these packets to the RTMP stream using an Android JNI wrapper for LibRtmp.
Despite this, ffprobe does not show the orientation metadata, and the video when played remains in landscape.
At this point I think I'm missing a very small detail about how FLV headers work when the raw h264 units are written out by LibRtmp. I have tried appending this displayOrientationSEI NAL unit:
To the initial SPS and PPS configuration only.
To each raw h264 NAL units straight from the encoder.
To both.
What am I doing wrong? Going through the source of some RTMP libraries, like rtmp-rtsp-stream-client-java, it seems the SEI message is dropped when creating FLV tags.
Help is much, much appreciated.
Does RTMP support the Display Orientation SEI Message in h264 streams?
RTMP is unaware of the very concept. from RTMPs perspective, the SEI is just a series of bytes it copys. It never looks at them, it never parses them.
The thing that needs to support it, is the h.264 decoder (which RTMP is also unaware of) and the player software. If it is not working for you, you must check the player, or the validity of the encoded SEI, not the transport.

How to live stream from android device camera to store video chunks on Amazon S3 bucket?

I'm working on a project for an Android solution around video recording and analysis. The idea is that since the video would be huge in size, so I have to design a solution wherein say every 30 minutes (can vary from 1 minute to 30 minutes), my code should pick up a curtailed video file and upload in on the cloud, delete that part from the device, while the camera continues to monitor and capture the video feed.
For eg, once I start streaming or recording at 8:00 AM and use interval of 1 minute,
The video files stored on S3 bucket should have data like:
1st video file - from 8:00 AM to 8:01 AM
2nd video file - from 8:01 AM to 8:02 AM
....
and so on until the stream stops.
For now, I'm using the AmazonKinesisVideoDemoApp from https://github.com/awslabs/aws-sdk-android-samples.
I'm using Kinesis Video Streams as a medium where I can start streaming from my android device to kinesis and use GET_HLS_STREAMING_SESSION_URL API in python to get video from stream and store on s3 bucket.
def save_chunks_to_s3(init_time, time_range):
timestamp = datetime.strptime(init_time, "%Y%m%d_%H%M%S")
while True:
playback_url = kvam.get_hls_streaming_session_url(StreamName=STREAM_NAME, PlaybackMode='ON_DEMAND',
HLSFragmentSelector={
'FragmentSelectorType': 'SERVER_TIMESTAMP',
'TimestampRange': {
'StartTimestamp': timestamp,
'EndTimestamp': timestamp + timedelta(seconds=time_range)
}
}
)['HLSStreamingSessionURL']
vcap = cv2.VideoCapture(playback_url)
fwidth = int(vcap.get(cv2.CAP_PROP_FRAME_WIDTH))
fheight = int(vcap.get(cv2.CAP_PROP_FRAME_HEIGHT))
length = int(vcap.get(cv2.CAP_PROP_FRAME_COUNT))
current_time = timestamp.strftime("%Y%m%d_%H%M%S")
filename = 'output_'+current_time+'.avi'
print("Saving to file "+filename)
fourcc = cv2.VideoWriter_fourcc(*'MJPG')
out = cv2.VideoWriter(filename,fourcc, 20.0, (fwidth,fheight))
(grabbed, frame) = vcap.read()
if grabbed:
# Capturing all the frames
while(length>0):
if frame is not None:
out.write(frame)
length -= 1
(grabbed, frame) = vcap.read()
out.release()
vcap.release()
# Storing to s3 bucket
with open(filename, 'rb') as data:
s3.upload_fileobj(data, 'android-kinesis-video-chunks', 'android_stream_'+filename)
print("Saved to file "+filename)
timestamp = timestamp + timedelta(seconds=time_range)
else:
out.release()
vcap.release()
break
s3 = boto3.client("s3")
kvs = boto3.client("kinesisvideo")
response = kvs.get_data_endpoint(APIName="GET_HLS_STREAMING_SESSION_URL", StreamName=STREAM_NAME)
# Grab the endpoint from GetDataEndpoint
endpoint = response['DataEndpoint']
# Grab the HLS Stream URL from the endpoint
kvam = boto3.client("kinesis-video-archived-media",
endpoint_url=endpoint,
region_name=REGION_NAME,
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY)
while True:
try:
# This will throw ResourceNotFoundException if not streaming
live_url = kvam.get_hls_streaming_session_url(StreamName=STREAM_NAME, PlaybackMode="LIVE")['HLSStreamingSessionURL']
init_utc_time = datetime.utcnow().strftime("%Y%m%d_%H%M%S")
start_time = time.time()
# Storing will start after a delay of 90s after live stream started
while(time.time()-start_time<90):
print(time.time()-start_time)
# Using interval range of 1 minute
save_chunks_to_s3(init_utc_time, 60)
except:
print("Waiting for live stream......")
I'm running the above code on my local machine before starting the video stream on android device.
But the requirement is to either run this using some serverless solution or find some other solution where video is directly streamed to S3.

MediaPlayer: Error (1,-2147483648) in Cordova 6.1.1 with Phaser 2.4.6

I am trying to play audio files on an Android game built with Cordova 6.1.1 and Phaser.io 2.4.6. The media will not play on Android versions less than API 21 or so, and gives the error
04-21 21:48:57.546 9659-9671/com.jtronlabs.birdu E/MediaPlayer: error (1, -2147483648)
04-21 21:48:57.546 9659-9672/com.jtronlabs.birdu E/MediaPlayer: Error (1,-2147483648)
I have read some SO answers, and nothing has helped. I load in audio using Phaser's Loader class:
this.load.audio('background-music', this.arrayOfCompatibleMusicFileNames('the_plucked_bird') );
...
//Phaser has support to load in multiple types of audio formats if the first supplied in the array is not compatible with the browser.
arrayOfCompatibleMusicFileNames: function(key){
//old versions of android don't play music, they require an absolute pathname (instead of relative). This is a generic solution
//https://stackoverflow.com/questions/4438822/playing-local-sound-in-phonegap?lq=1
var path = window.location.pathname;
path = path.substr( 0, path.lastIndexOf("/")+1 ); //need to remove 'index.html' from the end of pathname
var aud = path+'assets/audio/';
//aud = '/android_res/raw/'
var wav = aud + 'wav/';
var ogg = aud + 'ogg/';
var mp3 = aud + 'mp3/';
console.log(mp3+key+".mp3");
return [mp3+key+".mp3",ogg+key+".ogg",wav+key+".wav"];
},
This works in the browser, and on newer versions of Android. On older versions I have attempted to add multiple formats, the absolute path, external write permissions to $PROJECT_ROOT/platforms/android/AndroidManifest.xml, and moving the files from /www to $PROJECT_ROOT/platforms/android/res/raw.
All for naught. Any ideas on what could be going wrong?
Edit: When the audio files are in the 'res' folder, I reference them as such:
arrayOfCompatibleMusicFileNames: function(key){
return ['/android_res/raw/'+key+".ogg"];
}
Which works on API 21 but not 19 or below (just like the first function).
I have gotten the Audio working by using the cordova-plugin-media.
var path = window.location.pathname;
path = path.substr( 0, path.lastIndexOf("/")+1 ); //need to remove 'index.html' from the end of pathname
var aud = path+'assets/audio/';
var ogg = aud + 'ogg/' + key + ".ogg";
//works!
var snd = new Media(ogg);
snd.play();
However, I discovered the 'normal' way of doing this is what causes the bad behaviour.
//does not work... Phaser uses this internally
var snd = new Audio(ogg);
snd.play();
It seems I will have to write code to test if it is browser or cordova, and use 'Media' or 'Audio' respectively.
Update: I wrote that code and it makes things messy, but works. Nowadays I use Crosswalk, and WebAudio works on all devices. No need for the media plugin and extra case-checking in my code.

Can not play recorded audio file from android in iOS 5+

I am working on android app which also supports iOS. I want to record the audio & play it in Android as well as in iOS devices. I am recording audio in android using following settings
MediaRecorder audioRecorder = new MediaRecorder();
audioRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
audioRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
audioRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
audioRecorder.setAudioSamplingRate(44100);
audioRecorder.setAudioChannels(1);
audioRecorder.setAudioEncodingBitRate(12800);
audioRecorder.setOutputFile(<recordedSoundFilePath>);
audioRecorder.prepare();
audioRecorder.start();
On iOS side , settings are as follows
//audioRecorder is object of AVAudioRecorder
NSMutableDictionary *recordSettings = [[NSMutableDictionary alloc] initWithCapacity:10];
NSNumber *formatObject;
formatObject = [NSNumber numberWithInt: kAudioFormatMPEG4AAC ];
[recordSettings setObject:formatObject forKey: AVFormatIDKey];
[recordSettings setObject:[NSNumber numberWithFloat:44100.0] forKey: AVSampleRateKey];
[recordSettings setObject:[NSNumber numberWithInt:1] forKey:AVNumberOfChannelsKey];
[recordSettings setObject:[NSNumber numberWithInt:12800] forKey:AVEncoderBitRateKey];
[recordSettings setObject:[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];
[recordSettings setObject:[NSNumber numberWithInt: AVAudioQualityHigh] forKey: AVEncoderAudioQualityKey];
NSURL *soundFileURL = [NSURL fileURLWithPath:[self soundFilePath]];
NSError *error = nil;
audioRecorder = [[ AVAudioRecorder alloc] initWithURL:soundFileURL settings:recordSettings error:&error];
if ([audioRecorder prepareToRecord] == YES){
[audioRecorder record];
}else {
int errorCode = CFSwapInt32HostToBig ([error code]);
NSLog(#"Error: %# [%4.4s])" , [error localizedDescription], (char*)&errorCode);
}
I can record the audio & it is playing correctly in android devices.
Now the problem is I can play the recorded audio from iOS in Android device , but iOS device can't play the audio recorded on Android device. It returns OSStatus error 1937337955. I searched about this error , but I can't find anything.
Can anybody tell me what's going wrong in my code ?
Any kind of help is highly appreciated. Thanks in advance.
try this one:
audioRecorder.setAudioSource(MediaRecorder.AudioSource.DEFAULT);
audioRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
audioRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
I faced the same issue in which audio that recorded from Samsung's device is not working on all IOS devices and not even on safari browser. But they working fine in all android devices. I have fixed this issue by adding below lines in AudioRecordingUtil class:
recorder?.let{
it.setAudioSource(MediaRecorder.AudioSource.MIC)
it.setOutputFormat(MediaRecorder.OutputFormat.AAC_ADTS)
it.setAudioEncoder(MediaRecorder.AudioEncoder.AAC)
}
Hope this can help!
It is codec issue , the format recorded by iOS is not able decode by android media player , in order to make it work just decode the audio file on IOS side in to android supported format like mp3 or mp4 or 3Gp etc.
I also suffered issues with MediaRecorder. At the time of Audio Record, Mime Types are different like
Mac chrome - Mime Type:audio/webm;codecs=opus
Mac Safari - Mime Type:audio/mp4
Windows/Android - Mime Type:audio/webm;codecs=opus
Iphone Chrome - Mime Type:audio/mp4
I was saving the file as M4a but Audio was not running in IOS. After some analysis and Testing. I decided to convert the file after Upload in Server and used ffmpeg and It worked like a charm.
<!-- https://mvnrepository.com/artifact/org.bytedeco/ffmpeg-platform -->
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>ffmpeg-platform</artifactId>
<version>4.3.2-1.5.5</version>
</dependency>
/**
* Convert the file into MP4 using H264 Codac in order to make it work in IOS Mobile Device
* #param file
* #param outputFile
*/
private void convertToM4A(File file, File outputFile) {
try {
String ffmpeg = Loader.load(org.bytedeco.ffmpeg.ffmpeg.class);
ProcessBuilder pb = new ProcessBuilder(ffmpeg, "-i", file.getPath(), "-vcodec", "h264", outputFile.getPath());
pb.inheritIO().start().waitFor();
}catch (Exception e ){
e.printStackTrace();
}
}

Android -- Can't play any videos (mp4/mov/3gp/etc.)?

I'm having great difficulty getting my Android application to play videos from the SD card. It doesn't matter what size, bitrate, video format, or any other setting I can think of, neither the emulator nor my G1 will play anything I try to encode. I've also tried a number of videos from the web (various video formats, bitrates, with and without audio tracks, etc.), and none of those work either.
All I keep getting is a dialog box that says:
"Cannot play video"
"Sorry, this video cannot be played."
There are errors reported in LogCat, but I don't understand them and I've tried searching the Internet for further explanations without any luck. See below:
03-30 05:34:26.807: ERROR/QCOmxcore(51): OMXCORE API : Free Handle 390d4
03-30 05:34:26.817: ERROR/QCOmxcore(51): Unloading the dynamic library for OMX.qcom.video.decoder.avc
03-30 05:34:26.817: ERROR/PlayerDriver(51): Command PLAYER_PREPARE completed with an error or info PVMFErrNoResources
03-30 05:34:26.857: ERROR/MediaPlayer(14744): error (1, -15)03-30 05:34:26.867: ERROR/MediaPlayer(14744): Error (1,-15)
Sometimes I also get this:
03-30 05:49:49.267: ERROR/PlayerDriver(51): Command PLAYER_INIT completed with an error or info PVMFErrResource
03-30 05:49:49.267: ERROR/MediaPlayer(19049): error (1, -17)
03-30 05:49:49.347: ERROR/MediaPlayer(19049): Error (1,-17)
Here is the code I'm using (in my onCreate() method):
this.setContentView(R.layout.main);
//just a simple VideoView loading files from the SD card
VideoView myIntroView = (VideoView) this.findViewById(R.id.VideoView01);
MediaController mc = new MediaController(this);
myIntroView.setMediaController(mc);
myIntroView.setVideoPath("/sdcard/test.mp4");
myIntroView.requestFocus();
myIntroView.start();
Please help!
Okay, here goes. The video I've been working on in Adobe Premiere is supposed to be 480x800 (WxH), but I have the Adobe Media Encoder output the sequence as an "Uncompressed Microsoft AVI" using the "UYVY" video codec, 24fps frame rate, progressive, square pixels, and dimensions: 720x800 (WxH). This outputs a rather large file with 120px black borders on either side of the video content. I then take the video into Handbrake 0.9.4 and use the following settings (I started with the Regular->Normal preset):
Container: MP4 File
Large file size: [un-Checked]
Web-optimized: [un-Checked]
iPod 5G support: [un-Checked]
Width: 320 (this is key, any higher than 320 and it won’t work)
Height: 528
Keep Aspect Ratio: [Checked]
Anamorphic: None
Crop Left: 120
Crop Right: 120
Everything under the "Video Filter" tab set to "Off"
Video Codec: H.264(x264)
Framerate: same as source
2-Pass Encoding: [Checked]
Turbo first pass: [un-Checked]
Avg Bitrate: 384
Create chapter markers: [un-Checked]
Reference Frames: 2
Mixed References: [un-Checked]
B-Frames: 0
Motion Estimation Method: Uneven Multi-Hexagon
Sub-pixel Motion Estimation: 9
Motion Estimation Range: 16
Analysis: Default
8x8 DCT: [un-Checked]
CABAC Entropy Coding: [un-Checked]
No Fast-P-Skip: [un-Checked]
No DCT-Decimate: [un-Checked]
Deblocking: Default, Default
Psychovisual Rate Distortion: [MAX]
My main problem was that I was trying to output an mp4 file with 480x800 (WxH) dimensions. After changing the width to 320 (higher values didn't work), yet keeping the proper aspect ratio, the output video now plays without errors. I hope this helps someone else with a similar problem.
Side note: I wish the Android video restrictions were better documented.
I have had quite a bit of trouble getting many different videos to play on my phone (HTC hero). Standard 512K mp4's play (example: http://www.archive.org/details/more_animation), check with them first to make sure it's not your code.
Here's my code, from onCreate() in a sub-activity which only plays the video file:
protected VideoView mine;
protected boolean done = false;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.videoshow);
mine = (VideoView) findViewById(R.id.video); // Save the VideoView for touch event processing
try {
String myURI = "/sdcard/" + path + "/v/"
+ currentItem.getFile()
+ "." + currentItem.getFileType();
Uri video = Uri.parse(myURI);
mine.setVideoURI(video);
mine.start();
mine.setOnCompletionListener(new OnCompletionListener() {
public void onCompletion(MediaPlayer mp) {
result.putExtra("com.ejf.convincer01.Finished", true);
done = true;
}
});
} catch (Exception ex) {
Log.d(DEBUG_TAG, "Video failed: '" + ex + "'" );
}
I was facing this problem until I figured out that the problem was in the directory of my video. I was saving my videos to a directory that is unreachable to the video view. So every time I try to play a video it gives me error message says: "Can’t open video" or something like that.
Try and save your video to this directory which will be also visible in phone gallery.
String path = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES) + "/" + "your app name ";

Categories

Resources