So I am trying to play an Encrypted Content in a Dash File(.mpd),packaged with Shaka Packager and encrypted with CENC Method. The media plays absolutely fine on Shaka Player, but I am unable to make it play on ExoPlayer in Android. On playing, the Logcat shows the following error :
Caused by: android.media.MediaCodec$CryptoException: Crypto key not available
at android.media.MediaCodec.native_queueSecureInputBuffer(Native Method)
at android.media.MediaCodec.queueSecureInputBuffer(MediaCodec.java:2699)
at com.google.android.exoplayer2.mediacodec.MediaCodecRenderer.feedInputBuffer(MediaCodecRenderer.java:1188)
at com.google.android.exoplayer2.mediacodec.MediaCodecRenderer.render(MediaCodecRenderer.java:719)
at com.google.android.exoplayer2.ExoPlayerImplInternal.doSomeWork(ExoPlayerImplInternal.java:599)
at com.google.android.exoplayer2.ExoPlayerImplInternal.handleMessage(ExoPlayerImplInternal.java:329)
at android.os.Handler.dispatchMessage(Handler.java:103)
at android.os.Looper.loop(Looper.java:237)
at android.os.HandlerThread.run(HandlerThread.java:67)
The Build.Gradle has Minimum Sdk limit at API 21, so that checks out, and the code used is :
player = new SimpleExoPlayer.Builder(this).build();
ep.setPlayer(player);
DataSource.Factory dataSourceFactory = new DefaultDataSourceFactory(this, Util.getUserAgent(this, "AppName"));
Uri uri = Uri.parse(Tools.baseAddress+"VIDEO/"+i.getStringExtra("ModuleID")+"/index.php").buildUpon().appendQueryParameter("token", Tools.token).build();
String keyString = "{\"keys\":[{\"kty\":\"oct\",\"k\":\"76a6c65c5ea762046bd749a2e632ccbb\",\"kid\":\"a7e61c373e219033c21091fa607bf3b8\"}],'type':\"temporary\"}";
LocalMediaDrmCallback drmCallback = new LocalMediaDrmCallback(keyString.getBytes());
DrmSessionManager manager=null;
manager = new DefaultDrmSessionManager.Builder()
.setPlayClearSamplesWithoutKeys(true)
.setMultiSession(false)
.setUuidAndExoMediaDrmProvider(C.CLEARKEY_UUID, FrameworkMediaDrm.DEFAULT_PROVIDER)
.build(drmCallback);
MediaSource dashMediaSource = new DashMediaSource.Factory(dataSourceFactory).setDrmSessionManager(manager).createMediaSource(uri);
player = new SimpleExoPlayer.Builder(this).build();
ep.setPlayer(player);
player.prepare(dashMediaSource);
And the command used while packaging the MP4 Video was using Shaka Packager was :
.\packager input=videoplayback.mp4,stream=video,output=video.mp4 input=videoplayback.mp4,stream=audio,output=audio.mp4 --enable_raw_key_encryption --keys key_id=a7e61c373e219033c21091fa607bf3b8:key=76a6c65c5ea762046bd749a2e632ccbb --clear_lead 0 --mpd_output dash.mpd
I am not sure if the key formation is correct or the DRM Session Manager is properly initialized.
I would be really grateful for any help.
Thanks in advance.
The most likely cause is that the method LocalMediaDrmCallback is expecting the key and key_id to be in base64url encoding.
You can covert your key and key_id to this using an online tool such as:
https://base64.guru/standards/base64url/encode
You can see a programatic example in this GitHub issue discussion also: https://github.com/google/ExoPlayer/issues/3856#issuecomment-366197586
I had spend long time to research "CryptoKey not available" Exception.
I found this exception happen with some wrong things.
The MediaDrmCallback is bad. When use ClearKey system, use LocalMediaDrmCallback or sub classes, CAN NOT with network.
The key response (kid & k) base64url implements bad, not include '/' '=' '\n' '+'
Some media file encrypt with clearlead time, My video set parameter is 30s so it always happen exception, I think Android DRM session or keys has timeout in memory.
DrmSessionManager configs set bad. pay attention with "setMultiSession" it will break your set.
I fix this excetion with:
setMultiSession(true), true is for one req return one key, don't use "false"
Replace and implement other MediaDrmCallback, implement "One Req return One key" with Map<String,String>
Reasons:
Some Devices or DRM session implements has timeout I think, If your video have clearlead time, and DRM will play clear content and encrypted content. When Video load start play and first encrypted content play, The DRM session will get Key twice. so the first key used can't load video load key.
When you DASH manifest file contains some videos with difference key, AUDIO, HD, SD or other, when Network speed low or high, the video which played will changed and DRM session will decrypt with other key, but i think change will can't find right key.
When I use setMultiSession(false) -> this mean "One request response All keys", and simple with LocalMediaDrmCallback(responseJson), my video will play error or good, I think some time device load json and find first key, or other time with bad key.
I had write some code and information this issue in my website: https://blackfire.mobi (Chinese), see it to fix this.
I think "CryptoKey not available" Exception is so bad and write this reply for you.
Related
I have some encrypted (AES-128) .m3u8 stream in my Android app and I want to play it in ExoPlayer. I have for this two variables:
val secretKey = "a4cd9995a1aa91e1"
val initVector = "0000000000000000"
As I read in docs I need to add URI and IV parameters into source file. After adding I have the next one:
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:6
#EXT-X-MEDIA-SEQUENCE:0
#EXT-X-DISCONTINUITY
#EXT-X-KEY:METHOD=AES-128,URI="data:text/plain;charset=utf-8,a4cd9995a1aa91e1",IV=0x30303030303030303030303030303030
#EXTINF:6.0,
media_b525000_0.ts
#EXTINF:6.0,
media_b525000_1.ts
#EXTINF:6.0,
media_b525000_2.ts
*other .ts segments...*
where I added two lines: #EXT-X-DISCONTINUITY and #EXT-X-KEY. But the player doesnt play the stream and I have the next exception:
com.google.android.exoplayer2.upstream.HttpDataSource$HttpDataSourceException: Malformed URL
What did I do wrong? And how can I to decrypt stream when I have secretKey and initVector?
Exoplayer assumes anything following URI is an actual URL in m3u8 file and when it tries to load the encryption key using the below url which is invalid, above exception is thrown by OkHttpDataSource class.
URI="data:text/plain;charset=utf-8,a4cd9995a1aa91e1"
This problem can be solved in two ways.
1. Place the encryption key in a server and use a URL to fetch it.
2. Implement custom data source classes and implement, intercept the calls and modify the request/response objects as per your need.
I have faced similar issue but in my case i have a custom scheme instead of http. Default OkHttpDataSource does not handle custom url schemes hence i had to write my own data source and intercept.
I do not know if this is a bug in Android or if I am missing something here, but I can never get the actual language code of a given audio track with this MediaPlayer method:
MediaPlayer.TrackInfo[] mediaplayerTrackInfoArray = myMediaPlayer.getTrackInfo();
String a_string = mediaplayerTrackInfoArray[index].getLanguage(); // index must indicate the position of an actual audio track
All I always get is the String: "und", which supposedly means: undetermined.
I know that the video file being analyzed does contain the audio tracks with their respective language codes, for example: "eng" or "rus", because they show up correctly in an external media player like VLC.
Is there a fix to this bug? Or a substitute method or object?
I'm trying to implement Google Speech API in Android by following this demo: https://github.com/GoogleCloudPlatform/android-docs-samples
I was able to successfully reproduce the example in my app by using the given "audio.raw" file located in R.raw, and everything works perfectly. However, when I try to use my own audio files, it returns "API successful" without any transcription text. I'm not sure if it has to do with the files' path or the encoding, so I'll include information on both just in case.
Encoding
My audio files are obtained by recording a voice through MediaRecorder. These are the settings:
myAudioRecorder = new MediaRecorder();
myAudioRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
myAudioRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
myAudioRecorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_WB);
myAudioRecorder.setAudioSamplingRate(16000);
myAudioRecorder.setAudioEncodingBitRate(16000);
myAudioRecorder.setAudioChannels(1);
myAudioRecorder.setOutputFile(outputFile);
SpeechService's recognizeInputStream() function in the API:
mApi.recognize(
RecognizeRequest.newBuilder()
.setConfig(RecognitionConfig.newBuilder()
.setEncoding(RecognitionConfig.AudioEncoding.AMR_WB) //originally it was LINEAR16
.setLanguageCode("en-US")
.setSampleRateHertz(16000)
.build())
.setAudio(RecognitionAudio.newBuilder()
.setContent(ByteString.readFrom(stream))
.build())
.build(),
mFileResponseObserver);
Encoding guidelines by Google: https://cloud.google.com/speech/docs/best-practices
From what I understand, I can use AMR_WB and 16kHz instead of the default LINEAR16, I'm just not sure if I'm doing it right.
Path
This is the example that is fully working (with the audio file from the repo):
mSpeechService.recognizeInputStream(getResources().openRawResource(R.raw.audio));
However, none of the following options work, even with the exact same file:
InputStream inputStream = new URL("[website]/test/audio.raw").openStream();
mSpeechService.recognizeInputStream(inputStream);
Neither:
Uri uri = Uri.parse("android.resource://[package]/raw/audio");
InputStream inputStream = getActivity().getContentResolver().openInputStream(uri); //"getActivity()" because this is in a Fragment
mSpeechService.recognizeInputStream(inputStream);
To be clear, the result on the above paths is the same as on my custom audio files: "API successful" with no transcription. One of the options I have tried for my custom audio files, with the same thing happening, is this:
FileInputStream fis = new FileInputStream(filePath);
mSpeechService.recognizeInputStream(fis);
The only reason I'm not 100% sure the problem is in the path is because if the API is returning with success, then the file was found in the specified path. The problem should be the encoding, but then it's weird that the same file ("audio.raw") sent in different ways produces different results.
Anyway, thank you in advance! :)
EDIT:
To be clear, it's not that it returns an empty string in the transcription. It just never enters the "onSpeechRecognized" function that also exists in the demo, so no transcription is given.
I'm using mp4parser to mux h264 and aac file which are re-encoded from orginal video file,how can I write the metadata of the original video to the new mp4 file? Or is there a common method to write metadata to mp4 file?
metadata and MP4 is a really problem. There is no generally supported specification. But this is only one part of the problem.
Prob (1): When to write metadata
Prob (2): What to write
Prob (1) is relatively easy to solve: Just extend the DefaultMp4Builder or the FragmentedMp4Builder on your own and override the
protected ParsableBox createUdta(Movie movie) {
return null;
}
with something meaningful. E.g.:
protected ParsableBox createUdta(Movie movie) {
UserDataBox udta = new UserDataBox();
CopyrightBox copyrightBox = new CopyrightBox();
copyrightBox.setCopyright("All Rights Reserved, me, myself and I, 2015");
copyrightBox.setLanguage("eng");
udta.addBox(copyrightBox);
return udta;
}
some people used that to write apple compatible metadata but even though there are some classes in my code I never really figured out what works and what not. You might want to have a look into Apple's specification here
And yes: I'm posting this a year to late.
It seems that the 'mp4parser' library (https://code.google.com/p/mp4parser/), supports writing Metadata to mp4 files in Android. However, I've found there's little-to-no documentation on how to do this, beyond a few examples in their codebase. I've had some luck with the following example, which writes XML metadata into the 'moov/udta/meta' box:
https://github.com/copiousfreetime/mp4parser/blob/master/examples/src/main/java/com/googlecode/mp4parser/stuff/ChangeMetaData.java
If you consider the alternatives you might want to look at JCodec for this purpose. It now has the org.jcodec.movtool.MetadataEditor API (and a matching CLI org.jcodec.movtool.MetadataEditorMain).
Their documentation contains many samples: http://jcodec.org/docs/working_with_mp4_metadata.html
So basically when you want to add some metadata you need to know what key(s) it corresponds to. One way to find out is to inspect a sample file that already has the metadata you need. For this you can run the JCodec's CLI tool that will just print out all the existing metadata fields (keys with values):
./metaedit <file.mp4>
Then when you know the key you want to work with you can either use the same CLI tool:
# Changes the author of the movie
./metaedit -f -si ©ART=New\ value file.mov
or the same thing via the Java API:
MetadataEditor mediaMeta = MetadataEditor.createFrom(new
File("file.mp4"));
Map<Integer, MetaValue> meta = mediaMeta.getItunesMeta();
meta.put(0xa9415254, MetaValue.createString("New value")); // fourcc for '©ART'
mediaMeta.save(false); // fast mode is off
To delete a metadata field from a file:
MetadataEditor mediaMeta = MetadataEditor.createFrom(new
File("file.mp4"));
Map<Integer, MetaValue> meta = mediaMeta.getItunesMeta();
meta.remove(0xa9415254); // removes the '©ART'
mediaMeta.save(false); // fast mode is off
To convert string to integer fourcc you can use something like:
byte[] bytes = "©ART".getBytes("iso8859-1");
int fourcc =
ByteBuffer.wrap(bytes).order(ByteOrder.BIG_ENDIAN).getInt();
If you want to edit/delete the android metadata you'll need to use a different set of fucntion (because it's stored differently than iTunes metadata):
./metaedit -sk com.android.capture.fps,float=25.0 file.mp4
OR alternatively the same through the API:
MetadataEditor mediaMeta = MetadataEditor.createFrom(new
File("file.mp4"));
Map<String, MetaValue> meta = mediaMeta.getKeyedMeta();
meta.put("com.android.capture.fps", MetaValue.createFloat(25.));
mediaMeta.save(false); // fast mode is off
I have an Url of the remote audio file. I need to build data for adapter list with track details. Here is this part of code
Log.d("audioURL", audio.getUrl());
MediaPlayer tmpMedia;
tmpMedia = MediaPlayer.create(getContext(), Uri.parse(audio.getUrl()));
holder.txtDuration.setDuration(tmpMedia.getDuration()/1000);
tmpMedia.release();
But it works too slowly. LogCat writes something like this:
15:05:51.783: D/audioURL(776): http://cs4859.vk.me/u14195999/audios/0cbd695ddf50.mp3
15:05:51.783: D/MediaPlayer(776): Couldn't open file on client side, trying server side
15:05:53.813: D/audioURL(776): http://cs4859.vk.me/u14195999/audios/0cbd695ddf50.mp3
15:05:53.823: D/MediaPlayer(776): Couldn't open file on client side, trying server side
15:05:55.373: D/audioURL(776): http://cs4859.vk.me/u14195999/audios/0cbd695ddf50.mp3
15:05:55.383: D/MediaPlayer(776): Couldn't open file on client side, trying server side
15:05:58.143: D/audioURL(776): http://cs1626.vk.me/u149968/audios/04298447cd3c.mp3
15:05:58.153: D/MediaPlayer(776): Couldn't open file on client side, trying server side
...and so on. So, my playlist of about 30 tracks initializes with about 7 minutes.
I guess, the MediaPlayer class method getDuration() sequentially downloads these tracks (or some parts of them) to get their durations.
Is there a way to get these durations quickly, without downloading tracks?
Halim Qarroum, it seems to be a correct way, but I have some troubles with MediaMetadataRetriever class.
Here is my code above:
if (android.os.Build.VERSION.SDK_INT < 10){
holder.txtDuration.setDuration(audio.getTrackDuration());
} else {
MediaMetadataRetriever mRetriever = new MediaMetadataRetriever();
Log.d("URI", Uri.parse(audio.getUrl()).toString());
mRetriever.setDataSource(getContext(), Uri.parse(audio.getUrl()));
String s = mRetriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION);
holder.txtDuration.setDuration(Long.parseLong(s));
mRetriever.release();
}
Application terminates in mRetriever.setDataSource(getContext(), Uri.parse(audio.getUrl())); because of IllegalArgumentException. The audio.getURl() string is http://cs4859.vk.me/u14195999/audios/134dfe90d1ec.mp3.
Why the exception occurs?
Dheeb posted a well detailed answer. However, ID3 tags are not always present in an mp3 file. Instead of looking for these tags, which will force you to limit this method to mp3 files, you could use the MediaMetadataRetriever class which comes with the Android framework.
This class can give you several metadata from certain types of audio/video files, one of this information, is the duration. This method has the advantage to be standard, as it comes with the Android SDK and is not limited to one audio format.
From the Android developers related page :
MediaMetadataRetriever class provides a unified interface for
retrieving frame and meta data from an input media file.
A trivial example of code using this class :
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
retriever.setDataSource(your_data_source);
String time = retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION);
long timeInmillisec = Long.parseLong( time );
long duration = timeInmillisec / 1000;
long hours = duration / 3600;
long minutes = (duration - hours * 3600) / 60;
long seconds = duration - (hours * 3600 + minutes * 60);
There was bug related to MediaMetadataRetriever.
You could try,
metaRetreiver.setDataSource("<remoteUrl>", new HashMap<String, String>());
I'll assume mp3 since "Audio File" is a blanket phrase.
Method 1: fetch ID3 tag
Variant 1: 3rd party library
You will need to look at the ID3 tags in the mp3 file.
Unless you keep track of the metadata you want somewhere else.
To specifically get the Track length of the file you will need to look into the ID3 metadata tag for sure, specifically the 'TRCK' frame of the tag.
To only download the ID3 Tag part, you must first download the ID3 header part of the file.
This website contains very specific information about the ID3 Tag format. You will need to look at the version number of the ID3 Tag and then, based on that, you will need to find the information regarding how long the ID3 Tag is. Then, you must download the WHOLE tag because the frames are not in any specific order.
Then you should be able to use a third party library to find the TRCK frame and its data.
Variant 2: HTTP Hack
For ID3v2 tags, grab the start of the file. (It's possible for ID3v2 frames to be elsewhere, but in practice they're always there.) You can't tell how long the tag is going to be in advance. For text-only tags you're likely to find the information you want in the first 512-1024 bytes. Unfortunately more and more MP3s have embedded ‘album art’ pictures, which can be much longer; try to pick an ID3 library that will gracefully ignore truncated ID3 information.
ID3v1 tags are located at the end of the file. Again you can't tell how long they're going to be. And of course you don't know in advance whether the file has ID3v1 tags, ID3v2 tags, both or neither. Generally these days ID3v2 is a better bet though.
To read part of a file through HTTP you need the Range header. This too is not supported everywhere.
Method 2: Estimation
File size you can get with an HTTP HEAD request. Duration meaning playing time in seconds, cannot be gotten without fetching the entire file. You can guess, by fetching the first few MP3 frames, looking at their bitrate, and assuming that the rest of the file has the same bitrate, but given the popularity of Variable Bit-Rate encoding the likelihood this will be close to accurate is quite low.
ID3 tags can in theory contain information that might allow you to
guess the length better, in the ASPI and ETCO tags. But in practice
these are very rarely present.
Credits
Credits go to various authors on SO and the interwebs, ofcourse the guy on the first floor in my head.