I have some encrypted (AES-128) .m3u8 stream in my Android app and I want to play it in ExoPlayer. I have for this two variables:
val secretKey = "a4cd9995a1aa91e1"
val initVector = "0000000000000000"
As I read in docs I need to add URI and IV parameters into source file. After adding I have the next one:
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:6
#EXT-X-MEDIA-SEQUENCE:0
#EXT-X-DISCONTINUITY
#EXT-X-KEY:METHOD=AES-128,URI="data:text/plain;charset=utf-8,a4cd9995a1aa91e1",IV=0x30303030303030303030303030303030
#EXTINF:6.0,
media_b525000_0.ts
#EXTINF:6.0,
media_b525000_1.ts
#EXTINF:6.0,
media_b525000_2.ts
*other .ts segments...*
where I added two lines: #EXT-X-DISCONTINUITY and #EXT-X-KEY. But the player doesnt play the stream and I have the next exception:
com.google.android.exoplayer2.upstream.HttpDataSource$HttpDataSourceException: Malformed URL
What did I do wrong? And how can I to decrypt stream when I have secretKey and initVector?
Exoplayer assumes anything following URI is an actual URL in m3u8 file and when it tries to load the encryption key using the below url which is invalid, above exception is thrown by OkHttpDataSource class.
URI="data:text/plain;charset=utf-8,a4cd9995a1aa91e1"
This problem can be solved in two ways.
1. Place the encryption key in a server and use a URL to fetch it.
2. Implement custom data source classes and implement, intercept the calls and modify the request/response objects as per your need.
I have faced similar issue but in my case i have a custom scheme instead of http. Default OkHttpDataSource does not handle custom url schemes hence i had to write my own data source and intercept.
Related
So I am trying to play an Encrypted Content in a Dash File(.mpd),packaged with Shaka Packager and encrypted with CENC Method. The media plays absolutely fine on Shaka Player, but I am unable to make it play on ExoPlayer in Android. On playing, the Logcat shows the following error :
Caused by: android.media.MediaCodec$CryptoException: Crypto key not available
at android.media.MediaCodec.native_queueSecureInputBuffer(Native Method)
at android.media.MediaCodec.queueSecureInputBuffer(MediaCodec.java:2699)
at com.google.android.exoplayer2.mediacodec.MediaCodecRenderer.feedInputBuffer(MediaCodecRenderer.java:1188)
at com.google.android.exoplayer2.mediacodec.MediaCodecRenderer.render(MediaCodecRenderer.java:719)
at com.google.android.exoplayer2.ExoPlayerImplInternal.doSomeWork(ExoPlayerImplInternal.java:599)
at com.google.android.exoplayer2.ExoPlayerImplInternal.handleMessage(ExoPlayerImplInternal.java:329)
at android.os.Handler.dispatchMessage(Handler.java:103)
at android.os.Looper.loop(Looper.java:237)
at android.os.HandlerThread.run(HandlerThread.java:67)
The Build.Gradle has Minimum Sdk limit at API 21, so that checks out, and the code used is :
player = new SimpleExoPlayer.Builder(this).build();
ep.setPlayer(player);
DataSource.Factory dataSourceFactory = new DefaultDataSourceFactory(this, Util.getUserAgent(this, "AppName"));
Uri uri = Uri.parse(Tools.baseAddress+"VIDEO/"+i.getStringExtra("ModuleID")+"/index.php").buildUpon().appendQueryParameter("token", Tools.token).build();
String keyString = "{\"keys\":[{\"kty\":\"oct\",\"k\":\"76a6c65c5ea762046bd749a2e632ccbb\",\"kid\":\"a7e61c373e219033c21091fa607bf3b8\"}],'type':\"temporary\"}";
LocalMediaDrmCallback drmCallback = new LocalMediaDrmCallback(keyString.getBytes());
DrmSessionManager manager=null;
manager = new DefaultDrmSessionManager.Builder()
.setPlayClearSamplesWithoutKeys(true)
.setMultiSession(false)
.setUuidAndExoMediaDrmProvider(C.CLEARKEY_UUID, FrameworkMediaDrm.DEFAULT_PROVIDER)
.build(drmCallback);
MediaSource dashMediaSource = new DashMediaSource.Factory(dataSourceFactory).setDrmSessionManager(manager).createMediaSource(uri);
player = new SimpleExoPlayer.Builder(this).build();
ep.setPlayer(player);
player.prepare(dashMediaSource);
And the command used while packaging the MP4 Video was using Shaka Packager was :
.\packager input=videoplayback.mp4,stream=video,output=video.mp4 input=videoplayback.mp4,stream=audio,output=audio.mp4 --enable_raw_key_encryption --keys key_id=a7e61c373e219033c21091fa607bf3b8:key=76a6c65c5ea762046bd749a2e632ccbb --clear_lead 0 --mpd_output dash.mpd
I am not sure if the key formation is correct or the DRM Session Manager is properly initialized.
I would be really grateful for any help.
Thanks in advance.
The most likely cause is that the method LocalMediaDrmCallback is expecting the key and key_id to be in base64url encoding.
You can covert your key and key_id to this using an online tool such as:
https://base64.guru/standards/base64url/encode
You can see a programatic example in this GitHub issue discussion also: https://github.com/google/ExoPlayer/issues/3856#issuecomment-366197586
I had spend long time to research "CryptoKey not available" Exception.
I found this exception happen with some wrong things.
The MediaDrmCallback is bad. When use ClearKey system, use LocalMediaDrmCallback or sub classes, CAN NOT with network.
The key response (kid & k) base64url implements bad, not include '/' '=' '\n' '+'
Some media file encrypt with clearlead time, My video set parameter is 30s so it always happen exception, I think Android DRM session or keys has timeout in memory.
DrmSessionManager configs set bad. pay attention with "setMultiSession" it will break your set.
I fix this excetion with:
setMultiSession(true), true is for one req return one key, don't use "false"
Replace and implement other MediaDrmCallback, implement "One Req return One key" with Map<String,String>
Reasons:
Some Devices or DRM session implements has timeout I think, If your video have clearlead time, and DRM will play clear content and encrypted content. When Video load start play and first encrypted content play, The DRM session will get Key twice. so the first key used can't load video load key.
When you DASH manifest file contains some videos with difference key, AUDIO, HD, SD or other, when Network speed low or high, the video which played will changed and DRM session will decrypt with other key, but i think change will can't find right key.
When I use setMultiSession(false) -> this mean "One request response All keys", and simple with LocalMediaDrmCallback(responseJson), my video will play error or good, I think some time device load json and find first key, or other time with bad key.
I had write some code and information this issue in my website: https://blackfire.mobi (Chinese), see it to fix this.
I think "CryptoKey not available" Exception is so bad and write this reply for you.
I'm trying to implement Google Speech API in Android by following this demo: https://github.com/GoogleCloudPlatform/android-docs-samples
I was able to successfully reproduce the example in my app by using the given "audio.raw" file located in R.raw, and everything works perfectly. However, when I try to use my own audio files, it returns "API successful" without any transcription text. I'm not sure if it has to do with the files' path or the encoding, so I'll include information on both just in case.
Encoding
My audio files are obtained by recording a voice through MediaRecorder. These are the settings:
myAudioRecorder = new MediaRecorder();
myAudioRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
myAudioRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
myAudioRecorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_WB);
myAudioRecorder.setAudioSamplingRate(16000);
myAudioRecorder.setAudioEncodingBitRate(16000);
myAudioRecorder.setAudioChannels(1);
myAudioRecorder.setOutputFile(outputFile);
SpeechService's recognizeInputStream() function in the API:
mApi.recognize(
RecognizeRequest.newBuilder()
.setConfig(RecognitionConfig.newBuilder()
.setEncoding(RecognitionConfig.AudioEncoding.AMR_WB) //originally it was LINEAR16
.setLanguageCode("en-US")
.setSampleRateHertz(16000)
.build())
.setAudio(RecognitionAudio.newBuilder()
.setContent(ByteString.readFrom(stream))
.build())
.build(),
mFileResponseObserver);
Encoding guidelines by Google: https://cloud.google.com/speech/docs/best-practices
From what I understand, I can use AMR_WB and 16kHz instead of the default LINEAR16, I'm just not sure if I'm doing it right.
Path
This is the example that is fully working (with the audio file from the repo):
mSpeechService.recognizeInputStream(getResources().openRawResource(R.raw.audio));
However, none of the following options work, even with the exact same file:
InputStream inputStream = new URL("[website]/test/audio.raw").openStream();
mSpeechService.recognizeInputStream(inputStream);
Neither:
Uri uri = Uri.parse("android.resource://[package]/raw/audio");
InputStream inputStream = getActivity().getContentResolver().openInputStream(uri); //"getActivity()" because this is in a Fragment
mSpeechService.recognizeInputStream(inputStream);
To be clear, the result on the above paths is the same as on my custom audio files: "API successful" with no transcription. One of the options I have tried for my custom audio files, with the same thing happening, is this:
FileInputStream fis = new FileInputStream(filePath);
mSpeechService.recognizeInputStream(fis);
The only reason I'm not 100% sure the problem is in the path is because if the API is returning with success, then the file was found in the specified path. The problem should be the encoding, but then it's weird that the same file ("audio.raw") sent in different ways produces different results.
Anyway, thank you in advance! :)
EDIT:
To be clear, it's not that it returns an empty string in the transcription. It just never enters the "onSpeechRecognized" function that also exists in the demo, so no transcription is given.
If I were to stream some sort of media to a MediaPlayer, is there any way I could copy it before/as/after it is played? For instance, if I were to stream a YouTube clip, is it possible to save that clip as it is being played?
Edit:
(Ocelot's answer made me realise how localised this question is).
What I am looking to do is copy the stream of a MediaPlayer already in progress (be it youtube or music stream). I want to be able to be notified when a new stream starts and ends. So far the only thing I found (for the latter) that is even remotely close it the broadcast string ACTION_AUDIO_BECOMING_NOISY but that doesn't really do anything for what I need. I there any way to do this?
I haven't tested this, and it looks like quite a bit of work, but here is what I would try:
Create a subclass of Socket. In this class, you can handle all byte reads, and save the stream locally or do whatever you want with it
Create your own content provider, which you can use to pass URIs to your media player, in your own format. Example: mystream://youtube.com/watch?v=3Rhy37u
In your content provider, override the openFile method and in it, open your own socket, and create a ParcelFileDescriptor with it.
Now, simply passing the new format url to your mediaplayer should make all streams go through your Socket, where you can save your data.
one way is to first find out where the video is in the server for exmaple in youtube with simple regex like this :
Regex("(?<=&t=)[^&]*").Match(file).Value;
you could retrieve url to the video and then download it like
public static void Download(string videoID, string newFilePath)
{
WebClient wc = new WebClient();
string file = wc.DownloadString(string.Format("http://www.youtube.com/watch?v={0}", videoID));
string t = new Regex("(?<=&t=)[^&]*").Match(file).Value;
wc.DownloadFile(string.Format("http://www.youtube.com/get_video?t={0}=&video_id={1}",t,videoID), newFilePath);
}
it's c# code but you could easily convert it to java.
#zrgiu
I tried to go with this solution, but the MediaPlayer retrieves a FileDescriptor from the URI, so sadly no http URL can be passed like this.
I also found another solution, it suggests to create a local ProxyServer on your device to serve files from the internet, it should be possible to also save the files streamed via the proxy.
I'm attempting to use the gdata project from an android app. I'm attempting to upload a new csv file to google docs, but I keep encountering a 411 error (Content-Length).
My code looks something like:
GoogleService ss = new SpreadsheetService("testApp");
ss.setUserCredentials("<username>", "<password>");
DocumentListEntry newEntry = null;
newEntry = new DocumentListEntry();
newEntry.setTitle(new PlainTextConstruct("my.csv"));
TextConstruct tc = TextConstruct.plainText("1,2,3,4,5,6,7,8,9,10");
newEntry.setContent(new TextContent(tc));
DocumentListEntry res = ss.insert(new URL("https://docs.google.com/feeds/default/private/full/"), newEntry);
Since the GData lib is abstracting the network calls from me I assume that I don't need to set the Content-Length myself, which leads to me believe I'm simply not using the lib correctly.
What am I missing? Thanks.
The content of the file should not be set in the metadata but sent along the metadata using an HTTP multipart request.
The client library takes care of doing that and you can set the content using:
newEntry.setFile(/* java.io.File instance */ file, /* MIME type */ "text/csv");
This requires to load the data from a file, but you can use streams to load from memory.
A more detailed example can be found in the Java client library project page.
Scenario:
Have encrypted mp3 files in my .apk. Need to decrypt and send to MediaPlayer object.
Problem:
After I read the files and decrypt them, how do I get MediaPlayer to play them ?
Now. MediaPlayer has 4 versions of setDataSource().
setDataSource(String path)
setDataSource(FileDescriptor fd)
setDataSource(FileDescriptor fd, long offset, long length)
setDataSource(Context context, Uri uri)
None of which are ideal for the situation. Guess ideal would be to give MediaPlayer an InputStream ?
Possible solutions:
Write decrypted data to file play
that file. A lot of IO overhead.
Create a dummy http server
(ServerSocket ?) and pass the url to
MediaPlayer. Again, messy. Am I
even allowed to create a socket.
Does anyone have a better solution ?
byte[] callData = ...;
String base64EncodedString = Base64.encodeToString(callData, Base64.DEFAULT);
try
{
String url = "data:audio/amr;base64,"+base64EncodedString;
MediaPlayer mediaPlayer = new MediaPlayer();
mediaPlayer.setDataSource(url);
mediaPlayer.prepare();
mediaPlayer.start();
}
catch(Exception ex){
...
}
If you don't need all the functionality in MediaPlayer, I recommend trying AudioTrack. It's meant for basically what you describe. Unfortunately, MediaPlayer doesn't take an AudioTrack in its constructor, so the best solution in that case is to include a dummy Http server that sends your data back from a URL (which is what the Android 1.0 release notes recommends).
I'm not a 100% sure, but I don't think you have any other option than to temporarily save the the decrypted file before playing it.
This question is kind of similar, but I don't think you use the easy solution suggested there since you have an encrypted file. There is also provided a link to a tutorial for Custom Audio Streaming with MediaPlayer, but it seems like their solution also use a temporary file.