Asynchronous MediaCodec not receiving BUFFER_FLAG_END_OF_STREAM in callback - android

I've been implementing a video encoder which takes raw RGB frame data and encodes/muxes it into a H264 video.
Initially I was using a sync implementation with a while loop based on examples found in https://bigflake.com/mediacodec/, which worked fine.
To improve performance and readability I wanted to switch over to an asynchronous implementation, however I ran into an issue:
calling signalEndOfInputStream often does not set the MediaCodec.BUFFER_FLAG_END_OF_STREAM flag on MediaCodec.BufferInfo
I'm not sure when I should be sending that signal (ideally it would be in the finalize function, however when I tried that I never received the BUFFER_FLAG_END_OF_STREAM flag at all.)
The encoder API looks as follows:
package com.app.encoder
import android.media.MediaCodec
import android.media.MediaCodecInfo
import android.media.MediaFormat
import android.media.MediaMuxer
import android.os.Environment
import android.util.Log
import java.io.File
import java.io.IOException
import java.nio.ByteBuffer
import java.util.*
class VideoEncoder(private val width: Int, private val height: Int, private val frameRate: Int, bitRate: Int, private val fileName: String) : MediaCodec.Callback() {
private val format = MediaFormat.createVideoFormat(MIME_TYPE, width, height)
private var encoder = MediaCodec.createEncoderByType(MIME_TYPE)
private var surface: InputSurface
private lateinit var muxer: MediaMuxer
private var trackIndex: Int = -1
private var muxerStarted = false
private val sync = Object()
private var encoderDone = false
private val pendingBuffers: Queue<Pair<Int, MediaCodec.BufferInfo>> = LinkedList()
companion object {
const val MIME_TYPE = "video/avc"
const val IFRAME_INTERVAL = 10
const val TAG = "VideoEncoder"
}
init {
format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface)
format.setInteger(MediaFormat.KEY_BIT_RATE, bitRate)
format.setInteger(MediaFormat.KEY_FRAME_RATE, frameRate)
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL)
encoder.setCallback(this)
encoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE)
surface = InputSurface(encoder.createInputSurface())
encoder.start()
}
/**
* Prepares the media muxer
*/
fun init() {
val path = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_MOVIES)
val file = File(path, fileName)
try {
muxer = MediaMuxer(file.path, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4)
} catch (ioe: IOException) {
throw RuntimeException("Unable to create MediaMuxer", ioe)
}
}
override fun onInputBufferAvailable(codec: MediaCodec, index: Int) {
return // Unused
}
/**
* Starts the MediaMuxer and processes the queue (if any)
*/
override fun onOutputFormatChanged(codec: MediaCodec, format: MediaFormat) {
Log.d(TAG, "onOutputFormatChanged")
trackIndex = muxer.addTrack(format)
muxer.start()
muxerStarted = true
Log.d(TAG, "MediaMuxer started")
val queueIterator = pendingBuffers.iterator()
while (queueIterator.hasNext()) {
val p = queueIterator.next()
mux(p.first, p.second)
queueIterator.remove()
}
}
override fun onOutputBufferAvailable(codec: MediaCodec, index: Int, info: MediaCodec.BufferInfo) {
mux(index, info)
}
/**
* Pushes encoded data into the muxer, queue's it if the muxer was not yet started
*/
private fun mux(index: Int, info: MediaCodec.BufferInfo) {
if (!muxerStarted) {
pendingBuffers.add(Pair(index, info))
return
}
if (info.flags and MediaCodec.BUFFER_FLAG_CODEC_CONFIG != 0) {
encoder.releaseOutputBuffer(index, false)
return
}
val outputBuffer = encoder.getOutputBuffer(index)!!
if (info.size != 0) {
muxer.writeSampleData(trackIndex, outputBuffer, info)
}
encoder.releaseOutputBuffer(index, false)
// This flag is often not set after signalEndOfInputStream(), causing a timeout in finalize()
if ((info.flags and MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
synchronized(sync) {
encoderDone = true
sync.notifyAll()
}
}
}
override fun onError(codec: MediaCodec, e: MediaCodec.CodecException) {
// TODO
Log.d(TAG, "onError")
}
/**
* Pushes a frame into the encoder using a GLES20 texture
*/
fun addFrame(frameIndex: Int, data: ByteArray, endOfStream: Boolean) {
if (endOfStream) {
encoder.signalEndOfInputStream()
}
surface.makeCurrent()
surface.generateSurfaceFrame(width, height, ByteBuffer.wrap(data))
surface.setPresentationTime(frameIndex, frameRate)
surface.swapBuffers()
surface.releaseEGLContext()
}
/**
* Awaits for the encoder to finish
*/
fun finalize() {
// encoder.signalEndOfInputStream() <- I would prefer to send the signal here, but that does not work at all
Log.d(TAG, "Finalizing")
val waitUntil = System.currentTimeMillis() + 10000
var timedOut = false
synchronized(sync) {
while (!encoderDone) {
try {
sync.wait(1000)
} catch (_: InterruptedException) {
}
if (System.currentTimeMillis() > waitUntil) {
timedOut = true
break
}
}
}
Log.d(TAG, "Finalized")
release()
if (timedOut) {
throw RuntimeException("Timeout waiting for encoder to complete")
}
}
/**
* Releases any related objects
*/
private fun release() {
encoder.stop()
encoder.release()
surface.release()
if (muxerStarted) {
muxer.stop()
}
muxer.release()
}
}
I instantiate the encoder, call init(), addFrame() all the images and finally wait for the encoder to finish using finalize()
In the above implementation, I have a 50/50 chance that the BUFFER_FLAG_END_OF_STREAM flag is set, so I'm not sure what I'm doing wrong here

Related

Streaming H.264 video stream with Android MediaCodec when user must have the ability to save a frame with original size. Optimized solution?

Firstly, I know using OpenGL ES is more optimized, yet here not a choice.
So, when user would be able to save frames with original size in a H.264 live-stream, there are two scenarios, which would have better performance-wise?
Using MediaCodec in asynchronous mode, get YUV image and show that image on an ImageView. (Does it have overhead compared to second option??)
Using MediaCodec in synchronous mode, set TextureView's surface as MediaCodec input surface and whenever user wants to get screenshot, use textureView.getBitmap()
SurfaceView which cannot retrieve the (frame) bitmap after render because it's an output element so failing, no argument.
Code for option 1:
val frame = ...//ByteArray from server
mediaCodec.setCallback(object : MediaCodec.Callback() {
override fun onInputBufferAvailable(
_codec: MediaCodec,
index: Int
) {
try {
val buffer = _codec.getInputBuffer(index)
buffer?.put(frame)
mediaCodec.queueInputBuffer(
index,
0,
data.size,
0,
0
)
} catch (e: Exception) {
try {
_codec.flush()
} catch (e: Exception) {
}
}
}
override fun onOutputBufferAvailable(
_codec: MediaCodec,
index: Int,
info: MediaCodec.BufferInfo
) {
try {
val info = MediaCodec.BufferInfo()
val outputIndex = index
val image: Image? = _codec.getOutputImage(outputIndex)
if (image == null) {
return
}
val rect = image.cropRect
val yuvImage = YuvImage(
YUV_420_888toNV21(image),
NV21,
rect.width(),
rect.height(),
null
)
val stream = ByteArrayOutputStream()
yuvImage.compressToJpeg(
Rect(0, 0, rect.width(), rect.height()),
100,
stream
)
frameBitmap =
BitmapFactory.decodeByteArray(
stream.toByteArray(),
0,
stream.size()
)
imageView.setImageBitmap(frameBitmap)
_codec.stop()
stream.close()
image.close()
if (outputIndex >= 0) {
_codec.releaseOutputBuffer(outputIndex, false)
}
} catch (e: Exception) {
}
}
override fun onError(
_codec: MediaCodec,
e: MediaCodec.CodecException
) {
}
override fun onOutputFormatChanged(
_codec: MediaCodec,
format: MediaFormat
) {
}
})
try {
mediaCodec.start()
} catch (e: Exception) {
mediaCodec.flush()
}
Code for option 2:
val frame = ...//ByteArray from server
try {
val index = mediaCodec.dequeueInputBuffer(-1)
if (index >= 0) {
val buffer = mediaCodec.getInputBuffer(index)
buffer?.put(frame)
mediaCodec.queueInputBuffer(index, 0, data.size, 0, 0)
val info = MediaCodec.BufferInfo()
val outputIndex = mediaCodec.dequeueOutputBuffer(info, 0)
if (outputIndex >= 0) {
mediaCodec.releaseOutputBuffer(outputIndex, true)
)
lastRenderTime = System.currentTimeMillis()
}
} else {
}
} catch (e: Exception) {
//mediaCodec.flush()
}

ImageReader's onImageAvailable method doesn't call and preview shows only 8 frames in slow motion and freezes (Camera2)

I noticed strange behavior on Xiaomi Redmi Note 9 Pro. I tested the application on hundreds of phones but this problem appears only on this device and only when used ImageReader with YUV_420_888 format and 176*144 preview resolution (for example with 320 * 240 or JPEG or without ImageReader as capture surface everything works well). onImageAvailable method doesn't call, preview shows only 8 frames in slow motion and freezes, app slows down. onCaptureCompleted() in CameraCurrentParamsReceiver also calls only 8 times.
I get the smallest resolution by using getMinPreviewSize (176 * 144 for this Xiaomi phone).
const val PREVIEW_IMAGE_FORMAT = ImageFormat.YUV_420_888
const val IMAGE_READER_MAX_SIMULTANEOUS_IMAGES = 4
val previewCaptureCallback = CameraCurrentParamsReceiver(this)
private fun startPreview(cameraDevice: CameraDevice, cameraProperties: CameraProperties)
{
val imageReader = ImageReader.newInstance(cameraProperties.previewSize.width,
cameraProperties.previewSize.height,
PREVIEW_IMAGE_FORMAT,
IMAGE_READER_MAX_SIMULTANEOUS_IMAGES)
this.imageReader = imageReader
bufferedImageConverter = BufferedImageConverter(cameraProperties.previewSize.width, cameraProperties.previewSize.height)
val previewSurface = previewSurface
val previewSurfaceForCamera =
if (previewSurface != null)
{
if (previewSurface.isValid)
{
previewSurface
}
else
{
Log.w(TAG, "Invalid preview surface - camera preview display is not available")
null
}
}
else
{
null
}
val captureSurfaces = listOfNotNull(imageReader.surface, previewSurfaceForCamera)
cameraDevice.createCaptureSession(
captureSurfaces,
object : CameraCaptureSession.StateCallback()
{
override fun onConfigureFailed(cameraCaptureSession: CameraCaptureSession)
{
Log.e(TAG, "onConfigureFailed() cannot configure camera")
if (isCameraOpened(cameraDevice))
{
shutDown("onConfigureFailed")
}
}
override fun onConfigured(cameraCaptureSession: CameraCaptureSession)
{
Log.d(TAG, "onConfigured()")
if (!isCameraOpened(cameraDevice))
{
cameraCaptureSession.close()
shutDown("onConfigured.isCameraOpened")
return
}
captureSession = cameraCaptureSession
try
{
val request = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW)
captureSurfaces.forEach { request.addTarget(it) }
CameraPreviewRequestInitializer.initializePreviewRequest(request, cameraProperties, controlParams, isControlParamsStrict)
captureRequestBuilder = request
val previewCallback = PreviewFrameHandler(this#Camera2)
this#Camera2.previewFrameHandler = previewCallback
imageReader.setOnImageAvailableListener(previewCallback, previewCallback.backgroundHandler)
cameraCaptureSession.setRepeatingRequest(request.build(), previewCaptureCallback, null)
}
catch (ex: CameraAccessException)
{
Log.e(TAG, "onConfigured() failed with exception", ex)
shutDown("onConfigured.CameraAccessException")
}
}
},
null)
}
private fun chooseCamera(manager: CameraManager): CameraProperties?
{
val cameraIdList = manager.cameraIdList
if (cameraIdList.isEmpty())
{
return null
}
for (cameraId in cameraIdList)
{
val characteristics = manager.getCameraCharacteristics(cameraId)
val facing = characteristics.get(CameraCharacteristics.LENS_FACING)
if (facing != null && facing == CameraCharacteristics.LENS_FACING_BACK)
{
val minPreviewSize = getMinPreviewSize(characteristics)
if (minPreviewSize == null)
{
Log.e(TAG, "chooseCamera() Cannot determine the preview size")
return null
}
Log.d(TAG, "chooseCamera() chosen camera id: $cameraId, preview size: $minPreviewSize")
return CameraProperties(cameraId,
minPreviewSize,
characteristics)
}
}
return null
}
private fun getMinPreviewSize(characteristics: CameraCharacteristics): Size?
{
val map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP)
if (map == null)
{
Log.e(TAG, "getMinPreviewSize() Map is empty")
return null
}
return map.getOutputSizes(Constants.Camera.PREVIEW_IMAGE_FORMAT)?.minBy { it.width * it.height }
}
PreviewFrameHandler and CameraCurrentParamsReceiver (previewCaptureCallback variable)
private class PreviewFrameHandler(private val parent: Camera2) : ImageReader.OnImageAvailableListener, Handler.Callback
{
val backgroundHandler: Handler
private val backgroundHandlerThread: HandlerThread = HandlerThread("Camera2.PreviewFrame.HandlerThread")
private val mainHandler: Handler = Handler(Looper.getMainLooper(), this)
/**
* Main thread.
*/
init
{
backgroundHandlerThread.start()
backgroundHandler = Handler(backgroundHandlerThread.looper)
}
fun shutDown()
{
backgroundHandlerThread.quit()
mainHandler.removeMessages(0)
}
override fun handleMessage(msg: Message?): Boolean
{
msg ?: return false
parent.cameraFrameListener.onFrame(msg.obj as RGBImage)
return true
}
/**
* Background thread.
*/
private val relativeTimestamp = RelativeTimestamp()
override fun onImageAvailable(reader: ImageReader)
{
var image: Image? = null
try
{
image = reader.acquireNextImage()
image ?: return
val rgbImage = parent.bufferedImageConverter?.convertYUV420spToRGB(image, relativeTimestamp.updateAndGetSeconds(image.timestamp))
rgbImage ?: return
mainHandler.sendMessage(mainHandler.obtainMessage(0, rgbImage))
}
catch (ex: Exception)
{
Log.e(TAG, "onImageAvailable()", ex)
}
finally
{
image?.close()
}
}
private class RelativeTimestamp
{
private var initialNanos = 0L
fun updateAndGetSeconds(currentNanos: Long): Double
{
if (initialNanos == 0L)
{
initialNanos = currentNanos
}
return nanosToSeconds(currentNanos - initialNanos)
}
}
}
/**
* Class used to read current camera params.
*/
private class CameraCurrentParamsReceiver(private val parent: Camera2) : CameraCaptureSession.CaptureCallback()
{
private var isExposureTimeExceptionLogged = false
private var isIsoExceptionLogged = false
override fun onCaptureSequenceAborted(session: CameraCaptureSession, sequenceId: Int)
{
}
override fun onCaptureCompleted(session: CameraCaptureSession, request: CaptureRequest, result: TotalCaptureResult)
{
try
{
val exposureTimeNanos = result.get(CaptureResult.SENSOR_EXPOSURE_TIME)
if (exposureTimeNanos != null)
{
parent.currentExposureTimeNanos = exposureTimeNanos
}
}
catch (ex: IllegalArgumentException)
{
if (!isExposureTimeExceptionLogged)
{
isExposureTimeExceptionLogged = true
}
}
try
{
val iso = result.get(CaptureResult.SENSOR_SENSITIVITY)
if (iso != null)
{
parent.currentIso = iso
}
}
catch (ex: IllegalArgumentException)
{
if (!isIsoExceptionLogged)
{
Log.i(TAG, "Cannot get current SENSOR_SENSITIVITY, exception: " + ex.message)
isIsoExceptionLogged = true
}
}
}
override fun onCaptureFailed(session: CameraCaptureSession, request: CaptureRequest, failure: CaptureFailure)
{
}
override fun onCaptureSequenceCompleted(session: CameraCaptureSession, sequenceId: Int, frameNumber: Long)
{
}
override fun onCaptureStarted(session: CameraCaptureSession, request: CaptureRequest, timestamp: Long, frameNumber: Long)
{
}
override fun onCaptureProgressed(session: CameraCaptureSession, request: CaptureRequest, partialResult: CaptureResult)
{
}
override fun onCaptureBufferLost(session: CameraCaptureSession, request: CaptureRequest, target: Surface, frameNumber: Long)
{
}
}
As I understand something is wrong with preview size but I cannot find correct way how to get this value and the strangest thing is that this problem appears only on this Xiaomi device. Any thoughts?
176x144 is sometimes a problematic resolution for devices. It's really only listed by camera devices because it's sometimes required for recording videos for MMS (multimedia text message) messages. These videos, frankly, look awful, but it's still frequently a requirement by cellular carriers that they work.
But on modern devices with 12 - 50 MP cameras, the camera hardware actually struggles to scale images down to 176x144 from the sensor full resolution (> 20x downscale!), so sometimes certain combinations of sizes can cause problems.
I'd generally recommend not using preview resolutions below 320x240, to minimize issues, and definitely not mix a 176x144 preview with a high-resolution still capture.

Using MediaCodec in async mode, issue is, I am not getting MediaCodec.BUFFER_FLAG_END_OF_STREAM

I am building a streaming app. I am facing a problem, here is code
I want to live stream camera feed to the server and I hope I will get ByteBuffer in onOutputBufferAvailable(). I am getting output buffer but I am never getting MediaCodec.BUFFER_FLAG_END_OF_STREAM when I call stopVideoCapture()
Here are code segments
Creating Media Codec
private val recorderStreamSurface by lazy {
val format = MediaFormat.createVideoFormat(VIDEO_MIME_TYPE, width, height)
val frameRate = 30 // 30 fps
var recorderStreamSurface: Surface? = null
// Set some required properties. The media codec may fail if these aren't defined.
format.setInteger(
MediaFormat.KEY_COLOR_FORMAT,
MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface
)
format.setInteger(MediaFormat.KEY_BIT_RATE, 6000000) // 6Mbps
format.setInteger(MediaFormat.KEY_FRAME_RATE, frameRate)
format.setInteger(MediaFormat.KEY_CAPTURE_RATE, frameRate)
format.setInteger(MediaFormat.KEY_REPEAT_PREVIOUS_FRAME_AFTER, 1000000 / frameRate)
format.setInteger(MediaFormat.KEY_CHANNEL_COUNT, 1)
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1) // 1 seconds between I-frames
videoEncoder = MediaCodec.createEncoderByType(VIDEO_MIME_TYPE)
// Create a MediaCodec encoder and configure it. Get a Surface we can use for recording into.
try {
videoEncoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE)
recorderStreamSurface = videoEncoder.createInputSurface()
videoEncoder.setCallback(object : MediaCodec.Callback() {
override fun onError(codec: MediaCodec, exception: MediaCodec.CodecException) {
Log.d(TAG, "==onError $codec $exception")
serverChannel.onError(exception)
}
override fun onOutputFormatChanged(codec: MediaCodec, format: MediaFormat) {
Log.d(TAG, "video encoder: output format changed")
}
override fun onInputBufferAvailable(codec: MediaCodec, index: Int) {
Log.d(TAG, "video encoder: returned input buffer: $index")
val frameData: ByteArray
frameData = queue.take().array()
val inputData = codec.getInputBuffer(index)
inputData!!.clear()
inputData.put(frameData)
codec.queueInputBuffer(index, 0, frameData.size, 0, 0)
}
override fun onOutputBufferAvailable(codec: MediaCodec, index: Int, info: MediaCodec.BufferInfo) {
Log.d(TAG, "video encoder: returned output buffer: $index flag : ${info.flags}")
Log.d(TAG, "video encoder: returned buffer of size " + info.size)
if ((info.flags and MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
Log.i(TAG,"serverChannel.onCompleted()1")
}
videoEncoder.releaseOutputBuffer(index, false)
}
})
videoEncoder.start()
} catch (e: IOException) {
videoEncoder.stop()
videoEncoder.release()
serverChannel.onError(e)
}
recorderStreamSurface
}
local variables
lateinit var videoEncoder: MediaCodec
val queue: ArrayBlockingQueue<ByteBuffer> = ArrayBlockingQueue<ByteBuffer>(10)
val targets by lazy { listOf(viewFinder.holder.surface, recorderStreamSurface!!) }
private const val VIDEO_MIME_TYPE = "video/avc"
val cameraId = "1"
val fps = 30
val width = 1080
val height = 1920
Record Request
private val recordRequest: CaptureRequest by lazy {
// Capture request holds references to target surfaces
session.device.createCaptureRequest(CameraDevice.TEMPLATE_RECORD).apply {
// Add the preview and recording surface targets
for (target: Surface in targets) {
addTarget(target)
}
// Sets user requested FPS for all targets
set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, Range(fps, fps))
}.build()
}
and finally start and stop recording
private fun startVideoCapture() {
// Prevents screen rotation during the video recording
requireActivity().requestedOrientation =
ActivityInfo.SCREEN_ORIENTATION_LOCKED
session.setRepeatingRequest(previewRequest, null, cameraHandler)
// Start recording repeating requests, which will stop the ongoing preview
// repeating requests without having to explicitly call `session.stopRepeating`
session.setRepeatingRequest(recordRequest, null, cameraHandler)
recordingStartMillis = System.currentTimeMillis()
Log.d(TAG, "Recording started")
}
private fun stopVideoCapture() {
// Unlocks screen rotation after recording finished
requireActivity().requestedOrientation =
ActivityInfo.SCREEN_ORIENTATION_UNSPECIFIED
videoEncoder.stop()
videoEncoder.release()
Log.d(TAG, "Recording stopped")
session.setRepeatingRequest(previewRequest, null, cameraHandler)
}
you must pass as parameter the flag BUFFER_FLAG_END_OF_STREAM with the last data to encode.
codec.queueInputBuffer(index, 0, frameData.size, 0, BUFFER_FLAG_END_OF_STREAM)

How to Integrate Alexa Voice Service to Custom Android App

I am planning to develop an Android App with Alexa Voice Service integration to develop an app like Reverb. Below is what I have tried...
Checked AVS Device SDK, could get a proper guide to implement it in Android.
Checked https://github.com/willblaschko/AlexaAndroid , wasnt able to get it work.
Planned to implement myself, below is what I have done.
a. Integrated Login framework, was able to successfully login and get the token.
b. Created a sound recorder, was able to record and playback locally.
c. Created request to sent audio to https://avs-alexa-eu.amazon.com/v20160207/events
[UPDATED]
After changing the shortArray to Byte array I am getting the response but now the problem is that MediaPlayer is unable to play the response mp3, it gives error at prepare
package com.example.anoopmohanan.alexaandroid
import android.content.Context
import android.media.*
import android.media.AudioFormat.ENCODING_PCM_16BIT
import android.media.AudioFormat.CHANNEL_CONFIGURATION_MONO
import android.os.Environment
import android.os.Environment.getExternalStorageDirectory
import java.io.*
import com.example.anoopmohanan.alexaandroid.ResponseParser.getBoundary
import okhttp3.*
import org.jetbrains.anko.doAsync
import org.json.JSONObject
import java.nio.file.Files.exists
import okhttp3.OkHttpClient
import java.net.HttpURLConnection
import android.os.Looper
import android.os.PowerManager
import android.util.Log
import okio.BufferedSink
import okhttp3.RequestBody
import org.apache.commons.io.FileUtils
import org.jetbrains.anko.Android
import org.jetbrains.anko.runOnUiThread
import org.jetbrains.anko.toast
import java.util.*
import okhttp3.ResponseBody
import okio.Buffer
import com.example.anoopmohanan.alexaandroid.SoundRecorder.LoggingInterceptor
import android.media.MediaDataSource
class SoundRecorder(context: Context) {
private var appcontext: Context? = null
private var recording = false
val MEDIA_JSON = MediaType.parse("application/json; charset=utf-8")
val MEDIA_TYPE_AUDIO = MediaType.parse("application/octet-stream")
var accessToken = ""
private var mediaPlayer: MediaPlayer? = null
private var streamToSend:ByteArray? = null
private val client = OkHttpClient.Builder()
.addInterceptor(LoggingInterceptor())
.build()
init {
this.appcontext = context
}
fun startRecording(accessToken: String){
this.accessToken = accessToken
doAsync {
startRecord()
}
}
fun stopRecording(){
doAsync {
stopRecord()
}
}
fun playRecording(){
doAsync {
playRecord()
}
}
private fun stopRecord(){
recording = false
//val file = File(Environment.getExternalStorageDirectory(), "test.pcm")
//sendAuio(file)
}
fun playRecord() {
val file = File(Environment.getExternalStorageDirectory(), "speech2.mp3")
val mplayer = MediaPlayer()
mplayer.setDataSource(file.path)
mplayer.prepare()
mplayer.start()
// val file = File(Environment.getExternalStorageDirectory(), "test.pcm")
//
// val shortSizeInBytes = java.lang.Short.SIZE / java.lang.Byte.SIZE
//
// val bufferSizeInBytes = (file.length() / shortSizeInBytes).toInt()
// val audioData = ByteArray(bufferSizeInBytes)
//
// try {
// val inputStream = FileInputStream(file)
// val bufferedInputStream = BufferedInputStream(inputStream)
// val dataInputStream = DataInputStream(bufferedInputStream)
//
// var i = 0
// while (dataInputStream.available() > 0) {
// audioData[i] = dataInputStream.readByte()
// i++
// }
//
// dataInputStream.close()
//
// val audioTrack = AudioTrack.Builder()
// .setAudioAttributes(AudioAttributes.Builder()
// .setUsage(AudioAttributes.USAGE_MEDIA)
// .setContentType(AudioAttributes.CONTENT_TYPE_SPEECH)
// .build())
// .setAudioFormat(AudioFormat.Builder()
// .setEncoding(AudioFormat.ENCODING_PCM_16BIT)
// .setSampleRate(16000)
// .setChannelMask(AudioFormat.CHANNEL_OUT_MONO).build())
// .setBufferSizeInBytes(bufferSizeInBytes)
// .setTransferMode(AudioTrack.MODE_STREAM)
// .build()
//
// audioTrack.play()
// audioTrack.write(audioData, 0, bufferSizeInBytes)
//
//
// } catch (e: FileNotFoundException) {
// e.printStackTrace()
// } catch (e: IOException) {
// e.printStackTrace()
// }
}
private fun startRecord() {
val file = File(Environment.getExternalStorageDirectory(), "test.pcm")
try {
file.createNewFile()
val outputStream = FileOutputStream(file)
val bufferedOutputStream = BufferedOutputStream(outputStream)
val dataOutputStream = DataOutputStream(bufferedOutputStream)
val minBufferSize = AudioRecord.getMinBufferSize(16000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT)
val audioData = ByteArray(minBufferSize)
val audioRecord = AudioRecord(MediaRecorder.AudioSource.MIC,
16000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT,
800)
if (audioRecord.recordingState != AudioRecord.RECORDSTATE_STOPPED){
this.appcontext?.runOnUiThread {
toast("No recording source available")
}
return
}
recording = true
audioRecord.startRecording()
if (audioRecord.recordingState != AudioRecord.RECORDSTATE_RECORDING){
this.appcontext?.runOnUiThread {
toast("Someone is still recording")
}
recording = false
audioRecord.stop()
audioRecord.release()
return
}
this.appcontext?.runOnUiThread {
toast("Recording started hurray")
}
while (recording) {
audioRecord.read(audioData, 0, minBufferSize)
dataOutputStream.write(audioData,0,minBufferSize)
// for (i in 0 until numberOfShort) {
// dataOutputStream.writeShort(audioData[i].toInt())
// }
}
audioRecord.stop()
audioRecord.release()
dataOutputStream.close()
sendAuio(file)
} catch (e: IOException) {
e.printStackTrace()
}
}
fun sendAuio(audio: File){
streamToSend = audio.readBytes()
val requestBody = MultipartBody.Builder()
.setType(MultipartBody.FORM)
.addFormDataPart("metadata","metadata",RequestBody.create(MEDIA_JSON, generateSpeechMetadata()))
.addFormDataPart("audio", "test.pcm",
RequestBody.create(MEDIA_TYPE_AUDIO, streamToSend))
.build()
val request = Request.Builder()
.url("https://avs-alexa-eu.amazon.com/v20160207/events")
.addHeader("Authorization","Bearer $accessToken")
.post(requestBody)
.build()
print (request.body().toString())
client.newCall(request).execute().use({ response ->
if (!response.isSuccessful()) throw IOException("Unexpected code $response")
val items = if (response.code() == HttpURLConnection.HTTP_NO_CONTENT)
AvsResponse()
else
ResponseParser.parseResponse(response.body()!!.byteStream(), getBoundary(response))
if (items.size > 0){
handle(items)
}
System.out.println("[TRACE]"+response.body()!!.string())
})
}
#Throws(IOException::class)
fun toByteArray(`in`: InputStream): ByteArray {
val out = ByteArrayOutputStream()
var read = 0
val buffer = ByteArray(1024)
while (read != -1) {
read = `in`.read(buffer)
if (read != -1)
out.write(buffer, 0, read)
}
out.close()
return out.toByteArray()
}
private fun generateSpeechMetadata(): String {
val messageId = UUID.randomUUID().toString();
val dialogId = UUID.randomUUID().toString();
return "{\"event\": {\"header\": {\"namespace\": \"SpeechRecognizer\",\"name\": \"Recognize\",\"messageId\": \"$messageId\",\"dialogRequestId\": \"$dialogId\"},\"payload\": {\"profile\": \"CLOSE_TALK\", \"format\": \"AUDIO_L16_RATE_16000_CHANNELS_1\"}},\"context\": [{\"header\": {\"namespace\": \"AudioPlayer\",\"name\": \"PlaybackState\"},\"payload\": {\"token\": \"\",\"offsetInMilliseconds\": 0,\"playerActivity\": \"FINISHED\"}}, {\"header\": {\"namespace\": \"SpeechSynthesizer\",\"name\": \"SpeechState\"},\"payload\": {\"token\": \"\",\"offsetInMilliseconds\": 0,\"playerActivity\": \"FINISHED\"}}, { \"header\" : { \"namespace\" : \"Alerts\", \"name\" : \"AlertsState\" }, \"payload\" : { \"allAlerts\" : [ ], \"activeAlerts\" : [ ] } }, {\"header\": {\"namespace\": \"Speaker\",\"name\": \"VolumeState\"},\"payload\": {\"volume\": 25,\"muted\": false}}]}"
// return "{\n" +
// "\"messageHeader\": {\n" +
// "\"deviceContext\": [\n" +
// "{\n" +
// "\"name\": \"playbackState\",\n" +
// "\"namespace\": \"AudioPlayer\",\n" +
// "\"payload\": {\n" +
// "\"streamId\": \"\",\n" +
// "\"offsetInMilliseconds\": \"\",\n" +
// "\"playerActivity\": \"IDLE\"\n" +
// "}\n" +
// "}\n" +
// "]\n" +
// "},\n" +
// "\"messageBody\": {\n" +
// "\"profile\": \"doppler-scone\",\n" +
// "\"locale\": \"en-us\",\n" +
// "\"format\": \"audio/L16; rate=16000; channels=1\"\n" +
// "}\n" +
// "}"
}
private val audioRequestBody = object : RequestBody() {
override fun contentType(): MediaType? {
return MediaType.parse("application/octet-stream")
}
#Throws(IOException::class)
override fun writeTo(sink: BufferedSink?) {
//while our recorder is not null and it is still recording, keep writing to POST data
sink!!.write(streamToSend)
}
}
inner class LoggingInterceptor : Interceptor {
#Throws(IOException::class)
override fun intercept(chain: Interceptor.Chain): Response {
val request = chain.request()
val t1 = System.nanoTime()
Log.d("OkHttp", String.format("--> Sending request %s on %s%n%s", request.url(), chain.connection(), request.headers()))
val requestBuffer = Buffer()
request.body()?.writeTo(requestBuffer)
Log.d("OkHttp", requestBuffer.readUtf8())
val response = chain.proceed(request)
val t2 = System.nanoTime()
Log.d("OkHttp", String.format("<-- Received response for %s in %.1fms%n%s", response.request().url(), (t2 - t1) / 1e6, response.headers()))
val contentType = response.body()?.contentType()
val content = response.body()?.string()
Log.d("OkHttp", content)
val wrappedBody = ResponseBody.create(contentType, content)
return response.newBuilder().body(wrappedBody).build()
}
}
private fun getMediaPlayer(): MediaPlayer? {
if (mediaPlayer == null) {
mediaPlayer = MediaPlayer()
}
return mediaPlayer
}
fun handle(items:AvsResponse){
for (item in items){
handle(item)
}
}
fun handle(item: AvsItem){
if (item is AvsSpeakItem){
}else{
return
}
//cast our item for easy access
//write out our raw audio data to a file
val path = File(Environment.getExternalStorageDirectory(), "speech.mp3")
//path.deleteOnExit()
//val path = File(appcontext!!.getCacheDir(), System.currentTimeMillis().toString() + ".mp3")
//var fos: FileOutputStream? = null
try {
// fos = FileOutputStream(path)
// fos!!.write(item.audio)
// fos.close()
path.createNewFile()
path.writeBytes(item.audio)
// var ds = ByteArrayMediaDataSource(item.audio)
// val fis = FileInputStream(path)
// //play our newly-written file
// val mplayer = MediaPlayer()
// mplayer.setDataSource(path.path)
// mplayer.prepare()
// mplayer.start()
// getMediaPlayer()?.setDataSource(fis.fd)
// getMediaPlayer()?.prepare()
// getMediaPlayer()?.start()
} catch (e: IOException) {
e.printStackTrace()
} catch (e: IllegalStateException) {
e.printStackTrace()
}
}
inner class ByteArrayMediaDataSource(private val data: ByteArray?) : MediaDataSource() {
init {
assert(data != null)
}
#Throws(IOException::class)
override fun readAt(position: Long, buffer: ByteArray, offset: Int, size: Int): Int {
System.arraycopy(data, position.toInt(), buffer, offset, size)
return size
}
#Throws(IOException::class)
override fun getSize(): Long {
return data?.size?.toLong()!!
}
#Throws(IOException::class)
override fun close() {
// Nothing to do here
}
}
}
RESPONSE FROM ALEXA
--------abcde123
Content-Type: application/json; charset=UTF-8
{"directive":{"header":{"namespace":"SpeechSynthesizer","name":"Speak","messageId":"d58c83fe-377f-4d1d-851b-a68cf5686280","dialogRequestId":"d83b8496-e6a5-4fc5-b07b-32f70acd1f15"},"payload":{"url":"cid:2aed5305-081d-4624-b2ba-ef51eba6aa32_1353236331","format":"AUDIO_MPEG","token":"amzn1.as-ct.v1.Domain:Application:Knowledge#ACRI#2aed5305-081d-4624-b2ba-ef51eba6aa32"}}}
--------abcde123
Content-ID: <2aed5305-081d-4624-b2ba-ef51eba6aa32_1353236331>
Content-Type: application/octet-stream
ID3#TSSELavf57.71.100dUC\[QwwD뻝q \[D*An. .`>X>>#|L?&|>4=A?w'<<Ɨd"_h{[M̢8j8ǙbA
#=$T2bX2LM~NBPp˧Eh$`hoV
ԁh3}ɧ2/_74g挺`;*
W`dq՗HY)#mN<fG?l74(
˚z
BD)jy7uӹ\ۭc|>)qcsZQN} QppҜ{_]>iWw$Yd!.VEi]
<ƹYK.l6$̄)uLPGⰞRTFK>cr{\UK.|-EWGWȍ{3UN6]1tX#jr`5ka5/d V
WzLٺt?Aɬg%Я;la N%
A`r:ۙJSD3\´P y%eWOG5As
۪<*qfom
d !JjadR"=2vg?ZzrKZrV*Y2ش!XFOI EP99
Fdmy;ꡐ^JJ5С/\ _-LuafVMFZT-TUadRwO(A.2>sʳ!Y&)h`x<RR<AC|pTi"`k9#ɱX*%AP0CR7u+<VmFq ,ł)EhH굫.<hd#%[6h$kRO'IZ.VMX>!fqi+`:'j-Z6X *C
0S'9)y􋎚d&ĽX+)d Ӳ#Xz3 M xNgV9Vc׿?:ot\w}&dZk)b.`C$w1*\y?O՗ql\6d\&R=bcQt]
r*U{ztUT-| b%BN'<^潦P2Dtc1dھ]KN!*gxxv[dp0 ЈaR'id \#G_=f:fř6~pdcg.k/_E0lY
(XvoR.w*0U>/9_`ra1ANo^8&Ո񻊊{2d5Y{қ|Xo6`J| !mcx".~_Da_,êJgt7,xkdO,Ӭ4e 葅*J
wd0FsKb[g#1TN*ydJBhJ .p}䁤`%JHj~"kJl`P_Q#&6_F!Ίrw|Ȇ%vs 0$F P8n*c#eWO# dEM/dZY4N[j\Y]t~b3ݚKJo4Qff.|RSxLVo%$ߟ;I"XjRRC׾YK#PĹ?&w3B\4 D#\Jη!dk>Do
C5 0*6G7`Y4םke.Sv(J SIH$&pf|fP!rې3 %o#%gN`cM,E/}PV*(d}aڠNц3!qɿhinvhֹ`,+?_hEAi*Uu)LB)̈j{Rs$yTYQ*)_
]M4Pa6dĒLJyw*)3x÷/#s.Æ
(J%{i􄉽whNi^hAyW+į8SɜJg )%8Wa6uaZ gZT vWUdğד]:C
'52CON.kYp,c9HO#JZ3T|{8+Q䫚YYkytơʫ6tݼt$V#z9O*7`BqdįʸT?Tc=YP&c:PLB~Ȧ&3r*kg9CY$pJMN鱆cz,eM0V!I7uD8_c9odĿKɖy3|Z+r`w-{c
ww*nej5h8DLHO-
}g['cT
U :$5WTGVNo+ٚ#v7j-!G[ZOdondê099E ڀn҂c#R4&#fEkSvGxժI+7Jѐ>L)u8:
Fv*$ܷ{s4ğ7! ddҤ6
\a`dvFSeT#"uVVhbԧRarEr\]C(rMfarw\j7n[Xon*1x i
J?mL0Kh.Rsd&CtVWxxny:fN
ǃ>T
.)":`pUÂȆ^#z#1·fRu:;KN
sN< |zXqs<nWP&O3qēXx-FTd,јϐxE$LUf{%dHĤK4RX[9g^fY7"הv6
#t*u-#PhySi犂>v#KDan
F[0pxԐÍL)1YlG
ÄdĨ#1i.%u7_shnA A,G)KJL8HEgc_X}Sr6E;#_%9#!
aSi#5dF E#'Z=&XrPMH4<f7/T"vdĞ!`JFL0ʊS2TKαYiFgƞ`C'?(Lt bW$"ᾇ0V5)ǗqWX}(-H*=[bi3(ܷdĚ!SJɎtoZd`$c("2ݩEUSJjzBׂLߍA%a#d'dZ(ʺ?j}\C;8 gjV
oz3/`dĠN`dė^L)Xc 8, 7>xzMTT| ZRBe#M7X7"(8n2f Nc#AJ3" 0A
# 3qxf;CAA7db8##eps
EdĬ52_E S0/i
[M94<#١n2("6k97L+ƪHSjW]Z6Uoi??3967ī?_K1NDN9dX3<6`>Vn#"jM(^BN GEc:29.XnOB(3qNyjH$~s˧~8ԯ&ˬ<\(.:s%5(p3:yt
e^;DGY'd~T1GP1kkDs#bj /R!r0PU
fRϪ\)p٢4l*%:H'zmᠯ<`NjNe%l\y2k(7Rm%fDd⺲T0G%z;*&e_2֏SLZvv$_զ76!#r?mc2Up]gU;zgǝCTY jēnJ BQ,$#ŶZxr$Q,.d+9VI\^)[pPXh*#z8vYáY"$Ǹ*
üK;㡯5*ƈ#A`sJ*]_"M=L\ǖ֗d'×^y2uR[DdA!`
aLQ$HQJOe,ڰP*x=]J%. "!hI&NX2T>rp4\0P%*D(wT%~nZLAMdW~#I\E3.99.5LAME3.99.5UUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUd|HUUUUUUULAME3.99.5UUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUd|HUUUUUUULAME3.99.5UUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUd|HUUUUUUULAME3.99.5UUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUd|HUUUUUUULAME3.99.5UUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUd|HUUUUUUULAME3.99.5UUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUd|HUUUUUUULAME3.99.5UUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUd|HUUUUUUULAME3.99.5UUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUd|HUUUUUUULAME3.99.5UUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUd|HUUUUUUULAME3.99.5UUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUd|HUUUUUUULAME3.99.5UUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUd|HUUUUUUULAME3.99.5UUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUd|HUUUUUUULAME3.99.5UUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUd|HUUUUUUU0aAۇL>!*d2XqD ` 1 #l|9P $+t0\F f0N_6b18a`1HT:a`d|H#wp#0
0:pLOwfĨ.fV\O`,1qMLKBȰ#- LJ91G{`/#UT1FSj-m3C}bSd99AU
3#^d0]sP3ƀDVd1G8Na&a?(vfPc|\hQMdqD0iA yQH 6DRH\f*0BO
ܾhl\ERdĚ5k_*]eJ,e#0h(:Ko&ɲi&QU>
Zn#ET&I]"r[SScv#yQG#ڭ% XiOiȵ4tjyѵdf{dG_8Ϧ˪Os1Ӫ~b}=#
uEH͋(|a}%ڮ[lr\ʢobx
Ç~>
[2MwۙrjdW>N<8#ZT})1sB
zȀs뫳AAsK#\R:Zh"BOR¯bw..\jk!,G6Ʉ%Bbh+|ۚfsN\8`\ЖL cCdh.t9xڽww1LeJ-OgBݑCH2(D`3S c...DaD4>-5i;t8cmr7`o-А aI~_*d1${ڴE0jpA?D1ypr|H|JAQ˲8-2;[J22Ȁݝu|;DdE}^$C) 87ԬfS!nA{o1BJEId" 6E!ܻǜ2VwnqQy90I(Цa3]]]t0 _NgWI
eV3khuv[OcGyEMYP1중ƻր)~= &]d!>k
ߟ$Ѻ>
YaH
o>G]-կW+?fԬok!DXY,I(SrFH#$Ʌ2:2\Ŵ&(
<K2d4>*6l+_~5M
AAy'X84wSYƍ brTrʈ
eMw7N#ƭV]o!{ǹL"b]Bρh16dDPRJS_\n6aOPi,zL
:;%P5(ۍ9/ 6qoD֏X01}p\u#TIKc`,R+L$YLOdS2^xľw8ɯiKR/QDsf}zo{MA% gbU\}
Ⴠʨ-':#qt>$(QM{Hɱ"]<~O_^dc>zGHX].S
ۘ%VFr(#Q,p#mSfO{u9 *}u}ִOt#ws>qd$Ttޢ,(ld{TɊ*kJ3r_RU˶6~*^ba..-:C٦]}?*6O*8WBFlHU1.st/dČ\
Ix )v1S3m{P.:6\>Q2t̰kQ6ȿ[8tˎ!ڑ*]O.''#vZZZ,In>5u*P*pNc0Wbi(
c4 iCdĕ
ž8X6O5!HH%TGZDoYJ_r
-7{
.D2YaWoU]JG+d;*5A.{9S`#VX
8cLML?dģ!#]zYyAHLH:VؠT<jN!YSX]E`Y־WzP>̺>ӿr'V+`;pJ)A6mA#RJ&Qdġ!W[#"Cn
X&X4"To;2gr3m#0iȖ
Ϙ*ǚִn|?򶪏^$y<1^w-R3 ,7ASFqKCdWOWUrdĝ5Mx)O>>7
ŷ3}_5ky+1yC7Y^qD\<<9l#5K("YʕItj"M51ĐfbtL8nxAdH,t:hOG_AJ$.Bj^j#-yL)#!,
rD>VGVvL-k&/Z̒OJMK&f|ţ֟P$G"g }Q
Idƾ~\C1Xөbfd8&u>nXAGۨO`CDrTPE)B>_OE-ݤ ?jd&^oW$Q;Wmad C~OBOazRp0TF0}޿Eҳ'ݷduĎQs9ḨawWH]dRXԬ~ZXH%MKg)DRd2ܲ)~o:ԷwFld"sl$DLSR?ӷ
YdI8 U8%sY %~p̎]kYSkz-1#pMܟډ9$䳳To4d#bdm[0dϺjjy(תj^_6ݷؤĮ
"_F(]$P0P#U/fFȠv68%atvaː^#=80pse$]L%# 1dM/:`lTږI\
hNhv
L`-4/oִWTʂ"ȧ~p($':|0403['4yP5*=zi֕#Wf\ysacd!#6+#Y\qoU,\|pa9aQ<M ѱkXCmuk$EqAP8$#ozo׋Zo׋&8yOohHiVq9dz~(s%WisOPjWIg9/5T,jD-XeQ#H҆!6>LJ%i^zPY|jݢlew<b6#V-K0rՠօLd#ڥtEɟʲ&<ܻN$U-Gӎ?H -/sÀ֑#MjWBi9x%<'Bqb [LP d2r^HPNB"Q`PPqN.A"R!`AqqsvOSQՃTRMB3M(|&e^L4<6bإA=Qt#1gBP_dC^
OS;OYȏ4yϿaIJG g橣vs۩[44&?IGJ%"Lη/;_߶e!Þ_j! YSyQ5VIk #q`dO4;^n 'ÃzL$v"4_I.yKvQɕ㥕Vy'ͱtvb1얰Qp#iң;j}){|n[V%FCf-}r:]G3eG&0PQ
Ddm+ZY_fwkljbFzGP9Rjr(T :
V2
2/nܳPZ\oj>gԲiʆ*~,*)c1RT:Z%qoZvnޞg{e VIdH֐>,B뤋i^v>+vTL#
C᤬бaXeDM,gGR'_n<|5bH:ࣃ\Un
Umm^C۩Cj#4e39
ق2dėɒZy~=LՈ
ՊC7WQ{+ۤH0\
e6kΜcGe}K]u^;Tk 6tmbD8sM Ʒb?
Wmi 8dĦ%YZȟ``9#(I xP9,(Lܲy#M2z
08bh`C,[ckZv7OU'6:Brm){ׁ,#_dēI>XXO^1JrmLe3I}syL͂%NgFZ{Lؕ}T|xE$]K.U{ !# 3m?]lGjV{0Ɏ|\w-W!3ݠ]qoXdĨq6Dc)B_ݕfK<U&2nYݵ<",XQ*b8w%&:&X5IesLBsh{IAC0ݲ⯩^ ̨0:/rpcdĽ>348D;r؍ӯ*\s0}B% PyT6n*y(MUuj$1?=\)Ît^q,gg1l)d"KC 8%s(E2 ʁF t2!<:
:0<k¤5\!JSv ΅JDI$0ŤSIB!7_'"k"XHAyԪU4Qh^<=Od"{>{;ُO>Da Ly7\F
z`A!2Is ޿s6 ՌHq;"r>QH-Fz)Q|B1t߬*_)Ld%c:L J^ ۧ+65d
a|:Ppr]is2=V;nfDkŮdh\%h%#!0`vc򱟙JͲ?cj_d{ݞyB^o2q\:G
)E0)иzH_Ff(=Z1<Â/òPm[#8!(NCViE#6+kXmΙ*SL7
ƵUi_S?+D5sd!ɅLd
a3m6
hDdj$؉տ}/g#Z
C測AC"
(-13~71&8I*Hh.ͬY4td\
Z#$$"]MMv,7{zbToShI&&)%uvmk
n"35j_KO֖Cfwf;C) dR&d=k6rKu3s2%Cȝ/ɻ>X%·CJ6<
IXkYj=dN
CA`hyVX%
x8Wo MvV͊SD^)W*0V0RXR+dĆ9jOX,tj˚àQ̆iQãAJnV6tΑc
(Äj~֥_ۛ'9ĬgPL,֯q)e7\PG
>Ċ#۾6'w}MEdĔJpSL(A=#s`e0pd09<R
fny5bhO.
)\!:]"XZge;!/4sD"LSH%ԍQsuNOR]p\M$\bdl67dħ5._ݐZihN uN'fFP
erE0Im=s
=rD3SgY,VqVE5u+mc:n
0<CH Gq-T0XNfbۤMV_QmdS!f؀htYkcd_M*4IP11v8/ՄFA*JNknhc9)J _G
96ఀܡٌIB`'$AJikӈ5rAjR4dO":f^Bʢ堠xHY?zTюp.>L((h
F#
08.|#t +sik
MY&'j`*X'% (dIE**bDX9i[}Q?Msru^<\3];X7!ʔfR(os,2S is
ƋUY4"~]t=~YMoyU`GhdYR^˪p"ㄠl2;X8&zwBjJyAwjB^&-D$8gߘpbj["oQٹKjV[t)Bc,mO?սJke
ciT~ndj>DҶ+`-a[V?+uBrGjKjаw%R?% odxD6[;g+p&/`Tf\S7CULOoҖ nku+3""J* #s-dyj:~yF%#ADBB"#:o
_MޔBDsܓ$pc #;g|F:܊b#Ë;$maZE$jdĊc^C?z?׹BNthH\G#A:#CQ
H &gJ
fmֽUSddsI5k2PUffg-KoRfga{ffݙ_j$dė:h$;L4'ݸpT'<+ܨ\IG-tvǶFq;[JQ}KRZx5h <0`2:|P8[RV#dĥ:2v;pM!9IP&P
ډsCCSzDIg_hO=9 B co|meb"ϐe)fQuJ7fr]ꚶjQdĵG{%PRA'VJy^;PYD\*
\31RߘkF`/( EJfHX\uMNH" xj\9Vϫ9*0:5V2Q\Md2D\Õu,Y)T6;cp4]8\[gE\FF!;uJݼ
T3%OeńIhտb +MEDCÿ2_x{^BeدOd9.
fMHS?]+˼ϙ`,y
!{OC5*(i:V.f{9M\}#?I7#Nݸzx<X`襳9;Fa(*d"HHD%(,(F8*H廑S/L!
GS܋ 8h69[>!r!Т$H(V<S.Z%' I
;<87)֊
dI6F(P{')?nQ<( P+b
huiUDǕCG#b }p1ކ/C?iuc/}NuY6TףsH1ů'$2ʓpB+f&Q7tReZVd!Vw׫:!LfH,q-*i"bA
GEەI:IK.Qңs_n׹[yzWv5B>V˗+ašd V,|;ݙJ)XFCX4;-PZCƁ%
E#+4G<vExlj מ%?M8poz9[GVr%S7_?d +DؘBc;YXC^ei$`hYR<eX7
d*dm5g>uPzlROtZ ]r Җ}BIF̼SR4OԽd ╖ʸ6#3)}KF?BΨa1+UGejLxuPH Lem#mUWKsN{7BexhxuxULq})^iNO"oD,[=6nY_d ]J\'ZߔtX|ONYGaHNa*#-HwAVŷזAysR^ISׯOon1V(cyF6"ڬ5~C]dz_)jϋ#Kd .6X_Qɬn7u3NF_gIۻK$g=4;0t
J9gˎa:!X`D*J&w.S*5EZHdsw"`W9d-:ɞ
Zw}$PqbYįb<U9T(!Llȴͧ]{4sP:vԅ!~1YRa]+Ri#T>]0mHI]dĴ'|V
jRq%
vpbAr}4&ىg'X:40KbIh瑏XA"}F9b?̀*8T,#Ta$,Ya"h_dę$#
~EHkv1]֥ӱ1yZxXqy-4֏K:Ճ:TNDP{ScR,QF"'F`.R *RATsq܀d±ݬdċ!+^lZ;:T,1j
f3V˹ݻ?YuW>5(Jrf3tABG$Ot"Ψo̴3d-0#t{oJ *7mq37<0Ddĉ>M#OҬ4I6aNTyTsWi[!̥d,oYJL]A\; Ti_5$Ua!'L]#r57,yi4"hoMuӵ(_?
#P9gdĝzD#5^X=v;jD0k-XΏF&gR|3[_׼MYnRVន%Es?e#܅BiSXWo(qlܖQmukݙBU^ "+10dĪXC{BUf'*']REJ,]WX~[-hH2
C#\ꄶ&#i{0a1nC98G.>dĹX]X*#:P|O<lԴW|
Pe%r!A<r|f 1,B`H, 8FLAMu9udC_[d<bUdҊ豋d5:_*)ZFInqՁFM$L-ɀ7
LՑ}>+<?H/dӠrJdDj/jZ~s2^͉u>.[5qWM zvXu*sE&&B0VͯoZ'KABr#dwن0/7"qǚ0K`9kft));9V侗'7
[d*2H3MS{;&hx蟑=
2%'^٢,c.!Y1bj6[dĆD*6O}:hT#XX>KzĹV
&?N e8 %
QtcR-s__+"!ùۅJ^N]Q=M2Ѥu-ZpbdČ#D6R+`4ڷ ʛ+,P|GO+
dEL1\~+F}*C#L
q3
:
&*(?cPƘ&s,b=gQ5O$ny% }kh1dĂ!^|Ka5WC-n-HK5g55&
Rnǝ'I,!̡Ӈg]rەzKB?##fwǀaXg15)/dNDVcbW0jח4c_Ϭ{G)x+[gHʝ
HB
#^7"#wx{bkSt1X#.?#`a[\9{/ 9#T;)mS!Ⱥj$TTXdē6xzs۹&0Eg4VvTkн=c#lj9ڢ2OwH3X.c]IFKk것³΋-[B.*P;dĤĢ)ZEs]IS_5hC_][
?1){܊E4:9F<MvsU˹X#/+R()Šq:F_9Ns2eCD\IRdj|$DĐQOdĴN
:c)>Tm}ۦWov{zU)Y|twQmxߨj>GeRyl1u"</wbr2򖹭--*o*hs(Scь
[:<_}[dľ#y9oVv4V
fKM#P82VuX}Vh8FaUD$v0ceAT>WNՂyg3=ncu<T*
D#֠ds6
D,
N2VY*yen5Xt"(1HAar%*LMzDfW99FCnRAqzp`hÊxA1C ѫViAqd޷H̼Gp^_N+NUxvX;ҫJALb92ܵhgo>fYNj5={Z3ӉbYw&PEI(\6o5D2d IaIvgrQs~by!")|Di:RdžlE%s#|E#\9d3H3zfPhlsL&et3d)D3q"Z*GVHܖf(P98#z A`b|(0| H;-}!!df>܅rNV1jV7dN,;*d!xöޕy{ }QCS"21jP:FvVSEnjyۘ/%h/MpuzϚ0}n`WʁlVv#\qoO͏5+9߉e`4$WPՅd لm&óf-(}n
;CRȶu| 1xٚx-T^dBQ=k
#,:*]~L"8\OzeLBN{
H^EM]bV FoʸAnNSܡrdi),:[ {*Jd{$V
Êܒn*BlFFTd]r)
lYυs~NѦ_?<j7eH##}4iNv!DidJ6ѣ}d!֔'xzu:
H!0`8CC` ƀLAD0/?&Sy!U^a<YR+#E0,2XF͡jJS(KZw1i1V8ҋN^Qd.aKuDˈNY J&#
C(kM8V8`e
($V܌9#\8#ɟb*өcIiU%hÒYWP(bI!d#Rbhʼ.<`cy
`-w|/ӄ_n098'.0ؠ|SJP{K "z߸u!6TQҥ|M2'bw꾔>'evסsd!:bd-, o]~jZՂ.1)ħ#c pphXp-`Q
"DXI{ێˑsD!P:T#YGhFC CU`uL2w߬V/מz-gvdz`(ļ8tnKOC{&\(Cle4[I%xPli !mxlc3/rV諐.JO=r<*gޛ
̢zUN;
P{N3k
P!Lo&Cn'~DRd"d\8WQs{ohtN.Tk#[r/K_3"q꽍plSU4fba&p*sNyq՞ Yצ)׾TM*d;L6]xZrz9E:ae,ZÆ1:;k;<[JS4צ/Ǧ?=󿿛_WS~"=FPɩ*>ĩ\k\LBe/Zdă4d:xs+"VIԨ
ƃx*;\&,2wAd7j9%}G4~5YOhDMζ!.ڧbfʱ3R4!ipL8^AsG2-Q!d4)sZyP<8,:VeDA̝EpY8SBOِCռ]\]ClWcgQf.BPA~ga9wٻD2^c:xfXdPP˵ƈildr#Khn+īo=g 0^h)dob#cLDD!<麱_K1"cS0YHAB{(
!o=g(^E$%ERYdHA)'iL
pʧ8X&/T|;SȎTsj3203N_4l?|3_'TwHÞ;|>JxVlj+1-#K}$iLw]iDad%~a^IiXGܼwB2ʼn!Ͱȧ2YhY;?" *EǂN7FXpC$,~`K=xwmqDTDh.ufW+K:}DbH5~gj#Ȣ>Bd4Kj0JU]9[}#'򪶭I[=]>R~ߩDžG#CTF&jNwge]kqR䋌,,?K&*eH
Hp3tUDsWje(Z=01dEH;e4+JJݷk[e`S$X[ewxݵ(Li!6EM
26`'q9Ăn֫SoeOGQqdU2
FԓgJt nnLRkF"Veߜ
}WpcrNQ{K͛{lK$/p<w!#I#)$Kޱb;b'R%dgF]O0xwS<j2*!L5hT9I'%ʮO
J;ZT4PCd2m{22Rj
R2611r;%$4PmDY2ENe"Nwdz3T1cȢL##ކ
$B<#ȩ:$2ELHDJzɥU1.
,A0""]!eQ&9U>2
(.YVfUST)KJgKGoRRd-{1b1jRJR)t1ig1))JR1gR/C~`E(
LAME3.99.5dBHLAME3.99.5d|HLAME3.99.5d|HLAME3.99.5d|HLAME3.99.5d|HLAME3.99.5d|HLAME3.99.5d|HLAME3.99.5d|HLAME3.99.5d|HLAME3.99.5d|HLAME3.99.5d|H
--------abcde123--
I figured it out, I was using a LoggingInterceptor for OKHttp and it use to convert the bytes to string and log it and due to some reason, it is unable to recognize some characters and it gives some junk values for those characters and due to some reason, those values persisted back in the byte array also... which used to give malformed mp3 file. After removing the interceptor...it started working... so step 1 is done...more to go.

Reproducing encrypted video using ExoPlayer

I'm using ExoPlayer, in Android, and I'm trying to reproduce an encrypted video stored locally.
The modularity of ExoPlayer allows to create custom components that can be injected in the ExoPlayer, and this seems the case. Indeed, after some researches I realized that for achive that task I could create a custom DataSource and overriding open(), read() and close().
I have also found this solution, but actually here the entire file is decrypted in one step and stored in a clear inputstream. This can be good in many situation. But what if I need to reproduce big file?
So the question is: how can I reproduce encrypted video in ExoPlayer, decrypting content "on-fly" (without decrypting the entire file)? Is this possibile?
I tried creating a custom DataSource that has the open() method:
#Override
public long open(DataSpec dataSpec) throws FileDataSourceException {
try {
File file = new File(dataSpec.uri.getPath());
clearInputStream = new CipherInputStream(new FileInputStream(file), mCipher);
long skipped = clearInputStream.skip(dataSpec.position);
if (skipped < dataSpec.position) {
throw new EOFException();
}
if (dataSpec.length != C.LENGTH_UNBOUNDED) {
bytesRemaining = dataSpec.length;
} else {
bytesRemaining = clearInputStream.available();
if (bytesRemaining == 0) {
bytesRemaining = C.LENGTH_UNBOUNDED;
}
}
} catch (EOFException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
opened = true;
if (listener != null) {
listener.onTransferStart();
}
return bytesRemaining;
}
And this is the read() method:
#Override
public int read(byte[] buffer, int offset, int readLength) throws FileDataSourceException {
if (bytesRemaining == 0) {
return -1;
} else {
int bytesRead = 0;
int bytesToRead = bytesRemaining == C.LENGTH_UNBOUNDED ? readLength
: (int) Math.min(bytesRemaining, readLength);
try {
bytesRead = clearInputStream.read(buffer, offset, bytesToRead);
} catch (IOException e) {
e.printStackTrace();
}
if (bytesRead > 0) {
if (bytesRemaining != C.LENGTH_UNBOUNDED) {
bytesRemaining -= bytesRead;
}
if (listener != null) {
listener.onBytesTransferred(bytesRead);
}
}
return bytesRead;
}
}
If instead of an encoded file I pass a clear file, and just remove the CipherInputStream part, then it works fine, instead with encrypted file I obtain this error:
Unexpected exception loading stream
java.lang.IllegalStateException: Top bit not zero: -1195853062
at com.google.android.exoplayer.util.ParsableByteArray.readUnsignedIntToInt(ParsableByteArray.java:240)
at com.google.android.exoplayer.extractor.mp4.Mp4Extractor.readSample(Mp4Extractor.java:331)
at com.google.android.exoplayer.extractor.mp4.Mp4Extractor.read(Mp4Extractor.java:122)
at com.google.android.exoplayer.extractor.ExtractorSampleSource$ExtractingLoadable.load(ExtractorSampleSource.java:745)
at com.google.android.exoplayer.upstream.Loader$LoadTask.run(Loader.java:209)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:423)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
at java.lang.Thread.run(Thread.java:818)
EDIT:
the encrypted video is generated in this way:
Cipher cipher = Cipher.getInstance("AES/CBC/PKCS5Padding");
SecretKeySpec keySpec = new SecretKeySpec("0123456789012345".getBytes(), "AES");
IvParameterSpec ivSpec = new IvParameterSpec("0123459876543210".getBytes());
cipher.init(Cipher.ENCRYPT_MODE, keySpec, ivSpec);
outputStream = new CipherOutputStream(output_stream, cipher);
Then the outputStream is saved into a File.
Example how to play encrypted audio file, hope this will help to someone.
I'm using Kotlin here
import android.net.Uri
import com.google.android.exoplayer2.C
import com.google.android.exoplayer2.upstream.DataSource
import com.google.android.exoplayer2.upstream.DataSourceInputStream
import com.google.android.exoplayer2.upstream.DataSpec
import com.google.android.exoplayer2.util.Assertions
import java.io.IOException
import javax.crypto.CipherInputStream
class EncryptedDataSource(upstream: DataSource) : DataSource {
private var upstream: DataSource? = upstream
private var cipherInputStream: CipherInputStream? = null
override fun open(dataSpec: DataSpec?): Long {
val cipher = getCipherInitDecrypt()
val inputStream = DataSourceInputStream(upstream, dataSpec)
cipherInputStream = CipherInputStream(inputStream, cipher)
inputStream.open()
return C.LENGTH_UNSET.toLong()
}
override fun read(buffer: ByteArray?, offset: Int, readLength: Int): Int {
Assertions.checkNotNull<Any>(cipherInputStream)
val bytesRead = cipherInputStream!!.read(buffer, offset, readLength)
return if (bytesRead < 0) {
C.RESULT_END_OF_INPUT
} else bytesRead
}
override fun getUri(): Uri {
return upstream!!.uri
}
#Throws(IOException::class)
override fun close() {
if (cipherInputStream != null) {
cipherInputStream = null
upstream!!.close()
}
}
}
In function above you need to get Cipher which was used for encryption and init it: smth like this
fun getCipherInitDecrypt(): Cipher {
val cipher = Cipher.getInstance("AES/CTR/NoPadding", "BC");
val iv = IvParameterSpec(initVector.toByteArray(charset("UTF-8")))
val skeySpec = SecretKeySpec(key, TYPE_RSA)
cipher.init(Cipher.DECRYPT_MODE, skeySpec, iv)
return cipher
}
Next step is creating DataSource.Factory for DataSource we've implemented earlier
import com.google.android.exoplayer2.upstream.DataSource
class EncryptedFileDataSourceFactory(var dataSource: DataSource) : DataSource.Factory {
override fun createDataSource(): DataSource {
return EncryptedDataSource(dataSource)
}
}
And last step is players initialization
private fun prepareExoPlayerFromFileUri(uri: Uri) {
val player = ExoPlayerFactory.newSimpleInstance(
DefaultRenderersFactory(this),
DefaultTrackSelector(),
DefaultLoadControl())
val playerView = findViewById<PlayerView>(R.id.player_view)
playerView.player = player
val dsf = DefaultDataSourceFactory(this, Util.getUserAgent(this, "ExoPlayerInfo"))
//This line do the thing
val mediaSource = ExtractorMediaSource.Factory(EncryptedFileDataSourceFactory(dsf.createDataSource())).createMediaSource(uri)
player.prepare(mediaSource)
}
Eventually I found the solution.
I used a no-padding for the encryption algorithm, in this way:
cipher = Cipher.getInstance("AES/CTR/NoPadding", "BC");
so that the size of the encrypted file and the clear file size remain the same. So now I created the stream:
cipherInputStream = new CipherInputStream(inputStream, cipher) {
#Override
public int available() throws IOException {
return in.available();
}
};
This is because the Java documentation says about ChiperInputStream.available() that
This method should be overriden
and actually I think is it more like a MUST, because the values retrieved from that method are often really strange.
And that is it! Now it works perfectly.
This problem had me tearing my hair out, so I finally caved and implemented a streaming cipher for AES/CBC that lets you skip ahead. CBC theoretically allows for random reads, you need to initialize the cipher with the previous block's ciphertext as the initialization vector and then read ahead until the spot you needed. Sample project with full implementation here. Here's the key classes:
import android.net.Uri
import android.util.Log
import com.google.android.exoplayer2.C
import com.google.android.exoplayer2.upstream.DataSource
import com.google.android.exoplayer2.upstream.DataSpec
import com.google.android.exoplayer2.upstream.TransferListener
import ar.cryptotest.exoplayer2.MainActivity.Companion.AES_TRANSFORMATION
import java.io.EOFException
import java.io.File
import java.io.IOException
import java.io.InputStream
import java.lang.RuntimeException
import javax.crypto.Cipher
import javax.crypto.CipherInputStream
import javax.crypto.spec.IvParameterSpec
import javax.crypto.spec.SecretKeySpec
const val TAG = "ENCRYPTING PROCESS"
class BlockCipherEncryptedDataSource(
private val secretKeySpec: SecretKeySpec,
private val uri: Uri,
cipherTransformation: String = "AES/CBC/PKCS7Padding"
) : DataSource {
private val cipher: Cipher = Cipher.getInstance(cipherTransformation)
private lateinit var streamingCipherInputStream: StreamingCipherInputStream
private var bytesRemaining: Long = 0
private var isOpen = false
private val transferListeners = mutableListOf<TransferListener>()
private var dataSpec: DataSpec? = null
#Throws(EncryptedFileDataSourceException::class)
override fun open(dataSpec: DataSpec): Long {
this.dataSpec = dataSpec
if (isOpen) return bytesRemaining
try {
setupInputStream()
streamingCipherInputStream.forceSkip(dataSpec.position)
computeBytesRemaining(dataSpec)
} catch (e: IOException) {
throw EncryptedFileDataSourceException(e)
}
isOpen = true
transferListeners.forEach { it.onTransferStart(this, dataSpec, false) }
return C.LENGTH_UNSET.toLong()
}
private fun setupInputStream() {
val path = uri.path ?: throw RuntimeException("Tried decrypting uri with no path: $uri")
val encryptedFileStream = File(path).inputStream()
val initializationVector = ByteArray(cipher.blockSize)
encryptedFileStream.read(initializationVector)
streamingCipherInputStream =
StreamingCipherInputStream(
encryptedFileStream,
cipher,
IvParameterSpec(initializationVector),
secretKeySpec
)
}
#Throws(IOException::class)
private fun computeBytesRemaining(dataSpec: DataSpec) {
if (dataSpec.length != C.LENGTH_UNSET.toLong()) {
bytesRemaining = dataSpec.length
return
}
if (bytesRemaining == Int.MAX_VALUE.toLong()) {
bytesRemaining = C.LENGTH_UNSET.toLong()
return
}
bytesRemaining = streamingCipherInputStream.available().toLong()
}
#Throws(EncryptedFileDataSourceException::class)
override fun read(buffer: ByteArray, offset: Int, readLength: Int): Int {
if (bytesRemaining == 0L) {
Log.e(TAG, "End - No bytes remaining")
return C.RESULT_END_OF_INPUT
}
val bytesRead = try {
streamingCipherInputStream.read(buffer, offset, readLength)
} catch (e: IOException) {
throw EncryptedFileDataSourceException(e)
}
// Reading -1 means an error occurred
if (bytesRead < 0) {
if (bytesRemaining != C.LENGTH_UNSET.toLong())
throw EncryptedFileDataSourceException(EOFException())
return C.RESULT_END_OF_INPUT
}
// Bytes remaining will be unset if file is too large for an int
if (bytesRemaining != C.LENGTH_UNSET.toLong())
bytesRemaining -= bytesRead.toLong()
dataSpec?.let { nonNullDataSpec ->
transferListeners.forEach {
it.onBytesTransferred(this, nonNullDataSpec, false, bytesRead)
}
}
return bytesRead
}
override fun addTransferListener(transferListener: TransferListener) {
transferListeners.add(transferListener)
}
override fun getUri(): Uri = uri
#Throws(EncryptedFileDataSourceException::class)
override fun close() {
Log.e(TAG, "Closing stream")
try {
streamingCipherInputStream.close()
} catch (e: IOException) {
throw EncryptedFileDataSourceException(e)
} finally {
if (isOpen) {
isOpen = false
dataSpec?.let { nonNullDataSpec ->
transferListeners.forEach { it.onTransferEnd(this, nonNullDataSpec, false) }
}
}
}
}
class EncryptedFileDataSourceException(cause: IOException?) : IOException(cause)
class StreamingCipherInputStream(
private val sourceStream: InputStream,
private var cipher: Cipher,
private val initialIvParameterSpec: IvParameterSpec,
private val secretKeySpec: SecretKeySpec
) : CipherInputStream(
sourceStream, cipher
) {
private val cipherBlockSize: Int = cipher.blockSize
#Throws(IOException::class)
override fun read(b: ByteArray, off: Int, len: Int): Int = super.read(b, off, len)
fun forceSkip(bytesToSkip: Long) {
val bytesSinceStartOfCurrentBlock = bytesToSkip % cipherBlockSize
val bytesUntilPreviousBlockStart =
bytesToSkip - bytesSinceStartOfCurrentBlock - cipherBlockSize
try {
if (bytesUntilPreviousBlockStart <= 0) {
cipher.init(
Cipher.DECRYPT_MODE,
secretKeySpec,
initialIvParameterSpec
)
return
}
var skipped = sourceStream.skip(bytesUntilPreviousBlockStart)
while (skipped < bytesUntilPreviousBlockStart) {
sourceStream.read()
skipped++
}
val previousEncryptedBlock = ByteArray(cipherBlockSize)
sourceStream.read(previousEncryptedBlock)
cipher.init(
Cipher.DECRYPT_MODE,
secretKeySpec,
IvParameterSpec(previousEncryptedBlock)
)
skip(bytesUntilPreviousBlockStart + cipherBlockSize)
val discardableByteArray = ByteArray(bytesSinceStartOfCurrentBlock.toInt())
read(discardableByteArray)
} catch (e: Exception) {
Log.e(TAG, "Encrypted video skipping error", e)
throw e
}
}
// We need to return the available bytes from the upstream.
// In this implementation we're front loading it, but it's possible the value might change during the lifetime
// of this instance, and reference to the stream should be retained and queried for available bytes instead
#Throws(IOException::class)
override fun available(): Int {
return sourceStream.available()
}
}
}
class BlockCipherEncryptedDataSourceFactory(
private val secretKeySpec: SecretKeySpec,
private val uri: Uri,
private val cipherTransformation: String = "AES/CBC/PKCS7Padding"
) : DataSource.Factory {
override fun createDataSource(): BlockCipherEncryptedDataSource {
return BlockCipherEncryptedDataSource(secretKeySpec, uri, cipherTransformation)
}
}
check your proxy, given the following configuration.
ALLOWED_TRACK_TYPES = "SD_HD"
content_key_specs = [{ "track_type": "HD",
"security_level": 1,
"required_output_protection": {"hdcp": "HDCP_NONE" }
},
{ "track_type": "SD",
"security_level": 1,
"required_output_protection": {"cgms_flags": "COPY_FREE" }
},
{ "track_type": "AUDIO"}]
request = json.dumps({"payload": payload,
"content_id": content_id,
"provider": self.provider,
"allowed_track_types": ALLOWED_TRACK_TYPES,
"use_policy_overrides_exclusively": True,
"policy_overrides": policy_overrides,
"content_key_specs": content_key_specs
?
In the ExoPlayer demo app - DashRenderBuilder.java has a method 'filterHdContent' this always returns true if device is not level 1 (Assuming here it's L3). This causes the player to disregard the HD AdaptionSet in the mpd whilst parsing it.
You can set the filterHdContent to always return false if you want to play HD, however it is typical of content owners to require a L1 Widevine implementation for HD content.
check this link for more https://github.com/google/ExoPlayer/issues/1116
https://github.com/google/ExoPlayer/issues/1523
I don't believe a custom DataSource, with open/read/close, is a solution to your need. For an 'on-the-fly' decryption (valuable for big files but not only), you must design a streaming architecture.
There are already posts similar to yours. To find them, don't look for 'exoplayer', but 'videoview' or 'mediaplayer' instead. The answers should be compatible.
For instance, Playing encrypted video files using VideoView

Categories

Resources