Pjsip Android: How to use PjCamera class to display user preview - android

I have developed Video Call but currently facing one issue of displaying user preview(own preview: Currently I have used SurfaceView to display this preview code).
Pjsip library itself using Camera API to sending frames to other user.
There is a PjCamera class in Android. Anyone know how to use that class to dispaly your own preview?
======EDIT======
if (SipManager.currentCall != null &&
SipManager.currentCall?.mVideoPreview != null) {
if (videoPreviewActive) {
Log.d(TAG, "$TAG = if")
val vidWH = VideoWindowHandle()
vidWH.handle?.setWindow(holder.surface)
val vidPrevParam = VideoPreviewOpParam()
vidPrevParam.window = vidWH
try {
SipManager.currentCall?.mVideoPreview?.start(vidPrevParam)
} catch (e: Exception) {
println(e)
}
} else {
Log.d(TAG, "$TAG = else")
try {
SipManager.currentCall?.mVideoPreview?.stop()
} catch (e: Exception) {
println(e)
}
}
}

Related

Huawei ML Kit Text to speech duration of audio

I have implemented On-device TTS from the Huawei ML Kit in my app and it works well.
Now I would like to find out the duration of the synthesized audio. I want to for example display the remaining time while the audio is playing.
I tried writing the generated audio fragments from the callback to a .pcm file
override fun onAudioAvailable(
taskId: String?,
audioFragment: MLTtsAudioFragment?,
offset: Int,
range: android.util.Pair<Int, Int>?,
bundle: Bundle?
) {
if (taskId != null) {
if (audioFragment != null) {
val fileName = "audio.pcm"
writeToFile(audioFragment.audioData, fileName, true)
}
}
}
fun writeToFile(buffer: ByteArray?, strFileName: String?, append: Boolean) {
if (speechFile == null) {
val pcmFileDir: File = view.getExternalFilesDir("/PCM")!!
speechFile = File(pcmFileDir, strFileName)
}
var raf: RandomAccessFile? = null
var out: FileOutputStream? = null
try {
if (append) {
raf = RandomAccessFile(speechFile, "rw")
speechFile?.length()?.let { raf.seek(it) }
raf.write(buffer)
} else {
out = FileOutputStream(speechFile)
out.write(buffer)
out.flush()
}
} catch (e: IOException) {
e.printStackTrace()
} finally {
try {
raf?.close()
out?.close()
} catch (e: IOException) {
e.printStackTrace()
}
}
}
and getting the duration that way, but the Android MediaPlayer.getDuration() doesn't seem to work with a .pcm file.
Is there a better way to get the duration of the audio? If not then is it possible to calculate the duration of the .pcm file somehow?

Play sound in background thread issue

I have an app that in an activity i do heavy animations and play sound i used the below code to play the sound
var musicThread:Thread? = null
fun playSound(sound:Uri) {
musicThread = Thread(Runnable {
try {
sharedPlayer.reset()
sharedPlayer = MediaPlayer.create(MyApplication.appContext,sound)
sharedPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
// sharedPlayer.prepare()
sharedPlayer.setVolume(1f, 1f)
sharedPlayer.setLooping(false)
sharedPlayer.start()
sharedPlayer.setOnCompletionListener {
it.reset()
}
} catch (e: Exception) {
e.printStackTrace()
}
runOnUiThread {
}
})
musicThread?.start()
}
fun stopSound() {
try {
if (sharedPlayer.isPlaying()) {
sharedPlayer.stop()
}
musicThread?.stop()
} catch (e: Exception) {
e.printStackTrace()
}
}
Is there any problem with it ? BTW i call playSound many times with different sound files one by one

Twitter Streaming API HTTP 420

I want to consume twitter streaming api in android.
I've used kotlin coroutines and retrofit.
Somehow in the third request i get an HTTP 420 ERROR (Enhance your calm)
I cannot understand why this happens. I am using kotlin coroutines.
Here's my code:
fun getStreamData(str: String) {
Log.d("debug", "Fetching data..")
coroutineScope.launch {
withContext(Dispatchers.Main) {
//Display loading animation in UI
_status.value = DataApiStatus.LOADING
}
try {
val listResult = ApiService().api!!.getTweetList(str).await()
while (!listResult.source().exhausted()) {
val reader = JsonReader(InputStreamReader(listResult.byteStream()))
// https://stackoverflow.com/questions/11484353/gson-throws-malformedjsonexception
reader.setLenient(true);
val gson = GsonBuilder().create()
val j = gson.fromJson<JsonObject>(reader, JsonObject::class.java)
Log.d("debug", "JSON: " + j.toString())
if (j.get("text") != null && j.getAsJsonObject("user").get("profile_image_url_https") != null && j.getAsJsonObject("user").get("name") != null){
val t = gson.fromJson<Tweet>(j, Tweet::class.java)
withContext(Dispatchers.Main) {
_status.value = DataApiStatus.DONE
// https://stackoverflow.com/questions/47941537/notify-observer-when-item-is-added-to-list-of-livedata
tweetsList.add(t)
_tweetsList.value = tweetsList
}
}
}
}
catch (e : JsonSyntaxException) {
Log.e("error", "JsonSyntaxException ${e.message}");
}
catch (e: Exception) {
Log.e("error", "ERROR ${e.message}")
}
}
}
This function is responsible to search the stream accordingly to str string which is a parameter.
Also, when the search parameter changes i cancel the current job and relaunch a new one with the actual search parameter.
fun cancelJob(){
Log.d("debug", "Cancelling current Job!")
coroutineScope.coroutineContext.cancelChildren()
}
What am i doing wrong? In the third request i get an HTTP 420 ERROR.
Here's the full code:
https://github.com/maiamiguel/RHO-Challenge
The 420 Enhance Your Calm status code is an unofficial extension by Twitter. Twitter used this to tell HTTP clients that they were being rate limited. Rate limiting means putting restrictions on the total number of requests a client may do within a time period.

How to fix 'pink' image capture in android Camera2API?

I am trying to capture the image using 2 different lenses (wide and normal). The preview works fine for both cameras simultaneously using the new multicamera support in Camera2API. I am using Huawei Mate20Pro.
However, when I capture the picture, it only saves pink colored JPEG images. But when the object is close enough, the picture is captured perfectly. Here is what I mean. This is how a pink JPEG looks like:
However, when the object is close enough, the capture is fine. Here is how it looks like:
Here is the main activity code:
button.setOnClickListener {
if (isRunning) {
handler.removeCallbacksAndMessages(null)
restartActivity()
} else {
button.text = "Stop"
handler.postDelayed(object : Runnable {
override fun run() {
twoLens.reset()
twoLens.isTwoLensShot = true
MainActivity.cameraParams.get(dualCamLogicalId).let {
if (it?.isOpen == true) {
Logd("In onClick. Taking Dual Cam Photo on logical camera: $dualCamLogicalId")
takePicture(this#MainActivity, it)
Toast.makeText(applicationContext, "Captured!", Toast.LENGTH_SHORT).show()
}
}
handler.postDelayed(this, 1000)
}
}, 2000)
}
isRunning = !isRunning
}
}
Here is the picture capture code.
fun captureStillPicture(activity: MainActivity, params: CameraParams) {
if (!params.isOpen) {
return
}
try {
Logd("In captureStillPicture.")
val camera = params.captureSession?.getDevice()
if (null != camera) {
params.captureBuilder = camera.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE)
params.captureBuilder?.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO)
if (params.id.equals(dualCamLogicalId) && twoLens.isTwoLensShot) {
val normalParams: CameraParams? = MainActivity.cameraParams.get(normalLensId)
val wideParams: CameraParams? = MainActivity.cameraParams.get(wideAngleId)
if (null == normalParams || null == wideParams)
return
Logd("In captureStillPicture. This is a Dual Cam shot.")
params.captureBuilder?.addTarget(normalParams.imageReader?.surface!!)
params.captureBuilder?.addTarget(wideParams.imageReader?.surface!!)
params.captureBuilder?.set(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION, 4)
params.captureBuilder?.set(CaptureRequest.JPEG_QUALITY, 100)
if (Build.VERSION.SDK_INT >= 28) { params.captureBuilder?.set(CaptureRequest.DISTORTION_CORRECTION_MODE, CameraMetadata.DISTORTION_CORRECTION_MODE_OFF)
//This is REQUIRED to disable HDR+ on Pixel 3 - even though Pixel 3 doesn't have sepia
params.captureBuilder?.set(CaptureRequest.CONTROL_EFFECT_MODE, CameraMetadata.CONTROL_EFFECT_MODE_SEPIA)
} else {
//This is REQUIRED to disable HDR+ on Pixel 3 - even though Pixel 3 doesn't have sepia
params.captureBuilder?.set(CaptureRequest.CONTROL_EFFECT_MODE, CameraMetadata.CONTROL_EFFECT_MODE_SEPIA)
Logd("DUAL CAM DEBUG: I am setting sepia mode.")
// Logd("DUAL CAM DEBUG: I am NOT setting sepia mode.")
}
val rotation = activity.getWindowManager().getDefaultDisplay().getRotation()
var capturedImageRotation = getOrientation(params, rotation)
params.captureBuilder?.set(CaptureRequest.JPEG_ORIENTATION, capturedImageRotation)
try {
params.captureSession?.stopRepeating()
// params.captureSession?.abortCaptures()
} catch (e: CameraAccessException) {
e.printStackTrace()
}
//Do the capture
// TODO: Capture BURST HERE
if (28 <= Build.VERSION.SDK_INT)
params.captureSession?.captureSingleRequest(params.captureBuilder?.build(), params.backgroundExecutor, StillCaptureSessionCallback(activity, params))
else
params.captureSession?.capture(params.captureBuilder?.build(), StillCaptureSessionCallback(activity, params),
params.backgroundHandler)
}
} catch (e: CameraAccessException) {
e.printStackTrace()
} catch (e: IllegalStateException) {
Logd("captureStillPicture IllegalStateException, aborting: " + e)
}
}
This is how I am grabbing the captured pictures.
fun getImagesCaptured(activity: MainActivity, twoLens: TwoLensCoordinator){
Logd("Normal image timestamp: " + twoLens.normalImage?.timestamp)
Logd("Wide image timestamp: " + twoLens.wideImage?.timestamp)
val wideBuffer: ByteBuffer? = twoLens.wideImage!!.planes[0].buffer
val wideBytes = ByteArray(wideBuffer!!.remaining())
wideBuffer.get(wideBytes)
val normalBuffer: ByteBuffer? = twoLens.normalImage!!.planes[0].buffer
val normalBytes = ByteArray(normalBuffer!!.remaining())
normalBuffer.get(normalBytes)
val options = BitmapFactory.Options()
val wideMat: Mat = Mat(twoLens.wideImage!!.height, twoLens.wideImage!!.width, CvType.CV_8UC1)
val tempWideBitmap = BitmapFactory.decodeByteArray(wideBytes, 0, wideBytes.size, options)
val normalMat: Mat = Mat(twoLens.normalImage!!.height, twoLens.normalImage!!.width, CvType.CV_8UC1)
val tempNormalBitmap = BitmapFactory.decodeByteArray(normalBytes, 0, normalBytes.size, options)
save(normalBytes, "NormalShot")
save(wideBytes, "WideShot")
}
The save function is here.
fun save(bytes: Bitmap, tempName: String) {
val timeStamp = SimpleDateFormat("yyyyMMdd_HHmmss").format(Date())
val dataDir = File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS), "TwoCameraImages")
if (!dataDir.exists()) {
dataDir.mkdir()
}
val fileName = tempName + "_IMG_$timeStamp.jpg"
val fileDir = File(dataDir.path + File.separator + fileName)
try {
val fileOutputStream = FileOutputStream(fileDir)
bytes.compress(Bitmap.CompressFormat.JPEG, 100, fileOutputStream)
//fileOutputStream.write(bytes)
fileOutputStream.close()
} catch (e: FileNotFoundException) {
e.printStackTrace()
} catch (e: IOException) {
e.printStackTrace()
}
}
I built on top of code given here: https://github.com/google/basicbokeh and switched to rear cameras, and removed the face calculations. But this pink bitmap is not going away. Any help?

How to print image in thermal printer using Android?

I need to print a image using thermal printer. Now I am able to print text using below code,
private fun printBill() {
Thread {
try {
val sock = Socket("192.168.100.99", 9100)
val oStream = PrintWriter(sock.getOutputStream())
oStream.printf("%-35s %-5s %5s\n", "Jihin Raju", 10.00, 100.00)
oStream.println(charArrayOf(0x1D.toChar(), 0x56.toChar(), 0x41.toChar(), 0x10.toChar()))
oStream.close()
sock.close()
} catch (e: UnknownHostException) {
e.printStackTrace()
} catch (e: IOException) {
e.printStackTrace()
}
runOnUiThread {
Toast.makeText(this#MainActivity, "Printed ", Toast.LENGTH_SHORT).show()
}
}.start()
}
Is there any way to print image

Categories

Resources