I been trying to figure out how to stream mic data from the android to flutter. I found some example code on how to query mic in chucks but I do not know a way to get the data onto flutter.
https://github.com/bitplane/Microphone/blob/master/src/net/bitplane/android/microphone/MicrophoneService.java
I am not sure which classes to look in flutter
https://docs.flutter.io/flutter/services/EventChannel/receiveBroadcastStream.html
I wonder if anyone can help me point to the right direction.
I been digging through example plugins in flutter github and found this.
https://github.com/flutter/plugins/tree/master/packages/sensors
Unfortunately for me, EventChannel documentation is sparse. I do not believe we can pass streams from flutter to the platform so I can replay whatever I recorded in a feedback loop.
Event callback. Supports dual use: Producers of events to be sent to Flutter act as clients of this interface for sending events. Consumers of events sent from Flutter implement this interface for handling received events (the latter facility has not been implemented yet).
https://docs.flutter.io/javadoc/io/flutter/plugin/common/EventChannel.EventSink.html
The process is similar to creating a methodchannel, but I need to create an onCancel and onListen in android. Within the onListen, I must create also listener that could receive events. If I want to create audio events, I must use setPositionNotificationPeriod(int) and OnRecordPositionUpdateListener(listener).
https://developer.android.com/reference/android/media/AudioRecord.html
https://github.com/flutter/plugins/blob/master/packages/sensors/android/src/main/java/io/flutter/plugins/sensors/SensorsPlugin.java
On flutter I must create a broadcast stream event
https://github.com/flutter/plugins/blob/master/packages/sensors/lib/sensors.dart
I manage to write my own terrible plugin that is android only atm. If anyone needs it I wonder what permissive license I should add
https://github.com/hungrymonkey/mic_stream
Related
Is there is api where i can record all the incoming and outgoing call in android pie version . It seems nearly impossible to record a call. Please suggest what can be done in this scenario
Android disabled the api with their security update policy. Looking at the permissions list the closest you can find is the MANAGE_OWN_CALLS, meaning that the best solution for you is to implement a standalone application for calling where you should be able to interact with microphone directly.
Am using Kurento Media server for the one to one call.
In browser the call happens with no issues.
But I want to implement this in android.
Am using this, https://github.com/MaxMaes/WebRTCTest
Since, it is stated, the project is not complete, Am trying to complete this.
Am following all the steps of WebRTC.
Am able to connect to kurento one-one call app, running in server. The flow goes like this
Party A created in browser.
Party B created in android app.
A makes a call to B.
Now, on receiving "incoming call" message, the call is accepted.
pc.createOffer is called.
onCreateSuccess, pc.setLocalDescription is done. sdpoffer is sent to B.
parallelly, receiving "onIceCandidate", pc.addIceCanditate is done.
ICEGatheringState = Complete
A receives the, callResponse, and sends "startcommunication"with sdpAnswer.
pc.setRemoteDescription, from sdpAnswer is done.
onAddStream called. A remote renderer is added to the UI.
But no streaming is coming.
From A, B is receiving "iceCandidate". is anything to be done on this?
In both sides, I dont get the remote video But able to see local video
And I dont get any error.
In onAddRemoteStream change
VideoRendererGui.update(remoteRender, REMOTE_X, REMOTE_Y, REMOTE_WIDTH,REMOTE_HEIGHT, scalingType)
add one more parameter after scalingType
VideoRendererGui.update(remoteRender,
REMOTE_X, REMOTE_Y, REMOTE_WIDTH,REMOTE_HEIGHT, scalingType,true)
Hope this will help you.
you have it like this?
onAddRemoteStream(MediaStream remoteStream){
remoteStream.videoTracks.get(0).addRenderer(new VideoRenderer(remoteRender));
}
In some case you can not receive video if not sending video.
Workarounded with this: https://stackoverflow.com/a/51883182/571410
In chromecast
i want to send different kinds of url(mp4/mp3/png..) to the receiver,but how does the receiver to show them dynamically?
this is: how does the receiver recogonize what kind of the RemoteMedia received?
In the current version of the SDK, there is nothing from the framework side to help you with that directly. You can include the mimetype in the metadata and retrieve that on your receiver and do as you see fit. That said, if your media is only audio or video, things are better since the video element can handle both and you can just treat them the same but for images, you have to do some other work. Another approach is to look at the extension and try to guess the type but that is not fully reliable.
How is it possible to modify and extend the Sencha Touch 2 Native Packaging for Android (see http://docs.sencha.com/touch/2-0/#!/guide/native_android)? I want to add a Push functionality to the Android Project.
IMO, you should use cordova(phonegap). With that, you'll get plugins to add push in your app. As far as I know, there's no way you can use PUSH with Sencha Touch itself. To work with Push you must have a device id. Sencha can not give you that (unless you manage to call functions inside your activity from javascript). Moreover, you need to have a broadcast receiver set up to listen to incoming push messages. Basically, there's bunch of to-do's involved in using Push, so you can not simply automate that job and let native packaging do it.
Modifying/extending sencha tools native packaging is not simply worth it considering time and efforts that would take (if possible).
You should check out http://www.pubnub.com/ they have Javascript client to do listen to a queue and using that you can extecute whatever you want once message is pushed.
Is it possible to create an Android application that automatically attend incoming calls to an Android phone? If so, which APIs may be used to achieve this (a piece of code snippet highly appreciable)?
If the programmatic auto attendant feature not possible, why the Android OS imposes this restriction?
Is iOS behaves as same as Android in this scenario, please explain.
While googling I found something that can be useful. I haven't tried yet still I think this will help have a look at Call Control in Android
You can listen incomming call intent by implementing broadcast receiver Intent.CALL_STATE_CHANGED to listen for incoming call, but answering incomming call automatically seems not feasible.coz android application dont have access to incomming call audio stream.