My end goal is to visualize the remote audio in a one on one call using the Agora Api. The agora api and available examples are quite vast but I did not find a example that allows me to access the audio as it is streamed in so that I can then get the samples max amplitude and send to a visualizer.The byte array would do just fine.
I have looked through the examples provide at https://github.com/AgoraIO/API-Examples, which seemed promising but I have not been able to solve this. Any help is appreciated.
(Within the API_Example on Github, I have attempted to implement ProcessRawData and AudioRecordService)
Update: The APIExample allows me to grab the raw data as it flows through and that is what I am looking for. The issue arises when I try to duplicate the "ProcessRawData" class in a new project. The call back for the audio observer is never called. I have gone through my code and it matches everything in the example. The only thing I can think of is that the method to import the "lib-raw-data" folder was incorrect. I simply copied the entrire folder 'lib-raw-data' from the example api project and into my own. I then added the library directory to the gradle.settings file as well as the gradle(app) file. Outisde of that, I simply made sure the code matches the example provided.
Below is the most basic form of my application with the imported library "lib-raw-data" as described. I have no errors within Android Studio so I don't know where to look. The example in the github above works, but the same code below does not.
MainActivity.java
public class MainActivity extends AppCompatActivity {
private SessionVideoCall videoCall;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
initVideoCall();
}
private void initVideoCall(){
videoCall = new SessionVideoCall(this);
videoCall.setChannelName(#CHANNEL_NAME);
videoCall.attachView();
videoCall.startCall();
}
}
SessionVideoCall.java
public class SessionVideoCall implements MediaDataAudioObserver {
private Handler handler;
private final String TAG = this.getClass().getSimpleName();
private static final int PERMISSION_REQ_ID = 44;
private static final String[] REQUESTED_PERMISSIONS = {
Manifest.permission.RECORD_AUDIO,
Manifest.permission.CAMERA,
android.Manifest.permission.READ_EXTERNAL_STORAGE,
Manifest.permission.WRITE_EXTERNAL_STORAGE
};
private final Activity ACTIVITTY;
private MediaDataObserverPlugin mediaDataObserverPlugin;
private String channelName;
private FrameLayout mLocalContainer;
private FrameLayout mRemoteContainer;
private VideoCanvas mLocalVideo;
private VideoCanvas mRemoteVideo;
private Timer timer;
boolean isVisualizerAttached = true;
public static RtcEngine engine;
// set up engine
public SessionVideoCall(Activity activity){
this.ACTIVITTY = activity;
handler = new Handler(Looper.getMainLooper());
}
// set channel name
public void setChannelName(String channelName){
this.channelName = channelName;
}
public void attachView(){
mLocalContainer = ACTIVITTY.findViewById(R.id.local_video_view_container);
mRemoteContainer = ACTIVITTY.findViewById(R.id.remote_video_view_container);
}
// start call
public void startCall(){
if(hasPermissions())
initEngineAndJoinChannel();
}
// end call
public void endCall(){
removeFromParent(mLocalVideo);
mLocalVideo = null;
removeFromParent(mRemoteVideo);
mRemoteVideo = null;
if (mediaDataObserverPlugin != null) {
mediaDataObserverPlugin.removeAudioObserver(this);
mediaDataObserverPlugin.removeAllBuffer();
}
MediaPreProcessing.releasePoint();
leaveChannel();
}
private void initEngineAndJoinChannel() {
initializeEngine();
setupObserver();
setupAudioConfig();
setupVideoConfig();
setupLocalVideo();
joinChannel();
}
private void initializeEngine() {
try {
engine = RtcEngine.create(ACTIVITTY.getApplicationContext(), ACTIVITTY.getString(R.string.agora_app_id), mRtcEventHandler);
} catch (Exception e) {
Log.e(TAG, Log.getStackTraceString(e));
throw new RuntimeException("NEED TO check rtc sdk init fatal error\n" + Log.getStackTraceString(e));
}
}
private void setupObserver(){
mediaDataObserverPlugin = MediaDataObserverPlugin.the();
MediaPreProcessing.setCallback(mediaDataObserverPlugin);
MediaPreProcessing.setAudioPlayByteBuffer(mediaDataObserverPlugin.byteBufferAudioPlay);
mediaDataObserverPlugin.addAudioObserver(this);
}
private void setupAudioConfig(){
engine.setChannelProfile(Constants.CHANNEL_PROFILE_LIVE_BROADCASTING);
engine.setClientRole(IRtcEngineEventHandler.ClientRole.CLIENT_ROLE_BROADCASTER);
engine.setDefaultAudioRoutetoSpeakerphone(false);
engine.setEnableSpeakerphone(false);
engine.setPlaybackAudioFrameParameters(4000, 1, RAW_AUDIO_FRAME_OP_MODE_READ_ONLY, 1024);
}
private void setupVideoConfig() {
engine.enableVideo();
engine.setVideoEncoderConfiguration(new VideoEncoderConfiguration(
VideoEncoderConfiguration.VD_640x360,
VideoEncoderConfiguration.FRAME_RATE.FRAME_RATE_FPS_15,
VideoEncoderConfiguration.STANDARD_BITRATE,
VideoEncoderConfiguration.ORIENTATION_MODE.ORIENTATION_MODE_FIXED_PORTRAIT));
}
private void setupLocalVideo() {
SurfaceView view = RtcEngine.CreateRendererView(ACTIVITTY);
view.setZOrderMediaOverlay(true);
mLocalContainer.addView(view);
mLocalVideo = new VideoCanvas(view, VideoCanvas.RENDER_MODE_HIDDEN, 0);
engine.setupLocalVideo(mLocalVideo);
}
private void joinChannel() {
String token = ACTIVITTY.getString(R.string.agora_access_token);
if (TextUtils.isEmpty(token) || TextUtils.equals(token, "#YOUR ACCESS TOKEN#")) {
token = null; // default, no token
}
engine.joinChannel(token, channelName, "Extra Optional Data", 0);
}
private void leaveChannel(){
if (mediaDataObserverPlugin != null) {
mediaDataObserverPlugin.removeAudioObserver(this);
mediaDataObserverPlugin.removeAllBuffer();
}
MediaPreProcessing.releasePoint();
engine.leaveChannel();
if(timer != null)
timer.cancel();
}
private void removeFromParent(VideoCanvas canvas) {
if (canvas != null) {
ViewParent parent = canvas.view.getParent();
if (parent != null) {
ViewGroup group = (ViewGroup) parent;
group.removeView(canvas.view);
//return group;
}
}
//return null;
}
private void setupRemoteVideo(int uid) {
ViewGroup parent = mRemoteContainer;
if (parent.indexOfChild(mLocalVideo.view) > -1) {
parent = mLocalContainer;
}
if (mRemoteVideo != null) {
return;
}
SurfaceView view = RtcEngine.CreateRendererView(ACTIVITTY);
view.setZOrderMediaOverlay(parent == mLocalContainer);
parent.addView(view);
mRemoteVideo = new VideoCanvas(view, VideoCanvas.RENDER_MODE_HIDDEN, uid);
// Initializes the video view of a remote user.
engine.setupRemoteVideo(mRemoteVideo);
}
private final IRtcEngineEventHandler mRtcEventHandler = new IRtcEngineEventHandler() {
#Override
public void onJoinChannelSuccess(String channel, int uid, int elapsed) {
super.onJoinChannelSuccess(channel, uid, elapsed);
setupTimer();
Log.d(TAG,"onJoinChannelSuccess: ");
}
#Override
public void onFirstRemoteVideoDecoded(final int uid, int width, int height, int elapsed) {
ACTIVITTY.runOnUiThread(new Runnable() {
#Override
public void run() {
Log.d(TAG,"First remote video decoded, uid: " + (uid & 0xFFFFFFFFL));
setupRemoteVideo(uid);
}
});
}
#Override
public void onUserOffline(final int uid, int reason) {
super.onUserOffline(uid, reason);
// when remote user logs off
}
#Override
public void onUserJoined(int uid, int elapsed) {
super.onUserJoined(uid, elapsed);
Log.i(TAG, "onUserJoined->" + uid);
Log.d(TAG, "user has joined call: " + uid);
handler.post(() ->
{
if (mediaDataObserverPlugin != null) {
mediaDataObserverPlugin.addDecodeBuffer(uid);
}
});
}
#Override
public void onRemoteAudioStateChanged(int uid, int state, int reason, int elapsed)
{
super.onRemoteAudioStateChanged(uid, state, reason, elapsed);
Log.i(TAG, "onRemoteAudioStateChanged->" + uid + ", state->" + state + ", reason->" + reason);
}
};
private boolean hasPermissions(){
return (checkSelfPermission(REQUESTED_PERMISSIONS[0]) &&
checkSelfPermission(REQUESTED_PERMISSIONS[1]) &&
checkSelfPermission(REQUESTED_PERMISSIONS[2]) &&
checkSelfPermission(REQUESTED_PERMISSIONS[3]));
}
private boolean checkSelfPermission(String permission) {
if (ContextCompat.checkSelfPermission(ACTIVITTY, permission) !=
PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(ACTIVITTY, REQUESTED_PERMISSIONS, PERMISSION_REQ_ID);
return false;
}
return true;
}
private void setupTimer(){
timer = new Timer();
timer.schedule(new TimerTask() {
#Override
public void run() {
if(maxAmplitude > 0)
Log.e(TAG, "Amplitude Greater than 0: " + maxAmplitude);
}
},0,50);
}
#Override
public void onRecordAudioFrame(byte[] data, int audioFrameType, int samples, int bytesPerSample, int channels, int samplesPerSec, long renderTimeMs, int bufferLength) {
Log.e(TAG, "onRecordAudioFrame: ");
}
private int maxAmplitude = 0;
#Override
public void onPlaybackAudioFrame(byte[] data, int audioFrameType, int samples, int bytesPerSample, int channels, int samplesPerSec, long renderTimeMs, int bufferLength) {
if(isVisualizerAttached) {
short[] rawAudio = new short[data.length/2];
ByteBuffer.wrap(data).order(ByteOrder.BIG_ENDIAN).asShortBuffer().get(rawAudio);
short amplitude = 0;
for(short num: rawAudio){
if(num > amplitude)
amplitude = num;
}
Log.e(TAG, "onPlaybackAudioFrame: Supposedly we have data -> max: " + amplitude);
}
Log.e(TAG, "onPlaybackAudioFrame:");
}
#Override
public void onPlaybackAudioFrameBeforeMixing(int uid, byte[] data, int audioFrameType, int samples, int bytesPerSample, int channels, int samplesPerSec, long renderTimeMs, int bufferLength) {
Log.e(TAG, "onPlaybackAudioFrameBeforeMixing: ");
}
#Override
public void onMixedAudioFrame(byte[] data, int audioFrameType, int samples, int bytesPerSample, int channels, int samplesPerSec, long renderTimeMs, int bufferLength) {
Log.e(TAG, "onMixedAudioFrame: ");
}
}
Did you give/add permissions in the manifest? Because Agora does not show any runtime error when you don't give/add permissions; instead, your voice call goes blank, and the microphone and speaker buttons, if added, won't do anything. This is what I went through one day dealing with some code on GitHub by Agora (that is, it didn't look up the manifest file).
According to the Agora documentation, the following permissions are required:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.BLUETOOTH" />
I have discovered where in the API Example (provided by the github repo link above), where to interact with the remote audio data. In the CustomAudioSource class, within the AsyncTask object at the bottom of the file. Had to run on physical device to take advantage of it.
Now I am need of finding the max amplitude of the audio byte array. The formula given within this example is "lengthInByte = sampleRate/1000 × 2 × channels × audio duration (ms)"
Related
Currently, I am implementing google Speech to Text in my project. The sample code referred is this: Click Here.
I have used the SpeechService and Voice Recorder class from this project.
public class SpeechService extends Service {
public static final List<String> SCOPE =
Collections.singletonList("https://www.googleapis.com/auth/cloud-platform");
private static final String TAG = "SpeechService";
private static final String PREFS = "SpeechService";
private static final String PREF_ACCESS_TOKEN_VALUE = "access_token_value";
private static final String PREF_ACCESS_TOKEN_EXPIRATION_TIME = "access_token_expiration_time";
/**
* We reuse an access token if its expiration time is longer than this.
*/
private static final int ACCESS_TOKEN_EXPIRATION_TOLERANCE = 30 * 60 * 1000; // thirty minutes
/**
* We refresh the current access token before it expires.
*/
private static final int ACCESS_TOKEN_FETCH_MARGIN = 60 * 1000; // one minute
private static final String HOSTNAME = "speech.googleapis.com";
private static final int PORT = 443;
private static Handler mHandler;
private final SpeechBinder mBinder = new SpeechBinder();
private final ArrayList<Listener> mListeners = new ArrayList<>();
private final StreamObserver<StreamingRecognizeResponse> mResponseObserver
= new StreamObserver<StreamingRecognizeResponse>() {
#Override
public void onNext(StreamingRecognizeResponse response) {
Log.e("Speech", "Recognized");
String text = null;
boolean isFinal = false;
if (response.getResultsCount() > 0) {
System.out.println("result count....."+String.valueOf(response.getResultsCount()));
final StreamingRecognitionResult result = response.getResults(0);
isFinal = result.getIsFinal();
if (result.getAlternativesCount() > 0) {
final SpeechRecognitionAlternative alternative = result.getAlternatives(0);
text = alternative.getTranscript();
}
}
if (text != null && isFinal) {
for (Listener listener : mListeners) {
listener.onSpeechRecognized(text, isFinal);
}
} else {
for (Listener listener : mListeners) {
listener.onRandomStupidity();
}
}
}
#Override
public void onError(Throwable t) {
Log.e(TAG, "Error calling the API.", t);
for(Listener listener : mListeners){
listener.onErrorRecognizing();
}
}
#Override
public void onCompleted() {
Log.i(TAG, "API completed.");
}
};
private volatile AccessTokenTask mAccessTokenTask;
private final Runnable mFetchAccessTokenRunnable = new Runnable() {
#Override
public void run() {
fetchAccessToken();
}
};
private SpeechGrpc.SpeechStub mApi;
private StreamObserver<StreamingRecognizeRequest> mRequestObserver;
public static SpeechService from(IBinder binder) {
return ((SpeechBinder) binder).getService();
}
#Override
public void onCreate() {
super.onCreate();
mHandler = new Handler();
fetchAccessToken();
}
#Override
public void onDestroy() {
super.onDestroy();
mHandler.removeCallbacks(mFetchAccessTokenRunnable);
mHandler = null;
// Release the gRPC channel.
if (mApi != null) {
final ManagedChannel channel = (ManagedChannel) mApi.getChannel();
if (channel != null && !channel.isShutdown()) {
try {
channel.shutdown().awaitTermination(5, TimeUnit.SECONDS);
} catch (InterruptedException e) {
Log.e(TAG, "Error shutting down the gRPC channel.", e);
}
}
mApi = null;
}
}
private void fetchAccessToken() {
if (mAccessTokenTask != null) {
return;
}
mAccessTokenTask = new AccessTokenTask();
mAccessTokenTask.execute();
}
private String getDefaultLanguageCode() {
final LangInnerResponse languageToLearn = MemoryCache.getLanguageToLearn();
if(languageToLearn != null) {
Log.e("Test Lang", languageToLearn.getCode());
return languageToLearn.getCode();
} else {
final Locale locale = Locale.getDefault();
final StringBuilder language = new StringBuilder(locale.getLanguage());
final String country = locale.getCountry();
if (!TextUtils.isEmpty(country)) {
language.append("-");
language.append(country);
}
return language.toString();
}
}
#Nullable
#Override
public IBinder onBind(Intent intent) {
return mBinder;
}
public void addListener(#NonNull Listener listener) {
mListeners.add(listener);
}
public void removeListener(#NonNull Listener listener) {
mListeners.remove(listener);
}
/**
** Starts recognizing speech audio.
*
* #param sampleRate The sample rate of the audio.
*/
public void startRecognizing(int sampleRate) {
if (mApi == null) {
Log.w(TAG, "API not ready. Ignoring the request.");
return;
}
System.out.println("calling api....");
// Configure the API
mRequestObserver = mApi.streamingRecognize(mResponseObserver);
mRequestObserver.onNext(StreamingRecognizeRequest.newBuilder()
.setStreamingConfig(StreamingRecognitionConfig.newBuilder()
.setConfig(RecognitionConfig.newBuilder()
.setLanguageCode(getDefaultLanguageCode())
.setEncoding(RecognitionConfig.AudioEncoding.LINEAR16)
.setSampleRateHertz(sampleRate)
.build())
.setInterimResults(true)
.setSingleUtterance(true)
.build())
.build());
}
/**
* Recognizes the speech audio. This method should be called every time a chunk of byte buffer
* is ready.
*
* #param data The audio data.
* #param size The number of elements that are actually relevant in the {#code data}.
*/
public void recognize(byte[] data, int size) {
if (mRequestObserver == null) {
return;
}
// Call the streaming recognition API
mRequestObserver.onNext(StreamingRecognizeRequest.newBuilder()
.setAudioContent(ByteString.copyFrom(data, 0, size))
.build());
}
/**
* Finishes recognizing speech audio.
*/
public void finishRecognizing() {
if (mRequestObserver == null) {
return;
}
mRequestObserver.onCompleted();
mRequestObserver = null;
}
public interface Listener {
/**
* Called when a new piece of text was recognized by the Speech API.
*
* #param text The text.
* #param isFinal {#code true} when the API finished processing audio.
*/
void onSpeechRecognized(String text, boolean isFinal);
void onErrorRecognizing();
void onRandomStupidity();
}
/**
* Authenticates the gRPC channel using the specified {#link GoogleCredentials}.
*/
private static class GoogleCredentialsInterceptor implements ClientInterceptor {
private final Credentials mCredentials;
private Metadata mCached;
private Map<String, List<String>> mLastMetadata;
GoogleCredentialsInterceptor(Credentials credentials) {
mCredentials = credentials;
}
private static Metadata toHeaders(Map<String, List<String>> metadata) {
Metadata headers = new Metadata();
if (metadata != null) {
for (String key : metadata.keySet()) {
Metadata.Key<String> headerKey = Metadata.Key.of(
key, Metadata.ASCII_STRING_MARSHALLER);
for (String value : metadata.get(key)) {
headers.put(headerKey, value);
}
}
}
return headers;
}
#Override
public <ReqT, RespT> ClientCall<ReqT, RespT> interceptCall(
final MethodDescriptor<ReqT, RespT> method, CallOptions callOptions,
final Channel next) {
return new ClientInterceptors.CheckedForwardingClientCall<ReqT, RespT>(
next.newCall(method, callOptions)) {
#Override
protected void checkedStart(Listener<RespT> responseListener, Metadata headers)
throws StatusException {
Metadata cachedSaved;
URI uri = serviceUri(next, method);
synchronized (this) {
Map<String, List<String>> latestMetadata = getRequestMetadata(uri);
if (mLastMetadata == null || mLastMetadata != latestMetadata) {
mLastMetadata = latestMetadata;
mCached = toHeaders(mLastMetadata);
}
cachedSaved = mCached;
}
headers.merge(cachedSaved);
delegate().start(responseListener, headers);
}
};
}
/**
* Generate a JWT-specific service URI. The URI is simply an identifier with enough
* information for a service to know that the JWT was intended for it. The URI will
* commonly be verified with a simple string equality check.
*/
private URI serviceUri(Channel channel, MethodDescriptor<?, ?> method)
throws StatusException {
String authority = channel.authority();
if (authority == null) {
throw Status.UNAUTHENTICATED
.withDescription("Channel has no authority")
.asException();
}
// Always use HTTPS, by definition.
final String scheme = "https";
final int defaultPort = 443;
String path = "/" + MethodDescriptor.extractFullServiceName(method.getFullMethodName());
URI uri;
try {
uri = new URI(scheme, authority, path, null, null);
} catch (URISyntaxException e) {
throw Status.UNAUTHENTICATED
.withDescription("Unable to construct service URI for auth")
.withCause(e).asException();
}
// The default port must not be present. Alternative ports should be present.
if (uri.getPort() == defaultPort) {
uri = removePort(uri);
}
return uri;
}
private URI removePort(URI uri) throws StatusException {
try {
return new URI(uri.getScheme(), uri.getUserInfo(), uri.getHost(), -1 /* port */,
uri.getPath(), uri.getQuery(), uri.getFragment());
} catch (URISyntaxException e) {
throw Status.UNAUTHENTICATED
.withDescription("Unable to construct service URI after removing port")
.withCause(e).asException();
}
}
private Map<String, List<String>> getRequestMetadata(URI uri) throws StatusException {
try {
return mCredentials.getRequestMetadata(uri);
} catch (IOException e) {
throw Status.UNAUTHENTICATED.withCause(e).asException();
}
}
}
private class SpeechBinder extends Binder {
SpeechService getService() {
return SpeechService.this;
}
}
private class CreateApiSingle implements SingleOnSubscribe<SpeechGrpc.SpeechStub> {
#Override
public void subscribe(SingleEmitter<SpeechGrpc.SpeechStub> emitter) throws Exception {
final AccessToken accessToken = generateCredentials();
final SpeechGrpc.SpeechStub api = generateApi(accessToken);
emitter.onSuccess(api);
}
private AccessToken generateCredentials() throws IOException {
final SharedPreferences prefs =
getSharedPreferences(PREFS, Context.MODE_PRIVATE);
String tokenValue = prefs.getString(PREF_ACCESS_TOKEN_VALUE, null);
long expirationTime = prefs.getLong(PREF_ACCESS_TOKEN_EXPIRATION_TIME, -1);
// Check if the current token is still valid for a while
if (tokenValue != null && expirationTime > 0) {
if (expirationTime
> System.currentTimeMillis() + ACCESS_TOKEN_EXPIRATION_TOLERANCE) {
return new AccessToken(tokenValue, new Date(expirationTime));
}
}
// ***** WARNING *****
// In this sample, we load the credential from a JSON file stored in a raw resource
// folder of this client app. You should never do this in your app. Instead, store
// the file in your server and obtain an access token from there.
// *******************
final InputStream stream = getResources().openRawResource(R.raw.credential);
final GoogleCredentials credentials = GoogleCredentials.fromStream(stream)
.createScoped(SCOPE);
final AccessToken token = credentials.refreshAccessToken();
prefs.edit()
.putString(PREF_ACCESS_TOKEN_VALUE, token.getTokenValue())
.putLong(PREF_ACCESS_TOKEN_EXPIRATION_TIME,
token.getExpirationTime().getTime())
.apply();
stream.close();
return token;
}
private SpeechGrpc.SpeechStub generateApi(AccessToken accessToken) {
final ManagedChannel channel = new OkHttpChannelProvider()
.builderForAddress(HOSTNAME, PORT)
.nameResolverFactory(new DnsNameResolverProvider())
.intercept(new GoogleCredentialsInterceptor(new GoogleCredentials(accessToken)
.createScoped(SCOPE)))
.build();
return SpeechGrpc.newStub(channel);
}
}
private class AccessTokenTask extends AsyncTask<Void, Void, AccessToken> {
#Override
protected AccessToken doInBackground(Void... voids) {
final SharedPreferences prefs =
getSharedPreferences(PREFS, Context.MODE_PRIVATE);
String tokenValue = prefs.getString(PREF_ACCESS_TOKEN_VALUE, null);
long expirationTime = prefs.getLong(PREF_ACCESS_TOKEN_EXPIRATION_TIME, -1);
// Check if the current token is still valid for a while
if (tokenValue != null && expirationTime > 0) {
if (expirationTime
> System.currentTimeMillis() + ACCESS_TOKEN_EXPIRATION_TOLERANCE) {
return new AccessToken(tokenValue, new Date(expirationTime));
}
}
// ***** WARNING *****
// In this sample, we load the credential from a JSON file stored in a raw resource
// folder of this client app. You should never do this in your app. Instead, store
// the file in your server and obtain an access token from there.
// *******************
final InputStream stream = getResources().openRawResource(R.raw.credential);
try {
final GoogleCredentials credentials = GoogleCredentials.fromStream(stream)
.createScoped(SCOPE);
final AccessToken token = credentials.refreshAccessToken();
prefs.edit()
.putString(PREF_ACCESS_TOKEN_VALUE, token.getTokenValue())
.putLong(PREF_ACCESS_TOKEN_EXPIRATION_TIME,
token.getExpirationTime().getTime())
.apply();
return token;
} catch (IOException e) {
Log.e(TAG, "Failed to obtain access token.", e);
}
return null;
}
#Override
protected void onPostExecute(AccessToken accessToken) {
mAccessTokenTask = null;
final ManagedChannel channel = new OkHttpChannelProvider()
.builderForAddress(HOSTNAME, PORT)
.nameResolverFactory(new DnsNameResolverProvider())
.intercept(new GoogleCredentialsInterceptor(new GoogleCredentials(accessToken)
.createScoped(SCOPE)))
.build();
mApi = SpeechGrpc.newStub(channel);
// Schedule access token refresh before it expires
if (mHandler != null) {
mHandler.postDelayed(mFetchAccessTokenRunnable,
Math.max(accessToken.getExpirationTime().getTime()
- System.currentTimeMillis()
- ACCESS_TOKEN_FETCH_MARGIN, ACCESS_TOKEN_EXPIRATION_TOLERANCE));
}
}
}}
public class VoiceRecorder {
private static final int[] SAMPLE_RATE_CANDIDATES = new int[]{48000, 44100};
private static final int CHANNEL = AudioFormat.CHANNEL_IN_MONO;
private static final int ENCODING = AudioFormat.ENCODING_PCM_16BIT;
private static final int AMPLITUDE_THRESHOLD = 1500;
private static final int SPEECH_TIMEOUT_MILLIS = 2000;
private static final int MAX_SPEECH_LENGTH_MILLIS = 30 * 1000;
public static abstract class Callback {
/**
* Called when the recorder starts hearing voice.
*/
public void onVoiceStart() {
}
/**
* Called when the recorder is hearing voice.
*
* #param data The audio data in {#link AudioFormat#ENCODING_PCM_16BIT}.
* #param size The size of the actual data in {#code data}.
*/
public void onVoice(byte[] data, int size) {
}
/**
* Called when the recorder stops hearing voice.
*/
public void onVoiceEnd() {
}
}
private final Callback mCallback;
private AudioRecord mAudioRecord;
private Thread mThread;
private byte[] mBuffer;
private final Object mLock = new Object();
/** The timestamp of the last time that voice is heard. */
private long mLastVoiceHeardMillis = Long.MAX_VALUE;
/** The timestamp when the current voice is started. */
private long mVoiceStartedMillis;
public VoiceRecorder(#NonNull Callback callback) {
mCallback = callback;
}
/**
* Starts recording audio.
*
* <p>The caller is responsible for calling {#link #stop()} later.</p>
*/
public void start() {
// Stop recording if it is currently ongoing.
stop();
// Try to create a new recording session.
mAudioRecord = createAudioRecord();
if (mAudioRecord == null) {
throw new RuntimeException("Cannot instantiate VoiceRecorder");
}
// Start recording.
mAudioRecord.startRecording();
// Start processing the captured audio.
mThread = new Thread(new ProcessVoice());
mThread.start();
}
/**
* Stops recording audio.
*/
public void stop() {
synchronized (mLock) {
System.out.println("stop audio record....");
dismiss();
if (mThread != null) {
mThread.interrupt();
mThread = null;
}
if (mAudioRecord != null) {
mAudioRecord.stop();
mAudioRecord.release();
mAudioRecord = null;
}
mBuffer = null;
System.out.println("stop audio record....2");
}
}
/**
* Dismisses the currently ongoing utterance.
*/
public void dismiss() {
if (mLastVoiceHeardMillis != Long.MAX_VALUE) {
mLastVoiceHeardMillis = Long.MAX_VALUE;
mCallback.onVoiceEnd();
}
}
/**
* Retrieves the sample rate currently used to record audio.
*
* #return The sample rate of recorded audio.
*/
public int getSampleRate() {
if (mAudioRecord != null) {
return mAudioRecord.getSampleRate();
}
return 0;
}
/**
* Creates a new {#link AudioRecord}.
*
* #return A newly created {#link AudioRecord}, or null if it cannot be created (missing
* permissions?).
*/
private AudioRecord createAudioRecord() {
for (int sampleRate : SAMPLE_RATE_CANDIDATES) {
final int sizeInBytes = AudioRecord.getMinBufferSize(sampleRate, CHANNEL, ENCODING);
if (sizeInBytes == AudioRecord.ERROR_BAD_VALUE) {
continue;
}
final AudioRecord audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
sampleRate, CHANNEL, ENCODING, sizeInBytes);
if (audioRecord.getState() == AudioRecord.STATE_INITIALIZED) {
mBuffer = new byte[sizeInBytes];
return audioRecord;
} else {
audioRecord.release();
}
}
return null;
}
/**
* Continuously processes the captured audio and notifies {#link #mCallback} of corresponding
* events.
*/
private class ProcessVoice implements Runnable {
#Override
public void run() {
while (true) {
synchronized (mLock) {
if (Thread.currentThread().isInterrupted()) {
break;
}
final int size = mAudioRecord.read(mBuffer, 0, mBuffer.length);
final long now = System.currentTimeMillis();
if (isHearingVoice(mBuffer, size)) {
if (mLastVoiceHeardMillis == Long.MAX_VALUE) {
mVoiceStartedMillis = now;
mCallback.onVoiceStart();
}
mCallback.onVoice(mBuffer, size);
mLastVoiceHeardMillis = now;
if (now - mVoiceStartedMillis > MAX_SPEECH_LENGTH_MILLIS) {
end();
}
} else if (mLastVoiceHeardMillis != Long.MAX_VALUE) {
mCallback.onVoice(mBuffer, size);
if (now - mLastVoiceHeardMillis > SPEECH_TIMEOUT_MILLIS) {
end();
}
}
}
}
}
private void end() {
mLastVoiceHeardMillis = Long.MAX_VALUE;
mCallback.onVoiceEnd();
System.out.println("end...");
}
private boolean isHearingVoice(byte[] buffer, int size) {
for (int i = 0; i < size - 1; i += 2) {
// The buffer has LINEAR16 in little endian.
int s = buffer[i + 1];
if (s < 0) s *= -1;
s <<= 8;
s += Math.abs(buffer[i]);
if (s > AMPLITUDE_THRESHOLD) {
return true;
}
}
return false;
}
}}
Then I implemented the Speech Service & Voice Recorder callback as follows:
private VoiceRecorder voiceRecorder;
private final SpeechService.Listener speechServiceListener = new SpeechService.Listener() {
#Override
public void onSpeechRecognized(final String text, final boolean isFinal) {
if (isFinal) {
System.out.println("ui thread...");
if (!TextUtils.isEmpty(text)) {
runOnUiThread(() -> {
showMessage(text);
flingAnswer(text);
});
}
}
}
#Override
public void onErrorRecognizing() {
showMessage("Please try again. Could not detect.");
}
#Override
public void onRandomStupidity() {
}
};
private SpeechService speechService;
private final VoiceRecorder.Callback voiceCallback = new VoiceRecorder.Callback() {
#Override
public void onVoiceStart() {
if (speechService != null) {
System.out.println("voice start....");
speechService.startRecognizing(voiceRecorder.getSampleRate());
}
}
#Override
public void onVoice(byte[] data, int size) {
if (speechService != null) {
speechService.recognize(data, size);
}
}
#Override
public void onVoiceEnd() {
if (speechService != null) {
speechService.finishRecognizing();
}
}
};
private final ServiceConnection serviceConnection = new ServiceConnection() {
#Override
public void onServiceConnected(ComponentName componentName, IBinder binder) {
speechService = SpeechService.from(binder);
speechService.addListener(speechServiceListener);
}
#Override
public void onServiceDisconnected(ComponentName componentName) {
speechService = null;
}
};
For voice input this is the code:
#Override
public void stopRecognizing() {
stopVoiceRecorder();
Log.e("Recording", "Stopped");
}
#Override
public void startRecognizing() {
if (permissionManager != null && permissionManager.askForPermissions()) {
startVoiceRecorder();
vibrate.vibrate(50);//Providing haptic feedback to user on press.
}
Log.e("Recording", "Started");
}
binding.imgVoice.setOnTouchListener((v, event) -> {
switch (event.getAction()) {
case MotionEvent.ACTION_UP:
System.out.println("up...");
mCallback.stopRecognizing();
binding.imgVoice
.animate()
.scaleX(1.0f)
.scaleY(1.0f);
binding.imgVoice.setVisibility(View.GONE);
binding.progressBar.setVisibility(View.VISIBLE);
break;
case MotionEvent.ACTION_DOWN:
System.out.println("down...");
binding.imgVoice
.animate()
.scaleX(1.8f)
.scaleY(1.8f);
mCallback.startRecognizing();
break;
}
return true;
});
}
When I press the mic, event registered as Action_Down, I start the voice recorder and on releasing the mic , voice recorder is stopped. Also, with the Action_Down I am scaling up the mic icon which needs to be scaled down on Action_Up . But the ui freezes as a whole most of the times. I find that the onNext() callback for StreamObserver is continuously being invoked before the isFinal becomes true.
private void startVoiceRecorder() {
if (voiceRecorder != null) {
voiceRecorder.stop();
}
voiceRecorder = new VoiceRecorder(voiceCallback);
voiceRecorder.start();
}
private void stopVoiceRecorder() {
if (voiceRecorder != null) {
voiceRecorder.stop();
voiceRecorder = null;
}
}
But I want the mic to scale down as soon as I release the mic(on Action up event) which is not happening.
So if anyone can help me over this?
Thanks in Advance.
My current Android application is using <LiveData<PagedList<ModelUI>> with a Realm local database.
I'm basing my approach on this #EpicPanda article.
I'm employing the following Architecture components
api "android.arch.lifecycle:extensions:1.1.1"
api "android.arch.paging:runtime:1.0.1"
My datasource classes resemble this:-
public abstract class BaseDataSource<T> extends PositionalDataSource<ModelDO> {
#WorkerThread
public abstract int countItems();
#WorkerThread
public abstract List<ModelDO> loadRange(final int startPosition, final int count);
#Override
public final void loadInitial(#NonNull final LoadInitialParams params, #NonNull final LoadInitialCallback<ModelDO> callback) {
final int totalCount = countItems();
if (totalCount == 0) {
callback.onResult(Collections.emptyList(), 0, 0);
return;
}
final int firstLoadPosition = computeInitialLoadPosition(params, totalCount);
final int firstLoadSize = computeInitialLoadSize(params, firstLoadPosition, totalCount);
final List<ModelDO> list = loadRange(firstLoadPosition, firstLoadSize);
if (list != null && list.size() == firstLoadSize) {
callback.onResult(list, firstLoadPosition, totalCount);
} else {
invalidate();
}
}
#Override
public final void loadRange(#NonNull final LoadRangeParams params, #NonNull final LoadRangeCallback<ModelDO> callback) {
final List<ModelDO> list = loadRange(params.startPosition, params.loadSize);
if (list == null) {
invalidate();
} else {
callback.onResult(list);
}
}
}
and this:-
public class ModelDataSource<T> extends BaseDataSource<ModelDO> {
private static final String TAG = "ModelDataSource";
private final Realm workerRealm;
private final RealmResults<ModelDO> liveResults;
private final RealmChangeListener<RealmResults<ModelDO>> realmChangeListener = results -> {
if (results.isLoaded()) {
Log.i(TAG, "REALM DATA CHANGE DETECTED");
invalidate();
}
};
// WORKER THREAD
ModelDataSource() {
Log.i(TAG, "CREATED" + this);
this.workerRealm = DatabaseController.getRealm();
this.liveResults = DatabaseController.fetchShortListedResults(this.workerRealm);
if (!liveResults.isLoaded()) {
liveResults.load();
}
this.liveResults.addChangeListener(realmChangeListener);
}
#WorkerThread
public int countItems() {
Log.i(TAG, "COUNTING ITEMS" + this);
if (workerRealm.isClosed() || !liveResults.isValid()) {
Log.i("REALM TILED DATA SOURCE", "RESULTS ARE NOT VALID, OR REALM IS CLOSED.");
return 0;
}
Log.i(TAG, "ITEM SIZE [" + liveResults.size() + "]");
return liveResults.size();
}
#Override
public boolean isInvalid() {
Log.i(TAG, "REFRESHING REALM." + this);
workerRealm.refresh();
return super.isInvalid();
}
#WorkerThread
#Override
public List<ModelDO> loadRange(int startPosition, int count) {
Log.i(TAG, "LOAD: " + startPosition + " , " + count + " " + this);
int countItems = countItems();
if (countItems == 0) {
return Collections.emptyList();
}
final List<ModelDO> list = new ArrayList<>(count);
for (int i = startPosition; i < startPosition + count && i < countItems; i++) {
list.add(workerRealm.copyFromRealm(liveResults.get(i)));
}
return Collections.unmodifiableList(list);
}
}
My data factory class resembles:-
public class ModelDataSourceFactory<T> extends DataSource.Factory<Long, ModelUI> {
private static final String TAG = "ModelDataSourceFactory";
#Override
public DataSource<Long, ModelUI> create() {
Log.d(TAG, "create() called");
return new ModelDataSourceFactory().mapByPage(new Function<List<ModelDO>, List<ModelUI>>() {
#Override
public List<ModelUI> apply(final List<ModelDO> input) {
Log.d(TAG, "apply() called with: input = [" + input.size() + "]");
final List<ModelUI> output = new ArrayList<>();
for (final ModelDO articleSourceDO : input) {
final ModelUI articleSourceUI = constructModelUI(articleSourceDO);
output.add(articleSourceUI);
}
return output;
}
});
}
}
My ViewModel repository is a singleton; I configure my pagedList as shown here
realmPaginationManager = new PagingManager();
realmPaginationManager.open();
pagedListConfig = (new PagedList.Config.Builder())
.setEnablePlaceholders(true)
.setPrefetchDistance(17)
.setPageSize(20)
.build();
MY_PAGED_MODELS = new LivePagedListBuilder(new ArticleSourceDataSourceFactory(), pagedListConfig)
.setFetchExecutor(realmPaginationManager.getFetchExecutor())
.setInitialLoadKey(0)
.build();
When the list is first displayed it loads 60 items starting from 0 by calling the DataSources public final void loadInitial(#NonNull final LoadInitialParams params, #NonNull final LoadInitialCallback<ModelDO> callback) {} method.
The issue I have is that my list must support swipe to dismiss.
When I swipe to dismiss I delete the "swiped" record from the Realm database and my RealmChangeListener invalidates my Datasource as shown e.g.
private final RealmChangeListener<RealmResults<ModelDO>> realmChangeListener = results -> {
if (results.isLoaded()) {
Log.i(TAG, "REALM DATA CHANGE DETECTED");
invalidate();
}
};
What I would expect to happen is that by invalidating the DataStore within my RealmChangeListener this would re trigger the loadInitial() method again and I would now see a new List displayed with the swiped item missing.
What actually happens is the following
a). loadInitial() is called and loads firstLoadPosition = 0, firstLoadSize = 60.
b). loadRange() is called and loads startPosition = 60, count = 20
c). loadRange() is called and loads startPosition = 80, count = 20
I now have to page backwards through the list to where I swiped to dismiss and the deleted item has been removed.
Why doesn't the datastore simply call loadInitial() and redisplay the top of the list?
I'm working with Bluetooth LE devices and I was thinking about my current approach and best practices. Currently I have an activity which handles the connection and the GattCallbacks and I decided to restructure the code to get an better overview and maintainability cause its quite messy actually.
I found the BleManager from NordicSemiconductor https://github.com/NordicSemiconductor/Android-BLE-Library/
It's an abstraction of the basic steps for connecting with a BLE device and it handles the GattCallbacks + providing an appropriate interface to use it from a service or a ViewModel.
I'd like to use the ViewModel approach but I'm not so familiar with MVC, MVP, MVVM patterns and there are some questions that I still can't reply
This class is extending the BleManager (BlinkyManager.java)
It shows how to make use of the BleManager so I adopted the class and called it ECountBleManager.
EDIT:
The last 6 days I did reaearches especially facing the MVVM pattern and the Architecture Components. Unfortunately there are still a lot of questions that I can't reply myself. But I really want to get better so I made a draft of my current concept. I hope you can help me answering my questions and improving my project.
I'm especially interested in best practices.
Here is my draft:
And here are my class implementations:
ECountActivity.java
public class ECountActivity extends AppCompatActivity {
private ECountViewModel viewModel;
#Override
protected void onCreate(final Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.detail_view);
// hide magnifier icon
GifImageView customLoader = findViewById(R.id.progressBar);
customLoader.setVisibility(View.GONE);
// Get additional data from previous activity
final BluetoothDevice device = getIntent().getParcelableExtra("device");
initViewModel();
viewModel.connect(device);
}
private void initViewModel() {
viewModel = ViewModelProviders.of(this).get(ECountViewModel.class);
subscribeDataStreams(viewModel);
}
private void subscribeDataStreams(ECountViewModel viewModel) {
viewModel.isDeviceReady().observe(this, deviceReady -> openOptionsFragment());
viewModel.isConnected().observe(this, status -> {
// Todo: ...
});
}
private void openOptionsFragment() {
// load options fragment
FragmentTransaction ft = getSupportFragmentManager().beginTransaction();
ft.replace(R.id.contentFragment, new OptionsFragment());
ft.commitNow();
}
}
OtaFragment.java
public class OtaFragment extends Fragment implements FolderChooserDialog.FolderCallback,
FileChooserDialog.FileCallback {
private Button partialOtaButton;
private Button fullOtaButton;
private Button submitButton;
private SeekBar mtuSeekBar;
private EditText mtuInput;
private LinearLayout stacklayout;
private Button browseAppButton;
private TextView folderPathText;
private TextView appFileNameText;
private MaterialDialog otaPrepareDialog;
private MaterialDialog otaProgressDialog;
private ECountViewModel viewModel;
private OtaViewModel otaViewModel;
#Override
public void onCreate(#Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
initViewModel();
}
#Override
public View onCreateView(#NonNull LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
// inflate the layout for this fragment
View view = inflater.inflate(R.layout.ota_fragment, container, false);
initViews(view);
return view;
}
#Override
public void onFolderSelection(#NonNull FolderChooserDialog dialog, #NonNull File folder) {
final String otaFolderPath = folder.toString();
otaViewModel.setOtaFolderPath(otaFolderPath);
folderPathText.setText(otaFolderPath.substring(otaFolderPath.lastIndexOf("/")));
// enable app browse
browseAppButton.setClickable(true);
browseAppButton.setEnabled(true);
}
#Override
public void onFolderChooserDismissed(#NonNull FolderChooserDialog dialog) {}
#Override
public void onFileSelection(#NonNull FileChooserDialog dialog, #NonNull File file) {
final String otaAppFilePath = file.toString();
otaViewModel.setOtaAppFilePath(otaAppFilePath);
appFileNameText.setText(otaAppFilePath.substring(otaAppFilePath.lastIndexOf("/")));
// enable submitButton button
submitButton.setClickable(true);
submitButton.setEnabled(true);
}
#Override
public void onFileChooserDismissed(#NonNull FileChooserDialog dialog) {}
private void subscribeDataStreams(ECountViewModel viewModel) {
viewModel.isOtaMode().observe(this, otaMode -> {
otaPrepareDialog.dismiss();
initOtaProgressDialog();
otaProgressDialog.show();
// Todo: how can i get mtu?
viewModel.requestMtu(512);
});
}
private void initViewModel() {
viewModel = ViewModelProviders.of(getActivity()).get(ECountViewModel.class);
otaViewModel = ViewModelProviders.of(getActivity()).get(OtaViewModel.class);
subscribeDataStreams(viewModel);
}
private void initViews(View view) {
// get resources
final Button browseFolderButton = view.findViewById(R.id.browseFolder);
final Button cancelButton = view.findViewById(R.id.ota_cancel);
final SeekBar prioritySeekBar = view.findViewById(R.id.connection_seekBar);
partialOtaButton = view.findViewById(R.id.radio_ota);
fullOtaButton = view.findViewById(R.id.radio_ota_full);
browseAppButton = view.findViewById(R.id.browseApp);
folderPathText = view.findViewById(R.id.folderPathText);
appFileNameText = view.findViewById(R.id.appFileNameText);
stacklayout = view.findViewById(R.id.stacklayout);
submitButton = view.findViewById(R.id.ota_proceed);
mtuSeekBar = view.findViewById(R.id.mtu_seekBar);
mtuInput = view.findViewById(R.id.mtu_value);
// set initial states
mtuSeekBar.setMax(512-23);
mtuSeekBar.setProgress(244);
prioritySeekBar.setMax(2);
prioritySeekBar.setProgress(1);
browseAppButton.setClickable(false);
browseAppButton.setEnabled(false);
submitButton.setClickable(false);
submitButton.setEnabled(false);
mtuInput.setOnEditorActionListener((v, actionId, event) -> {
final Editable mtuText = mtuInput.getText();
if (mtuText != null) {
int mtu = Integer.valueOf(mtuText.toString());
if (mtu < 23) mtu = 23;
if (mtu > 512) mtu = 512;
mtuSeekBar.setProgress(mtu);
viewModel.setMtu(mtu);
}
return false;
});
mtuSeekBar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
#Override
public void onStartTrackingTouch(SeekBar seekBar) {}
#Override
public void onStopTrackingTouch(SeekBar seekBar) {}
#Override
public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
mtuInput.setText(String.valueOf(progress));
viewModel.setMtu(progress);
}
});
prioritySeekBar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener(){
#Override
public void onStartTrackingTouch(SeekBar seekBar) {}
#Override
public void onStopTrackingTouch(SeekBar seekBar) {}
#Override
public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
viewModel.setPriority(progress);
}
});
browseFolderButton.setOnClickListener(v -> new FolderChooserDialog.Builder(getActivity())
.chooseButton(R.string.positiveTextChoose)
.tag("#folder")
.show(getChildFragmentManager()));
browseAppButton.setOnClickListener(v -> new FileChooserDialog.Builder(getActivity())
.initialPath(otaViewModel.getOtaFolderPath())
.extensionsFilter(".ebl")
.tag("#app")
.show(getChildFragmentManager()));
cancelButton.setOnClickListener(v -> Log.i("ota", "cancelButton"));
submitButton.setOnClickListener(v -> {
// disable OTA submitButton button
submitButton.setClickable(false);
submitButton.setEnabled(false);
// init OTA process
viewModel.initOtaMode();
// show OTA preparing dialog
otaPrepareDialog.show();
});
fullOtaButton.setOnClickListener(v -> {
stacklayout.setVisibility(View.VISIBLE);
partialOtaButton.setBackgroundColor(getResources().getColor(R.color.colorPrimary));
fullOtaButton.setBackgroundColor(getResources().getColor(R.color.colorPrimaryDark));
});
partialOtaButton.setOnClickListener(v -> {
stacklayout.setVisibility(View.GONE);
partialOtaButton.setBackgroundColor(getResources().getColor(R.color.colorPrimaryDark));
fullOtaButton.setBackgroundColor(getResources().getColor(R.color.colorPrimary));
});
otaPrepareDialog = new MaterialDialog.Builder(getActivity())
.title(R.string.otaDialogHeaderText)
.content(R.string.waiting)
.progress(true, 0)
.progressIndeterminateStyle(true)
.build();
otaProgressDialog = new MaterialDialog.Builder(getActivity())
.title("test")
.customView(R.layout.ota_progress2, false)
.build();
}
private void initOtaProgressDialog() {
// Todo: ...
}
}
ECountViewModel.java
public class ECountViewModel extends AndroidViewModel implements ECountBleManagerCallbacks {
private final ECountBleManager eCountBleManager;
// Connection states Connecting, Connected, Disconnecting, Disconnected etc.
private final MutableLiveData<String> connectionState = new MutableLiveData<>();
// Flag to determine if the device is connected
private final MutableLiveData<Boolean> isConnected = new MutableLiveData<>();
// Flag to determine if the device is ready
private final MutableLiveData<Void> onDeviceReady = new MutableLiveData<>();
// Flag to determine if the device is in OTA mode
private final MutableLiveData<Void> onOtaMode = new MutableLiveData<>();
public LiveData<Void> isDeviceReady() {
return onDeviceReady;
}
public LiveData<Void> isOtaMode() {
return onOtaMode;
}
public LiveData<String> getConnectionState() {
return connectionState;
}
public LiveData<Boolean> isConnected() {
return isConnected;
}
public ECountViewModel(#NonNull final Application application) {
super(application);
// Initialize the manager
eCountBleManager = new ECountBleManager(getApplication());
eCountBleManager.setGattCallbacks(this);
}
/**
* Connect to peripheral
*/
public void connect(final BluetoothDevice device) {
eCountBleManager.connect(device);
}
/**
* Disconnect from peripheral
*/
private void disconnect() {
eCountBleManager.disconnect();
}
#Override
protected void onCleared() {
super.onCleared();
if (eCountBleManager.isConnected()) {
disconnect();
}
}
#Override
public void onDeviceConnecting(BluetoothDevice device) {
}
#Override
public void onDeviceConnected(BluetoothDevice device) {
isConnected.postValue(true);
}
#Override
public void onDeviceDisconnecting(BluetoothDevice device) {
isConnected.postValue(false);
}
#Override
public void onDeviceDisconnected(BluetoothDevice device) {
isConnected.postValue(false);
}
#Override
public void onLinklossOccur(BluetoothDevice device) {
isConnected.postValue(false);
}
#Override
public void onServicesDiscovered(BluetoothDevice device, boolean optionalServicesFound) {
}
#Override
public void onDeviceReady(BluetoothDevice device) {
onDeviceReady.postValue(null);
}
#Override
public void onOptionalServiceSupported(BluetoothDevice device) {
onOtaMode.postValue(null);
}
#Override
public void onBondingRequired(BluetoothDevice device) {
}
#Override
public void onBonded(BluetoothDevice device) {
}
#Override
public void onError(BluetoothDevice device, String message, int errorCode) {
}
#Override
public void onDeviceNotSupported(BluetoothDevice device) {
disconnect();
}
// delegate call from options fragment to ECountBleManager
public String getDeviceId() {
return BinaryUtils.byteArrayToHexString(eCountBleManager.getDeviceId());
}
// delegate call from ota fragment to ECountBleManager
public void setMtu(final int value) {
eCountBleManager.setMtu(value);
}
public void setPriority(final int value) {
eCountBleManager.setPriority(value);
}
ECountBleManager.java
public class ECountBleManager extends BleManager<BleManagerCallbacks> {
private static final String TAG = ECountBleManager.class.getSimpleName();
private final Handler handler;
private BluetoothGattCharacteristic authCharacteristic;
private BluetoothGattCharacteristic deviceIdCharacteristic;
private BluetoothGattCharacteristic deviceVersionCharacteristic;
private BluetoothGattCharacteristic configIdCharacteristic;
private BluetoothGattCharacteristic configTransmissionIntervalCharacteristic;
private BluetoothGattCharacteristic configKeepAliveIntervalCharacteristic;
private BluetoothGattCharacteristic configRadioModeCharacteristic;
private BluetoothGattCharacteristic configGpsCharacteristic;
private BluetoothGattCharacteristic configRadarCharacteristic;
private BluetoothGattCharacteristic configOperationModeCharacteristic;
private BluetoothGattCharacteristic configLoRaAppEuiCharacteristic;
private BluetoothGattCharacteristic configLoRaAppKeyCharacteristic;
private BluetoothGattCharacteristic configLoRaDeviceEuiCharacteristic;
private BluetoothGattCharacteristic operationCmdCharacteristic;
private BluetoothGattCharacteristic nemeusStatusCharacteristic;
private BluetoothGattCharacteristic gmrStatusCharacteristic;
private BluetoothGattCharacteristic radarStatusCharacteristic;
private BluetoothGattCharacteristic otaControlCharacteristic;
private BluetoothGattCharacteristic otaDataCharacteristic;
private byte[] configTransmissionInterval;
private byte[] configKeepAliveInterval;
private byte[] configRadioMode;
private byte[] configOperationMode;
private byte[] configId;
private byte[] deviceId;
private byte[] deviceVersion;
private byte[] configGps;
private byte[] configRadar;
private byte[] configLoRaAppEui;
private byte[] configLoRaAppKey;
private byte[] configLoRaDeviceEui;
private byte[] operationCmd;
private byte[] nemeusStatus;
private byte[] gmrStatus;
private byte[] radarStatus;
// OTA flags
private boolean isOtaProcessing = false;
private boolean isReconnectRequired = false;
private MutableLiveData<Boolean> isOtaMode = new MutableLiveData<>();
// OTA variables
private int mtu = 512;
private int priority = BluetoothGatt.CONNECTION_PRIORITY_HIGH;
private byte[] otaAppFileStream;
////////////////////////////
public ECountBleManager(Context context) {
super(context);
handler = new Handler();
}
#Override
protected BleManagerGattCallback getGattCallback() {
return gattCallback;
}
#Override
protected boolean shouldAutoConnect() {
return true;
}
/**
* BluetoothGatt callbacks for connection/disconnection, service discovery, receiving indication, etc
*/
private final BleManagerGattCallback gattCallback = new BleManagerGattCallback() {
#Override
protected void onDeviceReady() {
super.onDeviceReady();
}
#Override
protected void onOptionalServiceSupported() {
super.onOptionalServiceSupported();
isOtaMode.postValue(true);
}
#Override
protected boolean isOptionalServiceSupported(BluetoothGatt gatt) {
final BluetoothGattService otaService = gatt.getService(DC_UUID.otaService);
otaDataCharacteristic = otaService.getCharacteristic(DC_UUID.otaData);
return otaDataCharacteristic != null;
}
#Override
protected boolean isRequiredServiceSupported(BluetoothGatt gatt) {
final BluetoothGattService dcService = gatt.getService(DC_UUID.dcService);
final BluetoothGattService otaService = gatt.getService(DC_UUID.otaService);
if (dcService == null || otaService == null) return false;
authCharacteristic = dcService.getCharacteristic(DC_UUID.authentication);
deviceIdCharacteristic = dcService.getCharacteristic(DC_UUID.deviceId);
deviceVersionCharacteristic = dcService.getCharacteristic(DC_UUID.deviceVersion);
configIdCharacteristic = dcService.getCharacteristic(DC_UUID.configId);
configTransmissionIntervalCharacteristic = dcService.getCharacteristic(DC_UUID.configTransmissionInterval);
configKeepAliveIntervalCharacteristic = dcService.getCharacteristic(DC_UUID.configKeepAliveInterval);
configRadioModeCharacteristic = dcService.getCharacteristic(DC_UUID.configRadioMode);
configGpsCharacteristic = dcService.getCharacteristic(DC_UUID.configGps);
configRadarCharacteristic = dcService.getCharacteristic(DC_UUID.configRadar);
configOperationModeCharacteristic = dcService.getCharacteristic(DC_UUID.configOperationMode);
configLoRaAppEuiCharacteristic = dcService.getCharacteristic(DC_UUID.configLoRaAppEui);
configLoRaAppKeyCharacteristic = dcService.getCharacteristic(DC_UUID.configLoRaAppKey);
configLoRaDeviceEuiCharacteristic = dcService.getCharacteristic(DC_UUID.configLoRaDeviceEui);
operationCmdCharacteristic = dcService.getCharacteristic(DC_UUID.operationCmd);
nemeusStatusCharacteristic = dcService.getCharacteristic(DC_UUID.nemeusStatus);
gmrStatusCharacteristic = dcService.getCharacteristic(DC_UUID.gmrStatus);
radarStatusCharacteristic = dcService.getCharacteristic(DC_UUID.radarStatus);
otaControlCharacteristic = otaService.getCharacteristic(DC_UUID.otaControl);
return authCharacteristic != null &&
deviceIdCharacteristic != null &&
deviceVersionCharacteristic != null&&
configIdCharacteristic != null &&
configTransmissionIntervalCharacteristic != null &&
configKeepAliveIntervalCharacteristic != null &&
configRadioModeCharacteristic != null &&
configGpsCharacteristic != null &&
configRadarCharacteristic != null &&
configOperationModeCharacteristic != null &&
configLoRaAppEuiCharacteristic != null &&
configLoRaAppKeyCharacteristic != null &&
configLoRaDeviceEuiCharacteristic != null &&
operationCmdCharacteristic != null &&
nemeusStatusCharacteristic != null &&
gmrStatusCharacteristic != null &&
radarStatusCharacteristic != null &&
otaControlCharacteristic != null;
}
#Override
protected Deque<Request> initGatt(BluetoothGatt gatt) {
final LinkedList<Request> requests = new LinkedList<>();
requests.push(Request.readRequest(deviceIdCharacteristic));
requests.push(Request.readRequest(deviceVersionCharacteristic));
requests.push(Request.readRequest(configIdCharacteristic));
requests.push(Request.readRequest(configTransmissionIntervalCharacteristic));
requests.push(Request.readRequest(configKeepAliveIntervalCharacteristic));
requests.push(Request.readRequest(configRadioModeCharacteristic));
requests.push(Request.readRequest(configGpsCharacteristic));
requests.push(Request.readRequest(configRadarCharacteristic));
requests.push(Request.readRequest(configOperationModeCharacteristic));
requests.push(Request.readRequest(configLoRaAppEuiCharacteristic));
requests.push(Request.readRequest(configLoRaAppKeyCharacteristic));
requests.push(Request.readRequest(operationCmdCharacteristic));
requests.push(Request.readRequest(configLoRaDeviceEuiCharacteristic));
requests.push(Request.readRequest(nemeusStatusCharacteristic));
requests.push(Request.readRequest(gmrStatusCharacteristic));
requests.push(Request.readRequest(radarStatusCharacteristic));
// write authentication key to characteristic
requests.push(Request.writeRequest(authCharacteristic));
// perform server authentication
requests.push(Request.readRequest(authCharacteristic));
return requests;
}
#Override
protected void onDeviceDisconnected() {
authCharacteristic = null;
deviceIdCharacteristic = null;
deviceVersionCharacteristic = null;
configIdCharacteristic = null;
configTransmissionIntervalCharacteristic = null;
configKeepAliveIntervalCharacteristic = null;
configRadioModeCharacteristic = null;
configGpsCharacteristic = null;
configRadarCharacteristic = null;
configOperationModeCharacteristic = null;
configLoRaAppEuiCharacteristic = null;
configLoRaAppKeyCharacteristic = null;
configLoRaDeviceEuiCharacteristic = null;
nemeusStatusCharacteristic = null;
gmrStatusCharacteristic = null;
radarStatusCharacteristic = null;
otaDataCharacteristic = null;
}
#Override
protected void onMtuChanged(int mtu) {
super.onMtuChanged(mtu);
ECountBleManager.this.mtu = mtu;
}
#Override
protected void onCharacteristicRead(BluetoothGatt gatt, BluetoothGattCharacteristic characteristic) {
super.onCharacteristicRead(gatt, characteristic);
if (characteristic.getUuid().equals(DC_UUID.authentication)) {
byte encryptedData[];
try {
encryptedData = BinaryUtils.encryptByteArray(characteristic.getValue());
} catch (Exception e) {
e.printStackTrace();
return;
}
characteristic.setValue(encryptedData);
} else if (characteristic.getUuid().equals(DC_UUID.deviceId)) {
deviceId = characteristic.getValue();
} else if (characteristic.getUuid().equals(DC_UUID.deviceVersion)) {
deviceVersion = characteristic.getValue();
} else if (characteristic.getUuid().equals(DC_UUID.configId)) {
configId = characteristic.getValue();
} else if (characteristic.getUuid().equals(DC_UUID.configTransmissionInterval)) {
configTransmissionInterval = characteristic.getValue();
} else if (characteristic.getUuid().equals(DC_UUID.configKeepAliveInterval)) {
configKeepAliveInterval = characteristic.getValue();
} else if (characteristic.getUuid().equals(DC_UUID.configRadioMode)) {
configRadioMode = characteristic.getValue();
} else if (characteristic.getUuid().equals(DC_UUID.configGps)) {
configGps = characteristic.getValue();
} else if (characteristic.getUuid().equals(DC_UUID.configRadar)) {
configRadar = characteristic.getValue();
} else if (characteristic.getUuid().equals(DC_UUID.configOperationMode)) {
configOperationMode = characteristic.getValue();
} else if (characteristic.getUuid().equals(DC_UUID.configLoRaAppEui)) {
configLoRaAppEui = characteristic.getValue();
} else if (characteristic.getUuid().equals(DC_UUID.configLoRaAppKey)) {
configLoRaAppKey = characteristic.getValue();
} else if (characteristic.getUuid().equals(DC_UUID.configLoRaDeviceEui)) {
configLoRaDeviceEui = characteristic.getValue();
} else if (characteristic.getUuid().equals(DC_UUID.nemeusStatus)) {
nemeusStatus = characteristic.getValue();
} else if (characteristic.getUuid().equals(DC_UUID.gmrStatus)) {
gmrStatus = characteristic.getValue();
} else if (characteristic.getUuid().equals(DC_UUID.radarStatus)) {
radarStatus = characteristic.getValue();
}
}
#Override
protected void onCharacteristicWrite(BluetoothGatt gatt,
BluetoothGattCharacteristic characteristic) {
super.onCharacteristicWrite(gatt, characteristic);
if (characteristic.getUuid().equals(DC_UUID.otaControl)) {
final byte[] otaControl = characteristic.getValue();
if (otaControl.length == 1) {
// OTA client initiates the update process
if (otaControl[0] == (byte) 0x00) {
// set OTA process flag
isOtaProcessing = true;
// check whether device is in OTA mode
if (isOtaMode.getValue()) {
// request MTU size
requestMtu(mtu);
// start update process,but ensure MTU size has been requested
handler.postDelayed(() -> uploadOta(), 2000);
} else {
// reconnect to establish OTA mode
isReconnectRequired = true;
// enforces device to reconnect
gatt.disconnect();
}
}
// OTA client finishes the update process
if (otaControl[0] == (byte) 0x03) {
if (isOtaProcessing) { // if device is in OTA mode and update process was successful
isOtaProcessing = false;
disconnect();
} else { // if device is in OTA mode, but update process was not established
// enforces device to reconnect
gatt.disconnect();
}
}
}
}
}
#Override
public void onCharacteristicChanged(BluetoothGatt gatt, BluetoothGattCharacteristic characteristic) {
super.onCharacteristicChanged(gatt, characteristic);
}
};
public byte[] getDeviceId() {
return deviceId;
}
public void setDeviceId(final byte[] value) {
writeCharacteristic(deviceIdCharacteristic,
BluetoothGattCharacteristic.WRITE_TYPE_DEFAULT,
value);
}
// Todo: implement other getters and setters
// Here I have to get the otaAppFilePath which I discovered in OtaFragment
public void uploadOta(final String otaAppFilePath) {
if (otaDataCharacteristic != null) {
otaDataCharacteristic.setWriteType(BluetoothGattCharacteristic.WRITE_TYPE_NO_RESPONSE);
byte[] ebl = null;
try {
FileInputStream fileInputStream = new FileInputStream(otaAppFilePath);
int size = fileInputStream.available();
byte[] temp = new byte[size];
fileInputStream.read(temp);
fileInputStream.close();
ebl = temp;
} catch (Exception e) {
Logger.e(TAG, "Couldn't open file " + e);
}
otaAppFileStream = ebl;
pack = 0;
// start update process in another thread
Thread otaUploadThread = new Thread(() -> otaWriteDataReliable());
otaUploadThread.start();
}
}
private void writeCharacteristic(final BluetoothGattCharacteristic c,
final int writeType,
final byte[] value) {
if (c == null)
return;
c.setWriteType(writeType);
c.setValue(value);
writeCharacteristic(c); // will call the underlying API of BleManager
}
}
The code covers the basic use cases but I'm still not sure how to link the particular components with each other.
While reading about MVVM I noticed that there is always more than one possible solution/approach. I discovered the following questions:
Is the ECountBleManager the right place to store the variables that I got by calling characteristics.getValue() and when yes, should I place the variables that I discover in OtaFragment in it too (that would mean, that I have to forward the values e.g. of mtu to the ECountBleManager)? Consider that I have to access the variables that I discover in OtaFragment and maybe other Fragments.
Where do I store the variables from OtaFragment? In ECountVieModel or in ECountBleManager or do I create an OtaViewModel (but how could I access the ECountBleManager instance that I already created in ECountViewModel within the OtaViewModel?)
How can I access mtu, priority and otaAppFile which were discovered in OtaFragment within the ECountBleManager?
Do I have one ViewModel for every activity and fragment? But how to solve the problem with the ECountBleManager instance, see question 2?
How does the ECountBleManager fit in the MVVM pattern? I would guess it is part of the Model?! But which part? Repository, Interactor, Controller, Mediator?
The code is not less so I'm sorry but you see I'm really try harding and I want to get better. I hope someone can help me with my questions and to improve my code. Thanks in advance!
My 5 cents about BLE and an architecture:
you don't need to mix up the app architecture and BLE layer.
just think about BLE as a data-source/-repository (based on Rx, Coroutines)
split your functionality into several parts: BLE-scanner, BLE-connector, BLE-command-executor. Cover these parts by using the BLE-Fasade.
BLE layer is generally the async stuff (and it's too complicated to make them is 100% testable)
BLE layer in a deep look is the thread-safe handling of bytes. Don't handle it on the Main thread.
I'm building a plugin in Java for Unity. I set up the device camera myself in Java, which is working great. However to pass the camera preview data to Unity is proving difficult.
I have tested everything working using the ARToolkit library, which has a function to pass camera preview data to Unity.
However Unity itself has such a function for this as well for camera support in Android, which I would rather like to use. This function is called
private final native void nativeVideoFrameCallback(int var1, byte[] var2, int var3, int var4);
in the UnityPlayer class, in classes.jar.
You can download the classes.jar for inspection from here: https://github.com/PlayFab/Unity3d_Login_Example_Project/blob/master/Assets/Facebook/Editor/android/android-libs/unity-classes.jar (press the 'Raw' button).
As you can see it is set to private, so I have no option of calling it.
Original use by UnityPlayer
nativeVideoFrameCallback is originally called by Unity in:
public void onCameraFrame(final com.unity3d.player.a var1, final byte[] var2) {
final int var3 = var1.a();
final Size var4 = var1.b();
this.a(new UnityPlayer.c((byte)0) {
public final void a() {
UnityPlayer.this.nativeVideoFrameCallback(var3, var2, var4.width, var4.height);
var1.a(var2);
}
});
}
which is public, but asks for a non-public variable "com.unity3d.player.a var1", which I can't instantiate.
A possible solution
My solution was to create a new native function link for nativeVideoFrameCallback, but it leads to a FatalException. I do not get this exception when not calling my own nativeVideoFrameCallback link, so Unity does succeed for its own.
UnsatisfiedLinkError: No Implementation found for ...package...UnityPlayer.nativeVideoFrameCallback)int, yte[], int, int).
My UnityPlayer class:
public class UnityPlayer extends com.unity3d.player.UnityPlayer {
private final ConcurrentLinkedQueue<Runnable> jobs = new ConcurrentLinkedQueue<Runnable>();
public UnityPlayer(ContextWrapper contextWrapper) {
super(contextWrapper);
}
public void addJob(final Camera camera, final int cam, final byte[] data, final int width, final int height) {
jobs.add(new Runnable() {
#Override
public void run() {
nativeVideoFrameCallback(cam, data, width, height);
camera.addCallbackBuffer(data);
}
});
}
private final native void nativeVideoFrameCallback(int var1, byte[] var2, int var3, int var4);
static {
try {
System.loadLibrary("main"); // Is this still required? I would think not, as Unity already loads it
} catch (UnsatisfiedLinkError var1) {
Log.d(Constants.TAG, "Unable to find " + "main");
} catch (Exception var2) {
Log.d(Constants.TAG, "Unknown error " + var2);
}
}
#Override
protected void executeGLThreadJobs() {
super.executeGLThreadJobs();
Runnable job = jobs.poll();
if (job != null) {
job.run();
}
}
}
which requires a copy of UnityNativeActivity and the above UnityPlayer instantiation instead of com.unity3d.player.UnityPlayer.
I got it to work using reflection. It's not optimal to use reflection, so if someone knows a solution without reflection, I will accept that as the answer.
For anyone else looking to manage their own camera on Android with Unity:
public class UnityPlayer extends com.unity3d.player.UnityPlayer {
private final ConcurrentLinkedQueue<Runnable> jobs = new ConcurrentLinkedQueue<Runnable>();
public UnityPlayer(ContextWrapper contextWrapper) {
super(contextWrapper);
}
public void addJob(final Camera camera, final int cam, final byte[] data, final int width, final int height) { // execute on opengl thread using jobs
jobs.add(new Runnable() {
#Override
public void run() {
videoFrameCallback(cam, data, width, height);
camera.addCallbackBuffer(data);
}
});
}
#Override
protected int[] initCamera(int var1, int var2, int var3, int var4) {
return new int[]{640, 480}; // return width and height of camera
}
// private final native void nativeVideoFrameCallback(int var1, byte[] var2, int var3, int var4); ==> camera id (0 back, 1 front), imagedata, width, height
private void videoFrameCallback(int var1, byte[] var2, int var3, int var4) {
try {
Method m = com.unity3d.player.UnityPlayer.class.getDeclaredMethod("nativeVideoFrameCallback", Integer.TYPE, byte[].class, Integer.TYPE, Integer.TYPE);
m.setAccessible(true);
m.invoke(this, var1, var2, var3, var4);
} catch (Exception e) {
Log.d(Constants.TAG, e.toString());
}
}
#Override
protected void executeGLThreadJobs() {
super.executeGLThreadJobs();
Runnable job = jobs.poll();
if (job != null) {
job.run();
}
}
}
I want send composing event in Group (Multiuser) chat in xmpp, I am using asmack library, I have done same functionality with One to One chat.
I am using below code:
mMessageEventManager = new MessageEventManager(XMPPConnectApplication.getInstance().getXmppConnection());
mMessageEventManager.addMessageEventNotificationListener(new MessageEventNotificationListener() {
#Override
public void offlineNotification(String arg0, String arg1) {
}
#Override
public void displayedNotification(String arg0, String arg1) {
}
#Override
public void deliveredNotification(String arg0, String arg1) {
}
#Override
public void composingNotification(String from, String to) {
Log.e("Receiver-composingNotification",from + " is started typing......"+to);
}
#Override
public void cancelledNotification(String from, String to) {
Log.e("Receiver-cancelledNotification",from + " is stopped typing......"+to);
}
});
Please let me know if you have any idea for the same.
Any help will be appreciated.
Yes, I have idea about it and I have done just before 1 week.
I have used MessageEventManager to manage Chat States.
private MessageEventManager mMessageEventManager;
Add this method for Chat State Receiving Listener:
private void chatStateRecognizer(){
Thread thread = new Thread(new Runnable() {
#Override
public void run() {
mMessageEventManager = new MessageEventManager(mXmppConnection);
mMessageEventManager.addMessageEventNotificationListener(new MessageEventNotificationListener() {
#Override
public void offlineNotification(String arg0, String arg1) {
}
#Override
public void displayedNotification(String arg0, String arg1) {
}
#Override
public void deliveredNotification(String from, String arg1) {
}
#Override
public void composingNotification(String from, String to) {
Log.i("Receiver:Compose state",from + " is started typing......"+to);
}
#Override
public void cancelledNotification(String from, String to) {
Log.i("Receiver:Stop state",from + " is stopped typing......"+to);
}
});
}
});
thread.start();
}
Create one Model class name with GroupInfoModel.java:
public class GroupInfoModel implements Comparable<GroupInfoModel>, Serializable{
private static final long serialVersionUID = 1L;
private String memberId = "", memberName = "";
private boolean isAdmin;
public String getMemberId() {
return memberId;
}
public void setMemberId(String memberId) {
this.memberId = memberId;
}
public String getMemberName() {
return memberName;
}
public void setMemberName(String memberName) {
this.memberName = memberName;
}
public boolean isAdmin() {
return isAdmin;
}
public void setAdmin(boolean isAdmin) {
this.isAdmin = isAdmin;
}
#Override
public int compareTo(GroupInfoModel another) {
return getMemberName().compareTo(another.getMemberName());
}
}
Now take ArrayList of GroupInfoModel.java class:
private ArrayList<GroupInfoModel> groupDetailsList = new ArrayList<GroupInfoModel>();
private boolean isComposingStarted;
on onCreate() of Activity / Fragment:
groupDetailsList.clear();
ServiceDiscoveryManager discoManager = ServiceDiscoveryManager.getInstanceFor(mXmppConnection);
DiscoverItems items = discoManager.discoverItems(mRoomId);
for (Iterator<Item> it = items.getItems(); it.hasNext();) {
DiscoverItems.Item item = (DiscoverItems.Item) it.next();
String occupant = item.getEntityID();
occupant = occupant.split("/")[1];
GroupInfoModel groupInfoModel = new GroupInfoModel();
groupInfoModel.setAdmin(false);
groupInfoModel.setMemberId(occupant+"#"+mServiceNameHere);
groupInfoModel.setMemberName(occupant);
groupDetailsList.add(groupInfoModel);
}
Now add TextWatcher on your EditText of Compose Message (Chat view) screen:
#Override
public void onTextChanged(CharSequence s, int start, int before, int count) {
if(s.toString().length()==1&&!isComposingStarted){
isComposingStarted = true;
if(chatType.equals("OneToOneChat")){
mMessageEventManager.sendComposingNotification(myJabberId, friendJabberId);
}else if(chatType.equals("GroupChat")){
for (int i = 0; i < groupDetailsList.size(); i++) {
if(!groupDetailsList.get(i).getMemberId().contains(myJabberId)){
mMessageEventManager.sendComposingNotification(groupDetailsList.get(i).getMemberId(), roomId);
}
}
}
}else if(s.toString().length()==0){
isComposingStarted = false;
if(chatType.equals("OneToOneChat")){
mMessageEventManager.sendCancelledNotification(myJabberId, friendJabberId);
}else if(chatType.equals("GroupChat")){
for (int i = 0; i < groupDetailsList.size(); i++) {
if(!groupDetailsList.get(i).getMemberId().contains(myJabberId)){
mMessageEventManager.sendCancelledNotification(groupDetailsList.get(i).getMemberId(), roomId);
}
}
}
}
}
I strongly recommended that use above code in Application class, you can modify methods as your requirements.
Done.
// send multi user chat typing status
public static void sendMUCTypingStatus(ChatState state)
{
// check if you are connected to group
if(multiUserChat != null)
{
try{
// create packet
Message statusPacket = new Message();
// set body to null
statusPacket.setBody(null);
// set packet type to group chat
statusPacket.setType(Message.Type.groupchat);
// set subject to null
statusPacket.setSubject(null);
// set to the group name
statusPacket.setTo(multiUserChat.getRoom());
// set from my current jis example : me#domain.com
statusPacket.setFrom(new MyPrefrence(XmppBase.context).getUsername());
// get the chat state extension and pass our state
ChatStateExtension extension = new ChatStateExtension(state);
// add the extention to our packet
statusPacket.addExtension(extension);
// get the connection and send the packet
Utils.getConnection().sendStanza(statusPacket);
} catch (SmackException.NotConnectedException e) {
e.printStackTrace();
}
}
}
Usage :
sendMucTypingStatus(ChatState.composing);
watch this : Quick overview of using
With RxJava and Jake Wharton's RxBinding, it's quite simple to do:
RxTextView.afterTextChangeEvents(editText)
.observeOn(Schedulers.io())
.skip(1)
.map({ input ->
// FIRE ChatState.composing EVENT HERE
input // just returning the argument here
})
.debounce(2, TimeUnit.SECONDS)
.observeOn(Schedulers.io())
.subscribe {
// FIRE ChatState.active EVENT HERE
}
Remember that we will have to write code to catch these events via smack stanzaListener and display it on the UI accordingly!
Code is written in Kotlin, but it is fairly straight forward.