Android WebRTC - create 1-N calls - android

I have a project https://github.com/njovy/AppRTCDemo. This project works 1 -1 calls. I modified the PeerConnectionClient.class:
private static final PeerConnectionClient instance = new PeerConnectionClient();
// private final PCObserver pcObserver = new PCObserver();
private final PCObserver[] pcObservers = new PCObserver[MAX_CONNECTIONS];
// private final SDPObserver sdpObserver = new SDPObserver();
private final SDPObserver[] sdpObservers = new SDPObserver[MAX_CONNECTIONS];
private final LooperExecutor executor;
private static final int MAX_CONNECTIONS = 3;
private PeerConnectionFactory factory;
private PeerConnection[] peerConnections = new PeerConnection[MAX_CONNECTIONS];
PeerConnectionFactory.Options options = null;
private VideoSource videoSource;
private boolean videoCallEnabled;
private boolean audioCallEnabled;
private boolean preferIsac;
private boolean preferH264;
private boolean videoSourceStopped;
private boolean isError;
private Timer statsTimer;
private VideoRenderer.Callbacks localRender;
private VideoRenderer.Callbacks[] remoteRenders;
private SignalingParameters signalingParameters;
private MediaConstraints pcConstraints;
private MediaConstraints videoConstraints;
private MediaConstraints audioConstraints;
private MediaConstraints sdpMediaConstraints;
private PeerConnectionParameters peerConnectionParameters;
// Queued remote ICE candidates are consumed only after both local and
// remote descriptions are set. Similarly local ICE candidates are sent to
// remote peer after both local and remote description are set.
private LinkedList<IceCandidate>[] queuedRemoteCandidateLists = new LinkedList[MAX_CONNECTIONS];
private PeerConnectionEvents events;
private boolean[] isConnectionInitiator = new boolean[MAX_CONNECTIONS];
private SessionDescription[] localSdps = new SessionDescription[MAX_CONNECTIONS]; // either offer or answer SDP
private MediaStream mediaStream;
private int numberOfCameras;
private VideoCapturerAndroid videoCapturer;
// enableVideo is set to true if video should be rendered and sent.
private boolean renderVideo;
private VideoTrack localVideoTrack;
private VideoTrack[] remoteVideoTracks = new VideoTrack[MAX_CONNECTIONS];
as here https://pastebin.com/c0YCHS6g. My Activity for call: https://pastebin.com/8RVwVZRq

Android WebRTC - create 1-N calls, what is the value of N here?
If N > 4 you need to use a media relay servers(SFU/SVC) servers, otherwise mobile will die !!
For N remote participants, WebRTC will do N times encoding (it will eat CPU & battery N times) and to relay stream to N participants it will consume N times bandwidth (you can't imagine it in 3G/4G).
Choose a media server based on your use case from Janus/Jitsi/Kurento/Licode/red5/switchrtc/wowza and many more.
If N <= 4:
you need to refactor peerConnectionClinet into two parts.
1. Singleton Factory: Main peerConnectionFactory & MediaStream/MediaTracks
2. PC Instances: create a peerConnection from the Singleton Factory and add the same stream to all the instances. This pc instance is responsible for offer/answer/candidates exchange for each endpoint.

Related

Connect Android to Nodejs

I am trying to connect nodejs backend and android frontend. I use nodejs to write api with port 3000 and I use android to call the server side api but it always fails to connect to the server, what should I do now. Help me please!!!!!!
This is Client(Android)
package com.hieunguyen.souq.utils;
public class Constant {
public static String LOCALHOST = "http://localhost:3000/";
public static String PRODUCT = "product";
public static String ORDER = "order";
public static final int PICK_IMAGE = 88;
public static final int GALLERY_REQUEST = 100;
public static final int READ_EXTERNAL_STORAGE_CODE = 200;
public static final int CAMERA_REQUEST = 300;
public static final int CAMERA_PERMISSION_CODE = 400;
public static String PRODUCT_ID = "ProductId";
public static String CATEGORY = "Category";
public static String EMAIL = "email";
public static String OTP = "otp";
public static String PRODUCTID = "Product_id";
public static String KEYWORD = "keyword";
}
This is Server(Nodejs)
const express = require('express')
const bodyParser = require('body-parser')
require('dotenv').config();
const app = express()
app.use('/storage_user', express.static('storage_user'));
app.use('/storage_product', express.static('storage_product'));
app.use('/storage_poster', express.static('storage_poster'));
const userRouter = require('./api/routes/users')
const productRouter = require('./api/routes/products')
const favoriteRouter = require('./api/routes/favorites')
const cartRouter = require('./api/routes/carts')
const historyRouter = require('./api/routes/history')
const reviewRouter = require('./api/routes/review')
const posterRouter = require('./api/routes/posters')
const addressRouter = require('./api/routes/address')
const orderRouter = require('./api/routes/orders')
const port = 3000
app.use(bodyParser.urlencoded({ extended: false }))
app.use(bodyParser.json())
app.use('/users', userRouter)
app.use('/products', productRouter)
app.use('/favorites', favoriteRouter)
app.use('/carts', cartRouter)
app.use('/history', historyRouter)
app.use('/review', reviewRouter)
app.use('/posters', posterRouter)
app.use('/address', addressRouter)
app.use('/orders', orderRouter)
app.listen(port, () => console.log("Server đã khởi động!"))
When I run server and test on postman is ok(Localhost:3000/)
But when I run on client it's always error failed to connect to localhost:3000/ in server :((((. I don't know how to fix that!!!!. Please help me!!!! Sorry my English not good
There are 2 things I think may have occurred.
First of all, localhost is currently running on your local computer, you need to config port forwarding on your router in order to be accessible in your local network.
Secondly, you may want to enable CORS because your front end and back end is not hosting on the same domain.

What are some examples of MediaController commands?

Checking out the MediaController documentation, I noticed that there is a function called sendCommand(...), which requires three parameters:
command: String;
args: Bundle;
cb: ResultReceiver.
But examples of how to use that method are nowhere to be found.
What are the available MediaController#sendCommand(...) default commands and acceptable argument keys and values types?
For example, checking PlaybackState documentation, we can find a constant called ACTION_PLAY_FROM_MEDIA_ID which description is as follows:
Indicates this session supports the play from media id command
This leads us to think that MediaController#sendCommand(...) is able to change a MediaSession's current media by sending it the media ID. How can we do it?
It's known that Google Play Music App's MediaController shares its Media Queue through MediaController#getQueue function.
You can find Commands constants in MediaControllerCompat.
They actually are:
public static final String COMMAND_GET_EXTRA_BINDER =
"android.support.v4.media.session.command.GET_EXTRA_BINDER";
public static final String COMMAND_ADD_QUEUE_ITEM =
"android.support.v4.media.session.command.ADD_QUEUE_ITEM";
public static final String COMMAND_ADD_QUEUE_ITEM_AT =
"android.support.v4.media.session.command.ADD_QUEUE_ITEM_AT";
public static final String COMMAND_REMOVE_QUEUE_ITEM =
"android.support.v4.media.session.command.REMOVE_QUEUE_ITEM";
public static final String COMMAND_REMOVE_QUEUE_ITEM_AT =
"android.support.v4.media.session.command.REMOVE_QUEUE_ITEM_AT";
public static final String COMMAND_ARGUMENT_MEDIA_DESCRIPTION =
"android.support.v4.media.session.command.ARGUMENT_MEDIA_DESCRIPTION";
public static final String COMMAND_ARGUMENT_INDEX =
"android.support.v4.media.session.command.ARGUMENT_INDEX";
For some basic usage samples u can check out its methods, like:
#Override
public void removeQueueItem(MediaDescriptionCompat description) {
long flags = getFlags();
if ((flags & MediaSessionCompat.FLAG_HANDLES_QUEUE_COMMANDS) == 0) {
throw new UnsupportedOperationException(
"This session doesn't support queue management operations");
}
Bundle params = new Bundle();
params.putParcelable(COMMAND_ARGUMENT_MEDIA_DESCRIPTION, description);
sendCommand(COMMAND_REMOVE_QUEUE_ITEM, params, null);
}

How to apply to a standard audio player?

I need to be when you press the buttons in the usual android I could switch to music in my standard audio player for android while the music is playing. How can you interact with a standard audio player for android?
Music player is not standard thing on Android and vendors can come with different app than others (i.e. Sony is good exapmple). Controlling media app can be doable but final effect depends on what app is really installed (and used, as user can have more than one). You may be able to control some, but other may remain immune to your attempts. You can try this code - will work for Google Music for example):
public static final String SERVICECMD = "com.android.music.musicservicecommand";
public static final String CMDNAME = "command";
public static final String CMDSTOP = "stop";
AudioManager mAudioManager = (AudioManager)getSystemService(Context.AUDIO_SERVICE);
if(mAudioManager.isMusicActive()) {
Intent i = new Intent(SERVICECMD);
i.putExtra(CMDNAME , CMDSTOP );
YourApplicationClass.this.sendBroadcast(i);
}
other commands are
public static final String CMDTOGGLEPAUSE = "togglepause";
public static final String CMDPAUSE = "pause";
public static final String CMDPREVIOUS = "previous";
public static final String CMDNEXT = "next";
Commands are taken from android/music/MediaPlaybackService.java

Declaring a MqttClientPersistence object causes connection failure

In te below code I am trying to create a folder for persisted data. As you see I created private final String folder = "//temp"; and the object persistence. But the problem is when i run the App, i says Connection Failed, this message comes out from the client connection synchronous listener, and when I connect without the object persistence, every thing works fine.
Am i wrongly initializing the folder variable or using MqttClientPersistence persistence incorrectly?
code:
private final String folder = "//temp";
private final int keepAliveInterval = 30;
private final String TAG = this.getClass().getSimpleName();
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.mqtt_proj_01_layout);
final MqttClientPersistence persistence = new MqttDefaultFilePersistence(folder);
final MqttAndroidClient client2 = new MqttAndroidClient(getApplicationContext(), serverURI, clientID,persistence);
Do you have filesystem read/write permissions enabled for your android application?
Also the path probably wants to be application specific directory. You can get the application specific directory using something like this:
File outputDir = context.getCacheDir();

No implementation found for native Stitch

I use opencv stitching in android project.
public class MainActivity extends Activity implements OnClickListener {
private String mWarpType;
private String mMatchConf;
private String mConfThresh;
private SharedPreferences mSettings;
public static final String SETTINGS = "Pano_Settings";
private final String SETTINGS_WARP_TYPE = "warp";
private final String SETTINGS_MATCH_CONF = "match_conf";
private final String SETTINGS_CONF_THRESH = "conf_thresh";
private String mDefaultWarpType = "spherical";
private String mDefaultMatchConf = "0.5";
private String mDefaultConfThresh = "0.8";
...
public native int Stitch(Object[] args);
public void onClick(View v) {
switch (v.getId()) {
case R.id.GoButton:
List<String> s = new ArrayList<String>();
s.add("Stitch");
s.add("/sdcard/tesseract/images1.jpeg");
s.add("/sdcard/tesseract/images2.jpeg");
s.add("--warp");
s.add(mWarpType);
s.add("--conf_thresh");
s.add(mConfThresh);
s.add("--match_conf");
s.add(mMatchConf);
s.add("--work_megapix");
s.add("0.2");
s.add("--seam_megapix");
s.add("0.2");
s.add("--expos_comp");
s.add("gain");
s.add("--output");
s.add("/sdcard/tesseract/");
Integer i = Stitch(s.toArray());
Log.d("1",i.toString());
break;
default:
break;
}
}
}
application is started but when Stitch(s.toArray()) is called I get the error:
W/dalvikvm(15392): No implementation found for native Lcom/prototype/MainActivity;.Stitch ([Ljava/lang/Object;)I
OpenCV successfully added in the workplace and my project -> Properties -> Android -> Library add -> OpenCV lib project
version OpenCV 2.4.2.
sample was taken from the project android-opencv-panorama.
You probably copied the native code "as is" from the sample, but your Java class has a different package and name. Look for the function named Java_<some more>_Stitch() in your cpp file, and rename it to become:
Java_com_prototype_MainActivity_Stitch()

Categories

Resources