How can I create a receive-only sdp offer of webrtc? - android

I know how to create a send-only offer by add "OfferToReceiveVideo:false" and "OfferToReceiveAudio:false" in param MediaConstraints in this method:
public void createOffer(SdpObserver observer, MediaConstraints constraints)
But how can I create a receive-only sdp offer? I try to create it by adding no media stream to peer connection, however, it will cause sdp very short and no line "a:recvonly" contains. And no ice candidate generated.
I want to create a webrtc peer connection to receive media stream, but not send.

Solved.
Set "OfferToReceiveAudio" and "OfferToReceiveVideo" to "true" in MediaConstraints. And do not add stream.

Related

Casting audio with local server socket

I'm trying to fix an existing project which is casting video and audio to the web.
I need to create local socket:
socketId = "my.application.media." + suffix + "-" + new
Random().nextInt();
localServerSocket = new LocalServerSocket(socketId);
receiver = new LocalSocket();
receiver.connect(new LocalSocketAddress(socketId));
receiver.setReceiveBufferSize(SOCKET_BUFFER_SIZE);
receiver.setSendBufferSize(SOCKET_BUFFER_SIZE);
sender = localServerSocket.accept();
sender.setReceiveBufferSize(SOCKET_BUFFER_SIZE);
sender.setSendBufferSize(SOCKET_BUFFER_SIZE);
and creating media recorder:
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.RAW_AMR);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mMediaRecorder.setAudioEncodingBitRate((int) 7.95 * 1024);
mMediaRecorder.setAudioSamplingRate(8000);
mMediaRecorder.setAudioChannels(1);
mMediaRecorder.setOutputFile(sender.getFileDescriptor());
mMediaRecorder.prepare();
But I'm getting java.lang.IllegalStateException after calling start on mMediaRecorder. What am I missing? When I'm not using sender.getFileDescriptor() everything is working correctly so probably that's the problem. I know that there are many libraries which are providing this functionality but I prefer to fix this one. Casting the only video is working correctly and the only problem is with the audio. Thanks a lot for help.
Order of executed methods:
added logs to check the order of methods and thread:
creating sockets: Socket opening thread
creating receiver: Socket opening thread
creating sender: Socket opening thread
setting audio source: Socket opening thread
setting properties: Socket opening thread
creating file descriptor: Socket opening thread
preparing media recorder: Socket opening thread
starting media recorder: Socket opening thread
I found that I'm also receiving errors:
2019-02-13 18:15:49.701 6176-13833/? E/StagefrightRecorder: Output file descriptor is invalid
2019-02-13 18:15:49.701 7851-9780/my.application E/MediaRecorder: start failed: -38
As stated here this error java.lang.IllegalStateException occurring when
a method has been invoked at an illegal or inappropriate time.
So with that in mind and with this article in mind of how to use sockets, You should put your socket related staff inside AsyncTask (separated thread) and use try catch.
AsyncTask Documentation , and Socket Documentation if you want to expand your knowledge.
As it seems you are trying to use getFileDescriptor before (or after if it close) sender have the data to pull it out.
Try extracting the data at earlier location in the code to a variable and than use this variable instead.
Another possibility could be; MediaRecorder documentation says
You must specify a file descriptor that represents an actual file
so be sure that the type that sender.getFileDescriptor() return is the right type that mMediaRecorder.setAudioChannels can get.

gRPC Android Client losing connection "too many pings"

Android grpc client is receiving GOAWAY from server with "too many pings" error. Now I realise that this is probably a server side issue, but I think the issue is that the client channel settings do not match that of the servers.
I have a C# gRPC server with the following settings:
List<ChannelOption> channelOptions = new List<ChannelOption>();
channelOptions.Add(new
ChannelOption("GRPC_ARG_HTTP2_MIN_RECV_PING_INTERVAL_WITHOUT_DATA_MS",
1000));
channelOptions.Add(new
ChannelOption("GRPC_ARG_HTTP2_MAX_PINGS_WITHOUT_DATA", 0));
channelOptions.Add(new
ChannelOption("GRPC_ARG_KEEPALIVE_PERMIT_WITHOUT_CALLS", 1));
this.server = new Server(channelOptions) {
Services = { TerminalService.BindService(this) },
Ports = {new ServerPort("0.0.0.0", 5000,
ServerCredentials.Insecure)}
};
On Android I have the following channel setup:
private val channel = ManagedChannelBuilder.forAddress(name, port)
.usePlaintext()
.keepAliveTime(10, TimeUnit.SECONDS)
.keepAliveWithoutCalls(true)
.build()
After a few min (however seems to be a random time). I get the goaway error. I noticed that if I stream data on the call then the error never happens. It is only when there is no data on the stream. This leads me to believe the issue is that the GRPC_ARG_HTTP2_MAX_PINGS_WITHOUT_DATA needs to be set on the Android client aswell. Problem is for the life of me I cannot find where to set these channel settings on gRPC java. Can someone point out to me where I can set these channel settings? There are no examples where these have been set.
The channel options being specified are using the wrong names. Names like GRPC_ARG_HTTP2_MAX_PINGS_WITHOUT_DATA are the C-defines for things like "grpc.http2.max_pings_without_data".
You can map from the C name to the key string by looking at grpc_types.h. You should prefer using one of the C# constants in ChannelOptions when it is available, but that doesn't seem to be an option in this case.
These options are not visible in the Java ManagedChannelBuilder API because they are server-specific settings. So instead they are visible on the ServerBuilder. See A8 client-side keepalive for reference to the Java keepalive API.

Android (libstreaming) RTSP server can play video but no sound

I use libstreaming to create a RTSP server on an Android. Then, I use another phone to connect to the server to play the live stream. I hope the server can use its camera and microphone to record a video then play on the client.
After connecting, the video can play properly, but there is no sound.
The following is part of my RTSP server's code:
mSession = SessionBuilder.getInstance()
.setSurfaceView(mSurfaceView)
.setPreviewOrientation(90)
.setContext(getApplicationContext())
.setAudioEncoder(SessionBuilder.AUDIO_AAC)
//.setAudioQuality(new AudioQuality(16000, 32000))
.setAudioQuality(new AudioQuality(8000, 16000))
.setVideoEncoder(SessionBuilder.VIDEO_H264)
//.setVideoQuality(new VideoQuality(320, 240, 20, 500000))
.build();
mSession.startPreview(); //camera preview on phone surface
mSession.start();
I searched for this question, some people said I should modify the destination ports in SessionBuilder.java.
I tried to modify it as follow, but it still did not work
if (session.getAudioTrack() != null) {
Log.e("SessionBuilder", "Audio track != null");
AudioStream audio = session.getAudioTrack();
audio.setAudioQuality(mAudioQuality);
audio.setDestinationPorts(5008);
}
Does somebody know the reason for this question?
By the way, I used VLC player on another phone as the client.
I use the following line to connect to the server
rtsp:MY_IP:1234?h264=200-20-320-240
Thanks
I traced the source code and found out that the server did not receive the request of the audio stream, only received the request of the video stream.
After setup the connection in RtspServer.java, the received trackID=1.
(trackID=0 means AudioStream && trackID=1 means VideoStream)
public Response processRequest(Request request) throws IllegalStateException, IOException {
....
else if (request.method.equalsIgnoreCase("SETUP")) {
....
boolean streaming = isStreaming();
Log.e(TAG, "trackId: " + trackId);
// received trackID=1 which represent video stream
mSession.syncStart(trackId);
....
}
....
}
I solved this problem by using a different URL:
rtsp:MY_IP:1234?trackID=0
Thanks
I had the same problem. Setting the streaming method worked for me.
mSession.getVideoTrack().setStreamingMethod(MediaStream.MODE_MEDIACODEC_API_2);

Android Jwebsocket custom protocol

I am opening a connection setting up a custom protocol like this:
WebSocketSubProtocol d = new WebSocketSubProtocol("MyCustomProto",WebSocketEncoding.TEXT);
mJWC.addSubProtocol(d);
mJWC.open(mURL);
But... Server side, I receive tis in the protocol string
"org.jwebsocket.json MyCustomProto"
How can I remove from the string the "org.jwebsocket.json" ?
I don't wanna do it server side...
Thanks!
I will answer to my own question.
By calling the "addSubProtocol" doesn't seem to be the right solution for couple of reasons:
if you call those 3 lines of code multiple time (if the first time the connection failed for example..) well the the protocol string would be something like
"org.jwebsocket.json MyCustomProto MyCustomProto"
It just keep adding the protocol..
So I found a turn around. Now I don't use that "addSubProtocol" but instead I defined the protocol directly when I create the socket
mJWC = new BaseTokenClient("client||"+code+"||"+name,WebSocketEncoding.TEXT);
Voila.. Now no more "org.jwebsocket.json" anymore

Android streaming from icecast server get track information

I have a stream from an icecast server downloading, and I can grab the information in the headers by doing the following:
URLConnection cn = new URL(mediaUrl).openConnection();
cn.connect();
int pos=1;
String x;
String y;
while (cn.getHeaderField(pos) != null)
{
x=cn.getHeaderFieldKey(pos);
y = cn.getHeaderField(x);
Log.e(":::::",""+x+" : "+y);
pos++;
}
When I do this all of the headers I receive are shown as:
content-type : audio/mpeg
icy-br : 64
ice-audio-info : ice-samplerate=22050;ice-bitrate=64;ice-channels=2
icy-br : 64
icy-description : RadioStation
icy-genre : Classical, New Age, Ambient
icy-name : RadioStation Example
icy-private : 0
icy-pub : 1
icy-url : http://exampleradio.com
server : Icecast 2.3.2
cache-control : no-cache
However if I open my stream in mplayer I get:
ICY Info: StreamTitle='artist - album - trackname'
and with each time the song is changed, the new track information is sent appearing the same way in mplayer.
In android when I attempt to read the icy-info all I get returned is null. Also how would I go about retrieving the new information from the headers while I am buffering from the stream? Because even if I try to read the header of something I already know exists whilst buffering such as:
Log.e(getClass().getName()," "+cn.getHeaderField("icy-br"));
All I get returned is null.
I hope this makes sense, I can post more code on request.
I realize this question is old, but for others who are facing this challenge, I am using this project: http://code.google.com/p/streamscraper/ to get track information from an icecast stream. I'm using it on android and so far it works as expected.
All you need is to setDataSource() and pass the URL as a String, then you must prepareAsync() and with a mp.setOnPreparedListener(this); or etc. you will get noticed when the MediaPlayer is done buffering, then all you need to do is mp.start(); P.S.: Don't forget to mp.stop, mp.reset and mp.release upon destroying the application. ;) I'm still thinking of a way to read the ICY info... I must either make my own buffering mechanism and write a buffer file (init the MediaPlayer with FileDescriptor) or make a separate connection from time to time to check for ICY info tags and close the connection... Any better ideas anyone?

Categories

Resources