Casting audio with local server socket - android

I'm trying to fix an existing project which is casting video and audio to the web.
I need to create local socket:
socketId = "my.application.media." + suffix + "-" + new
Random().nextInt();
localServerSocket = new LocalServerSocket(socketId);
receiver = new LocalSocket();
receiver.connect(new LocalSocketAddress(socketId));
receiver.setReceiveBufferSize(SOCKET_BUFFER_SIZE);
receiver.setSendBufferSize(SOCKET_BUFFER_SIZE);
sender = localServerSocket.accept();
sender.setReceiveBufferSize(SOCKET_BUFFER_SIZE);
sender.setSendBufferSize(SOCKET_BUFFER_SIZE);
and creating media recorder:
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.RAW_AMR);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mMediaRecorder.setAudioEncodingBitRate((int) 7.95 * 1024);
mMediaRecorder.setAudioSamplingRate(8000);
mMediaRecorder.setAudioChannels(1);
mMediaRecorder.setOutputFile(sender.getFileDescriptor());
mMediaRecorder.prepare();
But I'm getting java.lang.IllegalStateException after calling start on mMediaRecorder. What am I missing? When I'm not using sender.getFileDescriptor() everything is working correctly so probably that's the problem. I know that there are many libraries which are providing this functionality but I prefer to fix this one. Casting the only video is working correctly and the only problem is with the audio. Thanks a lot for help.
Order of executed methods:
added logs to check the order of methods and thread:
creating sockets: Socket opening thread
creating receiver: Socket opening thread
creating sender: Socket opening thread
setting audio source: Socket opening thread
setting properties: Socket opening thread
creating file descriptor: Socket opening thread
preparing media recorder: Socket opening thread
starting media recorder: Socket opening thread
I found that I'm also receiving errors:
2019-02-13 18:15:49.701 6176-13833/? E/StagefrightRecorder: Output file descriptor is invalid
2019-02-13 18:15:49.701 7851-9780/my.application E/MediaRecorder: start failed: -38

As stated here this error java.lang.IllegalStateException occurring when
a method has been invoked at an illegal or inappropriate time.
So with that in mind and with this article in mind of how to use sockets, You should put your socket related staff inside AsyncTask (separated thread) and use try catch.
AsyncTask Documentation , and Socket Documentation if you want to expand your knowledge.

As it seems you are trying to use getFileDescriptor before (or after if it close) sender have the data to pull it out.
Try extracting the data at earlier location in the code to a variable and than use this variable instead.
Another possibility could be; MediaRecorder documentation says
You must specify a file descriptor that represents an actual file
so be sure that the type that sender.getFileDescriptor() return is the right type that mMediaRecorder.setAudioChannels can get.

Related

Add microphone in unity for Android platform

I want to take audio input in my unity application which I am building for Android platform. The code I have added in Start Function is as follows:
var audio = GetComponent< AudioSource > ();
audio.clip = Microphone.Start("Built-in Microphone", true, 10, 44100);
audio.loop = true;
while (!(Microphone.GetPosition(null) > 0)) { }
audio.Play();
But it is showing the following error:
ArgumentException: Couldn't acquire device ID for device name Built-in Microphone
I'm referring from this post to add microphone. How to resolve this? Also, is there any blog available for doing this end to end?
The error message clearly indicates that it can't find a Microphone device named "Built-in Microphone". So you should probably see what devices it can find.
Try running the following code in the Start method and see what output you get:
foreach (var device in Microphone.devices)
{
Debug.Log("Name: " + device);
}
Once you have a list of the devices, then replace "Built-in Microphone" with the name of your desired device. If "Built-in Microphone" is in the list or you get the same issue with a different device, then you're probably dealing with a permissions issue.

How can I create a receive-only sdp offer of webrtc?

I know how to create a send-only offer by add "OfferToReceiveVideo:false" and "OfferToReceiveAudio:false" in param MediaConstraints in this method:
public void createOffer(SdpObserver observer, MediaConstraints constraints)
But how can I create a receive-only sdp offer? I try to create it by adding no media stream to peer connection, however, it will cause sdp very short and no line "a:recvonly" contains. And no ice candidate generated.
I want to create a webrtc peer connection to receive media stream, but not send.
Solved.
Set "OfferToReceiveAudio" and "OfferToReceiveVideo" to "true" in MediaConstraints. And do not add stream.

basic4android Server Socket

I'm trying to do a two ways communication between a PC running .NET Client-Server and an Android device, (the code is made with Basic4Android).
Sending from Android to PC works fine, the problem occours when i try to send from the PC to Android.
I'm trying to use the ServerSocket but when the PC tries to connect to the Android the device, time-out is reached and an exception is raised. The code i'm using is the following:
PC .NET
Dim sock As New Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp)
sock.Connect(remoteip, 8565)
Dim buffer() As Byte = UTF8.GetBytes(string)
sock.Send(buffer)
sock.Close()
and the Android code:
Sub Process_Globals
'These global variables will be declared once when the application starts.
'These variables can be accessed from all modules.
Dim ss As ServerSocket
Dim IS1 As InputStream
Dim timerListener As Timer
End Sub
Sub Activity_Create(FirstTime As Boolean)
timerListener.Initialize(timerListener, 1)
ss.Initialize(8565, "ss")
ss.Listen
End Sub
Sub ss_NewConnection (Successful As Boolean, NewSocket As Socket)
If Successful Then
IS1 = NewSocket.InputStream
timerListener.Enabled = True
Else
ToastMessageShow("Error.", True)
End If
End Sub
Sub timerListener_Tick
Dim cv As ByteConverter
If IS1.BytesAvailable > 0 Then
Dim buffer() As Byte
IS1.ReadBytes(buffer, 0, IS1.BytesAvailable)
Dim result As String = cv.StringFromBytes(buffer, "UTF8")
ToastMessageShow(result, True)
timerListener.Enabled = False
End If
End Sub
What can be the issue?
Thank you in advance!!
You should use AsyncStreams to read the data.
In your PC read a byte from the stream after you write the message. This will cause the thread to wait for the byte to be available instead of closing the socket before the device read the message.
In Sub timerListener_Tick(), control the timer action:
timerListener.Enabled = False
...
timerListener.Enabled = True
Timer will receive the messages periodly.
Closing the PC socket straight after sending will cause you a send error because it's closing when you are in fact still sending, better to use a global variable to hold the socket instance, but if you want to send synchronous (wait for the transfer to commit) use the example from MSDN.
Synchronous Client Socket Example
I wouldn't recommend it though, personally, I would prefer to use async transfers, like one of the above post recommends.
See->Asynchronous Client Socket Example

Android Jwebsocket custom protocol

I am opening a connection setting up a custom protocol like this:
WebSocketSubProtocol d = new WebSocketSubProtocol("MyCustomProto",WebSocketEncoding.TEXT);
mJWC.addSubProtocol(d);
mJWC.open(mURL);
But... Server side, I receive tis in the protocol string
"org.jwebsocket.json MyCustomProto"
How can I remove from the string the "org.jwebsocket.json" ?
I don't wanna do it server side...
Thanks!
I will answer to my own question.
By calling the "addSubProtocol" doesn't seem to be the right solution for couple of reasons:
if you call those 3 lines of code multiple time (if the first time the connection failed for example..) well the the protocol string would be something like
"org.jwebsocket.json MyCustomProto MyCustomProto"
It just keep adding the protocol..
So I found a turn around. Now I don't use that "addSubProtocol" but instead I defined the protocol directly when I create the socket
mJWC = new BaseTokenClient("client||"+code+"||"+name,WebSocketEncoding.TEXT);
Voila.. Now no more "org.jwebsocket.json" anymore

Android streaming from icecast server get track information

I have a stream from an icecast server downloading, and I can grab the information in the headers by doing the following:
URLConnection cn = new URL(mediaUrl).openConnection();
cn.connect();
int pos=1;
String x;
String y;
while (cn.getHeaderField(pos) != null)
{
x=cn.getHeaderFieldKey(pos);
y = cn.getHeaderField(x);
Log.e(":::::",""+x+" : "+y);
pos++;
}
When I do this all of the headers I receive are shown as:
content-type : audio/mpeg
icy-br : 64
ice-audio-info : ice-samplerate=22050;ice-bitrate=64;ice-channels=2
icy-br : 64
icy-description : RadioStation
icy-genre : Classical, New Age, Ambient
icy-name : RadioStation Example
icy-private : 0
icy-pub : 1
icy-url : http://exampleradio.com
server : Icecast 2.3.2
cache-control : no-cache
However if I open my stream in mplayer I get:
ICY Info: StreamTitle='artist - album - trackname'
and with each time the song is changed, the new track information is sent appearing the same way in mplayer.
In android when I attempt to read the icy-info all I get returned is null. Also how would I go about retrieving the new information from the headers while I am buffering from the stream? Because even if I try to read the header of something I already know exists whilst buffering such as:
Log.e(getClass().getName()," "+cn.getHeaderField("icy-br"));
All I get returned is null.
I hope this makes sense, I can post more code on request.
I realize this question is old, but for others who are facing this challenge, I am using this project: http://code.google.com/p/streamscraper/ to get track information from an icecast stream. I'm using it on android and so far it works as expected.
All you need is to setDataSource() and pass the URL as a String, then you must prepareAsync() and with a mp.setOnPreparedListener(this); or etc. you will get noticed when the MediaPlayer is done buffering, then all you need to do is mp.start(); P.S.: Don't forget to mp.stop, mp.reset and mp.release upon destroying the application. ;) I'm still thinking of a way to read the ICY info... I must either make my own buffering mechanism and write a buffer file (init the MediaPlayer with FileDescriptor) or make a separate connection from time to time to check for ICY info tags and close the connection... Any better ideas anyone?

Categories

Resources