Possibility of a RTSP server on a android device? - android

I have seen lots of examples with Android's VideoVIew API being used to stream data from an external server into a device(VideoView internally uses an RTP and RTSP stack to receive data).
However, there are very few discussion on possibilities of using Android's internal RTSP and RTP stacks for achieving server capabilities, i.e making an android device act as a Streaming server and stream media out .
Is it possible ?
And where inside the Android native code can I start digging in to achieve such functionality ?
Would appreciate details .
Thanks
Amit

A bit late, but:
You can set the MediaRecorder output format to "7". This is defined in
/framework/base/media/java/android/media/MediaRecorder.java
check that for details
as:
/** #hide Stream over a socket, limited to a single stream */
public static final int OUTPUT_FORMAT_RTP_AVP = 7;
The destination is controllable via
setprop streaming.ip
and
setprop streaming.port
The AV data will then be streamed to the given destination address.
The RTP code (native) itself lives in the
/frameworks/base/media/libstagefright/rtsp directory.
Happy code digging

There is also possibility to use libstreaming library (https://github.com/fyhertz/libstreaming)
The documentation on Github gives you the example of how do you set up the server, but basically you need to add net.majorkernelpanic.streaming.gl.SurfaceView to your Layout
<net.majorkernelpanic.streaming.gl.SurfaceView
android:id="#+id/surface"
android:layout_width="match_parent"
android:layout_height="match_parent"/>
Add this to your manifest
<service android:name="net.majorkernelpanic.streaming.rtsp.RtspServer"/>
Include libstreaming library. If you are working with a newer version of Android Studio you need to clone the libstreaming as a separate project and import module. Afterwards, it is necessary to run build on the build.gradle in libstreaming. Then you can work with this library.
The last step is to create an Activity. Simplest possible looks like this:
public class RemoteStreamingActivity extends Activity {
private SurfaceView mSurfaceView;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_remote_streaming);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
handleGestures();
mSurfaceView = (SurfaceView) findViewById(R.id.surface);
SessionBuilder.getInstance()
.setSurfaceView(mSurfaceView)
.setPreviewOrientation(90)
.setContext(getApplicationContext())
.setAudioEncoder(SessionBuilder.AUDIO_NONE)
.setVideoEncoder(SessionBuilder.VIDEO_H264);
this.startService(new Intent(this,RtspServer.class));
}
#Override
public void onDestroy() {
super.onDestroy();
this.stopService(new Intent(this, RtspServer.class));
}
}
If you want to test whether the rstp server is running you can try using VLC and connect via URL: rstp://{ipAddressOfYourDevice}:8086?h264=200-20-320-240

Related

Open media output picker programmatically

In Android 9 media output controls were added to the volume selection dialog.
I would like to have a button in my app which would open these system media output controls so that the user can choose where the audio will be played.
I could find nothing about this in: Android 9 features and APIs, Behavior changes: apps targeting API level 28+ and Behavior changes: all apps
You are looking for MediaRouter API.
public class MediaRouterPlaybackActivity extends AppCompatActivity {
private MediaRouteSelector mSelector;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
// Create a route selector for the type of routes your app supports.
mSelector = new MediaRouteSelector.Builder()
// These are the framework-supported intents
.addControlCategory(MediaControlIntent.CATEGORY_REMOTE_PLAYBACK)
.addControlCategory(MediaControlIntent.CATEGORY_LIVE_AUDIO)
.build();
}
}
For further reference: https://developer.android.com/guide/topics/media/mediarouter#java
Also check Create a MediaRouteSelector section.
I did some research about this as well so I know that Android Pie is capable of this, but they haven't provided a way for us developers to do it yet (at least I haven't found a way). The CATEGORY_LIVE_AUDIO in MediaRouterSelector is also only for secondary output; you can read about it on the developer website. So if you must do it, I think the only way you can do that right now is to do a workaround programmatically:
Disconnect current device (The important of this step depends on the device; I have to do this on my Essential phone)
Use BluetoothAdapter.getBondedDevices() to get a list of paired devices.
Connect to the one you want
Try this - it works for me on Android 11:
Intent intent = new Intent("com.android.settings.panel.action.MEDIA_OUTPUT");
intent.addFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
context.startActivity(intent);

Local Video Renderer in Android WebRTC

I am using this library: https://bintray.com/google/webrtc/google-webrtc
What I want to achieve (at least, at the beginning of my project) is render video locally. I am using this tutorial (which is the only one around the Internet) https://vivekc.xyz/getting-started-with-webrtc-for-android-daab1e268ff4. Unfortunately, the last line of code is not up-to-date anymore. The constructor needs a callback which I have no idea how to implement:
localVideoTrack.addRenderer(new VideoRenderer(i420Frame -> {
// no idea what to put here
}));
My code is exactly the same as in the posted tutorial. This is the very first step to make familiar with WebRTC technology in Android which I cannot figure out. My camera is capturing the video because I can see it in my log:
I/org.webrtc.Logging: CameraStatistics: Camera fps: 28.
The main issue is that I have no idea how to pass it to my SurfaceViewRenderer through a callback. Did anyone meet that problem? I'll really appreciate any help or suggestions.
Here is the official example app which is the only source but it is done differently than one in the tutorial, it's much more complicated:
https://webrtc.googlesource.com/src/+/master/examples/androidapp/src/org/appspot/apprtc
You are right, the API no longer matches that in the tutorial, but it's close.
The VideoTrack, has an addRenderer(VideoRenderer renderer) method, that requires you to create a VideoRenderer, with the SurfaceViewRenderer as parameter. But that is not possible anymore, so instead you should use the addSink(VideoSink sink) method, of the VideoTrack. The SurfaceViewRenderer object implement the VideoSink onFrame(VideoFrame frame) method to make this work.
VideoTrack videoTrack = utility.createVideoTrack();
videoTrack.addSink(this.localSurfaceViewRenderer);
I used the same official example app as reference to get to this conclusion, and it works fine for me.
private static class ProxyVideoSink implements VideoSink {
private VideoSink target;
#Override
synchronized public void onFrame(VideoFrame frame) {
if (target == null) {
Logging.d("TAG", "Dropping frame in proxy because target is null.");
return;
}
target.onFrame(frame);
}
synchronized public void setTarget(VideoSink target) {
this.target = target;
}
}
ProxyVideoSink localVideoSink = new ProxyVideoSink();
videoTrack.addSink(localVideoSink);
localVideoSink.setTarget(localSurfaceView);
try this code as directly assigning videoTrack.addSink(localSurfaceView) might crash on next initialization.

Libgdx,change pitch in Music

I want to implement a cool effect, when there is an explosion the music gets slightly slower for a moment.
The Music class of libgdx doesn't allow changing the pitch of the sound, I tried using the Sound class instead of the Music to play my music but its very slow at loading the files ( 5 sec for a 5 mb file, on Desktop! ).
So, the question is if there is a way to workaround this, or if there is an external library, that works both on desktop and android and can work along with libgdx.
After some searching I found a way by editing the libgdx source code.
You need to use the AudioDevice object to play your music sample by sample ,to do that you need to include the audio-extension of libgdx.
We gonna edit the libgdx source code so you need to download it and replace the gdx.jar , gdx-backend-android.jar and gdx-backend-lwjgl.jar with the correct libgdx projects(they have the same name without the jar extension)
1)Edit the AudioDevice.java inside com.badlogic.gdx.audio package of the gdx project
add the following code inside the interface
public void setSpeed(float val);
2)Edit the AndroidAudioDevice.java inside com.badlogic.gdx.backends.android package of the gdx-backend-android project
The Android side of the AudioDevice class relies on the AudioTrack class of the Android sdk ,this class has a setPlaybackRate(..) method.
add the following code inside the class
#Override
public void setSpeed (float speed) {
track.setPlaybackRate((int)(track.getSampleRate()*speed));
}
3)Edit the OpenALAudioDevice.java inside com.badlogic.gdx.backends.lwjgl.audio package of the gdx-backend-lwjgl project
The Desktop side of the AudioDevice relies on OpenAL (the opengl of audio) which has a handy set pitch method
add the following inside the class
#Override
public void setSpeed (float speed) {
alSourcef(sourceID, AL_PITCH, speed);
}
4)Play the audio
Here is the code for loading and playing the sound file
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.audio.AudioDevice;
import com.badlogic.gdx.audio.io.Mpg123Decoder;
import com.badlogic.gdx.files.FileHandle;
public class MusicBeat {
static short[] samples = new short[2048];
Mpg123Decoder decoder;
AudioDevice device;
static FileHandle externalFile;
public boolean playing=true;
public MusicBeat(String name )
{
FileHandle file=Gdx.files.internal(name);
FileHandle external=Gdx.files.external("myappname/"+name);
if(!external.exists())file.copyTo(external); //copy the file to the external storage only if it doesnt exists yet
decoder = new Mpg123Decoder(external);
device = Gdx.audio.newAudioDevice(decoder.getRate(),decoder.getChannels() == 1 ? true : false);
playing=false;
externalFile=file;
}
void play()
{
playing=true;
Thread playbackThread = new Thread(new Runnable() {
#Override
public synchronized void run() {
int readSamples = 0;
while ( playing) {
if(decoder!=null){
if((readSamples = decoder.readSamples(samples, 0,samples.length))<=0){
decoder.dispose();
decoder = new Mpg123Decoder(externalFile);
playing=false;
}
device.writeSamples(samples, 0, readSamples);
}
}
}
});
playbackThread.setDaemon(true);
playbackThread.start();
}
public void stop(){
playing=false;
decoder.dispose();
}
public void setVolume(float vol){
device.setVolume(vol);
}
public void setSpeed(float speed){
device.setSpeed(speed);
}
}
Where to get the audio extension?(required for the AudioDevice)
The audio extension seems to have been deprecated and the jars cant be found easily, I have uploaded them here, its an old version but should work just fine.
Easier way?
If your game is only intent to run on desktop ,the Music class of libgdx on desktop relies again on OpenAL which gives as the power to play with the pitch ,so you just need to edit the Music interface(OpenALMusic on desktop) instead of the AudioDevice and get out the whole play sample by sample thing out of the equation,unfortunately as dawez said the Music class on android relies on MediaPlayer which is not giving us the pitch change option.
Conclusion :This method doesnt seem nice to me ,If your game really needs the pitch thing and it doesn't makes sense without it then go with it ,otherwise its just too much effort for such a small detail .
It seems that you cannot change the pitch of Music in Android.
To play music in libgdx you refer to the Interface Music:
public interface Music extends Disposable {
This is implemented by
package com.badlogic.gdx.backends.android;
public class AndroidMusic implements Music, MediaPlayer.OnCompletionListener {
private MediaPlayer player;
You are interested in the player object. That is of type MediaPlayer
package android.media;
public class MediaPlayer {
So now it boils down to the question if android is able to support pitching on the MediaPlayer class. Short answer no, long answer:Speed Control of MediaPlayer in Android
A workaround that you can do is to use a SoundPool even if that is slow, the loading can be started while the user is the loading screen. Otherwise you can try can split the music files in chunks and load them as you go still using a SoundPool. So you would do something like a lazy loading when the current part is coming to its end.
If you manage to find a suitable a solution, please post the relative code!
With the release of Android 6.0 (API level 23), "PlaybackParams" can be added to the MediaPlayer. These include playback speed and pitch among others.

did anybody use ACR1222L android library?

I am writing an app which will be run in tablets. The tablet will be connected to ACR1222L NFC reader.
I am using their android library to interact with the reader. I can detect the USB reader and also can read the readers name.
BUT i am struggling to read data from NFC tag. In fact I have no clue where to start, which classes/methods to use.
Is there anyone who already worked with ACR1222L and its android library?
Some guidelines, sample code, tutorial would save my life.
EDIT:
Well, I got little smarter now, I can read the UID. this is how to do it.
#Override
protected void onCreate(Bundle savedInstanceState) {
............... your code
mReader = new Reader(mManager);
mReader.setOnStateChangeListener(new OnStateChangeListener() {
#Override
public void onStateChange(int slotNum, int prevState, int currState) {
//This command is for the card UID
byte[] command = {(byte) 0xFF,(byte) 0xCA,0x00,0x00,0x00};
byte[] response = new byte[300];
int responseLength;
if (currState == Reader.CARD_PRESENT) {
try {
mReader.power(slotNum,Reader.CARD_WARM_RESET);
mReader.setProtocol(slotNum, Reader.PROTOCOL_T0| Reader.PROTOCOL_T1);
responseLength=mReader.transmit(slotNum,command, command.length, response,response.length);
//Here i have the card UID if i send the proper command
responsedata=NfcUtils.convertBinToASCII(response);
}
}
}
BUT I am still struggling to read the payload from the tag. I have also look into nfctools library. But I don't know where to start. Would be great if anyone guide my through the library.
Yes, this is possible - and quite the same as working with the ACR 122 as the API is almost identical.
I've developed a (commmercially available) library which probably does most of what you're looking into, or can serve as a starting point for your own implementation.

Accessing front camera of mobile using flash?

Recently I went thru the code for accessing the camera using flash ActionScript3 and I have tested the code in iMac machine, iPhone and Android.Now based on this, I am developing an application for Android which includes the accessibility of the front camera. Now my Problem is I dont know how to access the front camera? We should use some other code or should we specify which camera should be accessed? First of all, can we access the front camera thru flash?
I made a simple android app. Here is the code for selecting camera window
public class SelectCameraAlertAndroid extends StartAlertAndroid_design{
public function SelectCameraAlertAndroid() {
frontCameraButton.addEventListener(MouseEvent.CLICK, onFrontCamera);
backCameraButton.addEventListener(MouseEvent.CLICK, onBackCamera);
}
private function onFrontCamera(event:MouseEvent):void {
Model.model.camera = Camera.getCamera("1");
Model.model.cameraSelectedSignal.dispatch();
dispatchEvent(new Event("closeMe"));
}
private function onBackCamera(event:MouseEvent):void {
Model.model.camera = Camera.getCamera("0");
Model.model.cameraSelectedSignal.dispatch();
dispatchEvent(new Event("closeMe"));
}
}
Not true. You can access the front camera on Android.
The only problem is that you don't get to use the CameraUI(pretty sure).
var camera = Camera.getCamera("1");
camera.setMode(stage.stageWidth, stage.stageHeight, 30, true);
var video:Video = new Video(stage.stageWidth, stage.stageHeight);
video.attachCamera(camera);
addChild(video);
Note: This answer is outdated. Please refer to the other answers for updated information.
Currently, AIR only supports access to the primary camera on an Android device.
http://forums.adobe.com/thread/849983
Official documentation: http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/media/Camera.html#getCamera()
"On Android devices, you can only access the rear-facing camera."

Categories

Resources