I am currently doing an AndAR project in group of 3. I'm the person who's in charge of video streaming into the Android phone.
I got ourselves a D-Link DCS-920 IP camera and I found out that it uses MJPEG codec for the live video stream and the webserver uses Jview to view the live stream. As far as I know MJPG is not a supported file type for Android OS so I've came out with an idea, instead of using ImageView, I use WebView to stream the video.
I've implemented a very simple concept and it works! But the problem is, refresh rate is terrible.
I get the video image (eg: http://192.168.1.10/image.jpg) to view on the WebView and implement a Timer to control the refresh rate (supposed to set it to 30fps, which is refresh every 33ms) but it can only go up to 500ms interval, any lower interval I notice it will not be any smoother,sometimes the image wont load and connection is unstable (eg: dropped). Could this be I'm refreshing at a rate faster than it can receive?
But over on the webserver Jview it has no problem! was trying to find the source code for the jview but I have no hope.
Anyway here's the code I've written
package org.example.test;
import java.util.Timer;
import java.util.TimerTask;
import android.app.Activity;
import android.os.Bundle;
import android.view.View;
import android.webkit.WebView;
import android.widget.Button;
import android.widget.EditText;
public class Webview extends Activity {
public WebView webView;
public Timer autoUpdate;
public String url;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
webView = (WebView) findViewById(R.id.webview);
webView.getSettings();
final EditText urlText = (EditText) findViewById(R.id.urlText);
//Buttons//////////////////------------
final Button connectB = (Button)findViewById(R.id.connectButton);
connectB.setOnClickListener(new View.OnClickListener() {
public void onClick(View v) {
//Actions goes here
url = urlText.getText().toString();
webView.loadUrl(url);
timerSetup();
}
});
final Button exitB = (Button)findViewById(R.id.exitButton);
exitB.setOnClickListener(new View.OnClickListener() {
public void onClick(View v) {
//Actions goes here
finish();
}
});
}
//refresh timer//////////////-----------------
public void timerSetup(){
autoUpdate = new Timer();
autoUpdate.schedule(new TimerTask() {
#Override
public void run() {
runOnUiThread(new Runnable() {
#Override
public void run() {
//Actions goes here
webView.loadUrl(url);
}
});
}
}, 0, 500);//refresh rate time interval (ms)
}
}
Is there anyway I can get the video stream in by at least 15fps/have a faster refresh rate?
Are there any such thing as MJPEG viewer/source code that I can use to display these images?
here's the screenshot of the app
http://s945.photobucket.com/albums/ad295/kevinybh/?action=view¤t=video.jpg
(not enough points to post pictures) :(
I just need to make the video stream around 15-30fps. any suggestions/help would be very deeply appreciated :) Thanks!
Instead of an Arduino you could use a Raspberry PI, it should have enough CPU power to control the vehicle and to convert the video stream at the same time. Sure, you'll need to port all of your Arduino software to Raspberry...
MJPEG is a terribly inefficient way to deliver motion video to a mobile device, because each frame is compressed as it's own independent picture. For an application which doesn't need video (someone was asking about a camera watching waiting lines last week) your solution of pushing a static frame every second or so sounds good.
If you need motion video, I would recommend you do transcoding on your webserver from MJPEG to a supported video format which utilizes frame-to-frame compression. This will result in far less data to push, both over the user's 3g connection and from your server to all of its clients. You should only need to run one transcoding engine to support all clients - and you'll be able to use the same one for android & iphone devices, though you may want to also have a higher resolution output for tablets and pc's if your camera output is good enough to justify it.
On android, if we decode a jpeg by CPU, it will cost 40-100ms. If we want to play mjpeg to 15-30fps, we need hardware jpeg decoder.
There was a useful previous SO discussion and this great one with code. Would you try and let us know if that works for you.
You can use MjpegView class to display mjpeg stream directly.
https://code.google.com/p/android-camera-axis/source/browse/trunk/serealisation/src/de/mjpegsample/MjpegView/MjpegView.java?r=33
You'll have to implement some AsyncTasks on this class to works fine.
Good luck
Related
After reading the documentation on Spotify's Android Media Notifications API, https://beta.developer.spotify.com/documentation/android-sdk/guides/android-media-notifications/, I successfully managed to receive the notifications metadata and it is displayed properly on my app.
However, the notifications metadata is only updated when the queue changes, when the track changes, and when playback is changed, so unless one of these three actions happens, the "positionInMs" intent extra isn't sent.
As of right now as a workaround I am simply starting a timer using the time the intent was sent, the last known playback position, and the track duration to track current playback position.
This seemed to work at first, but after further testing I've realized that the timer I set can go out of sync, if the track the user is listening to freezes because of a slow internet connection.
Any ideas to properly track the playback position, while accounting for a slow internet connection? Or are there any alternatives I should look into?
I understand that this question is rather old, but I am going to answer anyway if anyone else comes across it.
I recommend constantly querying Spotify to get the playback position. One way you can do this is by using a timer and querying Spotify every given time frame. The below example queries Spotify every 100ms. If you want to reduce/increase the numbers of queries, you can simply use stopwatch.setClockDelay() and provide your required time
For instance, you can use this timer library
implementation 'com.yashovardhan99.timeit:timeit:1.2.0'
Then use the following code:
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.spotify);
Stopwatch stopwatch = new Stopwatch();
stopwatch.setOnTickListener(this);
stopwatch.start();
}
#Override
public void onTick(Stopwatch stopwatch) {
Data.getAndroidSpotifyAppRemote().getPlayerApi().getPlayerState().setResultCallback(new CallResult.ResultCallback<PlayerState>() {
#Override
public void onResult(PlayerState playerState) {
Log.d("TAG", playerState.playbackPosition);
}
});
}
Don't forget to add the following code at the top of your class:
implements Stopwatch.OnTickListener
For an engineering project I need to basically trick my phone to come out of NFC searching mode and into a mode where the phone is continuously putting out energy. Obviously I have activated NFC in the settings, but the only way I can trick it into leaving the searching mode and have it put out energy continuously is if I leave it on top of a blank tag.
I was thinking of implementing this NFC beam function: public boolean invokeBeam (Activity activity) and implanting it into the BeamLargeFiles sample code provided in Android studio, posted below.
I'm new to app development (though I have a fair bit of coding experience) so I'm just not sure if it's feasible, or if I'm looking in the right places. Any thoughts, ideas and help is appreciated!
package com.example.android.beamlargefiles;
import android.graphics.Color;
import android.os.Bundle;
import android.support.v4.app.FragmentTransaction;
import android.text.Html;
import android.widget.TextView;
import android.view.Menu;
import com.example.android.common.activities.SampleActivityBase;
import com.example.android.common.logger.Log;
import com.example.android.common.logger.LogFragment;
import com.example.android.common.logger.LogWrapper;
import com.example.android.common.logger.MessageOnlyLogFilter;
/**
* A simple launcher activity containing a summary sample description
* and a few action bar buttons.
*/
public class MainActivity extends SampleActivityBase {
public static final String TAG = "MainActivity";
public static final String FRAGTAG = "BeamLargeFilesFragment";
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
TextView sampleOutput = (TextView) findViewById(R.id.sample_output);
sampleOutput.setText(Html.fromHtml(getString(R.string.intro_message)));
FragmentTransaction transaction = getSupportFragmentManager().beginTransaction();
BeamLargeFilesFragment fragment = new BeamLargeFilesFragment();
transaction.add(fragment, FRAGTAG);
transaction.commit();
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
/** Create a chain of targets that will receive log data */
#Override
public void initializeLogging() {
// Wraps Android's native log framework.
LogWrapper logWrapper = new LogWrapper();
// Using Log, front-end to the logging chain, emulates android.util.log method signatures.
Log.setLogNode(logWrapper);
// Filter strips out everything except the message text.
MessageOnlyLogFilter msgFilter = new MessageOnlyLogFilter();
logWrapper.setNext(msgFilter);
// On screen logging via a fragment with a TextView.
LogFragment logFragment = (LogFragment) getSupportFragmentManager()
.findFragmentById(R.id.log_fragment);
msgFilter.setNext(logFragment.getLogView());
logFragment.getLogView().setTextAppearance(this, R.style.Log);
logFragment.getLogView().setBackgroundColor(Color.WHITE);
Log.i(TAG, "Ready");
}
}
From the beamlargedata sample code provided by android studio
When no suitable NFC device is in range the NFC controller will constantly search all technologies for a tag or peer-to-peer device. It does this by sending out short bursts separated by no field activity (to save power).
If you have a newer device it is also very likely that the NFC Controller will only generate a very weak RF field to save power. This field is not strong enough to power a NFC tag but is strong enough for the chip to detect if there is something resonating at 13.56Mhz.
With standard Android you cannot change this behaviour. There is no programatical way in the API to enable the mode you're looking for.
However, if you can stretch your requirements a bit you can likely get something close to what you want.
Option 1:
Enable the Reader-Mode. Call enableReaderMode using the EXTRA_READER_PRESENCE_CHECK_DELAY extra. Set this to the a very high value.
Now, if a tag enters the RF field the NFC controller won't check for presence that often anymore. You can activate your RF field by touching a tag, then removing it.
The RF field will be stable until the presence check delay expires.
Option 2:
If rooting the device is an option, you can hack yourself into the low level NFC stack. Each NFC controller that I've worked with so far has one or more test modes for antenna calibration. Just outputting an RF field is one of the very common test modes.
Reading the source-code of the nfc-stack will likely show you how to enable such a mode. That takes some digging in the source-code and is not for the faint heart, but it is doable.
I want to implement a cool effect, when there is an explosion the music gets slightly slower for a moment.
The Music class of libgdx doesn't allow changing the pitch of the sound, I tried using the Sound class instead of the Music to play my music but its very slow at loading the files ( 5 sec for a 5 mb file, on Desktop! ).
So, the question is if there is a way to workaround this, or if there is an external library, that works both on desktop and android and can work along with libgdx.
After some searching I found a way by editing the libgdx source code.
You need to use the AudioDevice object to play your music sample by sample ,to do that you need to include the audio-extension of libgdx.
We gonna edit the libgdx source code so you need to download it and replace the gdx.jar , gdx-backend-android.jar and gdx-backend-lwjgl.jar with the correct libgdx projects(they have the same name without the jar extension)
1)Edit the AudioDevice.java inside com.badlogic.gdx.audio package of the gdx project
add the following code inside the interface
public void setSpeed(float val);
2)Edit the AndroidAudioDevice.java inside com.badlogic.gdx.backends.android package of the gdx-backend-android project
The Android side of the AudioDevice class relies on the AudioTrack class of the Android sdk ,this class has a setPlaybackRate(..) method.
add the following code inside the class
#Override
public void setSpeed (float speed) {
track.setPlaybackRate((int)(track.getSampleRate()*speed));
}
3)Edit the OpenALAudioDevice.java inside com.badlogic.gdx.backends.lwjgl.audio package of the gdx-backend-lwjgl project
The Desktop side of the AudioDevice relies on OpenAL (the opengl of audio) which has a handy set pitch method
add the following inside the class
#Override
public void setSpeed (float speed) {
alSourcef(sourceID, AL_PITCH, speed);
}
4)Play the audio
Here is the code for loading and playing the sound file
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.audio.AudioDevice;
import com.badlogic.gdx.audio.io.Mpg123Decoder;
import com.badlogic.gdx.files.FileHandle;
public class MusicBeat {
static short[] samples = new short[2048];
Mpg123Decoder decoder;
AudioDevice device;
static FileHandle externalFile;
public boolean playing=true;
public MusicBeat(String name )
{
FileHandle file=Gdx.files.internal(name);
FileHandle external=Gdx.files.external("myappname/"+name);
if(!external.exists())file.copyTo(external); //copy the file to the external storage only if it doesnt exists yet
decoder = new Mpg123Decoder(external);
device = Gdx.audio.newAudioDevice(decoder.getRate(),decoder.getChannels() == 1 ? true : false);
playing=false;
externalFile=file;
}
void play()
{
playing=true;
Thread playbackThread = new Thread(new Runnable() {
#Override
public synchronized void run() {
int readSamples = 0;
while ( playing) {
if(decoder!=null){
if((readSamples = decoder.readSamples(samples, 0,samples.length))<=0){
decoder.dispose();
decoder = new Mpg123Decoder(externalFile);
playing=false;
}
device.writeSamples(samples, 0, readSamples);
}
}
}
});
playbackThread.setDaemon(true);
playbackThread.start();
}
public void stop(){
playing=false;
decoder.dispose();
}
public void setVolume(float vol){
device.setVolume(vol);
}
public void setSpeed(float speed){
device.setSpeed(speed);
}
}
Where to get the audio extension?(required for the AudioDevice)
The audio extension seems to have been deprecated and the jars cant be found easily, I have uploaded them here, its an old version but should work just fine.
Easier way?
If your game is only intent to run on desktop ,the Music class of libgdx on desktop relies again on OpenAL which gives as the power to play with the pitch ,so you just need to edit the Music interface(OpenALMusic on desktop) instead of the AudioDevice and get out the whole play sample by sample thing out of the equation,unfortunately as dawez said the Music class on android relies on MediaPlayer which is not giving us the pitch change option.
Conclusion :This method doesnt seem nice to me ,If your game really needs the pitch thing and it doesn't makes sense without it then go with it ,otherwise its just too much effort for such a small detail .
It seems that you cannot change the pitch of Music in Android.
To play music in libgdx you refer to the Interface Music:
public interface Music extends Disposable {
This is implemented by
package com.badlogic.gdx.backends.android;
public class AndroidMusic implements Music, MediaPlayer.OnCompletionListener {
private MediaPlayer player;
You are interested in the player object. That is of type MediaPlayer
package android.media;
public class MediaPlayer {
So now it boils down to the question if android is able to support pitching on the MediaPlayer class. Short answer no, long answer:Speed Control of MediaPlayer in Android
A workaround that you can do is to use a SoundPool even if that is slow, the loading can be started while the user is the loading screen. Otherwise you can try can split the music files in chunks and load them as you go still using a SoundPool. So you would do something like a lazy loading when the current part is coming to its end.
If you manage to find a suitable a solution, please post the relative code!
With the release of Android 6.0 (API level 23), "PlaybackParams" can be added to the MediaPlayer. These include playback speed and pitch among others.
I am developing an Android App in Air for Android using Flash Pro CC & I am tired of pushing updates all the time to change a spawn location for an image that needs to move every few days to a specific location. I won't know the location until just minutes before the update needs to be pushed & it would be much faster to simply have the app load the spawn coordinates for the image upon launch from my website in a .txt file. I would need something where I just type the X and Y coordinates in a file & then the information is loaded and AS3 spawns the image at those coordinates. If no coordinates are available in the text file (as 5 days of the week there won't be), I need a different image to be displayed wherever I place it. I will probably just have a separate frame for that though.
Any help is greatly appreciated & I'd prefer it if the image can be used in a motion tween but if not then I will work something out.
NOTE: I am new to AS3 coding but I have Flash itself figured out for animating with the timeline.
Have a look at URLRequest and URLLoader for retrieving the data. For spawning the image at a specific location, consider just moving it instead; Any object on stage is a DisplayObject, and DisplayObjects have properties x and y. For swapping out the images, look at DisplayObjectContainer, specifically the functions DisplayObjectContainer.addChild(child:DisplayObject) and DisplayObjectContainer.removeChild(child:DisplayObject). I have provided links to the documentation for each of the relevant functions.
If the update is on a daily basis, have a look at the Date class too - that will allow you to find out what date it is and whether you need to make an url request to load the textfile to display the image.
If you have any specific questions regarding the use of these classes, I think it's best if you make a new question with a link back to this one for context. You're good with English, not so good with AS3 (as you say), so I could explain the relevant bits where needed, but it would be a long and complex story if I were to explain this entire functionality in one go. ... I think you'll find that these class names will make googling easier too.
I expect that you'll have to use an URLLoader with an URLRequest to load the textfile, then depending on the results, display the image by adding it to stage via addChild if it's not there yet, and then setting its x and y values. You'll have to use the Date class to check whether you need to make a new request every time the user starts the application or does some specific action.
I have the finished code here. loadURL is the Document Class loaded by Flash. Everything works great!
package {
// IMPORTS EVENTS USED
import flash.display.MovieClip;
import flash.net.URLRequest;
import flash.net.URLLoader;
import flash.events.UncaughtErrorEvent;
import flash.events.ErrorEvent;
import flash.events.Event;
// DECLARES VARIABLES
public class loadURL extends MovieClip {
public var Xurl:String = "URL GOES HERE";
public var Yurl:String = "URL GOES HERE";
public var URLloaderX:URLLoader = new URLLoader();
public var URLloaderY:URLLoader = new URLLoader();
public var marker:Marker = new Marker();
public var gone:Gone = new Gone();
public var connectionerr:ConnectionErr = new ConnectionErr();
// CODE EXECUTED UPON LAUNCH
public function loadURL() {
// constructor code
trace("Loaded");
URLloaderX.addEventListener(Event.COMPLETE, completeHandlerX);
URLloaderX.load(new URLRequest(Xurl));
URLloaderY.addEventListener(Event.COMPLETE, completeHandlerY);
URLloaderY.load(new URLRequest(Yurl));
loaderInfo.uncaughtErrorEvents.addEventListener(UncaughtErrorEvent.UNCAUGHT_ERROR, onUncaughtError);
}
function completeHandlerX(event:Event):void
{
if(URLloaderX.data == null||URLloaderX.data==(""))
{addChild(gone)}
else{addChild(marker);marker.x = (URLloaderX.data)}
}
function completeHandlerY(event:Event):void
{
if(URLloaderY.data == null||URLloaderY.data==("")){}
marker.y = (URLloaderY.data)
}
private function onUncaughtError(e:UncaughtErrorEvent):void //Checks for no internet connection
{
e.preventDefault(); //leave this
// RESULT OF NO INTERNET HERE
addChild(connectionerr);
}
}
}
I have seen lots of examples with Android's VideoVIew API being used to stream data from an external server into a device(VideoView internally uses an RTP and RTSP stack to receive data).
However, there are very few discussion on possibilities of using Android's internal RTSP and RTP stacks for achieving server capabilities, i.e making an android device act as a Streaming server and stream media out .
Is it possible ?
And where inside the Android native code can I start digging in to achieve such functionality ?
Would appreciate details .
Thanks
Amit
A bit late, but:
You can set the MediaRecorder output format to "7". This is defined in
/framework/base/media/java/android/media/MediaRecorder.java
check that for details
as:
/** #hide Stream over a socket, limited to a single stream */
public static final int OUTPUT_FORMAT_RTP_AVP = 7;
The destination is controllable via
setprop streaming.ip
and
setprop streaming.port
The AV data will then be streamed to the given destination address.
The RTP code (native) itself lives in the
/frameworks/base/media/libstagefright/rtsp directory.
Happy code digging
There is also possibility to use libstreaming library (https://github.com/fyhertz/libstreaming)
The documentation on Github gives you the example of how do you set up the server, but basically you need to add net.majorkernelpanic.streaming.gl.SurfaceView to your Layout
<net.majorkernelpanic.streaming.gl.SurfaceView
android:id="#+id/surface"
android:layout_width="match_parent"
android:layout_height="match_parent"/>
Add this to your manifest
<service android:name="net.majorkernelpanic.streaming.rtsp.RtspServer"/>
Include libstreaming library. If you are working with a newer version of Android Studio you need to clone the libstreaming as a separate project and import module. Afterwards, it is necessary to run build on the build.gradle in libstreaming. Then you can work with this library.
The last step is to create an Activity. Simplest possible looks like this:
public class RemoteStreamingActivity extends Activity {
private SurfaceView mSurfaceView;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_remote_streaming);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
handleGestures();
mSurfaceView = (SurfaceView) findViewById(R.id.surface);
SessionBuilder.getInstance()
.setSurfaceView(mSurfaceView)
.setPreviewOrientation(90)
.setContext(getApplicationContext())
.setAudioEncoder(SessionBuilder.AUDIO_NONE)
.setVideoEncoder(SessionBuilder.VIDEO_H264);
this.startService(new Intent(this,RtspServer.class));
}
#Override
public void onDestroy() {
super.onDestroy();
this.stopService(new Intent(this, RtspServer.class));
}
}
If you want to test whether the rstp server is running you can try using VLC and connect via URL: rstp://{ipAddressOfYourDevice}:8086?h264=200-20-320-240