I want to implement a cool effect, when there is an explosion the music gets slightly slower for a moment.
The Music class of libgdx doesn't allow changing the pitch of the sound, I tried using the Sound class instead of the Music to play my music but its very slow at loading the files ( 5 sec for a 5 mb file, on Desktop! ).
So, the question is if there is a way to workaround this, or if there is an external library, that works both on desktop and android and can work along with libgdx.
After some searching I found a way by editing the libgdx source code.
You need to use the AudioDevice object to play your music sample by sample ,to do that you need to include the audio-extension of libgdx.
We gonna edit the libgdx source code so you need to download it and replace the gdx.jar , gdx-backend-android.jar and gdx-backend-lwjgl.jar with the correct libgdx projects(they have the same name without the jar extension)
1)Edit the AudioDevice.java inside com.badlogic.gdx.audio package of the gdx project
add the following code inside the interface
public void setSpeed(float val);
2)Edit the AndroidAudioDevice.java inside com.badlogic.gdx.backends.android package of the gdx-backend-android project
The Android side of the AudioDevice class relies on the AudioTrack class of the Android sdk ,this class has a setPlaybackRate(..) method.
add the following code inside the class
#Override
public void setSpeed (float speed) {
track.setPlaybackRate((int)(track.getSampleRate()*speed));
}
3)Edit the OpenALAudioDevice.java inside com.badlogic.gdx.backends.lwjgl.audio package of the gdx-backend-lwjgl project
The Desktop side of the AudioDevice relies on OpenAL (the opengl of audio) which has a handy set pitch method
add the following inside the class
#Override
public void setSpeed (float speed) {
alSourcef(sourceID, AL_PITCH, speed);
}
4)Play the audio
Here is the code for loading and playing the sound file
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.audio.AudioDevice;
import com.badlogic.gdx.audio.io.Mpg123Decoder;
import com.badlogic.gdx.files.FileHandle;
public class MusicBeat {
static short[] samples = new short[2048];
Mpg123Decoder decoder;
AudioDevice device;
static FileHandle externalFile;
public boolean playing=true;
public MusicBeat(String name )
{
FileHandle file=Gdx.files.internal(name);
FileHandle external=Gdx.files.external("myappname/"+name);
if(!external.exists())file.copyTo(external); //copy the file to the external storage only if it doesnt exists yet
decoder = new Mpg123Decoder(external);
device = Gdx.audio.newAudioDevice(decoder.getRate(),decoder.getChannels() == 1 ? true : false);
playing=false;
externalFile=file;
}
void play()
{
playing=true;
Thread playbackThread = new Thread(new Runnable() {
#Override
public synchronized void run() {
int readSamples = 0;
while ( playing) {
if(decoder!=null){
if((readSamples = decoder.readSamples(samples, 0,samples.length))<=0){
decoder.dispose();
decoder = new Mpg123Decoder(externalFile);
playing=false;
}
device.writeSamples(samples, 0, readSamples);
}
}
}
});
playbackThread.setDaemon(true);
playbackThread.start();
}
public void stop(){
playing=false;
decoder.dispose();
}
public void setVolume(float vol){
device.setVolume(vol);
}
public void setSpeed(float speed){
device.setSpeed(speed);
}
}
Where to get the audio extension?(required for the AudioDevice)
The audio extension seems to have been deprecated and the jars cant be found easily, I have uploaded them here, its an old version but should work just fine.
Easier way?
If your game is only intent to run on desktop ,the Music class of libgdx on desktop relies again on OpenAL which gives as the power to play with the pitch ,so you just need to edit the Music interface(OpenALMusic on desktop) instead of the AudioDevice and get out the whole play sample by sample thing out of the equation,unfortunately as dawez said the Music class on android relies on MediaPlayer which is not giving us the pitch change option.
Conclusion :This method doesnt seem nice to me ,If your game really needs the pitch thing and it doesn't makes sense without it then go with it ,otherwise its just too much effort for such a small detail .
It seems that you cannot change the pitch of Music in Android.
To play music in libgdx you refer to the Interface Music:
public interface Music extends Disposable {
This is implemented by
package com.badlogic.gdx.backends.android;
public class AndroidMusic implements Music, MediaPlayer.OnCompletionListener {
private MediaPlayer player;
You are interested in the player object. That is of type MediaPlayer
package android.media;
public class MediaPlayer {
So now it boils down to the question if android is able to support pitching on the MediaPlayer class. Short answer no, long answer:Speed Control of MediaPlayer in Android
A workaround that you can do is to use a SoundPool even if that is slow, the loading can be started while the user is the loading screen. Otherwise you can try can split the music files in chunks and load them as you go still using a SoundPool. So you would do something like a lazy loading when the current part is coming to its end.
If you manage to find a suitable a solution, please post the relative code!
With the release of Android 6.0 (API level 23), "PlaybackParams" can be added to the MediaPlayer. These include playback speed and pitch among others.
Related
I am using this library: https://bintray.com/google/webrtc/google-webrtc
What I want to achieve (at least, at the beginning of my project) is render video locally. I am using this tutorial (which is the only one around the Internet) https://vivekc.xyz/getting-started-with-webrtc-for-android-daab1e268ff4. Unfortunately, the last line of code is not up-to-date anymore. The constructor needs a callback which I have no idea how to implement:
localVideoTrack.addRenderer(new VideoRenderer(i420Frame -> {
// no idea what to put here
}));
My code is exactly the same as in the posted tutorial. This is the very first step to make familiar with WebRTC technology in Android which I cannot figure out. My camera is capturing the video because I can see it in my log:
I/org.webrtc.Logging: CameraStatistics: Camera fps: 28.
The main issue is that I have no idea how to pass it to my SurfaceViewRenderer through a callback. Did anyone meet that problem? I'll really appreciate any help or suggestions.
Here is the official example app which is the only source but it is done differently than one in the tutorial, it's much more complicated:
https://webrtc.googlesource.com/src/+/master/examples/androidapp/src/org/appspot/apprtc
You are right, the API no longer matches that in the tutorial, but it's close.
The VideoTrack, has an addRenderer(VideoRenderer renderer) method, that requires you to create a VideoRenderer, with the SurfaceViewRenderer as parameter. But that is not possible anymore, so instead you should use the addSink(VideoSink sink) method, of the VideoTrack. The SurfaceViewRenderer object implement the VideoSink onFrame(VideoFrame frame) method to make this work.
VideoTrack videoTrack = utility.createVideoTrack();
videoTrack.addSink(this.localSurfaceViewRenderer);
I used the same official example app as reference to get to this conclusion, and it works fine for me.
private static class ProxyVideoSink implements VideoSink {
private VideoSink target;
#Override
synchronized public void onFrame(VideoFrame frame) {
if (target == null) {
Logging.d("TAG", "Dropping frame in proxy because target is null.");
return;
}
target.onFrame(frame);
}
synchronized public void setTarget(VideoSink target) {
this.target = target;
}
}
ProxyVideoSink localVideoSink = new ProxyVideoSink();
videoTrack.addSink(localVideoSink);
localVideoSink.setTarget(localSurfaceView);
try this code as directly assigning videoTrack.addSink(localSurfaceView) might crash on next initialization.
I have seen lots of examples with Android's VideoVIew API being used to stream data from an external server into a device(VideoView internally uses an RTP and RTSP stack to receive data).
However, there are very few discussion on possibilities of using Android's internal RTSP and RTP stacks for achieving server capabilities, i.e making an android device act as a Streaming server and stream media out .
Is it possible ?
And where inside the Android native code can I start digging in to achieve such functionality ?
Would appreciate details .
Thanks
Amit
A bit late, but:
You can set the MediaRecorder output format to "7". This is defined in
/framework/base/media/java/android/media/MediaRecorder.java
check that for details
as:
/** #hide Stream over a socket, limited to a single stream */
public static final int OUTPUT_FORMAT_RTP_AVP = 7;
The destination is controllable via
setprop streaming.ip
and
setprop streaming.port
The AV data will then be streamed to the given destination address.
The RTP code (native) itself lives in the
/frameworks/base/media/libstagefright/rtsp directory.
Happy code digging
There is also possibility to use libstreaming library (https://github.com/fyhertz/libstreaming)
The documentation on Github gives you the example of how do you set up the server, but basically you need to add net.majorkernelpanic.streaming.gl.SurfaceView to your Layout
<net.majorkernelpanic.streaming.gl.SurfaceView
android:id="#+id/surface"
android:layout_width="match_parent"
android:layout_height="match_parent"/>
Add this to your manifest
<service android:name="net.majorkernelpanic.streaming.rtsp.RtspServer"/>
Include libstreaming library. If you are working with a newer version of Android Studio you need to clone the libstreaming as a separate project and import module. Afterwards, it is necessary to run build on the build.gradle in libstreaming. Then you can work with this library.
The last step is to create an Activity. Simplest possible looks like this:
public class RemoteStreamingActivity extends Activity {
private SurfaceView mSurfaceView;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_remote_streaming);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
handleGestures();
mSurfaceView = (SurfaceView) findViewById(R.id.surface);
SessionBuilder.getInstance()
.setSurfaceView(mSurfaceView)
.setPreviewOrientation(90)
.setContext(getApplicationContext())
.setAudioEncoder(SessionBuilder.AUDIO_NONE)
.setVideoEncoder(SessionBuilder.VIDEO_H264);
this.startService(new Intent(this,RtspServer.class));
}
#Override
public void onDestroy() {
super.onDestroy();
this.stopService(new Intent(this, RtspServer.class));
}
}
If you want to test whether the rstp server is running you can try using VLC and connect via URL: rstp://{ipAddressOfYourDevice}:8086?h264=200-20-320-240
I want to know, how to run a song in a service, I also want to use .aidl file to expose clients the interface.
Sound basic for Android..
I recommended you for go through this tutorial MusicDroid - Audio Player Part.
There are three parts of these tutorial. Its nicely describe for how to implement Audio player for android using service and AIDL.
Also look at this android developer tutorial Media Playback.
I think this will help you a lot..!
I assume you know how to create a service, I did something before similar
import android.media.MediaPlayer;
private MediaPlayer mMediaPlayer;
private void play() {
mMediaPlayer = new MediaPlayer();
mMediaPlayer.setDataSource(getSongUrl());
mMediaPlayer.prepare();
mMediaPlayer.start();
}
when interact with UI, send Intent from the UI to the service so that you can do let pause:
mMediaPlayer.pause();
or seek to certain time:
mMediaPlayer.seekTo((int) (to * mMediaPlayer.getDuration()));
and remember make sure call the release()
Please check the class here: http://developer.android.com/reference/android/media/MediaPlayer.html
We're using MediaRecorder to record video to a file on the external storage using setOutputFile() before doing the actual recording.
Everything works fine, but the main issue is that as soon as the recording is done, we want to start playing the recorded video back in a VideoView.
How to know when the file is ready to be read and played back?
The FileObserver class suits your needs perfectly. Here is the documentation. It's easy to use. When a observed file is closed after writing, the onEvent callback is called with CLOSE_WRITE as the parameter.
MyFileObserver fb = new MyFileObserver(mediaFile_path, FileObserver.CLOSE_WRITE);
fb.startWatching();
class MyFileObserver extends FileObserver {
public MyFileObserver (String path, int mask) {
super(path, mask);
}
public void onEvent(int event, String path) {
// start playing
}
}
Don't forget to call stopWatching().
We solved similar problem with the following algo:
while (file not complete)
sleep for 1 sec
read the fourth byte of the file
if it is not 0 (contains 'f' of the 'ftyp' header) then
file is complete, break
The key point is that MediaRecorder writes the ftyp box at the very last moment. If it is in place, then the file is complete.
In my tests irrespective of the size of the recording mediaRecorder.stop() is a blocking method that only returns after the file has been completely written and closed by the media recorder.
So JPMs answer is actually correct.
You can verify this by calling File.length() immediately after stop(). You will find that the output file length is the final length of the file at this point. In other words media recorder does not write anything further to the file after stop() has returned.
I haven't tried this myself but this might work:
public void release () Since: API Level 1
Releases resources associated with this MediaRecorder object. It is
good practice to call this method when you're done using the
MediaRecorder.
If it does what it says, then I guess if you call this and after this method returns you know the file is ready.
Apparently there is no way to detect when the recording has stopped in Media player, but there is a stop() that you can override if you create a custom class that implements MediaRecorder. here I would do something like this:
public class MyRecorder implements MediaRecorder {
public boolean stopped;
.... implement all the methods that MediaRecorder has making
sure to call super for each method.
#Override
public void myStop() {
this.stopped = true;
super.stop();
}
}
Then you can access the boolean to see if it has stopped recording.
A dirty way would be to check the lastModified() value of the File and open the VideoView if the File wasn't modified for 2 seconds.
I am currently doing an AndAR project in group of 3. I'm the person who's in charge of video streaming into the Android phone.
I got ourselves a D-Link DCS-920 IP camera and I found out that it uses MJPEG codec for the live video stream and the webserver uses Jview to view the live stream. As far as I know MJPG is not a supported file type for Android OS so I've came out with an idea, instead of using ImageView, I use WebView to stream the video.
I've implemented a very simple concept and it works! But the problem is, refresh rate is terrible.
I get the video image (eg: http://192.168.1.10/image.jpg) to view on the WebView and implement a Timer to control the refresh rate (supposed to set it to 30fps, which is refresh every 33ms) but it can only go up to 500ms interval, any lower interval I notice it will not be any smoother,sometimes the image wont load and connection is unstable (eg: dropped). Could this be I'm refreshing at a rate faster than it can receive?
But over on the webserver Jview it has no problem! was trying to find the source code for the jview but I have no hope.
Anyway here's the code I've written
package org.example.test;
import java.util.Timer;
import java.util.TimerTask;
import android.app.Activity;
import android.os.Bundle;
import android.view.View;
import android.webkit.WebView;
import android.widget.Button;
import android.widget.EditText;
public class Webview extends Activity {
public WebView webView;
public Timer autoUpdate;
public String url;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
webView = (WebView) findViewById(R.id.webview);
webView.getSettings();
final EditText urlText = (EditText) findViewById(R.id.urlText);
//Buttons//////////////////------------
final Button connectB = (Button)findViewById(R.id.connectButton);
connectB.setOnClickListener(new View.OnClickListener() {
public void onClick(View v) {
//Actions goes here
url = urlText.getText().toString();
webView.loadUrl(url);
timerSetup();
}
});
final Button exitB = (Button)findViewById(R.id.exitButton);
exitB.setOnClickListener(new View.OnClickListener() {
public void onClick(View v) {
//Actions goes here
finish();
}
});
}
//refresh timer//////////////-----------------
public void timerSetup(){
autoUpdate = new Timer();
autoUpdate.schedule(new TimerTask() {
#Override
public void run() {
runOnUiThread(new Runnable() {
#Override
public void run() {
//Actions goes here
webView.loadUrl(url);
}
});
}
}, 0, 500);//refresh rate time interval (ms)
}
}
Is there anyway I can get the video stream in by at least 15fps/have a faster refresh rate?
Are there any such thing as MJPEG viewer/source code that I can use to display these images?
here's the screenshot of the app
http://s945.photobucket.com/albums/ad295/kevinybh/?action=view¤t=video.jpg
(not enough points to post pictures) :(
I just need to make the video stream around 15-30fps. any suggestions/help would be very deeply appreciated :) Thanks!
Instead of an Arduino you could use a Raspberry PI, it should have enough CPU power to control the vehicle and to convert the video stream at the same time. Sure, you'll need to port all of your Arduino software to Raspberry...
MJPEG is a terribly inefficient way to deliver motion video to a mobile device, because each frame is compressed as it's own independent picture. For an application which doesn't need video (someone was asking about a camera watching waiting lines last week) your solution of pushing a static frame every second or so sounds good.
If you need motion video, I would recommend you do transcoding on your webserver from MJPEG to a supported video format which utilizes frame-to-frame compression. This will result in far less data to push, both over the user's 3g connection and from your server to all of its clients. You should only need to run one transcoding engine to support all clients - and you'll be able to use the same one for android & iphone devices, though you may want to also have a higher resolution output for tablets and pc's if your camera output is good enough to justify it.
On android, if we decode a jpeg by CPU, it will cost 40-100ms. If we want to play mjpeg to 15-30fps, we need hardware jpeg decoder.
There was a useful previous SO discussion and this great one with code. Would you try and let us know if that works for you.
You can use MjpegView class to display mjpeg stream directly.
https://code.google.com/p/android-camera-axis/source/browse/trunk/serealisation/src/de/mjpegsample/MjpegView/MjpegView.java?r=33
You'll have to implement some AsyncTasks on this class to works fine.
Good luck