I want to understand the working principles of the HTC Evo 3D's 3D Display; however, the code and HTCDev's tutorial do not help on this. It is said that the SEI FPA bit in the header overrides the choice which is given by hand such as:
public void surfaceChanged(SurfaceHolder surfaceholder, int i, int j, int k) {
holder = surfaceholder;
enableS3D(true, holder.getSurface()); // note SEI FPA flag in content
// overrides this
}
The play video code:
private void playVideo() {
release();
fileName = "HTCDemo.mp4";
try {
mediaPlayer = new MediaPlayer();
final AssetFileDescriptor afd = getAssets().openFd(fileName);
mediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(),
afd.getLength());
mediaPlayer.setDisplay(holder);
mediaPlayer.prepare();
mediaPlayer.setOnPreparedListener(this);
mediaPlayer.setOnVideoSizeChangedListener(this);
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
} catch (Exception e) {
Log.e(TAG, Log.getStackTraceString(e));
}
}
At this point, I could not track where it looks to the SEI FPA bit in the header. I need a help for showing the necessary code part. Thanks in advance.
Do you need to parse SEI FPA bit in the header itself? That's out of scope of this API. As noted, the codec parses this to enable (and override) S3D setting.
In the overview, it's mentioned how you can use 3rd party tools like x264 to add this bit to existing content.
I'd recommend looking at the code for x264 for help in parsing the file header if that's what you need to do at runtime.
Related
I'm developing an app in Android to change the all recorded audio files by men voice to the women voice.
I found a solution to change the Pitch value of an audio file by PlaybackParams in MediaPlayer.
Here's my code for changing Pitch value:
mediaPlayer =new MediaPlayer();
mediaPlayer.setDataSource(ur);
mediaPlayer.prepare();
PlaybackParams params = null;
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.M) {
params = new PlaybackParams();
try {
params.setPitch(Float.parseFloat(1.6f));
}catch (Exception e){
params.setPitch(1.6f)
}
mediaPlayer.setPlaybackParams(params);
}
It works well, but the problem is that it works only on Android above version 5.
Does anyone know another solution for that?
I've been working on an Android application that shows live streaming video via RTSP.
Assuming I have a well-functioning RTSP server that passes h264 packets, and to view the stream we should connect to rtsp://1.2.3.4:5555/stream
So I tried to use the native MediaPlayer\VideoView, but no luck (the video was stuck after 2-3 seconds of playback, so I loaded mrmaffen's vlc-android-sdk (can be found here) and used the following code:
ArrayList<String> options = new ArrayList<String>();
options.add("--no-drop-late-frames");
options.add("--no-skip-frames");
options.add("-vvv");
videoVlc = new LibVLC(options);
newVideoMediaPlayer = new org.videolan.libvlc.MediaPlayer(videoVlc);
final IVLCVout vOut = newVideoMediaPlayer.getVLCVout();
vOut.addCallback(this);
vOut.setVideoView(videoView); //videoView is a pre-defined view which is part of the layout
vOut.attachViews();
newVideoMediaPlayer.setEventListener(this);
Media videoMedia = new Media (videoVlc, Uri.parse(mVideoPath));
newVideoMediaPlayer.setMedia(videoMedia);
newVideoMediaPlayer.play();
The problem is that I see a blank screen.
Keep in mind that when I put a RTSP link with audio stream only, it works fine.
Is someone familliar with this sdk and have an idea about this issue?
Thanks in advance
Try adding this option:
--rtsp-tcp
I play rtsp streaming with following code
try {
Uri rtspUri=Uri.parse("rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov");
final MediaWrapper mw = new MediaWrapper(rtspUri);
mw.removeFlags(MediaWrapper.MEDIA_FORCE_AUDIO);
mw.addFlags(MediaWrapper.MEDIA_VIDEO);
MediaWrapperListPlayer.getInstance().getMediaList().add(mw);
VLCInstance.getMainMediaPlayer().setEventListener(this);
VLCInstance.get().setOnHardwareAccelerationError(this);
final IVLCVout vlcVout = VLCInstance.getMainMediaPlayer().getVLCVout();
vlcVout.addCallback(this);
vlcVout.setVideoView(mSurfaceView);
vlcVout.attachViews();
final SharedPreferences pref = PreferenceManager.getDefaultSharedPreferences(this);
final String aout = VLCOptions.getAout(pref);
VLCInstance.getMainMediaPlayer().setAudioOutput(aout);
MediaWrapperListPlayer.getInstance().playIndex(this, 0);
} catch (Exception e) {
Log.e(TAG, e.toString());
}
When you get playing event, you need enable video track.
private void onPlaying() {
stopLoadingAnimation();
VLCInstance.getMainMediaPlayer().setVideoTrackEnabled(true);
}
This may be helpful for you
I want to use ExoPlayer in my app. Could you please tell me which is simplest example? I have tried to do likely https://github.com/google/ExoPlayer/ but it's not easy for me. I tried to import library as module then i received bintray-release error.
As stated in the main Readme.md, you can import ExoPlayer as you will do for any other dependencies :
In your app build.gradle > dependencies add :
compile 'com.google.android.exoplayer:exoplayer:rX.X.X'
The current version is r1.5.1 as of October 27, 2015. see here.
Old question but since there are too few simple ExoPlayer tutorials out there, I wrote this up. I recently converted an app I have from using Android's default media player to ExoPlayer. The performance gains are amazing and it works on a wider range of devices. It is a bit more complicated, however.
This example is tailored specifically to playing an http audio stream but by experimenting you can probably adapt it easily to anything else. This example uses the latest v1.xx of ExoPlayer, currently v1.5.11:
First, put this in your build.gradle (Module: app) file, under "dependencies":
compile 'com.google.android.exoplayer:exoplayer:r1.5.11'
Also your class should implement ExoPlayer.Listener:
...implements ExoPlayer.Listener
Now here's the relevant code to play an http audio stream:
private static final int RENDERER_COUNT = 1; //since we want to render simple audio
private static final int BUFFER_SEGMENT_SIZE = 64 * 1024; // for http mp3 audio stream use these values
private static final int BUFFER_SEGMENT_COUNT = 256; // for http mp3 audio steam use these values
private ExoPlayer exoPlayer;
// for http mp3 audio stream, use these values
int minBufferMs = 1000;
int minRebufferMs = 5000;
// Prepare ExoPlayer
exoPlayer = ExoPlayer.Factory.newInstance(RENDERER_COUNT, minBufferMs, minRebufferMs);
// String with the url of the stream to play
String stream_location = "http://audio_stream_url";
// Convert String URL to Uri
Uri streamUri = Uri.parse(stream_location);
// Settings for ExoPlayer
Allocator allocator = new DefaultAllocator(BUFFER_SEGMENT_SIZE);
String userAgent = Util.getUserAgent(ChicagoPoliceRadioService.this, "ExoPlayer_Test");
DataSource dataSource = new DefaultUriDataSource(ChicagoPoliceRadioService.this, null, userAgent);
ExtractorSampleSource sampleSource = new ExtractorSampleSource(
streamUri, dataSource, allocator, BUFFER_SEGMENT_SIZE * BUFFER_SEGMENT_COUNT);
MediaCodecAudioTrackRenderer audioRenderer = new MediaCodecAudioTrackRenderer(sampleSource, MediaCodecSelector.DEFAULT);
// Attach listener we implemented in this class to this ExoPlayer instance
exoPlayer.addListener(this);
// Prepare ExoPlayer
exoPlayer.prepare(audioRenderer);
// Set full volume
exoPlayer.sendMessage(audioRenderer, MediaCodecAudioTrackRenderer.MSG_SET_VOLUME, 1f);
// Play!
exoPlayer.setPlayWhenReady(true);
There are three callback methods:
#Override
public void onPlayWhenReadyCommitted() {
// No idea what would go here, I left it empty
}
// Called when ExoPlayer state changes
#Override
public void onPlayerStateChanged(boolean playWhenReady, int playbackState) {
// If playbackState equals STATE_READY (4), that means ExoPlayer is set to
// play and there are no errors
if (playbackState == ExoPlayer.STATE_READY) {
// ExoPlayer prepared and ready, no error
// Put code here, same as "onPrepared()"
}
}
// Called on ExoPlayer error
#Override
public void onPlayerError(ExoPlaybackException error) {
// ExoPlayer error occurred
// Put your error code here
}
And when you're done playing do the usual:
if (exoPlayer != null) {
exoPlayer.stop();
exoPlayer.release();
}
NOTE: I'm still not 100% sure about the details of all of the ExoPlayer settings. I've never tried playing video. Note this is for version 1.5.x of ExoPlayer, 2.0 changed a lot and I still haven't figured it out. I do highly recommend this code to anyone who has an app that streams audio from the web as the performance gains are incredible and for my app it fixed an issue with Samsung phones that would only play about 30sec of audio before stopping.
I have a link of video from s3 server and i am playing this video on VideoView. Video is playing properly but the problem is that first it downloads the entire video then plays it.
I want it play like buffer. I mean if 20 % video downloaded it should play those and then again download (Like in youtube). Here is my code what i have done is..
FFmpegMediaMetadataRetriever mediaMetadataRetriever = new FFmpegMediaMetadataRetriever();
AWSCredentials myCredentials = new BasicAWSCredentials(
"AKIAIGOIY4LLB7EMACGQ",
"7wNQeY1JC0uyMaGYhKBKc9V7QC7X4ecBtyLimt2l");
AmazonS3 s3client = new AmazonS3Client(myCredentials);
GeneratePresignedUrlRequest request = new GeneratePresignedUrlRequest(
"mgvtest", videoUrl);
URL objectURL = s3client.generatePresignedUrl(request);
try {
mediaMetadataRetriever.setDataSource(videoUrl);
} catch (Exception e) {
utilDialog.showDialog("Unable to load this video",
utilDialog.ALERT_DIALOG);
pb.setVisibility(View.INVISIBLE);
}
videoView.setVideoURI(Uri.parse(videoUrl));
MediaController myMediaController = new MediaController(this);
// myMediaController.setMediaPlayer(videoView);
videoView.setMediaController(myMediaController);
videoView.setOnCompletionListener(myVideoViewCompletionListener);
videoView.setOnPreparedListener(MyVideoViewPreparedListener);
videoView.setOnErrorListener(myVideoViewErrorListener);
videoView.requestFocus();
videoView.start();
Listeners
MediaPlayer.OnCompletionListener myVideoViewCompletionListener = new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer arg0) {
// Toast.makeText(PlayRecordedVideoActivity.this, "End of Video",
// Toast.LENGTH_LONG).show();
}
};
MediaPlayer.OnPreparedListener MyVideoViewPreparedListener = new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
pb.setVisibility(View.INVISIBLE);
imgScreenshot.setVisibility(View.VISIBLE);
tvScreenshot.setVisibility(View.VISIBLE);
// final Animation in = new AlphaAnimation(0.0f, 1.0f);
// in.setDuration(3000);
// tvScreenshot.startAnimation(in);
Animation animation = AnimationUtils.loadAnimation(
getApplicationContext(), R.anim.zoom_in);
tvScreenshot.startAnimation(animation);
new Handler().postDelayed(new Runnable() {
#Override
public void run() {
tvScreenshot.setVisibility(View.INVISIBLE);
}
}, 3000);
}
};
MediaPlayer.OnErrorListener myVideoViewErrorListener = new MediaPlayer.OnErrorListener() {
#Override
public boolean onError(MediaPlayer mp, int what, int extra) {
// Toast.makeText(PlayRecordedVideoActivity.this, "Error!!!",
// Toast.LENGTH_LONG).show();
return true;
}
};
To be able to start playing an mp4 video before it has fully downloaded the video has to have the metadata at the start of the video rather than the end - unfortunately, with standard mp4 the default it usually to have it at the end.
The metadata is in an 'atom' or 'box' (basically a data structure within the mp4 file) and can be moved to the start. This is usually referred to as faststart and tools such as ffmpeg will allow you do this. The following is an extract from the ffmpeg documentation:
The mov/mp4/ismv muxer supports fragmentation. Normally, a MOV/MP4 file has all the metadata about all packets stored in one location (written at the end of the file, it can be moved to the start for better playback by adding faststart to the movflags, or using the qt-faststart tool).
There are other tools and software which will allow you do this also - e.g. the one mentioned in the ffmpeg extract above:
http://multimedia.cx/eggs/improving-qt-faststart/
If you actually want full streaming where the server breaks the file into chunks and these are downloaded one by one by the client, then you probably want to use one of the adaptive bit rate protocols (Apple's HLS, MS's Smoothstreaming, Adobe Adaptive Streaming or the new open standard DASH). This also allows you have different bit rates to allow for different network conditions. You will need a server that can support this functionality to use these techniques. This may be overkill if you just want a simple site with a single video and will not have too much traffic.
Actually you have to start cloudfront with s3, so can stream s3 videos,
checkout this link for more information:
http://www.miracletutorials.com/s3-streaming-video-with-cloudfront/
I'm using libGDX and face the problem that background music does not flawlessly loop on various Android devices (Nexus 7 running Lollipop for example). Whenever the track loops (i.e. jumps from the end to the start) a clearly noticeable gap is hearable. Now I wonder how the background music can be played in a loop without the disturbing gap?
I've already tried various approaches like:
Ensuring the number of Samples in the track are an exact multiple of the tracks sample rate (as mentioned somewhere here on SO).
Various audio formats like .ogg, .m4a, .mp3 and .wav (.ogg seems to be the solution of choice here at SO, but unfortunately it does not work in my case).
Used Androids MediaPlayer with setLooping(true) instead of libGDX Music class.
Used Androids MediaPlayer.setNextMediaPlayer(). The code looks like the following, and it plays the two tracks without a gap in between, but unfortunately, as soon as the second MediaPlayer finishes, the first does not start again!
/* initialization */
afd = context.getAssets().openFd(filename);
firstBackgroundMusic.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
firstBackgroundMusic.prepare();
firstBackgroundMusic.setOnCompletionListener(this);
secondBackgroundMusic.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
secondBackgroundMusic.prepare();
secondBackgroundMusic.setOnCompletionListener(this);
firstBackgroundMusic.setNextMediaPlayer(secondBackgroundMusic);
secondBackgroundMusic.setNextMediaPlayer(firstBackgroundMusic);
firstBackgroundMusic.start();
#Override
public void onCompletion(MediaPlayer mp) {
mp.stop();
try {
mp.prepare();
} catch (IOException e) { e.printStackTrace(); }
}
Any ideas what's wrong with the code snippet?
Just for the records:
It tuned out to be unsolvable. At the end, we looped the background music various times inside the file. This way the gap appears less frequently. It's no real solution to the problem, but the best workaround we could find.
This is an old question but I will give my solution in case anyone has the same problem.
The solution requires the use of the Audio-extension(deprecated but works just fine),if you cant find the link online here are the jars that I am using, also requires some external storage space.
The abstract is the following
Extract the raw music data with a decoder(VorbisDecoder class for ogg or Mpg123Decoder for mp3)and save them to the external storage(you can make a check to see if it already exists so that it only needs to be extracted once, cause it takes some time).
Create a RandomAccessFile using the file you just saved to the external storage
While playing set the RandomAccessFile pointer to the correct spot in the file and read a data segment
Play the above data segment with the AudioDevice class
Here is some code
Extract the music file and save it to the external storage,file is the FileHandle of the internal music file,here is an ogg and thats why we use VorbisDecoder
FileHandle external=Gdx.files.external("data/com.package.name/music/"+file.name());
file.copyTo(external);
VorbisDecoder decoder = new VorbisDecoder(external);
FileHandle extreactedDataFile=Gdx.files.external("data/com.package.name/music/"+file.nameWithoutExtension()+".mdata");
if(extreactedDataFile.exists())extreactedDataFile.delete();
ShortBuffer sbuffer=ByteBuffer.wrap(shortBytes).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer();
while(true){
if(LogoScreen.shouldBreakMusicLoad)break;
int num=decoder.readSamples(samples, 0,samples.length);
sbuffer.put(samples,0,num);
sbuffer.position(0);
extreactedDataFile.writeBytes(shortBytes,0,num*2, true);
if(num<=0)break;
}
external.delete();
Create an RandomAccessFile pointing to the file we just created
if(extreactedDataFile.exists()){
try {
raf=new RandomAccessFile(Gdx.files.external(extreactedDataFile.path()).file(), "r");
raf.seek(0);
} catch (Exception e) {
e.printStackTrace();
}
}
Create a Buffer so we can translate the bytes read from the file to a short array that gets feeded to the AudioDevice
public byte[] rafbufferBytes=new byte[length*2];
public short[] rafbuffer=new short[length];
public ShortBuffer sBuffer=ByteBuffer.wrap(rafbufferBytes).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer();
When we want to play the file we create an AudioDevice and a new thread where we read constantly read from the raf file and feed it to the AudioDevice
device = Gdx.audio.newAudioDevice((int)rate/*the hrz of the music e.g 44100*/,isthemusicMONO?);
currentBytes=0;//set the file to the beggining
playbackThread = new Thread(new Runnable() {
#Override
public synchronized void run() {
while (playing) {
if(raf!=null){
int length=raf.read(rafbufferBytes);
if(length<=0){
ocl.onCompletion(DecodedMusic.this);
length=raf.read(rafbufferBytes);
}
sBuffer.get(rafbuffer);
sBuffer.position(0);
if(length>20){
try{
device.writeSamples(rafbuffer,0,length/2);
fft.spectrum(rafbuffer, spectrum);
currentBytes+=length;
}catch(com.badlogic.gdx.utils.GdxRuntimeException ex){
ex.printStackTrace();
device = Gdx.audio.newAudioDevice((int)(rate),MusicPlayer.mono);
}
}
}
}
}
});
playbackThread.setDaemon(true);
playbackThread.start();
And when we want to seek at a position
public void seek(float pos){
currentBytes=(int) (rate*pos);
try {
raf.seek(currentBytes*4);
} catch (IOException e) {
e.printStackTrace();
}
}