Exoplayer - How to check if MP4 video has audio? - android

I'm using URLs from an API. Some of the URLs are mp4's without sound(video is playing just no sound). How do I check if that video has sound or not? I've been searching through SimpleExoPlayer docs and testing the methods on my URLS
https://exoplayer.dev/doc/reference/com/google/android/exoplayer2/SimpleExoPlayer.html for the past couple hours
But I can't figure out how to detect check if the video playing has sound or not.
Tried all the methods in getAudioAttributes(), getAudioComponents() and now just tried getAudioFormat() but they all return null.
try{
Log.d(TAG, "onCreateView: " + player.getAudioFormat().channelCount);
}catch (Exception e){
Log.d(TAG, "onCreateView: " + e);
}
And yes I've made sure the link's actually have Audio.

You can track the current tracks with Player#EventListener#OnTracksChanged and get the current ones with Player#getCurrentTrackGroups(). If you go through the track groups you can look for the type. If you find AUDIO type there that means your video file contains the audio track.
If you additionally want to check if any of the audio tracks was selected, then Player#getCurrentTrackSelections() is the place to look at.

To complete #Hamza Khan's answer, here is my code to check whether the loaded video has any audio:
override fun onTracksChanged(
trackGroups: TrackGroupArray?,
trackSelections: TrackSelectionArray?
) {
if (trackGroups != null && !trackGroups.isEmpty){
for (arrayIndex in 0 until trackGroups.length){
for (groupIndex in 0 until trackGroups[arrayIndex].length){
val sampleMimeType = trackGroups[arrayIndex].getFormat(groupIndex).sampleMimeType
if ( sampleMimeType != null && sampleMimeType.contains("audio") ){
//video contains audio
}
}
}
}
}

player.addListener(new Player.EventListener() {
#Override
public void onTracksChanged(TrackGroupArray trackGroups, TrackSelectionArray trackSelections) {
if (trackGroups != null && !trackGroups.isEmpty()) {
for (int i = 0; i < trackGroups.length; i++) {
for (int g = 0; g < trackGroups.get(i).length; g++) {
String sampleMimeType = trackGroups.get(i).getFormat(g).sampleMimeType;
if (sampleMimeType != null && sampleMimeType.contains("audio")) {
//video contains audio
}
}
}
}
}
}
JAVA version of mrj answer
Came across this thread which helped me a lot!

Related

Xamarin recording audio over a video and exporting it to youtube

I'm in the final stages of creating my Foley app! (my first ever created app during my internship, I learned how to make it myself, so sorry if any questions are stupid!)
My game is finished and working (also thanks to many of you!) and now I am trying to make the second part.
The idea is that it will show a video without sound. People then will have the opportunity to use our Foley room to record sounds over the video.
So far so good, but I still have a few issues.
First, I can't seem to find a way to stop the video/audio and replay it. When you take all the correct steps, there is no issue, but as most of you will know you want to make your code as waterproof as possible, so it would be nice to find a solution to this.
Second, I'd like to know if it's possible to export the combination of the video and recording audio to youtube. (I found a code to upload, but haven't tried if my idea is possible, does anyone have any experience with this?)
//<-----------------------------------Aangeven variabelen--------------->
MediaRecorder _recorder;
MediaPlayer _player;
Button _start;
Button _stop;
public static bool recorderplaying = false;
protected override void OnCreate(Bundle savedInstanceState)
{
base.OnCreate(savedInstanceState);
SetContentView(Resource.Layout.VideoActie);
//Videovariabelen
var videoView = FindViewById<VideoView>(Resource.Id.PlaceHolder);
Button play = FindViewById<Button>(Resource.Id.Play);
Button stop = FindViewById<Button>(Resource.Id.stop);
//opnamevariabelen
_start = FindViewById<Button>(Resource.Id.record);
_stop = FindViewById<Button>(Resource.Id.stop);
//<-----------------------------------Opnemen audio---------------->
//Opslaan opname
string path = $"{Android.OS.Environment.ExternalStorageDirectory.AbsolutePath}/test.3gpp";
//Toegang vragen tot opslag android telefoon
if (Build.VERSION.SdkInt > BuildVersionCodes.M)
{
if (CheckSelfPermission(Manifest.Permission.WriteExternalStorage) != Android.Content.PM.Permission.Granted
|| CheckSelfPermission(Manifest.Permission.RecordAudio) != Android.Content.PM.Permission.Granted)
{
RequestPermissions(new[] { Manifest.Permission.WriteExternalStorage, Manifest.Permission.RecordAudio }, 0);
}
}
//<-----------------------------------Video afspelen-------------->
//video source
videoView.SetMediaController(new MediaController(this));
videoView.SetVideoPath($"android.resource://{PackageName}/{Resource.Raw.puppy}");
videoView.RequestFocus();
//<-----------------------------------Buttons--------------------->
//opname start
_start.Click += delegate
{
_stop.Enabled = !_stop.Enabled;
_start.Enabled = !_start.Enabled;
_recorder.SetAudioSource(AudioSource.Mic);
_recorder.SetOutputFormat(OutputFormat.ThreeGpp);
_recorder.SetAudioEncoder(AudioEncoder.AmrNb);
_recorder.SetOutputFile(path);
_recorder.Prepare();
_recorder.Start();
videoView.Start();
recorderplaying = true;
};
//opname stop
stop.Click += delegate
{
_stop.Enabled = !_stop.Enabled;
videoView.Pause();
_player.Stop();
if (recorderplaying == true)
{
_recorder.Stop();
_recorder.Reset();
recorderplaying = false;
}
else
{
int pass = 0;
}
};
play.Click += delegate
{
_stop.Enabled = !_stop.Enabled;
_player.SetDataSource(path);
_player.Prepare();
_player.Start();
videoView.Start();
};
}
//<-----------------------------------OnResume, OnPause---------------->
protected override void OnResume()
{
base.OnResume();
_recorder = new MediaRecorder();
_player = new MediaPlayer();
_player.Completion += (sender, e) => {
_player.Reset();
_start.Enabled = !_start.Enabled;
_stop.Enabled = !_stop.Enabled;
};
}
protected override void OnPause()
{
base.OnPause();
_player.Release();
_recorder.Release();
_player.Dispose();
_recorder.Dispose();
_player = null;
_recorder = null;
}
}
}
This is what I have so far.
I also tried to make a global variable for stopping the video, since that worked fine for the audio in the game, but sadly that didn't work.
note: don't mind the puppy video, it's a placeholder haha!
If anyone has any idea if and how this is possible, that would be amazing!!

Using ZXing in Xamarin for Android, how do I stop continuous scanning right after I get my result?

I'm using ZXing in an Android app being developed in Xamarin to scan a QR code and start playing the corresponding audio file automatically.
My problem is that when I get a result from scanning, it takes some time for the audio player activity to load so it gets called twice or more due to subsequent successful scannings.
Is there a way to stop continuous scanning as soon as I get a correct result?
Here's the code:
//Start scanning
scanner.ScanContinuously(opt, HandleScanResult);
}
private void HandleScanResult(ZXing.Result result)
{
string msg = "";
if (result != null && !string.IsNullOrEmpty(result.Text))
{
msg = result.Text;
var playerActivity = new Intent(myContext, typeof(AudioActivity));
//-------------------------------------------------------------
// Prerequisite: load all tracks onto "Assets/tracks" folder
// You can put here qr code - track assignments here below
// msg: decoded qr code
// playerActivity.Putextra second parameter is a relative path
// under "Assets" directory
//--------------------------------------------------------------
//Iterate through tracks stored in assets and load their titles into an array
System.String[] trackArray = Application.Context.Assets.List("tracks");
bool trackFound = false;
foreach (string track in trackArray)
{
if (track.Equals(msg + ".mp3"))
{
playerActivity.PutExtra("Track", "tracks/" + msg + ".mp3");
for (int i = 0; i < PostList.postList.Count; i++)
{
if (PostList.postList.ElementAt(i).code.Equals(msg))
playerActivity.PutExtra("TrackTitle", PostList.postList.ElementAt(i).title);
}
myContext.StartActivity(playerActivity);
trackFound = true;
}
}
Thank you!
Old question but i'll post it anyway for anyone still looking for this information.
You need your scanner to be a class variable. This is my code:
public MobileBarcodeScanner scanner = new MobileBarcodeScanner();
private void ArrivalsClick(object sender, EventArgs e)
{
try
{
if (Arrivals.IsEnabled)
{
MobileBarcodeScanningOptions optionsCustom = new MobileBarcodeScanningOptions();
scanner.TopText = "Scan Barcode";
optionsCustom.DelayBetweenContinuousScans = 3000;
scanner.ScanContinuously(optionsCustom, ArrivalResult);
}
}
catch (Exception)
{
throw;
}
}
private async void ArrivalResult(ZXing.Result result)
{
if (result != null && result.Text != "")
{
// Making a call to a REST API
if (resp.StatusCode == System.Net.HttpStatusCode.OK)
{
int? res = JsonConvert.DeserializeObject<int>(resp.Content);
if (res == 0)
{
scanner.Cancel(); // <----- Stops scanner (Something went wrong)
Device.BeginInvokeOnMainThread(async () =>
{
await DisplayAlert("..", "..", "ΟΚ");
});
}
else
{
Plugin.SimpleAudioPlayer.ISimpleAudioPlayer player = Plugin.SimpleAudioPlayer.CrossSimpleAudioPlayer.Current;
player.Load("beep.wav");
player.Play(); // Scan successful
}
}
else
{
scanner.Cancel();
Device.BeginInvokeOnMainThread(async () =>
{
await DisplayAlert("..", "..", "ΟΚ");
});
}
}
}

Android native webrtc: add video after already connected

I have successfully been running WebRTC in my Android app for a while, using libjingle.so and PeerConnectionClient.java, etc., from Google's code library. However, I am now running into a problem where a user starts a connection as audio only (i.e., an audio call), but then toggles video on. I augmented the existing setVideoEnabled() in PeerConnectionClient as such:
public void setVideoEnabled(final boolean enable) {
executor.execute(new Runnable() {
#Override
public void run() {
renderVideo = enable;
if (localVideoTrack != null) {
localVideoTrack.setEnabled(renderVideo);
} else {
if (renderVideo) {
//AC: create a video track
String cameraDeviceName = VideoCapturerAndroid.getDeviceName(0);
String frontCameraDeviceName =
VideoCapturerAndroid.getNameOfFrontFacingDevice();
if (numberOfCameras > 1 && frontCameraDeviceName != null) {
cameraDeviceName = frontCameraDeviceName;
}
Log.i(TAG, "Opening camera: " + cameraDeviceName);
videoCapturer = VideoCapturerAndroid.create(cameraDeviceName);
if (createVideoTrack(videoCapturer) != null) {
mediaStream.addTrack(localVideoTrack);
localVideoTrack.setEnabled(renderVideo);
peerConnection.addStream(mediaStream);
} else {
Log.d(TAG, "Local video track is still null");
}
} else {
Log.d(TAG, "Local video track is null");
}
}
if (remoteVideoTrack != null) {
remoteVideoTrack.setEnabled(renderVideo);
} else {
Log.d(TAG,"Remote video track is null");
}
}
});
}
This allows me successfully see a local inset of the device's video camera, but it doesn't send the video to the remove client. I thought the peerConnection.addStream() call would do that, but perhaps I am missing something else?
To avoid building an external mechanism of communication between peers that will involve an answer from the second peer that the new stream can be added, you can always start with existing (but sometimes empty) video stream. Now it is just the matter of filling this stream with content when (and if) necessary.

Double open brackets on if-statement? What is this?

I am using some example code from the internet that manages music. One section of the code is a way of pausing music and preparing it to play again. I am not understanding a double bracket notation. I thought it may have been a fancy way of doing an If-Else, but my equivalent snippet of code does not work, however, the code with the double brackets works perfectly. What exactly does it mean when you add two open brackets to an If-statement?
Here is the snippet of code
// previousMusic should always be something valid
if (currentMusic != -1) {{
previousMusic = currentMusic;
Log.d(TAG, "Music is paused");
}
currentMusic = -1;
Log.d(TAG, "Paused music is prepared to play again (if necessary)");
}
Here is what I thought it could have meant. It does not work as intended, so this is not the same actually.
// previousMusic should always be something valid
if (currentMusic != -1) {
previousMusic = currentMusic;
Log.d(TAG, "Music is paused");
} else {
currentMusic = -1;
Log.d(TAG, "Paused music is prepared to play again");
}
Thank you in advance for the explanation.
hat exactly does it mean when you add two open brackets to an If-statement?
Nothing special really it is just a block of code where you put your previousMusic = currentMusic in another scope of the method.
It is like saying:
if (currentMusic != -1) {
previousMusic = currentMusic;
Log.d(TAG, "Music is paused");
currentMusic = -1;
Log.d(TAG, "Paused music is prepared to play again (if necessary)");
}
But if you make a variable inside the block of code then you cant access it outside the block because the scope of the variable is only accessible to the block of code.
if(1==1)
{
int i2;
{
int i;
i2= 1; //can access from top level scope
}
i = 0; //compile time error cant access the variable in the block of code
}
I hope you are aware about the scope of variables.
Example 1:
if(some condition){
{ // x is born here
int x = 32;
} // x is dead here
// not allowed!
Log.d(TAG,"Value is: " + x);
}
Example 2:
if(some condition){
int x = 32;
// totally legit!
Log.d(TAG,"Value is: " + x);
}
See the subtle difference between the two?
In Example 1, the nested {} limit the scope of x. The variable x is available for use only till the closing brace } of its corresponding opening brace {
The correct way to understand that code is:
if (currentMusic != -1) {
{
previousMusic = currentMusic;
Log.d(TAG, "Music is paused");
}
currentMusic = -1;
Log.d(TAG, "Paused music is prepared to play again (if necessary)");
}
When you put a braces in your code, you just create a new scope to handle your variables.

Android Player Error (-38,0)

i made a stream program to play an ad + audio + ad. i play first ad fine , then i switch to the audio which fine then i fail at playing the last ad and i get Error(38,0). i checked that i have set data source,onPrepareListener and i tried every thing i can found so far but still getting this error on android 4.1.1
I get error after my method MPStarting , i do not even reach the onPrepared method only for final ad.if there is any info u need more plz let me know thanks.
here is the part of code which is related
MPStarting(Track)
{ try
{
if (_playlist !=null && _playlist.GetCurrent() != null)
{
Episode ep = (Episode) _playlist.GetCurrent();
_player = new MediaPlayer();
AdsInfo startAd = ep.getAdWithType(PlayTime.start_ad);
AdsInfo endAd = ep.getAdWithType(PlayTime.end_ad);
if(currAudio == null && startAd != null)
currAudio = startAd;
else if(currAudio == startAd )
currAudio = ep;
else if (currAudio instanceof Episode && endAd != null)
currAudio = ep.getAdWithType(PlayTime.end_ad);
}
if(_player != null)
{
_player.setDataSource(dataSource);
_player.setOnPreparedListener(this);
_player.setOnCompletionListener(this);
_player.setOnBufferingUpdateListener(this);
_player.setOnSeekCompleteListener(this);
_player.setOnErrorListener(this);
_player.prepareAsync();
}
catch (Exception e)
{
Log.i("mpcPlayer","MPStarting "+ e.getLocalizedMessage());
}
}
}
#Override
public void onCompletion(MediaPlayer mp)
{
//here i check on current playing
//i always stop player if it is playing ,reset,release and make player = null
// then i call MPStarting and i send the current audio then return
}
I think i found my problem ,i was calling sometimes getCurrentPosition() it seems player was not ready at that time.i guess this error is about calling a method sometimes while player not in right state.

Categories

Resources