[AS3][Air for Android] Get streaming mic input? - android

Is there a way to get true mic streaming input?
The example code I have at the moment looks to be getting the mic input data, and saving it to a sound object and playing it right away.
Is there a way to stream it properly?
If not, in my example, is there a way to get the mic input data, but mute the audio, as it is causing a feedback loop (despite setLoopBack being set to false..)
Code below:
import flash.display.*;
import flash.events.*;
import flash.media.*;
import flash.media.SoundTransform;
import flash.utils.*;
var _soundBytes:ByteArray = new ByteArray();
var _micBytes:ByteArray;
var son:Sound;
var sc:SoundChannel;
var pow:int = 0;
var myBar:Sprite;
stage.quality = "LOW";
// this code ended up muting the mic input oddly?
//SoundMixer.soundTransform = new SoundTransform(0);
init();
function init()
{
myBar = new Sprite;
micInit();
soundInit();
addEventListener(Event.ENTER_FRAME, visualise);
}
function micInit()
{
var mic:Microphone = Microphone.getMicrophone();
if(mic != null) {
//mic.setUseEchoSuppression(true);
mic.setLoopBack(false);
mic.setSilenceLevel(0);
mic.rate = 44;
mic.gain = 60;
mic.addEventListener(SampleDataEvent.SAMPLE_DATA, micSampleDataHandler);
}
}
function micSampleDataHandler(event:SampleDataEvent):void
{
_micBytes = event.data;
sc = son.play();
}
function soundInit():void {
son = new Sound();
son.addEventListener(SampleDataEvent.SAMPLE_DATA, soundSampleDataHandler);
}
function soundSampleDataHandler(event:SampleDataEvent):void {
for (var i:int = 0; i < 8192 && _micBytes.bytesAvailable > 0; i++) {
var sample:Number = _micBytes.readFloat();
event.data.writeFloat(sample);
event.data.writeFloat(sample);
}
}
function drawLines(e:Event):void{
SoundMixer.computeSpectrum(_soundBytes, true);
myBar.graphics.clear();
myBar.graphics.lineStyle(2,0xabc241);
for (var i=0; i < 256; i++) {
pow = _soundBytes.readFloat()*200;
pow = Math.abs(pow);
myBar.graphics.drawRect(i*2, 0, 2, pow);
addChild(myBar);
}
}
Hope you can help!

To use acoustic echo cancellation, call Microphone.getEnhancedMicrophone() to get a reference to an enhanced Microphone object. Set the Microphone.enhancedOptions property to an instance of the MicrophoneEnhancedOptions class. Here is an article that discusses it all. Article about enhanced microphone options at Adobe
EDIT: I spoke too soon. I have used enhanced mic many times before, but I decided to read the article myself to see if there were any interesting things I could learn new from it... and I found this near the end
AEC is computationally expensive. Currently, only desktop platforms are supported for Flash Player and AIR
Although I just looked at the date... last year, so maybe give it a try and it is supported now?!?

Related

How to set sound playback from external speaker?

There is a kind of weird issue, I am using oboe lib https://github.com/google/oboe, for sound playback. Of course you can choose sound playback output according to android settings
https://developer.android.com/reference/android/media/AudioDeviceInfo
So, if I need to set exact output chanel I need to set it to oboe lib.
By the way output chanal that I need is TYPE_BUILTIN_SPEAKER, but on some devices (sometimes, not constantly) I hear the sound from
TYPE_BUILTIN_EARPIECE
How I am doing this, I have such method to get needed chanel id
fun findAudioDevice(app: Application,
deviceFlag: Int,
deviceType: Int): AudioDeviceInfo?
{
var result: AudioDeviceInfo? = null
val manager = app.getSystemService(Context.AUDIO_SERVICE) as AudioManager
val adis = manager.getDevices(deviceFlag)
for (adi in adis)
{
if (adi.type == deviceType)
{
result = adi
break
}
}
return result
}
How I use it
val id = getAudioDeviceInfoId(getBuildInSpeakerInfo())
private fun getBuildInSpeakerInfo(): AudioDeviceInfo?
{
return com.tetavi.ar.basedomain.utils.Utils.findAudioDevice( //
getApplication<Application>(), //
AudioManager.GET_DEVICES_OUTPUTS, //
AudioDeviceInfo.TYPE_BUILTIN_SPEAKER //
)
}
private fun getAudioDeviceInfoId(info: AudioDeviceInfo?): Int
{
var result = -1
if (info != null)
{
result = info.id
}
return result
}
And eventually I need to set this id to oboe lib. Oboe lib is native lib, so with JNI I pass this id and set it
oboe::Result oboe_engine::createPlaybackStream()
{
oboe::AudioStreamBuilder builder;
const oboe::SharingMode sharingMode = oboe::SharingMode::Exclusive;
const int32_t sampleRate = mBackingTrack->getSampleRate();
const oboe::AudioFormat audioFormat = oboe::AudioFormat::Float;
const oboe::PerformanceMode performanceMode = oboe::PerformanceMode::PowerSaving;
builder.setSharingMode(sharingMode)
->setPerformanceMode(performanceMode)
->setFormat(audioFormat)
->setCallback(this)
->setSampleRate(sampleRate);
if (m_output_playback_chanel_id != EMPTY_NUM)
{
//set output playback chanel (like internal or external speaker)
builder.setDeviceId(m_output_playback_chanel_id); <------------- THIS LINE
}
return builder.openStream(&mAudioStream);
}
So, actually issue is that on some devices (sometimes, not constantly) I still hear that sound playback goes from internal speaker TYPE_BUILTIN_EARPIECE inspite of I set directly that I need to use TYPE_BUILTIN_SPEAKER
I checked a few times the flow, from moment that I get this id (it is acctually is 3) and up to the moment when I set it as a param to oboe lib, but still sometimes I hear sound from internal speaker.
So, question is - if I miss something here? Maybe some trick should be implemented or something else?

Change Exoplayer HLS Quality

I'm trying to set video bitrate in Exoplayer. I had already set it like this:
trackSelector = new DefaultTrackSelector(factory);
DefaultTrackSelector.Parameters parameters = trackSelector.getParameters();
parameters.withMaxVideoBitrate(maxBitrate);
parameters.withExceedRendererCapabilitiesIfNecessary(false);
parameters.withExceedVideoConstraintsIfNecessary(false);
trackSelector.setParameters(parameters);
but it doesn't work. Everywhere I found something about this I've found people were using HlsChunk source which is private in Exoplayer 2.6. Can anyone help me, pls?
For those who need to set HLS quality according to their needs this is how it could be made, considering that about this topic there are several post on SO but no one is very clear.
As I write in 2019 I assume everyone is using ExoPlayer2.
This is the solution which gave us the best result.
DataSource.Factory dataSourceFactory = new DefaultDataSourceFactory(Objects.requireNonNull(getContext()),
Util.getUserAgent(this.getContext(), getResources().getString(R.string.app_name)));
trackSelector = new CustomTrackSelector();
videoSource = new HlsMediaSource.Factory(dataSourceFactory).createMediaSource(mp4VideoUri);
player = ExoPlayerFactory.newSimpleInstance(this.getContext(), trackSelector);
so what you should is just override the behavior of the custom track selector, overriding the selectTrack method
public class CustomTrackSelector extends DefaultTrackSelector
{
public CustomTrackSelector()
{
super();
}
protected #Nullable
TrackSelection selectVideoTrack(
TrackGroupArray groups,
int[][] formatSupports,
int mixedMimeTypeAdaptationSupports,
Parameters params,
#Nullable TrackSelection.Factory adaptiveTrackSelectionFactory)
throws ExoPlaybackException
{
AdaptiveTrackSelection adaptiveTrackSelection = null;
if (groups.length > 0)
{
for (int groupIndex = 0; groupIndex < groups.length; groupIndex++)
{
TrackGroup trackGroup = groups.get(groupIndex);
int[] tracks = new int[trackGroup.length];
//creation of indexes array
for (int i = 0; i < trackGroup.length; i++)
{
tracks[i] = i;
}
adaptiveTrackSelection = new AdaptiveTrackSelection(
trackGroup,
tracks,
new DefaultBandwidthMeter(),
AdaptiveTrackSelection.DEFAULT_MIN_DURATION_FOR_QUALITY_INCREASE_MS,
AdaptiveTrackSelection.DEFAULT_MAX_DURATION_FOR_QUALITY_DECREASE_MS,
AdaptiveTrackSelection.DEFAULT_MIN_DURATION_TO_RETAIN_AFTER_DISCARD_MS,
AdaptiveTrackSelection.DEFAULT_BANDWIDTH_FRACTION,
AdaptiveTrackSelection.DEFAULT_BUFFERED_FRACTION_TO_LIVE_EDGE_FOR_QUALITY_INCREASE,
AdaptiveTrackSelection.DEFAULT_MIN_TIME_BETWEEN_BUFFER_REEVALUTATION_MS,
Clock.DEFAULT);
for (int i = 0; i < tracks.length; i++)
{
Format format = trackGroup.getFormat(tracks[i]);
if (format.width < MIN_WIDTH)
{
Logger.log(this, "Video track blacklisted with width = " + format.width);
adaptiveTrackSelection.blacklist(tracks[i], BLACKLIST_DURATION);
} else
{
Logger.log(this, "Video track NOT blacklisted with width = " + format.width);
}
}
}
}
return adaptiveTrackSelection;
}
}
The above method just blacklist the track that you don't want to select, allowing the player to choose just between those that are not blacklisted.
We have blacklisted tracks according to width parameter, but obviously you can filter them using bitRate.
With this behavior the player will start with the track you allow it to use, and after a period of time (BLACKLIST TIME) it can switch back to use all the tracks in case of need.
If you want to exclude a track for all the time just use Integer.MAX_VALUE as blacklist time.
I hope that this will help who is searching for this feature.

Output file using FFmpeg in Xamarin Android

I'm building an android app using Xamarin. The requirement of the app is to capture video from the camera and encode the video to send it across to a server.
Initially, I was using an encoder library on the server-side to encode recorded video but it was proving to be extremely unreliable and inefficient especially for large-sized video files. I have posted my issues on another thread here
I then decided to encode the video on the client-side and then send it to the server. I've found encoding to be a bit complicated and there isn't much information available on how this can be done. So, I searched for the only way I knew how to encode a video that is by using FFmpeg codec. I've found some solutions. There's a project on GitHub that demonstrates how FFmpeg is used inside a Xamarin android project. However, running the solution doesn't give any output. The project has a binary FFmpeg file which is installed to the phone directory using the code below:
_ffmpegBin = InstallBinary(XamarinAndroidFFmpeg.Resource.Raw.ffmpeg, "ffmpeg", false);
Below is the example code for encoding video into a different set of outputs:
_workingDirectory = Android.OS.Environment.ExternalStorageDirectory.AbsolutePath;
var sourceMp4 = "cat1.mp4";
var destinationPathAndFilename = System.IO.Path.Combine (_workingDirectory, "cat1_out.mp4");
var destinationPathAndFilename2 = System.IO.Path.Combine (_workingDirectory, "cat1_out2.mp4");
var destinationPathAndFilename4 = System.IO.Path.Combine (_workingDirectory, "cat1_out4.wav");
if (File.Exists (destinationPathAndFilename))
File.Delete (destinationPathAndFilename);
CreateSampleFile(Resource.Raw.cat1, _workingDirectory, sourceMp4);
var ffmpeg = new FFMpeg (this, _workingDirectory);
var sourceClip = new Clip (System.IO.Path.Combine(_workingDirectory, sourceMp4));
var result = ffmpeg.GetInfo (sourceClip);
var br = System.Environment.NewLine;
// There are callbacks based on Standard Output and Standard Error when ffmpeg binary is running as a process:
var onComplete = new MyCommand ((_) => {
RunOnUiThread(() =>_logView.Append("DONE!" + br + br));
});
var onMessage = new MyCommand ((message) => {
RunOnUiThread(() =>_logView.Append(message + br + br));
});
var callbacks = new FFMpegCallbacks (onComplete, onMessage);
// 1. The idea of this first test is to show that video editing is possible via FFmpeg:
// It results in a 150x150 movie that eventually zooms on a cat ear. This is desaturated, and there's a fade-in.
var filters = new List<VideoFilter> ();
filters.Add (new FadeVideoFilter ("in", 0, 100));
filters.Add(new CropVideoFilter("150","150","0","0"));
filters.Add(new ColorVideoFilter(1.0m, 1.0m, 0.0m, 0.5m, 1.0m, 1.0m, 1.0m, 1.0m));
var outputClip = new Clip (destinationPathAndFilename) { videoFilter = VideoFilter.Build (filters) };
outputClip.H264_CRF = "18"; // It's the quality coefficient for H264 - Default is 28. I think 18 is pretty good.
ffmpeg.ProcessVideo(sourceClip, outputClip, true, new FFMpegCallbacks(onComplete, onMessage));
//2. This is a similar version in command line only:
string[] cmds = new string[] {
"-y",
"-i",
sourceClip.path,
"-strict",
"-2",
"-vf",
"mp=eq2=1:1.68:0.3:1.25:1:0.96:1",
destinationPathAndFilename2,
"-acodec",
"copy",
};
ffmpeg.Execute (cmds, callbacks);
// 3. This lists codecs:
string[] cmds3 = new string[] {
"-codecs",
};
ffmpeg.Execute (cmds, callbacks);
// 4. This convers to WAV
// Note that the cat movie just has some silent house noise.
ffmpeg.ConvertToWaveAudio(sourceClip, destinationPathAndFilename4, 44100, 2, callbacks, true);
I have tried different commands but no output file is generated. I have tried to use another project found here but this one has the same issue. I don't get any errors but no output file is generated. I'm really hoping someone can help me find a way I can manage to use FFmpeg in my project or some way to compress video to transport it to the server.
I will really appreciate if someone can point me in the right direction.
Just figure how to get the output by adding the permission in AndroidManifest file.
android.permission.WRITE_EXTERNAL_STORAG
Please read the update on the repository, it says that there is a second package, Xamarin.Android.MP4Transcoder for Android 6.0 onwards.
Install NuGet https://www.nuget.org/packages/Xamarin.Android.MP4Transcoder/
await Xamarin.MP4Transcoder.Transcoder
.For720pFormat()
.ConvertAsync(inputFile, ouputFile, f => {
onProgress?.Invoke((int)(f * (double)100), 100);
});
return ouputFile;
For Previous Android versions
Soruce Code https://github.com/neurospeech/xamarin-android-ffmpeg
Install-Package Xamarin.Android.FFmpeg
Use this as template, this lets you log output as well as calculates progress.
You can take a look at source, this one downloads ffmpeg and verifies sha1 hash on first use.
public class VideoConverter
{
public VideoConverter()
{
}
public File ConvertFile(Context contex,
File inputFile,
Action<string> logger = null,
Action<int,int> onProgress = null)
{
File ouputFile = new File(inputFile.CanonicalPath + ".mpg");
ouputFile.DeleteOnExit();
List<string> cmd = new List<string>();
cmd.Add("-y");
cmd.Add("-i");
cmd.Add(inputFile.CanonicalPath);
MediaMetadataRetriever m = new MediaMetadataRetriever();
m.SetDataSource(inputFile.CanonicalPath);
string rotate = m.ExtractMetadata(Android.Media.MetadataKey.VideoRotation);
int r = 0;
if (!string.IsNullOrWhiteSpace(rotate)) {
r = int.Parse(rotate);
}
cmd.Add("-b:v");
cmd.Add("1M");
cmd.Add("-b:a");
cmd.Add("128k");
switch (r)
{
case 270:
cmd.Add("-vf scale=-1:480,transpose=cclock");
break;
case 180:
cmd.Add("-vf scale=-1:480,transpose=cclock,transpose=cclock");
break;
case 90:
cmd.Add("-vf scale=480:-1,transpose=clock");
break;
case 0:
cmd.Add("-vf scale=-1:480");
break;
default:
break;
}
cmd.Add("-f");
cmd.Add("mpeg");
cmd.Add(ouputFile.CanonicalPath);
string cmdParams = string.Join(" ", cmd);
int total = 0;
int current = 0;
await FFMpeg.Xamarin.FFMpegLibrary.Run(
context,
cmdParams
, (s) => {
logger?.Invoke(s);
int n = Extract(s, "Duration:", ",");
if (n != -1) {
total = n;
}
n = Extract(s, "time=", " bitrate=");
if (n != -1) {
current = n;
onProgress?.Invoke(current, total);
}
});
return ouputFile;
}
int Extract(String text, String start, String end)
{
int i = text.IndexOf(start);
if (i != -1)
{
text = text.Substring(i + start.Length);
i = text.IndexOf(end);
if (i != -1)
{
text = text.Substring(0, i);
return parseTime(text);
}
}
return -1;
}
public static int parseTime(String time)
{
time = time.Trim();
String[] tokens = time.Split(':');
int hours = int.Parse(tokens[0]);
int minutes = int.Parse(tokens[1]);
float seconds = float.Parse(tokens[2]);
int s = (int)seconds * 100;
return hours * 360000 + minutes * 60100 + s;
}
}

How to add Music in AS3 Android App

I am a noob in Adobe Flash Action Script 3..
I want a music player (with start, pause buttons) code in AS3, I have imported the music in the library and I have added the following code
var qMySound:Sound = new mySound1();
qMySound.play(0, 9999);
mySound1 is the AS Linkage
and it works.
I made a stop button using the code snippet's code
button_7.addEventListener(MouseEvent.CLICK, fl_ClickToStopAllSounds_6);
function fl_ClickToStopAllSounds_6(event:MouseEvent):void
{
SoundMixer.stopAll();
}
And it works too, but now i want to start it again, I tried to use this code:
button_1.addEventListener(MouseEvent.CLICK, fl_ClickToGoToAndStopAtFrame_28);
function fl_ClickToGoToAndStopAtFrame_28(event:MouseEvent):void
{
gotoAndStop(1);
}
The music AS3 code is in Frame 1 btw.
And that doesn't work.. so any ideas?
edit:
This is my code:
var mySound:Sound = new Sound();
var myChannel:SoundChannel = new SoundChannel();
var myTransform:SoundTransform = new SoundTransform();
var lastPosition:Number = 0;
var mySound:Sound = new mySound1(0,999);
myChannel = mySound.play();
// here is to learn how to deal with volume as well (between 0 and 1)
myTransform.volume = .5;
// setting the SoundTransform instance to the myChannel object
myChannel.soundTransform = myTransform;
pause_btn.addEventListener(MouseEvent.CLICK, onClickPauseHandler, false, 0, true);
play_btn.addEventListener(MouseEvent.CLICK, onClickPlayHandler, false, 0, true);
function onClickPauseHandler(event:MouseEvent):void
{
// getting the current position of the sound
lastPosition = myChannel.position;
// stopping the current sound
myChannel.stop();
}
function onClickPlayHandler(event:MouseEvent):void
{
// playing from the saved position
myChannel = mySound.play(0,999);
}
And this error pops out.
Line 8 | 1151: A conflict exists with definition of mySound in namespace internal.
You need to use the classes SoundChannel/SoundTransform to have more functionalities. When you stop all sounds using SoundMixer.stopAll(), you are not stopping that respective sound.
Follow some example code, that you guide you to achieve what you are expecting, plus, give you some extra knowledge about how to deal with sound.
var mySound:Sound = new Sound();
var myChannel:SoundChannel = new SoundChannel();
var myTransform:SoundTransform = new SoundTransform();
var lastPosition:Number = 0;
mySound.load(new URLRequest('yourMp3FileName.mp3'));
myChannel = mySound.play();
// here is to learn how to deal with volume as well (between 0 and 1)
myTransform.volume = .5;
// setting the SoundTransform instance to the myChannel object
myChannel.soundTransform = myTransform;
pause_btn.addEventListener(MouseEvent.CLICK, onClickPauseHandler, false, 0, true);
play_btn.addEventListener(MouseEvent.CLICK, onClickPlayHandler, false, 0, true);
function onClickPauseHandler(event:MouseEvent):void
{
// getting the current position of the sound
lastPosition = myChannel.position;
// stopping the current sound
myChannel.stop();
}
function onClickPlayHandler(event:MouseEvent):void
{
// playing from the saved position
myChannel = mySound.play(lastPosition);
}

BitmapData lock and unlock not working on android

The following code will erase a bitmap (brush akk droplet) from another bitmap (akka
The code works great on PC and pretty ok performacewise.
When i test it on more android devices, it doesn't work. No matter if is a high end device or a slower one. I've made some tests and found out the problem is lock() and unlock() functions from BitmapData. It simply doesn't update the image on device, only once.
I've tried to remove them, but the then it lags alot. Also the performace drop is noticeable on PC too.
Does anyone know a solution, where am I doing wrong?
import flash.display.BitmapData;
import flash.display.Bitmap;
import flash.geom.Point;
import flash.events.MouseEvent;
import flash.geom.ColorTransform;
import flash.geom.Rectangle;
var m:BitmapData = new water_pattern;
var b:BitmapData = new droplet;
var bm:Bitmap = new Bitmap(m);
var bla = new blabla();
addChild(bla);
bla.addChild(bm);
function p($x,$y){
var refPoint = new Point($x-b.width/2,$y-b.height/2);
for(var i=0;i<b.width;i++)
for(var j=0;j<b.height;j++)
{
var a:uint = (b.getPixel32(i,j)>> 24) & 0xFF;
a=0xFF-a;
var tp:uint = m.getPixel32(refPoint.x+i,refPoint.y+j);
var tp_trans:uint = (tp >> 24)&0xFF;
if(tp_trans>a){
tp=(tp&0x00FFFFFF)|(a<<24);
m.setPixel32(refPoint.x+i,refPoint.y+j,tp);
}
}
//for(var k=0;k<10000000;k++){};
}
var k=1;
var md = function(e)
{
m.lock();
p(bm.mouseX,bm.mouseY);
m.unlock();
};
bla.addEventListener(MouseEvent.MOUSE_DOWN,function(e)
{
bla.addEventListener(Event.EXIT_FRAME,md);
});
bla.addEventListener(MouseEvent.MOUSE_UP,function(e)
{
bla.removeEventListener(Event.EXIT_FRAME,md);
});
I've reworked the code :
public function draw($x, $y)
{
var refPoint = new Point($x - brush.width / 2, $y - brush.height / 2);
var r:Rectangle = new Rectangle(refPoint.x, refPoint.y, brush.width, brush.height);
var pv:Vector.<uint> = pattern.getVector(r);
var bv:Vector.<uint> = brush.getVector(brush.rect);
for (var i = 0; i < bv.length; i++)
{
var a:uint = (bv[i]>>24) &0xFF;
a = 0xFF - a;
var tp:uint = pv[i];
var tp_trans:uint = (tp >> 24) & 0xFF;
// trace(a.toString(16) + " vs " + tp_trans.toString(16));
if (tp_trans > a)
{
tp = (tp & 0x00FFFFFF) | (a << 24);
// trace("??>" + tp);
pv[i] = tp;
}
}
pattern.setVector(r, pv);
}
Now it works, but still it is pretty slow on device. That before i saw Jeff Ward's comment, so i changed it to render mode on CPU. It works fast.
The big problem is in CPU mode the game is very slow compared to GPU. Yet this script is fast on CPU but unusable slow on GPU.
So I've tried again the first code and surprise. It works. Jeff Ward, thank you, you're a genius.
Now the question remains is why? Can someone please explain?
For your original question, sometimes GPU mode doesn't pick up changes into the underlying bitmapdata. Try any one of these operations after your unlock() to 'hint' that it should re-upload the bitmap data:
bm.filters = [];
bm.bitmapData = m;
bm.alpha = 0.98+Math.random()*0.02;
But as you found, uploading bitmapdata can be slow. To clarify GPU/direct render modes:
In GPU mode, changing any pixel in a Bitmap requires a re-upload of the full bitmap, so it's the size of Bitmap that's the limiting factor. In direct mode, it blits only the portions of the screen that have been updated. So I'd guess some parts of the game change a lot of the screen at once (slow in direct mode), whereas this effect changes a large bitmap, but only a little bit at a time (slow in GPU mode).
You have to get creative to maximize your performance wrt GPUs:
In GPU mode, split the effect into many bitmaps, and only change as few as possible for any given frame. (medium effort)
Use Starling GPU-accelerated framework and Starling filters (GPU shaders) to achieve your effect (effort depends on how much you have invested in your game already), see a couple of examples

Categories

Resources