Record Sound in Android using ActionScript 3 - android

i build a application that record sound in Desktop using ActionScript 3 , now i convert the application to Andriod Application but there is a problem that SampleDataEvent.SAMPLE_DATA event doesn't receive any data to record
Here is the code :
private var _microphone:Microphone;
private var _buffer:ByteArray = new ByteArray();
private var _difference:uint;
public function record():void
{
if ( _microphone == null )
_microphone = Microphone.getMicrophone();
_difference = getTimer();
_microphone.setSilenceLevel(_silenceLevel, _timeOut);
_microphone.gain = _gain;
_microphone.rate = _rate;
_buffer.length = 0;
_microphone.addEventListener(SampleDataEvent.SAMPLE_DATA, onSampleData);
_microphone.addEventListener(StatusEvent.STATUS, onStatus);
}
private function onSampleData(event:SampleDataEvent):void
{
_recordingEvent.time = getTimer() - _difference;
dispatchEvent( _recordingEvent );
var buteData:Number;
while(event.data.bytesAvailable > 0)
{
buteData = event.data.readFloat();
_buffer.writeFloat(buteData);
soundBytes.writeFloat( buteData);
}
}
anyone can help here
Thanx

I think, maybe you don't check out about AIR for Android settings. If you not checked in the RECORD_AUDIO. you should check it.
refer a below image.

Related

Run game local (file:///) Construct 2

I have a question like the one is asked here in this link: Solution: Run game local (file:///) Construct 2
I did the third step of the solution. I added
self.loadProject(FULL_CONTENT_INSIDE_MY_DATA.JS); return;
just after
xhr.open("GET", datajs_filename, true);
var supportsJsonResponse = false;
in c2runtime.js. But when I test the game in my Android App, I only see the first page of construct; The game cannot be loaded and the first page of construct remains as an image.
A piece of my code:
Runtime.prototype.requestProjectData = function ()
{
var self = this;
var xhr;
if (this.isWindowsPhone8)
xhr = new ActiveXObject("Microsoft.XMLHTTP");
else
xhr = new XMLHttpRequest();
var datajs_filename = "data.js";
if (this.isWindows8App || this.isWindowsPhone8 || this.isWindowsPhone81 || this.isWindows10)
datajs_filename = "data.json";
//xhr.open("GET", datajs_filename, true); <-- I commented this line
var supportsJsonResponse = false;
self.loadProject(FULL_CONTENT_INSIDE_MY_DATA.JS); return;
Just run it through a localhost server
use Wamp or Xampp

Web Audio Api biquadFilter in Android needs extra configuration?

Here says the Web audio API works in Chrome for Android, and here I have tested CM Browser, Chrome and CyanogenMod default Android 5.1.1 browsers, and all pass the tests (specially the biquadNode one).
But When I open this codepen with an eq (biquadNode), I can hear the music but not the eq working.
Does biquadNode works in android? any special implementation is needed?
*Code pen required to post
var context = new AudioContext();
var mediaElement = document.getElementById('player');
var sourceNode = context.createMediaElementSource(mediaElement);
// EQ Properties
//
var gainDb = -40.0;
var bandSplit = [360,3600];
var hBand = context.createBiquadFilter();
hBand.type = "lowshelf";
hBand.frequency.value = bandSplit[0];
hBand.gain.value = gainDb;
var hInvert = context.createGain();
hInvert.gain.value = -1.0;
var mBand = context.createGain();
var lBand = context.createBiquadFilter();
lBand.type = "highshelf";
lBand.frequency.value = bandSplit[1];
lBand.gain.value = gainDb;
var lInvert = context.createGain();
lInvert.gain.value = -1.0;
sourceNode.connect(lBand);
sourceNode.connect(mBand);
sourceNode.connect(hBand);
hBand.connect(hInvert);
lBand.connect(lInvert);
hInvert.connect(mBand);
lInvert.connect(mBand);
var lGain = context.createGain();
var mGain = context.createGain();
var hGain = context.createGain();
lBand.connect(lGain);
mBand.connect(mGain);
hBand.connect(hGain);
var sum = context.createGain();
lGain.connect(sum);
mGain.connect(sum);
hGain.connect(sum);
sum.connect(context.destination);
// Input
//
function changeGain(string,type)
{
var value = parseFloat(string) / 100.0;
switch(type)
{
case 'lowGain': lGain.gain.value = value; break;
case 'midGain': mGain.gain.value = value; break;
case 'highGain': hGain.gain.value = value; break;
}
}
createMediaElementSource in Chrome on Android doesn't work in general. But if you have a recent build of Chrome (49 and later?), you can go to chrome://flags and enable the unified media pipeline option. That will make createMediaElementSource work like on desktop.

Display bug with Front Camera, Video and VideoDisplay in Adobe AIR (Flex 4.6.0) for Android

I'm current working wits AS3 and Flex 4.6 to create an android application.
i'm using the front camera and attach it to a local Video object that i add as an child to an VideoDisplay object.
When i debug on my computer everything is working perfectly, but when i build the project and run it on my Android device my local video display becomes an gray grid.
As example i took an picture of the device.
I wrote this method based on a post here on Stackoverflow to initialize the front and back camera.
private function InitCamera():void {
var CamCount:int = ( Camera.isSupported ) ? Camera.names.length : 0;
for( var i:int = 0; i < CamCount; i++ ) {
var cam:Camera = Camera.getCamera( String( i ) );
if( cam ) {
if( cam.position == CameraPosition.FRONT ) {
CamFront = cam;
continue;
}
if( cam.position == CameraPosition.BACK ) {
CamBack = cam;
continue;
}
if( cam.position == CameraPosition.UNKNOWN ) {
CamFront = cam;
continue;
}
}
}
}
And i wrote this method to create an Video object, attach the front Camera as the default camera and add the Video as an child to an VideoDisplay:
private function SetUpLocalVideo():void {
Debug( "Setting up local video" );
LocalVideo = new Video( this.LVideo.width, this.LVideo.height );
LocalVideo.attachCamera( CamFront );
LVideo.addChild( LocalVideo ); <--- this is the VideoDisplay
}
I've been searching on the internet for an solution, but so far i failed to find any.
Do any one else had this problem before ? can you share you solutions with me ?
I appreciate the help.
Thanks.
Set the render mode to direct on your application.xml
<renderMode>direct</renderMode>
If it still doesn't work, change the dpi settings to 240 of your main flex application.

Pull to refresh in Titanium for Android

How to pull to refresh?
In Titanium appcelerator I need to show a list of content in tableview. If I pull the view it needs to update. In iPhone I complete but in Android it won't work. Please any one help to solve this problem in Android.
My Android code:-
tableView.addEventListener('scroll',function(e)
{
var offset = e.contentOffset.y;
if (offset < -65.0 && !pulling && !reloading)
{
var t = Ti.UI.create2DMatrix();
t = t.rotate(-180);
pulling = true;
arrow.animate({transform:t,duration:180});
statusLabel.text = "Release to refresh...";
}
else if((offset > -65.0 && offset < 0 ) && pulling && !reloading)
{
pulling = false;
var t = Ti.UI.create2DMatrix();
arrow.animate({transform:t,duration:180});
statusLabel.text = "Pull down to refresh...";
}
});
tableView.addEventListener('dragEnd', function(e)
{
if(pulling && !reloading)
{
reloading = true;
pulling = false;
arrow.hide();
actInd.show();
statusLabel.text = "Reloading...";
tableView.setContentInsets({top:60},{animated:true});
tableView.scrollToTop(-60,true);
arrow.transform=Ti.UI.create2DMatrix();
beginReloading();
}
});
Titanium now supports pull to refresh for BOTH Android (> v6.2.0) and iOS (>3.2.0) with a Titanium.UI.TableView, Titanium.UI.ListView or Titanium.UI.ScrollView object.
See the docs:
https://docs.appcelerator.com/platform/latest/#!/api/Titanium.UI.ListView
https://docs.appcelerator.com/platform/latest/#!/api/Titanium.UI.RefreshControl
Sample code taken from the docs:
var win = Ti.UI.createWindow({
fullscreen:true
});
var counter = 0;
function genData() {
var data = [];
for (var i=1; i<=3; i++) {
data.push({properties:{title:'ROW '+(counter+i)}})
}
counter += 3;
return data;
}
var section = Ti.UI.createListSection();
section.setItems(genData());
var control = Ti.UI.createRefreshControl({
tintColor:'red'
})
var listView = Ti.UI.createListView({
sections:[section],
refreshControl:control
});
control.addEventListener('refreshstart',function(e){
Ti.API.info('refreshstart');
setTimeout(function(){
Ti.API.debug('Timeout');
section.appendItems(genData());
control.endRefreshing();
}, 2000);
})
win.add(listView);
win.open();
Is this just the IOS code form the Kitchen Sink example?
There are a couple of attempts at getting this working on Android, though I haven't confirmed that any of them work as expected. From what I understand, the problem is that you can't get the offset the same way in Android as in IOS.
A quick Google search turned up this link, which was referenced from the official Appcelerator forums.
https://gist.github.com/903895

how to use front camera using flex4

I have developed simple camera application for Android mobile using flex 4. The problem is , when I run that application it uses the rear camera. It's not using the front camera. How can I change the camera. I need to use front side camera for this application , kindly help me .
var camera:Camera = Camera.getCamera(cameraIndex.toString());
if (camera)
{
var ui: UIComponent = new UIComponent();
var localVideoDisplay: Video = new Video(180, 135);
localVideoDisplay.attachCamera(camera);
ui.addChild(localVideoDisplay);
cameraGroup.addChild(ui);
}
this is the code I have used in my application.
try
function getCamera( position:String ):Camera
{
var camera:Camera;
var cameraCount:uint = Camera.names.length;
for ( var i:uint = 0; i < cameraCount; ++i )
{
camera = Camera.getCamera( String(i) );
if ( camera.position == position )
return camera;
}
return Camera.getCamera();
}
Use getCamera(CameraPosition.FRONT)

Categories

Resources