I want to take audio input in my unity application which I am building for Android platform. The code I have added in Start Function is as follows:
var audio = GetComponent< AudioSource > ();
audio.clip = Microphone.Start("Built-in Microphone", true, 10, 44100);
audio.loop = true;
while (!(Microphone.GetPosition(null) > 0)) { }
audio.Play();
But it is showing the following error:
ArgumentException: Couldn't acquire device ID for device name Built-in Microphone
I'm referring from this post to add microphone. How to resolve this? Also, is there any blog available for doing this end to end?
The error message clearly indicates that it can't find a Microphone device named "Built-in Microphone". So you should probably see what devices it can find.
Try running the following code in the Start method and see what output you get:
foreach (var device in Microphone.devices)
{
Debug.Log("Name: " + device);
}
Once you have a list of the devices, then replace "Built-in Microphone" with the name of your desired device. If "Built-in Microphone" is in the list or you get the same issue with a different device, then you're probably dealing with a permissions issue.
Related
I am working with AMediaCodec and AMediaExtractor, all works fine on all devices(I hope:), but if I check the same code on Android Q (in my case Pixel 2XL) I got such error AMEDIA_ERROR_UNSUPPORTED.
What I do
bool NativeCodec::createStreamingMediaPlayer(const std::string &filename)
{
AMediaExtractor *ex = AMediaExtractor_new();
media_status_t err =
AMediaExtractor_setDataSource(ex, filename.c_str());; <-- Here media status I got AMEDIA_ERROR_UNSUPPORTED
if (err != AMEDIA_OK)
{
__android_log_print(ANDROID_LOG_ERROR, "ERROR", "ERROR ::: %s", std::to_string(err).c_str());
return false;
}
.....
}
Maybe this is somehow connected with privicy that was introduced in Android Q, but I didn't find any info about it...
How to check this issue?
This to me seems a bug on Android 10. It seems that android:requestLegacyExternalStorage="true" does not change the situation. You may have to request <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE/> on manifest and ask same permission at runtime. The AMediaExtractor_setDataSource function must be called on a thread that is attached to Java. Doing all of that correctly will allow you to make it work on other versions of Android but not on Android 10. I've reported the issue on Android Bug Tracker here: https://issuetracker.google.com/144837266
As per google answer, it seems that all the app using native libraries that require file access through path can be affected and they know the issue https://www.youtube.com/watch?v=UnJ3amzJM94 .
A workaround in my case was to use AMediaExtractor_setDataSourceFd, getting the file descriptor at Java level through contentResolver and its method openFileDescriptor.
This is due to you not specifying a Java thread to be used. Create the thread and then attach it via AttachCurrentThread
I'm using this nu-get package to stream mp3 url in a Xamarin Android project:
https://github.com/martijn00/XamarinMediaManager
I followed the instructions in the link up there... and it shows the music playing in the notification bar but it is not working (no sound and it's not even starting the song).
Code snippet:
clickButton.Click += (sender, args) =>
{
ClickButtonEvent();
};
private static async void ClickButtonEvent()
{
await CrossMediaManager.Current.Play("http://www.montemagno.com/sample.mp3");
}
I built the sample included in the link, and I got the same result from their sample. Also deployed on real device too, same result!
Image:
Am I missing something ?
Or is the library broken ?
I ran into this using Android Emulator on Hyper-v. It turns out that the network is set to internal. So the http://www.montemagno.com/sample.mp3 could not be found. My workaround:
Hyper-v -> Virtual Switch Manager, add an external network.
Hyper-v -> Virtual Machines->Settings, add new hardware->Network adapter and set to external network.
"Visual Studio Emulator for Android" desktop app, launch phone vm,
in Visual Studio, deploy and run app.
Sound should work from external source now.
Permissions maybe? In the project site it states that for Android:
You must request AccessWifiState, Internet, MediaContentControl and
WakeLock permissions
By default example use ExoPlayerAudioService.
There are issue with url escape in ExoPlayerAudioService.GetSource method
private IMediaSource GetSource(string url)
{
string escapedUrl = Uri.EscapeDataString(url);
var uri = Android.Net.Uri.Parse(escapedUrl);
var factory = URLUtil.IsHttpUrl(escapedUrl) || URLUtil.IsHttpsUrl(escapedUrl) ? GetHttpFactory() : new FileDataSourceFactory();
var extractorFactory = new DefaultExtractorsFactory();
return new ExtractorMediaSource(uri
, factory
, extractorFactory, null, this);
}
string escapedUrl = Uri.EscapeDataString(url);
I.E. http://example.com/path_to_audio.mp3 will be escaped to "http%3A%2F%2Fexample.com%2Fpath_to_audio.mp3" as result HTTP error.
To fix just skip url escape.
Anybody success to use the plugin cordovaFile & cordovaFileTransfer?
I have failed to understand and failed miserably execution. Case wants to make the upload and download controller. Each tested via the browser, it always appears File / FileTransfer is not defined in Firebug. When I made to console.log as:
console.log($cordovaFile); or
console.log($cordovaFileTransfer); or
console.log($cordovaFileTransfer.download); or
console.log($cordovaFileTransfer.upload);
Its return true, form of the {object}.
But when I call their methods included parameters, for example:
$cordovaFileTransfer.download (urlServer, fileTarget, {}, true);
Direct emerge error: FileTransfer is not defined.
I tried to move the download function to the Service, and then Controller call the function (the umpteenth time search results on google). The result is just the same, the above error.
Because there are user in some forum said should / could only be tested through the device, finally I try to upload ionic.io & I sync via APL ionic view on my Smartphone. But the result is NOTHING.
I tried to improvise a little, try method checkDir / checkFile as follows:
.controller('PhotoCtrl', function($scope, $cordovaFile) {
$scope.downpic = function(){
$cordovaFile.checkDir("/sdcard/storage/emulated/0/").then(function(result){
alert("wow");
}, function(err){
alert("eror");
});
}
})
It turns out alerts that appear "error", I try mutually value directory is as follows:
file///sdcard/storage/emulated/0/
file///storage/emulated/0/
/storage/emulated/0/
Just the same error alerts, the chain problem. My question :
What is the application of ionic cordova can access the internal
storage? (I only have the Mobile Internal Storage, without External
Storage);
I was looking for information about AndroidManifest.xml
uses-permission, the permission is only for external storage. Are
there any other analysis?
Please help, really newbie
Finally, I just got the clear solution from the link below :
https://www.thepolyglotdeveloper.com/2014/09/manage-files-in-android-and-ios-using-ionicframework/
I've been trying to access the rear camera on an LG G4 Android phone running Chrome. I'm able to filter out the video sources from MediaStreamTrack.getSources(), but when I try to set a constraint to prefer the rear camera, I get the error TypeError: Failed to execute 'webkitGetUserMedia' on 'Navigator': Malformed constraints object. Below I have the code I'm using to filter the video sources:
if (navigator.getUserMedia) {
if (MediaStreamTrack.getSources) {
MediaStreamTrack.getSources(function(sourceInfos) {
var sources = [];
_.forEach(sourceInfos, function(info) {
if (info.kind === 'video') {
sources.push(info);
}
})
handleSources(sources);
})
}
}
Then I'm trying to select a source in the handleSources function mentioned above:
function handleSources(sources) {
var constraints = {
video: {
facingMode: 'environment' // Yeah, this definitely doesn't work.
}
}
getMedia(constraints); // This calls getUserMedia with the selected contraints
}
I've tried tons of different formats for the constraints object, but none of them seem to work. I know I'd be able to loop through all the sources and select the environmental camera from there, but I'd love to know how the actual syntax for this works. Googling for the answer only brings up https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia#Parameters, the syntax of which doesn't work.
It would appear as though earlier/different versions of the Android browser implement a different API for camera discovery. I have found that each phone I have access to (emulator and physical) seems to respect a different set of options. To make matters worse this seems to be an area where various documentation repositories insist on ignoring or removing the previously implemented APIs (even though we still need to know how to use them if we are going to be able to implement on anything but the newest phones).
The two major flavors of the API that I have found are the one that's currently documented (you refer to that API above) and one that was documented in a version of the WebRTC specification from October 2013. That flavor has a significantly different constraints specification that includes mandatory and optional properties. Your call above to getMedia would look like this under the older specification:
var constraints = {
video: {
mandatory: {
facingMode: 'environment'
}
}
}
getMedia(constraints);
Alternately, you can use optional settings, which are provided as an array instead so you can have multiple choices (these are evaluated in order):
var constraints = {
video: {
optional: [{
facingMode: 'environment'
}]
}
}
getMedia(constraints);
That having been said, your mileage may vary when it comes to finding filters that work. For example, the facingMode filter above does not function in my Android 5.0 emulator (it doesn't throw an error but it also doesn't present the environment-facing camera); however, using a device ID does work (which looks like this when mapped to your example):
var constraints = {
video: {
mandatory: {
sourceId: '<your source ID here>'
}
}
}
getMedia(constraints);
In the case of the Android 5.0 emulated device that I have done some testing with I am able to use MediaStreamTrack.getSources() to find the device I want (it returns the facing property with each camera). Note that the "recommended" replacement method navigator.mediaDevices.enumerateDevices() method is not present in this emulated device.
There are numerous other issues that you will see when using different emulated and physical devices, each of which has been quite problematic for me when implementing these APIs in the real world. I highly recommend using a combination of multiple physical devices (if you are in a work environment where you can get access to them), BrowserStack (to give you lots of real and emulated devices to test on), console.log(), and Vorlon.js (to view the console.log() output in real-time from all those emulated devices so you can see what's really going on).
I am currently working on this exact problem right now - if I find anything additional with respect to different API flavors that need supporting I will post an update here.
If you look at the Browser Compatibility section of the MDN page you linked to you'll see:
Chrome uses an outdated constraint syntax, but the syntax described here is available through the adapter.js polyfill.
You'll be happy to know that adapter.js now supports the facingMode constraint on Chrome for Android (use https fiddle for Chrome):
var gum = mode =>
navigator.mediaDevices.getUserMedia({video: {facingMode: {exact: mode}}})
.then(stream => (video.srcObject = stream))
.catch(e => log(e));
var stop = () => video.srcObject && video.srcObject.getTracks().map(t => t.stop());
var log = msg => div.innerHTML += msg + "<br>";
<button onclick="stop();gum('user')">Front</button>
<button onclick="stop();gum('environment')">Back</button>
<div id="div"></div><br>
<video id="video" height="320" autoplay></video>
<script src="https://webrtc.github.io/adapter/adapter-latest.js"></script>
The { exact: } syntax means the constraint is required, and things fail if the user doesn't have the right camera. If you leave it out then the constraint is optional (though Firefox for Android will let users override the choice in the camera chooser in the permission prompt in that case).
adapter.js also supports navigator.mediaDevices.enumerateDevices(), which has replaced MediaStreamTrack.getSources.
I am trying to insert a movie texture in unity onto a cube screen. This is my code to make it play.
#pragma strict
var movTex : MovieTexture;
function Start () {
renderer.material.mainTexture = movTex;
movTex.Play();
}
function Update () {
}
When I try to build it to my Android I get a build error:
Assets/Scripts/Movie.js(3,14): BCE0018: The name 'MovieTexture' does not denote a valid type ('not found').
Does anyone know what may be wrong?
You are trying to use something that doesn't work on Android, if you look up the MovieTexture on the unity docs we find:
http://docs.unity3d.com/Manual/class-MovieTexture.html
Movie Textures are not supported on Android. Instead, full-screen streaming playback is provided using Handheld.PlayFullScreenMovie.