I'm doing an app that requires Spotify in Android. But when i run libSpotify "sp_session_create" i get a SIGSEGV. The version of libSpotify that i'm using is v12.1.51 BETA - For android ARM
sp_error initialise(const char * asKey,
const char * asFolder,
const char * asUserAgent,
bool isPlaylistCompressed,
bool isMetadataOnPlaylist,
bool isPlaylistInitialUnload)
{
sp_session_callbacks asCallbacks;
memset(&asCallbacks, 0, sizeof(asCallbacks));
asCallbacks.logged_in = Callback::onSessionLogin;
asCallbacks.log_message = Callback::onSessionLog;
sp_session_config asConfiguration;
memset(&asConfiguration, 0, sizeof(asConfiguration));
std::string asDirectory = asFolder;
auto asDirectoryCache = asDirectory + "/cache";
auto asDirectorySetting = asDirectory + "/setting";
const auto asApplicationKey = "Key goes here...";
asConfiguration.api_version = SPOTIFY_API_VERSION;
asConfiguration.application_key = asApplicationKey;
asConfiguration.application_key_size = sizeof(asApplicationKey);
asConfiguration.cache_location = asDirectoryCache.c_str();
asConfiguration.settings_location = asDirectorySetting.c_str();
asConfiguration.user_agent = asUserAgent;
asConfiguration.compress_playlists = isPlaylistCompressed;
asConfiguration.dont_save_metadata_for_playlists = isMetadataOnPlaylist;
asConfiguration.initially_unload_playlists = isPlaylistInitialUnload;
asConfiguration.userdata = this;
asConfiguration.callbacks = &asCallbacks;
sp_error asError = sp_session_create(&asConfiguration, &_asSession);
__android_log_print(ANDROID_LOG_VERBOSE, LIBRARY_NAME, "PASSED");
return asError;
}
The crash was due to cache folder.
Edit: Spotify's new Android SDK is released! You should strongly consider moving your project to the new SDK, since libspotify is now deprecated for that platform.
Related
I am using openAi client with android kotlin (implementation com.aallam.openai:openai-client:2.1.3).
Is the path wrong or is the library missing?
val imgURL = Uri.parse("android.resource://" + packageName + "/" + R.drawable.face3)
try {
val images = openAI.image(
edit = ImageEditURL( // or 'ImageEditJSON'
image = FilePath(imgURL.toString()), // <-
mask = FilePath(imgURL.toString()), // <-
prompt = "a sunlit indoor lounge area with a pool containing a flamingo",
n = 1,
size = ImageSize.is1024x1024
)
);
} catch (e: Exception) {
println("error is here:"+e)
}
As can be seen, it wants a path from me, but it does not succeed even though I give the path.
I would suggest updating to version 3 of openai-kotlin, and use Okio's Source to provide files.
Assuming the images are in the res/raw folder, your example would be something like this:
val request = ImageEdit(
image = FileSource(
name = "image.png",
source = resources.openRawResource(R.raw.image).source()
),
mask = FileSource(
name = "mask.png",
source = resources.openRawResource(R.raw.mask).source()
),
prompt = "a sunlit indoor lounge area with a pool containing a flamingo",
n = 1,
size = ImageSize.is1024x1024,
)
val response = client.imageURL(request)
I'm building an android app using Xamarin. The requirement of the app is to capture video from the camera and encode the video to send it across to a server.
Initially, I was using an encoder library on the server-side to encode recorded video but it was proving to be extremely unreliable and inefficient especially for large-sized video files. I have posted my issues on another thread here
I then decided to encode the video on the client-side and then send it to the server. I've found encoding to be a bit complicated and there isn't much information available on how this can be done. So, I searched for the only way I knew how to encode a video that is by using FFmpeg codec. I've found some solutions. There's a project on GitHub that demonstrates how FFmpeg is used inside a Xamarin android project. However, running the solution doesn't give any output. The project has a binary FFmpeg file which is installed to the phone directory using the code below:
_ffmpegBin = InstallBinary(XamarinAndroidFFmpeg.Resource.Raw.ffmpeg, "ffmpeg", false);
Below is the example code for encoding video into a different set of outputs:
_workingDirectory = Android.OS.Environment.ExternalStorageDirectory.AbsolutePath;
var sourceMp4 = "cat1.mp4";
var destinationPathAndFilename = System.IO.Path.Combine (_workingDirectory, "cat1_out.mp4");
var destinationPathAndFilename2 = System.IO.Path.Combine (_workingDirectory, "cat1_out2.mp4");
var destinationPathAndFilename4 = System.IO.Path.Combine (_workingDirectory, "cat1_out4.wav");
if (File.Exists (destinationPathAndFilename))
File.Delete (destinationPathAndFilename);
CreateSampleFile(Resource.Raw.cat1, _workingDirectory, sourceMp4);
var ffmpeg = new FFMpeg (this, _workingDirectory);
var sourceClip = new Clip (System.IO.Path.Combine(_workingDirectory, sourceMp4));
var result = ffmpeg.GetInfo (sourceClip);
var br = System.Environment.NewLine;
// There are callbacks based on Standard Output and Standard Error when ffmpeg binary is running as a process:
var onComplete = new MyCommand ((_) => {
RunOnUiThread(() =>_logView.Append("DONE!" + br + br));
});
var onMessage = new MyCommand ((message) => {
RunOnUiThread(() =>_logView.Append(message + br + br));
});
var callbacks = new FFMpegCallbacks (onComplete, onMessage);
// 1. The idea of this first test is to show that video editing is possible via FFmpeg:
// It results in a 150x150 movie that eventually zooms on a cat ear. This is desaturated, and there's a fade-in.
var filters = new List<VideoFilter> ();
filters.Add (new FadeVideoFilter ("in", 0, 100));
filters.Add(new CropVideoFilter("150","150","0","0"));
filters.Add(new ColorVideoFilter(1.0m, 1.0m, 0.0m, 0.5m, 1.0m, 1.0m, 1.0m, 1.0m));
var outputClip = new Clip (destinationPathAndFilename) { videoFilter = VideoFilter.Build (filters) };
outputClip.H264_CRF = "18"; // It's the quality coefficient for H264 - Default is 28. I think 18 is pretty good.
ffmpeg.ProcessVideo(sourceClip, outputClip, true, new FFMpegCallbacks(onComplete, onMessage));
//2. This is a similar version in command line only:
string[] cmds = new string[] {
"-y",
"-i",
sourceClip.path,
"-strict",
"-2",
"-vf",
"mp=eq2=1:1.68:0.3:1.25:1:0.96:1",
destinationPathAndFilename2,
"-acodec",
"copy",
};
ffmpeg.Execute (cmds, callbacks);
// 3. This lists codecs:
string[] cmds3 = new string[] {
"-codecs",
};
ffmpeg.Execute (cmds, callbacks);
// 4. This convers to WAV
// Note that the cat movie just has some silent house noise.
ffmpeg.ConvertToWaveAudio(sourceClip, destinationPathAndFilename4, 44100, 2, callbacks, true);
I have tried different commands but no output file is generated. I have tried to use another project found here but this one has the same issue. I don't get any errors but no output file is generated. I'm really hoping someone can help me find a way I can manage to use FFmpeg in my project or some way to compress video to transport it to the server.
I will really appreciate if someone can point me in the right direction.
Just figure how to get the output by adding the permission in AndroidManifest file.
android.permission.WRITE_EXTERNAL_STORAG
Please read the update on the repository, it says that there is a second package, Xamarin.Android.MP4Transcoder for Android 6.0 onwards.
Install NuGet https://www.nuget.org/packages/Xamarin.Android.MP4Transcoder/
await Xamarin.MP4Transcoder.Transcoder
.For720pFormat()
.ConvertAsync(inputFile, ouputFile, f => {
onProgress?.Invoke((int)(f * (double)100), 100);
});
return ouputFile;
For Previous Android versions
Soruce Code https://github.com/neurospeech/xamarin-android-ffmpeg
Install-Package Xamarin.Android.FFmpeg
Use this as template, this lets you log output as well as calculates progress.
You can take a look at source, this one downloads ffmpeg and verifies sha1 hash on first use.
public class VideoConverter
{
public VideoConverter()
{
}
public File ConvertFile(Context contex,
File inputFile,
Action<string> logger = null,
Action<int,int> onProgress = null)
{
File ouputFile = new File(inputFile.CanonicalPath + ".mpg");
ouputFile.DeleteOnExit();
List<string> cmd = new List<string>();
cmd.Add("-y");
cmd.Add("-i");
cmd.Add(inputFile.CanonicalPath);
MediaMetadataRetriever m = new MediaMetadataRetriever();
m.SetDataSource(inputFile.CanonicalPath);
string rotate = m.ExtractMetadata(Android.Media.MetadataKey.VideoRotation);
int r = 0;
if (!string.IsNullOrWhiteSpace(rotate)) {
r = int.Parse(rotate);
}
cmd.Add("-b:v");
cmd.Add("1M");
cmd.Add("-b:a");
cmd.Add("128k");
switch (r)
{
case 270:
cmd.Add("-vf scale=-1:480,transpose=cclock");
break;
case 180:
cmd.Add("-vf scale=-1:480,transpose=cclock,transpose=cclock");
break;
case 90:
cmd.Add("-vf scale=480:-1,transpose=clock");
break;
case 0:
cmd.Add("-vf scale=-1:480");
break;
default:
break;
}
cmd.Add("-f");
cmd.Add("mpeg");
cmd.Add(ouputFile.CanonicalPath);
string cmdParams = string.Join(" ", cmd);
int total = 0;
int current = 0;
await FFMpeg.Xamarin.FFMpegLibrary.Run(
context,
cmdParams
, (s) => {
logger?.Invoke(s);
int n = Extract(s, "Duration:", ",");
if (n != -1) {
total = n;
}
n = Extract(s, "time=", " bitrate=");
if (n != -1) {
current = n;
onProgress?.Invoke(current, total);
}
});
return ouputFile;
}
int Extract(String text, String start, String end)
{
int i = text.IndexOf(start);
if (i != -1)
{
text = text.Substring(i + start.Length);
i = text.IndexOf(end);
if (i != -1)
{
text = text.Substring(0, i);
return parseTime(text);
}
}
return -1;
}
public static int parseTime(String time)
{
time = time.Trim();
String[] tokens = time.Split(':');
int hours = int.Parse(tokens[0]);
int minutes = int.Parse(tokens[1]);
float seconds = float.Parse(tokens[2]);
int s = (int)seconds * 100;
return hours * 360000 + minutes * 60100 + s;
}
}
Here says the Web audio API works in Chrome for Android, and here I have tested CM Browser, Chrome and CyanogenMod default Android 5.1.1 browsers, and all pass the tests (specially the biquadNode one).
But When I open this codepen with an eq (biquadNode), I can hear the music but not the eq working.
Does biquadNode works in android? any special implementation is needed?
*Code pen required to post
var context = new AudioContext();
var mediaElement = document.getElementById('player');
var sourceNode = context.createMediaElementSource(mediaElement);
// EQ Properties
//
var gainDb = -40.0;
var bandSplit = [360,3600];
var hBand = context.createBiquadFilter();
hBand.type = "lowshelf";
hBand.frequency.value = bandSplit[0];
hBand.gain.value = gainDb;
var hInvert = context.createGain();
hInvert.gain.value = -1.0;
var mBand = context.createGain();
var lBand = context.createBiquadFilter();
lBand.type = "highshelf";
lBand.frequency.value = bandSplit[1];
lBand.gain.value = gainDb;
var lInvert = context.createGain();
lInvert.gain.value = -1.0;
sourceNode.connect(lBand);
sourceNode.connect(mBand);
sourceNode.connect(hBand);
hBand.connect(hInvert);
lBand.connect(lInvert);
hInvert.connect(mBand);
lInvert.connect(mBand);
var lGain = context.createGain();
var mGain = context.createGain();
var hGain = context.createGain();
lBand.connect(lGain);
mBand.connect(mGain);
hBand.connect(hGain);
var sum = context.createGain();
lGain.connect(sum);
mGain.connect(sum);
hGain.connect(sum);
sum.connect(context.destination);
// Input
//
function changeGain(string,type)
{
var value = parseFloat(string) / 100.0;
switch(type)
{
case 'lowGain': lGain.gain.value = value; break;
case 'midGain': mGain.gain.value = value; break;
case 'highGain': hGain.gain.value = value; break;
}
}
createMediaElementSource in Chrome on Android doesn't work in general. But if you have a recent build of Chrome (49 and later?), you can go to chrome://flags and enable the unified media pipeline option. That will make createMediaElementSource work like on desktop.
I am wondering how to adjust the standard user agent in my http requests. I am using the Volley library and I KNOW how to
set a new user agent
retrieve the default user agent as a string (e.g. "Dalvik/1.6.0 (Linux; U; Android 4.0.2; sdk Build/ICS_MR0") => System.getProperty("http.agent")
What I DON'T know is:
how to get the single elements this user agent is build of, so I can replace only the string "Dalvik/1.6.0" with a custom string.
Is that possible, or do I have to make a string replacement?
Thx
In order to set the user agent globally for all requests sent via volley, here is my solution:
When you are initializing the volley request queue, instead of using the convenient method Volley.newRequestQueue(Context);, use the following snippet:
private RequestQueue makeRequestQueue(Context context) {
DiskBasedCache cache = new DiskBasedCache(new File(context.getCacheDir(), DEFAULT_CACHE_DIR), DISK_CACHE_SIZE);
BasicNetwork network = new BasicNetwork(new MyHurlStack());
RequestQueue queue = new RequestQueue(cache, network);
queue.start();
return queue;
}
public static class MyHurlStack extends HurlStack {
#Override
public HttpResponse executeRequest(Request<?> request, Map<String, String> additionalHeaders) throws IOException, AuthFailureError {
if (additionalHeaders == null || Collections.emptyMap().equals(additionalHeaders) {
additionalHeaders = new HashMap<>();
}
additionalHeaders.put("User-Agent", "test_user_agent_in_volley");
return super.executeRequest(request, additionalHeaders);
}
}
This solution assumes you are targeting api level >= 9, so we use the HurlStack
The reason why this works is because in HurlStack.executeRequest(Request<?> request, Map<String, String> additionalHeaders) method, the stuff you add to the additionalHeaders would later be added to an HttpUrlConnection request property as in connection.addRequestProperty(headerName, map.get(headerName));
Yes,
Build.FINGERPRINT contains all the information you need,
https://developer.android.com/reference/android/os/Build.html
To get the individual parts, use the individual constant strings,
For detailed OS Version information use Build.VERSION
import android.util.Log;
import android.os.Bundle;
import android.os.Build;
public class MainActivity {
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
Log.i("Build","BOARD = "+Build.BOARD);
Log.i("Build","BOOTLOADER = "+Build.BOOTLOADER);
Log.i("Build","BRAND = "+Build.BRAND);
Log.i("Build","CPU_ABI = "+Build.CPU_ABI);
Log.i("Build","CPU_ABI2 = "+Build.CPU_ABI2);
Log.i("Build","DEVICE = "+Build.DEVICE);
Log.i("Build","DISPLAY = "+Build.DISPLAY);
Log.i("Build","FINGERPRINT = "+Build.FINGERPRINT);
Log.i("Build","HARDWARE = "+Build.HARDWARE);
Log.i("Build","HOST = "+Build.HOST);
Log.i("Build","ID = "+Build.ID);
Log.i("Build","MANUFACTURER = "+Build.MANUFACTURER);
Log.i("Build","MODEL = "+Build.MODEL);
Log.i("Build","PRODUCT = "+Build.PRODUCT);
Log.i("Build","RADIO = "+Build.RADIO);
Log.i("Build","SERIAL = "+Build.SERIAL);
Log.i("Build","TAGS = "+Build.TAGS);
Log.i("Build","TYPE = "+Build.TYPE);
Log.i("Build","USER = "+Build.USER);
Log.i("Build","BASE_OS = "+Build.VERSION.BASE_OS);
Log.i("Build","CODENAME = "+ Build.VERSION.CODENAME);
Log.i("Build","INCREMENTAL = "+ Build.VERSION.INCREMENTAL);
Log.i("Build","RELEASE = "+ Build.VERSION.RELEASE);
Log.i("Build","SDK = "+ Build.VERSION.SDK);
Log.i("Build","SECURITY_PATCH = "+ Build.VERSION.SECURITY_PATCH);
Log.i("$TAG#",Build.FINGERPRINT);
}
}
System.getProperty("http.agent") returns something like:
Dalvik/2.1.0 (Linux; U; Android 9; Android SDK built for x86 Build/PSR1.180720.075)
It's possible to build all the parts of this using a combination of android.os.Build and java.lang.System.getProperty().
This is an example of what's in android.os.Build running on an emulator:
Build.BOARD = "goldfish_x86"
Build.BOOTLOADER = "unknown"
Build.BRAND = "google"
Build.DEVICE = "generic_x86"
Build.DISPLAY = "sdk_gphone_x86-userdebug 9 PSR1.180720.075 5124027 dev-keys"
Build.FINGERPRINT = "google/sdk_gphone_x86/generic_x86:9/PSR1.180720.075/5124027:userdebug/dev-keys"
Build.HARDWARE = "ranchu"
Build.HOST = "abfarm904"
Build.ID = "PSR1.180720.075"
Build.MANUFACTURER = "Google"
Build.MODEL = "Android SDK built for x86"
Build.PRODUCT = "sdk_gphone_x86"
Build.SUPPORTED_32_BIT_ABIS = {"x86"}
Build.SUPPORTED_64_BIT_ABIS = {}
Build.SUPPORTED_ABIS = {"x86"}
Build.TAGS = "dev-keys"
Build.TIME = 1541887073000
Build.TYPE = "userdebug"
Build.USER = "android-build"
Build.UNKNOWN = "unknown"
Build.VERSION.BASE_OS = ""
Build.VERSION.CODENAME = "REL"
Build.VERSION.INCREMENTAL = "5124027"
Build.VERSION.PREVIEW_SDK_INT = 0
Build.VERSION.RELEASE = "9"
Build.VERSION.SDK_INT = 28
Build.VERSION.SECURITY_PATCH = "2018-09-05"
These properties are always provided by the Dalvik VM, according to Google's documentation:
file.separator = /
java.class.path = .
java.class.version = 50.0
java.compiler = Empty
java.ext.dirs = Empty
java.home = /system
java.io.tmpdir = /sdcard
java.library.path = /vendor/lib:/system/lib
java.vendor = The Android Project
java.vendor.url = http://www.android.com/
java.version = 0
java.specification.version = 0.9
java.specification.vendor = The Android Project
java.specification.name = Dalvik Core Library
java.vm.version = 1.2.0
java.vm.vendor = The Android Project
java.vm.name = Dalvik
java.vm.specification.version = 0.9
java.vm.specification.vendor = The Android Project
java.vm.specification.name = Dalvik Virtual Machine Specification
line.separator = \n
os.arch = armv7l
os.name = Linux
os.version = 2.6.32.9-g103d848
path.separator = :
user.dir = /
user.home = Empty
user.name = Empty
So, the default user agent appears to be composed of:
System.getProperty("java.vm.name") // Dalvik
System.getProperty("java.vm.version") // 2.1.0
System.getProperty("os.name") // Linux
"U" // not sure where to get this
"Android" // or this, probably safe to hard code though
Build.VERSION.RELEASE // 9
Build.MODEL // Android SDK built for x86
Build.ID // PSR1.180720.075
I meet a question about running android emulator, here is the details below. Thanks a lot!
I need to run android emulator automatically, so tend to use execve in Linux, the source code shown below:
/*initailize passed command line\*/
char *binary = (char*)malloc(8*sizeof(char));
char **newargv = (char **)malloc(16*sizeof(char *));
newargv[0] = "/media/career/android/source/out/host/linux-x86/sdk/android-sdk_eng.corey_linux-x86/tools/emulator";
newargv[1] = "-avd";
newargv[2] = "new1";
newargv[3] = "-system";
newargv[4] = "/media/career/android/source/out/target/product/generic/system.img";
newargv[5] = "-ramdisk";
newargv[6] = "/media/career/android/source/out/target/product/generic/ramdisk.img";
newargv[7] = "-data";
newargv[8] = "/media/career/android/source/out/target/product/generic/userdata.img";
newargv[9] = NULL;
/*initialize the env value of new process(emulator) */
const char *temp = getenv("ANDROID_AVD_HOME");
envp[0] = temp;
envp[1] = getenv("PATH");
envp[2] = NULL;
/*main function*/
if (execve (binary, (char **)newargv, (char **)envp) < 0 )
environment variable:
declare -x ANDROID_AVD_HOME="/home/corey/.android/avd"
declare -x PATH="/media/career/android/source/out/host/linux-x86/sdk/android-sdk_eng.corey_linux-x86/platform-tools:/media/career/android/source/out/host/linux-x86/sdk/android-sdk_eng.corey_linux-x86/tools:
The console shows: PANIC: Could not open: /tmp/.android/avd/new1.ini
My avd stored path is ~/.android/avd by default.
But the new process only find the root filesystem's path for reading avd files with bad portability. Actually the process of emulator has run. I focus on setting env variables, but failed.
May I ask how should I set my environment variables?
You have passed wrong envp
const char *temp = getenv("ANDROID_AVD_HOME");
envp[0] = "ANDROID_AVD_HOME=" + temp;
envp[1] = "PATH=" + getenv("PATH");
envp[2] = NULL;
replace + with safety strcat