Using the LibGdx Music API from within Android applcation - android

I have an android application.
Part of the features of the app is to play ogg files.
I tried using the MediaPlayer for this task, but it caused huge slowdown of the app on the first play.
I'm guessing this is because the music file is loaded whole into memory, which leaves less memory for the app.
The question is, is it possible to use only the Music API of LibGdx for this task?
Can I use it without all the stuff with the application of LibGdx, since I have my own activities written in pure Android code.

Well, i would recommend to take a look into the Music backend of libgdx. There is code for the regular music Api which you can take a look at. Everything there is pure android and i am sure that you can take snippets of it to create your own musik player.
I would recommend to put that into it's own thread so it does not effect your other code. Use a kind of event system to start stop new songs for example.
I dont think, that you can simply integrate a part of libgdx to your project. If you try to take a look at the wiki about interfaceing with platform code. Create an interface in your android project and create a player in the core project of libgdx. Now just call that interface inside of your android project and it get forwared to the libgdx core "musicplayer".

Related

Use Godot game as an Activity or Fragment in a larger Android Studio application

I have been making a game for a while in Android Studio, using just plain Android features like Drawable-s and such for the visuals.
I am planning to add change/update some things and I think that would be easier to do with Godot, but I want to keep the rest of the application in Android Studio.
Is it possible to make something in Godot and export/integrate that into an Android Studio project? Maybe as a separate Activity or Fragment?
If not possible in Godot, is it possible in some other similar game engine?
It seems it is indeed possible to do this in Godot (github discussion), but there is no documentation for it yet (github discussion).
However I discovered it is possible to run LibGDX code in a fragment of an activity (wiki page). I ended up doing it with LibGDX both because I found it to have less of a learning curve from my experience, and it would still be part of the android application and I could pass callbacks and other Java objects back and forth between the Android specific features and LibGDX game logic.

QMediaPlayer on Android

I know that the QMultimediaWidgets are not supported for C++. I am developing a native application for Android as well. Since I don't use QML I need a way of playing my videos in the application. I want to use the QMediaPlayer since I rely on the signals and slots. Is there any manually developed backend which works on Android or a solution how I can render the video myself still using QMediaPlayer?
Is there a way I can developed such a backend myself using ffmpeg or any available program on Android? Will there be any update for this in Qt soon?
QtMultimediaWidgets is not supported on Android so you need to use the QML elements. What you can theoretically try is to embed a QML scene using the MediaPlayer and VideoOutput elements in your QWidget-based app using QWidget::createWindowContainer. Once you see this can be done, you can get your QMediaPlayer object from QML using the mediaObject property of the MediaPlayer QML element. I never tried to do something like this actually.
You may also try to use another plugin like QtAV, but you may lose acceleration.

Can you tell how or what was used to create an Android game?

Could you look at an app (simply as a user) and tell how it was made or what library (if any) were used? I'm thinking about making a game like Devil's Attorney, but I'm really going back and forth with something. That something is whether to use a 2D library like libgdx, HTML5 (phonegap), or just use the standard Android library. The game I'm looking at as an example, almost looks like a really juiced up HTML5 app, but.. I can't tell. So, is there a way to kind of know what platform and/or library was used in making an Android game? This is related strictly to popular/successful Android game apps that aren't open source.
You can find out which libraries are used in an Android app with the AppBrain Ad Detector (main purpose is to detect ad networks, but it also detects popular libraries such as libgdx, cocos2d, unity etc)
https://play.google.com/store/apps/details?id=com.appspot.swisscodemonkeys.detector

Using Unity player/engine to load 3d in existing Android application

Have created a Augment Reality app using Vuforia which would play video on recognition. Now, just need the ability to show 3D objects as well. I did follow the video tutorial from Vuforia on this and able to get the 3D models showing on targets using Unity engine. But, when I exported the project as android (eclipse) project I could locate only 3 activity classes. The challenge here is how to integrate these two projects together.
The component that loads 3D model using unity player is abstracted and not exposed. I was thinking I could just simply use the unity player API to load the 3D models within my existing project.
Is there a way the 3D model loading code be exposed to the Java (android) part? As I need to dynamically load the models and might need to download it from the server and load it.
Since we have done so much work on the UI part of our existing application it will be really helpful if we can just plug the Unity player and manage the model inside our app.
Thanks in advance.
You may find some tips here : https://developer.vuforia.com/forum/faq/unity-how-can-i-extend-unitys-android-activity
It was helpfull to me to override QCARPlayerActivity class, and it seems to work fine (actually, i'm stuck with another problem to finish() the activity...)

ffmpeg decode multiple streams at same time

I am using ffmpeg to decode a file and play it back on an android device. I have this working and would now like to decode two streams at the same time. I have read some comments regarding needing to use av_lockmgr_register() call with ffmpeg, unfortunately I am not sure how to use these and how the flow would work when using these locks.
Currently I have seperate threads on the java side making requests through JNI to native code that is communicating with ffmpeg.
Do the threads need to be on the native(NDK) side, or can I manage them on the java side? And do I need to do any locking, and if so how does that work with ffmpeg?
***UPDATE
I have this working now, it appears that setting up the threads at the java sdk level transfers into separate threads at the native level. With that I was able to create a struct with my variables, and then pass a variable to the native layer to specify what struct to use for each video. So for I have needed to use any mutexs or locks at the native level, and haven't had any issues.
Does anyone know of potential gotchas I may encounter by not doing so with ffmpeg?
I'll answer this, my latest update approach appears to be working. By controlling the threads from the java layer and making my native calls on separate threads everything is working and I have not encountered any issues.

Categories

Resources