I have an already working audio app for Android that uses Oboe. I found some nice plugins https://github.com/juandagilc/Audio-Effects that I'd like to add to my project.
I made JUCE compile successfully on my Android Studio project, but now I'm wondering what I should do next. The mobile tutorial only talks about Projuce, bit what if I want to just produce music, without any of the things from Projuce? I just want to use JUCE as an audio processing library.
All the JUCE tutorials are more linked to using the GUI provided by Juce, but how can I use it for just audio processing, that is, as a true library?
Related
I am trying to understand the working of playing midi files in Android.I know that Android uses Sonivox Eas Libraries and OpenSL ES libraries.But I wanted to get into more details and wanted to know how these libraries are called and what are the functions of these libraries.
Also I wanted to generate my own MIDI notes that could be streamed or read and the output sound could be produced.How can i do that?
I am thinking of making an Android application for playing a MIDI file . How can i start with building the application? What other libraries apart from mentioned above would be needed to play a midi file.
I am planning to use native (C++/C) android development to make this android application.
I'm developing a mobile APP to record audio, save to file and then send to a server. I'm currently using SmartFace.io, cross platform application to create both Android and iOS mobile apps. Been researching but can't find the audio capabilities of such platform, the online API documentation doesn't include specifics and the media items have no detailed info.
I'm not a beginner, SmartFace looks good but can't find any info regarding what I need to do. I'm not so sure if a lot of people is using it.
What I've done:
- Tried using PhoneGap but couldn't make it work, a coworker with more experience on Phonegap struggled until one project worked, we discovered some libraries and versions collide
- Tried samples posted here but as some other users reported, didn't work
- Also tried the now dead Mosync but the C code provided on the now dead forum doesn't work (says platform not supported).
- I know Appcelerator have working samples but it's my last choice
- Found working projects for Android Studio but we are still trying to avoid specific work for each platform-app-IDE-framework
Thanks in advance
Smartface App Studio offers lots of ready to use components and libraries in it.
However, for current version it is not available to record an audio samples.
For more details about the features and roadmap please check the links below;
http://www.smartface.io/developer/guides/
http://docs.smartface.io/
http://www.smartface.io/roadmap/
I have developed a custom decoder as part of my final project. My decoder accepts .steve files now I want to integrate my decoder with android framework.
I have researched a lot in this matter and one of the useful links I found is here.
Android: How to integrate a decoder to multimedia framework
The problem I am facing is I don't want compile a whole android source code to integrate a my decoder.
I want to create an app with .apk file which can recognize .steve files and play the video. So that anyone who wants to my test my app can directly install my apk in there phone rather than to compile the source code and then apk can be easily distributed in my grad school to test my decoder.
You cannot integrate custom codec to whole multimedia layer by just installing apk. To do so you have to recompile your own modified firmware. Which is complicated and not applicable for production (you cannot force everybody to install your Android OS version).
But you can develop your own multimedia player which will contain custom codec support (like MXPlayer app or any other). And then in manifest of your app you can mark support of this custom format (.steve), so Android will knew this app can handle this format.
I am developing a mobile application using Titanium SDK. This is my first mobile app. Most of the app is done successfully. The only module that remains is Video conference support. The company I am working in has chosen to use Opentok SDK. I finished the web application and it works fine. Now I've hit a wall in mobile app and can't move further. The problem is Opentok provides a module to use for Titanium, But it only supports build to IOS not Android. The reason we chose Titanium is for cross platform support.
Is there any module available to use or any other way to implement Opentok with Titanium that builds into both Android and IOS.
I have already tried using a WebView to open the conference module of web application. But bad luck, Opentok only works with chrome browser in mobile. But WebView utilizes native stock browser which does not support WEBRTC. So, opentok doesn't work with webview too.
Please help me. This is my first app and I am stuck at this point.
To Create a Module for OpenTok Android:
These are the steps I would follow.
Create a new Android module: titanium.py create --type=module --id=com.tokbox.ti.opentok --platform=android --name=opentok-titanium
Follow the installation instructions from OpenTok for Android. (Hint: I added a separate section down below to help you get through their instructions.)
Make sure the module runs: ant run.emulator or ant install.
Try running their sample, fully in JAVA, completely separate from Titanium. Make sure it works, and you know what it should look like. Then, figure out what exactly you need from their API. Or, if you're feeling ambitious, decide you want everything. Work to strip down the example to just the surface area that you need. Simplify it down to the least number of files you feel makes sense.
Write an example/app.js that demonstrates how you want the module to be used. For example, maybe you'd start off by requiring the module, then setting some API + Session keys, then calling some API, etc.
Based on the documentation, port what you need in to your module. Reference the Appcelerator Android module dev guide and open source Android modules for inspiration.
Write documentation for the module to specify what the various properties, methods, etc are, so that other developers can figure out how to use the module.
When you're done, submit a PR to OpenTok and revel in your creation and contribution.
Some Hints for Step 2:
.jar files go in lib/.
.so files go in platform/android/libs/armeabi/
Permissions go in timodule.xml, and you can see an example in the open source PayPal module for Android
OpenTok does not work with WebView. OpenTok support for Titanium Android does not currently exist because it is currently in beta and we don't currently have engineering bandwidth to build a Titanium Android integration. However, if you are familiar with Titanium, you are more than welcome to add the integration yourself and send a pull request. You can get the Android beta here and you can get titanium source code here
If you are in a hurry and open to trying other frameworks, our PhoneGap Plugin currently supports both android and ios.
I starting to write an application that will need to have a QR code reading capabilities embedded in it. Meaning that I basically want to have an option from inside my application to scan a QR code and then perform some logic related to my application.
I got some explanations about how to do it in Android (though haven't tried it yet), but now that I've decided to use PhoneGap I want to know if it changes anything.
Does the usage of PhoneGap mean that I will have a generic way to include a QR-scanner application inside my own application, or do I still need take care of the QR-scanner application including for every platform?
Advanced features like this require a PhoneGap plugin, with an native implementation for each platform. See http://wiki.phonegap.com/w/page/36752779/PhoneGap%20Plugins
The good news is that the BarcodeScanner plugin is already implemented for Android, BlackBerry and iOS, see https://github.com/phonegap/phonegap-plugins
This plugin support 1D barcodes as well as QR codes and other 2D codes by integrating ZXing (http://code.google.com/p/zxing/)
Note that for now, integrating a plugin in a PhoneGap application needs a different procedure for each platform. See the PhoneGap Wiki referenced above as well as build & install instructions in the README file for each plugin.