There is a problem with HoloEverywhere and AACDecoder (com.spoledge.aacdecoder).
When I try click play music from stream I get this:
12-13 09:50:33.044 24134-24225/com.example.aapxxx E/AACPlayer﹕ playAsync():
java.lang.NullPointerException
at com.spoledge.aacdecoder.AACPlayer.dumpHeaders(AACPlayer.java:510)
at com.spoledge.aacdecoder.AACPlayer.processHeaders(AACPlayer.java:497)
at com.spoledge.aacdecoder.MultiPlayer.processHeaders(MultiPlayer.java:108)
at com.spoledge.aacdecoder.AACPlayer.play(AACPlayer.java:280)
at com.spoledge.aacdecoder.AACPlayer$1.run(AACPlayer.java:248)
at java.lang.Thread.run(Thread.java:841)
I've implemented all 4 files into lib and selected aacdecoder jar as source.
HoloEverywhere is using android v4 19 0 1
I stuck and don't know how to fix it.
I'm using Android Studio. I followed as in tutorial here
Please help what to do.
Thanks in advance.
I'm trying to be runnin on 4.4
Works on lower versions.
I could use notpackaged project with sources instead ready jar of accplayer but I don't know how to manage it. Maybe there is just any small bug in theirs code..
Related problem discussion:
project page issue
Android Live Radio comments (second or third comment)
EDIT: 2014-01-09 - PROBABLY FIX
On main project page ACCPLAYER in issue ticket page has been uploaded FIX for kit kat.
Problem is related to changes in kiktat which was also mentioned on StackOverflow
One shot in the dark...sometimes the connection to stream fails, if there's no User-Agent set. But I'm not sure if your problem is because of your headers.. so please provide more (code)information. just for the record -> this is how you could set heards with android Mediaplayer. Check if AACDecoder has something similar too...
//Setting HTTP header for fixing some issues with normal setDataSource Header from Android
//User-Agent "iTunes" was set to force re-direct to shoutcast streaming url
Map<String, String> headers = new HashMap<String, String>();
headers.put("User-Agent", "iTunes");
setDataSource(context, uri, headers);
prepareAsync();
(full answer here)
Related
I want to create a live stream functionality in my android app. I got an example from this link:
https://github.com/youtube/yt-watchme.
While running this code i got an error, "libffmpeg" can't load. To solve this issue i downloaded pre-built 'libffmpeg' and added to my project. After that I getting the issue:
java.lang.UnsatisfiedLinkError: dlopen failed: file offset
for the library "/data/app/com.google.android.apps.watchme-2/
lib/arm/libffmpeg.so" >= file size: 0 >= 0.
How I can solve this issue.
Have you checked on this documentation? The YouTube Live Streaming API lets you create, update, and manage live events on YouTube. Using the API, you can schedule events (broadcasts) and associate them with video streams, which represent the actual broadcast content.
For your error: dlopen failed: file offset for the library, you may check on this related SO thread. Make sure that you have downloaded and copied the files properly.
You can also check on this Java Code Samples that used the Google APIs Client Library for Java which are available for the YouTube Live Streaming API.
I'll try to make this simple :
If I create an AIR app from the Flash IDE, I can choose to embed folder in my package. Then I can load the files using 'app:/'+filename. Everything is ok.
I have to move to Flash Builder because I can't test workers in the IDE (thanks Adobe). My issue is that, if I test/debug from Flash Builder, it does a stream error when calling 'app:/'+filename. If I launch the test in the IDE from FB, it works but the Workers don't. I should mention, the reason I'm using this method is that I have so many graphical assets, it's just easier to maintain/update this way instead of using [Embed.. ] for all my items, and it just works in the IDE...
I've added my folder to my sources locations in Flash Builder, still it seems I cannot use the 'app:/' thing.
How can I make this work so I don't change my code and still use 'app:/'? FB is such a confusing program...
edit : I tested again the workers in the IDE build launched by FB (the test in flash IDE icon), I can trace its state with :
worker.start();
worker.addEventListener(Event.WORKER_STATE, this._handleWorkerState);
private function _handleWorkerState(__e:Event):void{
trace(__e.currentTarget.state);
}
traces 'new' then 'running'. But for some reason, it doesn't send or receive any data from any message channel, which, again, works in FB4.7 when i run a debug but doesn't find my files....
Error #2044: Unhandled ioError:. text=Error #2032: Stream Error. URL: app:/foldername..
So basically, I'm looking for a solution to at least one of my problems :)
EDIT :
So ok. Here it is, one issue was due to the wrong debugger version installed (for the workers part). So I can now work and compile in the IDE again. I haven't found an answer to why 'app:/' doesn't work from FB4.7. So that would be the remaining question.
One option since you have Flash IDE is to create a library with all of your images. Drop all your images into the library in Flash and export them for actionscript. Then publish and create a a SWC. Then you can use the swc, which is kind of like a zip file for display objects, in flashbuilder and access them like:
var mc:MovieClip = new imageExportedForAS3_1()
Create a top level folder in your flex project called for example images, copy all of your images into that folder, then every time you need to load an image, just use the source attribute and use the absoulte rute, for example.
<mx:Image source="#Embed(source='../images/pic.png')"
I have never used the app:/ sentence before! Good luck!
I want to implement a keyword spotting based on PocketSphinx for an Android app.
PocketSphinx is new for me. I started with the PocketsphinxAndroidDemo from their repo.
Then I have imported the project in Eclipse and have build and deployed the demo app on my phone. The demo recognized the commands deposit and withdraw and numbers fine. I have not installed any other lib or tool.
Now I want to recognize my own keywords and follew the CMUSphinx tutorial. Therefore I created an own DIC and LM file using "Sphinx knowledge base generator" and have included in the assets subfolder of the project. The corpus I use:
open browser
new e-mail
forward
backward
next window
last window
open music player
I have modified SpeechRecognizer the following:
config.setString("-jsgf", joinPath(dataDir, "dialog.gram")); // unmodified
config.setString("-dict", joinPath(dataDir, "lm/2914.dic"));
config.setString("-lm", joinPath(dataDir, "lm/2914.lm"));
config.setString("-hmm", joinPath(dataDir, "hmm/hub4wsj_sc_8k")); // unmodified
Then I have started the app again and got the following errors:
11-21 12:48:18.758: E/cmusphinx(15521): "fsg_search.c", line 334: The word 'withdraw' is missing in the dictionary
and
11-21 12:48:26.375: A/libc(15521): Fatal signal 11 (SIGSEGV) at 0x0000001c (code=1), thread 15557 (SpeechRecognize)
I know "withdraw" is a word of the former dictionary.
What do I have to modify in the grammar file?
I also read this tutorial http://www.aiaioo.com/cms/index.php?id=28 and used the acoustic model and the phonetic dictionary from there. I have modified the SpeechRecognizer again. While the app is starting, I see the ressources were loaded correctly. But then I got the same error.
Can someone please tell me please what the problem is?
What steps do I have to do to get my spotting running?
Try to change entries in a dictionary file to ALL CAPS. Try using this for your dictionary file:
OPEN BROWSER
NEW E-MAIL
FORWARD
BACKWARD
NEXT WINDOW
LAST WINDOW
OPEN MUSIC PLAYER
This worked for me.
I have to implement a GStreamer pipeline on Android which will get a live mpegts stream from a mpegts server on a linux machine(also implemented through GStreamer).
Now, I have a Samdung Galaxy Tab2, 5113, which has Android-4.1.2-JellyBean and API level=16
My receiver pipeline is as follows:
data->pipeline = gst_parse_launch("udpsrc caps=\"video/mpegts, systemstream=true, packet-size=188\" ! tsdemux ! queue ! h264parse ! amcviddec-omxgoogleh264decoder ! eglglessink", &error);
This as per Android-Tutorial-3 of GStreamerSDK.
When I press the play button,
I get this error:
06-26 00:04:56.927: D/GStreamer+tutorial-3(7201): 0:00:05.920807000 0x5a65c320 jni/tutorial-3.c:88:set_ui_message Setting message to: Error received from element amcvideodec-omxgoogleh264decoder0: GStreamer encountered a general supporting library error.
A more detailed log of the application as shown on the logcat of the Eclipse IDE:http://pastebin.com/EX8sgcEp
So it seems that the amcviddec-omxgoogleh264decoder element cannot dequeue the input data as well as GStreamer encounters a library error.
I would appreciate any help or suggestions.
We had solved the problem some time back.
Just putting it here for any body else's reference.
The problem was, if we are to use amcviddec-omxgoogleh264decoder, there are some dependent files which need to be installed besides the gstreamer application. Don't knopw exactly what they were.
Anyways, if one sees the /etc/media-codec.xml file in the android root, we will get to know all the multimedia codecs supported by one's android device. This includes the codec supported by hardware codec chips also.
For us, we tried the amcviddec-omxtiducati1videodecoder, and it worked like a charm.
Regards,
Yusuf Husainy.
I have a task which involves integration of a video decoder into Stagefright(Android's multimedia framework). I searched and found the following about creating a new plugin for Stagefright:
To add support for a new format, you need to:
Develop a new Extractor class, if the container is not supported yet.
Develop a new Decoder class, that implements the interface needed by the StageFright core to read the data.
Associate the mime-type of the files to read to your new Decoder in the OMXCodec.cpp file, in the kDecoderInfo array.
static const CodecInfo kDecoderInfo[] = {
{MEDIA_MIMETYPE_AUDIO_AAC, "OMX.TI.AAC.decode"},
{MEDIA_MIMETYPE_AUDIO_AAC, "AACDecoder"},
};
The above data is all i could find out on net. Right now i have a simple app that will take a file as an input and render it on the screen using native API's in android. Can anyone please tell me how to proceed further. And from where does all these OMXCodec.cpp and others come into picture and which directory of my project should i have them in. Please provide solutions regarding the same. Thanks in advance.
From your question, it appears that you are looking at a recommendation which is specific for Ice-Cream Sandwich or earlier versions of Android. The first thing you should be clear about is the version of the android i.e. Ice-Cream Sandwich or before or JellyBean and after. The integration of codecs is different across different releases of Android.
I have already commented on your other question which is specific for JellyBean and later (Reference: Android: How to integrate a decoder to multimedia framework)
If you would like to integrate your codec in Ice-Cream Sandwich or before, the steps are already available in your question. In addition to adding the decoder into kDecoderInfo list, you may like to setup certain quirks as shown here.
For the question on OMXCodec.cpp, you can find this file at
frameworks/base/media/libstagefright/ in case of Ice-Cream Sandwich and frameworks/av/media/libstagefright/ in case of JellyBean.
If you have followed all the steps to integrate the video decoder into the Stagefright framework, then the easiest test would be to perform the following:
Copy a media file into SD-Card
In OMXCodec.cpp, enable logs by removing the comment in this statement //#define LOG_NDEBUG 0 and run a mm in the directory. Copy the rebuilt libstagefright.so to /system/lib on your device.
Enable logcat and start capturing logs.
Goto gallery, select your file and allow the standard player to play your file.
Check your log file if the player has selected your OMX component by searching for your component name. If found, your integration of codec into Stagefright is successful. Else, you will have to debug and find out what is the problem.
Postscript:
Based on your queries, I presume you aren't familiar with Android sources. Please refer to androidxref site to become familiar with AOSP distributions.
Unless you are planning to support a new media file-format, you will not require to support Extractor class. MediaExtractor abstracts a file-format parser and helps to de-multiplex the different tracks in a media file.
I hope with this information, you should be able to get your codec integrated and functional in Android.