I have to implement a GStreamer pipeline on Android which will get a live mpegts stream from a mpegts server on a linux machine(also implemented through GStreamer).
Now, I have a Samdung Galaxy Tab2, 5113, which has Android-4.1.2-JellyBean and API level=16
My receiver pipeline is as follows:
data->pipeline = gst_parse_launch("udpsrc caps=\"video/mpegts, systemstream=true, packet-size=188\" ! tsdemux ! queue ! h264parse ! amcviddec-omxgoogleh264decoder ! eglglessink", &error);
This as per Android-Tutorial-3 of GStreamerSDK.
When I press the play button,
I get this error:
06-26 00:04:56.927: D/GStreamer+tutorial-3(7201): 0:00:05.920807000 0x5a65c320 jni/tutorial-3.c:88:set_ui_message Setting message to: Error received from element amcvideodec-omxgoogleh264decoder0: GStreamer encountered a general supporting library error.
A more detailed log of the application as shown on the logcat of the Eclipse IDE:http://pastebin.com/EX8sgcEp
So it seems that the amcviddec-omxgoogleh264decoder element cannot dequeue the input data as well as GStreamer encounters a library error.
I would appreciate any help or suggestions.
We had solved the problem some time back.
Just putting it here for any body else's reference.
The problem was, if we are to use amcviddec-omxgoogleh264decoder, there are some dependent files which need to be installed besides the gstreamer application. Don't knopw exactly what they were.
Anyways, if one sees the /etc/media-codec.xml file in the android root, we will get to know all the multimedia codecs supported by one's android device. This includes the codec supported by hardware codec chips also.
For us, we tried the amcviddec-omxtiducati1videodecoder, and it worked like a charm.
Regards,
Yusuf Husainy.
Related
I am trying to reproduce MPEG DASH, HLS and HSS(HTTP SmoothStreaming) streams by using modified sample applications provided by Google, but for now, only DASH is working.
I use:
https://github.com/googlecast/cast-custom-receiver
and
https://github.com/googlecast/CastVideos-android
As i see here https://developers.google.com/cast/docs/player?hl=en all protocols (HSS, HLS, DASH) should work without problems.
When i try to stream HSS and HLS i get this in Sample Media Receiver HUD:
Media Element State: "Error" (or "Abort")
.
.
.
Host State: "Fatal Error: code = 1"
Does anyone know what that error represents?
Also, sometimes i get this "Fatal Error: code = 0".
//UPDATE
I get this error when i try to cast HSS:
[2648.568s] [cast.receiver.MediaManager] Load metadata error cast_receiver.js:19
and the link is :
http://video3.smoothhd.com/ondemand/Turner_Sports_PGA.ism/Manifest
//UPDATE
HLS is working now. Problem is solved by setting CORS headers.
I don't know what HSS is and we don't mention HSS as a supported protocol either, nor we claim we support "all" protocols. We have listed the supported protocols/variations in the link that you have mentioned.
Problem was with codecs... I used streams encoded with VP1 video and WMAP audio codecs, so there were many errors. That is the reason for getting all those LOAD METADATA ERRORs.
And for CORS, you can use this: https://github.com/TOMODOcom/TOMODOkorz
Works like a charm :)
There is a problem with HoloEverywhere and AACDecoder (com.spoledge.aacdecoder).
When I try click play music from stream I get this:
12-13 09:50:33.044 24134-24225/com.example.aapxxx E/AACPlayer﹕ playAsync():
java.lang.NullPointerException
at com.spoledge.aacdecoder.AACPlayer.dumpHeaders(AACPlayer.java:510)
at com.spoledge.aacdecoder.AACPlayer.processHeaders(AACPlayer.java:497)
at com.spoledge.aacdecoder.MultiPlayer.processHeaders(MultiPlayer.java:108)
at com.spoledge.aacdecoder.AACPlayer.play(AACPlayer.java:280)
at com.spoledge.aacdecoder.AACPlayer$1.run(AACPlayer.java:248)
at java.lang.Thread.run(Thread.java:841)
I've implemented all 4 files into lib and selected aacdecoder jar as source.
HoloEverywhere is using android v4 19 0 1
I stuck and don't know how to fix it.
I'm using Android Studio. I followed as in tutorial here
Please help what to do.
Thanks in advance.
I'm trying to be runnin on 4.4
Works on lower versions.
I could use notpackaged project with sources instead ready jar of accplayer but I don't know how to manage it. Maybe there is just any small bug in theirs code..
Related problem discussion:
project page issue
Android Live Radio comments (second or third comment)
EDIT: 2014-01-09 - PROBABLY FIX
On main project page ACCPLAYER in issue ticket page has been uploaded FIX for kit kat.
Problem is related to changes in kiktat which was also mentioned on StackOverflow
One shot in the dark...sometimes the connection to stream fails, if there's no User-Agent set. But I'm not sure if your problem is because of your headers.. so please provide more (code)information. just for the record -> this is how you could set heards with android Mediaplayer. Check if AACDecoder has something similar too...
//Setting HTTP header for fixing some issues with normal setDataSource Header from Android
//User-Agent "iTunes" was set to force re-direct to shoutcast streaming url
Map<String, String> headers = new HashMap<String, String>();
headers.put("User-Agent", "iTunes");
setDataSource(context, uri, headers);
prepareAsync();
(full answer here)
I have a task which involves integration of a video decoder into Stagefright(Android's multimedia framework). I searched and found the following about creating a new plugin for Stagefright:
To add support for a new format, you need to:
Develop a new Extractor class, if the container is not supported yet.
Develop a new Decoder class, that implements the interface needed by the StageFright core to read the data.
Associate the mime-type of the files to read to your new Decoder in the OMXCodec.cpp file, in the kDecoderInfo array.
static const CodecInfo kDecoderInfo[] = {
{MEDIA_MIMETYPE_AUDIO_AAC, "OMX.TI.AAC.decode"},
{MEDIA_MIMETYPE_AUDIO_AAC, "AACDecoder"},
};
The above data is all i could find out on net. Right now i have a simple app that will take a file as an input and render it on the screen using native API's in android. Can anyone please tell me how to proceed further. And from where does all these OMXCodec.cpp and others come into picture and which directory of my project should i have them in. Please provide solutions regarding the same. Thanks in advance.
From your question, it appears that you are looking at a recommendation which is specific for Ice-Cream Sandwich or earlier versions of Android. The first thing you should be clear about is the version of the android i.e. Ice-Cream Sandwich or before or JellyBean and after. The integration of codecs is different across different releases of Android.
I have already commented on your other question which is specific for JellyBean and later (Reference: Android: How to integrate a decoder to multimedia framework)
If you would like to integrate your codec in Ice-Cream Sandwich or before, the steps are already available in your question. In addition to adding the decoder into kDecoderInfo list, you may like to setup certain quirks as shown here.
For the question on OMXCodec.cpp, you can find this file at
frameworks/base/media/libstagefright/ in case of Ice-Cream Sandwich and frameworks/av/media/libstagefright/ in case of JellyBean.
If you have followed all the steps to integrate the video decoder into the Stagefright framework, then the easiest test would be to perform the following:
Copy a media file into SD-Card
In OMXCodec.cpp, enable logs by removing the comment in this statement //#define LOG_NDEBUG 0 and run a mm in the directory. Copy the rebuilt libstagefright.so to /system/lib on your device.
Enable logcat and start capturing logs.
Goto gallery, select your file and allow the standard player to play your file.
Check your log file if the player has selected your OMX component by searching for your component name. If found, your integration of codec into Stagefright is successful. Else, you will have to debug and find out what is the problem.
Postscript:
Based on your queries, I presume you aren't familiar with Android sources. Please refer to androidxref site to become familiar with AOSP distributions.
Unless you are planning to support a new media file-format, you will not require to support Extractor class. MediaExtractor abstracts a file-format parser and helps to de-multiplex the different tracks in a media file.
I hope with this information, you should be able to get your codec integrated and functional in Android.
I've spent almost a week on this now, trying to get FFmpeg "Angel"/"Happiness" to build for Android.
I've tried build scripts from all over the internet to no avail. I got closest was using this. As the author himself says the script doesn't work for newer versions of FFmpeg due to this bug, which has been dismissed on that ticket saying "I found a Makefile that does it." This was dis-heartening, being the only post on all of the vast Google world that was anywhere close to my problem.
So, question time:
Is there a way to get around the above bug? I'm trying to use the newest ffmpeg API, and "Love" is just giving me "undefined reference" errors while trying to use av_encode_video2(), and av_free_frame(). The code I was working on the lines of is at the ffmpeg git repo, under /doc/examples/decoding_encoding.c (the function starting on line 338).
Update: So they've done away with codec_names.sh in "Angel". Sorry didn't notice that before, but the problem persists in a different avatar now. With every build attempt the compiler throws a certain
start ndk-building...
/home/<user>/android-ndk/build/core/build-binary.mk:41: *** target file `clean' has both : and :: entries. Stop.
Say whatnow!?
Given the lack of any response at all, I'm assuming people who know their shit in this topic are really busy putting their skills to use with whatever they managed to compile. For the ones like me who scraped each corner of the web for an answer that makes any little sense, I have a.. more than decent workaround.
The Guardian Project, an awesome resource on github, has the perfect project set up for building an ffmpeg binary with all the settings of your choice. But just the one big problem of getting it to successfully build sans the "Unable to create executables" error.
So.. there's a way out there too. Less flexible, but it saves you from losing any more hair than I'm sure you (like me) already have. Head out here and profit.
From running the file command I noticed this binary was dynamically linked, that seemed weird, but it works.
Also, you'll have to run the chmod command before using it on the device (being a binary file and all). So pop it into your res/raw/ folder, load it up when needed and edit those videos like there's no tomorrow!
I am working on an Android application which is supposed to play videos over HTTP on Android devices. Before we setup a server to host the video files just wanted a few things clarified:
As per the developer documentation, Android supports .mp4 and .3gp container formats for video. If we use H.263(video) - AAC LC (Audio) audio-video codec used for our media files will we be able to play the video by passing the URL to MediaPlayer class?
I did a little experiment and passed URL of one of the video files(.mp4) to the MediaPlayer class and got the following error:
Command PLAYER_INIT completed with an
error or info
PVMFErrContentInvalidForProgressivePlayback
From the docs, I came to know that for progressive playback, the video's index (e.g moov atom) should be at the start of the file.
Questions:
1. How do we make our videos Android-ready?
2. What are the different considerations that we need to make?
Please help.
Thanks.
You can actually achieve this using a pure Java implementation of ISO BMF ( MP4 ) container used JCodec ( http://jcodec.org ). For this use the following code:
MovieBox movie = MP4Util.createRefMovie(new File("bad.mp4"));
new Flattern().flattern(movie, new File("good.mp4"));
The side effect of 'Flattern' is creating a web optimized movie file that has it's header BEFORE the data.
You can also use similar functionality from command line:
java -cp jcodec-0.1.3-uberjar.jar org.jcodec.movtool.WebOptimize <movie>
The JCodec library can be downloaded from a project website.
I cross posted this question on Android-developers google group. Mark answered it there. Thanks Mark!
See this thread