Is Android.net.rtp only for VoIP ppplication? - android

From API 12, Android have its only RTP package, which is android.net.rtp.
There are 4 classes in this package: RtpStream, AudioStream, AudioCodec and AudioGroup.
However, it seems that all those classes are only for VoIP application. I'm trying to stream some 'static' audio file (such as MP3 file on SDCARD) by using RTP protocol to VLC player and I find it somehow impossible by using android.net.rtp.
I tried this example: http://androidsourcecode.blogspot.com/2013/10/android-rtp-sample-receiving-via-vlc.html
The result is: The sound from the mic of my android phone was sent successfully to my MAC VLC player very well.
So, is it possible to stream 'static' file through android's native media decoder, and push decoded data to AudioRecord then again to AudioStream?
To make it simple, is it possible to make a fake microphone (MP3 file -> MediaPlayer -> AudioRecord -> AudioStream) pipeline?

Related

How can we achieve audio passthourgh in Chromecast Styled Media Receiver

We're trying to play 5.1 audio on Chrome-cast Player via Audio Passthrough from Android app. But we get only 2.1 as audio output.
Even when we ingested only 5.1 audio, we have received below error,
[cast.player.hls.PackedAudioParser] Neither ID3 nor ADTS header was found at 0
[cast.player.api.Host] error: cast.player.api.ErrorCode.NETWORK/315
Reference:
https://developers.google.com/cast/docs/media#audio_passthrough
Encoding tool used: ffmpeg
Input Params:
Url: http://cdn.example.com/video.m3u8
MimeType: application/x-mpegURL
Audio codec: ac-3
Sample m3u8 file
#EXTM3U
#EXT-X-VERSION:6
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="audio/6ch",LANGUAGE="tam",NAME="Tamil",CHANNELS="6",URI="https://cdn.example.com/videos/audio_video_seperation/audio_5.1/master.m3u8""
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="audio/mp4a/2ch",LANGUAGE="tam",NAME="Tamil",CHANNELS="2",URI="https://cdn.example.com/videos/video_audio_seperation/audio_2.1/master.m3u8"
#EXT-X-STREAM-INF:AUDIO="audio/6ch",BANDWIDTH=1045504,AUTOSELECT=YES,CODECS="avc1.4D401F,ac-3",RESOLUTION=853x480,FRAME-RATE=25.000
https://cdn.example.com/videos/video_audio_seperation/video_480/master.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,AUDIO="audio/mp4a/2ch",SUBTITLES="subs",BANDWIDTH=1044480,CODECS="avc1.4D401F,mp4a.40.2",RESOLUTION=853x480,FRAME-RATE=25.000
https://cdn.example.com/videos/video_audio_seperation/video_480/master.m3u8
Chromecast debug messages:
Device supports ac-3 encoding
Queries:
Is there any sample media file available to test 5.1 audio on chrome-cast?
Which is right way to decouple audio and video stream from source mp4 file for HLS?
Any sample ffmpeg command is useful
How to fix or check ADTS header issue on audio file?

Play RTP stream withouth RTSP in Android VideoView

I'd like to play a RTP stream in an Android VideoView. I create the stream with gstreamer on a linux machine and fire the rtp stream to a udp sink.
Is it possible to consume this stream in an Android VideoView without having a RTSP server?
I tried to set the video URI to "rtp://:#" but then I got an error "Video could not be played". I also thought about creating an SDP file and using that on the Android device. But I'm not sure if this works and how to create such a file.
Thanks
http://developer.android.com/guide/appendix/media-formats.html
Above link is first stop for what is support or no.
As it says rtp only in conjunction with sdp as per rtsp spec.
No can't do naked rtp stream in base android.
Try porting some lib like live555 if you must have naked rtp.

Android custom audio player to replace MediaPlayer with libPd or OpenSL ES or AudioTrack

I have already developed Streaming Audio application using MediaPlayer API. All the features are works fine, expect it takes more time to start the playback (buffer time is more).
i want to add recording live audio stream(save the live stream data in disk, not the recording from MIC). as MediaPlayer does not provide any API to access the raw data stream, i am planning to build custom audio player.
i want to control the buffering time, access to the raw audio stream, should able to play all the audio format which are supported in android natively. which api (libPd or OpenSL ES or AudioTrack) will be suitable to build the custom audio player in Android?
In my experience OpenSL_ES would be the choice, here there is a Link that explains how to do audio streaming that you may find useful. bufferframes determines how many samples you will collect before playing, so smaller bufferframes faster response time, but you have to balance that with your device processing capabilities.
You can record with libpd (pd-for-android) too.
All the recording process is managed by libpd.
Check the ScenePlayer project, it uses libpd and lets you record audio into a folder on sdcard:
https://github.com/libpd/pd-for-android/tree/master/ScenePlayer

Streaming an audio file in android via RTP

I am looking for a way to stream a prerecorded MP3 or WAV file over the internet using SIP and RTP. The main stumbling block now has been how to get a stream from a file and synchronize it, so that it could be delivered via RTP just like a stream from microphone or video camera.

Play audio RTMP stream on android

Has anyone had any success playing back an audio RTMP stream on Android using http://code.google.com/p/android-rtmp-client or know of any other non-flash solutions. The example that comes with the android rtmp source records the audio to a file but I'm looking for example code that plays back over speakers (or bluetooth).
The easiest way to play an audio RTMP stream in Android has a partial discussion of what's needed.

Categories

Resources