Android Live microphone sound Streaming out - android

I would like to stream microphone.
And use http server on android so that
user just go http://xxxx.xxx.xx.xxx/xxx.wav can listen to what I say?
How can I do so?

I would try to develop a small HTTP server which serves an FLV stream.
You can take ipcamera-for-android as an example. This app serves a FLV video stream however you could reuse the server and the FLV encoder part.
As FLV supports PCM streams you can simply copy the buffer of the microphone to your stream.
Another variant
You can encode the microphone stream using the built-in MediaRecorder.AudioEncoder.AAC. Afterwards you can simply serve the AAC as a stream to your client.

Related

Distinguish HLS content during stream

I want to distinguish the content type of data incoming from a HLS stream using Google's ExoPlayer.
Specifically, I want to distinguish between Video and Audio-only content.
Thanks!
Accoridng to the Exoplayer documentation the player does not currently support detecting the media format at all:
ExoPlayer does not (yet) automatically detect the format of the media being played. An application needs to know the format of the media it wishes to play in order to construct an ExoPlayer capable of playing it. Removing this limitation is tracked by Issue #438.
The issue link mentioned above is: https://github.com/google/ExoPlayer/issues/438
You could look at the raw stream yourself if you can access it (it is not encrypted) and figure out from there if it is audio or video but this will need a reasonable amount of work: HLS is a 'streaming protocol' that streams MPEG-2 transport streams, chunked into segments. Each of these MPEG-2 transport streams can contain audio and video.
The MPEG standards (MPEG 1 and MPEG 2) use stream ids to identify each individual audio or video stream in the transport stream. Audio streams are numbered 110X XXXX and video streams 1110XXXX - hence you can check all the individual streams in an MPEG2 transport stream and see if whether it is audio only, video only or a mix.

is MP4 a streaming protocol or file format?

I am currently using Wowza to stream videos. I am currently trying to integrate Wowza, Android, and ChromeCast Device (CCD). According to this document, https://developers.google.com/cast/docs/media, Google Cast supports the "MP4 protocol".
So, my question is this: is MP4 a streaming protocol, file format, or both?
In the ChromeCast Android demo applications, they simply pass a URL like this http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4 as metadata to the CCD.
To me, this implies that no server is required to stream the MP4 file. Meaning, I won't even need Wowza as an intermediary party to stream.
Is this understanding correct?
It seems that the client player will then be responsible to interact with the MP4 file directly (e.g. seek, pause, stop, play, etc...).
While you've already accepted an answer, and gotten your app to work (which was likely your ultimate goal), I thought it might be helpful to answer your question as well about what MP4 really is.
MP4 is a video container format; inside the MP4 container is video stream data (generally encoded in the H.264 format) and audio stream data (often encoded in the AAC format). The client player can interact with it directly because the Chromecast's browser has HTML5 video support for interpreting the MP4 container format and playing back the H.264 video and AAC audio, but it isn't "streaming" in the way that term is often used ... it's just downloading it from your web server in chunks and playing it back. There's nothing wrong with this if it's performing as you'd like (in fact, this is one of the big benefits of HTML5 video, that it doesn't need a streaming server backend), but if you actually want true media streaming (to leverage things such as adaptive bitrate switching, licensing, and so forth), you would have the MP4 file served via Wowza rather than via your web server.
If you simply have an MP4 file, just pass its url and it should work fine, just like the samples (CastVideos) projects that we have on the Github.

Is it possible to capture/save JPEGs from an MJPEG stream? (and also the MJPEG itself)?

I am using MJPEG-Streamer to stream an MJPEG video file over http, similar to how this guy does it. MJPEG-Streamer basically creates a streaming server on the streaming device which hosts the MJPEG video being streamed. This stream can be accessed via http://<Server-IP-address>:8080/?action=stream. An example of this is http://sigsiu.homeip.net/?action=stream. Snapshots can be captured with http://<Server-IP-address>:8080/?action=snapshot
The thing I'm curious about is the format that the stream is received in: /?action=stream. Is this a normal protocol for how multimedia is streamed, or is this something MJPEG-Streamer specific?
Most importantly, would it be possible for me to extract JPEGs / the MJPEG when it is streaming via /?action=stream
I'm very new to streaming live videos over the internet, so I have no idea how to tell if MJPEG-Streamer is streaming in some very standard way, in which it would be easy to extract the JPEGs/MJPEG; or if its streaming in some esoteric way that's a little cryptic, where it's difficult to extract the JPEGs/MJPEG.
Thank you!
Some extra info: I plan on receiving/capturing the MJPEG stream on an Android device

Streaming audio/video from Android

I am writing Android app to stream audio/video to Wowza server in RTSP interleaved mode. Using AAC and H.264 encoders. I created packetizers for both audio and video. The problem that I am facing is that when I send both streams simultaneously I am losing video stream. I only get audio on Wowza and VLC. When I do not stream audio video works just fine. This proves that my packetizers and RTP streaming code perform as expected. It looks as if I cannot send video fast enough to sustain the stream.
Similarly architected code on iOS provides stable video and audio feed.
Thank you

Streaming an audio file in android via RTP

I am looking for a way to stream a prerecorded MP3 or WAV file over the internet using SIP and RTP. The main stumbling block now has been how to get a stream from a file and synchronize it, so that it could be delivered via RTP just like a stream from microphone or video camera.

Categories

Resources