Send Android webcam video live to a server - android

I am trying to program an Android app that will be able to open the webcam and upload the recording live to another server.
Right now I have only found solutions where Android providing the stream on its port, instead of sending it. So to clarify, I would like to send the data to the server (upload).
I don't want to use a closed source program, but rather program it myself. I have some medium android programming knowledge, but the theoretical knowledge about how to accomplish this is missing.
Could anybody please point me out to the right direction.
Is this even possibe?
Regards
Edit:
Maybe some sort of RTP/RTSP setup would be possibel. I do not care about compatibility on android versions. So everything in that direction is welcome too.
Edit2:
Sorry to have been so unclear in the first place. I do have to implement it myself, but I can use existing code. What I cannot do is use already closed source implementations.

using MediaRecorder, you can capture video to a file. here's a post about it,
Android: Does anyone know how to capture video?
to "stream" it to a server, you could recorder a (never ending) series of short videos, say 10s each, and upload the chunks to the server. if you wanted to get fancy, you could have the server stitch them together.

Install Bambuser. Ask them what intents are available to launch it. Done.
If you really need the video stored on your own server, maybe you could make some sort of arrangement with Bambuser.

Related

Best way to make a video merging mobile app

I am looking to make a mobile app that will allow the users to take X number of videos and it will combine them together to make a single video. Users will also be able to choose what to put in between each video recording and background music.
I have more experience with Xamarin/C# than with native Java/Obj-C but the only method I have found online that might accomplish this would be with using native with FFMPEG. Is this the case? Is FFMPEG even going to work for this? Is there a way to use Xamarin to accomplish what I need to do?
Thanks
Have a look at the AVMutableComposition and its related classes.
There's an example here, about halfway down the page: http://www.raywenderlich.com/13418/how-to-play-record-edit-videos-in-ios
It looks like it's covered by Xamarin: http://iosapi.xamarin.com/index.aspx?link=T%3AMonoTouch.AVFoundation.AVMutableComposition

Display two videos together then output as a merged video on a single screen

This question may sound a little bit complex or ambiguous, but I'll try to make it as clear as I can. I have done lots of Googling and spent lots of time but didn't find anything relevant for windows.
I want to play two videos on a single screen. One as full screen in background and one on top of it in a small window or small width/height in the right corner. Then I want an output which consists of both videos playing together on a single screen.
So basically one video overlays another and then I want that streamed as output so the user can play that stream later.
I am not asking you to write the whole code, just tell me what to do or how to do it or which tool or third party SDK I have to use to make it happen.
update:
Tried a lots of solution.
1.Xuggler- doesn't support Android.
2.JavaCV or JJMPEG- not able to find any tutorial which suggested how to do it?
Now looking for FFMPEG- searched for a long time but not able to find any tutorial which suggest the coding way to do it. I found command line way to how to fix it.
So can anyone suggest or point the tutorial of FFMPEG or tell any other way to
I would start with JavaCV. It's quite good and flexible. It should allow you to grab frames, composite them and write them back to a file. Use FFmpegFrameGrabber and Recorder classes. The composition can be done manually.
The rest of the answer depends on few things:
do you want to read from a file/mem/url?
do you want to save to a file/mem/url?
do you need realtime processing?
do you need something more than simple picture-in-picture?
You could use OpenGL to do the trick. Please note however that you will need to have to render steps, one rendering the first video in a FBO and then the second rendering the second video, using the FBO as TEXTURE0 and the second as EXTERNAL_TEXTURE.
Blending, and all the stuff you want would be done by OpengL.
You can check the source codes here: Using SurfaceTexture in Android and some important information here: Android OpenGL combination of SurfaceTexture (external image) and ordinary texture
The only thing I'm not sure is what happens when two instances of mediaplayer are running in Parallel. I guess it should not be a problem.
ffmpeg is a very active project, lot's of changes and releases all the time.
You should look at the Xuggler project, this provides a Java API for what you want to do, and they have tight integration with ffmpeg.
http://www.xuggle.com/xuggler/
Should you choose to go down the Runtime.exec() path, this Red5 thread should be useful:
http://www.nabble.com/java-call-ffmpeg-ts15886850.html

Accessing Metadata from currently playing audio. (Builtin or External App)

I am trying to develop an app/widget for which I need display the currently playing information (metadata) of an audio track.
This would be trivial if I was also writing the MediaPlayer myself, as I could simply access the MediaStore and bring up the info, however, I do not wish to compete with the plethora of existing apps on this front. I want to be able to pull this inforrmation from the builtin audio player or other app such as SongBird or PowerAMP.
I should be able to do this with PowerAMP using their [API][1], but have, but I really want a solution that works for the stock android player and others too.
I was hoping to be able to grab the information from the AudioManager, but that seems only to allow me to query the current state (Music is playing et) and I can set my intent to play music, etc... But no access to metadata from someone elses app.
So my thought is this cannot be done easily. My thoughts are that I could maybe access this info from the info bar at the top as the now playing info is printed up there. It might be an ugly hack though...
For a moment I got excited about the RemoteControlClient.MetadataEditor from 4.0, but then I figured out that it was for writing that information to a stream that can be sent to the physical remote, rather than allowing you to create a software remote. Damn!
Does anyone have any ideas?
[1]: http://forum.powerampapp.com/index.php?/topic/1034-updated-for-20-poweramp-api-lib-and-sample-applications/ Power AMP
I've written a guide for implementing this.
Basically, you need to have access to hidden classes of android.jar library. Then you have to extend IRemoteControlDisplay$Stub class, and implement it's methods.
After that you register your RemoteControlDisplay with hidden method - AudioManager#registerRemoteControlDisplay.
There is just way too much to explain in one answer, so read my guide on XDA-Developers.
Here is the link:
http://forum.xda-developers.com/showpost.php?p=44513199
Also, I'm currently working on a library which will simplify the process of implementing you remote media controls.
I should be able to do this with PowerAMP using their [API][1], but have, but I really want a solution that works for the stock android player and others too.
There is no documented and supported API for the AOSP Music app or the Google Play Music app, AFAIK. They certainly are not in the Android SDK.
I am not aware of an Android ecosystem standard for media players exposing this information, let alone a roster of apps that support such a standard. You are welcome to work with the developers of such apps and encourage them to create and adopt a standard.
My thoughts are that I could maybe access this info from the info bar at the top as the now playing info is printed up there.
It is not possible to spy on other applications' Notifications, for obvious privacy and security reasons.
For a moment I got excited about the RemoteControlClient.MetadataEditor from 4.0, but then I figured out that it was for writing that information to a stream that can be sent to the physical remote, rather than allowing you to create a software remote. Damn!
Surely there's a way to access the Remote Control Client metadata on Android 4.0, because the lock screen is able to access it when media is playing.
I'm not a developer at all, but I've tried to do a bit of poking around in the AOKP sources and this is my limited understanding of how it works. At least in AOKP (and presumably AOSP as well, then), it appears that the lockscreen uses core/java/com/android/internal/widget/TransportControlView.java to draw the music control widget on the lockscreen, which in turn uses media/java/android/media/IRemoteControlDisplay.aidl for data retrieval. At the very least, it may be useful to poke around in TransportControlView.java to see if you can figure out how the lockscreen widget works.

Develop client-server app for android ... where do i start?

First, let me say that I know nothing. I am reasonably intelligent, and I can learn .... but what I need to know is what exactly it is that I need to learn. Consider me a hobbyist that just got started. I have a degree in math so logic makes sense to me, but it was all abstract math so I never even used Matlab once in school.
I want to develop an application for android. I want this application to take input (text and camera images) from the user, and store that input along with certain meta-data (i.e. time of input, geo-location of device when inputted). I also want that data to be transferred to a server (I have complete admin access to a server, but haven't learned much about it yet either) and stored there in a manner which can be accessed by a desktop or web application which I will also need to develop. The android device may not always have an internet connection at the time of input (but will be taken to a wi-fi hotspot for uploading when completed), and after uploading the data to the server I will have no further need for the data on the device.
I have done a bit of research, and discovered the following gaps in my knowledge, and remedied them in the following ways:
a) I'm going to need to know how to program in android - I have worked through these tutorials at developer.android.com, purchased and partly read this book(1), and just purchased this book(2).
b) I believe I am going to need to know something about JSON - I have just purchased this book(3), after reading just a little bit about JSON on the web.
c) I will need to learn what I need to with the server to prepare it for the data - No idea where to start.
d) I will need to decide how to access the data, and learn how to develop whatever it is. - No idea where to start.
I am not able to post multiple links, so I have moved all the books down here ...
1 google.com/search?q=isbn+9780321741233
2 google.com/search?q=isbn+9780321749673
3 google.com/search?q=isbn+9780470526910
My question(s):
If this was your project, how would you go about doing this? What languages will I have to learn? Can you recommend any books, online tutorials, etc. for each of those languages in the way that they would apply to my project?
Thank you for taking the time to read my query, and thank you for any help you may provide.
Book: Pro Android 2. I have it, I've read it, and it shows you how to do just about everything. I bought it for the sole purpose of developing a client/server application in Android, and I completed the app in two weeks.

Is there something in the Android architecture or API that prevents people from creating MP3 players that read embedded lyrics?

When I play certain MP3 files (such as lessons from JapanesePod101.com) on my iPod Touch, lyrics or transcripts that are embedded in the MP3 files are displayed in the media player. The lyrics are, I believe, stored as ID3/ID4 tags in the MP3 metadata.
I find this to be an extremely useful feature, and I believe I'm not alone. Despite that, neither the stock Android media player nor any other media player I've downloaded from the Market seems to support this. I just have not been able to find any way to get feature on my Nexus One.
This feature is important enough to me that I'm considering learning Android development just so I can write a simple media player that displays embedded lyrics or notes. However, the fact that nobody else seems to have done this makes me wonder - is it even possible? Is there something in the Android architecture or APIs that make it difficult or impossible to read and display lyrics information from MP3 files? I'd hate to get deep into the learning process and find out what I'm aiming for can't easily be done. (I mean, if all else fails I assume I could write my own MP3-decoder, but that's more trouble than I'm willing to go through right now).
I've already asked this question on the Android Enthusiasts Stack Exchange Beta Site, but in retrospect I decided it was more of a programming question and decided it was better to ask here.
Yeah, definitely more of a programming question. Just from my brief experience of reading through the ID3 spec, I think it's probably just that decoding ID3 tags is a complete PITA. I'm sure it can be done, as there are MP3 tag editing apps available for Android (whether any support lyrics or not, I do not know).
ID3v2.3 seems to have support for both synchronized and unsynchronized lyrics through the SYLT and USLT frames of the header. I imagine it's just such an underused feature that it isn't worth the effort to most to do so. Purchased MP3s don't carry this information (I've always wondered why not?), so they would have to manually be added (or automatically via a lyric service API, but there's a lot more coding involved with that).
Here is the ID3v2.3 spec if you'd like to look into it further...(abandon hope all ye who enter here)
The problem may be that most people would use the built-in mp3 playback mechanisms, and this may neither support lyric display nor be very easy to keep synchronized with something else doing lyric display.
So it may be that something needs to be written which does it's own mp3 decoding.
Most likely this would want to be done in native code. On the other hand, on android, audio output (and unless you use opengl, video display) pretty much has to be done from java. So you are looking at a fair amount of work to decode data with a native library and then dispatch it for playback and display from java.
So to answer your question - is it possible? Definitely
Is it made easy by the android APIs? - not really
I just added a new feature request that would give Android support for reading USLT in the ID3 tag. This will enable the native and 3rd party music players to display lyrics. If you want this feature, please star the request below, and post your comments.
http://code.google.com/p/android/issues/detail?id=32547

Categories

Resources