I am looking to make a mobile app that will allow the users to take X number of videos and it will combine them together to make a single video. Users will also be able to choose what to put in between each video recording and background music.
I have more experience with Xamarin/C# than with native Java/Obj-C but the only method I have found online that might accomplish this would be with using native with FFMPEG. Is this the case? Is FFMPEG even going to work for this? Is there a way to use Xamarin to accomplish what I need to do?
Thanks
Have a look at the AVMutableComposition and its related classes.
There's an example here, about halfway down the page: http://www.raywenderlich.com/13418/how-to-play-record-edit-videos-in-ios
It looks like it's covered by Xamarin: http://iosapi.xamarin.com/index.aspx?link=T%3AMonoTouch.AVFoundation.AVMutableComposition
Related
I'm currently fiddling around with an idea, and therefor I'm looking for a potential way to access the Audio-Stream on a TV (regardless if SmartTV, Android, WebOS, ...), run some audio-filters on it, and then have it output.
I've briefly read through some of the API's, but it seems I'm only able to control the volume - which is not really what I want. Am I missing something, or is this not possible at the moment?
For webOS at the moment you can only:
setMuted
volumeDown
volumeUp
and the formats that can be supported it at the moment:
So you cannot change the audio from native side, the only way i can imagine for this to work is to use a js library or something similar that can be supported then make change and display it to users.
Reference:
http://developer.lge.com/webOSTV/api/webos-service-api/audio/
This question may sound a little bit complex or ambiguous, but I'll try to make it as clear as I can. I have done lots of Googling and spent lots of time but didn't find anything relevant for windows.
I want to play two videos on a single screen. One as full screen in background and one on top of it in a small window or small width/height in the right corner. Then I want an output which consists of both videos playing together on a single screen.
So basically one video overlays another and then I want that streamed as output so the user can play that stream later.
I am not asking you to write the whole code, just tell me what to do or how to do it or which tool or third party SDK I have to use to make it happen.
update:
Tried a lots of solution.
1.Xuggler- doesn't support Android.
2.JavaCV or JJMPEG- not able to find any tutorial which suggested how to do it?
Now looking for FFMPEG- searched for a long time but not able to find any tutorial which suggest the coding way to do it. I found command line way to how to fix it.
So can anyone suggest or point the tutorial of FFMPEG or tell any other way to
I would start with JavaCV. It's quite good and flexible. It should allow you to grab frames, composite them and write them back to a file. Use FFmpegFrameGrabber and Recorder classes. The composition can be done manually.
The rest of the answer depends on few things:
do you want to read from a file/mem/url?
do you want to save to a file/mem/url?
do you need realtime processing?
do you need something more than simple picture-in-picture?
You could use OpenGL to do the trick. Please note however that you will need to have to render steps, one rendering the first video in a FBO and then the second rendering the second video, using the FBO as TEXTURE0 and the second as EXTERNAL_TEXTURE.
Blending, and all the stuff you want would be done by OpengL.
You can check the source codes here: Using SurfaceTexture in Android and some important information here: Android OpenGL combination of SurfaceTexture (external image) and ordinary texture
The only thing I'm not sure is what happens when two instances of mediaplayer are running in Parallel. I guess it should not be a problem.
ffmpeg is a very active project, lot's of changes and releases all the time.
You should look at the Xuggler project, this provides a Java API for what you want to do, and they have tight integration with ffmpeg.
http://www.xuggle.com/xuggler/
Should you choose to go down the Runtime.exec() path, this Red5 thread should be useful:
http://www.nabble.com/java-call-ffmpeg-ts15886850.html
In my application the customer want me to embed a tutorial to help users learn it. This tutorial would be in the form of a screen with animations showing how to perform tasks.
Before implementing this ... I want know if there is any framework already out there, that I can easily use?
Thanks
You could embed a video file using a VideoView.
No. You'll have to do this manually. Designing a general framework or library to do this would be very hard since the use cases will wildly vary from app to app.
I am trying to program an Android app that will be able to open the webcam and upload the recording live to another server.
Right now I have only found solutions where Android providing the stream on its port, instead of sending it. So to clarify, I would like to send the data to the server (upload).
I don't want to use a closed source program, but rather program it myself. I have some medium android programming knowledge, but the theoretical knowledge about how to accomplish this is missing.
Could anybody please point me out to the right direction.
Is this even possibe?
Regards
Edit:
Maybe some sort of RTP/RTSP setup would be possibel. I do not care about compatibility on android versions. So everything in that direction is welcome too.
Edit2:
Sorry to have been so unclear in the first place. I do have to implement it myself, but I can use existing code. What I cannot do is use already closed source implementations.
using MediaRecorder, you can capture video to a file. here's a post about it,
Android: Does anyone know how to capture video?
to "stream" it to a server, you could recorder a (never ending) series of short videos, say 10s each, and upload the chunks to the server. if you wanted to get fancy, you could have the server stitch them together.
Install Bambuser. Ask them what intents are available to launch it. Done.
If you really need the video stored on your own server, maybe you could make some sort of arrangement with Bambuser.
I understand that flash is quite new to android. Has anyone actually used flash in an android app? How did you do it and what do you recommend. I'm trying to embed flash animation into an app if that helps any.
You can always convert the Flash file into a video or gif, and then place it into your app.
If you don't necessarily need to use flash, other software or editing with code in your app might be a good idea too..
This link might be useful: http://www.ehow.com/way_6175136_android-animation-tutorial.html