We are developing plugins which convert media file as the source and convert in to other format that can be used by DLNA compatible device in android.
Any help on how to deal with media and its file format and convert into DLNA file format..
Any idea will be appreciable...
Thank you
There are several containers and codecs supported by DLNA, most of which can be created with FFmpeg.
Specs: http://www.dlna.org/industry/why_dlna/key_components/media_format/
Now do you plan to do the conversion on the device or will a server be involved? There are several servers on this approved / supported list: http://www.rbgrn.net/content/21-how-to-choose-dlna-media-server-windows-mac-os-x-or-linux
If you plan to do conversion on the device, there are ports of FFmpeg that run on Android but your mileage will vary of course.
You may also be interested in using Kik API for sharing rich content and media with other mobile app users. You can build your app on top of Kik Messenger and use Kik's own transport, infrastructure and userbase to share content from your mobile app. This API is available for Android and iPhone, and in simpler scenarios it takes only about 5 lines of code to integrate. There is more info on Kik API website: http://www.kik.com/dev and http://apiblog.kik.com
Disclaimer: I'm one of the developers behind Kik API :)
Related
I am new to mobile programming, and I am making an application which will play video files. I am trying Azure media service (AMS) with Xamarin Forms. I would like to know how to it works to have a file that will play with Widevine on android and Fairplay on IOS.
I need explanations!
Thanks in advance!
The implementation on the client side for those different DRM's is about as different as you can imagine. Xamarin.Forms does not out of the box provide a component that can handle the DRM specific methods. You will likely need to create platform specific plugins for that or find a ready-made component for that - although in my search i haven't found that. The closest is a cross platform video player such as https://github.com/adamfisher/Xamarin.Forms.VideoPlayer
Also, you are not likely to get a file to play, it will likely be a stream with different manifests for the two platforms. Azure Media Services however can do that on the fly from a set of MP4's. So it might look like MP4's but what is actually served to the client is a video manifest file.
There is no library/component in Xamarin that allows you to handle multiple DRM schemes. Your best option is to use platform libraries wrapped, to be able to consume DRM content. I would check out the Inside Secure DRM solution, that allows the playback of the DRM Schemes that you listed.
I'm currently planning on building an application using the ionic framework wrapping angularJS and cordova.
The app, once downloaded must be able to play its own audio files that were bundled with the application without streaming them from the internet.
Can anyone give guidance as to whether the phones 'hard drive' can be accessed to store audio files on? Or, if not whether the $localstorage facility would be suitable for storing audio files up to perhaps 50Mb?
In the api docs you can find the info for it.
I haven't used cordova yet but that should work.
Localstorage in browser is limited to 2.5MB, 5MB, 10 MB or unlimited depending on used browser. Not sure how it is handled in a native webview but probably similar to these limits.
In this blog post is a text read demo that could also help.
I'm just learingn mobile web development and thinking about task:
Is there a way to make a videostream betwen iOS, Android and Browser. What architecture and technology it should use. I already read this quetion on SO Peer-to-Peer video from iOS to Android? but there is nothing about browsers.
If it can't be p2p and crossplatfom at the same time. I thought i shoud use Red5 server or etc. or Xmpp
So I'm asking your advice and opinion here. Any information would be valuable
Yes, You can !!!
There is new technology enforced by google is WEBRTC
It is stands for "web real time communication" and is an opensource project funded by google.
It is also support Android/iPhone native application.
I am working on it and got success say 60%.
Video clarity is good but audio is choppy.
You can find source code from Here
Discussion with community Here
You can see live demo Here
NOTE:
It is ongoing project and has not been stable yet. Google team is working on.Currently it is working on latest Chrome,FF and opera. IE has not given support yet.
Yes,the open source solution should be WEBRTC technology,please check it on official website: webrtc.org
I need to build a photo sharing application that is web based and also has mobile application for android, Windows Phone and iOS (I don't need to build all these apps, it should be supported seamlessly for these). The Photo Sharing Project will be my assignment in order to learn the varios aspects involved in the photo sharing process (efficient storage, reliability and UI Presentaion will be my key aspects).
I was planning to use Windows Azure, IIS, SQL Server and maybe some silverlight ? Is this the most cost effitive and a recomended way for me to start ?
If any1 could share the current trending technologies used by some of the popular photo sharing sites.
Thank You
Ok so if you want your application to support WP, iOS, Android, ... you'll have to look at some technology that allows you to build an API that can easily be used by any of these platforms. In that case you might want to look at the new ASP.NET Web API which allows you to build an API that supports content negotiation.
Besides that you'll also want to look at how you can store data in Windows Azure using Blob Storage. This allows you to give all your files an URI and have Blob Storage serve these files to your users instead of your application having to do this. This will drastically decrease the load on your front-end. You can even combine this with the CDN to make your images available over multiple edge servers to serve the content from a location closest to your users.
Since you're working with images I'm assuming you'll be working with those images, like creating thumbnails, applying filters, ... These tasks could use lots of resources and you wouldn't want to put all that load on your front-end (Web Role / Web Site). That's why it's a common practice to off-load this work to your back-end (Worker Role) using queues. Here is a complete example coming from the training kit that creates thumbnails using a Worker Role and a queue: http://msdn.microsoft.com/en-us/vs2010trainingcourse_introtowindowsazurelabvs2010_topic3.aspx
As for the Silverlight part, personally, I wouldn't use Silverlight since you'll be limiting yourself to desktops only. Using ASP.NET MVC + jQuery / (any other JS framework) will allow mobile/tablet users to also work with your site even if there isn't a native application available.
You can host the Web application in Azure.
Store the photos in the Azure private blob.I am not sure about the Silverlight part I think it is not supported in some mobile platforms.
I am a MIDI based musical application author. In my application I am generating a .midi file with a small lib that I wrote and play it on MediaPlayer and that's enough for that app. However in the future app I plan to have more interactivity and that's where I would probably need a streaming API.
As far as I know Android leaks APIs for realtime midi synth (at least official). But still I can see some apps that do use midi in quite advanced way. Question is how? Do they use NDK to access Sonivox directly or are there an unofficial apis for that after all? Any ideas?
Also I'm very interested if Google is planning to improve MIDI support in future versions of Android (in case anybody of Google sees this :))
Thanks.
You should check out libpd, which is a native port of PureData for both Android and iOS. It will provide you with access to the MIDI drivers of the system while still being able to prototype your software with very high-level tools.
Java has a very important latency, so i think this should be done with the NDK. Check this question, it has a couple of hints. This was reported as an Android issue (NDK support for low-latency audio), there might be some tips or info there too.
This is a simple but great sample application that successfully streams MIDI on Android https://github.com/billthefarmer/mididriver
You will have to put your MIDI messages together manually though ( the example creates two MIDI messages for play note and stop note). One can refer to the MIDI specification to further control the MIDI channels. The problem is that the default sound fonts on Android sound so bad.