I am using the modified tutorial 3 (android*1) for do streaming from other phone with the android aplication "IP Webcam"*2.
the program work but i have a problem when i call nativePlay() the application takes a long time to load the video and when when the video load then the video is accelerated until it appears in real time after a few seconds / minutes depending on the resolution of the image.
I guess since we launched the function until the video is displayed in the application are saving the video and so appears accelerated. I read the code but can not find where is this buffer to remove and does not take so long to display the video
I hope I may be able to help.
*1(http://docs.gstreamer.com/display/GstSDK/Android+tutorial+3%3A+Video)
Code:
https://drive.google.com/folderview?id=0BwB3O9F5yEPMazNyUy1OYlU1TW8&usp=sharing
Related
I have an app which plays a lot of video files. It's been all good on every phone and tablet I've tested on. I just tested on an Acer Chromebook and the first few frames (maybe up to a 1/4 second) are being dropped on all video files, so the video appears to start just slightly beyond 0. Some of the videos have audio that starts immediately and so it's obvious to the user that the video is clipped at the start. I'm wondering if anyone else has seen this issue, maybe on a Chromebook or other device and if there is some simple way to deal with it?
There is absolutely nothing fancy about my code. I'm using VideoView, preload the video with setVideoURI(), then call videoView.start() on a button press.
Thanks!
This problem only appears if I make a call to seek(0). So rather than doing that to replay the video I have to load the video each time it's requested. That workaround resolves the problem.
I am Writing video player in android. So far i could able to capture the frames, with the help of av_read_frame and avcodec_decode_video2, and updating to SDL2.0. I have followed dranger tutorial02.c http://dranger.com/ffmpeg/ .
Sudo Code is :
while (1)
{
1. Read packet
2. check if video frame; if not Go to Step 3.
2.1 if video frame, then update with SDL_UpdateYUVTexture,
3. Handle SDL Event
4. Clear the Renderer.
5. Present Renderer.
}
I wonder, do i need to take care of synchronization of video, dts/pts calculation while i need only to display video?
This scenario works well in the samsung, but not in other mobiles.
What woud be your advice?
It depends. If you're ok with the fact that your video will a) play as fast as the device can decode it and b) will play with different speed on different devices and even on the same device depending on other processes, then you don't need to synchronize, and can just dump the frames as soon as they're decoded.
Otherwise you still need to synchronize the video output to PTS. Since you don't have audio, and won't have audio clock, your only option would be to synchronize the video to the system clocks which makes it simpler.
I am using external camera with my application. The camera takes 9 pictures every second (9fps). The pictures are bitmaps 384x288. I need to create from this pictures a video file.
What I have tried:
Using Jcodec
The problem: jcodec is relatively slow, and for it to work properly i add the bitmaps to ArrayList and when the record stopped i convert the array to video. I takes to much time. For 30 sec video there is about 1 min rendering time.
Using native mediaCodec
The problem: I could only generate AVI files (video/avc) that not readable in the original android player. I can not use what is written here: http://bigflake.com/mediacodec/ because I developing for API 16. I have tried using (video/mp4v-es) but the video is corrupted and not playable in any player.
Using FFmpeg
The problem: Very complicated to implement in android, and I am not sure it will give me the result I needed after spending time to implement this. The result I need is to record video streaming as I get the bitmap without any delay.
What can you suggest me?
Working on an Android app.
I need to record video continuously.
But the video files can only be N seconds long.
I have it working so far, and the files are being created properly. However, the time it takes to stop the MediaRecorder and start it back up again with a new file handle, I lose 2 seconds between stop/start.
Is there anything procedure-wise that I can do to mitigate this delay? I'm recording in mp4 format so I'm suspecting I can't just force a "move" on a file on the filesystem since the video needs to be packaged properly using the mp4 encoding.
Anyone know, roughly, how I might go about this?
Thanks!
When i run my application on the tablet(android) and hit the pause button it pauses the screen of the video but the video will jump a head how ever many seconds the pause when i un-pause the video. So if i start the video and then click pause and walk away i can come back and un-pause the video and it well jump right to the end.
The code i am using is ns.togglePause(); This same code works on the desktop and works with .flv and .f4v but will not work when i am using a mp4 on the tablet.
Has anyone seen this before or know why it would do something like this?
i am using flashdevelop to debug the application.
I found out that my problem is caused by encoding. This helped me solve my problem.I found this on adobes forms.
Video encoding is very important.
For example, use baseline profile level 3.1 for H264 for mobile
(and not High profile level 4.1 recommended for desktop).
For more information, see the MAX session of Fabio Sonnati:
"Encoding for Performance on Multiple Devices"
And if you have an Android phone/tablet, there is my AIR application
to watch Adobe MAX 2011 videos :
https://market.android.com/details?id=air.fr.inway.maxVideos2011
Search "Sonnati" for this session's video.
(For info, i use a video player based on OSMF 1.6)
The pdf presentation is available on Sonmati's blog:
http://sonnati.wordpress.com/