hi guys i read many example about play video in video view, but no one work for me, i get this error:
java.io.FileNotFoundException: /android.resource:/frt.com.maint/2130968576 (No such file or directory)
this is my code-------------------------------------------------------------------------:
FileInputStream fi = new FileInputStream("android.resource://frt.com.maint/" + R.raw.videointro);
MediaPlayer pl = new MediaPlayer();
pl.setDataSource(fi.getFD());
pl.prepare();
pl.start();
MediaPlayer don't have method setVideoURI, i use the first solution that you give me but i still get same error, after i use this code with videoview:
Uri video = Uri.parse("android.resource://frt.com.maint/videointro");
vidview_gdf.setVideoURI(video);
vidview_gdf.start();
but i get an error with message "you can not play the video"
p.s: additional info: introvideo.mp4 - 7 MB
You're trying to use the ID of the resource, which is just an int index.
Use the filename instead:
fi = new FileInputStream("android.resource://frt.com.maint/nitrovideo");
Or better:
StringBuilder videoURIPath = new StringBuilder();
videoURIPath.append("android.resource://");
videoURIPath.append(getPackageName() + "/");
videoURIPath.append("raw/");
videoURIPath.append(videoFileName);
pl.setVideoURI(Uri.parse(videoURIPath.toString());
Where videoFileName is a string of the name of your file.
Are you doing this on emulator or actual device?
I had a bit of bad experience with H.264 encoded video before. Basically, I tried to play it on the first GalaxyTab but it didn't work. Turned out that GalaxyTab I had didn't support H.264.
So, I would advise you to make sure that the default video player can play this file before proceed further. If that's not the case for you then I'm not sure what's wrong. Your code looks fine to me.
Related
I'm trying to implement Google Speech API in Android by following this demo: https://github.com/GoogleCloudPlatform/android-docs-samples
I was able to successfully reproduce the example in my app by using the given "audio.raw" file located in R.raw, and everything works perfectly. However, when I try to use my own audio files, it returns "API successful" without any transcription text. I'm not sure if it has to do with the files' path or the encoding, so I'll include information on both just in case.
Encoding
My audio files are obtained by recording a voice through MediaRecorder. These are the settings:
myAudioRecorder = new MediaRecorder();
myAudioRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
myAudioRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
myAudioRecorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_WB);
myAudioRecorder.setAudioSamplingRate(16000);
myAudioRecorder.setAudioEncodingBitRate(16000);
myAudioRecorder.setAudioChannels(1);
myAudioRecorder.setOutputFile(outputFile);
SpeechService's recognizeInputStream() function in the API:
mApi.recognize(
RecognizeRequest.newBuilder()
.setConfig(RecognitionConfig.newBuilder()
.setEncoding(RecognitionConfig.AudioEncoding.AMR_WB) //originally it was LINEAR16
.setLanguageCode("en-US")
.setSampleRateHertz(16000)
.build())
.setAudio(RecognitionAudio.newBuilder()
.setContent(ByteString.readFrom(stream))
.build())
.build(),
mFileResponseObserver);
Encoding guidelines by Google: https://cloud.google.com/speech/docs/best-practices
From what I understand, I can use AMR_WB and 16kHz instead of the default LINEAR16, I'm just not sure if I'm doing it right.
Path
This is the example that is fully working (with the audio file from the repo):
mSpeechService.recognizeInputStream(getResources().openRawResource(R.raw.audio));
However, none of the following options work, even with the exact same file:
InputStream inputStream = new URL("[website]/test/audio.raw").openStream();
mSpeechService.recognizeInputStream(inputStream);
Neither:
Uri uri = Uri.parse("android.resource://[package]/raw/audio");
InputStream inputStream = getActivity().getContentResolver().openInputStream(uri); //"getActivity()" because this is in a Fragment
mSpeechService.recognizeInputStream(inputStream);
To be clear, the result on the above paths is the same as on my custom audio files: "API successful" with no transcription. One of the options I have tried for my custom audio files, with the same thing happening, is this:
FileInputStream fis = new FileInputStream(filePath);
mSpeechService.recognizeInputStream(fis);
The only reason I'm not 100% sure the problem is in the path is because if the API is returning with success, then the file was found in the specified path. The problem should be the encoding, but then it's weird that the same file ("audio.raw") sent in different ways produces different results.
Anyway, thank you in advance! :)
EDIT:
To be clear, it's not that it returns an empty string in the transcription. It just never enters the "onSpeechRecognized" function that also exists in the demo, so no transcription is given.
Investigating the Java code of the lib, I found no way to save playing video somewhere. However, the VLC core has such capabilities, according to this doc, you can duplicate the stream and save it, redirecting it right to the file.
I thought we could supply the corresponding arguments while creating an instance of lib, so I tried to add an option when initializing library in libvlcjni.c like that:
"--sout=duplicate{dst=standard{access=file,mux=ts,dst=/storage/emulated/0/example.mp4},
dst=display}"
but seems it's not working. Any other ideas?
You can concurrently save a playing video to a file using libvlc (at least the following worked for me):
final ArrayList<String> args = new ArrayList<>();
args.add("-vvv");
mLibVLC = new LibVLC(this, args);
mMediaPlayer = new MediaPlayer(mLibVLC);
<code associating surface for display...>
Media media = new Media(mLibVLC, Uri.parse(SAMPLE_URL));
media.addOption(":sout=#duplicate{dst=file{dst=" + <file name> + "},dst=display}");
mMediaPlayer.setMedia(media);
mMediaPlayer.play();
Guess, currenty there is no way to do it by means of libvlc.
However, guys have plans for this feature, see their milestones at gitlab
I tried to add a video path to an Android sample project - MediaPlayerDemo
I can playback the video when it stored in sdcard, the path is
"file:///sdcard/dcim/a.m4v"
But I can't playback the video when it stored in res/drawable. the path is
"android.resource://" +this.getPackageName () + "/" + R.drawable.a
I can read the id of the video in debug mode, but just can't replay the video.
How to solve it?
UPDATES
Thank you for the reply, so far i have tried:
put the video in assets, set path to "file:///android_asset/a.m4v".
put the video in raw, set path to
("android.resource://" +this.getPackageName () + "/" + R.raw.a) or ("android.resource://" +this.getPackageName () + "/raw/a)
but none of them can playback video.
My video is 1.8Mb, does it matter?
Create a new folder with name raw in res folder, if already created let it be. Copy your playable video file (e.g., myvideo.mp4) to raw folder. Use below code in your app.
String uriPath = "android.resource://"+getPackageName()+"/raw/myvideo";
Uri uri = Uri.parse(uriPath);
mVideoView.setVideoURI(uri);
I tested, its working for me. If the video is playable from sdcard then only it will play from raw folder, otherwise it will show a dialog box says Cannot play video.
try it and let me know what happened.
FYI, drawable is to store icons, images, drawables for the application. So You can put the same video either in assets or in raw folder.
I have found the solution, but not a straight forward way.
First, the file not found problem is not the path problem, it is because the permission problem.
To solve that, many people suggest copy the file into FileInputStream. But still got file not found problem.
But the File can be written to Inputstream. However, setDataSource() of Mediaplayer class does not acccept InpuStream. Therefore need to write the Inputstream to a temp File by BufferedOutputStream.
finally, setDataSource(tempfile_path) without error.
This question is quite old and still not answered well so here I will answer.
First of all, do not put videos in the assets folder. It is a bad practice. Create another folder (Preferably one named raw).
The second thing is that please do not use m4v format. Use a mp4 video.
Here is the code to insert video:
//Here it is assumed here that the file name of video in raw folder is demo
VideoView video = (VideoView) findViewById(R.id.videoView);
video.setVideoPath("android.resource//" + getPackageName() + "/" + R.raw.demo);
video.start();
Hope this clarifies your doubt!
I am working in android. I just used a video player. I am accessing video from my sdcard.
When i access video from myFolder then it works fine, but there are some folder which name are Japanese character. So whenever i want to access video from those folder then "Sorry, This video can not play" error is occurred.
This is my code which i am using:-
VideoView video = (VideoView) findViewById(R.id.videoId);
MediaController controller = new MediaController(this);
controller.setPadding(0, 0, 0, 55);
video.setVideoPath(URLDecoder.decode(sdCardUrl));
video.setMediaController(controller);
video.requestFocus();
video.start();
and this my path in which i am facing problem to video:-
url = /sdcard/.FileStorage/History/Myfilestorage/のフダ名/H.264(avc)_mp3_1000kbps_640x480_25fps.mp4
please suggest what should i do ?
Thank you in advance.
Try by converting that Japanese name in to UTF-8 format and then try to use it.
I have resolved my problem to trace the sequence of output of each line. I have done mistake that there is no need to decode url.
So the solution is to use following line:-
video.setVideoPath(sdCardUrl);
in place of video.setVideoPath(URLDecoder.decode(sdCardUrl));
I have prepared a code to just play a simple mp4 file from my res folder. The coding is something like this:
public class VideoPlayer extends Activity {
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.video);
VideoView video = (VideoView)findViewById(R.id.VideoView);
Uri uri = Uri.parse("android.resource://company.software.myapp/"
+ R.raw.myvideo);
MediaController mc = new MediaController(this);
video.setMediaController(mc);
video.setVideoURI(uri);
//video.requestFocus();
video.start();
}
}
Now though there is no error in playing. The activity automatically generates a dialog saying "sorry this video cannot be played", but I can hear the audio and it plays till end. What is the problem?
Thanx a lot commonsware.com... but i found the solution to the problem... And astonishingly its the PC processor which is the culprit... I checked n a higher configuration and guess wat... it worked perfectly fine... though sometimes if we do some processing in the background the dialog box does come up but on clicking ok it starts playing the video after some time...
But i confirm that this technique of playing file from resource is ok as far as i know...
sorry to waste ur precious time in a mundane hardware problem... but hope it'll be useful for other people who get this problem...
Android supports 3gp and mp4 format, but still sometimes there are problems in playing an mp4 content.
one thing what I have found out from my research is that, this might be because the resolution problem with the video.
I think that you should re-size the resolution of your mp4 video. This might help.
I have not attempted to play a video clip out of a resource, and I am not certain that it works.
As a test, put the video clip on the SD card and use that as the source of your video.
If you get the same symptoms, then either the MP4 file has issues or it is something with your test environment (e.g., you are using the emulator and don't have a quad-core CPU).
If the SD card test works, though, then I suspect the problem is packaging it as a resource.