I've tried the following tutorial from this blog ( http://android-er.blogspot.kr/2013/05/get-current-frame-in-videoview-using.html ), which shows how to capture a video frame using MediaMetadataRetriever from a video source. However, it only works if the video is located locally on the phone.
Is there a way to capture a video frame while the VideoView is streaming the video over IP?
Video View is a Subclass of SurfaceView so Pixel Copy can take a screenshot of video View as well
Put this method in some Util class
/**
* Pixel copy to copy SurfaceView/VideoView into BitMap
*/
fun usePixelCopy(videoView: SurfaceView, callback: (Bitmap?) -> Unit) {
val bitmap: Bitmap = Bitmap.createBitmap(
videoView.width,
videoView.height,
Bitmap.Config.ARGB_8888
);
try {
// Create a handler thread to offload the processing of the image.
val handlerThread = HandlerThread("PixelCopier");
handlerThread.start();
PixelCopy.request(
videoView, bitmap,
PixelCopy.OnPixelCopyFinishedListener { copyResult ->
if (copyResult == PixelCopy.SUCCESS) {
callback(bitmap)
}
handlerThread.quitSafely();
},
Handler(handlerThread.looper)
)
} catch (e: IllegalArgumentException) {
callback(null)
// PixelCopy may throw IllegalArgumentException, make sure to handle it
e.printStackTrace()
}
}
Usage:
usePixelCopy(videoView) { bitmap: Bitmap? ->
processBitMp(bitmap)
}
I have found a solution to this problem. It appears that the VideoView does not allow this because of low-level hardware GPU reasons while using a SurfaceView.
The solution is to use a TextureView and use a MediaPlayer to play a video inside of it. The Activity will need to implement TextureView.SurfaceTextureListener. When taking a screenshot with this solution, the video freezes for a short while. Also, the TextureView does not display a default UI for the playback progress bar (play, pause, FF/RW, play time, etc). That is one drawback. If you have another solution please let me know :)
Here is the solution:
public class TextureViewActivity extends Activity
implements TextureView.SurfaceTextureListener,
OnBufferingUpdateListener,
OnCompletionListener,
OnPreparedListener,
OnVideoSizeChangedListener
{
private MediaPlayer mp;
private TextureView tv;
public static String MY_VIDEO = "https://www.blahblahblah.com/myVideo.mp4";
public static String TAG = "TextureViewActivity";
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_texture_view);
tv = (TextureView) findViewById(R.id.textureView1);
tv.setSurfaceTextureListener(this);
}
public void getBitmap(TextureView vv)
{
String mPath = Environment.getExternalStorageDirectory().toString()
+ "/Pictures/" + Utilities.getDayTimeString() + ".png";
Toast.makeText(getApplicationContext(), "Capturing Screenshot: " + mPath, Toast.LENGTH_SHORT).show();
Bitmap bm = vv.getBitmap();
if(bm == null)
Log.e(TAG,"bitmap is null");
OutputStream fout = null;
File imageFile = new File(mPath);
try {
fout = new FileOutputStream(imageFile);
bm.compress(Bitmap.CompressFormat.PNG, 90, fout);
fout.flush();
fout.close();
} catch (FileNotFoundException e) {
Log.e(TAG, "FileNotFoundException");
e.printStackTrace();
} catch (IOException e) {
Log.e(TAG, "IOException");
e.printStackTrace();
}
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.media_player_video, menu);
return true;
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
if (id == R.id.action_settings) {
return true;
}
return super.onOptionsItemSelected(item);
}
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height)
{
Surface s = new Surface(surface);
try
{
mp = new MediaPlayer();
mp.setDataSource(MY_VIDEO);
mp.setSurface(s);
mp.prepare();
mp.setOnBufferingUpdateListener(this);
mp.setOnCompletionListener(this);
mp.setOnPreparedListener(this);
mp.setOnVideoSizeChangedListener(this);
mp.setAudioStreamType(AudioManager.STREAM_MUSIC);
mp.start();
Button b = (Button) findViewById(R.id.textureViewButton);
b.setOnClickListener(new OnClickListener(){
#Override
public void onClick(View v)
{
TextureViewActivity.this.getBitmap(tv);
}
});
}
catch (IllegalArgumentException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (SecurityException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
Related
I want to make a dubbing app in Android.
Flow of the app is:
Get video and audio from the gallery.
Reduce the original sound of Video file. And mix (Dub) the selected audio on this video file.
After mixing the audio on this video file save it in to external memory.
I am using MediaMuxer for this, but m not success. Please help me regarding this.
Regards,
Prateek
even i was looking for the same to dub my video with an audio using mediaMuxer, MediaMuxer was a little difficult concept for me to understand as i am beginner . i ended up refering this github code. https://github.com/tqnst/MP4ParserMergeAudioVideo
it was my saviour. really thanx to that person.
i just picked up the code i wanted from it, i.e dubbing a video with the audio i specify.
here is my code i used in my project below
private void mergeAudioVideo(String originalVideoPath,String AudioPath,String OutPutVideoPath) {
// TODO Auto-generated method stub
Movie video = null;
try {
new MovieCreator();
video = MovieCreator.build(originalVideoPath);
} catch (RuntimeException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
Movie audio = null;
try {
new MovieCreator();
audio = MovieCreator.build(AudioPath);
} catch (IOException e) {
e.printStackTrace();
} catch (NullPointerException e) {
e.printStackTrace();
}
List<Track> videoTracks = new LinkedList<Track>();
for (Track t : video.getTracks()) {
if (t.getHandler().equals("vide")) {
videoTracks.add(t);
//seperate the video from the orginal video
}
}
Track audioTrack = audio.getTracks().get(0);// get your audio track to dub the video
Movie result = new Movie();
result.addTrack(videoTracks.get(0)); // add the video seprated from the originals
result.addTrack(audioTrack); //add the track to be put in resul video
Container out = new DefaultMp4Builder().build(result);
FileOutputStream fos = null;
try {
fos = new FileOutputStream(OutPutVideoPath);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
BufferedWritableFileByteChannel byteBufferByteChannel = new BufferedWritableFileByteChannel(fos);
try {
out.writeContainer(byteBufferByteChannel);
byteBufferByteChannel.close();
fos.close();
} catch (IOException e) {
e.printStackTrace();
}
}
and here is the BufferedWritableFileByteChannel class to write the outputVideo data to the directory.
public class BufferedWritableFileByteChannel implements WritableByteChannel {
private static final int BUFFER_CAPACITY = 1000000;
private boolean isOpen = true;
private final OutputStream outputStream;
private final ByteBuffer byteBuffer;
private final byte[] rawBuffer = new byte[BUFFER_CAPACITY];
public BufferedWritableFileByteChannel(OutputStream outputStream) {
this.outputStream = outputStream;
this.byteBuffer = ByteBuffer.wrap(rawBuffer);
}
#Override
public int write(ByteBuffer inputBuffer) throws IOException {
int inputBytes = inputBuffer.remaining();
if (inputBytes > byteBuffer.remaining()) {
dumpToFile();
byteBuffer.clear();
if (inputBytes > byteBuffer.remaining()) {
throw new BufferOverflowException();
}
}
byteBuffer.put(inputBuffer);
return inputBytes;
}
#Override
public boolean isOpen() {
return isOpen;
}
#Override
public void close() throws IOException {
dumpToFile();
isOpen = false;
}
private void dumpToFile() {
try {
outputStream.write(rawBuffer, 0, byteBuffer.position());
} catch (IOException e) {
throw new RuntimeException(e);
}
}
}
and dont forget to add the libraries in your project.
this may not be the exact answer to your question. but atleast it will able to shed some light on the probable solution.
In my app I have an audio player Activity which simply shows the current song title and album cover art together with a MediaController. I also have a Service which implements MediaController.MediaPlayerControl and which uses a MediaPlayer for playback.
In general everything works fine - the Activity binds to the Service and sets the MediaController to control the MediaPlayer.
The problem comes when starting a new song but ONLY when the MediaController is shown on the screen. If it is hidden or even if I leave the Activity using BACK and return to the home screen, the 'playlist' is processed without a problem. If the MediaContoller is visible, however, the player dies silently and the seek / progress bar shows a start and end time of 466:36:07 and logcat shows...
attempt to call getDuration without a valid mediaplayer
I don't get an unhandled exception stacktrace in logcat - just that message and MediaPlayer simply stops playing. I've Googled for the error and get nearly 3000 hits (most of which are for questions here on SO) but I can't find a situation which matches mine.
Relevant Activity code...
public class AudioPlayerActivity extends Activity
implements OnKeyListener {
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.audio_player);
// Find views here
controller = new MyMediaController(this);
controller.setAnchorView(ll);
controller.setOnKeyListener(this);
}
#Override
protected void onResume() {
super.onResume();
mBound = bindService(new Intent(this, MusicPlayerService.class), mConnection, Context.BIND_AUTO_CREATE);
try {
getCurrentItem();
mediaFilename = currentItem.getTrackName();
coverArtPath = currentItem.getMediaArtUrl();
updateTrackDetails();
Intent playerServiceIntent = new Intent(this, MusicPlayerService.class);
startService(playerServiceIntent);
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (SecurityException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
}
public void showMediaController() {
// Called by onTouchEvent(...)` when the screen is touched
if (mBound) {
controller.setMediaPlayer(mService);
controller.setEnabled(true);
controller.show(0);
}
}
}
The Service has the following code...
public class NDroidTEMusicPlayerService extends Service
implements MediaPlayer.OnPreparedListener,
MediaPlayer.OnCompletionListener,
MediaPlayer.OnErrorListener,
AudioManager.OnAudioFocusChangeListener,
MediaPlayerControl {
private void setupMediaPlayer() {
...
try {
mMediaPlayer.setDataSource(trackUrl);
mMediaPlayer.prepareAsync();
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (SecurityException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
}
#Override
public void onCompletion(MediaPlayer mp) {
stop();
mp.reset();
currentItem = playList.getNextItem(isRepeating);
Intent i = new Intent(MYAPP.ACTION_MYAPP_UPDATE_PLAY_VIEW);
i.putExtra("current_item", currentItem);
sendBroadcast(i);
setupMediaPlayer();
}
#Override
public int getDuration() {
int duration = 0;
if (mMediaPlayer != null)
duration = mMediaPlayer.getDuration();
return duration;
}
}
This question already has answers here:
The application may be doing too much work on its main thread
(21 answers)
Closed 1 year ago.
import java.io.BufferedReader;
public class Main extends Activity implements SurfaceHolder.Callback,
MediaPlayer.OnCompletionListener, View.OnClickListener, OnInitListener {
String SrcPath = "";
MediaPlayer mp;
SurfaceView mSurfaceView;
private SurfaceHolder holderrrr;
Boolean play = false;
String t_alarm1 = "alarm.xml", t_alarm2 = "alarm2.xml", text;
// TextToSpeach
private TextToSpeech mText2Speech;
#Override
protected void onCreate(Bundle savedInstanceState) {
StrictMode.ThreadPolicy policy = new StrictMode.ThreadPolicy.Builder()
.permitAll().build();
StrictMode.setThreadPolicy(policy);
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
addListenerOnButton();
mText2Speech = new TextToSpeech(Main.this, Main.this);
}
// menü
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
switch (item.getItemId()) {
case R.id.action_settings: {
Intent myIntent = new Intent(this, menu.class);
Main.this.startActivity(myIntent);
return true;
}
}
return true;
}
// kilépésfigyelő
private static final long DOUBLE_PRESS_INTERVAL = 2000000000;// 2 másodperc
private long lastPressTime;
#Override
public void onBackPressed() {
Toast.makeText(Main.this, getString(R.string.kilepes_dupla),
Toast.LENGTH_SHORT).show();
long pressTime = System.nanoTime();
if (pressTime - lastPressTime <= DOUBLE_PRESS_INTERVAL) {
// this is a double click event
System.exit(0);
}
lastPressTime = pressTime;
}
public void onInit(int status) {
if (status == TextToSpeech.SUCCESS) {
mText2Speech.setLanguage(Locale.getDefault()); // alaértelmezett
// nyelv a TtS-hez
}
}
private void addListenerOnButton() {
final Button button5 = (Button) findViewById(R.id.button5);
button5.setOnClickListener(new View.OnClickListener() {
public void onClick(View v) {
// Perform action on click
if (play) {
try{
mp.stop();
mp.release();
mp = null;
}catch(Exception e){
}
button5.setText("Start");
play = false;
} else {
try {
mp = new MediaPlayer();
mSurfaceView = (SurfaceView) findViewById(R.id.surface);
holderrrr = mSurfaceView.getHolder();
play = true;
button5.setText("Stop");
surfaceCreated(holderrrr);
} catch (IllegalArgumentException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (SecurityException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
});
}// addlistener vége
public void surfaceCreated(SurfaceHolder holder) {
mp.setDisplay(holder);
holder = mSurfaceView.getHolder();
holder.addCallback(this);
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
try {
mp.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mp) {
mp.stop();
mp.release();
Toast.makeText(Main.this,
"A videó lejátszás befejeződött!",
Toast.LENGTH_SHORT).show();
// button5.setText("Start");
//play = false;
}
});
mp.setDataSource(SrcPath);
mp.prepare();
} catch (IllegalArgumentException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (SecurityException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
// Get the dimensions of the video
// int videoWidth = mp.getVideoWidth();
// int videoHeight = mp.getVideoHeight();
// Get the width of the screen
// int screenWidth = getWindowManager().getDefaultDisplay().getWidth();
// Get the SurfaceView layout parameters
android.view.ViewGroup.LayoutParams lp = mSurfaceView.getLayoutParams();
// Set the width of the SurfaceView to the width of the screen
// lp.width = screenWidth;
lp.width = 420;
// Set the height of the SurfaceView to match the aspect ratio of the
// video
// be sure to cast these as floats otherwise the calculation will likely
// be 0
// lp.height = (int) (((float) videoHeight / (float) videoWidth) *
// (float) screenWidth);
lp.height = 390;
// Commit the layout parameters
mSurfaceView.setLayoutParams(lp);
// Start video
mp.start();
}
}
}
it works fine on the 4.1.2 android on galaxy s3 but it gives me the error message which is in the title. and it doesn't show the first 3 sec video...
please give me some advice or som solution with this because i have no idea how to get rid of this kind of error
In simple terms this error means you are asking your Android system to do too much work on the main thread of this particular application. There are some good general answers on this error, one good example being:
https://stackoverflow.com/a/21126690/334402
For your specific example, you may be simply asking it to do too much video related work, which is very processor hungry, on the main thread. It would be worth looking at your 'prepare' method for example and if you are using a streamed source, consider using prepareAsynch - see below from the Android documentation:
public void prepareAsync ()
Added in API level 1 Prepares the player for playback, asynchronously.
After setting the datasource and the display surface, you need to
either call prepare() or prepareAsync(). For streams, you should call
prepareAsync(), which returns immediately, rather than blocking until
enough data has been buffered.
One reason why you may be seeing problems on the Nexus 7 and not on the Galaxy 3 is that the Nexus has a significantly bigger screen and your video source may offer different video encodings for different sized devices - the larger ones quite likely requiring more processing to decode and render.
I try to play a video for a splash Activity when my Android app starts and I have the problem that onSurfaceTextureAvailable interface method never get called.
Here is the code:
public class HomeActivity extends Activity implements TextureView.SurfaceTextureListener,
MediaPlayer.OnBufferingUpdateListener, OnCompletionListener,
MediaPlayer.OnPreparedListener, MediaPlayer.OnVideoSizeChangedListener {
private MediaPlayer videoMediaPlayer;
private TextureView videoPreview;
private Bundle extras;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
videoPreview = new TextureView(this);
videoPreview.setSurfaceTextureListener(this);
extras = getIntent().getExtras();
setContentView(videoPreview);
}
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
Surface s = new Surface(surface);
Log.d("HOME", "onSurfaceTextureAvailable");
try {
videoMediaPlayer= new MediaPlayer();
Uri video = Uri.parse("android.resource://" + getPackageName() + "/"
+ R.raw.intro_video);
videoMediaPlayer.setDataSource(getApplicationContext(), video);
videoMediaPlayer.setSurface(s);
videoMediaPlayer.prepare();
videoMediaPlayer.setOnBufferingUpdateListener(this);
videoMediaPlayer.setOnCompletionListener(this);
videoMediaPlayer.setOnPreparedListener(this);
videoMediaPlayer.setOnVideoSizeChangedListener(this);
videoMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
videoMediaPlayer.start();
} catch (IllegalArgumentException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (SecurityException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
// Other stuff
The code is from stackoverflow similar question which it seems that he got it working like this.
Any advice?
to onSurfaceTextureAvailable() be called, hardware acceleration must be enabled. you can do this in AndroidManifast.xml file
<activity
android:name="com.example.HomeActivity"
android:hardwareAccelerated="true" >
</activity>
I've been working on an app where the user touches the screen to start a movie. The screen image is the first frame of the movie. Once the touch happens, the movie plays. I do this by putting a jpg of the first frame in front of the movie, and then removing the jpg once I think the movie is playing. (Figuring out when that happens is impossible, but that's another issue. And on older devices if you remove the image too soon, you get black.)
Tested this on probably six different devices. Today the seventh: Kindle Fire HD. On this device, the movies are all brighter than the corresponding jpgs. On all other devices, they match perfectly. Any ideas what could cause this or how to fix?
(Another issue with the HD is that movies take a REALLY long time to start playing. But that's another issue.)
EDIT: here is my main.xml:
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:orientation="vertical" >
<ImageView
android:id="#+id/iv"
android:layout_width="fill_parent"
android:layout_height="fill_parent"/>
<VideoView
android:id="#+id/vv"
android:layout_width="fill_parent"
android:layout_height="fill_parent"/>
</FrameLayout>
and here is code:
public class VideoTestActivity extends Activity implements SurfaceHolder.Callback, OnPreparedListener, OnCompletionListener {
private VideoView vv;
private ImageView iv;
private Bitmap b;
private MediaPlayer mp = new MediaPlayer();
private static final String TAG = VideoTestActivity.class.getSimpleName();
private volatile boolean prepared = false;
private volatile boolean readytoplay = false;
private volatile boolean playing = false;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.main);
iv = (ImageView)findViewById(R.id.iv);
iv.bringToFront();
vv = (VideoView)findViewById(R.id.vv);
b = BitmapFactory.decodeResource(getResources(), R.drawable.babyblack);
iv.setBackgroundColor( 0xFFDFA679 );
vv.getHolder().addCallback(this);
mp.setOnPreparedListener( this );
mp.setOnCompletionListener( this );
try {
mp.setDataSource( this, Uri.parse("android.resource://" + getPackageName() + "/" + R.raw.ape) );
} catch (IllegalArgumentException e) {
Log.d(TAG,"illegal argument exception on set data source");
e.printStackTrace();
} catch (SecurityException e) {
Log.d(TAG,"security exception on set data source");
e.printStackTrace();
} catch (IllegalStateException e) {
Log.d(TAG,"illegal state exception on set data source");
e.printStackTrace();
} catch (IOException e) {
Log.d(TAG,"IO exception on set data source");
e.printStackTrace();
}
}
#Override
public boolean onTouchEvent(MotionEvent event) {
float dx, dy;
Log.d(TAG,"touch event");
if ( !playing && event.getAction() == MotionEvent.ACTION_UP ) {
Log.d(TAG,"action up");
if ( prepared ) {
playing = true;
Log.d(TAG,"hardware accelerated: iv="+iv.isHardwareAccelerated()+", vv="+vv.isHardwareAccelerated());
mp.start();
Log.d(TAG, "playing video in onTouch callback");
Log.d(TAG,"hardware accelerated: iv="+iv.isHardwareAccelerated()+", vv="+vv.isHardwareAccelerated());
} else
readytoplay = true;
}
return true;
}
#Override
public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2, int arg3) {
// TODO Auto-generated method stub
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
// TODO Auto-generated method stub
Log.d(TAG,"surface is created");
mp.setDisplay( vv.getHolder() );
try {
mp.prepareAsync();
} catch (IllegalArgumentException e) {
Log.d(TAG,"illegal argument exception on prepare");
e.printStackTrace();
} catch (SecurityException e) {
Log.d(TAG,"security exception on prepare");
e.printStackTrace();
} catch (IllegalStateException e) {
Log.d(TAG,"illegal state exception on prepare");
e.printStackTrace();
}
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
// TODO Auto-generated method stub
}
#Override
public void onPrepared(MediaPlayer mp) {
Log.d(TAG,"video is prepared");
prepared = true;
if ( readytoplay ) {
playing = true;
mp.start();
iv.setVisibility( View.GONE );
Log.d(TAG,"playing video from prepared callback");
}
}
#Override
public void onCompletion(MediaPlayer arg0) {
Log.d(TAG,"video is done");
playing = false;
iv.setVisibility( View.VISIBLE );
}}
I changed the ImageView to have no image, but just a solid-colored background. The only data file you need is an mp4 movie. When you touch the screen, the movie plays, hidden behind the ImageView. The screen immediately brightens when I touch it (mp.start() happens), then the movie starts playing, and it gradually dims a bit, then brightens again, and finally stabilizes when the movie is done.
I tried hardware acceleration, and no hardware acceleration; no difference. I tried plugging the Kindle Fire HD in, and not plugging it in; no difference.
I would post the 2-second mp4 file that I am using but don't know how.
look like it's by design, per this forum post - https://forums.developer.amazon.com/forums/thread.jspa?threadID=450#1780