Youtube video play with android player - android

How will we play youtube video with android media player???
Anybody has idea then please explain me.
I have to play the URL:"http://www.youtube.com/embed/bIPcobKMB94?autoplay=1" with android default media player.
When i played this url with android media player, i got MediaPlayer error (1, -2147483648).
I play that url in my android device media player but now i am not able play in tablet. Anybody help me. Why i am not able to play that video in tablet?
rtsp://v6.cache3.c.youtube.com/CiILENy73wIaGQnokCRYfXXPsBMYDSANFEgGUgZ2aWRlb3MM/0/0/0/video.3gp
Thanks

Well, the Android part is quite easy. You just need a raw URI of the YouTube video and trigger an intent with it:
Uri uri = Uri.parse("http://<link-to-RAW-youtube-video>"); // i.e. mp4 version
Intent intent = new Intent(Intent.ACTION_VIEW);
intent.setDataAndType(uri, "video/*"); // important! otherwise you just download the video
startActivity(intent); // called directly from an other activity as you see
If the user has multiple video players installed the user also gets a choice which one he wants to use for the playback.
The tricky part here is not Android related it is how to get the raw URI of a YouTube video. Usually they are extracted from the source code of the original YouTube Video page but that's an other topic. I tried the code above with a self extracted URI to a mp4 version of the video and it worked just fine on my Android 4.0 phone.
EDIT:
You really have to use a raw URI to the YouTube Video, for your Video it would be:
http://o-o.preferred.ber01s04.v22.lscache2.c.youtube.com/videoplayback?sparams=cp%2Cid%2Cip%2Cipbits%2Citag%2Cratebypass%2Csource%2Cexpire&fexp=904550%2C919700%2C911614&itag=18&ip=77.0.0.0&signature=C721CE7543081FC0C805C86F5D3C4D9B34D77764.D4288106CF7A3153FF1574F2334161CBD1176535&sver=3&ratebypass=yes&source=youtube&expire=1332001847&key=yt1&ipbits=8&cp=U0hSR1BLT19JUUNOMl9IRVNJOmllRjJJYy1SSG92&id=6c83dca1b28c07de&title=AT%26T%20Samsung%20Captivate%20TV%20Commercial%20-%20Galaxy%20S%20Phone-360p
It's very long but it will work :)

I had a very hard time implementing a player supporting many features and formats. Even VideoView didn't match all of my needs. Finally, I ended up writting my own solution based on SurfaceView and MediaPlayer. Here is the source code. It was validated on 4.0.3 and 4.1.2 for local 3gp, local mp4, http 3gp and youtube rtsp streams.
Remember that, presently, an URI for valid youtube video on android is to be like this: rtsp://v5.cache1.c.youtube.com/CjYLENy73wIaLQnhycnrJQ8qmRMYESARFEIJbXYtZ29vZ2xlSARSBXdhdGNoYPj_hYjnq6uUTQw=/0/0/0/video.3gp. You get them via http://m.youtube.video.
The activity's arguments are set by Intent. There are quite a variety of options. See the last source code provided in my comment. You can remove the code related to Sequencer, Scenario and Test classes. They are specific to my program (automated system for platform testing)
package my.package;
import java.io.IOException;
import java.lang.ref.WeakReference;
import android.annotation.SuppressLint;
import android.app.Activity;
import android.content.Intent;
import android.graphics.Point;
import android.media.AudioManager;
import android.media.MediaPlayer;
import android.media.MediaPlayer.OnBufferingUpdateListener;
import android.media.MediaPlayer.OnCompletionListener;
import android.media.MediaPlayer.OnPreparedListener;
import android.media.MediaPlayer.OnVideoSizeChangedListener;
import android.net.Uri;
import android.os.Build.VERSION;
import android.os.Bundle;
import android.os.Handler;
import android.os.Message;
import android.text.TextUtils;
import android.text.format.Time;
import android.view.Display;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import my.package.Log;
import my.package.R;
import my.package.Scenario;
import my.package.Sequencer;
import my.package.Test;
import my.package.TestUtil;
/**
* An activity to display local or remote video or play audio file.
*/
public class MediaPlayerActivity extends Activity implements
OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener,
OnVideoSizeChangedListener, OnErrorListener, SurfaceHolder.Callback {
private static final String LOG_TAG = MediaPlayerActivity.class
.getSimpleName();
// Mandatory creation arguments passed by extras in starting Intent
public static final String SEQUENCER = "sequencer"; //$NON-NLS-1$
public static final String SCENARIO = "scenario"; //$NON-NLS-1$
public static final String TEST = "test"; //$NON-NLS-1$
public static final String SOURCE_URI = "uri"; //$NON-NLS-1$
// Optional creation arguments passed by extras in starting Intent
public static final String PAUSE_ON_BACKGROUND = "auto_pause"; //$NON-NLS-1$
public static final String TIMEOUT = "timeout"; //$NON-NLS-1$
public static final String LOOP = "loop"; //$NON-NLS-1$
public static final String SCREEN_ON_WHILE_PLAYING = "screen_on_while_playing"; //$NON-NLS-1$
// data arguments returned by Intent on finish
public static final String REASON = "cause"; //$NON-NLS-1$
public static final String EXCEPTION = "exception"; //$NON-NLS-1$
// additional state bundle arguments.
private static final String START_POSITION = "start"; //$NON-NLS-1$
private static final String VIDEO_WIDTH = "video_width"; //$NON-NLS-1$
private static final String VIDEO_HEIGHT = "video_height"; //$NON-NLS-1$
private WeakReference<Sequencer> mSequencer = new WeakReference<Sequencer> (null);
private WeakReference<Test> mTest = new WeakReference<Test> (null);
/**
* URI of the video/audio source.
*
* This player supports a variety of videos and audio sources, either local
* or remote.
* <p>
* An HTTP live streaming URI would be:
* {#code httplive://xboodangx.api.channel.livestream.com/3.0/playlist.m3u8}
* </p>
* <p>
* A local video file URI would be {#code file:///sdcard/spiderman.mp4}
* </p>
* <p>
* A remote 3GPP format video URI would be
* {#code http://commonsware.com/misc/test2.3gp}
* </p>
* <p>
* And finally an RTP or RTSP video source URI would be
* {#code rtsp://v4.cache1.c.youtube.com/CjYLENy73wIaLQk4RDShYkdS1BMYDSANFEIJbXYtZ29vZ2xlSARSBXdhdGNoYK-Cn8qh8J6-Tgw=/0/0/0/video.3gp}
* </p>
*/
private Uri mMediaURI;
/**
* Input: this flag is set to true if the video must be paused when the
* activity is not visible any more. The video will resume when the activity
* is visible again.
*/
private boolean mPauseOnBackground;
/**
* Input: number of milliseconds until automatic shutdown, or -1 if no
* timeout.
*/
private long mTimeout;
/**
* Input: flag set to true to loop back to the beginning of the video when
* reaching its end
*/
private boolean mLoop;
/**
* The width of the video, obtained by
* {#link #onVideoSizeChanged(MediaPlayer, int, int)}
*/
private int mVideoWidth;
/**
* The height of the video, obtained by
* {#link #onVideoSizeChanged(MediaPlayer, int, int)}
*/
private int mVideoHeight;
private MediaPlayer mMediaPlayer;
private SurfaceView mPreview;
private boolean mIsVideoSizeKnown = false;
private boolean mMediaPrepared = false;
/**
* This member is set to position the video should start. It is set when
* pausing the video and used when restoring the instance.
*/
private int mStartPosition;
private boolean mTimeoutSet;
private boolean mScreenOnWhilePlaying;
private SurfaceHolder mHolder;
private static class ShutdownHandler extends Handler {
WeakReference<MediaPlayerActivity> mActivity;
public ShutdownHandler(MediaPlayerActivity activity) {
mActivity = new WeakReference<MediaPlayerActivity>(activity);
}
#Override
public void handleMessage(Message msg) {
MediaPlayerActivity activity = mActivity.get();
if (activity != null) {
activity.finishTest(Activity.RESULT_OK, null);
} else {
//Log.w(LOG_TAG, "no activity for shutdown");
}
}
}
public MediaPlayerActivity() {
super();
// These members are initialized in onCreate(Bundle) by the
// starting Intent, the first time the activity is created, and restored
// in the same method with the arguments in the saved instance bundle.
mSequencer = new WeakReference<Sequencer> (null);
mTest = new WeakReference<Test> (null);
setSourceURI(null);
setPauseOnBackground(false);
setPlayTimeout(-1);
setLooping(false);
setScreenOnWhilePlaying(true);
// Initialize internals.
mIsVideoSizeKnown = false;
mVideoWidth = mVideoHeight = 0; // unknown
mMediaPrepared = false;
mMediaPlayer = null;
mPreview = null; // set in onCreate(Bundle)
setStartPosition(0); // beginning of the video
mTimeoutSet = false;
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.video_player);
Intent intent = getIntent();
if (savedInstanceState != null) {
onRestoreInstanceState(savedInstanceState);
} else if (intent != null) {
Log.d(LOG_TAG, "Loading starting Intent extras...");
// read starting Intent extras.
_updateForeignReferences(intent);
setSourceURI((Uri) intent.getParcelableExtra(SOURCE_URI));
setPlayTimeout(intent.getLongExtra(TIMEOUT, -1L));
setPauseOnBackground(intent.getBooleanExtra(PAUSE_ON_BACKGROUND,
false));
setLooping(intent.getBooleanExtra(LOOP, false));
setScreenOnWhilePlaying(intent.getBooleanExtra(SCREEN_ON_WHILE_PLAYING, true));
}
mTimeoutSet = false;
_updateWidgets();
}
protected void onRestoreInstanceState(Bundle savedInstanceState) {
super.onRestoreInstanceState(savedInstanceState);
Log.d(LOG_TAG, "Restoring instance state...");
// restore saved references
_updateSavedForeignReferences(savedInstanceState);
// restore saved inputs
setSourceURI(Uri.parse(savedInstanceState.getString(SOURCE_URI)));
setPlayTimeout(savedInstanceState.getLong(TIMEOUT));
setPauseOnBackground(savedInstanceState.getBoolean(PAUSE_ON_BACKGROUND));
setLooping(savedInstanceState.getBoolean(LOOP));
setScreenOnWhilePlaying(savedInstanceState.getBoolean(SCREEN_ON_WHILE_PLAYING));
// restore internals
setStartPosition(savedInstanceState.getInt(START_POSITION, 0));
mVideoWidth = savedInstanceState.getInt(VIDEO_WIDTH);
mVideoHeight = savedInstanceState.getInt(VIDEO_HEIGHT); // unknown
mIsVideoSizeKnown = (mVideoWidth > 0) && (mVideoHeight > 0);
}
#Override
protected void onResume() {
super.onResume();
if (mMediaPlayer == null) {
try {
_playMedia();
} catch (Exception e) {
_fail(e);
}
}
}
#Override
protected void onSaveInstanceState(Bundle outState) {
super.onSaveInstanceState(outState);
Log.d(LOG_TAG, "Saving instance state");
Sequencer sequencer = mSequencer.get();
if (sequencer != null) {
outState.putInt(SEQUENCER, sequencer.getPosition());
Scenario scenario = sequencer.getScenario();
if (scenario != null) {
outState.putInt(SCENARIO, scenario.getScenarioId());
}
}
Test test = mTest.get();
if (test != null) {
outState.putString(TEST, TestUtil.getTestPath(test, TestUtil.Path.Static));
}
if (getSourceURI() != null) {
outState.putString(SOURCE_URI, getSourceURI().toString());
}
outState.putBoolean(LOOP, isLooping());
outState.putBoolean(PAUSE_ON_BACKGROUND, isPausingOnBackground());
outState.putLong(TIMEOUT, getPlayTimeout());
outState.putBoolean(SCREEN_ON_WHILE_PLAYING, isScreenOnWhilePlaying());
outState.putInt(START_POSITION, getStartPosition());
}
#Override
protected void onPause() {
super.onPause();
if (isPausingOnBackground()) {
_pausePlayback();
}
}
#Override
protected void onStop() {
super.onStop();
Log.d(LOG_TAG, "onStop");
if (isPausingOnBackground()) {
_releaseMediaPlayer();
}
}
#Override
protected void onDestroy() {
super.onDestroy();
Log.d(LOG_TAG, "onDestroy");
// TODO: It would be fine to fail the test if the activity was destroyed
// if we didn't finished the test yet but there are far too many cases
// and I failed to implement one working with all of them!
/* if (!mInstanceSaved) {
finishTest(Activity.RESULT_FIRST_USER, new Intent().putExtra(REASON,
"Activity destroyed. Something certainly goes wrong there."));
}
*/
_releaseMediaPlayer();
}
// Restore sequencer, scenario and test references from saved state.
private void _updateSavedForeignReferences(Bundle savedInstanceState) {
int sequencer = savedInstanceState.getInt(SEQUENCER, -1);
int scenarioId = savedInstanceState.getInt(SCENARIO, -1);
mSequencer = new WeakReference<Sequencer>(null);
mTest = new WeakReference<Test>(null);
if (scenarioId >= 0 && sequencer >= 0) {
Scenario scenario = Controller.controller.getData().scenarios()
.getScenario(scenarioId);
mSequencer = new WeakReference<Sequencer>(Controller.controller
.engine().getSequencerAt(sequencer));
String testPath = savedInstanceState.getString(TEST);
if (!TextUtils.isEmpty(testPath)) {
mTest = new WeakReference<Test>(TestUtil.fromPath(
scenario.getRootTest(), testPath));
}
}
}
// Update sequencer, scenario and test references from starting Intent
protected void _updateForeignReferences(Intent intent) {
int scenarioId = intent.getIntExtra(MediaPlayerActivity.SCENARIO, -1);
Scenario scenario = Controller.controller.getData().scenarios()
.getScenario(scenarioId);
int sequencer = intent.getIntExtra(MediaPlayerActivity.SEQUENCER, -1);
mSequencer = new WeakReference<Sequencer>(null);
mTest = new WeakReference<Test>(null);
if (scenarioId >= 0 && sequencer >= 0) {
mSequencer = new WeakReference<Sequencer>(Controller.controller
.engine().getSequencerAt(sequencer));
String testPath = intent.getStringExtra(MediaPlayerActivity.TEST);
if (!TextUtils.isEmpty(testPath)) {
mTest = new WeakReference<Test>(TestUtil.fromPath(
scenario.getRootTest(), testPath));
}
}
}
/**
* Terminate the test case and finish the activity at the same time.
* <p>
* The result code and data are passed to both the parent activity and the
* {#link Test#terminate(int, Sequencer, Object)} method.
* </p>
*
* #param resultCode
* the result code. May be on of the Activity result code but
* also any other one having a meaning for the test and caller
* activity.
* #param data
* extra result data. Can be null.
*/
public void finishTest(int resultCode, Intent data) {
Test test = mTest.get();
Sequencer sequencer = mSequencer.get();
if ((test != null) && (sequencer != null)) {
test.terminate(resultCode, sequencer, data);
// prevent any further call to finishTest.
mTest = new WeakReference<Test> (null);
}
if (!isFinishing()) {
setResult(Activity.RESULT_OK, data);
finish();
}
}
#SuppressWarnings("deprecation")
private void _updateWidgets() {
SurfaceView surface = (SurfaceView) findViewById(R.id.surface);
mPreview = surface;
SurfaceHolder holder = surface.getHolder();
holder.addCallback(this);
if (VERSION.SDK_INT < android.os.Build.VERSION_CODES.HONEYCOMB) {
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
}
public void onBufferingUpdate(MediaPlayer arg0, int percent) {
Log.d(LOG_TAG, "onBufferingUpdate percent:" + percent);
if (percent >= 100) {
mMediaPlayer.setOnBufferingUpdateListener(null);
}
// no display so don't bother.
}
public void onCompletion(MediaPlayer player) {
_releaseMediaPlayer();
finishTest(Activity.RESULT_OK, null);
}
private void _rearmPlayer() {
mVideoWidth = 0;
mVideoHeight = 0;
mMediaPrepared = false;
mIsVideoSizeKnown = false;
}
public void onVideoSizeChanged(MediaPlayer mp, int width, int height) {
if (mIsVideoSizeKnown) return; // further notifications are completely wrong!!
Log.i(LOG_TAG, "Video size: " + width + "x" + height);
mIsVideoSizeKnown = true;
mVideoWidth = width;
mVideoHeight = height;
if (width > 0 && height > 0) {
_fitSurfaceForVideo();
} else {
// audio only or video size unknown.
}
if (mMediaPrepared) {
_startPlayback();
}
}
#SuppressWarnings("deprecation")
#SuppressLint("NewApi")
private void _fitSurfaceForVideo() {
Display display = getWindowManager().getDefaultDisplay();
Point size = new Point();
if (VERSION.SDK_INT >= 13) {
display.getSize(size);
} else {
size.x = display.getWidth();
size.y = display.getHeight();
}
double ratioVideo = (double)mVideoWidth / mVideoHeight;
double ratioScreen = (double)size.x / size.y;
if (ratioScreen > ratioVideo) {
// fit in height and scale the width to match the video ratio
mVideoHeight = size.y;
mVideoWidth = (int) (size.y / ratioVideo);
} else {
// fit in width and scale height to keep the video ratio
mVideoWidth = size.x;
mVideoHeight = (int) (size.x / ratioVideo);
}
}
public void onPrepared(MediaPlayer mediaplayer) {
mMediaPrepared = true;
if (mIsVideoSizeKnown) {
_startPlayback();
}
}
private void _startPlayback() {
SurfaceHolder holder = mPreview.getHolder();
if (mVideoWidth > 0 && mVideoHeight > 0) {
holder.setFixedSize(mVideoWidth, mVideoHeight);
}
if (mStartPosition > 0) {
Log.i(LOG_TAG, String.format("Resume at %f seconds", mStartPosition / 1000.f));
mMediaPlayer.seekTo(mStartPosition);
} else {
Log.i(LOG_TAG, "Start video");
}
if (!mTimeoutSet && (getPlayTimeout() > 0)) {
// this is a constant time reference: deduce the time already played.
long remaining = getPlayTimeout() - mStartPosition;
Time time = new Time();
time.setToNow();
time.second += remaining / 1000;
time.normalize(false);
Log.i(LOG_TAG, "Will end the video on " + time.format2445());
new ShutdownHandler(this).sendEmptyMessageDelayed(0, remaining);
mTimeoutSet = true;
}
mMediaPlayer.start();
}
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
Log.d(LOG_TAG, String.format(
"surfaceChanged called: format=%d, width=%d, height=%d",
format, width, height));
}
public void surfaceDestroyed(SurfaceHolder surfaceholder) {
Log.d(LOG_TAG, "surfaceDestroyed called");
mHolder = null;
if (mMediaPlayer != null) {
mMediaPlayer.setDisplay(null);
}
}
public void surfaceCreated(SurfaceHolder holder) {
Log.d(LOG_TAG, "surfaceCreated called");
mHolder = holder;
if (mMediaPlayer != null) {
// surface created after the media player
try {
mMediaPlayer.setDisplay(holder);
mMediaPlayer.setScreenOnWhilePlaying(isScreenOnWhilePlaying());
mMediaPlayer.setOnCompletionListener(this);
if (!mMediaPrepared) {
mMediaPlayer.setDataSource(this, mMediaURI);
mMediaPlayer.prepareAsync();
}
} catch (IllegalStateException e) {
_fail(e);
} catch (IllegalArgumentException e) {
_fail(e);
} catch (SecurityException e) {
_fail(e);
} catch (IOException e) {
_fail(e);
}
}
}
private void _playMedia() throws IllegalArgumentException,
SecurityException, IllegalStateException, IOException {
Log.d(LOG_TAG, "_playMedia()");
_rearmPlayer();
/*
* The video should be in a streamable mp4 or 3gpp format. The URI
* scheme may be Http. Mediaplayer can only play
* "progressive streamable contents" which basically means: 1. the movie
* atom has to precede all the media data atoms. 2. The clip has to be
* reasonably interleaved.
*/
if (mMediaURI != null) {
Log.i(LOG_TAG, "Source: " + mMediaURI);
// Create a new media player and set the listeners
if (mMediaPlayer != null) {
mMediaPlayer.reset();
} else {
mMediaPlayer = new MediaPlayer();
}
// mMediaPlayer.setDataSource(this, mMediaURI);
mMediaPlayer.setOnBufferingUpdateListener(this);
mMediaPlayer.setOnPreparedListener(this);
mMediaPlayer.setLooping(isLooping());
mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mMediaPlayer.setOnVideoSizeChangedListener(this);
if (mHolder != null) {
// surface created before the media player
mMediaPlayer.setDisplay(mHolder);
mMediaPlayer.setScreenOnWhilePlaying(isScreenOnWhilePlaying());
mMediaPlayer.setOnCompletionListener(this);
mMediaPlayer.setDataSource(this, mMediaURI);
mMediaPlayer.prepareAsync();
}
} else {
_fail("No video source defined");
}
}
// fail due to exception
private void _fail(Exception e) {
Log.e(LOG_TAG, e);
Intent data = new Intent();
data.putExtra(EXCEPTION, e);
finishTest(Activity.RESULT_FIRST_USER, data);
}
// fail due to generic error
private void _fail(String text) {
Log.e(LOG_TAG, text);
Intent data = new Intent();
data.putExtra(REASON, text);
finishTest(Activity.RESULT_FIRST_USER, data);
}
private void _pausePlayback() {
if (mMediaPlayer != null && mMediaPlayer.isPlaying()) {
Log.i(LOG_TAG, "Pause playback");
mMediaPlayer.pause();
// retain position for use by onSaveInstanceState(Bundle)
setStartPosition(mMediaPlayer.getCurrentPosition());
}
}
#Override
public void onBackPressed() {
super.onBackPressed();
Intent data = new Intent();
data.putExtra(REASON, "canceled by user");
finishTest(Activity.RESULT_CANCELED, data);
}
private void _releaseMediaPlayer() {
_stopPlayback();
if (mMediaPlayer != null) {
mMediaPlayer.release();
mMediaPlayer = null;
}
_rearmPlayer();
}
private void _stopPlayback() {
if (mMediaPlayer != null && mMediaPlayer.isPlaying()) {
_pausePlayback();
Log.i(LOG_TAG, "stop video");
mMediaPlayer.stop();
}
}
public boolean isPausingOnBackground() {
return mPauseOnBackground;
}
public void setPauseOnBackground(boolean pause) {
Log.d(LOG_TAG, "Pausing on background: " + pause);
this.mPauseOnBackground = pause;
}
public Uri getSourceURI() {
return mMediaURI;
}
public void setSourceURI(Uri uri) {
Log.d(LOG_TAG, "Media source: " + uri);
this.mMediaURI = uri;
}
public long getPlayTimeout() {
return mTimeout;
}
public void setPlayTimeout(long timeout) {
Log.d(LOG_TAG, "Play length (ms): " + timeout);
this.mTimeout = timeout;
}
public boolean isLooping() {
return mLoop;
}
public void setLooping(boolean loop) {
Log.d(LOG_TAG, "Is looping: " + loop);
this.mLoop = loop;
}
public int getStartPosition() {
int position = 0;
if (mMediaPlayer != null) {
position = mMediaPlayer.getCurrentPosition();
}
return position;
}
public void setStartPosition(int position) {
Log.d(LOG_TAG, String.format("Start at %fs", position / 1000.));
this.mStartPosition = position;
}
public boolean isScreenOnWhilePlaying() {
return mScreenOnWhilePlaying;
}
public void setScreenOnWhilePlaying(boolean screenOnWhilePlaying) {
Log.d(LOG_TAG, "Screen ON while playing: " + screenOnWhilePlaying);
this.mScreenOnWhilePlaying = screenOnWhilePlaying;
}
#Override
public boolean onError(MediaPlayer mp, int what, int extra) {
_fail(String.format("Media player error: what=%d, extra=%d", what, extra));
return true;
}
}
The resource for the layout containing the SurfaceView:
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent" >
<SurfaceView
android:id="#+id/surface"
android:layout_width="wrap_content"
android:layout_height="wrap_content" />
</FrameLayout>
An example of use of the activity:
Intent intent = new Intent(parent, MediaPlayerActivity.class);
intent.putExtra(MediaPlayerActivity.SOURCE_URI, this.getSourceURI());
intent.putExtra(MediaPlayerActivity.PAUSE_ON_BACKGROUND,
this.getPauseOnBackground());
Scenario scenario = sequencer.getScenario();
intent.putExtra(MediaPlayerActivity.SCENARIO, scenario.getScenarioId());
intent.putExtra(MediaPlayerActivity.SEQUENCER, sequencer.getPosition());
intent.putExtra(MediaPlayerActivity.TEST, TestUtil.getTestPath(
scenario.getCurrentTest(), TestUtil.Path.Static));
// timeout option
TimeSpan timeout = this.getTimeout();
if (!timeout.isNull()) {
timeout.renew();
intent.putExtra(MediaPlayerActivity.TIMEOUT, timeout.value());
}
intent.putExtra(MediaPlayerActivity.LOOP, this.isLooping());
startActivity(intent);

#Sock have you thought about using the YouTube API for Android. They have a great playerview which gives you lots of control and event-listening ability on the video. I recently posted a tutorial on using the API if it might fit your needs, check it out here

Related

OpenGL ES: Rotate the complete scene to match portrait mode

Note Im new to Android and OpenGL
Im building an Augmented Reality App based on ARToolKitX (Github: https://github.com/artoolkitx/artoolkitx/tree/8c6bd4e7be5e80c8439066b23473506aebbb496c/Source/ARXJ/ARXJProj/arxj/src/main/java/org/artoolkitx/arx/arxj).
The application shows the camera frame and displays objects with opengl on top.
My Problem:
ARToolKitX forces the app to be in landscape mode:
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
but when I change the screen orientation to SCREEN_ORIENTATION_PORTRAIT, the camera image and the opengl objects dont rotate to the correct orientation and stay in landscape mode.
Inside the ARRenderer I can use the drawVideoSettings method to rotate the camera image by itself, but that doesnt apply to the opengl objects.
ARToolKitX also provides a SurfaceChanged method inside the CameraSurface class, with the comment: "This is where [...] to create transformation matrix to scale and then rotate surface view, if the app is going to handle orientation changes."
But I dont have any idea, how the transformation matrix has too look like and how to apply it.
Any help is appreciated.
ARRenderer:
public abstract class ARRenderer implements GLSurfaceView.Renderer {
private MyShaderProgram shaderProgram;
private int width, height, cameraIndex;
private int[] viewport = new int[4];
private boolean firstRun = true;
private final static String TAG = ARRenderer.class.getName();
/**
* Allows subclasses to load markers and prepare the scene. This is called after
* initialisation is complete.
*/
public boolean configureARScene() {
return true;
}
public void onSurfaceCreated(GL10 unused, EGLConfig config) {
// Transparent background
GLES20.glClearColor(0.0f, 0.0f, 0.0f, 0.f);
this.shaderProgram = new MyShaderProgram(new MyVertexShader(), new MyFragmentShader());
GLES20.glUseProgram(shaderProgram.getShaderProgramHandle());
}
public void onSurfaceChanged(GL10 unused, int w, int h) {
this.width = w;
this.height = h;
if(ARController.getInstance().isRunning()) {
//Update the frame settings for native rendering
ARController.getInstance().drawVideoSettings(cameraIndex, w, h, false, false, false, ARX_jni.ARW_H_ALIGN_CENTRE, ARX_jni.ARW_V_ALIGN_CENTRE, ARX_jni.ARW_SCALE_MODE_FILL, viewport);
}
}
public void onDrawFrame(GL10 unused) {
if (ARController.getInstance().isRunning()) {
// Initialize artoolkitX video background rendering.
if (firstRun) {
boolean isDisplayFrameInited = ARController.getInstance().drawVideoInit(cameraIndex);
if (!isDisplayFrameInited) {
Log.e(TAG, "Display Frame not inited");
}
if (!ARController.getInstance().drawVideoSettings(cameraIndex, this.width, this.height, false, false,
false, ARX_jni.ARW_H_ALIGN_CENTRE, ARX_jni.ARW_V_ALIGN_CENTRE,
ARX_jni.ARW_SCALE_MODE_FILL, viewport)) {
Log.e(TAG, "Error during call of displayFrameSettings.");
} else {
Log.i(TAG, "Viewport {" + viewport[0] + ", " + viewport[1] + ", " + viewport[2] + ", " + viewport[3] + "}.");
}
firstRun = false;
}
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
if (!ARController.getInstance().drawVideoSettings(cameraIndex)) {
Log.e(TAG, "Error during call of displayFrame.");
}
draw();
}
}
/**
* Should be overridden in subclasses and used to perform rendering.
*/
public void draw() {
GLES20.glViewport(viewport[0], viewport[1], viewport[2], viewport[3]);
//TODO: Check how to refactor near and far plane
shaderProgram.setProjectionMatrix(ARController.getInstance().getProjectionMatrix(10.0f, 10000.0f));
float[] camPosition = {1f, 1f, 1f};
shaderProgram.render(camPosition);
}
#SuppressWarnings("unused")
public ShaderProgram getShaderProgram() {
return shaderProgram;
}
public void setCameraIndex(int cameraIndex) {
this.cameraIndex = cameraIndex;
}
}
CameraSurface
class CameraSurfaceImpl implements CameraSurface {
/**
* Android logging tag for this class.
*/
private static final String TAG = CameraSurfaceImpl.class.getSimpleName();
private CameraDevice mCameraDevice;
private ImageReader mImageReader;
private Size mImageReaderVideoSize;
private final Context mAppContext;
private final CameraDevice.StateCallback mCamera2DeviceStateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(#NonNull CameraDevice camera2DeviceInstance) {
mCameraDevice = camera2DeviceInstance;
startCaptureAndForwardFramesSession();
}
#Override
public void onDisconnected(#NonNull CameraDevice camera2DeviceInstance) {
camera2DeviceInstance.close();
mCameraDevice = null;
}
#Override
public void onError(#NonNull CameraDevice camera2DeviceInstance, int error) {
camera2DeviceInstance.close();
mCameraDevice = null;
}
};
/**
* Listener to inform of camera related events: start, frame, and stop.
*/
private final CameraEventListener mCameraEventListener;
/**
* Tracks if SurfaceView instance was created.
*/
private boolean mImageReaderCreated;
public CameraSurfaceImpl(CameraEventListener cameraEventListener, Context appContext){
this.mCameraEventListener = cameraEventListener;
this.mAppContext = appContext;
}
private final ImageReader.OnImageAvailableListener mImageAvailableAndProcessHandler = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader)
{
Image imageInstance = reader.acquireLatestImage();
if (imageInstance == null) {
//Note: This seems to happen quite often.
Log.v(TAG, "onImageAvailable(): unable to acquire new image");
return;
}
// Get a ByteBuffer for each plane.
final Image.Plane[] imagePlanes = imageInstance.getPlanes();
final int imagePlaneCount = Math.min(4, imagePlanes.length); // We can handle up to 4 planes max.
final ByteBuffer[] imageBuffers = new ByteBuffer[imagePlaneCount];
final int[] imageBufferPixelStrides = new int[imagePlaneCount];
final int[] imageBufferRowStrides = new int[imagePlaneCount];
for (int i = 0; i < imagePlaneCount; i++) {
imageBuffers[i] = imagePlanes[i].getBuffer();
// For ImageFormat.YUV_420_888 the order of planes in the array returned by Image.getPlanes()
// is guaranteed such that plane #0 is always Y, plane #1 is always U (Cb), and plane #2 is always V (Cr).
// The Y-plane is guaranteed not to be interleaved with the U/V planes (in particular, pixel stride is
// always 1 in yPlane.getPixelStride()). The U/V planes are guaranteed to have the same row stride and
// pixel stride (in particular, uPlane.getRowStride() == vPlane.getRowStride() and uPlane.getPixelStride() == vPlane.getPixelStride(); ).
imageBufferPixelStrides[i] = imagePlanes[i].getPixelStride();
imageBufferRowStrides[i] = imagePlanes[i].getRowStride();
}
if (mCameraEventListener != null) {
mCameraEventListener.cameraStreamFrame(imageBuffers, imageBufferPixelStrides, imageBufferRowStrides);
}
imageInstance.close();
}
};
#Override
public void surfaceCreated() {
Log.i(TAG, "surfaceCreated(): called");
SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(mAppContext);
int defaultCameraIndexId = mAppContext.getResources().getIdentifier("pref_defaultValue_cameraIndex","string", mAppContext.getPackageName());
mCamera2DeviceID = Integer.parseInt(prefs.getString("pref_cameraIndex", mAppContext.getResources().getString(defaultCameraIndexId)));
Log.i(TAG, "surfaceCreated(): will attempt to open camera \"" + mCamera2DeviceID +
"\", set orientation, set preview surface");
/*
Set the resolution from the settings as size for the glView. Because the video stream capture
is requested based on this size.
WARNING: While coding the preferences are taken from the res/xml/preferences.xml!!!
When building for Unity the actual used preferences are taken from the UnityARPlayer project!!!
*/
int defaultCameraValueId = mAppContext.getResources().getIdentifier("pref_defaultValue_cameraResolution","string",mAppContext.getPackageName());
String camResolution = prefs.getString("pref_cameraResolution", mAppContext.getResources().getString(defaultCameraValueId));
String[] dims = camResolution.split("x", 2);
mImageReaderVideoSize = new Size(Integer.parseInt(dims[0]),Integer.parseInt(dims[1]));
// Note that maxImages should be at least 2 for acquireLatestImage() to be any different than acquireNextImage() -
// discarding all-but-the-newest Image requires temporarily acquiring two Images at once. Or more generally,
// calling acquireLatestImage() with less than two images of margin, that is (maxImages - currentAcquiredImages < 2)
// will not discard as expected.
mImageReader = ImageReader.newInstance(mImageReaderVideoSize.getWidth(),mImageReaderVideoSize.getHeight(), ImageFormat.YUV_420_888, /* The maximum number of images the user will want to access simultaneously:*/ 2 );
mImageReader.setOnImageAvailableListener(mImageAvailableAndProcessHandler, null);
mImageReaderCreated = true;
} // end: public void surfaceCreated(SurfaceHolder holder)
/* Interface implemented by this SurfaceView subclass
holder: SurfaceHolder instance associated with SurfaceView instance that changed
format: pixel format of the surface
width: of the SurfaceView instance
height: of the SurfaceView instance
*/
#Override
public void surfaceChanged() {
Log.i(TAG, "surfaceChanged(): called");
// This is where to calculate the optimal size of the display and set the aspect ratio
// of the surface view (probably the service holder). Also where to Create transformation
// matrix to scale and then rotate surface view, if the app is going to handle orientation
// changes.
if (!mImageReaderCreated) {
surfaceCreated();
}
if (!isCamera2DeviceOpen()) {
openCamera2(mCamera2DeviceID);
}
if (isCamera2DeviceOpen() && (null == mYUV_CaptureAndSendSession)) {
startCaptureAndForwardFramesSession();
}
}
private void openCamera2(int camera2DeviceID) {
Log.i(TAG, "openCamera2(): called");
CameraManager camera2DeviceMgr = (CameraManager)mAppContext.getSystemService(Context.CAMERA_SERVICE);
try {
if (PackageManager.PERMISSION_GRANTED == ContextCompat.checkSelfPermission(mAppContext, Manifest.permission.CAMERA)) {
camera2DeviceMgr.openCamera(Integer.toString(camera2DeviceID), mCamera2DeviceStateCallback, null);
return;
}
} catch (CameraAccessException ex) {
Log.e(TAG, "openCamera2(): CameraAccessException caught, " + ex.getMessage());
} catch (Exception ex) {
Log.e(TAG, "openCamera2(): exception caught, " + ex.getMessage());
}
if (null == camera2DeviceMgr) {
Log.e(TAG, "openCamera2(): Camera2 DeviceMgr not set");
}
Log.e(TAG, "openCamera2(): abnormal exit");
}
private int mCamera2DeviceID = -1;
private CaptureRequest.Builder mCaptureRequestBuilder;
private CameraCaptureSession mYUV_CaptureAndSendSession;
private void startCaptureAndForwardFramesSession() {
if ((null == mCameraDevice) || (!mImageReaderCreated) /*|| (null == mPreviewSize)*/) {
return;
}
closeYUV_CaptureAndForwardSession();
try {
mCaptureRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
List<Surface> surfaces = new ArrayList<>();
Surface surfaceInstance;
surfaceInstance = mImageReader.getSurface();
surfaces.add(surfaceInstance);
mCaptureRequestBuilder.addTarget(surfaceInstance);
mCameraDevice.createCaptureSession(
surfaces, // Output surfaces
new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession session) {
try {
if (mCameraEventListener != null) {
mCameraEventListener.cameraStreamStarted(mImageReaderVideoSize.getWidth(), mImageReaderVideoSize.getHeight(), "YUV_420_888", mCamera2DeviceID, false);
}
mYUV_CaptureAndSendSession = session;
// Session to repeat request to update passed in camSensorSurface
mYUV_CaptureAndSendSession.setRepeatingRequest(mCaptureRequestBuilder.build(), /* CameraCaptureSession.CaptureCallback cameraEventListener: */null, /* Background thread: */ null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession session) {
Toast.makeText(mAppContext, "Unable to setup camera sensor capture session", Toast.LENGTH_SHORT).show();
}
}, // Callback for capture session state updates
null); // Secondary thread message queue
} catch (CameraAccessException ex) {
ex.printStackTrace();
}
}
#Override
public void closeCameraDevice() {
closeYUV_CaptureAndForwardSession();
if (null != mCameraDevice) {
mCameraDevice.close();
mCameraDevice = null;
}
if (null != mImageReader) {
mImageReader.close();
mImageReader = null;
}
if (mCameraEventListener != null) {
mCameraEventListener.cameraStreamStopped();
}
mImageReaderCreated = false;
}
private void closeYUV_CaptureAndForwardSession() {
if (mYUV_CaptureAndSendSession != null) {
mYUV_CaptureAndSendSession.close();
mYUV_CaptureAndSendSession = null;
}
}
/**
* Indicates whether or not camera2 device instance is available, opened, enabled.
*/
#Override
public boolean isCamera2DeviceOpen() {
return (null != mCameraDevice);
}
#Override
public boolean isImageReaderCreated() {
return mImageReaderCreated;
}
}
Edit:
/**
* Override the draw function from ARRenderer.
*/
#Override
public void draw() {
super.draw();
fpsCounter.frame();
if(maxfps<fpsCounter.getFPS()){
maxfps= fpsCounter.getFPS();
}
logger.log(Level.INFO, "FPS: " + maxfps);
// Initialize GL
GLES20.glEnable(GLES20.GL_CULL_FACE);
GLES20.glEnable(GLES20.GL_DEPTH_TEST);
GLES20.glFrontFace(GLES20.GL_CCW);
// Look for trackables, and draw on each found one.
for (int trackableUID : trackables.keySet()) {
// If the trackable is visible, apply its transformation, and render the object
float[] modelViewMatrix = new float[16];
if (ARController.getInstance().queryTrackableVisibilityAndTransformation(trackableUID, modelViewMatrix)) {
float[] projectionMatrix = ARController.getInstance().getProjectionMatrix(10.0f, 10000.0f);
trackables.get(trackableUID).draw(projectionMatrix, modelViewMatrix);
}
}
}

Unable to read FNC1 character at the first position of a GS1 128 Barcode using Zbar library?

I have developed an application for barcode decoding in android using Google vision Library for GS1 data matrix and Zbar Library for GS1 128 barcode Unable to read FNC1 character at the first position of a GS1 128 Barcode using Zbar library.
The Zbar library is unable to display any sign of FNC1 character at the start of the Barcode!
Any Solutions. . . .
Instant Help is appreciable . . .
Below is my ZBar Scanner Activity
#SuppressWarnings("deprecation")
public class ZBarFirstScannerActivity extends AppCompatActivity{
//TextView tv;
ImageView iv;
LinearLayout ll;
private Camera mCamera;
private CameraPreview mPreview;
private Handler autoFocusHandler;
private ImageScanner scanner;
private boolean barcodeScanned = false;
private boolean previewing = true;
TextView tv;
static {
System.loadLibrary("iconv");
}
static {
System.loadLibrary("zbarjni");
}
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.barcode_capture1d);
tv = (TextView) findViewById(R.id.textVertical);
tv.setRotation(90);
initToolbar();
autoFocusHandler = new Handler();
mCamera = getCameraInstance();
// Instance barcode scanner
scanner = new ImageScanner();
scanner.setConfig(0, Config.X_DENSITY, 1);
scanner.setConfig(0, Config.Y_DENSITY, 1);
scanner.setConfig(Symbol.CODE128, Config.ENABLE,1);
scanner.setConfig(Symbol.EAN13, Config.ENABLE,1);
mPreview = new CameraPreview(this, mCamera, previewCb, autoFocusCB);
FrameLayout preview = (FrameLayout)findViewById(R.id.cameraPreview);
preview.addView(mPreview);
}
private void initToolbar() {
final Toolbar toolbar = (Toolbar) findViewById(R.id.toolbar);
setSupportActionBar(toolbar);
final ActionBar actionBar = getSupportActionBar();
if (actionBar != null) {
actionBar.setHomeButtonEnabled(true);
actionBar.setHomeAsUpIndicator(ContextCompat.getDrawable(this, R.drawable.abc_ic_ab_back_mtrl_am_alpha));
actionBar.setDisplayHomeAsUpEnabled(true);
}
}
/** A safe way to get an instance of the Camera object. */
public static Camera getCameraInstance()
{
Camera c = null;
try
{
c = Camera.open();
} catch (Exception e)
{
//nada
}
return c;
}
private void releaseCamera()
{
if (mCamera != null)
{
previewing = false;
mCamera.setPreviewCallback(null);
mCamera.release();
mCamera = null;
}
}
PreviewCallback previewCb = new PreviewCallback()
{
public void onPreviewFrame(byte[] data, Camera camera)
{
Camera.Parameters parameters = camera.getParameters();
Size size = parameters.getPreviewSize();
Image barcode = new Image(size.width, size.height, "Y800");
barcode.setData(data);
int result = scanner.scanImage(barcode);
if (result != 0)
{
previewing = false;
mCamera.setPreviewCallback(null);
mCamera.stopPreview();
SymbolSet syms = scanner.getResults();
for (Symbol sym : syms)
{
barcodeScanned = true;
Intent returnIntent = new Intent();
returnIntent.putExtra("BARCODE", sym.getData());
setResult(MainActivity.BAR_CODE_TYPE_128,returnIntent);
releaseCamera();
finish();
break;
}
}
}
};
// Mimic continuous auto-focusing
AutoFocusCallback autoFocusCB = new AutoFocusCallback()
{
public void onAutoFocus(boolean success, Camera camera)
{
autoFocusHandler.postDelayed(doAutoFocus, 3000);
}
};
private Runnable doAutoFocus = new Runnable()
{
public void run()
{
if (previewing)
mCamera.autoFocus(autoFocusCB);
}
};
public void onPause() {
super.onPause();
releaseCamera();
}
public void onResume(){
super.onResume();
new ZBarFirstScannerActivity();
}
#Override
public void onBackPressed() {
releaseCamera();
finish();
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
int id = item.getItemId();
if (id == android.R.id.home) {
onBackPressed();
return true;
}
return super.onOptionsItemSelected(item);
}
}
Below is my Google Scanner Activity
public final class GoogleScannerActivity extends AppCompatActivity {
private static final String TAG = "Barcode-reader";
// intent request code to handle updating play services if needed.
private static final int RC_HANDLE_GMS = 9001;
// permission request codes need to be < 256
private static final int RC_HANDLE_CAMERA_PERM = 2;
// constants used to pass extra data in the intent
public static final String AutoFocus = "AutoFocus";
public static final String UseFlash = "UseFlash";
public static final String BarcodeObject = "Barcode";
Bitmap bmp;
FileOutputStream fos = null;
private Camera c;
Switch aSwitch;
private CameraSource mCameraSource;
private CameraSourcePreview mPreview;
private GraphicOverlay<BarcodeGraphic> mGraphicOverlay;
// helper objects for detecting taps and pinches.
private ScaleGestureDetector scaleGestureDetector;
private GestureDetector gestureDetector;
/**
* Initializes the UI and creates the detector pipeline.
*/
#Override
public void onCreate(Bundle icicle) {
super.onCreate(icicle);
setContentView(R.layout.barcode_capture2d);
initToolbar();
ActivitySource.caller = this;
mPreview = (CameraSourcePreview) findViewById(R.id.preview);
mGraphicOverlay = (GraphicOverlay<BarcodeGraphic>) findViewById(R.id.graphicOverlay);
boolean autoFocus = true;
boolean useFlash = false;
// Check for the camera permission before accessing the camera. If the
// permission is not granted yet, request permission.
int rc = ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA);
if (rc == PackageManager.PERMISSION_GRANTED) {
createCameraSource(autoFocus, useFlash);
} else {
requestCameraPermission();
}
gestureDetector = new GestureDetector(this, new CaptureGestureListener());
scaleGestureDetector = new ScaleGestureDetector(this, new ScaleListener());
/*Snackbar.make(mGraphicOverlay, "Tap to capture. Pinch/Stretch to zoom",
Snackbar.LENGTH_LONG)
.show();*/
}
private void initToolbar() {
final Toolbar toolbar = (Toolbar) findViewById(R.id.toolbar);
setSupportActionBar(toolbar);
final ActionBar actionBar = getSupportActionBar();
if (actionBar != null) {
actionBar.setHomeButtonEnabled(true);
actionBar.setHomeAsUpIndicator(ContextCompat.getDrawable(this, R.drawable.abc_ic_ab_back_mtrl_am_alpha));
actionBar.setDisplayHomeAsUpEnabled(true);
}
}
private Camera.Size getBestPreviewSize(int width, int height, Camera.Parameters parameters){
Camera.Size bestSize = null;
List<Camera.Size> sizeList = parameters.getSupportedPreviewSizes();
bestSize = sizeList.get(0);
for(int i = 1; i < sizeList.size(); i++){
if((sizeList.get(i).width * sizeList.get(i).height) >
(bestSize.width * bestSize.height)){
bestSize = sizeList.get(i);
}
}
return bestSize;
}
/**
* Handles the requesting of the camera permission. This includes
* showing a "Snackbar" message of why the permission is needed then
* sending the request.
*/
private void requestCameraPermission() {
Log.w(TAG, "Camera permission is not granted. Requesting permission");
final String[] permissions = new String[]{Manifest.permission.CAMERA};
if (!ActivityCompat.shouldShowRequestPermissionRationale(this,
Manifest.permission.CAMERA)) {
ActivityCompat.requestPermissions(this, permissions, RC_HANDLE_CAMERA_PERM);
return;
}
final Activity thisActivity = this;
View.OnClickListener listener = new View.OnClickListener() {
#Override
public void onClick(View view) {
ActivityCompat.requestPermissions(thisActivity, permissions,
RC_HANDLE_CAMERA_PERM);
}
};
Snackbar.make(mGraphicOverlay, R.string.permission_camera_rationale,
Snackbar.LENGTH_INDEFINITE)
.setAction(R.string.ok, listener)
.show();
}
#Override
public boolean onTouchEvent(MotionEvent e) {
boolean b = scaleGestureDetector.onTouchEvent(e);
boolean c = gestureDetector.onTouchEvent(e);
return b || c || super.onTouchEvent(e);
}
/**
* Creates and starts the camera. Note that this uses a higher resolution in comparison
* to other detection examples to enable the barcode detector to detect small barcodes
* at long distances.
*
* Suppressing InlinedApi since there is a check that the minimum version is met before using
* the constant.
*/
#SuppressLint("InlinedApi")
private void createCameraSource(boolean autoFocus, boolean useFlash) {
Context context = getApplicationContext();
// A barcode detector is created to track barcodes. An associated multi-processor instance
// is set to receive the barcode detection results, track the barcodes, and maintain
// graphics for each barcode on screen. The factory is used by the multi-processor to
// create a separate tracker instance for each barcode.
BarcodeDetector barcodeDetector = new BarcodeDetector.Builder(context).setBarcodeFormats(Barcode.CODE_128 | Barcode.DATA_MATRIX | Barcode.QR_CODE).build();
BarcodeTrackerFactory barcodeFactory = new BarcodeTrackerFactory(mGraphicOverlay);
barcodeDetector.setProcessor(
new MultiProcessor.Builder<>(barcodeFactory).build());
if (!barcodeDetector.isOperational()) {
// Note: The first time that an app using the barcode or face API is installed on a
// device, GMS will download a native libraries to the device in order to do detection.
// Usually this completes before the app is run for the first time. But if that
// download has not yet completed, then the above call will not detect any barcodes
// and/or faces.
//
// isOperational() can be used to check if the required native libraries are currently
// available. The detectors will automatically become operational once the library
// downloads complete on device.
Log.w(TAG, "Detector dependencies are not yet available.");
// Check for low storage. If there is low storage, the native library will not be
// downloaded, so detection will not become operational.
IntentFilter lowstorageFilter = new IntentFilter(Intent.ACTION_DEVICE_STORAGE_LOW);
boolean hasLowStorage = registerReceiver(null, lowstorageFilter) != null;
if (hasLowStorage) {
Toast.makeText(this, R.string.low_storage_error, Toast.LENGTH_LONG).show();
Log.w(TAG, getString(R.string.low_storage_error));
}
}
// Creates and starts the camera. Note that this uses a higher resolution in comparison
// to other detection examples to enable the barcode detector to detect small barcodes
// at long distances.
CameraSource.Builder builder = new CameraSource.Builder(getApplicationContext(), barcodeDetector)
.setFacing(CameraSource.CAMERA_FACING_BACK)
.setRequestedPreviewSize(1100, 844)
.setRequestedFps(15.0f);
// make sure that auto focus is an available option
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.ICE_CREAM_SANDWICH) {
builder = builder.setFocusMode(
autoFocus ? Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE : null);
}
mCameraSource = builder
.setFlashMode(useFlash ? Camera.Parameters.FLASH_MODE_TORCH : null)
.build();
}
/**
* Restarts the camera.
*/
#Override
protected void onResume() {
super.onResume();
startCameraSource();
}
/**
* Stops the camera.
*/
#Override
protected void onPause() {
super.onPause();
if (mPreview != null) {
mPreview.stop();
}
}
/**
* Releases the resources associated with the camera source, the associated detectors, and the
* rest of the processing pipeline.
*/
#Override
protected void onDestroy() {
super.onDestroy();
if (mPreview != null) {
mPreview.release();
}
}
#Override
public void onRequestPermissionsResult(int requestCode,
#NonNull String[] permissions,
#NonNull int[] grantResults) {
if (requestCode != RC_HANDLE_CAMERA_PERM) {
Log.d(TAG, "Got unexpected permission result: " + requestCode);
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
return;
}
if (grantResults.length != 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
Log.d(TAG, "Camera permission granted - initialize the camera source");
// we have permission, so create the camerasource
boolean autoFocus = getIntent().getBooleanExtra(AutoFocus,false);
boolean useFlash = getIntent().getBooleanExtra(UseFlash, false);
createCameraSource(autoFocus, useFlash);
return;
}
Log.e(TAG, "Permission not granted: results len = " + grantResults.length +
" Result code = " + (grantResults.length > 0 ? grantResults[0] : "(empty)"));
DialogInterface.OnClickListener listener = new DialogInterface.OnClickListener() {
public void onClick(DialogInterface dialog, int id) {
finish();
}
};
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setTitle("Multitracker sample")
.setMessage(R.string.no_camera_permission)
.setPositiveButton(R.string.ok, listener)
.show();
}
/**
* Starts or restarts the camera source, if it exists. If the camera source doesn't exist yet
* (e.g., because onResume was called before the camera source was created), this will be called
* again when the camera source is created.
*/
private void startCameraSource() throws SecurityException {
// check that the device has play services available.
int code = GoogleApiAvailability.getInstance().isGooglePlayServicesAvailable(
getApplicationContext());
if (code != ConnectionResult.SUCCESS) {
Dialog dlg =
GoogleApiAvailability.getInstance().getErrorDialog(this, code, RC_HANDLE_GMS);
dlg.show();
}
if (mCameraSource != null) {
try {
mPreview.start(mCameraSource, mGraphicOverlay);
} catch (IOException e) {
Log.e(TAG, "Unable to start camera source.", e);
mCameraSource.release();
mCameraSource = null;
}
}
}
/**
* onTap is called to capture the oldest barcode currently detected and
* return it to the caller.
*
* #param rawX - the raw position of the tap
* #param rawY - the raw position of the tap.
* #return true if the activity is ending.
*/
private boolean onTap(float rawX, float rawY) {
//TODO: use the tap position to select the barcode.
BarcodeGraphic graphic = mGraphicOverlay.getFirstGraphic();
Barcode barcode = null;
if (graphic != null) {
barcode = graphic.getBarcode();
if (barcode != null) {
Intent data = new Intent();
data.putExtra(BarcodeObject, barcode);
setResult(CommonStatusCodes.SUCCESS, data);
finish();
}
else {
Log.d(TAG, "barcode data is null");
}
}
else {
Log.d(TAG,"no barcode detected");
}
return barcode != null;
}
private class CaptureGestureListener extends GestureDetector.SimpleOnGestureListener {
#Override
public boolean onSingleTapConfirmed(MotionEvent e) {
return onTap(e.getRawX(), e.getRawY()) || super.onSingleTapConfirmed(e);
}
}
private class ScaleListener implements ScaleGestureDetector.OnScaleGestureListener {
/**
* Responds to scaling events for a gesture in progress.
* Reported by pointer motion.
*
* #param detector The detector reporting the event - use this to
* retrieve extended info about event state.
* #return Whether or not the detector should consider this event
* as handled. If an event was not handled, the detector
* will continue to accumulate movement until an event is
* handled. This can be useful if an application, for example,
* only wants to update scaling factors if the change is
* greater than 0.01.
*/
#Override
public boolean onScale(ScaleGestureDetector detector) {
return false;
}
/**
* Responds to the beginning of a scaling gesture. Reported by
* new pointers going down.
*
* #param detector The detector reporting the event - use this to
* retrieve extended info about event state.
* #return Whether or not the detector should continue recognizing
* this gesture. For example, if a gesture is beginning
* with a focal point outside of a region where it makes
* sense, onScaleBegin() may return false to ignore the
* rest of the gesture.
*/
#Override
public boolean onScaleBegin(ScaleGestureDetector detector) {
return true;
}
/**
* Responds to the end of a scale gesture. Reported by existing
* pointers going up.
* <p/>
* Once a scale has ended, {#link ScaleGestureDetector#getFocusX()}
* and {#link ScaleGestureDetector#getFocusY()} will return focal point
* of the pointers remaining on the screen.
*
* #param detector The detector reporting the event - use this to
* retrieve extended info about event state.
*/
#Override
public void onScaleEnd(ScaleGestureDetector detector) {
mCameraSource.doZoom(detector.getScaleFactor());
}
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
int id = item.getItemId();
if (id == android.R.id.home) {
onBackPressed();
return true;
}
return super.onOptionsItemSelected(item);
}
}
When scanning a GS1-128 symbol the FNC1 in first position acts as a flag character to indicate the presence of GS1 Application Identifier standard format data and is intentionally omitted from the scanned data, whilst any inter-field FNC1 formatting character is transmitted at GS (ASCII 29).
The implicit, leading FNC1 can be inferred if your reader is configured to emit symbology identifiers at the start of the scanned data. In this case your GS1-128 scanned data will begin with ]C1 rather than ]C0 for generic Code 128.
Unfortunately it does not appear as though either the ZBar library or Google Vision Library can be configured to return symbology identifiers which is a disappointing limitation.
Additionally the Google Vision Library erroneously returns a leading GS1 representing the FNC1 in first position.
Reading of GS1 formatted data is described in detail by this answer.
Specifically the ISO/IEC 15417 - Code 128 bar code symbology specification says:
"Any application which utilizes Code
128 symbols with FNC1 in the first or second data position should
require the transmission of symbology identifiers to be enabled. When
FNC1 is used in the first or second position it shall not be
represented in the transmitted message, although its presence is
indicated by the use of modifier values 1 or 2 respectively in the
symbology identifier."

Issue with OCR Scan on Android

I'm trying to create an app with a OCR Scanner by using the tessract/tess-two library, I've successful access to the Phone camera, I can do the manual focus but when I take a picture the following error:
07-18 19:07:06.335 2585-2585/com.fastnetserv.myapp D/DBG_com.fastnetserv.myapp.MainActivity: Picture taken
07-18 19:07:06.335 2585-2585/com.fastnetserv.myapp D/DBG_com.fastnetserv.myapp.MainActivity: Got null data
07-18 19:07:06.405 2585-2585/com.fastnetserv.myapp D/DBG_com.fastnetserv.myapp.MainActivity: Picture taken
07-18 19:07:06.426 2585-2585/com.fastnetserv.myapp D/DBG_com.fastnetserv.myapp.MainActivity: Got bitmap
07-18 19:07:06.427 2585-11599/com.fastnetserv.myapp E/DBG_com.fastnetserv.myapp.TessAsyncEngine: Error passing parameter to execute(context, bitmap)
07-18 19:07:14.111 2585-2585/com.fastnetserv.myapp D/DBG_com.fastnetserv.myapp.CameraUtils: CameraEngine Stopped
Here the CameraFragment code:
package com.fastnetserv.myapp;
import android.content.Context;
import android.graphics.Bitmap;
import android.hardware.Camera;
import android.net.Uri;
import android.os.AsyncTask;
import android.os.Bundle;
import android.support.v4.app.Fragment;
import android.util.Log;
import android.view.LayoutInflater;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.ViewGroup;
import android.widget.Button;
import com.googlecode.tesseract.android.TessBaseAPI;
/**
* A simple {#link Fragment} subclass.
* Activities that contain this fragment must implement the
* {#link //CameraFragment.//OnFragmentInteractionListener} interface
* to handle interaction events.
* Use the {#link CameraFragment#newInstance} factory method to
* create an instance of this fragment.
*/
public class CameraFragment extends Fragment implements SurfaceHolder.Callback, View.OnClickListener,
Camera.PictureCallback, Camera.ShutterCallback {
static final String TAG = "DBG_" + MainActivity.class.getName();
Button shutterButton;
Button focusButton;
FocusBoxView focusBox;
SurfaceView cameraFrame;
CameraEngine cameraEngine;
// TODO: Rename parameter arguments, choose names that match
// the fragment initialization parameters, e.g. ARG_ITEM_NUMBER
private static final String ARG_PARAM1 = "param1";
private static final String ARG_PARAM2 = "param2";
// TODO: Rename and change types of parameters
private String mParam1;
private String mParam2;
private OnFragmentInteractionListener mListener;
public CameraFragment() {
// Required empty public constructor
}
/**
* Use this factory method to create a new instance of
* this fragment using the provided parameters.
*
* #param param1 Parameter 1.
* #param param2 Parameter 2.
* #return A new instance of fragment CameraFragment.
*/
// TODO: Rename and change types and number of parameters
public static CameraFragment newInstance(String param1, String param2) {
CameraFragment fragment = new CameraFragment();
Bundle args = new Bundle();
args.putString(ARG_PARAM1, param1);
args.putString(ARG_PARAM2, param2);
fragment.setArguments(args);
return fragment;
}
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
if (getArguments() != null) {
mParam1 = getArguments().getString(ARG_PARAM1);
mParam2 = getArguments().getString(ARG_PARAM2);
}
}
#Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
// Inflate the layout for this fragment
return inflater.inflate(R.layout.fragment_camera, container, false);
}
// TODO: Rename method, update argument and hook method into UI event
public void onButtonPressed(Uri uri) {
if (mListener != null) {
mListener.onFragmentInteraction(uri);
}
}
#Override
public void onAttach(Context context) {
super.onAttach(context);
try {
mListener = (OnFragmentInteractionListener) context;
} catch (ClassCastException e) {
throw new ClassCastException(context.toString()
+ " must implement OnFragmentInteractionListener");
}
}
#Override
public void onDetach() {
super.onDetach();
mListener = null;
}
// Camera Code
public String detectText(Bitmap bitmap) {
TessDataManager.initTessTrainedData(getActivity());
TessBaseAPI tessBaseAPI = new TessBaseAPI();
String path = "/mnt/sdcard/com.fastnetserv.myapp/tessdata/ita.traineddata";
Log.d(TAG, "Check data path: " + path);
tessBaseAPI.setDebug(true);
tessBaseAPI.init(path, "ita"); //Init the Tess with the trained data file, with english language
//For example if we want to only detect numbers
tessBaseAPI.setVariable(TessBaseAPI.VAR_CHAR_WHITELIST, "1234567890");
tessBaseAPI.setVariable(TessBaseAPI.VAR_CHAR_BLACKLIST, "!##$%^&*()_+=-qwertyuiop[]}{POIU" +
"YTREWQasdASDfghFGHjklJKLl;L:'\"\\|~`xcvXCVbnmBNM,./<>?");
tessBaseAPI.setImage(bitmap);
String text = tessBaseAPI.getUTF8Text();
//Log.d(TAG, "Got data: " + result);
tessBaseAPI.end();
return text;
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
Log.d(TAG, "Surface Created - starting camera");
if (cameraEngine != null && !cameraEngine.isOn()) {
cameraEngine.start();
}
if (cameraEngine != null && cameraEngine.isOn()) {
Log.d(TAG, "Camera engine already on");
return;
}
cameraEngine = CameraEngine.New(holder);
cameraEngine.start();
Log.d(TAG, "Camera engine started");
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
}
#Override
public void onResume() {
super.onResume();
cameraFrame = (SurfaceView) getActivity().findViewById(R.id.camera_frame);
shutterButton = (Button) getActivity().findViewById(R.id.shutter_button);
focusBox = (FocusBoxView) getActivity().findViewById(R.id.focus_box);
focusButton = (Button) getActivity().findViewById(R.id.focus_button);
shutterButton.setOnClickListener(this);
focusButton.setOnClickListener(this);
SurfaceHolder surfaceHolder = cameraFrame.getHolder();
surfaceHolder.addCallback(this);
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
cameraFrame.setOnClickListener(this);
}
#Override
public void onPause() {
super.onPause();
if (cameraEngine != null && cameraEngine.isOn()) {
cameraEngine.stop();
}
SurfaceHolder surfaceHolder = cameraFrame.getHolder();
surfaceHolder.removeCallback(this);
}
#Override
public void onClick(View v) {
if(v == shutterButton){
if(cameraEngine != null && cameraEngine.isOn()){
cameraEngine.takeShot(this, this, this);
}
}
if(v == focusButton){
if(cameraEngine!=null && cameraEngine.isOn()){
cameraEngine.requestFocus();
}
}
}
#Override
public void onPictureTaken(byte[] data, Camera camera) {
Log.d(TAG, "Picture taken");
if (data == null) {
Log.d(TAG, "Got null data");
return;
}
Bitmap bmp = Tools.getFocusedBitmap(getActivity(), camera, data, focusBox.getBox());
Log.d(TAG, "Got bitmap");
new TessAsyncEngine().executeOnExecutor(AsyncTask.SERIAL_EXECUTOR, this, bmp);
}
#Override
public void onShutter() {
}
}
And here the TessAsyncEngine:
package com.fastnetserv.myapp;
import android.app.Activity;
import android.graphics.Bitmap;
import android.os.AsyncTask;
import android.util.Log;
import com.fastnetserv.myapp.ImageDialog;
import com.fastnetserv.myapp.Tools;
/**
* Created by Fadi on 6/11/2014.
*/
public class TessAsyncEngine extends AsyncTask<Object, Void, String> {
static final String TAG = "DBG_" + TessAsyncEngine.class.getName();
private Bitmap bmp;
private Activity context;
#Override
protected String doInBackground(Object... params) {
try {
if(params.length < 2) {
Log.e(TAG, "Error passing parameter to execute - missing params");
return null;
}
if(!(params[0] instanceof Activity) || !(params[1] instanceof Bitmap)) {
Log.e(TAG, "Error passing parameter to execute(context, bitmap)");
return null;
}
context = (Activity)params[0];
bmp = (Bitmap)params[1];
if(context == null || bmp == null) {
Log.e(TAG, "Error passed null parameter to execute(context, bitmap)");
return null;
}
int rotate = 0;
if(params.length == 3 && params[2]!= null && params[2] instanceof Integer){
rotate = (Integer) params[2];
}
if(rotate >= -180 && rotate <= 180 && rotate != 0)
{
bmp = Tools.preRotateBitmap(bmp, rotate);
Log.d(TAG, "Rotated OCR bitmap " + rotate + " degrees");
}
TessEngine tessEngine = TessEngine.Generate(context);
bmp = bmp.copy(Bitmap.Config.ARGB_8888, true);
String result = tessEngine.detectText(bmp);
Log.d(TAG, result);
return result;
} catch (Exception ex) {
Log.d(TAG, "Error: " + ex + "\n" + ex.getMessage());
}
return null;
}
#Override
protected void onPostExecute(String s) {
if(s == null || bmp == null || context == null)
return;
ImageDialog.New()
.addBitmap(bmp)
.addTitle(s)
.show(context.getFragmentManager(), TAG);
super.onPostExecute(s);
}
}
I have followed this tutorial (http://www.codeproject.com/Tips/840623/Android-Character-Recognition) but probably I forgot something due to the lack of my knowledge with Android
That if(context == null || bmp == null) is not needed, as you already tested those values with instanceof.
But I'm guessing your main problem is passing this from Fragment as Activity parameter, which is not.
To fix.. I overall would try to not toss Activity pointer around wildly, as those have quite limited life cycle on android. I have an app with tess-two and I don't recall ever needing Activity to init it (although usually I init it from native C++, so YMMV).
Isn't just the Context needed for that call? If yes, I would suggest to move to getApplicationContext() value instead. I think this is directly or indirectly accessible from Fragment too.
Sorry for not trying your code, but this is something you can debug quite easily.
One more note to android and tesseract usage. What is Tools.getFocusedBitmap? Will it cut down the pic reasonably? If it keeps full size, and your Camera is set to full size, you are tossing around 5-10+MP Bitmaps around, which in Android means to hit Out-Of-Memory (OOM) almost instantly. Either set Camera to reasonably low resolution, or cut-out designed part of photo ASAP and drop the whole Image, ideally as first step of processing.
Also you may want to reconsider whole tess-two thing, and try the official Google Text API from Google play services.
https://developers.google.com/android/reference/com/google/android/gms/vision/text/Text
It's brand new addition, inside I guess it will use the second generation of Tesseract engine with latest improvements, so very likely to get better results and better speed, than from tess-two.
I think it's accessible only from Android 4.4 and only on devices with Google Play Services, and cross-platform sucks, so I'm staying with tess-two in my projects - as I have to support also iOS and Windows Phone.
And generally I don't believe things which don't come with source along, SW without source is zombie, already dead while you are using it (will take at most 30-50y to die), and it's a waste of time and skill of those programmers.

I need to get torch turn on by toggle Button at on going Camera Activity By adding some more lines of code without touching the existing code

I do have a Barcode Decoding Applcation : https://play.google.com/store/apps/details?id=com.barcodereader
I have Used Zbar library and Google Vision API for Scanning
Now what i want is while scanning the Barcode if user taps on a button at appbar for turning on Torch (Flash) then it should be turn on and off vice-versa.
But the Problem is the camera is already on with its all parameters so when user taps the button to turn on torch we need to interrupt the ongoing Camera parameters and I don't want to do that,
I am searching the other way to get torch on without changing the existing Camera Parameters..
Below is The Camera Activity of ZBar And Google Vision this both use some other Camera Classes for Camera Preview.
#SuppressWarnings("deprecation")
public class ZBarFirstScannerActivity extends AppCompatActivity{
//TextView tv;
ImageView iv;
LinearLayout ll;
private Camera mCamera;
private CameraPreview mPreview;
private Handler autoFocusHandler;
private ImageScanner scanner;
private boolean barcodeScanned = false;
private boolean previewing = true;
TextView tv;
static {
System.loadLibrary("iconv");
}
static {
System.loadLibrary("zbarjni");
}
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.barcode_capture1d);
tv = (TextView) findViewById(R.id.textVertical);
tv.setRotation(90);
initToolbar();
autoFocusHandler = new Handler();
mCamera = getCameraInstance();
// Instance barcode scanner
scanner = new ImageScanner();
scanner.setConfig(0, Config.X_DENSITY, 1);
scanner.setConfig(0, Config.Y_DENSITY, 1);
scanner.setConfig(Symbol.CODE128, Config.ENABLE,1);
scanner.setConfig(Symbol.EAN13, Config.ENABLE,1);
mPreview = new CameraPreview(this, mCamera, previewCb, autoFocusCB);
FrameLayout preview = (FrameLayout)findViewById(R.id.cameraPreview);
preview.addView(mPreview);
}
private void initToolbar() {
final Toolbar toolbar = (Toolbar) findViewById(R.id.toolbar);
setSupportActionBar(toolbar);
final ActionBar actionBar = getSupportActionBar();
if (actionBar != null) {
actionBar.setHomeButtonEnabled(true);
actionBar.setHomeAsUpIndicator(ContextCompat.getDrawable(this, R.drawable.abc_ic_ab_back_mtrl_am_alpha));
actionBar.setDisplayHomeAsUpEnabled(true);
}
}
/** A safe way to get an instance of the Camera object. */
public static Camera getCameraInstance()
{
Camera c = null;
try
{
c = Camera.open();
} catch (Exception e)
{
//nada
}
return c;
}
private void releaseCamera()
{
if (mCamera != null)
{
previewing = false;
mCamera.setPreviewCallback(null);
mCamera.release();
mCamera = null;
}
}
PreviewCallback previewCb = new PreviewCallback()
{
public void onPreviewFrame(byte[] data, Camera camera)
{
Camera.Parameters parameters = camera.getParameters();
Size size = parameters.getPreviewSize();
Image barcode = new Image(size.width, size.height, "Y800");
barcode.setData(data);
int result = scanner.scanImage(barcode);
if (result != 0)
{
previewing = false;
mCamera.setPreviewCallback(null);
mCamera.stopPreview();
SymbolSet syms = scanner.getResults();
for (Symbol sym : syms)
{
barcodeScanned = true;
Intent returnIntent = new Intent();
returnIntent.putExtra("BARCODE", sym.getData());
setResult(MainActivity.BAR_CODE_TYPE_128,returnIntent);
releaseCamera();
finish();
break;
}
}
}
};
// Mimic continuous auto-focusing
AutoFocusCallback autoFocusCB = new AutoFocusCallback()
{
public void onAutoFocus(boolean success, Camera camera)
{
autoFocusHandler.postDelayed(doAutoFocus, 3000);
}
};
private Runnable doAutoFocus = new Runnable()
{
public void run()
{
if (previewing)
mCamera.autoFocus(autoFocusCB);
}
};
public void onPause() {
super.onPause();
releaseCamera();
}
public void onResume(){
super.onResume();
new ZBarFirstScannerActivity();
}
#Override
public void onBackPressed() {
releaseCamera();
finish();
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
int id = item.getItemId();
if (id == android.R.id.home) {
onBackPressed();
return true;
}
return super.onOptionsItemSelected(item);
}
}
And GoogleScanner
public final class GoogleScannerActivity extends AppCompatActivity {
private static final String TAG = "Barcode-reader";
// intent request code to handle updating play services if needed.
private static final int RC_HANDLE_GMS = 9001;
// permission request codes need to be < 256
private static final int RC_HANDLE_CAMERA_PERM = 2;
// constants used to pass extra data in the intent
public static final String AutoFocus = "AutoFocus";
public static final String UseFlash = "UseFlash";
public static final String BarcodeObject = "Barcode";
Bitmap bmp;
FileOutputStream fos = null;
private Camera c;
Switch aSwitch;
private CameraSource mCameraSource;
private CameraSourcePreview mPreview;
private GraphicOverlay<BarcodeGraphic> mGraphicOverlay;
// helper objects for detecting taps and pinches.
private ScaleGestureDetector scaleGestureDetector;
private GestureDetector gestureDetector;
/**
* Initializes the UI and creates the detector pipeline.
*/
#Override
public void onCreate(Bundle icicle) {
super.onCreate(icicle);
setContentView(R.layout.barcode_capture2d);
initToolbar();
ActivitySource.caller = this;
mPreview = (CameraSourcePreview) findViewById(R.id.preview);
mGraphicOverlay = (GraphicOverlay<BarcodeGraphic>)findViewById(R.id.graphicOverlay);
boolean autoFocus = true;
boolean useFlash = false;
// Check for the camera permission before accessing the camera. If the
// permission is not granted yet, request permission.
int rc = ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA);
if (rc == PackageManager.PERMISSION_GRANTED) {
createCameraSource(autoFocus, useFlash);
} else {
requestCameraPermission();
}
gestureDetector = new GestureDetector(this, new CaptureGestureListener());
scaleGestureDetector = new ScaleGestureDetector(this, new ScaleListener());
/*Snackbar.make(mGraphicOverlay, "Tap to capture. Pinch/Stretch to zoom",
Snackbar.LENGTH_LONG)
.show();*/
}
private void initToolbar() {
final Toolbar toolbar = (Toolbar) findViewById(R.id.toolbar);
setSupportActionBar(toolbar);
final ActionBar actionBar = getSupportActionBar();
if (actionBar != null) {
actionBar.setHomeButtonEnabled(true);
actionBar.setHomeAsUpIndicator(ContextCompat.getDrawable(this, R.drawable.abc_ic_ab_back_mtrl_am_alpha));
actionBar.setDisplayHomeAsUpEnabled(true);
}
}
private Camera.Size getBestPreviewSize(int width, int height, Camera.Parameters parameters){
Camera.Size bestSize = null;
List<Camera.Size> sizeList = parameters.getSupportedPreviewSizes();
bestSize = sizeList.get(0);
for(int i = 1; i < sizeList.size(); i++){
if((sizeList.get(i).width * sizeList.get(i).height) >
(bestSize.width * bestSize.height)){
bestSize = sizeList.get(i);
}
}
return bestSize;
}
/**
* Handles the requesting of the camera permission. This includes
* showing a "Snackbar" message of why the permission is needed then
* sending the request.
*/
private void requestCameraPermission() {
Log.w(TAG, "Camera permission is not granted. Requesting permission");
final String[] permissions = new String[]{Manifest.permission.CAMERA};
if (!ActivityCompat.shouldShowRequestPermissionRationale(this,
Manifest.permission.CAMERA)) {
ActivityCompat.requestPermissions(this, permissions, RC_HANDLE_CAMERA_PERM);
return;
}
final Activity thisActivity = this;
View.OnClickListener listener = new View.OnClickListener() {
#Override
public void onClick(View view) {
ActivityCompat.requestPermissions(thisActivity, permissions,
RC_HANDLE_CAMERA_PERM);
}
};
Snackbar.make(mGraphicOverlay, R.string.permission_camera_rationale,
Snackbar.LENGTH_INDEFINITE)
.setAction(R.string.ok, listener)
.show();
}
#Override
public boolean onTouchEvent(MotionEvent e) {
boolean b = scaleGestureDetector.onTouchEvent(e);
boolean c = gestureDetector.onTouchEvent(e);
return b || c || super.onTouchEvent(e);
}
/**
* Creates and starts the camera. Note that this uses a higher resolution in comparison
* to other detection examples to enable the barcode detector to detect small barcodes
* at long distances.
*
* Suppressing InlinedApi since there is a check that the minimum version is met before using
* the constant.
*/
#SuppressLint("InlinedApi")
private void createCameraSource(boolean autoFocus, boolean useFlash) {
Context context = getApplicationContext();
// A barcode detector is created to track barcodes. An associated multi-processor instance
// is set to receive the barcode detection results, track the barcodes, and maintain
// graphics for each barcode on screen. The factory is used by the multi-processor to
// create a separate tracker instance for each barcode.
BarcodeDetector barcodeDetector = new BarcodeDetector.Builder(context).setBarcodeFormats(Barcode.CODE_128 | Barcode.DATA_MATRIX | Barcode.QR_CODE).build();
BarcodeTrackerFactory barcodeFactory = new BarcodeTrackerFactory(mGraphicOverlay);
barcodeDetector.setProcessor(
new MultiProcessor.Builder<>(barcodeFactory).build());
if (!barcodeDetector.isOperational()) {
// Note: The first time that an app using the barcode or face API is installed on a
// device, GMS will download a native libraries to the device in order to do detection.
// Usually this completes before the app is run for the first time. But if that
// download has not yet completed, then the above call will not detect any barcodes
// and/or faces.
//
// isOperational() can be used to check if the required native libraries are currently
// available. The detectors will automatically become operational once the library
// downloads complete on device.
Log.w(TAG, "Detector dependencies are not yet available.");
// Check for low storage. If there is low storage, the native library will not be
// downloaded, so detection will not become operational.
IntentFilter lowstorageFilter = new IntentFilter(Intent.ACTION_DEVICE_STORAGE_LOW);
boolean hasLowStorage = registerReceiver(null, lowstorageFilter) != null;
if (hasLowStorage) {
Toast.makeText(this, R.string.low_storage_error, Toast.LENGTH_LONG).show();
Log.w(TAG, getString(R.string.low_storage_error));
}
}
// Creates and starts the camera. Note that this uses a higher resolution in comparison
// to other detection examples to enable the barcode detector to detect small barcodes
// at long distances.
CameraSource.Builder builder = new CameraSource.Builder(getApplicationContext(), barcodeDetector)
.setFacing(CameraSource.CAMERA_FACING_BACK)
.setRequestedPreviewSize(1100, 844)
.setRequestedFps(15.0f);
// make sure that auto focus is an available option
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.ICE_CREAM_SANDWICH) {
builder = builder.setFocusMode(
autoFocus ? Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE : null);
}
mCameraSource = builder
.setFlashMode(useFlash ? Camera.Parameters.FLASH_MODE_TORCH : null)
.build();
}
/**
* Restarts the camera.
*/
#Override
protected void onResume() {
super.onResume();
startCameraSource();
}
/**
* Stops the camera.
*/
#Override
protected void onPause() {
super.onPause();
if (mPreview != null) {
mPreview.stop();
}
}
/**
* Releases the resources associated with the camera source, the associated detectors, and the
* rest of the processing pipeline.
*/
#Override
protected void onDestroy() {
super.onDestroy();
if (mPreview != null) {
mPreview.release();
}
}
#Override
public void onRequestPermissionsResult(int requestCode,
#NonNull String[] permissions,
#NonNull int[] grantResults) {
if (requestCode != RC_HANDLE_CAMERA_PERM) {
Log.d(TAG, "Got unexpected permission result: " + requestCode);
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
return;
}
if (grantResults.length != 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
Log.d(TAG, "Camera permission granted - initialize the camera source");
// we have permission, so create the camerasource
boolean autoFocus = getIntent().getBooleanExtra(AutoFocus,false);
boolean useFlash = getIntent().getBooleanExtra(UseFlash, false);
createCameraSource(autoFocus, useFlash);
return;
}
Log.e(TAG, "Permission not granted: results len = " + grantResults.length +
" Result code = " + (grantResults.length > 0 ? grantResults[0] : "(empty)"));
DialogInterface.OnClickListener listener = new DialogInterface.OnClickListener() {
public void onClick(DialogInterface dialog, int id) {
finish();
}
};
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setTitle("Multitracker sample")
.setMessage(R.string.no_camera_permission)
.setPositiveButton(R.string.ok, listener)
.show();
}
/**
* Starts or restarts the camera source, if it exists. If the camera source doesn't exist yet
* (e.g., because onResume was called before the camera source was created), this will be called
* again when the camera source is created.
*/
private void startCameraSource() throws SecurityException {
// check that the device has play services available.
int code = GoogleApiAvailability.getInstance().isGooglePlayServicesAvailable(
getApplicationContext());
if (code != ConnectionResult.SUCCESS) {
Dialog dlg =
GoogleApiAvailability.getInstance().getErrorDialog(this, code, RC_HANDLE_GMS);
dlg.show();
}
if (mCameraSource != null) {
try {
mPreview.start(mCameraSource, mGraphicOverlay);
} catch (IOException e) {
Log.e(TAG, "Unable to start camera source.", e);
mCameraSource.release();
mCameraSource = null;
}
}
}
/**
* onTap is called to capture the oldest barcode currently detected and
* return it to the caller.
*
* #param rawX - the raw position of the tap
* #param rawY - the raw position of the tap.
* #return true if the activity is ending.
*/
private boolean onTap(float rawX, float rawY) {
//TODO: use the tap position to select the barcode.
BarcodeGraphic graphic = mGraphicOverlay.getFirstGraphic();
Barcode barcode = null;
if (graphic != null) {
barcode = graphic.getBarcode();
if (barcode != null) {
Intent data = new Intent();
data.putExtra(BarcodeObject, barcode);
setResult(CommonStatusCodes.SUCCESS, data);
finish();
}
else {
Log.d(TAG, "barcode data is null");
}
}
else {
Log.d(TAG,"no barcode detected");
}
return barcode != null;
}
private class CaptureGestureListener extends GestureDetector.SimpleOnGestureListener {
#Override
public boolean onSingleTapConfirmed(MotionEvent e) {
return onTap(e.getRawX(), e.getRawY()) || super.onSingleTapConfirmed(e);
}
}
private class ScaleListener implements ScaleGestureDetector.OnScaleGestureListener {
/**
* Responds to scaling events for a gesture in progress.
* Reported by pointer motion.
*
* #param detector The detector reporting the event - use this to
* retrieve extended info about event state.
* #return Whether or not the detector should consider this event
* as handled. If an event was not handled, the detector
* will continue to accumulate movement until an event is
* handled. This can be useful if an application, for example,
* only wants to update scaling factors if the change is
* greater than 0.01.
*/
#Override
public boolean onScale(ScaleGestureDetector detector) {
return false;
}
/**
* Responds to the beginning of a scaling gesture. Reported by
* new pointers going down.
*
* #param detector The detector reporting the event - use this to
* retrieve extended info about event state.
* #return Whether or not the detector should continue recognizing
* this gesture. For example, if a gesture is beginning
* with a focal point outside of a region where it makes
* sense, onScaleBegin() may return false to ignore the
* rest of the gesture.
*/
#Override
public boolean onScaleBegin(ScaleGestureDetector detector) {
return true;
}
/**
* Responds to the end of a scale gesture. Reported by existing
* pointers going up.
* <p/>
* Once a scale has ended, {#link ScaleGestureDetector#getFocusX()}
* and {#link ScaleGestureDetector#getFocusY()} will return focal point
* of the pointers remaining on the screen.
*
* #param detector The detector reporting the event - use this to
* retrieve extended info about event state.
*/
#Override
public void onScaleEnd(ScaleGestureDetector detector) {
mCameraSource.doZoom(detector.getScaleFactor());
}
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
int id = item.getItemId();
if (id == android.R.id.home) {
onBackPressed();
return true;
}
return super.onOptionsItemSelected(item);
}
}

Expansion File (Android): "Download failed because the resources could not be found"

I am facing a problem with an expansion file process, I do not know what the problem is. I hope if someone could help.
I have followed the steps by Google; yet I cannot assure I have done it correctly 100%.
/*
* Copyright (C) 2012 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
*/
import com.android.vending.expansion.zipfile.ZipResourceFile;
import com.android.vending.expansion.zipfile.ZipResourceFile.ZipEntryRO;
import com.google.android.vending.expansion.downloader.Constants;
import com.google.android.vending.expansion.downloader.DownloadProgressInfo;
import com.google.android.vending.expansion.downloader.DownloaderClientMarshaller;
import com.google.android.vending.expansion.downloader.DownloaderServiceMarshaller;
import com.google.android.vending.expansion.downloader.Helpers;
import com.google.android.vending.expansion.downloader.IDownloaderClient;
import com.google.android.vending.expansion.downloader.IDownloaderService;
import com.google.android.vending.expansion.downloader.IStub;
import android.app.Activity;
import android.app.PendingIntent;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageManager.NameNotFoundException;
import android.os.AsyncTask;
import android.os.Bundle;
import android.os.Messenger;
import android.os.SystemClock;
import android.provider.Settings;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.ProgressBar;
import android.widget.TextView;
import java.io.DataInputStream;
import java.io.IOException;
import java.util.zip.CRC32;
/**
* This is sample code for a project built against the downloader library. It
* implements the IDownloaderClient that the client marshaler will talk to as
* messages are delivered from the DownloaderService.
*/
public class ExpansionFileDownloaderActivity extends Activity implements IDownloaderClient {
private static final String LOG_TAG = "LVLDownloader";
private ProgressBar mPB;
private TextView mStatusText;
private TextView mProgressFraction;
private TextView mProgressPercent;
private TextView mAverageSpeed;
private TextView mTimeRemaining;
private View mDashboard;
private View mCellMessage;
private Button mPauseButton;
private Button mWiFiSettingsButton;
private boolean mStatePaused;
private int mState;
private IDownloaderService mRemoteService;
private IStub mDownloaderClientStub;
private void setState(int newState) {
if (mState != newState) {
mState = newState;
mStatusText.setText(Helpers.getDownloaderStringResourceIDFromState(newState));
}
}
private void setButtonPausedState(boolean paused) {
mStatePaused = paused;
int stringResourceID = paused ? R.string.text_button_resume :
R.string.text_button_pause;
mPauseButton.setText(stringResourceID);
}
/**
* This is a little helper class that demonstrates simple testing of an
* Expansion APK file delivered by Market. You may not wish to hard-code
* things such as file lengths into your executable... and you may wish to
* turn this code off during application development.
*/
private static class XAPKFile {
public final boolean mIsMain;
public final int mFileVersion;
public final long mFileSize;
XAPKFile(boolean isMain, int fileVersion, long fileSize) {
mIsMain = isMain;
mFileVersion = fileVersion;
mFileSize = fileSize;
}
}
/**
* Here is where you place the data that the validator will use to determine
* if the file was delivered correctly. This is encoded in the source code
* so the application can easily determine whether the file has been
* properly delivered without having to talk to the server. If the
* application is using LVL for licensing, it may make sense to eliminate
* these checks and to just rely on the server.
*/
private static final XAPKFile[] xAPKS = {
new XAPKFile(
true, // true signifies a main file
1, // the version of the APK that the file was uploaded
// against
102459993L // the length of the file in bytes
)
};
/**
* Go through each of the APK Expansion files defined in the structure above
* and determine if the files are present and match the required size. Free
* applications should definitely consider doing this, as this allows the
* application to be launched for the first time without having a network
* connection present. Paid applications that use LVL should probably do at
* least one LVL check that requires the network to be present, so this is
* not as necessary.
*
* #return true if they are present.
*/
boolean expansionFilesDelivered() {
for (XAPKFile xf : xAPKS) {
String fileName = Helpers.getExpansionAPKFileName(this, xf.mIsMain, xf.mFileVersion);
if (!Helpers.doesFileExist(this, fileName, xf.mFileSize, false))
return false;
}
return true;
}
public static boolean expansionFilesDelivered(Context ctx) {
for (XAPKFile xf : xAPKS) {
String fileName = Helpers.getExpansionAPKFileName(ctx, xf.mIsMain, xf.mFileVersion);
if (!Helpers.doesFileExist(ctx, fileName, xf.mFileSize, false))
return false;
}
return true;
}
/**
* Calculating a moving average for the validation speed so we don't get
* jumpy calculations for time etc.
*/
static private final float SMOOTHING_FACTOR = 0.005f;
/**
* Used by the async task
*/
private boolean mCancelValidation;
/**
* Go through each of the Expansion APK files and open each as a zip file.
* Calculate the CRC for each file and return false if any fail to match.
*
* #return true if XAPKZipFile is successful
*/
void validateXAPKZipFiles() {
AsyncTask<Object, DownloadProgressInfo, Boolean> validationTask = new AsyncTask<Object, DownloadProgressInfo, Boolean>() {
#Override
protected void onPreExecute() {
mDashboard.setVisibility(View.VISIBLE);
mCellMessage.setVisibility(View.GONE);
mStatusText.setText(R.string.text_verifying_download);
mPauseButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
mCancelValidation = true;
}
});
mPauseButton.setText(R.string.text_button_cancel_verify);
super.onPreExecute();
}
#Override
protected Boolean doInBackground(Object... params) {
for (XAPKFile xf : xAPKS) {
String fileName = Helpers.getExpansionAPKFileName(
ExpansionFileDownloaderActivity.this,
xf.mIsMain, xf.mFileVersion);
if (!Helpers.doesFileExist(ExpansionFileDownloaderActivity.this, fileName,
xf.mFileSize, false))
return false;
fileName = Helpers
.generateSaveFileName(ExpansionFileDownloaderActivity.this, fileName);
ZipResourceFile zrf;
byte[] buf = new byte[1024 * 256];
try {
zrf = new ZipResourceFile(fileName);
ZipEntryRO[] entries = zrf.getAllEntries();
/**
* First calculate the total compressed length
*/
long totalCompressedLength = 0;
for (ZipEntryRO entry : entries) {
totalCompressedLength += entry.mCompressedLength;
}
float averageVerifySpeed = 0;
long totalBytesRemaining = totalCompressedLength;
long timeRemaining;
/**
* Then calculate a CRC for every file in the Zip file,
* comparing it to what is stored in the Zip directory.
* Note that for compressed Zip files we must extract
* the contents to do this comparison.
*/
for (ZipEntryRO entry : entries) {
if (-1 != entry.mCRC32) {
long length = entry.mUncompressedLength;
CRC32 crc = new CRC32();
DataInputStream dis = null;
try {
dis = new DataInputStream(
zrf.getInputStream(entry.mFileName));
long startTime = SystemClock.uptimeMillis();
while (length > 0) {
int seek = (int) (length > buf.length ? buf.length
: length);
dis.readFully(buf, 0, seek);
crc.update(buf, 0, seek);
length -= seek;
long currentTime = SystemClock.uptimeMillis();
long timePassed = currentTime - startTime;
if (timePassed > 0) {
float currentSpeedSample = (float) seek
/ (float) timePassed;
if (0 != averageVerifySpeed) {
averageVerifySpeed = SMOOTHING_FACTOR
* currentSpeedSample
+ (1 - SMOOTHING_FACTOR)
* averageVerifySpeed;
} else {
averageVerifySpeed = currentSpeedSample;
}
totalBytesRemaining -= seek;
timeRemaining = (long) (totalBytesRemaining / averageVerifySpeed);
this.publishProgress(
new DownloadProgressInfo(
totalCompressedLength,
totalCompressedLength
- totalBytesRemaining,
timeRemaining,
averageVerifySpeed
)
);
}
startTime = currentTime;
if (mCancelValidation)
return true;
}
if (crc.getValue() != entry.mCRC32) {
Log.e(Constants.TAG,
"CRC does not match for entry: "
+ entry.mFileName
);
Log.e(Constants.TAG,
"In file: " + entry.getZipFileName());
return false;
}
} finally {
if (null != dis) {
dis.close();
}
}
}
}
} catch (IOException e) {
e.printStackTrace();
return false;
}
}
return true;
}
#Override
protected void onProgressUpdate(DownloadProgressInfo... values) {
onDownloadProgress(values[0]);
super.onProgressUpdate(values);
}
#Override
protected void onPostExecute(Boolean result) {
if (result) {
mDashboard.setVisibility(View.VISIBLE);
mCellMessage.setVisibility(View.GONE);
mStatusText.setText(R.string.text_validation_complete);
mPauseButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
finish();
}
});
mPauseButton.setText(android.R.string.ok);
} else {
mDashboard.setVisibility(View.VISIBLE);
mCellMessage.setVisibility(View.GONE);
mStatusText.setText(R.string.text_validation_failed);
mPauseButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
finish();
}
});
mPauseButton.setText(android.R.string.cancel);
}
super.onPostExecute(result);
}
};
validationTask.execute(new Object());
}
/**
* If the download isn't present, we initialize the download UI. This ties
* all of the controls into the remote service calls.
*/
private void initializeDownloadUI() {
mDownloaderClientStub = DownloaderClientMarshaller.CreateStub
(this, SampleDownloaderService.class);
setContentView(R.layout.downloader_ui);
mPB = (ProgressBar) findViewById(R.id.progressBar);
mStatusText = (TextView) findViewById(R.id.statusText);
mProgressFraction = (TextView) findViewById(R.id.progressAsFraction);
mProgressPercent = (TextView) findViewById(R.id.progressAsPercentage);
mAverageSpeed = (TextView) findViewById(R.id.progressAverageSpeed);
mTimeRemaining = (TextView) findViewById(R.id.progressTimeRemaining);
mDashboard = findViewById(R.id.downloaderDashboard);
mCellMessage = findViewById(R.id.approveCellular);
mPauseButton = (Button) findViewById(R.id.pauseButton);
mWiFiSettingsButton = (Button) findViewById(R.id.wifiSettingsButton);
mPauseButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
if (mStatePaused) {
mRemoteService.requestContinueDownload();
} else {
mRemoteService.requestPauseDownload();
}
setButtonPausedState(!mStatePaused);
}
});
mWiFiSettingsButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
startActivity(new Intent(Settings.ACTION_WIFI_SETTINGS));
}
});
Button resumeOnCell = (Button) findViewById(R.id.resumeOverCellular);
resumeOnCell.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
mRemoteService.setDownloadFlags(IDownloaderService.FLAGS_DOWNLOAD_OVER_CELLULAR);
mRemoteService.requestContinueDownload();
mCellMessage.setVisibility(View.GONE);
}
});
}
/**
* Called when the activity is first create; we wouldn't create a layout in
* the case where we have the file and are moving to another activity
* without downloading.
*/
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
/**
* Both downloading and validation make use of the "download" UI
*/
initializeDownloadUI();
/**
* Before we do anything, are the files we expect already here and
* delivered (presumably by Market) For free titles, this is probably
* worth doing. (so no Market request is necessary)
*/
if (!expansionFilesDelivered()) {
try {
Intent launchIntent = ExpansionFileDownloaderActivity.this
.getIntent();
Intent intentToLaunchThisActivityFromNotification = new Intent(
ExpansionFileDownloaderActivity
.this, ((Object) this).getClass() //..... this.getClass()
);
intentToLaunchThisActivityFromNotification.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK |
Intent.FLAG_ACTIVITY_CLEAR_TOP);
intentToLaunchThisActivityFromNotification.setAction(launchIntent.getAction());
if (launchIntent.getCategories() != null) {
for (String category : launchIntent.getCategories()) {
intentToLaunchThisActivityFromNotification.addCategory(category);
}
}
// Build PendingIntent used to open this activity from
// Notification
PendingIntent pendingIntent = PendingIntent.getActivity(
ExpansionFileDownloaderActivity.this,
0, intentToLaunchThisActivityFromNotification,
PendingIntent.FLAG_UPDATE_CURRENT);
// Request to start the download
int startResult = DownloaderClientMarshaller.startDownloadServiceIfRequired(this,
pendingIntent, SampleDownloaderService.class);
if (startResult != DownloaderClientMarshaller.NO_DOWNLOAD_REQUIRED) {
// The DownloaderService has started downloading the files,
// show progress
initializeDownloadUI();
return;
} // otherwise, download not needed so we fall through to
// starting the movie
} catch (NameNotFoundException e) {
Log.e(LOG_TAG, "Cannot find own package! MAYDAY!");
e.printStackTrace();
}
} else {
validateXAPKZipFiles();
}
}
/**
* Connect the stub to our service on start.
*/
#Override
protected void onStart() {
if (null != mDownloaderClientStub) {
mDownloaderClientStub.connect(this);
}
super.onStart();
}
/**
* Disconnect the stub from our service on stop
*/
#Override
protected void onStop() {
if (null != mDownloaderClientStub) {
mDownloaderClientStub.disconnect(this);
}
super.onStop();
}
/**
* Critical implementation detail. In onServiceConnected we create the
* remote service and marshaler. This is how we pass the client information
* back to the service so the client can be properly notified of changes. We
* must do this every time we reconnect to the service.
*/
#Override
public void onServiceConnected(Messenger m) {
mRemoteService = DownloaderServiceMarshaller.CreateProxy(m);
mRemoteService.onClientUpdated(mDownloaderClientStub.getMessenger());
}
/**
* The download state should trigger changes in the UI --- it may be useful
* to show the state as being indeterminate at times. This sample can be
* considered a guideline.
*/
#Override
public void onDownloadStateChanged(int newState) {
setState(newState);
boolean showDashboard = true;
boolean showCellMessage = false;
boolean paused;
boolean indeterminate;
switch (newState) {
case IDownloaderClient.STATE_IDLE:
// STATE_IDLE means the service is listening, so it's
// safe to start making calls via mRemoteService.
paused = false;
indeterminate = true;
break;
case IDownloaderClient.STATE_CONNECTING:
case IDownloaderClient.STATE_FETCHING_URL:
showDashboard = true;
paused = false;
indeterminate = true;
break;
case IDownloaderClient.STATE_DOWNLOADING:
paused = false;
showDashboard = true;
indeterminate = false;
break;
case IDownloaderClient.STATE_FAILED_CANCELED:
case IDownloaderClient.STATE_FAILED:
case IDownloaderClient.STATE_FAILED_FETCHING_URL:
case IDownloaderClient.STATE_FAILED_UNLICENSED:
paused = true;
showDashboard = false;
indeterminate = false;
break;
case IDownloaderClient.STATE_PAUSED_NEED_CELLULAR_PERMISSION:
case IDownloaderClient.STATE_PAUSED_WIFI_DISABLED_NEED_CELLULAR_PERMISSION:
showDashboard = false;
paused = true;
indeterminate = false;
showCellMessage = true;
break;
case IDownloaderClient.STATE_PAUSED_BY_REQUEST:
paused = true;
indeterminate = false;
break;
case IDownloaderClient.STATE_PAUSED_ROAMING:
case IDownloaderClient.STATE_PAUSED_SDCARD_UNAVAILABLE:
paused = true;
indeterminate = false;
break;
case IDownloaderClient.STATE_COMPLETED:
showDashboard = false;
paused = false;
indeterminate = false;
validateXAPKZipFiles();
return;
default:
paused = true;
indeterminate = true;
showDashboard = true;
}
int newDashboardVisibility = showDashboard ? View.VISIBLE : View.GONE;
if (mDashboard.getVisibility() != newDashboardVisibility) {
mDashboard.setVisibility(newDashboardVisibility);
}
int cellMessageVisibility = showCellMessage ? View.VISIBLE : View.GONE;
if (mCellMessage.getVisibility() != cellMessageVisibility) {
mCellMessage.setVisibility(cellMessageVisibility);
}
mPB.setIndeterminate(indeterminate);
setButtonPausedState(paused);
}
/**
* Sets the state of the various controls based on the progressinfo object
* sent from the downloader service.
*/
#Override
public void onDownloadProgress(DownloadProgressInfo progress) {
mAverageSpeed.setText(getString(R.string.kilobytes_per_second,
Helpers.getSpeedString(progress.mCurrentSpeed)));
mTimeRemaining.setText(getString(R.string.time_remaining,
Helpers.getTimeRemaining(progress.mTimeRemaining)));
progress.mOverallTotal = progress.mOverallTotal;
mPB.setMax((int) (progress.mOverallTotal >> 8));
mPB.setProgress((int) (progress.mOverallProgress >> 8));
mProgressPercent.setText(Long.toString(progress.mOverallProgress
* 100 /
progress.mOverallTotal) + "%");
mProgressFraction.setText(Helpers.getDownloadProgressString
(progress.mOverallProgress,
progress.mOverallTotal));
}
#Override
protected void onDestroy() {
this.mCancelValidation = true;
super.onDestroy();
}
}
this is the SampleDownloaderService class,
import com.google.android.vending.expansion.downloader.impl.DownloaderService;
/**
* Created by Abdulkarim Kanaan on 22-Mar-14.
*/
public class SampleDownloaderService extends DownloaderService {
// You must use the public key belonging to your publisher account
public static final String BASE64_PUBLIC_KEY = "<Place Key provided by Google Play Publisher>";
// You should also modify this salt
public static final byte[] SALT = new byte[] { 1, 42, -12, -1, 54, 98,
-100, -12, 43, 2, -8, -4, 9, 5, -106, -107, -33, 45, -1, 84
};
#Override
public String getPublicKey() {
return BASE64_PUBLIC_KEY;
}
#Override
public byte[] getSALT() {
return SALT;
}
#Override
public String getAlarmReceiverClassName() {
return SampleAlarmReceiver.class.getName();
}
}
Finally, this is the Main Activity creation method, where the check is carried out to see whether the expansion file exists or not.
public class MainActivity extends Activity {
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
String.valueOf(ExpansionFileDownloaderActivity.expansionFilesDelivered(this)));
if (!ExpansionFileDownloaderActivity.expansionFilesDelivered(this)){
startActivity(new Intent(this, ExpansionFileDownloaderActivity.class));
}
//startApp(); // Expansion files are available, start the app
}
the problem is that after uploading the expansion file to Google Play, when the program starts, it shows that downloading process is going to be carried out. After a short period, it stops saying that "Download failed because the resources could not be found". When I place the file (obb file) in the SD Card, the application works as expected. However, I am trying now to download the file from Google Play.
What is the problem?
Thank you in advance,
I was having the same problem and no matter what I tried it still didnt work.
Finally I managed to download the apk expansion file. What I did was the following.
*Make sure your file size is correct (file size not disk size).
*Make sure versionCode matches with your manifest's versionCode.
*Make sure you have changed the SALT bytes.
Step by step:
1- Create a new app in the developer console.
2- Obtain the LVL Licence Key and repace it in your code.
3- Sign and export your application and create a zip with your expansion file without compression (compression: "store" in win rar).
4-Upload your app to google play but upload it in the Beta Testing Section. (without a production version listed, this is very important you must not have a production version saved in draft).
5- Create a google group with the people you want to allow beta acess and under the Beta Testing tab you must add that group.
5- Now its the scary part, you must click on publish. Your app will not show in google play, it will only be shown to your testing groups, TRUST ME. It should look like this:
*Note: Make sure you don't promote the beta to production or it will be shown, also notice I don't have a production version listed.
6- Under Beta Testing tab there is a link that a you must click in order to use the beta.
After that just download the app from google play, you can just type your package name for example:
https://play.google.com/store/apps/details?id=com.mydomain.myapp
and the app will be installed along with the expansion package.
If you want to test the files download, install the app and then delete the obb file downloaded. This will start the obb download process.
Hope this helps, as it did for me!
Cheers.

Categories

Resources