How to migrate a jzy3d java project into android? - android

I followed jzy3d's demo and made a 3D plot model with AnalysisLauncher.open() in IntelliJ, win10. It's working great. Now I need to migrate to Android so that the model can be displayed in a mobile application.
What kind of change is needed for my code? A sample project with any jzy3d surface plot would be ideal.
I read the page How to use jzy3d in android using eclipse?. The idea seems promising because jzy3d relies on JOGL 2 and OpenGL is cross-platform API. But I couldn't understand all the instructions by #Martin. Namely how do I replace AWT and 'derive CanvasNewtAwt'? I have no idea how to start the project in Android Studio.
The jzy3d website(http://www.jzy3d.org/) also says 'Android supposed to work if enable approte JOGL jars'. How do I do this?
import MyColorMap.ColorMapGrayColor;
import org.jzy3d.analysis.AbstractAnalysis;
import org.jzy3d.analysis.AnalysisLauncher;
import org.jzy3d.chart.factories.AWTChartComponentFactory;
import org.jzy3d.colors.Color;
import org.jzy3d.colors.ColorMapper;
import org.jzy3d.colors.colormaps.ColorMapGrayscale;
import org.jzy3d.colors.colormaps.IColorMap;
import org.jzy3d.maths.Coord3d;
import org.jzy3d.plot3d.primitives.Point;
import org.jzy3d.plot3d.primitives.Polygon;
import org.jzy3d.plot3d.primitives.Shape;
import org.jzy3d.plot3d.rendering.canvas.Quality;
public class ChartPlot extends AbstractAnalysis {
private int unitSize;
private BufferedImage img;
public static void main(String[] args) throws Exception {
AnalysisLauncher.open(new ChartPlot("Forest.jpg", 10));
}
ChartPlot (String imgName, int unitSize) {
img= getImg(imgName);
this.unitSize= unitSize;
}
#Override
public void init() {
int[][][] xycoords= getXYCoords(img, unitSize);
float[][][] zcoords= getZCoords(img, unitSize);
List<Polygon>[] polygonsArray= (List<Polygon>[])new List[COUNT_COLOR];
for (int iColor=0; iColor < polygonsArray.length; iColor++) {
polygonsArray[iColor]= getPolygons(xycoords, zcoords, iColor);
}
IColorMap[] colorMaps= new IColorMap[] {new ColorMapGrayscale(), new ColorMapGrayColor(ColorMapGrayColor.RED),
new ColorMapGrayColor(ColorMapGrayColor.YELLOW), new ColorMapGrayColor(ColorMapGrayColor.BLUE),
new ColorMapGrayscale()};
Shape[] surfaces= new Shape[COUNT_COLOR];
for (int iColor= 0; iColor < surfaces.length; iColor++) {
surfaces[iColor]= iColor == INDEX_WHITE || iColor == INDEX_BLACK ?
getSurface_Global(polygonsArray[iColor], colorMaps[iColor]) :
getSurface_Local(polygonsArray[iColor], colorMaps[iColor]);
}
chart = AWTChartComponentFactory.chart(Quality.Advanced, getCanvasType());
chart.getScene().getGraph().add(surfaces[INDEX_BLACK]);
chart.getScene().getGraph().add(surfaces[INDEX_RED]);
chart.getScene().getGraph().add(surfaces[INDEX_YELLOW]);
chart.getScene().getGraph().add(surfaces[INDEX_BLUE]);
chart.getScene().getGraph().add(surfaces[INDEX_WHITE]);
}

Related

Android N Developer Preview: Camera support takes a third value

I have a Nexus 6P. I'm investigating why OpenCamera has stopped working on Android N Developer Preview (I'm not a developer, just a user). I have found the following piece of code that might be causing the problem: CameraControllerManager2.java:62
I created a new Android project, and added the following function:
...
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CameraMetadata;
...
public class MainActivity extends AppCompatActivity {
private String TAG = "MainActivity";
...
public void test(int cameraId) {
CameraManager manager = (CameraManager)this.getSystemService(Context.CAMERA_SERVICE);
try {
String cameraIdS = manager.getCameraIdList()[cameraId];
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraIdS);
int support = characteristics.get(CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL);
Log.d(TAG, "Camera support: " + support);
}
catch (CameraAccessException e) {
e.printStackTrace();
}
}
...
}
Calling test(0), the console output on my device is:
04-22 15:16:54.263 11578-11578/test.myapplication D/MainActivity: Camera support: 3
When I look up the possible values of support (docs), they must be 0, 1 or 2, but how is support taking the value of 3? Is it supposed to be a bitmask or something worse is happening?
You are looking at the docs for the shipping version of Android. At the present time, Android N is in a developer preview, and the docs are elsewhere.
There is a new INFO_SUPPORTED_HARDWARE_LEVEL_3 value for that characteristic, described as:
...devices additionally support YUV reprocessing and RAW image capture, along with additional output stream configurations.

Libgdx localisation

I want create a Localisation file for my project in Libgdx; however, my code is throwing an error.
My code:
package com.mygdx.mytest;
import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.files.FileHandle;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.utils.I18NBundle;
import java.util.Locale;
public class MyTest extends ApplicationAdapter {
#Override
public void create () {
FileHandle baseFileHandle = Gdx.files.internal("i18n/MyBundle");
Locale locale =new Locale("", "", "");
I18NBundle MyBundle = I18NBundle.createBundle(baseFileHandle, locale);
}
#Override
public void render () {
Gdx.gl.glClearColor(1, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
}
}
My error:
I18NBundle MyBundle = I18NBundle.createBundle(baseFileHandle, locale);
Where is my mistake? Please help.
Your code works, but apparently you don't have a bundle.
You need to have a file with path:
{project root}/android/assets/i18n/MyBundle.properties
Remember to give a error log.
I use this code in my project and it work fine for me.
FileHandle internal = Gdx.files.internal("i18n/lang");
I18NBundle local = I18NBundle.createBundle(internal, Locale.ROOT);
I18NBundle portuguese = I18NBundle.createBundle(internal, new Locale("pt"));

how to work with air 3.2 and HTMLLoader?

i use this code to view my web page .... it is work with air 3.2 for desktop ... how to make it work with air 3.2 for android ? .... it does not loading anything , only white screen !
package {
import flash.display.Sprite;
import flash.html.HTMLLoader;
import flash.net.URLRequest;
public class HTMLLoaderExample extends Sprite
{
public function HTMLLoaderExample()
{
var html:HTMLLoader = new HTMLLoader();
var urlReq:URLRequest = new URLRequest("http://www.doomanco.com/");
html.width = stage.stageWidth;
html.height = stage.stageHeight;
html.load(urlReq);
addChild(html);
html.x = 0;
html.y = 0;
}
}
}
As Adobe said about HTMLLoader: " AIR profile support: This feature is supported on all desktop operating systems, but is not supported on mobile devices or on AIR for TV devices. You can test for support at run time using the HTMLLoader.isSupported property. See AIR Profile Support for more information regarding API support across multiple profiles. ", I think that's not supported for your android device, you ca verify that using HTMLLoader.isSupported property. For more details you can take a look here : Adobe.com : HTMLLoader and here : Adobe.com : Device profiles for AIR.
thanks dude #DodgerThud & #akmozo ..... it is done with this:
package {
import flash.display.MovieClip;
import flash.media.StageWebView;
import flash.geom.Rectangle;
import flash.events.KeyboardEvent;
import flash.ui.Keyboard;
import flash.desktop.NativeApplication;
import flash.display.Stage;
import flash.display.StageAlign;
import flash.display.StageScaleMode;
import flash.events.Event;
public class StageWebViewExample extends MovieClip{
private var webView:StageWebView = new StageWebView();
public function StageWebViewExample()
{
stage.scaleMode = StageScaleMode.NO_SCALE;
stage.align = StageAlign.TOP_LEFT;
webView.stage = this.stage;
webView.viewPort = new Rectangle( 0, 0, stage.fullScreenWidth, stage.fullScreenHeight );
webView.loadURL( "http://www.google.com" );
stage.addEventListener( KeyboardEvent.KEY_DOWN, onKey );
}
private function onKey( event:KeyboardEvent ):void
{
if( event.keyCode == Keyboard.BACK && webView.isHistoryBackEnabled )
{
trace("Back.");
webView.historyBack();
event.preventDefault();
}
if( event.keyCode == Keyboard.SEARCH && webView.isHistoryForwardEnabled )
{
trace("Forward.");
webView.historyForward();
}
}
}
}

Playing AAC streams with Adobe AIR in Android/iOS

Looks like it's a pretty common trouble but I still can't see any solution.
The goal is to play some AAC stream in Adobe AIR mobile app.
I've used a transcoder from https://code.google.com/p/project-thunder-snow/. The one adds FLV headers to AAC data so the stream can be played through a standard AS3 NetStream object (as I understand). It works fine for Win and Mac, but the same app launched on Android or iOS produces neither sound nor any error. I've figured out that transcoder works fine, the cause is in the different
characteristics of NetStreams.
So, is there any solution or, at least, any documentation describing the difference between the NetStreams on PC and mobile platform?
okay... I might have a clue for you. I tried running the NBAAC code link you posted on the other question from inside Flash CS5 and got this error:
onChannelReady : MetaData
Error: Error #2067: The ExternalInterface is not available in this container. ExternalInterface requires Internet Explorer ActiveX, Firefox, Mozilla 1.7.5 and greater, or other browsers that support NPRuntime.
So it seems the player is designed to be used only inside an html page & uses ExternalInterface (Javascript) for getting metadata from browser into SWF
I removed the ExternalInterface stuff and it played ok in Flash Player & Device Central without browser/HTML. Try this code in your AIR app (I don't have AIR installed anymore to confirm..)
package
{
import com.thebitstream.flv.CodecFactory;
import com.thebitstream.flv.Transcoder;
import com.thebitstream.flv.codec.*;
import com.thebitstream.ice.*;
import com.thebitstream.nsv.*;
import flash.display.Sprite;
import flash.events.Event;
import flash.events.IOErrorEvent;
import flash.events.ProgressEvent;
import flash.events.SecurityErrorEvent;
//import flash.external.externalInterface;
import flash.media.SoundTransform;
import flash.net.NetConnection;
import flash.net.NetStream;
import flash.net.NetStreamAppendBytesAction;
import flash.net.URLRequest;
import flash.net.URLRequestMethod;
import flash.net.URLStream;
import flash.utils.ByteArray;
import flash.utils.setTimeout;
[SWF (width="250",height="250")]
public dynamic class NBAAC_AIR_v2 extends Sprite
{
public var request:URLRequest;
public var transcoder :Transcoder
public var transport :NetConnection;
public var transportStream :NetStream;
public var serverConnection:URLStream;
//public var host:String=" "; //now not used in line... request=new URLRequest(resource);
public var resource:String="http://aacplus-ac-64.timlradio.co.uk/;"; //test station
public function NBAAC_AIR_v2 ()
{
super();
/*
if(loaderInfo.parameters.host)
host=loaderInfo.parameters.host;
if(loaderInfo.parameters.resource)
resource=loaderInfo.parameters.resource;
*/
//CodecFactory.ImportCodec(MetaData);
CodecFactory.ImportCodec(AAC);
CodecFactory.ImportCodec(AACP);
transcoder = new Transcoder();
transcoder.addEventListener(CodecEvent.STREAM_READY,onChannelReady);
transcoder.addEventListener(StreamDataEvent.DATA, onTag);
transcoder.initiate();
transcoder.loadCodec("AAC");
transport = new NetConnection();
transport.connect(null);
flash.utils.setTimeout(boot,500);
}
public function boot():void
{
//externalInterface.addCallback('nbaac.setVolume', onVolume);
//externalInterface.addCallback('nbaac.togglePause', onTogglePause);
//externalInterface.addCallback('nbaac.setBuffer', setBufferLength);
//externalInterface.addCallback('nbaac.getBuffer', getBufferLength);
//externalInterface.addCallback('nbaac.getTime', getTime);
/*
var meta:Object={};
meta.uri="";
meta.StreamTitle="";
*/
//externalInterface.call('logit','start up');
transportStream = new NetStream(transport);
transportStream.bufferTime=2;
transportStream.client = this;
transportStream.soundTransform=new SoundTransform(.5,0);
transportStream.play(null);
transportStream.appendBytesAction(NetStreamAppendBytesAction.RESET_BEGIN);
var headerTag:ByteArray=transcoder.createHeader(false,true);
transportStream.appendBytes(headerTag);
headerTag.clear();
//transcoder.readMetaObject(meta,0);
serverConnection=new URLStream();
serverConnection.addEventListener(ProgressEvent.PROGRESS,loaded);
serverConnection.addEventListener(IOErrorEvent.IO_ERROR, onIo);
serverConnection.addEventListener(SecurityErrorEvent.SECURITY_ERROR, onNoPolicy);
serverConnection.addEventListener(Event.CLOSE, onClose);
request=new URLRequest(resource); //removed "host" from url
//request.requestHeaders=[new URLRequestHeader("GET",resource+" HTTP/1.0")];
//request.requestHeaders=[new URLRequestHeader("Icy-MetaData","1")];
request.method=URLRequestMethod.GET;
serverConnection.load(request);
}
private function getTime() :Number
{ return transportStream.time; }
private function getBufferLength() :Number
{ return transportStream.bufferLength; }
private function setBufferLength (val:Number) :void
{ transportStream.bufferTime = val; }
private function onTogglePause():void
{ transportStream.togglePause(); }
private function onVolume (val:Number) :void
{ transportStream.soundTransform = new SoundTransform(val, 0); }
private function onIo(pe:IOErrorEvent) :void
{ /* externalInterface.call('logit','IOErrorEvent') */ }
private function onTag(sde:StreamDataEvent) :void
{
sde.tag.position=0;
transportStream.appendBytes(sde.tag);
}
private function onChannelReady(ce:CodecEvent) :void
{ trace('onChannelReady :',ce.codec.type); }
private function onClose(e:Event):void
{ /* externalInterface.call('logit','onClose') */ }
public function onMetaData(e:Object):void
{ /* externalInterface.call('logit','onMetaData') */ }
private function loaded(e:ProgressEvent):void
{
var chunk:ByteArray=new ByteArray();
while(serverConnection.bytesAvailable)
{ chunk.writeByte( serverConnection.readByte() ); }
chunk.position=0;
transcoder.addRawData( chunk, 0, "AAC" );
}
private function onNoPolicy(se:SecurityErrorEvent):void
{ /* externalInterface.call('logit','SecurityErrorEvent'+host+resource+'<br />'+se.text); */ }
}
}

phonegap custom plugin for android advice needed

Hi I am in mobile app.
I have developed an app in phonegap (html5, JQuery, JS) and I want to develop a plugin to print to a BT printer.
I download printer manufacturer's SDK and I imported the appropriate .jar file to my project with the following way:
To include this library into your project:
Drag the appropriate library file into the Project Explorer from the SDK package
Right click the project folder and choose Properties
Click Java Build Path
Click Libraries and the Add JARs button
At the top of your main code add:
import com.starmicronics.stario.StarIOPort;
import com.starmicronics.stario.StarIOPortException;
import com.starmicronics.stario.StarPrinterStatus;
Now you can access all of StarIO’s methods!
I create the following plugin
js
var HelloPlugin = {
callNativeFunction: function (success, fail, resultType) {
return cordova.exec(success, fail, "com.tricedesigns.HelloPlugin", "nativeAction", [resultType]);
}
};
java
package com.tricedesigns;
import com.starmicronics.stario.StarIOPort;
import com.starmicronics.stario.StarIOPortException;
import com.starmicronics.stario.StarPrinterStatus;
import org.apache.cordova.api.Plugin;
import org.apache.cordova.api.PluginResult;
import org.json.JSONArray;
import android.app.AlertDialog;
import android.app.AlertDialog.Builder;
import android.content.Context;
import android.util.Log;
public class HelloPlugin extends Plugin {
public static final String NATIVE_ACTION_STRING="nativeAction";
public static final String SUCCESS_PARAMETER="success";
public static final String portName = "BT:";
public static final String portSettings = "mini";
#Override
public PluginResult execute(String action, JSONArray data, String callbackId) {
Log.d("HelloPlugin", "Hello, this is a native function called from PhoneGap/Cordova!");
//only perform the action if it is the one that should be invoked
if (NATIVE_ACTION_STRING.equals(action)) {
String resultType = null;
try {
resultType = data.getString(0);
}
catch (Exception ex) {
Log.d("HelloPlugin", ex.toString());
}
byte[] texttoprint = resultType.toString().getBytes();
if (resultType.equals(SUCCESS_PARAMETER)) {
StarIOPort port = null;
return new PluginResult(PluginResult.Status.OK, "Yay, Success!!!");
}
else {
return new PluginResult(PluginResult.Status.ERROR, "Oops, Error :(");
}
}
return null;
}
}
which is working with no promblems.
When i try to include the below call to printer .jar method
port = StarIOPort.getPort(portName, portSettings, 10000, context);
I get Error: Status=2 Message=Class not found.
Where am i wrong????

Categories

Resources