i use this code to view my web page .... it is work with air 3.2 for desktop ... how to make it work with air 3.2 for android ? .... it does not loading anything , only white screen !
package {
import flash.display.Sprite;
import flash.html.HTMLLoader;
import flash.net.URLRequest;
public class HTMLLoaderExample extends Sprite
{
public function HTMLLoaderExample()
{
var html:HTMLLoader = new HTMLLoader();
var urlReq:URLRequest = new URLRequest("http://www.doomanco.com/");
html.width = stage.stageWidth;
html.height = stage.stageHeight;
html.load(urlReq);
addChild(html);
html.x = 0;
html.y = 0;
}
}
}
As Adobe said about HTMLLoader: " AIR profile support: This feature is supported on all desktop operating systems, but is not supported on mobile devices or on AIR for TV devices. You can test for support at run time using the HTMLLoader.isSupported property. See AIR Profile Support for more information regarding API support across multiple profiles. ", I think that's not supported for your android device, you ca verify that using HTMLLoader.isSupported property. For more details you can take a look here : Adobe.com : HTMLLoader and here : Adobe.com : Device profiles for AIR.
thanks dude #DodgerThud & #akmozo ..... it is done with this:
package {
import flash.display.MovieClip;
import flash.media.StageWebView;
import flash.geom.Rectangle;
import flash.events.KeyboardEvent;
import flash.ui.Keyboard;
import flash.desktop.NativeApplication;
import flash.display.Stage;
import flash.display.StageAlign;
import flash.display.StageScaleMode;
import flash.events.Event;
public class StageWebViewExample extends MovieClip{
private var webView:StageWebView = new StageWebView();
public function StageWebViewExample()
{
stage.scaleMode = StageScaleMode.NO_SCALE;
stage.align = StageAlign.TOP_LEFT;
webView.stage = this.stage;
webView.viewPort = new Rectangle( 0, 0, stage.fullScreenWidth, stage.fullScreenHeight );
webView.loadURL( "http://www.google.com" );
stage.addEventListener( KeyboardEvent.KEY_DOWN, onKey );
}
private function onKey( event:KeyboardEvent ):void
{
if( event.keyCode == Keyboard.BACK && webView.isHistoryBackEnabled )
{
trace("Back.");
webView.historyBack();
event.preventDefault();
}
if( event.keyCode == Keyboard.SEARCH && webView.isHistoryForwardEnabled )
{
trace("Forward.");
webView.historyForward();
}
}
}
}
Related
How to run the below Appium code in sauce labs? When I checked sauce labs website there is only one line given below
driver = new WebDriver(
new URL("https://balajimscit09:a30f3417-cbe6-48ce-92b5-e9a6d0814879#ondemand.us-west-1.saucelabs.com:443")
);
Below is my code
package mobile_Appium;
import static io.appium.java_client.touch.TapOptions.tapOptions;
import static io.appium.java_client.touch.WaitOptions.waitOptions;
import static io.appium.java_client.touch.offset.ElementOption.element;
import java.io.File;
import java.net.MalformedURLException;
import java.net.URL;
import java.time.Duration;
import java.util.concurrent.TimeUnit;
import org.openqa.selenium.By;
import org.openqa.selenium.Dimension;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.remote.DesiredCapabilities;
import io.appium.java_client.AppiumDriver;
import io.appium.java_client.FindsByAndroidUIAutomator;
import io.appium.java_client.MobileElement;
import io.appium.java_client.TouchAction;
import io.appium.java_client.android.AndroidDriver;
import io.appium.java_client.android.AndroidTouchAction;
import io.appium.java_client.remote.MobileCapabilityType;
import io.appium.java_client.touch.WaitOptions;
import io.appium.java_client.touch.offset.PointOption;
public class InstallTestAndroid10 {
static AppiumDriver driver;
public static void main(String[] args) throws MalformedURLException, InterruptedException {
File f = new File("src");
File fs = new File(f, "ApiDemos-debug.apk");
DesiredCapabilities cap = new DesiredCapabilities();
cap.setCapability(MobileCapabilityType.PLATFORM_NAME, "Android");
cap.setCapability(MobileCapabilityType.VERSION, "10.0");
cap.setCapability(MobileCapabilityType.DEVICE_NAME, "Android Device");
cap.setCapability(MobileCapabilityType.AUTOMATION_NAME, "Uiautomator2");
cap.setCapability("autoGrantPermissions", true);
cap.setCapability("noReset", "false");
cap.setCapability("fullReset", "true");
cap.setCapability(MobileCapabilityType.APP, fs.getAbsolutePath());
driver = new AndroidDriver<>(new URL("http://127.0.0.1:4723/wd/hub"), cap);
driver.manage().timeouts().implicitlyWait(60, TimeUnit.SECONDS);
/*driver.findElement(By.xpath("//android.widget.Button[#text='OK']")).click();
Thread.sleep(10000);
((FindsByAndroidUIAutomator<MobileElement>) driver).findElementByAndroidUIAutomator("new UiScrollable(new UiSelector().scrollable(true).instance(0)).scrollIntoView(new UiSelector().textContains(\"Views\").instance(0))");
driver.findElement(By.xpath("//android.widget.TextView[#text='Views']")).click(); */
}}
How to integrate with a real device present in sauce labs?
Your App should be upload to sauce storage.
After that, the app capability should point to this file.
Fo example:
cap.setCapability(MobileCapabilityType.APP, "storage:filename=ApiDemos-debug.apk");
You can read more here:
https://wiki.saucelabs.com/display/DOCS/Application+Storage
Also, you should change your access key after publishing it here
In those capabilities, it looks like you are still pointing to a local URL. You need to add a URL for sauce labs with your username and access key, and upload an app. See how it is done in this video: https://www.youtube.com/watch?v=hwp5YeF5Me4
There are 3 basic things that you need to do to run an Appium test
Upload your app to Sauce Labs so your test can run against it in the Real Device Cloud
Update your Test code with your Sauce Username and Access Key (Set as environment vars), and use these to start a driver with the endpoint (or URL) to test against
Update your capabilities for the real device you want to test including app name, device, platform version and more.
System.out.println("Sauce iOS Native - BeforeMethod hook");
String username = System.getenv("SAUCE_USERNAME");
String accesskey = System.getenv("SAUCE_ACCESS_KEY");
String sauceUrl;
if (region.equalsIgnoreCase("eu")) {
sauceUrl = "#ondemand.eu-central-1.saucelabs.com:443";
} else {
sauceUrl = "#ondemand.us-west-1.saucelabs.com:443";
}
String SAUCE_REMOTE_URL = "https://" + username + ":" + accesskey + sauceUrl +"/wd/hub";
String appName = "iOS.RealDevice.SauceLabs.Mobile.Sample.app.2.7.1.ipa";
String methodName = method.getName();
URL url = new URL(SAUCE_REMOTE_URL);
DesiredCapabilities capabilities = new DesiredCapabilities();
capabilities.setCapability("deviceName", "iPhone 8.*");
capabilities.setCapability("platformName", "iOS");
capabilities.setCapability("automationName", "XCuiTest");
capabilities.setCapability("app", "storage:filename="+appName); // or "storage:"+appID
capabilities.setCapability("name", methodName);
iosDriver.set(new IOSDriver(url, capabilities));
I followed jzy3d's demo and made a 3D plot model with AnalysisLauncher.open() in IntelliJ, win10. It's working great. Now I need to migrate to Android so that the model can be displayed in a mobile application.
What kind of change is needed for my code? A sample project with any jzy3d surface plot would be ideal.
I read the page How to use jzy3d in android using eclipse?. The idea seems promising because jzy3d relies on JOGL 2 and OpenGL is cross-platform API. But I couldn't understand all the instructions by #Martin. Namely how do I replace AWT and 'derive CanvasNewtAwt'? I have no idea how to start the project in Android Studio.
The jzy3d website(http://www.jzy3d.org/) also says 'Android supposed to work if enable approte JOGL jars'. How do I do this?
import MyColorMap.ColorMapGrayColor;
import org.jzy3d.analysis.AbstractAnalysis;
import org.jzy3d.analysis.AnalysisLauncher;
import org.jzy3d.chart.factories.AWTChartComponentFactory;
import org.jzy3d.colors.Color;
import org.jzy3d.colors.ColorMapper;
import org.jzy3d.colors.colormaps.ColorMapGrayscale;
import org.jzy3d.colors.colormaps.IColorMap;
import org.jzy3d.maths.Coord3d;
import org.jzy3d.plot3d.primitives.Point;
import org.jzy3d.plot3d.primitives.Polygon;
import org.jzy3d.plot3d.primitives.Shape;
import org.jzy3d.plot3d.rendering.canvas.Quality;
public class ChartPlot extends AbstractAnalysis {
private int unitSize;
private BufferedImage img;
public static void main(String[] args) throws Exception {
AnalysisLauncher.open(new ChartPlot("Forest.jpg", 10));
}
ChartPlot (String imgName, int unitSize) {
img= getImg(imgName);
this.unitSize= unitSize;
}
#Override
public void init() {
int[][][] xycoords= getXYCoords(img, unitSize);
float[][][] zcoords= getZCoords(img, unitSize);
List<Polygon>[] polygonsArray= (List<Polygon>[])new List[COUNT_COLOR];
for (int iColor=0; iColor < polygonsArray.length; iColor++) {
polygonsArray[iColor]= getPolygons(xycoords, zcoords, iColor);
}
IColorMap[] colorMaps= new IColorMap[] {new ColorMapGrayscale(), new ColorMapGrayColor(ColorMapGrayColor.RED),
new ColorMapGrayColor(ColorMapGrayColor.YELLOW), new ColorMapGrayColor(ColorMapGrayColor.BLUE),
new ColorMapGrayscale()};
Shape[] surfaces= new Shape[COUNT_COLOR];
for (int iColor= 0; iColor < surfaces.length; iColor++) {
surfaces[iColor]= iColor == INDEX_WHITE || iColor == INDEX_BLACK ?
getSurface_Global(polygonsArray[iColor], colorMaps[iColor]) :
getSurface_Local(polygonsArray[iColor], colorMaps[iColor]);
}
chart = AWTChartComponentFactory.chart(Quality.Advanced, getCanvasType());
chart.getScene().getGraph().add(surfaces[INDEX_BLACK]);
chart.getScene().getGraph().add(surfaces[INDEX_RED]);
chart.getScene().getGraph().add(surfaces[INDEX_YELLOW]);
chart.getScene().getGraph().add(surfaces[INDEX_BLUE]);
chart.getScene().getGraph().add(surfaces[INDEX_WHITE]);
}
I have windows 7 which is connected to two android devices and I am using Selenium and Appium to automate an App but not able to run the test simultaneously in both the devices. below is the code I am using along with contents from testng.xml. let me know where I am wrong. The below code runs fine but it installs the app on first device then on second device what I want to acheive is install the app simultaneously on both the devices. Any help appreciated.
package ca.automation.com;
import org.testng.annotations.Test;
import com.relevantcodes.extentreports.ExtentReports;
import com.relevantcodes.extentreports.ExtentTest;
import com.relevantcodes.extentreports.LogStatus;
import io.appium.java_client.android.AndroidDriver;
import org.testng.annotations.BeforeSuite;
import org.testng.annotations.BeforeTest;
import java.io.File;
import java.io.IOException;
import java.net.MalformedURLException;
import java.net.URL;
import java.util.List;
import java.util.concurrent.TimeUnit;
import org.junit.Assert;
import org.openqa.selenium.By;
import org.openqa.selenium.NoSuchElementException;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.ie.InternetExplorerDriver;
import org.openqa.selenium.remote.DesiredCapabilities;
import org.openqa.selenium.support.ui.ExpectedConditions;
import org.openqa.selenium.support.ui.WebDriverWait;
public class StackOverflow {
WebDriver driver1;
WebDriver driver2;
// ExtentReports report;
// ExtentTest logger;
// Boolean present;
File app = new File("App\\app_US_IT_Ananta.apk");
#BeforeSuite
public void startReport(){
// report=new ExtentReports("C:\\Anuj\\MobileAppResults.html");
}
#Test (priority =0)
public void installapp() {
// logger=report.startTest("VerifyAppInstalltion");
DesiredCapabilities capabilities = new DesiredCapabilities();
capabilities.setCapability("udid", "1015fadb1a274005");
// capabilities.setCapability("udid", "ee92ba92");
capabilities.setCapability("deviceName","Android Emulator");
capabilities.setCapability("platformVersion", "4.4");
capabilities.setCapability("autoAcceptAlerts", true);
capabilities.setCapability("app", app.getAbsolutePath());
try {
driver1 = new AndroidDriver(new URL("http://127.0.0.1:4723/wd/hub"), capabilities);
} catch (MalformedURLException e) {
e.printStackTrace();
}
}
#Test (priority =0)
public void installapp1() {
DesiredCapabilities capabilities1 = new DesiredCapabilities();
capabilities1.setCapability("udid", "ee92ba92");
capabilities1.setCapability("deviceName","Android Emulator");
capabilities1.setCapability("platformVersion", "4.4");
capabilities1.setCapability("autoAcceptAlerts", true);
capabilities1.setCapability("app", app.getAbsolutePath());
try {
driver2 = new AndroidDriver(new URL("http://127.0.0.1:4730/wd/hub"), capabilities1);
} catch (MalformedURLException e) {
e.printStackTrace();
}
}
}
Testng.xml
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd">
<suite name="Suite" parallel="tests" thread-count="2">
<test name="Test">
<classes>
<class name="ca.automation.com.StackOverflow"/>
</classes>
</test> <!-- Test -->
Change parallel="tests" to parallel="methods" because you have to execute the methods in parallel, as its in your case.
Also, running tests in parallel wont be exactly 100% simultaneous execution. There would be some lag between execution in both devices. Try out a complete script with few more additional steps. That way we can easily make out that the tests are running simultaneously.
Looks like it's a pretty common trouble but I still can't see any solution.
The goal is to play some AAC stream in Adobe AIR mobile app.
I've used a transcoder from https://code.google.com/p/project-thunder-snow/. The one adds FLV headers to AAC data so the stream can be played through a standard AS3 NetStream object (as I understand). It works fine for Win and Mac, but the same app launched on Android or iOS produces neither sound nor any error. I've figured out that transcoder works fine, the cause is in the different
characteristics of NetStreams.
So, is there any solution or, at least, any documentation describing the difference between the NetStreams on PC and mobile platform?
okay... I might have a clue for you. I tried running the NBAAC code link you posted on the other question from inside Flash CS5 and got this error:
onChannelReady : MetaData
Error: Error #2067: The ExternalInterface is not available in this container. ExternalInterface requires Internet Explorer ActiveX, Firefox, Mozilla 1.7.5 and greater, or other browsers that support NPRuntime.
So it seems the player is designed to be used only inside an html page & uses ExternalInterface (Javascript) for getting metadata from browser into SWF
I removed the ExternalInterface stuff and it played ok in Flash Player & Device Central without browser/HTML. Try this code in your AIR app (I don't have AIR installed anymore to confirm..)
package
{
import com.thebitstream.flv.CodecFactory;
import com.thebitstream.flv.Transcoder;
import com.thebitstream.flv.codec.*;
import com.thebitstream.ice.*;
import com.thebitstream.nsv.*;
import flash.display.Sprite;
import flash.events.Event;
import flash.events.IOErrorEvent;
import flash.events.ProgressEvent;
import flash.events.SecurityErrorEvent;
//import flash.external.externalInterface;
import flash.media.SoundTransform;
import flash.net.NetConnection;
import flash.net.NetStream;
import flash.net.NetStreamAppendBytesAction;
import flash.net.URLRequest;
import flash.net.URLRequestMethod;
import flash.net.URLStream;
import flash.utils.ByteArray;
import flash.utils.setTimeout;
[SWF (width="250",height="250")]
public dynamic class NBAAC_AIR_v2 extends Sprite
{
public var request:URLRequest;
public var transcoder :Transcoder
public var transport :NetConnection;
public var transportStream :NetStream;
public var serverConnection:URLStream;
//public var host:String=" "; //now not used in line... request=new URLRequest(resource);
public var resource:String="http://aacplus-ac-64.timlradio.co.uk/;"; //test station
public function NBAAC_AIR_v2 ()
{
super();
/*
if(loaderInfo.parameters.host)
host=loaderInfo.parameters.host;
if(loaderInfo.parameters.resource)
resource=loaderInfo.parameters.resource;
*/
//CodecFactory.ImportCodec(MetaData);
CodecFactory.ImportCodec(AAC);
CodecFactory.ImportCodec(AACP);
transcoder = new Transcoder();
transcoder.addEventListener(CodecEvent.STREAM_READY,onChannelReady);
transcoder.addEventListener(StreamDataEvent.DATA, onTag);
transcoder.initiate();
transcoder.loadCodec("AAC");
transport = new NetConnection();
transport.connect(null);
flash.utils.setTimeout(boot,500);
}
public function boot():void
{
//externalInterface.addCallback('nbaac.setVolume', onVolume);
//externalInterface.addCallback('nbaac.togglePause', onTogglePause);
//externalInterface.addCallback('nbaac.setBuffer', setBufferLength);
//externalInterface.addCallback('nbaac.getBuffer', getBufferLength);
//externalInterface.addCallback('nbaac.getTime', getTime);
/*
var meta:Object={};
meta.uri="";
meta.StreamTitle="";
*/
//externalInterface.call('logit','start up');
transportStream = new NetStream(transport);
transportStream.bufferTime=2;
transportStream.client = this;
transportStream.soundTransform=new SoundTransform(.5,0);
transportStream.play(null);
transportStream.appendBytesAction(NetStreamAppendBytesAction.RESET_BEGIN);
var headerTag:ByteArray=transcoder.createHeader(false,true);
transportStream.appendBytes(headerTag);
headerTag.clear();
//transcoder.readMetaObject(meta,0);
serverConnection=new URLStream();
serverConnection.addEventListener(ProgressEvent.PROGRESS,loaded);
serverConnection.addEventListener(IOErrorEvent.IO_ERROR, onIo);
serverConnection.addEventListener(SecurityErrorEvent.SECURITY_ERROR, onNoPolicy);
serverConnection.addEventListener(Event.CLOSE, onClose);
request=new URLRequest(resource); //removed "host" from url
//request.requestHeaders=[new URLRequestHeader("GET",resource+" HTTP/1.0")];
//request.requestHeaders=[new URLRequestHeader("Icy-MetaData","1")];
request.method=URLRequestMethod.GET;
serverConnection.load(request);
}
private function getTime() :Number
{ return transportStream.time; }
private function getBufferLength() :Number
{ return transportStream.bufferLength; }
private function setBufferLength (val:Number) :void
{ transportStream.bufferTime = val; }
private function onTogglePause():void
{ transportStream.togglePause(); }
private function onVolume (val:Number) :void
{ transportStream.soundTransform = new SoundTransform(val, 0); }
private function onIo(pe:IOErrorEvent) :void
{ /* externalInterface.call('logit','IOErrorEvent') */ }
private function onTag(sde:StreamDataEvent) :void
{
sde.tag.position=0;
transportStream.appendBytes(sde.tag);
}
private function onChannelReady(ce:CodecEvent) :void
{ trace('onChannelReady :',ce.codec.type); }
private function onClose(e:Event):void
{ /* externalInterface.call('logit','onClose') */ }
public function onMetaData(e:Object):void
{ /* externalInterface.call('logit','onMetaData') */ }
private function loaded(e:ProgressEvent):void
{
var chunk:ByteArray=new ByteArray();
while(serverConnection.bytesAvailable)
{ chunk.writeByte( serverConnection.readByte() ); }
chunk.position=0;
transcoder.addRawData( chunk, 0, "AAC" );
}
private function onNoPolicy(se:SecurityErrorEvent):void
{ /* externalInterface.call('logit','SecurityErrorEvent'+host+resource+'<br />'+se.text); */ }
}
}
I am trying to Play a video on a android tablet using stageVideo but any time i click play and add the video to the stage the hole app flickers and then the video is added to the stage. The video then start off being all pixelated. Then it goes away and starts playing properly with only a few jumps. I am wondering what is casing this to happen? Is there a better way to load the video. This also can happen when just using the video object in flex.
The video is stored locally in the file:///mnt/sdcard
The video type is H.264Thanks for your help! If i missed something that you need to know please comment and I will edit my question.
Here is the view for the video. (i am using a view based mobile App)
<?xml version="1.0" encoding="utf-8"?>
<s:View xmlns:fx="http://ns.adobe.com/mxml/2009" xmlns:s="library://ns.adobe.com/flex/spark" title="stageVidPage" backKeyPressed="0" xmlns:mx="library://ns.adobe.com/flex/mx" backgroundAlpha="0" alpha="1">
<fx:Script>
<![CDATA[
import ios.iOSStageVideo;
import mx.core.UIComponent;
import mx.events.FlexEvent;
protected function backClick(event:MouseEvent):void
{
navigator.pushView(SliderAppHomeView);
}
protected function playVideo(event:MouseEvent):void
{
var path:String = new String(new File("file:///mnt/sdcard/Movies/Video_test_11.mp4").url);
var vid:iOSStageVideo = new iOSStageVideo( path , 1280 , 720 );
vid.addEventListener('videoDone' , videoStop);
var container:UIComponent = new UIComponent();
container.width = stage.stageWidth;
container.height = stage.stageHeight;
addElement( container );
container.addChild( vid );
}
private function videoStop(e:Event):void {
//vid.stopVideo();
//container.removeChild( vid );
//removeElement( container );
}
]]>
</fx:Script>
<fx:Declarations>
<!-- Place non-visual elements (e.g., services, value objects) here -->
</fx:Declarations>
<s:actionContent>
<s:Button click="backClick(event)" label="Back"/>
</s:actionContent>
<s:Button left="10" bottom="10" label="Play" alpha="1" click="playVideo(event)"/>
</s:View>
Here is the As class i found online to help play the video (really don't use much of it and since it gives some errors when the video ends i will need to rewrite it. I commented out those parts)
package ios
{
import flash.display.Sprite;
import flash.display.StageAlign;
import flash.display.StageQuality;
import flash.display.StageScaleMode;
import flash.events.Event;
import flash.events.NetStatusEvent;
import flash.events.StageVideoAvailabilityEvent;
import flash.events.StageVideoEvent;
import flash.geom.Rectangle;
import flash.media.StageVideo;
import flash.media.StageVideoAvailability;
import flash.media.Video;
import flash.net.NetConnection;
import flash.net.NetStream;
[Bindable]
public class iOSStageVideo extends Sprite
{
private var videoPath:String;
private var videoWidth:Number;
private var videoHeight:Number;
private var _sv:StageVideo;
private var _vd:Video;
private var _obj:Object;
private var _ns:NetStream;
public function iOSStageVideo( path:String , w:Number , h:Number ){
videoPath = path;
videoWidth = w;
videoHeight = h;
addEventListener(Event.ADDED_TO_STAGE, onAddedToStage);
}
//stage is ready
private function onAddedToStage(e:Event):void{
stage.scaleMode = StageScaleMode.NO_SCALE;
stage.align = StageAlign.TOP_LEFT;
var nc:NetConnection = new NetConnection();
nc.connect(null);
_ns = new NetStream(nc);
_obj = new Object();
_ns.client = _obj; _ns.bufferTime = 2;
_ns.client = _obj;
_obj.onMetaData = MetaData;
_sv = stage.stageVideos[0];
_sv.viewPort = new Rectangle(0, 0, videoWidth , videoHeight );
_sv.attachNetStream(_ns);
playVideo();
}
//video is ready, play it
//public, can be called externally
public function playVideo():void{
_ns.play( videoPath );
_ns.addEventListener(NetStatusEvent.NET_STATUS, videoStatus);
}
//required metadata for stagevideo, even if not used
private function MetaData(info:Object):void{ }
//get video status
private function videoStatus(e:NetStatusEvent):void{
switch(e.info.code){
case "NetStream.Play.StreamNotFound":
//do something
break;
case "NetStream.Play.Start":
//do something
break
case "NetStream.Play.Stop":
stopVideo();
break;
case "NetStream.Buffer.Empty":
//do something
break;
case "NetStream.Buffer.Full":
//do something
break;
case "NetStream.Buffer.Flush":
//do something
break;
}
}
//stop and clear the video
//public, can be called externally
public function stopVideo():void{
_ns.close();
_ns.dispose();
dispatchEvent( new Event('videoDone', true ) );
}
}
}
The video is located on the tablet in the local file system. Thank you for any help!