Building Chromium for Android with Webrtc h264 support - android

I'm trying to build Chromium Android with h264 support in webrtc. My understanding is that the following args.gn file should do what I want.
target_os = "android"
target_cpu = "arm64"
proprietary_codecs = true
ffmpeg_branding = "Chrome"
However, when I install the APK on my Pixel 3, use chrome://inspect to debug from my desktop and run new RTCPeerConnection().createOffer({offerToReceiveVideo: true}).then(s => console.log(s.sdp)) I only see VP8 and VP9 codecs.
Is there anything else I'm missing?

I ended up having to change the code to get the behavior I wanted.
Setting these build flags causes the GPU process to answer, "Yes, I support H264 video decoding" to any queries https://cs.chromium.org/chromium/src/media/gpu/android/media_codec_video_decoder.cc?q=proprietary_codecs&sq=package:chromium&dr=C&l=154
However, webrtc's definition of supported codecs comes from this function which is simply polling the formats supported by the encoder. https://webrtc.googlesource.com/src/+/refs/heads/master/media/engine/webrtc_video_engine.cc#142. So it appears that although my Pixel 3 supports H264 decoding it doesn't support encoding and so webrtc considers it an unsupported format. Interestingly, Chrome running on the exact same device does support webrtc H264.
I'm only looking to receive H264 video so I edited this function to add a webrtc::SdpVideoFormat for each H264 format that Chrome supports.
+static void AddH264Formats(std::vector<webrtc::SdpVideoFormat>& formats) {
+ webrtc::SdpVideoFormat h264Format(kH264CodecName, {
+ {cricket::kH264FmtpLevelAsymmetryAllowed, "1"}});
+
+ h264Format.parameters[cricket::kH264FmtpProfileLevelId] = "42001f";
+ h264Format.parameters[cricket::kH264FmtpPacketizationMode] = "1";
+ if(std::find(formats.begin(), formats.end(), h264Format) == formats.end()) {
+ formats.push_back(h264Format);
+ }
+ h264Format.parameters[cricket::kH264FmtpPacketizationMode] = "0";
+ if(std::find(formats.begin(), formats.end(), h264Format) == formats.end()) {
+ formats.push_back(h264Format);
+ }
+
+ h264Format.parameters[cricket::kH264FmtpProfileLevelId] = "42e01f";
+ h264Format.parameters[cricket::kH264FmtpPacketizationMode] = "1";
+ if(std::find(formats.begin(), formats.end(), h264Format) == formats.end()) {
+ formats.push_back(h264Format);
+ }
+ h264Format.parameters[cricket::kH264FmtpPacketizationMode] = "0";
+ if(std::find(formats.begin(), formats.end(), h264Format) == formats.end()) {
+ formats.push_back(h264Format);
+ }
+
+ h264Format.parameters[cricket::kH264FmtpProfileLevelId] = "4d0032";
+ h264Format.parameters[cricket::kH264FmtpPacketizationMode] = "1";
+ if(std::find(formats.begin(), formats.end(), h264Format) == formats.end()) {
+ formats.push_back(h264Format);
+ }
+ h264Format.parameters[cricket::kH264FmtpPacketizationMode] = "0";
+ if(std::find(formats.begin(), formats.end(), h264Format) == formats.end()) {
+ formats.push_back(h264Format);
+ }
+}
+
std::vector<VideoCodec> AssignPayloadTypesAndDefaultCodecs(
const webrtc::VideoEncoderFactory* encoder_factory) {
- return encoder_factory ? AssignPayloadTypesAndDefaultCodecs(
- encoder_factory->GetSupportedFormats())
- : std::vector<VideoCodec>();
+ auto formats = encoder_factory->GetSupportedFormats();
+ AddH264Formats(formats);
+
+ return AssignPayloadTypesAndDefaultCodecs(formats);
}
Instead of editing the webrtc code I think I may edit GpuVideoAcceleratorFactoriesImpl::GetVideoEncodeAcceleratorSupportedProfiles. Editing the GpuVideoAcceleratorFactoriesImpl this way may be less correct but it would allow me to fork Chromium without having to mess with third_party repositories.

Related

Invalid Request when debuging on Android devide, Xamarin

I'm writing a project in Xamarin Forms, and today we try our app on Android device. And our request to googleAPI stop working. Here's code :
HttpWebRequest webRequest = WebRequest.Create(#"https://maps.googleapis.com/maps/api/place/search/json?location=" + position.Latitude + "," + position.Longitude + "&radius=1000&types=bar&sensor=false&key=APIKEY") as HttpWebRequest;
webRequest.Method = "GET";
webRequest.BeginGetResponse(new AsyncCallback(RequestCompleted), webRequest);
and then :
var request = (HttpWebRequest)result.AsyncState;
var response = (HttpWebResponse)request.EndGetResponse(result);
using (var stream = response.GetResponseStream())
{
var r = new StreamReader(stream);
var resp = r.ReadToEnd();
}
We got response and change it to json. On computer it works OK, but on Android it is saying INVALID REQUEST. Someone worked with Xamarin and know how to solve it?
If I would have to guess, I'd guess, that the locale/ language on your phone is different to that one on your pc.
HttpWebRequest webRequest = WebRequest.Create(#"https://maps.googleapis.com/maps/api/place/search/json?location=" + position.Latitude + "," + position.Longitude + "&radius=1000&types=bar&sensor=false&key=APIKEY") as HttpWebRequest;
There are implicit ToString() at ...n=" + position.Latitude + "," + position.Longitude + "&ra... that produce a localized output. But the google API needs the values with a . as decimal separator.
Append .ToString("G", CultureInfo.InvariantCulture)
HttpWebRequest webRequest = WebRequest.Create(#"https://maps.googleapis.com/maps/api/place/search/json?location=" +
position.Latitude.ToString("G", CultureInfo.InvariantCulture) +
"," + position.Longitude.ToString("G", CultureInfo.InvariantCulture) +
"&radius=1000&types=bar&sensor=false&key=APIKEY") as HttpWebRequest;

Selenium File Upload iOS or Android using BrowserStack

Is there any possibility to upload a file on mobile browser (Safari on iOS, Chrome on Android)? Conventional methods doesn't seem to work. Maybe there are similar options to BrowserStack where file upload can actually work?
BrowserStack uses Appium to drive your Selenium tests on Android and iOS.
As given here, since Appium currently does not support uploads, BrowserStack too wouldn't be able to support File Upload on mobile devices.
This is a hacky file upload solution using base64 and JS, but it works, so hey, I hope you find this useful:
public CorePage UploadHack(string fileInputId, string contentType, string fileContent, string fileName, string angularScopeVar)
{
UploadFile(_filePath);
var uploadHack =
"(function(){" +
"function convert(base64){" +
"var raw = atob(base64);" +
"var arr = new Uint8Array(new ArrayBuffer(raw.length));" +
"for (var i = 0; i < raw.length; i++){" +
"arr[i] = raw.charCodeAt(i);" +
"}" +
"return arr; " +
"}" +
$"var file = new Blob([convert('{fileContent}')], {{'type':'{contentType}'}}); " +
$"file.name = '{fileName}'; " +
$"angular.element('#{fileInputId}').scope().{angularScopeVar} = file;" +
"})()";
Driver.ExecuteJavaScript(uploadHack);
return this;
}

Unable to merge videos using FFMPEG Commands

I am trying to merge two videos in Android using FFMPEG and I have been following the Android War Zone blog which gives great ideas and simple methods to integrate FFMPEG in our project. However, I am facing issues in merging two videos.
Command :
vk.run(new String[]{
"ffmpeg",
"-f",
"concat",
"-i",
list,
"-s",
"hd720",
"-c",
"copy",
"-b",
br_from_db + "k",
path + "/" + "merged_video_3.mp4"
}, work_path, getActivity());
And the "list" in the above command is the one where I am facing a issue.It throws me the following error when I use the following method :
Code :
private String generateList(String[] inputs) {
File list;
Writer writer = null;
try {
list = File.createTempFile("ffmpeg-list", ".txt");
writer = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(list)));
for (String input : inputs) {
writer.write("file '" + input + "'\n");
Log.d(TAG, "Writing to list file: file '" + input + "'");
}
} catch (IOException e) {
e.printStackTrace();
return "/";
} finally {
try {
if (writer != null)
writer.close();
} catch (IOException ex) {
ex.printStackTrace();
}
}
Log.d(TAG, "Wrote list file to " + list.getAbsolutePath());
return list.getAbsolutePath();
}
Error :
12-16 19:49:57.416 5437-5437/? E/ffmpeg4android﹕ Command validation failed.
12-16 19:49:57.416 5437-5437/? E/ffmpeg4android﹕ Check if input file exists: /data/data/com.family45.golive.family45v1/cache/ffmpeg-list-1803386407.txt/storage/emulated/0/DCIM/Camera/dec24.mp4 /storage/emulated/0/DCIM/Camera/vid2.mp4
12-16 19:49:57.416 5437-5437/? W/System.err﹕ com.netcompss.ffmpeg4android.CommandValidationException
12-16 19:49:57.416 5437-5437/? W/System.err﹕ at com.netcompss.loader.LoadJNI.run(LoadJNI.java:34)
12-16 19:49:57.416 5437-5437/? W/System.err﹕ at com.netcompss.loader.LoadJNI.run(LoadJNI.java:49)
I obtained the command from this stack question. Its accepted but I am facing the above issue. I am very sure that the videos are present in their respective locations and all the paths are right but I cant seem to make it work.
Any insights on this is highly appreciated. Thanks in advance.
Update :
Call to generateList:
ArrayList<String> paths_to_merge = new ArrayList<String>();
paths_to_merge.add(path + "/" + "dec24.mp4");
paths_to_merge.add(path + "/" + "vid2.mp4");
LoadJNI vk = new LoadJNI();
String[] v12 = new String[paths_to_merge.size()];
v12 = paths_to_merge.toArray(v12);
String list = generateList(v12);
I am not sure what went wrong in my code, I am still not able to come with the right list. However, I found another command which seems to be working good.
Command :
vk.run(new String[]{"ffmpeg","-y","-i",path + "/" + "num1.mp4","-i",path + "/" + "num2.mp4","-i",path + "/" + "num3.mp4","-i",path + "/" + "num4.mp4",
"-i",created_folder + "/" + "created_video2.mp4","-strict","experimental",
"-filter_complex",
"[0:v]scale=640x480,setsar=1:1[v0];[1:v]scale=640x480,setsar=1:1[v1];[2:v]scale=640x480,setsar=1:1[v2];[3:v]scale=640x480,setsar=1:1[v3];" +
"[4:v]scale=640x480,setsar=1:1[v4];[v0][0:a][v1][1:a][v2][2:a][v3][3:a][v4][4:a] concat=n=5:v=1:a=1",
"-ab","48000","-ac","2","-ar","22050","-s","640x480","-r","30","-vcodec","mpeg4","-b","2097k",path + "/" + "numbers_video_m.mp4"},path,getActivity());
As you can see in the command, I have appended 5 videos for the purpose of testing but I believe that we can add more videos dynamically and this works without any issues for me.
Things to be noted :
"-i",path + "/" + "num1.mp4"
represent the input and you can append as many as you want.
[0:v]scale=640x480,setsar=1:1[v0];
and add this according to the number of inputs accordingly as [0:v]...[1:v].. and so on.
[v0][0:a]
and also this parameter to be added according the number of inputs.
concat=n=5:v=1:a=1
Give the value of n according to the number of videos.
So those are the main things that needs to be taken care of.

Cordova + Android : How to get path to sd card?

I'm trying to play some mp3 files (which app downloaded from the remote server) from sd card using Cordova.
I tried to use following:
if($ionicPlatform.is('android')){
src = 'yourPersistantAppFolder/main_expansion/' + src;
}
and
if($ionicPlatform.is('android')){
src = 'sdcard/yourPersistantAppFolder/main_expansion/' + src;
}
But without luck.
I tried to find some anwser here:
https://github.com/apache/cordova-plugin-file
But without the luck.
Many thanks for any advice.
Edit:
Path in Android is displayed like this:
This may help:
(jc??? are mysupport routines)
import { File } from "#ionic-native/file";
import { Diagnostic } from "#ionic-native/diagnostic";
constructor(
...
public file: File,
public diagnostic: Diagnostic
){
this.diagnostic.getExternalSdCardDetails()
.then( (data) => {
this.jcError += "\n" + "sd:" + JSON.stringify( data);
this.jcError += "\n" + "Number cards: " + data.length;
for( let ii = 0; ii < data.length; ii += 1){
let thisElem = data[ii];
if( thisElem.type.toLowerCase() === "application" && thisElem.canWrite){
this.ourSDentry = thisElem;
basePathSD = thisElem.filePath;
break;
}
}
if( !basePathSD){
this.jcError += "\n\n" + "no SD card found";
return;
}
}, (errData)=>{
tag = "getExternalSdCardDetails";
this.jcError += "\n\n" + tag + ":ERR:" + JSON.stringify( errData);
});

XMLHTTPrequest and CrossWalk Android ( edit ..after i found a solution !)

original post: ( before fix )..
I develop an application (transformer.html) with XDK (ThreeJS and WebGL ) to control another application. The communication is established via a WebService (VB ASP.Net). The first version of this transformer.html had NO WebGL neither crosswalk. In debug and emulator-mode all is fine. Also compiled as a legacy hybrid mobile APK and published to my Samsung Galaxy Tab 3 without any problems. The problems pop up when I implement CrossWalk AND XMLHTTPRequest in my app.
The same origin policy seems not to be a problem. I placed this on web server side (in web.config).
<webServices>
<protocols>
<add name="HttpGet"/>
<add name="HttpPost"/>
</protocols>
</webServices>
As said - the constellation Transformer-APP on Tablet -- WebService --- TargetWebApplication is running perfectly in my local network!
The Problem:
The problem I run into came up with implementing a simple WebGL graphic. Everything is running perfectly in simulation and debug, and when I just run this in a Firefox browser. But when I build an APK this combination (CrossWalk & XMLHTTPRequest) fails!
*kristina/14.03.15: it´s working now. i consequently used this XDK-example: https://github.com/gomobile/sample-webgl-threejs
i took my httprequest in and it was working fine!*
Before:
XMLHTTPRequest (or even the POST via jQuery!) work fine under Android Legacy build. I could make the APK running but there was no Crosswalk WebGL graphic visible on my app. So HTTP Post was OK, but not Crosswalk. I´m wondering if it is possible to build a legacy App with Crosswalk. Even with the XDK's own Crosswalk demo, I was not able to build as a hybrid Legacy APK.
CrossWalk was OK on my app when I build with CrossWalk for Android, but, in this case, XMLHTTPRequest seems not possible. My connection to the WebService fails; I got a 404. But, as I said, all communication should be there, as in other modes (legacy, emulation, browser, whatever...), its working.
kristina/14.03.15: XDK recomment to build it in Android/Crosswalk. This is working NOW. The trick is to set specific host parameters in the build section!
<access origin="*"/>
( handle with care. e.g. reduce this to limited hosts! my setup only is working in my smal local environment. This way it´s ok for me )
those info was very helpful during the errorTracking:
*https://software.intel.com/en-us/xdk/docs/adding-special-build-options-to-your-xdk-cordova-app-with-the-intelxdk-config-additions-xml-file
https://software.intel.com/en-us/xdk/docs/using-the-cordova-for-android-ios-etc-build-option
http://www.ilinsky.com/articles/XMLHttpRequest/#usage
This was NOT very helpful - as in XDK/cordova/Crosswalk the manifest.json seem not effect anything(!?):
https://crosswalk-project.org/documentation/manifest/content_security_policy.html
As so soften it was the same-origine topic i struggled..
Many Thanks to Paul (Intel) who gave me the final hint :-)
Now the construction
CrossWalk - WebGL - XMLHttpRequest - Webservice is working perfect
*
My Setup:
XDK1826 (latest release I got automatically!)
Samsung Galaxy Tab 3
If you need more info please let me know.
THIS code is running after the fixes:
main.js::
/*jslint browser:true, devel:true, white:true, vars:true, eqeq:true */
/*global THREE:false, requestAnimationFrame:false*/
/*
* Based on http://threejs.org/examples/canvas_geometry_cube.html
*/
document.addEventListener ('DOMContentLoaded', function () {
var camera, scene, renderer;
var cube, plane;
var targetRotation = 0;
var targetRotationOnMouseDown = 0;
var mouseX = 0;
var mouseXOnMouseDown = 0;
var windowHalfX = window.innerWidth / 2;
var windowHalfY = window.innerHeight / 2;
var auto_timer = 0;
init();
animate();
function init() {
renderer = new THREE.WebGLRenderer( { antialias: true, alpha: true, devicePixelRatio: 1 } );
renderer.setSize (window.innerWidth, window.innerHeight);
document.body.appendChild (renderer.domElement);
camera = new THREE.PerspectiveCamera (
70, window.innerWidth / window.innerHeight, 1, 1000);
camera.position.y = 150;
camera.position.z = 500;
scene = new THREE.Scene();
// Cube
var geometry_cube = new THREE.CubeGeometry (200, 200, 200);
var texture = THREE.ImageUtils.loadTexture ('textures/crosswalk.png');//Works on mobile Android NOT in Browser or Intel XDK
texture.anisotropy = renderer.getMaxAnisotropy ();
var material_cube = new THREE.MeshBasicMaterial ( { map: texture } );
cube = new THREE.Mesh (geometry_cube, material_cube);
cube.position.y = 150;
scene.add( cube );
// Plane
var geometry_plane = new THREE.PlaneGeometry (180, 180);
geometry_plane.applyMatrix (new THREE.Matrix4 ().makeRotationX (-Math.PI / 2));
var material_plane = new THREE.MeshBasicMaterial ( { color: 0xde613e } );
plane = new THREE.Mesh (geometry_plane, material_plane);
scene.add (plane);
document.addEventListener ('mousedown', onDocumentMouseDown, false);
document.addEventListener ('touchstart', onDocumentTouchStart, false);
document.addEventListener ('touchmove', onDocumentTouchMove, false);
// Generic setup
window.addEventListener ('resize', onWindowResize, false);
}
function onWindowResize () {
windowHalfX = window.innerWidth / 2;
windowHalfY = window.innerHeight / 2;
camera.aspect = window.innerWidth / window.innerHeight;
camera.updateProjectionMatrix ();
renderer.setSize (window.innerWidth, window.innerHeight);
}
function stopAutoRotate () {
if (auto_timer)
window.clearTimeout (auto_timer);
auto_timer = window.setTimeout (startAutoRotate, 1000);
}
function startAutoRotate () {
auto_timer = 0;
}
function animate () {
requestAnimationFrame (animate);
plane.rotation.y = cube.rotation.y += (targetRotation - cube.rotation.y) * 0.05;
if (auto_timer === 0) {
targetRotation += 0.025;
}
renderer.render (scene, camera);
}
function onDocumentMouseDown (e) {
e.preventDefault();
document.addEventListener ('mousemove', onDocumentMouseMove, false);
document.addEventListener ('mouseup', onDocumentMouseUp, false);
document.addEventListener ('mouseout', onDocumentMouseOut, false);
mouseXOnMouseDown = e.clientX - windowHalfX;
targetRotationOnMouseDown = targetRotation;
stopAutoRotate ();
}
function onDocumentMouseMove (e) {
mouseX = e.clientX - windowHalfX;
targetRotation = targetRotationOnMouseDown +
(mouseX - mouseXOnMouseDown) * 0.02;
stopAutoRotate ();
}
function onDocumentMouseUp (e) {
document.removeEventListener ('mousemove', onDocumentMouseMove, false);
document.removeEventListener ('mouseup', onDocumentMouseUp, false);
document.removeEventListener ( 'mouseout', onDocumentMouseOut, false);
stopAutoRotate ();
}
function onDocumentMouseOut (e) {
document.removeEventListener ('mousemove', onDocumentMouseMove, false);
document.removeEventListener ('mouseup', onDocumentMouseUp, false);
document.removeEventListener ('mouseout', onDocumentMouseOut, false);
stopAutoRotate ();
}
function onDocumentTouchStart (e) {
if (e.touches.length === 1) {
e.preventDefault ();
miniHttpTest();
getSphereParametersWSxhr();
mouseXOnMouseDown = e.touches[ 0 ].pageX - windowHalfX;
targetRotationOnMouseDown = targetRotation;
stopAutoRotate ();
}
}
function onDocumentTouchMove (e) {
if (e.touches.length === 1) {
e.preventDefault ();
mouseX = e.touches[0].pageX - windowHalfX;
targetRotation = targetRotationOnMouseDown +
(mouseX - mouseXOnMouseDown) * 0.05;
stopAutoRotate ();
}
}
});
function XHRObject() {
var xhr;
xhr = new XMLHttpRequest();
xhr.onerror = function () {};
xhr.onstart = function () {};
xhr.success = function () {};
return xhr;
}
function getSphereParametersWSxhr() {
var url = "http://192.444.2.444/transporter.asmx/getVideoCubeParameters";
xhr = new XMLHttpRequest();
var params = "";
console.log(xhr);
alert("----------------------------- getSphereParametersWSxhr - before open POST : " + xhr);
xhr.open("POST", url, true);
xhr.setRequestHeader("Content-Type", "application/x-www-form-urlencoded");
// forbidden xhr.setRequestHeader("Content-length", params.length);
alert("after open POST : " + xhr);
try {
xhr.onreadystatechange = function () {
alert("xhr.readyState == " + xhr.readyState + " xhr.status == " + xhr.status + " xhr.statusText: " + xhr.statusText + " xhr.responseText" + xhr.responseText);
if (xhr.readyState == 2 && xhr.status == 404) {
console.log("404 page not found: " + xhr);
alert("404 page not found: " + xhr);
}
if (xhr.readyState == 3) {
console.log("ready state 3: " + xhr.statusText + " " + xhr.status);
alert("ready state 3: " + xhr.statusText + " " + xhr.status);
}
if (xhr.readyState == 4) { //&& xhr.status == 200
console.log("ready state 4: " + xhr.statusText + " " + xhr.responseText);
alert("ready state 4: " + xhr.statusText + " " + xhr.responseText);
var erg1 = xhr.responseXML.getElementsByTagName("videoCubeSizeX")[0].textContent;
var stringList = erg1.split(";");
console.log(erg1);
alert("videoCubeSizeX: " + erg1);
alert(xhr.responseText);
}
}
xhr.send(params);
} catch (e) {
console.log("XHR Post : " + e);
alert("XHR Post : " + e);
}
}
function miniHttpTest() {
alert("miniHttpTest: mit GET ");
var xhr = new XMLHttpRequest();
xhr.open("GET", "http://crosswalk-project.org/", true);
xhr.onreadystatechange = function () {
if (xhr.readyState == 4) {
alert("ready state 4: " + xhr.statusText + " " + xhr.responseText);
}
}
xhr.send();
}

Categories

Resources