I want to know how I can display video from jpegs in Xamarin (all platforms).
My jpegs are being streamed from a http client stream sent by a popular video surveillance management software.
My jpegs are in the form of byte[] and I get about 10 jpegs/second. This format is imposed.
I tried rapidly changing the Source on a Image but it results in severe fliquering on Android. This seems to work on Windows phone but not so good performance.
How can I create a videoplayer for each one? Unless I am wrond, the existing components cannot do this.
Best,
Thank you Jason! Works great, very fluid rendering!!
Simply add the SkiaSharp.Views.Forms with NuGet to the project and voila!
Here is what that would look like in code (shared project):
// Content page initialization
private void InitUI() {
Title = "Xamavideo";
var button = new Button
{
Text = "Connect!"
};
Label label = new Label
{
Text = ""
};
var scroll = new ScrollView();
scroll.BackgroundColor = Color.Black;
Content = scroll;
var stack = new StackLayout
{
Padding = 40,
Spacing = 10
};
//Add a SKCanvasView item to the stack
var videoCanvas = new SKCanvasView
{
HeightRequest = 400,
WidthRequest = 600,
};
videoCanvas.PaintSurface += OnCanvasViewPaintSurface;
stack.Children.Add(videoCanvas);
}
//Create the event handler
void OnCanvasViewPaintSurface(object sender, SKPaintSurfaceEventArgs args)
{
SKImageInfo info = args.Info;
SKSurface surface = args.Surface;
// using (var stream = new SKManagedStream(fileStream))
if (lastFrame == null) return;
using (var canvas = surface.Canvas)
// use KBitmap.Decode to decode the byte[] in jpeg format
using (var bitmap = SKBitmap.Decode(lastFrame))
using (var paint = new SKPaint())
{
// clear the canvas / fill with black
canvas.DrawColor(SKColors.Black);
canvas.DrawBitmap(bitmap, SKRect.Create(640, 480), paint);
}
}
void UpdateFrame(VideoClient client){
//Use this to update the canvas:
byte[] lastFrame = client.imageBytes;
videoCanvas.InvalidateSurface();
}
I am able to successfully take the screenshot one of the page of my application JainLibrary using below code. I am using junit and appium.
public String Screenshotpath = "Mention the folder Location";
File scrFile = ((TakesScreenshot)driver).getScreenshotAs(OutputType.FILE);
FileUtils.copyFile(scrFile, new File(Screenshotpath+"Any name".jpg"));
Now I want to compare the screenshot with a reference image so that I can move forward with the test case.
A simple solution would be to compare each pixel with the reference screenshoot:
// save the baseline screenshot
driver.get("https://www.google.co.uk/intl/en/about/");
File scrFile = ((TakesScreenshot)driver).getScreenshotAs(OutputType.FILE);
FileUtils.copyFile(scrFile, new File("c:\\temp\\screenshot.png"));
// take another screenshot and compare it to the baseline
driver.get("https://www.google.co.uk/intl/en/about/");
byte[] pngBytes = ((TakesScreenshot)driver).getScreenshotAs(OutputType.BYTES);
if (IsPngEquals(new File("c:\\temp\\screenshot.png"), pngBytes)) {
System.out.println("equals");
} else {
System.out.println("not equals");
}
public static boolean IsPngEquals(File pngFile, byte[] pngBytes) throws IOException {
BufferedImage imageA = ImageIO.read(pngFile);
ByteArrayInputStream inStreamB = new ByteArrayInputStream(pngBytes);
BufferedImage imageB = ImageIO.read(inStreamB);
inStreamB.close();
DataBufferByte dataBufferA = (DataBufferByte)imageA.getRaster().getDataBuffer();
DataBufferByte dataBufferB = (DataBufferByte)imageB.getRaster().getDataBuffer();
if (dataBufferA.getNumBanks() != dataBufferB.getNumBanks()) {
return false;
}
for (int bank = 0; bank < dataBufferA.getNumBanks(); bank++) {
if (!Arrays.equals(dataBufferA.getData(bank), dataBufferB.getData(bank))) {
return false;
}
}
return true;
}
Note that you need to save the reference screenshot as a PNG. A JPEG format will alter the pixels.
I am able to successfully take the screenshot one of the page of my application JainLibrary using below code. I am using junit and appium.
public String Screenshotpath = "Mention the folder Location";
File scrFile = ((TakesScreenshot)driver).getScreenshotAs(OutputType.FILE);
FileUtils.copyFile(scrFile, new File(Screenshotpath+"Any name".jpg"));
Now I want to compare the screenshot with a reference image so that I can move forward with the test case.
A simple solution would be to compare each pixel with the reference screenshoot:
// save the baseline screenshot
driver.get("https://www.google.co.uk/intl/en/about/");
File scrFile = ((TakesScreenshot)driver).getScreenshotAs(OutputType.FILE);
FileUtils.copyFile(scrFile, new File("c:\\temp\\screenshot.png"));
// take another screenshot and compare it to the baseline
driver.get("https://www.google.co.uk/intl/en/about/");
byte[] pngBytes = ((TakesScreenshot)driver).getScreenshotAs(OutputType.BYTES);
if (IsPngEquals(new File("c:\\temp\\screenshot.png"), pngBytes)) {
System.out.println("equals");
} else {
System.out.println("not equals");
}
public static boolean IsPngEquals(File pngFile, byte[] pngBytes) throws IOException {
BufferedImage imageA = ImageIO.read(pngFile);
ByteArrayInputStream inStreamB = new ByteArrayInputStream(pngBytes);
BufferedImage imageB = ImageIO.read(inStreamB);
inStreamB.close();
DataBufferByte dataBufferA = (DataBufferByte)imageA.getRaster().getDataBuffer();
DataBufferByte dataBufferB = (DataBufferByte)imageB.getRaster().getDataBuffer();
if (dataBufferA.getNumBanks() != dataBufferB.getNumBanks()) {
return false;
}
for (int bank = 0; bank < dataBufferA.getNumBanks(); bank++) {
if (!Arrays.equals(dataBufferA.getData(bank), dataBufferB.getData(bank))) {
return false;
}
}
return true;
}
Note that you need to save the reference screenshot as a PNG. A JPEG format will alter the pixels.
I'm building an android app using Xamarin. The requirement of the app is to capture video from the camera and encode the video to send it across to a server.
Initially, I was using an encoder library on the server-side to encode recorded video but it was proving to be extremely unreliable and inefficient especially for large-sized video files. I have posted my issues on another thread here
I then decided to encode the video on the client-side and then send it to the server. I've found encoding to be a bit complicated and there isn't much information available on how this can be done. So, I searched for the only way I knew how to encode a video that is by using FFmpeg codec. I've found some solutions. There's a project on GitHub that demonstrates how FFmpeg is used inside a Xamarin android project. However, running the solution doesn't give any output. The project has a binary FFmpeg file which is installed to the phone directory using the code below:
_ffmpegBin = InstallBinary(XamarinAndroidFFmpeg.Resource.Raw.ffmpeg, "ffmpeg", false);
Below is the example code for encoding video into a different set of outputs:
_workingDirectory = Android.OS.Environment.ExternalStorageDirectory.AbsolutePath;
var sourceMp4 = "cat1.mp4";
var destinationPathAndFilename = System.IO.Path.Combine (_workingDirectory, "cat1_out.mp4");
var destinationPathAndFilename2 = System.IO.Path.Combine (_workingDirectory, "cat1_out2.mp4");
var destinationPathAndFilename4 = System.IO.Path.Combine (_workingDirectory, "cat1_out4.wav");
if (File.Exists (destinationPathAndFilename))
File.Delete (destinationPathAndFilename);
CreateSampleFile(Resource.Raw.cat1, _workingDirectory, sourceMp4);
var ffmpeg = new FFMpeg (this, _workingDirectory);
var sourceClip = new Clip (System.IO.Path.Combine(_workingDirectory, sourceMp4));
var result = ffmpeg.GetInfo (sourceClip);
var br = System.Environment.NewLine;
// There are callbacks based on Standard Output and Standard Error when ffmpeg binary is running as a process:
var onComplete = new MyCommand ((_) => {
RunOnUiThread(() =>_logView.Append("DONE!" + br + br));
});
var onMessage = new MyCommand ((message) => {
RunOnUiThread(() =>_logView.Append(message + br + br));
});
var callbacks = new FFMpegCallbacks (onComplete, onMessage);
// 1. The idea of this first test is to show that video editing is possible via FFmpeg:
// It results in a 150x150 movie that eventually zooms on a cat ear. This is desaturated, and there's a fade-in.
var filters = new List<VideoFilter> ();
filters.Add (new FadeVideoFilter ("in", 0, 100));
filters.Add(new CropVideoFilter("150","150","0","0"));
filters.Add(new ColorVideoFilter(1.0m, 1.0m, 0.0m, 0.5m, 1.0m, 1.0m, 1.0m, 1.0m));
var outputClip = new Clip (destinationPathAndFilename) { videoFilter = VideoFilter.Build (filters) };
outputClip.H264_CRF = "18"; // It's the quality coefficient for H264 - Default is 28. I think 18 is pretty good.
ffmpeg.ProcessVideo(sourceClip, outputClip, true, new FFMpegCallbacks(onComplete, onMessage));
//2. This is a similar version in command line only:
string[] cmds = new string[] {
"-y",
"-i",
sourceClip.path,
"-strict",
"-2",
"-vf",
"mp=eq2=1:1.68:0.3:1.25:1:0.96:1",
destinationPathAndFilename2,
"-acodec",
"copy",
};
ffmpeg.Execute (cmds, callbacks);
// 3. This lists codecs:
string[] cmds3 = new string[] {
"-codecs",
};
ffmpeg.Execute (cmds, callbacks);
// 4. This convers to WAV
// Note that the cat movie just has some silent house noise.
ffmpeg.ConvertToWaveAudio(sourceClip, destinationPathAndFilename4, 44100, 2, callbacks, true);
I have tried different commands but no output file is generated. I have tried to use another project found here but this one has the same issue. I don't get any errors but no output file is generated. I'm really hoping someone can help me find a way I can manage to use FFmpeg in my project or some way to compress video to transport it to the server.
I will really appreciate if someone can point me in the right direction.
Just figure how to get the output by adding the permission in AndroidManifest file.
android.permission.WRITE_EXTERNAL_STORAG
Please read the update on the repository, it says that there is a second package, Xamarin.Android.MP4Transcoder for Android 6.0 onwards.
Install NuGet https://www.nuget.org/packages/Xamarin.Android.MP4Transcoder/
await Xamarin.MP4Transcoder.Transcoder
.For720pFormat()
.ConvertAsync(inputFile, ouputFile, f => {
onProgress?.Invoke((int)(f * (double)100), 100);
});
return ouputFile;
For Previous Android versions
Soruce Code https://github.com/neurospeech/xamarin-android-ffmpeg
Install-Package Xamarin.Android.FFmpeg
Use this as template, this lets you log output as well as calculates progress.
You can take a look at source, this one downloads ffmpeg and verifies sha1 hash on first use.
public class VideoConverter
{
public VideoConverter()
{
}
public File ConvertFile(Context contex,
File inputFile,
Action<string> logger = null,
Action<int,int> onProgress = null)
{
File ouputFile = new File(inputFile.CanonicalPath + ".mpg");
ouputFile.DeleteOnExit();
List<string> cmd = new List<string>();
cmd.Add("-y");
cmd.Add("-i");
cmd.Add(inputFile.CanonicalPath);
MediaMetadataRetriever m = new MediaMetadataRetriever();
m.SetDataSource(inputFile.CanonicalPath);
string rotate = m.ExtractMetadata(Android.Media.MetadataKey.VideoRotation);
int r = 0;
if (!string.IsNullOrWhiteSpace(rotate)) {
r = int.Parse(rotate);
}
cmd.Add("-b:v");
cmd.Add("1M");
cmd.Add("-b:a");
cmd.Add("128k");
switch (r)
{
case 270:
cmd.Add("-vf scale=-1:480,transpose=cclock");
break;
case 180:
cmd.Add("-vf scale=-1:480,transpose=cclock,transpose=cclock");
break;
case 90:
cmd.Add("-vf scale=480:-1,transpose=clock");
break;
case 0:
cmd.Add("-vf scale=-1:480");
break;
default:
break;
}
cmd.Add("-f");
cmd.Add("mpeg");
cmd.Add(ouputFile.CanonicalPath);
string cmdParams = string.Join(" ", cmd);
int total = 0;
int current = 0;
await FFMpeg.Xamarin.FFMpegLibrary.Run(
context,
cmdParams
, (s) => {
logger?.Invoke(s);
int n = Extract(s, "Duration:", ",");
if (n != -1) {
total = n;
}
n = Extract(s, "time=", " bitrate=");
if (n != -1) {
current = n;
onProgress?.Invoke(current, total);
}
});
return ouputFile;
}
int Extract(String text, String start, String end)
{
int i = text.IndexOf(start);
if (i != -1)
{
text = text.Substring(i + start.Length);
i = text.IndexOf(end);
if (i != -1)
{
text = text.Substring(0, i);
return parseTime(text);
}
}
return -1;
}
public static int parseTime(String time)
{
time = time.Trim();
String[] tokens = time.Split(':');
int hours = int.Parse(tokens[0]);
int minutes = int.Parse(tokens[1]);
float seconds = float.Parse(tokens[2]);
int s = (int)seconds * 100;
return hours * 360000 + minutes * 60100 + s;
}
}
I am trying to display the image with URL content://com.android.contacts/contacts/1/photo in the image element of HTML page.
Although it's a old problem, Can anyone please provide me a complete example to display a contact image in img element using PhoneGap.
Thanks in advance,
prodeveloper.
I read a lot on this probleme and the problem seems to be solved on the cordova 3.2.0
here is my code.
You do not need to create a tempory image or something else.
var init = function () {
var options = new ContactFindOptions();
options.filter = ""; // empty search string returns all contacts
options.multiple = true; // return multiple results
var filter = ["displayName",
"phoneNumbers",
"photos"];
navigator.contacts.find(filter, onSuccess, onError, options);
};
function onSuccess(contacts) {
var contactPhoto;
for (var i = 0; i < contacts.length; i++) {
if (contacts[i].displayName && contacts[i].phoneNumbers) {
contactPhoto = defaultvalue;
if (contacts[i].photos) {
for (var j = 0; j < contacts[i].photos.length; j++)
if (contacts[i].photos[j].value) {
contactPhoto = contacts[i].photos[j].value;
break;
}
}
showContactsModel.Contacts.add(
{
displayName: contacts[i].displayName,
phoneNumbers:contacts[i].phoneNumbers,
photo: contactPhoto
});
}
}
}
and my binding
<img data-bind="attr:{src: photo}" alt="something.png" />
you can directly use
<img src="content://com.android.contacts/contacts/1502/photo" alt="Oops!!">
once you have the url of image !