I am working on iOS & Android application using Xamarin where I am trying to print implement silent printing functionality currently I am using following Android native scripts which always show dialog during print.
Android print script using Webview:
public void Print(WebView viewToPrint)
{
var droidViewToPrint = Platform.CreateRenderer(viewToPrint).ViewGroup.GetChildAt(0) as Android.Webkit.WebView;
if (droidViewToPrint != null)
{
// Only valid for API 19+
var version = Android.OS.Build.VERSION.SdkInt;
if (version >= Android.OS.BuildVersionCodes.Kitkat)
{
var printMgr = (PrintManager)Forms.Context.GetSystemService(Context.PrintService);
printMgr.Print("Forms-EZ-Print", droidViewToPrint.CreatePrintDocumentAdapter(), null);
}
}
}
iOS print script for Text:
public void Print()
{
var printInfo = UIPrintInfo.PrintInfo;
printInfo.JobName = "My first Print Job";
printInfo.OutputType = UIPrintInfoOutputType.General;
var textFormatter = new UISimpleTextPrintFormatter("Once upon a time...")
{
StartPage = 0,
MaximumContentWidth = 6 * 72,
PerPageContentInsets = new UIEdgeInsets(72, 72, 72, 72),
};
var printer = UIPrintInteractionController.SharedPrintController;
printer.PrintInfo = printInfo;
printer.PrintFormatter = textFormatter;
printer.ShowsPageRange = true;
printer.Present(true, (handler, completed, error) =>
{
if (!completed && error != null)
{
Console.WriteLine($"Error: {error.LocalizedDescription ?? ""}");
}
});
printInfo.Dispose();
textFormatter.Dispose();
}
Could please let know how can print silently from iOS and Android application without any print preview or dialog?
Related
I have written a Dart web app that retrieves .mp3 files from a server and plays them back; I am trying to write a mobile version using Flutter. I know dart:web_audio is the main option for a web app, but Flutter can't find it in my SDK. I know it's there because I can compile the following to Javascript:
import 'dart:html';
import 'dart:convert';
import 'dart:web_audio';
AudioContext audioContext;
main() async {
audioContext = new AudioContext();
var ul = (querySelector('#songs') as UListElement);
var signal = await HttpRequest.getString('http://10.0.0.6:8000/api/filelist');
// Map json = JSON.decode(signal);
// for (Map file in json['songs']) {
print("signal: $signal");
Map json = JSON.decode(signal);
for (Map file in json['songs']) {
var li = new LIElement()
..appendText(file['title']);
var button = new ButtonElement();
button.setAttribute("id", "#${file['file']}");
button.appendText("Play");
li.append(button);
new Song(button, file['file']);
ul.append(li);
}
}
class Song {
ButtonElement button;
bool _playing = false;
// AudioContext _audioContext;
AudioBufferSourceNode _source;
String title;
Song(this.button, this.title) {
button..onClick.listen((e) => _toggle());
}
_toggle() {
_playing = !_playing;
_playing ? _start() : _stop();
}
_start() {
return HttpRequest
.request("http://10.0.0.6:8000/music/$title", responseType: "arraybuffer")
.then((HttpRequest httpRequest) {
return audioContext
.decodeAudioData(httpRequest.response)
.then((AudioBuffer buffer) {
_source = audioContext.createBufferSource();
_source.buffer = buffer;
_source.connectNode(audioContext.destination);
_source.start(0);
button.text = "Stop";
_source.onEnded.listen((e){
_playing = false;
button.text = "Play";
});
});
});
}
_stop() {
_source.stop(0);
button.text = "Play";
}
}
How would I rewrite the dart:web_audio parts of my code for a Flutter app? Can Flutter access MediaPlayer? And if so, how would I refer to it in pubspec.yaml?
As raju-bitter noted above, Flutter used to provide some built-in audio wrappers in its core engine but those have since been removed: https://github.com/flutter/flutter/issues/1364.
Flutter-using Apps are just iOS or Android apps, and thus it is possible to do anything the underlying iOS/Android can do via Flutter using some Java or Obj-C code in the hello_services model (https://github.com/flutter/flutter/tree/master/examples/hello_services). This model is documented at https://flutter.io/platform-services. It's not nearly as easy as we'd like it to be yet. Many improvements to come soon.
I know its 4 years late, but i have found audioplayers package which can be used as the following
import 'package:audioplayers/audio_cache.dart';
import 'package:audioplayers/audioplayers.dart';
//Call this function from an event
void playRemoteFile() {
AudioPlayer player = new AudioPlayer();
player.play("https://luan.xyz/files/audio/ambient_c_motion.mp3");
}
i build a application that record sound in Desktop using ActionScript 3 , now i convert the application to Andriod Application but there is a problem that SampleDataEvent.SAMPLE_DATA event doesn't receive any data to record
Here is the code :
private var _microphone:Microphone;
private var _buffer:ByteArray = new ByteArray();
private var _difference:uint;
public function record():void
{
if ( _microphone == null )
_microphone = Microphone.getMicrophone();
_difference = getTimer();
_microphone.setSilenceLevel(_silenceLevel, _timeOut);
_microphone.gain = _gain;
_microphone.rate = _rate;
_buffer.length = 0;
_microphone.addEventListener(SampleDataEvent.SAMPLE_DATA, onSampleData);
_microphone.addEventListener(StatusEvent.STATUS, onStatus);
}
private function onSampleData(event:SampleDataEvent):void
{
_recordingEvent.time = getTimer() - _difference;
dispatchEvent( _recordingEvent );
var buteData:Number;
while(event.data.bytesAvailable > 0)
{
buteData = event.data.readFloat();
_buffer.writeFloat(buteData);
soundBytes.writeFloat( buteData);
}
}
anyone can help here
Thanx
I think, maybe you don't check out about AIR for Android settings. If you not checked in the RECORD_AUDIO. you should check it.
refer a below image.
How to pull to refresh?
In Titanium appcelerator I need to show a list of content in tableview. If I pull the view it needs to update. In iPhone I complete but in Android it won't work. Please any one help to solve this problem in Android.
My Android code:-
tableView.addEventListener('scroll',function(e)
{
var offset = e.contentOffset.y;
if (offset < -65.0 && !pulling && !reloading)
{
var t = Ti.UI.create2DMatrix();
t = t.rotate(-180);
pulling = true;
arrow.animate({transform:t,duration:180});
statusLabel.text = "Release to refresh...";
}
else if((offset > -65.0 && offset < 0 ) && pulling && !reloading)
{
pulling = false;
var t = Ti.UI.create2DMatrix();
arrow.animate({transform:t,duration:180});
statusLabel.text = "Pull down to refresh...";
}
});
tableView.addEventListener('dragEnd', function(e)
{
if(pulling && !reloading)
{
reloading = true;
pulling = false;
arrow.hide();
actInd.show();
statusLabel.text = "Reloading...";
tableView.setContentInsets({top:60},{animated:true});
tableView.scrollToTop(-60,true);
arrow.transform=Ti.UI.create2DMatrix();
beginReloading();
}
});
Titanium now supports pull to refresh for BOTH Android (> v6.2.0) and iOS (>3.2.0) with a Titanium.UI.TableView, Titanium.UI.ListView or Titanium.UI.ScrollView object.
See the docs:
https://docs.appcelerator.com/platform/latest/#!/api/Titanium.UI.ListView
https://docs.appcelerator.com/platform/latest/#!/api/Titanium.UI.RefreshControl
Sample code taken from the docs:
var win = Ti.UI.createWindow({
fullscreen:true
});
var counter = 0;
function genData() {
var data = [];
for (var i=1; i<=3; i++) {
data.push({properties:{title:'ROW '+(counter+i)}})
}
counter += 3;
return data;
}
var section = Ti.UI.createListSection();
section.setItems(genData());
var control = Ti.UI.createRefreshControl({
tintColor:'red'
})
var listView = Ti.UI.createListView({
sections:[section],
refreshControl:control
});
control.addEventListener('refreshstart',function(e){
Ti.API.info('refreshstart');
setTimeout(function(){
Ti.API.debug('Timeout');
section.appendItems(genData());
control.endRefreshing();
}, 2000);
})
win.add(listView);
win.open();
Is this just the IOS code form the Kitchen Sink example?
There are a couple of attempts at getting this working on Android, though I haven't confirmed that any of them work as expected. From what I understand, the problem is that you can't get the offset the same way in Android as in IOS.
A quick Google search turned up this link, which was referenced from the official Appcelerator forums.
https://gist.github.com/903895
var style1 = document.createElement("link");
style1.id = "rel";
style1.rel = "stylesheet";
style1.href = "http://www.mysite.com/css.css";
style1.onload = function(){document.body.innerHTML+="fffffff";};
document.getElementsByTagName("head")[0].appendChild(style1);
This code works in Chrome/Firefox, and yet stock browsers on my Froyo (2.3) and Jellybean (4.1) Android devices will print nothing. What's the problem? I'd like if I could execute some js onload of a link. Anything else would in my case amount to a hack. :/
The problem isn't innerHTML. Try it with alerts if you want (at your own peril).
Another answer mentions checking for this functionality by doing
var huh = 'onload' in document.createElement('link');
..but this is true in both stock browsers! wtf guys?
Android browser doesn't support "onload" / "onreadystatechange" events for element: http://pieisgood.org/test/script-link-events/
But it returns:
"onload" in link === true
So, my solution is to detect Android browser from userAgent and then wait for some special css rule in your stylesheet (e.g., reset for "body" margins).
If it's not Android browser and it supports "onload" event- we will use it:
var userAgent = navigator.userAgent,
iChromeBrowser = /CriOS|Chrome/.test(userAgent),
isAndroidBrowser = /Mozilla\/5.0/.test(userAgent) && /Android/.test(userAgent) && /AppleWebKit/.test(userAgent) && !iChromeBrowser;
addCssLink('PATH/NAME.css', function(){
console.log('css is loaded');
});
function addCssLink(href, onload) {
var css = document.createElement("link");
css.setAttribute("rel", "stylesheet");
css.setAttribute("type", "text/css");
css.setAttribute("href", href);
document.head.appendChild(css);
if (onload) {
if (isAndroidBrowser || !("onload" in css)) {
waitForCss({
success: onload
});
} else {
css.onload = onload;
}
}
}
// We will check for css reset for "body" element- if success-> than css is loaded
function waitForCss(params) {
var maxWaitTime = 1000,
stepTime = 50,
alreadyWaitedTime = 0;
function nextStep() {
var startTime = +new Date(),
endTime;
setTimeout(function () {
endTime = +new Date();
alreadyWaitedTime += (endTime - startTime);
if (alreadyWaitedTime >= maxWaitTime) {
params.fail && params.fail();
} else {
// check for style- if no- revoke timer
if (window.getComputedStyle(document.body).marginTop === '0px') {
params.success();
} else {
nextStep();
}
}
}, stepTime);
}
nextStep();
}
Demo: http://codepen.io/malyw/pen/AuCtH
Can anyone point me to an example of take a Photo and store it using MVVMCross?
I have been searching but only have found this:
Monodroid Take a picture with Camera (Doesn't Implement MVVMCross)
Video Recording (It's Video and i can't make it work :S)
The Oficialy Recipe Example (It Works but does not implement MVVMCross)
Thanks!!!
Resolved! Thanks!
To Future References: (Using Master Branch)
Credits to Stuart, I just changed the code to work with my reality
using Cirrious.MvvmCross.ExtensionMethods;
using Cirrious.MvvmCross.Interfaces.Platform.Tasks;
using Cirrious.MvvmCross.Interfaces.ServiceProvider;
using SIGEP.DummyService;
using SIGEP.Mobile.Core.Interfaces;
namespace SIGEP.Mobile.Core.Models
{
public class PhotoService : IMvxServiceConsumer<IMvxPictureChooserTask>
{
private const int MaxPixelDimension = 1024;
private const int DefaultJpegQuality = 92;
public void GetNewPhoto()
{
this.GetService<IMvxPictureChooserTask>().TakePicture(
MaxPixelDimension,
DefaultJpegQuality,
HandlePhotoAvailable,
() => { /* cancel is ignored */ });
}
public event EventHandler<PhotoStreamEventArgs> PhotoStreamAvailable;
private void HandlePhotoAvailable(Stream pictureStream)
{
var handler = PhotoStreamAvailable;
if (handler != null)
{
handler(this, new PhotoStreamEventArgs() { PictureStream = pictureStream, OnSucessGettingPhotoFileName = OnSucessGettingPhotoFileName });
}
}
public static void TakePhoto(Action<string> successFileName, Action<Exception> error)
{
var service = new PhotoService();
service.OnSucessGettingPhotoFileName = successFileName;
service.OnError = error;
service.GetNewPhoto();
service.PhotoStreamAvailable += new EventHandler<PhotoStreamEventArgs>(service_PhotoStreamAvailable);
}
static void service_PhotoStreamAvailable(object sender, PhotoStreamEventArgs e)
{
//grava pra ficheiro!!!
var directory = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments);
var filename = Path.Combine(directory, "photo.jpeg");
string saveTo = filename;
FileStream writeStream = new FileStream(saveTo, FileMode.Create, FileAccess.Write);
ReadWriteStream(e.PictureStream, writeStream);
e.OnSucessGettingPhotoFileName(filename);
}
private static void ReadWriteStream(Stream readStream, Stream writeStream)
{
int Length = 256;
Byte[] buffer = new Byte[Length];
int bytesRead = readStream.Read(buffer, 0, Length);
// write the required bytes
while (bytesRead > 0)
{
writeStream.Write(buffer, 0, bytesRead);
bytesRead = readStream.Read(buffer, 0, Length);
}
readStream.Close();
writeStream.Close();
}
public Action<string> OnSucessGettingPhotoFileName { get; set; }
public Action<Exception> OnError { get; set; }
}
[Serializable]
[ComVisible(true)]
public class PhotoStreamEventArgs : EventArgs
{
public Stream PictureStream { get; set; }
public Action<string> OnSucessGettingPhotoFileName { get; set; }
}
}
I generally implement a service using the built-in IMvxPictureChooserTask (this is in a Plugin if using vNext):
using Cirrious.MvvmCross.ExtensionMethods;
using Cirrious.MvvmCross.Interfaces.Platform.Tasks;
using Cirrious.MvvmCross.Interfaces.ServiceProvider;
public class PhotoService
: IMvxServiceConsumer<IMvxPictureChooserTask>
, IPhotoService
{
private const int MaxPixelDimension = 1024;
private const int DefaultJpegQuality = 92;
public void GetNewPhoto()
{
Trace.Info("Get a new photo started.");
this.GetService<IMvxPictureChooserTask>().TakePicture(
MaxPixelDimension,
DefaultJpegQuality,
HandlePhotoAvailable,
() => { /* cancel is ignored */ });
}
public event EventHandler<PhotoStreamEventArgs> PhotoStreamAvailable;
private void HandlePhotoAvailable(Stream pictureStream)
{
Trace.Info("Picture available");
var handler = PhotoStreamAvailable;
if (handler != null)
{
handler(this, new PhotoStreamEventArgs() { PictureStream = pictureStream });
}
}
}
I generally register this service as a singleton during startup, and then call it from a ViewModel ICommand handler.
One app which uses this service is the Blooor sample - see BaseEditProductViewModel.cs - this isn't a sample I had anything to do with, but I believe it brings in both Picture taking and ZXing - both using external services.
One warning: On MonoDroid, you can see some strange/unexpected Activity/ViewModel lifecycle behaviour - basically you can see that the Activity you take the photo from is unloaded/wiped from memory during the photo taking. If this happens to your app then you'll probably need to start looking at questions like: Saving Android Activity state using Save Instance State - this isn't automatically handled in MvvmCross (yet).
I believe the Blooor sample might suffer from this issue - but whether a user would ever see it in normal app use is debatable.
As an alternative to the IMvxPictureChooserTask service, you can also look at using some of the cross-platform APIs from Xamarin.Mobile - see MvvmCross vnext : monodroid use a VideoView inside a plugin for a possible starting place - or for Android only you can easily implement your own.