I'm trying to replicate this RaspberryPI example on Android
I created an app for android to advertise a URL using mDNS (jmDNS library), the app works well and I can receive the message on another phone using ZeroConf Browser app
But when I try to receive the same message using Physical Web app nothing happens, the app doesn't find the service.
I believe the problem is in the way I send hostname and txt-records.
This is my code:
serviceInfo = ServiceInfo.create(type,
"www.google.github.io", 80,
"path=/physical-web/");
/*A Key value map that can be advertised with the service*/
serviceInfo.setText(getDeviceDetailsMap());
jmdns.registerService(serviceInfo);
can you help me understand what is wrong?
See these discussions:
https://github.com/openhab/jmdns/issues/25
https://github.com/google/physical-web/issues/414
In short, I think the issue is because the url is in a text record, rather than in the service name, but the Physical Web may change the required format in the future...mdns support is still developing.
Related
I'm using a local server on my Android device to cast images + mp4s to Chromecast.
(I tried both - NanoHttpD and AndroidAsyncHttpServer) https://github.com/MikeMitterer/android-cromecastsample/tree/master/src/main/java/at/mikemitterer/mobile/chromecastsample/cast (GH-Repo)) but if I server the files from one of these local servers I get Status{statusCode=unknown status code: 2100, resolution=null}
Serving the files from an external server works - at least with a MP4-File.
This is the the server I'm currently using: https://github.com/MikeMitterer/android-cromecastsample/blob/master/src/main/java/at/mikemitterer/mobile/chromecastsample/cast/CastAsyncFileServer.java
The Activity that initializes everything: https://github.com/MikeMitterer/android-cromecastsample/blob/master/src/main/java/at/mikemitterer/mobile/chromecastsample/ui/MainActivity.java
and my buildMediaInfo-function: https://github.com/MikeMitterer/android-cromecastsample/blob/master/src/main/java/at/mikemitterer/mobile/chromecastsample/ui/MainActivity.java#L317-L341
This shows how the sample app looks like:
Clicking on one of the images triggers the loadRemoteMedia -> buildMediaInfo.
I'm using the Chromecast V1 (old) Device on a 22" Monitor (No Audio with HDMI to DVI Converter) Streaming with other Apps than mine works. Im using the sample-app ID (4F8B3483) for testing.
BTW: (my second problem) If I use my own registered ID Chromecast connects fine but automatically disconnects after approximately 20secs...
I'm a bit lost - please help!
[Update]
Thanks to # Ali Naddaf I re-registered my Application!!! not the device! I changed from "Remote Display Receiver" to "Styled Media Receiver" waited ~15mins turned off/on Chromecast now debugging works and my second problem, that Chromecast automatically disconnected after ~20secs, is also solved. Cool!
If you check my SS you can see that my image is set as background for the video-tag. Is this the way it should work or I am doing something wrong?
I'm creating an android application that interfaces with the texas instruments sensortag. One of the things the app needs to do is be able to change the frequency in which the temperature is reported to the app. I am able to change it through the official TI app which is great, but I cannot seem to get it working in my app.
When viewing the official app (iOS, can't run the android one?), it shows the temperature GATT service, which contains 3 characteristics. When I inspect the characteristics discovered by my app however, it only seems to find two - the data, and the notifications. Not the interval. I have attempted to construct this characteristic myself and write it however it doesn't do anything - no error, no success, just nothing.
The steps I've taken are essentially:
bluetoothGatt.discoverServices();
...
services = bluetoothGatt.getServices();
...
BluetoothGattService service = bluetoothGatt.getService(serviceUUID);
System.out.println("Characteristic = " + service.getCharacteristic(SensorTagGatt.UUID_IRT_PERI));
The output yields null. Is there something obvious I'm missing or that I should be doing that I might not be?
EDIT:
I've installed another app onto the phone written by another developer, and using this to inspect the services and characteristics available shows that it too is unable to find it, so I'm assuming there is something wrong with the android service discovery? The official iOS app is working as expected, and showing all characteristics. Unfortunately, the official android app seems to be incompatible with the version 1.5 firmware and crashes when trying to connect but I assume it too will fail to find the characteristic.
Has anyone else run into this issue and if so been able to get around it?
I am into a new project which requires me to use a USBconnected "Webcam".
The whole fun should run on Android 4.4.
my little story was:
I try multiple apps which do this - all work on both my testing devices
adapting some NDK lib that directly uses /dev/video0. This didnt work due to read-permission that was not granted in a new File("dev/video0").canRead() check. Although my unix permissions are correct, this seems to not work due to some new check on Android 4.4. (the whole thing was suggested here: Connect Android phone to a USB Web camera )
next: discover the UsbAccessory API that supposedly easens a lot of the above.
´find no documentation or anything about how to correctly handle a webcam
I still try, but don't come further than finding no device via
usbManager.getAccessory();
I've also tried to discover devices by filtering for a USB_ATTACHED broadcast but nothing triggers.
So I am starting to ask myself how the hell do others find the devices & communicate with them to get the pictures?
Anyone has sources from which i could learn, or a tutorial or something?
Little update from my side:
- I've gotten access by using the Android USB Host API e.g. UsbDevice instead of UsbAccessory.
- I have the connection and everything setup fine, and can now send binary data to my webcam and supposedly receive.
I can now send controlCommands via connection.controlTransfer(...) or use a "UsbRequest" in order to receive data.
However, I couldn't find any documentation to "make the camera submit pictures" to me. My Endpoint is of type XFER_INT (=interrupts).
I am continuing to try sending out various commands (e.g. binary values) but haven't had any success so far.
I am using Tizen SDK for Wearable from samsung-gear site in order to communicate a provider android application with Samsung Gear 2 device. I am able to send notifications to gear and once I run the consumer application on gear 2, I am able to transfer data between the watch and my Android phone as well.
What I am trying to do is to check within the Android application if the phone is paired with Gear 2. Something as simple as creating a communication object using the accessory service and calling a method like isPaired()?:
CommunicationObject commObject = new CommunicationObject(Communication parameters);
// I am assuming some connection call like commObject.connect() should be invoked first
// where I can check for it's result afterwards such as
if(commObject.isPaired())
{
// do something
}
I think SDK examples such as consumer/provider application they provide on their site already assume that the device is paired, hence they show how to transfer data between phone and the gear watch. Yet I am seeking something as simple as asking the phone if it's paired with a gear device, which should be the prerequisite for transferring the data, which is done automatically by Samsung Gear Manager I believe right now.
Note: For the case of example provider/consumer applications, one can just check if any connection is available using the code in them. But the data transfer connection enabled only when I manually start the consumer app from the gear device, otherwise it acts like gear device is not paired even though it is.
I believe this is not the most popular topic these days so I will post what I came up with as an answer although I doubt anyone will need it, without being perfect, it's the closest way I could get to my goal using the available documentation.
I should also mention that this slide helped me stay on track as well.
In my solution, there must be an active 2-way connection between the gear widget(consumer/.wgt) and the host side application(provider/.apk) as in the example application provided by Samsung(Hello Accessory) at all times, at least during the time where I wanted to check for the pairing condition. The documentation refers to it as:
Hello Gear is a simple application that consists of:
Host-side application(provider) : HelloAccessoryProvider.apk
Wearable-side Application(consumer) : HelloAccessoryConsumer.wgt (Web app)
See that both sides have some xml configuration and Android requires specific permissions which are explained in detail in Hello Gear documentation.
This 2 way communication is provided by the Samsung Accessory Framework on the network layer(through Samsung Accessory Protocol, SAP) given that both sides implement the same Accessory Service Profile, again, configured via the xml files on both ends(service name, channel id etc.).
Android side implements the protocol as a service, extending the SAAgent abstract class. Then the widget on gear side application(.wgt) can invoke the SAAgent callbacks and provider/consumer communication is handled through SASocket objects claimed on both ends over the predefined channel in the xml configuration files.
Please note that this communication has to be initialized on both ends, in my case I had to open the widget application once on Gear(I believe there should be a way to start the gear widget via an intent or notification, somehow, but I could not find yet) after the Android application has started, here started means that SAAgent service is up and bound to an Activity, being eligible to receive callbacks and send state messages to the rest of the application via broadcasts. Such as the number of active connections, or any data transmission between the gear socket and Android application can be done this way.
Note that if you don't have to transfer data between the gear widget and the Android application, you may just be OK with the notifications. The only requirement to send notifications to the Gear from Android applications seems to be that the Gear is paired with your phone and connected via Bluetooth. Then you can just send an intent as explained in more detail here in Section 6. All you need should be the permission:
com.samsung.wmanager.ENABLE_NOTIFICATION
and some metadata definition in your ApplicationManifest.xml file explained in the same section.
<meta-data
android:name="master_app_packagename"
android:value="com.example.gearMasterApp"/>
<meta-data
android:name="app_notification_maxbyte"
android:value="300 "/>
And here is the sample code for intent, in order to send notifications to the Gear:
public static final String ALERT_NOTIFICATION =
“com.samsung.accessory.intent.action.ALERT_NOTIFICATION_ITEM”;
public static final int NOTIFICATION_SOURCE_API_SECOND = 3;
Bitmap bitmapImg;
// Put data to Intent
Intent myIntent = new Intent(ALERT_NOTIFICATION);
myIntent.putExtra("NOTIFICATION_PACKAGE_NAME", “com.example.gearApp”);
myIntent.putExtra("NOTIFICATION_VERSION", NOTIFICATION_SOURCE_API_SECOND);
myIntent.putExtra("NOTIFICATION_TIME", System.currentTimeMillis(););
myIntent.putExtra("NOTIFICATION_MAIN_TEXT", “Title Text”);
myIntent.putExtra("NOTIFICATION_TEXT_MESSAGE", ”Body text);
byte [] byteArray = convertResizeBitmapToByte(bitmapImg);
myIntent.putExtra("NOTIFICATION_APP_ICON", byteArray);
myIntent.putExtra("NOTIFICATION_LAUNCH_INTENT", “com.example.gearMasterApp”);
myIntent.putExtra("NOTIFICATION_LAUNCH_TOACC_INTENT", “com.example.gearSideApp”);
sendBroadcast(myIntent);
public byte[] convertResizeBitmapToByte(Bitmap bitmap){
Bitmap scBitmap = Bitmap.createScaledBitmap(bitmap, 75, 75, false);
ByteArrayOutputStream byteArrayStream = new ByteArrayOutputStream();
scBitmap.compress(Bitmap.CompressFormat.PNG, 50, byteArrayStream);
return byteArrayStream.toByteArray();
}
Once the notification is read on the gear side, you can receive the intent action along with some optional parameters:
Intent Action :
"com.samsung.accessory.intent.action.UPDATE_NOTIFICATION_ITEM"
This could be another approach to check active communication with the Gear and your phone, but there is no guarantee that the notification will be read and my case did require to keep the Gear communication optional in order to allow the Android application continue it's tasks even though there is no active connection with the Gear.
About the original question, where I asked for a way to detect if the Gear is paired or not, I tried listing paired Bluetooth devices using getBondedDevices() method of Android's BluetoothAdapter but it shows that your Gear is paired even when your Gear is turned off, which was not enough for my needs and I did not find it logical. It's true though once your device is turned back on.
I'm happy with the above solution since it was enough for my needs, therefore I will accept my own answer.
I want to build an android application in which the android device work as airplay server (receiver) and iOS device as a client(sender). I have followed this link . But here you have to first register to the port to appear as a airplay option on iOS device from the command line using :
mDNS -R MyAirplayService _airplay._tcp local 22555
When i run this java code I can see airplay icon on my iOS device . But how it can be don in android device ? Is there any open source code or library to do this ?
That code is basically registering an airplay tcp service in the local network, so that any other iOS device on the same local network can discover this airplay service, and therefore display the airplay icon as an option.
In iOS, this can be done using the Bonjour/NSNetService. Please refer to the Apple's official tutorial.
NSNetService *service;
service = [[NSNetService alloc] initWithDomain:#""// 1
type:#"_airplay._tcp"
//this will show up as the airplay name
name:#"myiOSAirplayServer"
port:port];
if(service)
{
[service setDelegate:delegateObject];// 2
[service publish];// 3
}
else
{
NSLog(#"An error occurred initializing the NSNetService object.");
}
In Android, this can be done using Network Service Discovery, and the official example is here:
public void registerService(int port) {
NsdServiceInfo serviceInfo = new NsdServiceInfo();
//this will show up as the airplay name
serviceInfo.setServiceName("myAndroidAirplayServer");
serviceInfo.setServiceType("_airplay._tcp.");
serviceInfo.setPort(port);
mNsdManager = Context.getSystemService(Context.NSD_SERVICE);
mNsdManager.registerService(
serviceInfo, NsdManager.PROTOCOL_DNS_SD, mRegistrationListener);
}
However, doing so just registers the service in local network, and gives you an icon in the iOS device. To do the real airplay server/mirroring service, you need to do a lot more. If you want to know that, please check my iOS app that works as an Airplay mirroring server, https://www.youtube.com/watch?v=0d6ggJMypIk. There is also an open source project that is written in python, called PyOpenAirMirror.
If I'm not mistaken, Airplay is an Apple-only API. I have tried getting it to be recognizable on android and I was largely unsuccessful. You may want to consider another mode of transmission for the streaming audio.
I would look at Erica Sadun's utilities. I may be mistaken, but I think they are open source. She has written a server, player/transmitter etc. for AirPlay.
http://ericasadun.com/category/airplayer/