I'm currently writing a project using Raspberry pi and mobile (Android).
I have problem to send data from Camera Rpi to Android App.
I'm using library Picamera on python: https://picamera.readthedocs.io/en/release-1.13/recipes1.html#recording-to-a-network-stream .
My actual code on Rpi looks something like this:
import socket
import time
import picamera
camera = picamera.PiCamera()
camera.resolution = (640, 480)
camera.framerate = 24
server_socket = socket.socket()
server_socket.bind(('0.0.0.0', 8000))
server_socket.listen(0)
# Accept a single connection and make a file-like object out of it
connection = server_socket.accept()[0].makefile('wb')
try:
camera.start_recording(connection, format='h264')
camera.wait_recording(60)
camera.stop_recording()
finally:
connection.close()
server_socket.close()
To receive stream we can use: tcp/h264://x.x.x.x:8000 . It works on PC when I used vlc.
On Android I try use VideoView or ExoPlayer, but problem is with URI because, android can't parse tcp/h264 protocol.
When I try stream using vlc:
raspivid -o - -t 99999 |cvlc -vvv stream:///dev/stdin --sout '#standard{access=http,mux=ts,dst=:8000}' :demux=h264
It works on Android if I pass url with prefix http:// but is not from my program on python.
It seems to me that I have 2 ways.
On python use different way to stream video output.
Somehow handle protocol tcp/h264 (probably used socket and independently parse stream bytes to video). It is possible: https://github.com/ShawnBaker/RPiCameraViewer but i am looking for better (not low level) solution.
You can stream it from python easily, just use
import subprocess
subprocess.Popen("raspivid -o - -t 99999 |cvlc -vvv stream:///dev/stdin --sout '#standard{access=http,mux=ts,dst=:8000}' :demux=h264", shell=True)
This will launch it in a different thread, so it won't be blocking your program/
Related
I am using a Raspberry Pi Zero W. I have succeeded in connecting the Pi to my Android Device on startup of Pi. Then I turn on Internet Sharing to make sure my Pi has an internet connection. I want to make an application which can receive data from Android Device and run preexisting scripts based on it without using ssh, if possible.
I normally use Juice SSH on my android phone to run scripts on the Pi but that involves manual work like finding and executing the script which I do not want my user to do.
The script I want to run is a Google Directions Python Script. I have the script ready, it just takes input of Origin and Destination from the user. After that it fetches the Direction Response and starts showing instructions on a screen connected to the Pi.
TLDR: I would like to know a way to initiate a python script on a Raspberry Pi from an Android Device connected via Bluetooth. Do I need to make a server? Is it possible using Firebase?
I actually mounted something very similar not too long ago. You may be able to get around various ways but I think some sort of server is going to be needed any way.
Take a look at my public repository in github!
git clone https://github.com/NanoSpicer/XpressShutdown
You could then modify my index.js file like so:
#!/usr/bin/node
const command = 'python yourscript.py';
const proc = require('child_process');
const express = require('express');
const app = new express();
const router = express.Router();
router.get('/customComand', (request, response) => {
// you could even catch parameters with this, edit your command string and pass them into the script
proc.exec(command, (err, stdout, stderr) => {
response.json({output: stdout});
});
});
app.use('/raspi', router);
app.listen(80);
console.log('Server is running');
Get that server up and running as a background process with:
chmod +x index.js
./index.js & # you can do this because of the shebang
Make the HTTP request like
http://{your-raspi-IP-address}/raspi/customComand
And now you could run your command wherever in the world if you can perform an http request to your raspi!
I solved this problem by using the Jsch Library for Android. It is quite simple and well documented. It allows me to start a SSH connection with a set command that I want to execute on the Server.
I made a simple motion detector program in using python 3.7 and opencv, is there a way to access my phone's camera using python and stream the video to my laptop using bluetooth or mobile hotspot so I can process the data on my laptop? I'm basically just using my phone as a detachable camera.
You can do this using IP Webcam android application.
Steps -
Install the application in your android phone.
Connect your Laptop and Phone in a local network (you can use mobile hotspot).
Start application and select Start Server option, the application will start capturing video and show you IP addresses.
Use this IP address to read the video feed using the following python code.
Process the video using OpenCV.
Python code -
import urllib
import cv2
import numpy as np
import ssl
ctx = ssl.create_default_context()
ctx.check_hostname = False
ctx.verify_mode = ssl.CERT_NONE
url = 'Your URL'
while True:
imgResp = urllib3.urlopen(url)
imgNp = np.array(bytearray(imgResp.read()), dtype=np.uint8)
img = cv2.imdecode(imgNp, -1)
cv2.imshow('temp',cv2.resize(img,(600,400)))
q = cv2.waitKey(1)
if q == ord("q"):
break;
cv2.destroyAllWindows()
You can find the android application here - IP Webcam
And this video will explain better - How to use with OpenCV
Use IP Webcam android application.
url is given by ip webcam
and at the end I have added video for video streaming or you can
url = 'http://192.168.137.138:8080/shot.jpg'
inside for loop before
cap.read()
This works for me flawlessly with 1280 x 720 resolution
NOTE your url ip will change but add video in the last
import cv2
import numpy as np`
url = 'http://192.168.137.138:8080/video'
cap = cv2.VideoCapture(url)
while(True):
ret, frame = cap.read()
if frame is not None:
cv2.imshow('frame',frame)
q = cv2.waitKey(1)
if q == ord("q"):
break
cv2.destroyAllWindows()
I have a college assignment to build Android app that communicates with Ubuntu (or any other Linux distribution), and streams audio via microphone and speakers both on PC and phone. Switching the direction of communication should be done on Android and script for listening on Bluetooth port on PC should be written in Python or some other lightweight language. It does not have to be full-duplex, only single-duplex.
Is the answer in the BluetoothA2dp Android profile or is there something else?
I'm common with making simple Android apps.
Thanks a lot!
Not sure if you still need the answer, but I am working on something similar.
Basically working with python on windows platform to record streaming audio from microphone of laptop then process the sound for ANC [ automatic noise cancellation ] and pass it through band-pass filter then output the audio stream to a Bluetooth device.
I would like to ultimately port this to smartphone, but for now prototyping with Python as that's lot easier.
While I am still early stage on the project, here are two piceses that may be helpful,
1) Stream Audio from microphone to speakers using sounddevice
Record external Audio and play back
Refer to soundaudio module installation details from here
http://python-sounddevice.readthedocs.org/en/0.3.1/
import sounddevice as sd
duration = 5 # seconds
myrecording = sd.rec(duration * fs, samplerate=fs, channels=2, dtype='float64')
print "Recording Audio"
sd.wait()
print "Audio recording complete , Play Audio"
sd.play(myrecording, fs)
sd.wait()
print "Play Audio Complete"
2)Communicate to bluetooth
Refer to details from here:
https://people.csail.mit.edu/albert/bluez-intro/c212.html
import bluetooth
target_name = "My Phone"
target_address = None
nearby_devices = bluetooth.discover_devices()
for bdaddr in nearby_devices:
if target_name == bluetooth.lookup_name( bdaddr ):
target_address = bdaddr
break
if target_address is not None:
print "found target bluetooth device with address ", target_address
else:
print "could not find target bluetooth device nearby"
I know I am simply quoting examples from these sites, You may refer to these sites to gain more insight.
Once I have a working prototype I will try to post it here in future.
My new surveillance camera just arrived, so I'm trying to write an app to live stream the video from it.
Since it came with basically no documentation, I installed the 'onvifer' android app which allows you to browse the camera's capabilities. This app works fine - gets the video and allows PTZ controls, etc. It reports the streaming url as:
rtsp://192.1.0.193:554/mpeg4
I tested the stream in the VLC windows client, and it's able to stream video from that URL as well. This makes me comfortable that the network is working OK.
The camera states the feed will be 1920x1080; VLC confirms this.
The basic code in my activity:
VideoView videoView = (VideoView)this.findViewById(R.id.VideoView);
videoView.setVideoURI(Uri.parse("rtsp://192.1.0.193:554/mpeg4"));
videoView.requestFocus();
videoView.start();
I've also given the app INTERNET permissions in AndroidManifest.xml, disabled authentication on the camera, and am running on a real device (not the emulator).
When I run the app, LogCat shows this immediately:
setDataSource IOException happend :
java.io.FileNotFoundException: No content provider: rtsp://192.1.0.193:554/mpeg4
at android.content.ContentResolver.openTypedAssetFileDescriptor (ContentResolver.java).
About 15 seconds later, the app shows a "Can't play this video" modal dialog box and this is added to LogCat:
MediaPlayer error (100, 0)
AudioSystem AudioFlinger server died!
MediaPlayer error (100, 0)
VideoView Error: 100,0
I've googled everything I can think of, but haven't found anything useful.
Any thoughts?
wild-ass-guess on your logcat and the RC=100... No SDP file or no equivalent for RTSP of the 'moov atom' block required to negotiate details of the stream /container/ codec/ format... You can get the AOSP code for mediaPlayer/videoView and grep the RC value in the source.
RTSP is gnarly to debug ( note the tools links ) and not assured to run inside a NAT'd network due to UDP issues. So, to get better result, you may have to look into forcing your config to do data channel on TCP an not UDP. Or it could be other issues , of which there are many.
If you really want to investigate, some possible tools below:
Use command line and CURL client to request your stream:
Android - Java RTSP Session Mgmt package on Git
Protocol dumps for CLI RTSP sessions to Youtube RTSP/SDP streams
To pursue the issue, you may need to get into the weeds with debug tools that track details of the protocol negotiation that preceeds the MediaPlayer actually starting play on the stream. That would include learning the RFP and the protocol details.
videoView.setVideoURI(“rtsp://192.1.0.193:554/mpeg4”);
Try your app on another phone.
You may find the problem is about the mobile device.
Try this
path:"rtsp://218.204.223.237:554/mobile/1/4C024DFE77DC717D/onnuvesj43xj7t26.sdp".
See whether the code has something wrong.
With #rowntreerob's help, I have successfully built the FFMpeg for Android library / shared object for use in an Android application.
How do I capture the stdout or stderr messages issued as a result of running an FFMpeg command and display the messages in a TexView widget in the same Android application from which the FFMpeg command was issued?
Also when FFMPeg connects to (Wowza/Flash Media Server)- FMS, FMS issues CONNECTING, CONNECTED and STREAMING messages. How does one capture these messages and display them using Toast in the application from from which the FFMpeg command was issued.
Sample code will really help. TIA
/dev/null ususally gets the console output of dalvik.
you need to understand the file system , streams, redirecting ...
try reading here