RTSP server in xamarin android - android

I am trying to setup a RTSP server in Xamarin android..
Actually already setup RTSP server ,but this server is expecting
raw sps,pps data (bytes data) to stream...
so my question is how to collect byte data from camera/Surface view
I tried with old codes like onpreviewframe,previewcallback which is working ,
but these are depreciated..this below code i did in onpreviewframe callback method...
in short...how to collect bytes raw data from camera in recent android versions...
private void Video_source_ReceivedYUVFrame(uint timestamp_ms, int width, int height, byte[] yuv_data)
{
byte[] raw_video_nal = h264_encoder.offerEncoder(yuv_data);
byte[] spsPpsInfo = h264_encoder.GetRawSPSPPS();
bool isKeyframe = true;
List<byte[]> nal_array = new List<byte[]>();
Boolean add_sps_pps_to_keyframe = true;
if (add_sps_pps_to_keyframe && isKeyframe)
{
nal_array.Add(raw_sps);
nal_array.Add(raw_pps);
}
nal_array.Add(raw_video_nal);
rtspServer.FeedInRawSPSandPPS(raw_sps);
rtspServer.FeedInRawNAL(timestamp_ms, nal_array);
}

Related

How to decode Midi data Android

I'm writing an Android application. A MIDI piano keyboard is connected physically by a cable to an Android device. I have been following the official Android Midi documentation here https://developer.android.com/reference/android/media/midi/package-summary, but I am stuck with decoding the raw Midi data which I am receiving.
#RequiresApi(api = Build.VERSION_CODES.M)
class MidiFramer extends MidiReceiver {
public void onSend(byte[] data, int offset,
int count, long timestamp) throws IOException {
// parse MIDI or whatever
// How to convert data to something readable? Below doesn't make any sense.
Log.v(LOG_TAG, "onSend strData:" + data +" length:"+data.length);
StringBuffer sb = new StringBuffer();
for (int i=0; i<data.length; i++){
String hex = new String (data, StandardCharsets.UTF_8);
sb.append(hex);
}
Log.v(LOG_TAG, "onSend sb:" + sb.toString());
}
}
Essentially from the raw Midi data which is being received, I want to know what note is being played (e.g. D4 / C#5) on the physical piano keyboard. Any help would be appreciated.

Send image stream byte data flutter to native Android

I was trying to process the live camera feed from flutter so I needed to send the byte data to nativve Android for processing.
I was concatenating the 3 planes as suggested by flutterfire. I needed the data to create an InputImage for the google ml kit.
Concatenate Plane method
static Uint8List _concatenatePlanes(List<Plane> planes) {
final WriteBuffer allBytes = WriteBuffer();
for (Plane plane in planes) {
allBytes.putUint8List(plane.bytes);
}
return allBytes.done().buffer.asUint8List();
}
Input Image created from the image data in native Android
public void fromByteBuffer(Map<String, Object> imageData, final MethodChannel.Result result) {
byte[] bytes = (byte[]) imageData.get("bytes");
int rotationCompensation = ((int) imageData.get("rotation")) % 360;
//Create an input image
InputImage inputImage = InputImage.fromByteArray(bytes,
(int) imageData.get("width"),
(int) imageData.get("height"),
rotationCompensation,
InputImage.IMAGE_FORMAT_NV21);
}
I am not getting any results after processing a frame from camera stream, but if the same frame is captured, stored and then processed by creating the Input Image from file path the image is being processed properly.
Is there anything wrong in the way I am creating the input image.
Any help is appreciated.

Encryption on open Source VoIP Android

This is with reference to sipdroid data encrypt failed
I tried using XOR operation instead of reverse byte code for send packets and receive packets in SipdroidSocket.class.
I experienced same issue(too much noise)
Please guide me in encrypting and decrypting packets in SipdroidSocket.class
Sorry for late reply.I am posting the snippets of the code I tried. Please refer the original RtpSocket.java and SipdroidSocket.java classes for complete view. I am just putting the snippets here.
In RtpSocket.java , I took a static value and collected the packet's header length. Then used this header length in SipdroidSocket.java so as to remove the header part prior tweaking with the payload:
In SipdroidSocket.java, following editing were done in Send and Receive functions:
public void receive(DatagramPacket pack) throws IOException {
if (loaded) {
impl.receive(pack);
byte[] b = pack.getData(); // fetch data from receiver
int len = RtpSocket.header;
pack.setData(do_something(b, len)); // do the XORing to retrieve
// original data
} else {
super.receive(pack);
byte[] b = pack.getData();
int len = RtpSocket.header;
pack.setData(do_something(b, len));
}
}
public void send(DatagramPacket pack) throws IOException {
byte[] b = pack.getData(); // fetch original data
int len = RtpSocket.header;
pack.setData(do_something(b, len)); // replace with tweaked data
if (loaded)
impl.send(pack);
else
super.send(pack);
}
private byte[] do_something(byte[] b, int len) {
// TODO Auto-generated method stub
int new_buff_len = b.length - len;
byte[] new_buff = new byte[new_buff_len];
int i = 0;
for (i = len; i < b.length; i++) // separating header values
{
new_buff[i] = (byte) (b[i] ^ 0x43); // XORing original packet
// payload before sending and
// after receiving to get
// original data on both sides
}
return new_buff;
}
Kindly , try it and suggest me please.
Finally it worked ! Had to meddle with the other parts of the code . XOR operation now works fine and have attained the objective.

issue in reading a serialized object

I am trying to create a client-server android app in which I want to transfer a file using a UDP protocol. Till now I am able to transfer the file and receive the acknowledgements for the packets.
Now I want to add the sequence numbers to the with the data in the packet. I have tried to do the following:
Create a ByteArrayOutputStream.
Wrap it in an ObjectOutputStream
Write data to the object using writeObject()
Serialized class includes:
public class Message implements Serializable {
private int seqNo;
private byte[] data;
private boolean ack;
public Message(int seqNo, byte[] data, boolean ack) {
this.seqNo = seqNo;
this.data = data;
this.ack = ack;
}
Client Side
byte[] fileBytes = new byte[500];
ByteArrayOutputStream outStream = new ByteArrayOutputStream();
ObjectOutputStream os = new ObjectOutputStream(outStream);
while((numBytesRead = inputBuf.read(fileBytes)) != -1) {
//DatagramPacket packet = new DatagramPacket(fileBytes, fileBytes.length);
if (os == null) {
os = new ObjectOutputStream(outStream);
}
Message msg = new Message(++seqNo, fileBytes, false);
os.writeObject(msg);
os.flush();
os.reset();
byte[] data = outStream.toByteArray();
atagramPacket packet = new DatagramPacket(data, data.length);
clientSocket.send(packet);
}
Server Side
byte[] incomingData = new byte[1024];
while (true) {
try{
DatagramPacket incomingPacket = new DatagramPacket(incomingData, incomingData.length);
serverSocket.receive(incomingPacket);
byte[] data = incomingPacket.getData();
ByteArrayInputStream in = new ByteArrayInputStream(data);
ObjectInputStream is = new ObjectInputStream(in);
if (is == null) {
is = new ObjectInputStream(in);
}
Message msg = (Message) is.readObject();
System.out.println(msg.getSeqNo());
out.write(msg.getData(),0,msg.getData().length);
}
The problem that I am facing is
I am receiving the same sequence number for each packet (i.e. 1)
I am not sure about the buffer size for the incoming packet, as I am using 500 bytes at Client side and 1024 at
the Sever. And if I take 500 bytes at both the sides I get an EOFexception.
I would really appreciate if you could suggest better ways to implement the same thing! Thanks :)
Message msg = new Message(++seqNo, fileBytes, false);
Here you are assuming that the prior read() filled the buffer. On the last read() before end of file it almost certainly won't, and it isn't guaranteed to fill it any time, only to transfer at least one byte.
You should be passing the read count 'numBytes' to this constructor, and it should create a byte array of that size, and copy only that many bytes into it.
Other issues:
It is impossible for 'os' to be null at the point you're testing it.
Ditto 'is'.
You should be creating a new ObjectOutputStream and ByteArrayOutputStream per datagram.
Java Datagrams keep shrinking to the size of the shortest datagram payload received so far. You must either create a new one per receive, or at least reset its length before each receive.
you need a larger buffer at the receiver because of ObjectOutputStream overheads.
I don't believe this code presently works at all, let alone that you keep getting the same sequence number. More likely you keep getting the same message, because you're ignoring an exception somewhere.

How to encode non-camera video in Android

I am working on an android application in which a video is dynamically generated by compositing a sequence of animation frames. I tried to use the Android Media Recorder API for this but have not found a way to get it to accept a non-camera source as input. I have been attempting to use a FFMPEG port (based on the Rockplayer build) but am running into difficulties with missing functions since I am using it as an encoder, not a decoder.
The iPhone version of this app uses AVAssetWriter from the AVFoundation framework.
Is there an easier way to do this or am I stuck slugging it out with FFMPEG?
This may help (see the note on resolution though):-
How to encode using the FFMpeg in Android (using H263)
I'm not sure if they did a custom build of ffmpeg, or not, if so they may be able to offer advice on porting a more feature complete version.
-Anthony
Opencv has ViewBase class which takes the input from the camera as a frame and represent the frame as a bitmap , you can extand the class View base and make it for your own use , even though installing opencv on the android isn't very easy.
When you extend SampleCvViewBase you will have the following function which you can use pretty much hard work but the best I can think of.
#Override
protected Bitmap processFrame(VideoCapture capture) {
capture.retrieve(picture, Highgui.CV_CAP_ANDROID_COLOR_FRAME_RGBA);
if (Utils.matToBitmap(picture, bmp))
return bmp;
bmp.recycle();
return null;
}
You can use a pure Java open source library called JCodec ( http://jcodec.org ).
It contains a simple yet working H.264 encoder and MP4 muxer. The class below uses JCodec low level API and should be what you need ( CORRECTED ):
public class SequenceEncoder {
private SeekableByteChannel ch;
private Picture toEncode;
private RgbToYuv420 transform;
private H264Encoder encoder;
private ArrayList<ByteBuffer> spsList;
private ArrayList<ByteBuffer> ppsList;
private CompressedTrack outTrack;
private ByteBuffer _out;
private int frameNo;
private MP4Muxer muxer;
public SequenceEncoder(File out) throws IOException {
this.ch = NIOUtils.writableFileChannel(out);
// Transform to convert between RGB and YUV
transform = new RgbToYuv420(0, 0);
// Muxer that will store the encoded frames
muxer = new MP4Muxer(ch, Brand.MP4);
// Add video track to muxer
outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, 25);
// Allocate a buffer big enough to hold output frames
_out = ByteBuffer.allocate(1920 * 1080 * 6);
// Create an instance of encoder
encoder = new H264Encoder();
// Encoder extra data ( SPS, PPS ) to be stored in a special place of
// MP4
spsList = new ArrayList<ByteBuffer>();
ppsList = new ArrayList<ByteBuffer>();
}
public void encodeImage(BufferedImage bi) throws IOException {
if (toEncode == null) {
toEncode = Picture.create(bi.getWidth(), bi.getHeight(), ColorSpace.YUV420);
}
// Perform conversion
for (int i = 0; i < 3; i++)
Arrays.fill(toEncode.getData()[i], 0);
transform.transform(AWTUtil.fromBufferedImage(bi), toEncode);
// Encode image into H.264 frame, the result is stored in '_out' buffer
_out.clear();
ByteBuffer result = encoder.encodeFrame(_out, toEncode);
// Based on the frame above form correct MP4 packet
spsList.clear();
ppsList.clear();
H264Utils.encodeMOVPacket(result, spsList, ppsList);
// Add packet to video track
outTrack.addFrame(new MP4Packet(result, frameNo, 25, 1, frameNo, true, null, frameNo, 0));
frameNo++;
}
public void finish() throws IOException {
// Push saved SPS/PPS to a special storage in MP4
outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList));
// Write MP4 header and finalize recording
muxer.writeHeader();
NIOUtils.closeQuietly(ch);
}
public static void main(String[] args) throws IOException {
SequenceEncoder encoder = new SequenceEncoder(new File("video.mp4"));
for (int i = 1; i < 100; i++) {
BufferedImage bi = ImageIO.read(new File(String.format("folder/img%08d.png", i)));
encoder.encodeImage(bi);
}
encoder.finish();
}
}
You can get JCodec jar from a project web-site.

Categories

Resources