Upload live android webcam video to RTP/RTSP Server - android

I have already done proper research, but still lack information on the thing I would like to achieve.
So I would like to program an application where the user can record a video and instantly (live) upload the video to a RTP/RTSP Server.
The server side will not be a problem. The thing I am unclear about is how to achieve this on the phone-side.
My research so far is that I have to write the video on recording to a local socket rather than to a file, because the 3gp files if written to a file cannot be accessed, until finalized (when the video is stopped and the header information have been written to the video about length and others).
When the socket receives the continuous data, I will need to wrap it into a RTP packet and send it to the remote server. I possibly will also have to do basic encoding first (which is not so important yet).
Does anybody have any idea, if this theory is correct so far.
I would also like to know if someone could point me to a few code-snippets of similar approaches, especially for sending the video on the fly to the server. I am not sure yet how to do that.
Thank you very much and best regards

Your overall approach sounds correct, but there are a couple of things you need to consider.
So I would like to program an application where the user can record a video and instantly (live) upload the video to a RTP/RTSP Server.
I'm assuming you want to upload to an RTSP server so that it can redistribute the content to multiple clients?
How will you handle the signaling/setup of the RTP session to the
RTSP server? You need to notify the RTSP server somehow that a user
is going to upload live media so that it can open the appropriate
RTP/RTCP sockets etc.
How will you handle authentication? Multiple client devices?
My research so far is that I have to write the video on recording to a local socket rather than to a file, because the 3gp files if written to a file cannot be accessed, until finalized (when the video is stopped and the header information have been written to the video about length and others).
Sending frames in real-time over RTP/RTCP is the correct approach. As the capture device captures each frame, you need to encode/compress it and send it over the socket. 3gp, like mp4, is a container format used for file storage. For live capture there is no need to write to a file. The only time this makes sense is e.g. in HTTP Live Streaming or DASH approaches, where media is written to a transport stream or mp4 file, before being served over HTTP.
When the socket receives the continuous data, I will need to wrap it into a RTP packet and send it to the remote server. I possibly will also have to do basic encoding first (which is not so important yet).
I would disagree, encoding is very important, you'll likely never manage to send the video otherwise, and you'll have to deal with issues such as cost (over mobile networks) and just the sheer volume of media depending on resolution and framerate.
Does anybody have any idea, if this theory is correct so far. I would also like to know if someone could point me to a few code-snippets of similar approaches, especially for sending the video on the fly to the server. I am not sure yet how to do that.
Take a look at the spydroid open source project as a starting point.
It contains many of the necessary steps including how to configure the encoder, packetise to RTP, send RTCP, as well as some RTSP server functionality. Spydroid sets up an RTSP server so media is encoded and sent once an RTSP client such as VLC is used to setup an RTSP session. Since your application is driven by the phone user wanting to send media to a server, you may need to consider another approach to start the sending, even if you send some kind of message to the server to for instance setup an RTSP session like in spydroid.

A year ago I created android app that could stream its camera/microphone using rtsp over tcp to wowza media server.
General approach is to create unix socket, get its file descriptor and feed it to android media recorder component. Then media recorder is then instructed to record camera video in mp4/h264 format to that file descriptor. Now, your app reads the client socket, parses mp4 to remove the header and get iframes from it and wraps it into rtsp stream on fly.
Something similar can also be done for sound (normally AAC). Of course you have to handle time stamping your self and the most tricky thing in entire approach is video/audio synchronisation.
So here is first part of it. Something that can be called rtspsocket. It negotiates with media server in connect method and after that you can write into it the stream itself. I will show it later.
package com.example.android.streaming.streaming.rtsp;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.io.UnsupportedEncodingException;
import java.math.BigInteger;
import java.net.InetSocketAddress;
import java.net.Socket;
import java.net.SocketException;
import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;
import java.util.Locale;
import java.util.concurrent.ConcurrentHashMap;
import android.util.Base64;
import android.util.Log;
import com.example.android.streaming.StreamingApp;
import com.example.android.streaming.streaming.Session;
import com.example.android.streaming.BuildConfig;
public class RtspSocket extends Socket {
public static final int RTSP_HEADER_LENGTH = 4;
public static final int RTP_HEADER_LENGTH = 12;
public static final int MTU = 1400;
public static final int PAYLOAD_OFFSET = RTSP_HEADER_LENGTH + RTP_HEADER_LENGTH;
public static final int RTP_OFFSET = RTSP_HEADER_LENGTH;
private ConcurrentHashMap<String, String> headerMap = new ConcurrentHashMap<String, String>();
static private final String kCRLF = "\r\n";
// RTSP request format strings
static private final String kOptions = "OPTIONS %s RTSP/1.0\r\n";
static private final String kDescribe = "DESCRIBE %s RTSP/1.0\r\n";
static private final String kAnnounce = "ANNOUNCE %s RTSP/1.0\r\n";
static private final String kSetupPublish = "SETUP %s/trackid=%d RTSP/1.0\r\n";
#SuppressWarnings("unused")
static private final String kSetupPlay = "SETUP %s/trackid=%d RTSP/1.0\r\n";
static private final String kRecord = "RECORD %s RTSP/1.0\r\n";
static private final String kPlay = "PLAY %s RTSP/1.0\r\n";
static private final String kTeardown = "TEARDOWN %s RTSP/1.0\r\n";
// RTSP header format strings
static private final String kCseq = "Cseq: %d\r\n";
static private final String kContentLength = "Content-Length: %d\r\n";
static private final String kContentType = "Content-Type: %s\r\n";
static private final String kTransport = "Transport: RTP/AVP/%s;unicast;mode=%s;%s\r\n";
static private final String kSession = "Session: %s\r\n";
static private final String kRange = "range: %s\r\n";
static private final String kAccept = "Accept: %s\r\n";
static private final String kAuthBasic = "Authorization: Basic %s\r\n";
static private final String kAuthDigest = "Authorization: Digest username=\"%s\",realm=\"%s\",nonce=\"%s\",uri=\"%s\",response=\"%s\"\r\n";
// RTSP header keys
static private final String kSessionKey = "Session";
static private final String kWWWAuthKey = "WWW-Authenticate";
byte header[] = new byte[RTSP_MAX_HEADER + 1];
static private final int RTSP_MAX_HEADER = 4095;
static private final int RTSP_MAX_BODY = 4095;
static private final int RTSP_RESP_ERR = -6;
// static private final int RTSP_RESP_ERR_SESSION = -7;
static public final int RTSP_OK = 200;
static private final int RTSP_BAD_USER_PASS = 401;
static private final int SOCK_ERR_READ = -5;
/* Number of channels including control ones. */
private int channelCount = 0;
/* RTSP negotiation cmd seq counter */
private int seq = 0;
private String authentication = null;
private String session = null;
private String path = null;
private String url = null;
private String user = null;
private String pass = null;
private String sdp = null;
private byte[] buffer = new byte[MTU];
public RtspSocket() {
super();
try {
setTcpNoDelay(true);
setSoTimeout(60000);
} catch (SocketException e) {
Log.e(StreamingApp.TAG, "Failed to set socket params.");
}
buffer[RTSP_HEADER_LENGTH] = (byte) Integer.parseInt("10000000", 2);
}
public byte[] getBuffer() {
return buffer;
}
public static final void setLong(byte[] buffer, long n, int begin, int end) {
for (end--; end >= begin; end--) {
buffer[end] = (byte) (n % 256);
n >>= 8;
}
}
public void setSequence(int seq) {
setLong(buffer, seq, RTP_OFFSET + 2, RTP_OFFSET + 4);
}
public void setSSRC(int ssrc) {
setLong(buffer, ssrc, RTP_OFFSET + 8, RTP_OFFSET + 12);
}
public void setPayload(int payload) {
buffer[RTP_OFFSET + 1] = (byte) (payload & 0x7f);
}
public void setRtpTimestamp(long timestamp) {
setLong(buffer, timestamp, RTP_OFFSET + 4, RTP_OFFSET + 8);
}
/** Sends the RTP packet over the network */
private void send(int length, int stream) throws IOException {
buffer[0] = '$';
buffer[1] = (byte) stream;
setLong(buffer, length, 2, 4);
OutputStream s = getOutputStream();
s.write(buffer, 0, length + RTSP_HEADER_LENGTH);
s.flush();
}
public void sendReport(int length, int ssrc, int stream) throws IOException {
setPayload(200);
setLong(buffer, ssrc, RTP_OFFSET + 4, RTP_OFFSET + 8);
send(length + RTP_HEADER_LENGTH, stream);
}
public void sendData(int length, int ssrc, int seq, int payload, int stream, boolean last) throws IOException {
setSSRC(ssrc);
setSequence(seq);
setPayload(payload);
buffer[RTP_OFFSET + 1] |= (((last ? 1 : 0) & 0x01) << 7);
send(length + RTP_HEADER_LENGTH, stream);
}
public int getChannelCount() {
return channelCount;
}
private void write(String request) throws IOException {
try {
String asci = new String(request.getBytes(), "US-ASCII");
OutputStream out = getOutputStream();
out.write(asci.getBytes());
} catch (IOException e) {
throw new IOException("Error writing to socket.");
}
}
private String read() throws IOException {
String response = null;
try {
InputStream in = getInputStream();
int i = 0, len = 0, crlf_count = 0;
boolean parsedHeader = false;
for (; i < RTSP_MAX_BODY && !parsedHeader && len > -1; i++) {
len = in.read(header, i, 1);
if (header[i] == '\r' || header[i] == '\n') {
crlf_count++;
if (crlf_count == 4)
parsedHeader = true;
} else {
crlf_count = 0;
}
}
if (len != -1) {
len = i;
header[len] = '\0';
response = new String(header, 0, len, "US-ASCII");
}
} catch (IOException e) {
throw new IOException("Connection timed out. Check your network settings.");
}
return response;
}
private int parseResponse(String response) {
String[] lines = response.split(kCRLF);
String[] items = response.split(" ");
String tempString, key, value;
headerMap.clear();
if (items.length < 2)
return RTSP_RESP_ERR;
int responseCode = RTSP_RESP_ERR;
try {
responseCode = Integer.parseInt(items[1]);
} catch (Exception e) {
Log.w(StreamingApp.TAG, e.getMessage());
Log.w(StreamingApp.TAG, response);
}
if (responseCode == RTSP_RESP_ERR)
return responseCode;
// Parse response header into key value pairs.
for (int i = 1; i < lines.length; i++) {
tempString = lines[i];
if (tempString.length() == 0)
break;
int idx = tempString.indexOf(":");
if (idx == -1)
continue;
key = tempString.substring(0, idx);
value = tempString.substring(idx + 1);
headerMap.put(key, value);
}
tempString = headerMap.get(kSessionKey);
if (tempString != null) {
// Parse session
items = tempString.split(";");
tempString = items[0];
session = tempString.trim();
}
return responseCode;
}
private void generateBasicAuth() throws UnsupportedEncodingException {
String userpass = String.format("%s:%s", user, pass);
authentication = String.format(kAuthBasic, Base64.encodeToString(userpass.getBytes("US-ASCII"), Base64.DEFAULT));
}
public static String md5(String s) {
MessageDigest digest;
try {
digest = MessageDigest.getInstance("MD5");
digest.update(s.getBytes(), 0, s.length());
String hash = new BigInteger(1, digest.digest()).toString(16);
return hash;
} catch (NoSuchAlgorithmException e) {
e.printStackTrace();
}
return "";
}
static private final int CC_MD5_DIGEST_LENGTH = 16;
private String md5HexDigest(String input) {
byte digest[] = md5(input).getBytes();
String result = new String();
for (int i = 0; i < CC_MD5_DIGEST_LENGTH; i++)
result = result.concat(String.format("%02x", digest[i]));
return result;
}
private void generateDigestAuth(String method) {
String nonce, realm;
String ha1, ha2, response;
// WWW-Authenticate: Digest realm="Streaming Server",
// nonce="206351b944cb28fe37a0794848c2e36f"
String wwwauth = headerMap.get(kWWWAuthKey);
int idx = wwwauth.indexOf("Digest");
String authReq = wwwauth.substring(idx + "Digest".length() + 1);
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, String.format("Auth Req: %s", authReq));
String[] split = authReq.split(",");
realm = split[0];
nonce = split[1];
split = realm.split("=");
realm = split[1];
realm = realm.substring(1, 1 + realm.length() - 2);
split = nonce.split("=");
nonce = split[1];
nonce = nonce.substring(1, 1 + nonce.length() - 2);
if (BuildConfig.DEBUG) {
Log.d(StreamingApp.TAG, String.format("realm=%s", realm));
Log.d(StreamingApp.TAG, String.format("nonce=%s", nonce));
}
ha1 = md5HexDigest(String.format("%s:%s:%s", user, realm, pass));
ha2 = md5HexDigest(String.format("%s:%s", method, url));
response = md5HexDigest(String.format("%s:%s:%s", ha1, nonce, ha2));
authentication = md5HexDigest(String.format(kAuthDigest, user, realm, nonce, url, response));
}
private int options() throws IOException {
seq++;
StringBuilder request = new StringBuilder();
request.append(String.format(kOptions, url));
request.append(String.format(kCseq, seq));
request.append(kCRLF);
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "--- OPTIONS Request ---\n\n" + request);
write(request.toString());
String response = read();
if (response == null)
return SOCK_ERR_READ;
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "--- OPTIONS Response ---\n\n" + response);
return parseResponse(response);
}
#SuppressWarnings("unused")
private int describe() throws IOException {
seq++;
StringBuilder request = new StringBuilder();
request.append(String.format(kDescribe, url));
request.append(String.format(kAccept, "application/sdp"));
request.append(String.format(kCseq, seq));
request.append(kCRLF);
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "--- DESCRIBE Request ---\n\n" + request);
write(request.toString());
String response = read();
if (response == null)
return SOCK_ERR_READ;
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "--- DESCRIBE Response ---\n\n" + response);
return parseResponse(response);
}
private int recurseDepth = 0;
private int announce() throws IOException {
seq++;
recurseDepth = 0;
StringBuilder request = new StringBuilder();
request.append(String.format(kAnnounce, url));
request.append(String.format(kCseq, seq));
request.append(String.format(kContentLength, sdp.length()));
request.append(String.format(kContentType, "application/sdp"));
request.append(kCRLF);
if (sdp.length() > 0)
request.append(sdp);
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "--- ANNOUNCE Request ---\n\n" + request);
write(request.toString());
String response = read();
if (response == null)
return SOCK_ERR_READ;
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "--- ANNOUNCE Response ---\n\n" + response);
int ret = parseResponse(response);
if (ret == RTSP_BAD_USER_PASS && recurseDepth == 0) {
String wwwauth = headerMap.get(kWWWAuthKey);
if (wwwauth != null) {
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, String.format("WWW Auth Value: %s", wwwauth));
int idx = wwwauth.indexOf("Basic");
recurseDepth++;
if (idx != -1) {
generateBasicAuth();
} else {
// We are assuming Digest here.
generateDigestAuth("ANNOUNCE");
}
ret = announce();
recurseDepth--;
}
}
return ret;
}
private int setup(int trackId) throws IOException {
seq++;
recurseDepth = 0;
StringBuilder request = new StringBuilder();
request.append(String.format(kSetupPublish, url, trackId));
request.append(String.format(kCseq, seq));
/* One channel for rtp (data) and one for rtcp (control) */
String tempString = String.format(Locale.getDefault(), "interleaved=%d-%d", channelCount++, channelCount++);
request.append(String.format(kTransport, "TCP", "record", tempString));
request.append(kCRLF);
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "--- SETUP Request ---\n\n" + request);
write(request.toString());
String response = read();
if (response == null)
return SOCK_ERR_READ;
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "--- SETUP Response ---\n\n" + response);
int ret = parseResponse(response);
if (ret == RTSP_BAD_USER_PASS && recurseDepth == 0) {
String wwwauth = headerMap.get(kWWWAuthKey);
if (wwwauth != null) {
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, String.format("WWW Auth Value: %s", wwwauth));
int idx = wwwauth.indexOf("Basic");
recurseDepth++;
if (idx != -1) {
generateBasicAuth();
} else {
// We are assuming Digest here.
generateDigestAuth("SETUP");
}
ret = setup(trackId);
authentication = null;
recurseDepth--;
}
}
return ret;
}
private int record() throws IOException {
seq++;
recurseDepth = 0;
StringBuilder request = new StringBuilder();
request.append(String.format(kRecord, url));
request.append(String.format(kCseq, seq));
request.append(String.format(kRange, "npt=0.000-"));
if (authentication != null)
request.append(authentication);
if (session != null)
request.append(String.format(kSession, session));
request.append(kCRLF);
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "--- RECORD Request ---\n\n" + request);
write(request.toString());
String response = read();
if (response == null)
return SOCK_ERR_READ;
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "--- RECORD Response ---\n\n" + response);
int ret = parseResponse(response);
if (ret == RTSP_BAD_USER_PASS && recurseDepth == 0) {
String wwwauth = headerMap.get(kWWWAuthKey);
if (wwwauth != null) {
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, String.format("WWW Auth Value: %s", wwwauth));
int idx = wwwauth.indexOf("Basic");
recurseDepth++;
if (idx != -1) {
generateBasicAuth();
} else {
// We are assuming Digest here.
generateDigestAuth("RECORD");
}
ret = record();
authentication = null;
recurseDepth--;
}
}
return ret;
}
#SuppressWarnings("unused")
private int play() throws IOException {
seq++;
recurseDepth = 0;
StringBuilder request = new StringBuilder();
request.append(String.format(kPlay, url));
request.append(String.format(kCseq, seq));
request.append(String.format(kRange, "npt=0.000-"));
if (authentication != null)
request.append(authentication);
if (session != null)
request.append(String.format(kSession, session));
request.append(kCRLF);
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "--- PLAY Request ---\n\n" + request);
write(request.toString());
String response = read();
if (response == null)
return SOCK_ERR_READ;
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "--- PLAY Response ---\n\n" + response);
int ret = parseResponse(response);
if (ret == RTSP_BAD_USER_PASS && recurseDepth == 0) {
String wwwauth = headerMap.get(kWWWAuthKey);
if (wwwauth != null) {
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, String.format("WWW Auth Value: %s", wwwauth));
int idx = wwwauth.indexOf("Basic");
recurseDepth++;
if (idx != -1) {
generateBasicAuth();
} else {
// We are assuming Digest here.
generateDigestAuth("PLAY");
}
ret = record();
authentication = null;
recurseDepth--;
}
}
return ret;
}
private int teardown() throws IOException {
seq++;
recurseDepth = 0;
StringBuilder request = new StringBuilder();
request.append(String.format(kTeardown, url));
request.append(String.format(kCseq, seq));
if (authentication != null)
request.append(authentication);
if (session != null)
request.append(String.format(kSession, session));
request.append(kCRLF);
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "--- TEARDOWN Request ---\n\n" + request);
write(request.toString());
String response = read();
if (response == null)
return SOCK_ERR_READ;
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "--- TEARDOWN Response ---\n\n" + response);
int ret = parseResponse(response);
if (ret == RTSP_BAD_USER_PASS && recurseDepth == 0) {
String wwwauth = headerMap.get(kWWWAuthKey);
if (wwwauth != null) {
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, String.format("WWW Auth Value: %s", wwwauth));
int idx = wwwauth.indexOf("Basic");
recurseDepth++;
if (idx != -1) {
generateBasicAuth();
} else {
// We are assuming Digest here.
generateDigestAuth("TEARDOWN");
}
ret = record();
authentication = null;
recurseDepth--;
}
}
return ret;
}
public void connect(String dest, int port, Session session) throws IOException {
int trackId = 1;
int responseCode;
if (isConnected())
return;
if (!session.hasAudioTrack() && !session.hasVideoTrack())
throw new IOException("No tracks found in session.");
InetSocketAddress addr = null;
try {
addr = new InetSocketAddress(dest, port);
} catch (Exception e) {
throw new IOException("Failed to resolve rtsp server address.");
}
this.sdp = session.getSDP();
this.user = session.getUser();
this.pass = session.getPass();
this.path = session.getPath();
this.url = String.format("rtsp://%s:%d%s", dest, addr.getPort(), this.path);
try {
super.connect(addr);
} catch (IOException e) {
throw new IOException("Failed to connect rtsp server.");
}
responseCode = announce();
if (responseCode != RTSP_OK) {
close();
throw new IOException("RTSP announce failed: " + responseCode);
}
responseCode = options();
if (responseCode != RTSP_OK) {
close();
throw new IOException("RTSP options failed: " + responseCode);
}
/* Setup audio */
if (session.hasAudioTrack()) {
session.getAudioTrack().setStreamId(channelCount);
responseCode = setup(trackId++);
if (responseCode != RTSP_OK) {
close();
throw new IOException("RTSP video failed: " + responseCode);
}
}
/* Setup video */
if (session.hasVideoTrack()) {
session.getVideoTrack().setStreamId(channelCount);
responseCode = setup(trackId++);
if (responseCode != RTSP_OK) {
close();
throw new IOException("RTSP audio setup failed: " + responseCode);
}
}
responseCode = record();
if (responseCode != RTSP_OK) {
close();
throw new IOException("RTSP record failed: " + responseCode);
}
}
public void close() throws IOException {
if (!isConnected())
return;
teardown();
super.close();
}
}

I tried to achieve the same result (but abandoned due to lack of experience). My way was to use ffmpeg and/or avlib because it already has working rtmp stack. So in theory all you need is to route video stream to ffmpeg process which will stream to server.

is there a reason for using 3gp on the client side? With mp4 (with MOOV atom set in header) you can read the temp file in chunks and send over to the server, there will likely be a slight time delay though, all depends on your connection speed as well. Your rtsp server should be able to re-encode the mp4 back to 3gp for low bandwidth viewing.

At this point, if i had to accept camera ( raw stream ) and immediately make it available to a set of clients, i would go the google hangouts route and use WebRTC. see ondello 'platform section' for the toolset/SDK. During your evaluation, you should have looked at comparative merit of WebRTC v RTSP.
IMO with its statefulness, RTSP will be a nightware behind firewalls and with NAT. AFAIK on 3G/4G the use of RTP in 3rd party apps is a bit risky.
That said, i put up on git an old android/rtp/rtsp/sdp project using libs from netty and 'efflux'. I think that this project was trying to retrieve and play just the audio track within the container ( vid track ignored and not pulled via network ) from Youtube videos all of which were encoded for RTSP at the time. I think there were some packet and frame header issues and i got fed up with RTSP and dropped it.
If you must pursue RTP/RTSP some of the packet and frame level stuff that other posters have mentioned is right there in the android classes and in the test cases that come with efflux

And here is rtsp session class. It uses rtsp socket to talk to media server. Its purpose is also to hold session params, such as, what streams it can send (video and/or audio), queues, somewhat audio/video sync code.
Used interface.
package com.example.android.streaming.streaming.rtsp;
public interface PacketListener {
public void onPacketReceived(Packet p);
}
Session itself.
package com.example.android.streaming.streaming;
import static java.util.EnumSet.of;
import java.io.IOException;
import java.util.EnumSet;
import java.util.concurrent.BlockingDeque;
import java.util.concurrent.LinkedBlockingDeque;
import java.util.concurrent.atomic.AtomicBoolean;
import java.util.concurrent.locks.Condition;
import java.util.concurrent.locks.ReentrantLock;
import android.app.Activity;
import android.content.SharedPreferences;
import android.hardware.Camera;
import android.hardware.Camera.CameraInfo;
import android.os.SystemClock;
import android.preference.PreferenceManager;
import android.util.Log;
import android.view.SurfaceHolder;
import com.example.android.streaming.BuildConfig;
import com.example.android.streaming.StreamingApp;
import com.example.android.streaming.streaming.audio.AACStream;
import com.example.android.streaming.streaming.rtsp.Packet;
import com.example.android.streaming.streaming.rtsp.Packet.PacketType;
import com.example.android.streaming.streaming.rtsp.PacketListener;
import com.example.android.streaming.streaming.rtsp.RtspSocket;
import com.example.android.streaming.streaming.video.H264Stream;
import com.example.android.streaming.streaming.video.VideoConfig;
import com.example.android.streaming.streaming.video.VideoStream;
public class Session implements PacketListener, Runnable {
public final static int MESSAGE_START = 0x03;
public final static int MESSAGE_STOP = 0x04;
public final static int VIDEO_H264 = 0x01;
public final static int AUDIO_AAC = 0x05;
public final static int VIDEO_TRACK = 1;
public final static int AUDIO_TRACK = 0;
private static VideoConfig defaultVideoQuality = VideoConfig.defaultVideoQualiy.clone();
private static int defaultVideoEncoder = VIDEO_H264, defaultAudioEncoder = AUDIO_AAC;
private static Session sessionUsingTheCamera = null;
private static Session sessionUsingTheCamcorder = null;
private static int startedStreamCount = 0;
private int sessionTrackCount = 0;
private static SurfaceHolder surfaceHolder;
private Stream[] streamList = new Stream[2];
protected RtspSocket socket = null;
private Activity context = null;
private String host = null;
private String path = null;
private String user = null;
private String pass = null;
private int port;
public interface SessionListener {
public void startSession(Session session);
public void stopSession(Session session);
};
public Session(Activity context, String host, int port, String path, String user, String pass) {
this.context = context;
this.host = host;
this.port = port;
this.path = path;
this.pass = pass;
}
public boolean isConnected() {
return socket != null && socket.isConnected();
}
/**
* Connect to rtsp server and start new session. This should be called when
* all the streams are added so that proper sdp can be generated.
*/
public void connect() throws IOException {
try {
socket = new RtspSocket();
socket.connect(host, port, this);
} catch (IOException e) {
socket = null;
throw e;
}
}
public void close() throws IOException {
if (socket != null) {
socket.close();
socket = null;
}
}
public static void setDefaultVideoQuality(VideoConfig quality) {
defaultVideoQuality = quality;
}
public static void setDefaultAudioEncoder(int encoder) {
defaultAudioEncoder = encoder;
}
public static void setDefaultVideoEncoder(int encoder) {
defaultVideoEncoder = encoder;
}
public static void setSurfaceHolder(SurfaceHolder sh) {
surfaceHolder = sh;
}
public boolean hasVideoTrack() {
return getVideoTrack() != null;
}
public MediaStream getVideoTrack() {
return (MediaStream) streamList[VIDEO_TRACK];
}
public void addVideoTrack(Camera camera, CameraInfo info) throws IllegalStateException, IOException {
addVideoTrack(camera, info, defaultVideoEncoder, defaultVideoQuality, false);
}
public synchronized void addVideoTrack(Camera camera, CameraInfo info, int encoder, VideoConfig quality,
boolean flash) throws IllegalStateException, IOException {
if (isCameraInUse())
throw new IllegalStateException("Camera already in use by another client.");
Stream stream = null;
VideoConfig.merge(quality, defaultVideoQuality);
switch (encoder) {
case VIDEO_H264:
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "Video streaming: H.264");
SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(context.getApplicationContext());
stream = new H264Stream(camera, info, this, prefs);
break;
}
if (stream != null) {
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "Quality is: " + quality.resX + "x" + quality.resY + "px " + quality.framerate
+ "fps, " + quality.bitrate + "bps");
((VideoStream) stream).setVideoQuality(quality);
((VideoStream) stream).setPreviewDisplay(surfaceHolder.getSurface());
streamList[VIDEO_TRACK] = stream;
sessionUsingTheCamera = this;
sessionTrackCount++;
}
}
public boolean hasAudioTrack() {
return getAudioTrack() != null;
}
public MediaStream getAudioTrack() {
return (MediaStream) streamList[AUDIO_TRACK];
}
public void addAudioTrack() throws IOException {
addAudioTrack(defaultAudioEncoder);
}
public synchronized void addAudioTrack(int encoder) throws IOException {
if (sessionUsingTheCamcorder != null)
throw new IllegalStateException("Audio device is already in use by another client.");
Stream stream = null;
switch (encoder) {
case AUDIO_AAC:
if (android.os.Build.VERSION.SDK_INT < 14)
throw new IllegalStateException("This device does not support AAC.");
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "Audio streaming: AAC");
SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(context.getApplicationContext());
stream = new AACStream(this, prefs);
break;
}
if (stream != null) {
streamList[AUDIO_TRACK] = stream;
sessionUsingTheCamcorder = this;
sessionTrackCount++;
}
}
public synchronized String getSDP() throws IllegalStateException, IOException {
StringBuilder sdp = new StringBuilder();
sdp.append("v=0\r\n");
/*
* The RFC 4566 (5.2) suggests to use an NTP timestamp here but we will
* simply use a UNIX timestamp.
*/
//sdp.append("o=- " + timestamp + " " + timestamp + " IN IP4 127.0.0.1\r\n");
sdp.append("o=- 0 0 IN IP4 127.0.0.1\r\n");
sdp.append("s=Vedroid\r\n");
sdp.append("c=IN IP4 " + host + "\r\n");
sdp.append("i=N/A\r\n");
sdp.append("t=0 0\r\n");
sdp.append("a=tool:Vedroid RTP\r\n");
int payload = 96;
int trackId = 1;
for (int i = 0; i < streamList.length; i++) {
if (streamList[i] != null) {
streamList[i].setPayloadType(payload++);
sdp.append(streamList[i].generateSDP());
sdp.append("a=control:trackid=" + trackId++ + "\r\n");
}
}
return sdp.toString();
}
public String getDest() {
return host;
}
public int getTrackCount() {
return sessionTrackCount;
}
public static boolean isCameraInUse() {
return sessionUsingTheCamera != null;
}
/** Indicates whether or not the microphone is being used in a session. **/
public static boolean isMicrophoneInUse() {
return sessionUsingTheCamcorder != null;
}
private SessionListener listener = null;
public synchronized void prepare(int trackId) throws IllegalStateException, IOException {
Stream stream = streamList[trackId];
if (stream != null && !stream.isStreaming())
stream.prepare();
}
public synchronized void start(int trackId) throws IllegalStateException, IOException {
Stream stream = streamList[trackId];
if (stream != null && !stream.isStreaming()) {
stream.start();
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "Started " + (trackId == VIDEO_TRACK ? "video" : "audio") + " channel.");
// if (++startedStreamCount == 1 && listener != null)
// listener.startSession(this);
}
}
public void startAll(SessionListener listener) throws IllegalStateException, IOException {
this.listener = listener;
startThread();
for (int i = 0; i < streamList.length; i++)
prepare(i);
/*
* Important to start video capture before audio capture. This makes
* audio/video de-sync smaller.
*/
for (int i = 0; i < streamList.length; i++)
start(streamList.length - i - 1);
}
public synchronized void stopAll() {
for (int i = 0; i < streamList.length; i++) {
if (streamList[i] != null && streamList[i].isStreaming()) {
streamList[i].stop();
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "Stopped " + (i == VIDEO_TRACK ? "video" : "audio") + " channel.");
if (--startedStreamCount == 0 && listener != null)
listener.stopSession(this);
}
}
stopThread();
this.listener = null;
if (BuildConfig.DEBUG)
Log.d(StreamingApp.TAG, "Session stopped.");
}
public synchronized void flush() {
for (int i = 0; i < streamList.length; i++) {
if (streamList[i] != null) {
streamList[i].release();
if (i == VIDEO_TRACK)
sessionUsingTheCamera = null;
else
sessionUsingTheCamcorder = null;
streamList[i] = null;
}
}
}
public String getPath() {
return path;
}
public String getUser() {
return user;
}
public String getPass() {
return pass;
}
private BlockingDeque<Packet> audioQueue = new LinkedBlockingDeque<Packet>(MAX_QUEUE_SIZE);
private BlockingDeque<Packet> videoQueue = new LinkedBlockingDeque<Packet>(MAX_QUEUE_SIZE);
private final static int MAX_QUEUE_SIZE = 1000;
private void sendPacket(Packet p) {
try {
MediaStream channel = (p.type == PacketType.AudioPacketType ? getAudioTrack() : getVideoTrack());
p.packetizer.send(p, socket, channel.getPayloadType(), channel.getStreamId());
getPacketQueue(p.type).remove(p);
} catch (IOException e) {
Log.e(StreamingApp.TAG, "Failed to send packet: " + e.getMessage());
}
}
private final ReentrantLock queueLock = new ReentrantLock();
private final Condition morePackets = queueLock.newCondition();
private AtomicBoolean stopped = new AtomicBoolean(true);
private Thread t = null;
private final void wakeupThread() {
queueLock.lock();
try {
morePackets.signalAll();
} finally {
queueLock.unlock();
}
}
public void startThread() {
if (t == null) {
t = new Thread(this);
stopped.set(false);
t.start();
}
}
public void stopThread() {
stopped.set(true);
if (t != null) {
t.interrupt();
try {
wakeupThread();
t.join();
} catch (InterruptedException e) {
}
t = null;
}
audioQueue.clear();
videoQueue.clear();
}
private long getStreamEndSampleTimestamp(BlockingDeque<Packet> queue) {
long sample = 0;
try {
sample = queue.getLast().getSampleTimestamp() + queue.getLast().getFrameLen();
} catch (Exception e) {
}
return sample;
}
private PacketType syncType = PacketType.AnyPacketType;
private boolean aligned = false;
private final BlockingDeque<Packet> getPacketQueue(PacketType type) {
return (type == PacketType.AudioPacketType ? audioQueue : videoQueue);
}
private void setPacketTimestamp(Packet p) {
/* Don't sync on SEI packet. */
if (!aligned && p.type != syncType) {
long shift = getStreamEndSampleTimestamp(getPacketQueue(syncType));
Log.w(StreamingApp.TAG, "Set shift +" + shift + "ms to "
+ (p.type == PacketType.VideoPacketType ? "video" : "audio") + " stream ("
+ (getPacketQueue(syncType).size() + 1) + ") packets.");
p.setTimestamp(p.getDuration(shift));
p.setSampleTimestamp(shift);
if (listener != null)
listener.startSession(this);
aligned = true;
} else {
p.setTimestamp(p.packetizer.getTimestamp());
p.setSampleTimestamp(p.packetizer.getSampleTimestamp());
}
p.packetizer.setSampleTimestamp(p.getSampleTimestamp() + p.getFrameLen());
p.packetizer.setTimestamp(p.getTimestamp() + p.getDuration());
// if (BuildConfig.DEBUG) {
// Log.d(StreamingApp.TAG, (p.type == PacketType.VideoPacketType ? "Video" : "Audio") + " packet timestamp: "
// + p.getTimestamp() + "; sampleTimestamp: " + p.getSampleTimestamp());
// }
}
/*
* Drop first frames if len is less than this. First sync frame will have
* frame len >= 10 ms.
*/
private final static int MinimalSyncFrameLength = 15;
#Override
public void onPacketReceived(Packet p) {
queueLock.lock();
try {
/*
* We always synchronize on video stream. Some devices have video
* coming faster than audio, this is ok. Audio stream time stamps
* will be adjusted. Other devices that have audio come first will
* see all audio packets dropped until first video packet comes.
* Then upon first video packet we again adjust the audio stream by
* time stamp of the last video packet in the queue.
*/
if (syncType == PacketType.AnyPacketType && p.type == PacketType.VideoPacketType
&& p.getFrameLen() >= MinimalSyncFrameLength)
syncType = p.type;
if (syncType == PacketType.VideoPacketType) {
setPacketTimestamp(p);
if (getPacketQueue(p.type).size() > MAX_QUEUE_SIZE - 1) {
Log.w(StreamingApp.TAG, "Queue (" + p.type + ") is full, dropping packet.");
} else {
/*
* Wakeup sending thread only if channels synchronization is
* already done.
*/
getPacketQueue(p.type).add(p);
if (aligned)
morePackets.signalAll();
}
}
} finally {
queueLock.unlock();
}
}
private boolean hasMorePackets(EnumSet<Packet.PacketType> mask) {
boolean gotPackets;
if (mask.contains(PacketType.AudioPacketType) && mask.contains(PacketType.VideoPacketType)) {
gotPackets = (audioQueue.size() > 0 && videoQueue.size() > 0) && aligned;
} else {
if (mask.contains(PacketType.AudioPacketType))
gotPackets = (audioQueue.size() > 0);
else if (mask.contains(PacketType.VideoPacketType))
gotPackets = (videoQueue.size() > 0);
else
gotPackets = (videoQueue.size() > 0 || audioQueue.size() > 0);
}
return gotPackets;
}
private void waitPackets(EnumSet<Packet.PacketType> mask) {
queueLock.lock();
try {
do {
if (!stopped.get() && !hasMorePackets(mask)) {
try {
morePackets.await();
} catch (InterruptedException e) {
}
}
} while (!stopped.get() && !hasMorePackets(mask));
} finally {
queueLock.unlock();
}
}
private void sendPackets() {
boolean send;
Packet a, v;
/*
* Wait for any type of packet and send asap. With time stamps correctly
* set, the real send moment is not important and may be quite
* different. Media server will only check for time stamps.
*/
waitPackets(of(PacketType.AnyPacketType));
v = videoQueue.peek();
if (v != null) {
sendPacket(v);
do {
a = audioQueue.peek();
if ((send = (a != null && a.getSampleTimestamp() <= v.getSampleTimestamp())))
sendPacket(a);
} while (!stopped.get() && send);
} else {
a = audioQueue.peek();
if (a != null)
sendPacket(a);
}
}
#Override
public void run() {
Log.w(StreamingApp.TAG, "Session thread started.");
/*
* Wait for both types of front packets to come and synchronize on each
* other.
*/
waitPackets(of(PacketType.AudioPacketType, PacketType.VideoPacketType));
while (!stopped.get())
sendPackets();
Log.w(StreamingApp.TAG, "Flushing session queues.");
Log.w(StreamingApp.TAG, " " + audioQueue.size() + " audio packets.");
Log.w(StreamingApp.TAG, " " + videoQueue.size() + " video packets.");
long start = SystemClock.elapsedRealtime();
while (audioQueue.size() > 0 || videoQueue.size() > 0)
sendPackets();
Log.w(StreamingApp.TAG, "Session thread stopped.");
Log.w(StreamingApp.TAG, "Queues flush took " + (SystemClock.elapsedRealtime() - start) + " ms.");
}
}

Check this answer: Video streaming over WIFI?
Then if u want to see the live streaming in android phone then include vlc plugin inside your application and connect through real time streaming protocol(rtsp).
Intent i = new Intent("org.videolan.vlc.VLCApplication.gui.video.VideoPlayerActivity");
i.setAction(Intent.ACTION_VIEW);
i.setData(Uri.parse("rtsp://10.0.0.179:8086/"));
startActivity(i);
If u have installed VLC on your android phone, then you can stream using intent and pass the ip address and port no as shown above.

Related

decode h264 raw stream using mediacodec

I recieve h264 data from server, I want to decode this stream using mediacodec and texture view on android.I got the data from the server , parssing it to get the SPS , the PPS and the video frame data, then I passed this data to the mediacodec , but the function dequeueOutputBuffer(info, 100000) always returns -1 and I get dequeueOutputBuffer timed out.
Any help please, I'am stucked at this issues from three weeks.
this is the code used to decode the video frame.
public class H264PlayerActivity extends AppCompatActivity implements TextureView.SurfaceTextureListener {
private TextureView m_surface;// View that contains the Surface Texture
private H264Provider provider;// Object that connects to our server and gets H264 frames
private MediaCodec m_codec;// Media decoder
// private DecodeFramesTask m_frameTask;// AsyncTask that takes H264 frames and uses the decoder to update the Surface Texture
// the channel used to receive the partner's video
private ZMQ.Socket subscriber = null;
private ZMQ.Context context;
// thread handling the video reception
// byte[] byte_SPSPPS = null;
//byte[] byte_Frame = null;
public static String stringSubscribe=null;
public static String myIpAcquisition=null;
public static byte[] byte_SPSPPS = null;
public static byte[] byte_Frame = null;
boolean isIframe = false;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.h264player_activity);
Bundle extras = getIntent().getExtras();
if(extras!=null)
{
stringSubscribe=extras.getString("stringSubscribe");
myIpAcquisition=(extras.getString("myIpAcquisition"));
}
// Get a referance to the TextureView in the UI
m_surface = (TextureView) findViewById(R.id.textureView);
// Add this class as a call back so we can catch the events from the Surface Texture
m_surface.setSurfaceTextureListener(this);
}
#RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN)
#Override
// Invoked when a TextureView's SurfaceTexture is ready for use
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
// when the surface is ready, we make a H264 provider Object. When its constructor runs it starts an AsyncTask to log into our server and start getting frames
provider = new H264Provider(stringSubscribe, myIpAcquisition,byte_SPSPPS,byte_Frame);
}
#Override
// Invoked when the SurfaceTexture's buffers size changed
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
}
#Override
// Invoked when the specified SurfaceTexture is about to be destroyed
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
return false;
}
#Override
// Invoked when the specified SurfaceTexture is updated through updateTexImage()
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
}
private class H264Provider {
String stringSubscribe = "";
String myIpAcquisition = "";
byte[] byte_SPSPPS = null;
byte[] byte_PPS = null;
byte[] byte_Frame = null;
H264Provider(String stringSubscribe, String myIpAcquisition, byte[] byte_SPS, byte[] byte_Frame) {
this.stringSubscribe = stringSubscribe;
this.myIpAcquisition = myIpAcquisition;
this.byte_SPSPPS = byte_SPS;
this.byte_PPS = byte_PPS;
this.byte_Frame = byte_Frame;
System.out.println(" subscriber client started");
//SetUpConnection setup=new SetUpConnection();
// setup.execute();
PlayerThread mPlayer = new PlayerThread();
mPlayer.start();
}
void release(){
// close ØMQ socket
subscriber.close();
//terminate 0MQ context
context.term();
}
byte[] getCSD( ) {
return byte_SPSPPS;
}
byte[] nextFrame( ) {
return byte_Frame;
}
private class PlayerThread extends Thread
{
public PlayerThread()
{
System.out.println(" subscriber client started");
}
#TargetApi(Build.VERSION_CODES.LOLLIPOP)
#RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN)
#Override
public void run() {
/******************************************ZMQ****************************/
// Prepare our context and subscriber
ZMQ.Context context = ZMQ.context(1);
//create 0MQ socket
ZMQ.Socket subscriber = context.socket(ZMQ.SUB);
//create outgoing connection from socket
String address = "tcp://" + myIpAcquisition + ":xxxx";
Boolean bbbb = subscriber.connect(address);
subscriber.setHWM(20);// the number of messages to queue.
Log.e("zmq_tag", "connect connect " + bbbb);
//boolean bbbb = subscriber.setSocketOpt(zmq.ZMQ.ZMQ_SNDHWM, 1);
subscriber.subscribe(stringSubscribe.getBytes(ZMQ.CHARSET));
Log.e("zmq_tag", " zmq stringSubscribe " + stringSubscribe);
boolean bRun = true;
while (bRun) {
ZMsg msg = ZMsg.recvMsg(subscriber);
String string_SPS = null;
String string_PPS = null;
String SPSPPS = null;
String string_Frame = null;
if (msg != null) {
// create a video message out of the zmq message
VideoMessage oVideoMsg = VideoMessage.fromZMsg(msg);
// wait until get Iframe
String szInfoPublisher = new String(oVideoMsg.szInfoPublisher);
Log.e("zmq_tag", "szInfoPublisher " + szInfoPublisher);
if (szInfoPublisher.contains("0000000167")) {
isIframe = true;
String[] split_IFrame = szInfoPublisher.split("0000000165");
String SPS__PPS = split_IFrame[0];
String [] split_SPSPPS=SPS__PPS.split("0000000167");
SPSPPS="0000000167" + split_SPSPPS[1];
Log.e("zmq_tag", "SPS+PPS " + SPSPPS);
String iFrame = "0000000165" + split_IFrame[1];
Log.e("zmq_tag", "IFrame " + iFrame);
string_Frame = iFrame;
} else {
if ((szInfoPublisher.contains("0000000161")||szInfoPublisher.contains("0000000141")) && isIframe) {
if (szInfoPublisher.contains("0000000161"))
{
String[] split_IFrame = szInfoPublisher.split("0000000161");
String newMSG = "0000000161" + split_IFrame[1];
Log.e("zmq_tag", " P Frame " + newMSG);
string_Frame = newMSG;
} else
if (szInfoPublisher.contains("0000000141"))
{
String[] split_IFrame = szInfoPublisher.split("0000000141");
String newMSG = "0000000141" + split_IFrame[1];
Log.e("zmq_tag", " P Frame " + newMSG);
string_Frame = newMSG;
}
} else {
isIframe = false;
}
}
}
if (SPSPPS != null) {
byte_SPSPPS = SPSPPS.getBytes();
Log.e("zmq_tag", " byte_SPSPPS " + new String(byte_SPSPPS));
}
if (string_Frame != null) {
byte_Frame = string_Frame.getBytes();
Log.e("zmq_tag", " byte_Frame " + new String(byte_Frame));
}
if(SPSPPS != null) {
// Create the format settinsg for the MediaCodec
MediaFormat format = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, 1920, 1080);// MIMETYPE: a two-part identifier for file formats and format contents
// Set the PPS and SPS frame
format.setByteBuffer("csd-0", ByteBuffer.wrap(byte_SPSPPS));
// Set the buffer size
format.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 100000);
try {
// Get an instance of MediaCodec and give it its Mime type
m_codec = MediaCodec.createDecoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);
// Configure the Codec
m_codec.configure(format, new Surface(m_surface.getSurfaceTexture()), null, 0);
// Start the codec
m_codec.start();
// Create the AsyncTask to get the frames and decode them using the Codec
while (!Thread.interrupted()) {
// Get the next frame
byte[] frame = byte_Frame;
Log.e("zmq_tag", " frame " + new String(frame));
// Now we need to give it to the Codec to decode into the surface
// Get the input buffer from the decoder
int inputIndex = m_codec.dequeueInputBuffer(1);// Pass in -1 here as in this example we don't have a playback time reference
Log.e("zmq_tag", "inputIndex " + inputIndex);
// If the buffer number is valid use the buffer with that index
if (inputIndex >= 0) {
ByteBuffer buffer =m_codec.getInputBuffer(inputIndex);
buffer.put(frame);
// tell the decoder to process the frame
m_codec.queueInputBuffer(inputIndex, 0, frame.length, 0, 0);
}
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
int outputIndex = m_codec.dequeueOutputBuffer(info, 100000);
Log.e("zmq_tag", "outputIndex " + outputIndex);
if (outputIndex >= 0) {
m_codec.releaseOutputBuffer(outputIndex, true);
}
// wait for the next frame to be ready, our server makes a frame every 250ms
try {
Thread.sleep(250);
} catch (Exception e) {
e.printStackTrace();
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
// close ØMQ socket
subscriber.close();
//terminate 0MQ context
context.term();
}
}
sorry I can't comment, but I see some probable mistakes in your code :
Your KEY_MAX_INPUT_SIZE is wrong, it must be at least your Height * Width, in your case the Height * Width = 1920 * 1080 = 2073600 > 100000, you feed your decoder input buffer with data that can be > 100000, so since the decoder wants NALUs it probably wouldn't like it.
You do not clear input buffer before pushing data (realy needed?)

Asynctask fast on virtual device, slow on real device

I am using a service to download files and extract them if they are archived.
The extraction method is wrapped as a asynctask to improve performance of the extraction process.
My problem is that when I run the app on the virtual device, all is fine and the extraction process is really fast but as soon as I test it on a real device (Nexus 9 tablet, Android 6x) the extraction process is really slow and takes minutes to complete.
Is there anything I can do, to speed up the extraction process?
I execute the asynctask with: new UnRarTask(targetAppName).execute();
Below the piece of code which is relevant:
public class DownloadTask implements Runnable {
private DownloadService service;
private DownloadManager downloadManager;
protected void init(DownloadService service, Intent intent) {
this.service = service;
downloadManager = (DownloadManager) MyApp_.getInstance().
getSystemService(Activity.DOWNLOAD_SERVICE);
DownloadRequest downloadRequest = intent.getParcelableExtra(DownloadService
.DOWNLOAD_REQUEST);
}
private class UnRarTask extends AsyncTask<Void, Integer, String> {
String rarPath = null;
int countRar = 0;
long copiedbytes = 0, totalbytes = 0;
Archive archive = null;
FileHeader fileHeader = null;
File archiveFile;
List<FileHeader> headers;
UnRarTask(String one) {
this.archiveFile = new File(one);
}
#Override
protected String doInBackground(Void... params) {
try {
archive = new Archive(new FileVolumeManager(archiveFile));
} catch (RarException | IOException e) {
e.printStackTrace();
}
String fileName = archiveFile.getName();
String absolutePath = archiveFile.getAbsolutePath();
String archiveDirectoryFileName = absolutePath.substring(0, absolutePath.indexOf(fileName));
if (archive != null) {
fileHeader = archive.nextFileHeader();
headers = archive.getFileHeaders();
for (FileHeader fh : headers) {
totalbytes = totalbytes + fh.getFullUnpackSize();
}
}
while (fileHeader != null) {
BufferedInputStream inputStream;
try {
inputStream = new BufferedInputStream(archive.getInputStream(fileHeader));
String extractedFileName = fileHeader.getFileNameString().trim();
String fullExtractedFileName = archiveDirectoryFileName + extractedFileName;
File extractedFile = new File(fullExtractedFileName);
FileOutputStream fileOutputStream = new FileOutputStream(extractedFile);
BufferedOutputStream flout = new BufferedOutputStream(fileOutputStream, BUFFER_SIZE);
if (extractedFile.getName().toLowerCase().endsWith(".mp3")
|| extractedFile.getName().toLowerCase().endsWith(".epub")
|| extractedFile.getName().toLowerCase().endsWith(".pdf")
|| extractedFile.getName().toLowerCase().endsWith(".mobi")
|| extractedFile.getName().toLowerCase().endsWith(".azw3")
|| extractedFile.getName().toLowerCase().endsWith(".m4b")
|| extractedFile.getName().toLowerCase().endsWith(".apk")) {
rarPath = extractedFile.getPath();
countRar++;
}
int len;
byte buf[] = new byte[BUFFER_SIZE];
while ((len = inputStream.read(buf)) > 0) {
//fileOutputStream.write(buf, 0, len);
copiedbytes = copiedbytes + len;
int progress = (int) ((copiedbytes / (float) totalbytes) * 100);
if (progress > lastProgress) {
lastProgress = progress;
service.showUpdateProgressNotification(downloadId, appName, progress,
"Extracting rar archive: " + lastProgress + " % completed", downloadStart);
}
}
archive.extractFile(fileHeader, flout);
flout.flush();
flout.close();
fileOutputStream.flush();
fileOutputStream.close();
inputStream.close();
fileHeader = archive.nextFileHeader();
} catch (RarException | IOException e) {
e.printStackTrace();
}
}
if (countRar == 0) {
filePath = "Error";
broadcastFailed();
}
if (copiedbytes == totalbytes) {
if (archive != null)
archive.close();
}
return null;
}
}
}

android image not appearing when getting bytes from socket and using bitmap creation

I have a program in which I'm trying to download the content of an image file from a server. I'm using java socket to download it. After downloading, I use BitmapFactory.decodeByteArray() to create a bitmap.
At the server side, the file is a .jpg file and it's only about 180 KBytes, so I don't need to try scaling it. I can see through logs that the exact number of bytes in the file is received by my image download code. I store all the bytes in a byte[] array and then convert it into a bitmap.
The imageView is initially hidden and then supposed to be made visible after populating the image. But using BitmapFactory.decodeByteArray() is returning null always. I did see some other posts about null bitmap, but nothing seems to have an answer for this problem.
I don't want to use any external library just for this, so please do not give me suggestions to try out some other libraries. Can someone spot any problem with the code? The server side program is also mine and I know that part is correct because using that, browsers are able to download the same image file. I have copy-pasted it below.
public class ImageDownloader {
private Socket sockToSrvr;
private PrintWriter strmToSrvr;
private BufferedInputStream strmFromSrvr;
private String srvrAddr;
private int port;
private String remoteFile;
private Context ctxt;
private Bitmap imgBmap;
private View parkSpotImgVwHldr;
private View mngAndFndVwHldr;
private View parkSpotImgVw;
public ImageDownloader(Context c) {
srvrAddr = KloudSrvr.srvrIp();
port = KloudSrvr.port();
sockToSrvr = null;
strmFromSrvr = null;
strmToSrvr = null;
remoteFile = null;
ctxt = c;
imgBmap = null;
parkSpotImgVwHldr = null;
mngAndFndVwHldr = null;
parkSpotImgVw = null;
}
public void downloadFile(String remf, View parkSpotImgVwHldrVal,
View mngAndFndVwHldrVal, View parkSpotImgVwVal) {
remoteFile = remf;
parkSpotImgVwHldr = parkSpotImgVwHldrVal;
mngAndFndVwHldr = mngAndFndVwHldrVal;
parkSpotImgVw = parkSpotImgVwVal;
Thread dwnThrd = new Thread() {
#Override
public void run() {
imgBmap = null;
openServerConnection(); sendReq(); doDownload(); closeServerConnection();
((Activity)ctxt).runOnUiThread(new Runnable() {
public void run() {
((Activity)ctxt).runOnUiThread(new Runnable() {
public void run() {
mngAndFndVwHldr.setVisibility(View.GONE);
parkSpotImgVwHldr.setVisibility(View.VISIBLE);
Toast.makeText(ctxt, "completed", Toast.LENGTH_LONG).show();
}
});
}
});
}
};
dwnThrd.start();
}
private void sendReq() {
if(strmToSrvr == null) return;
String req = "GET /downloadFile " + remoteFile + " HTTP/1.1\r\n\r\n";
Log.d("IMG-DWNL-LOG: ", "writing req msg to socket " + req);
strmToSrvr.write(req); strmToSrvr.flush();
}
private void doDownload() {
boolean gotContLen = false;
int contLen = 0;
while(true) {
String inLine = getLine(strmFromSrvr); if(inLine == null) break;
if((gotContLen == true) &&
(inLine.replace("\r", "").replace("\n", "").isEmpty() == true)) break;
if(inLine.trim().startsWith("Content-Length:") == true) {
// an empty line after this signifies start of content
String s = inLine.replace("Content-Length:", "").trim();
try {contLen = Integer.valueOf(s); gotContLen = true; continue;}
catch(NumberFormatException nfe) {contLen = 0;}
}
}
if((gotContLen == false) || (contLen <= 0)) return;
byte[] imgByts = new byte[contLen];
int totRdByts = 0, rdByts, chnk = 1024, avlByts;
while(true) {
try {
avlByts = strmFromSrvr.available(); if(avlByts < 0) break;
if(avlByts == 0) {try {Thread.sleep(1000);} catch(InterruptedException ie) {} continue;}
rdByts = (avlByts < chnk) ? avlByts : chnk;
rdByts = strmFromSrvr.read(imgByts, totRdByts, rdByts); if(rdByts < 0) break;
if(rdByts == 0) {try {Thread.sleep(1000);} catch(InterruptedException ie) {} continue;}
totRdByts += rdByts;
if(totRdByts >= contLen) break;
} catch(IOException ioe) {return;}
}
if(totRdByts < contLen) {
Log.d("IMG-DWNL-LOG: ", "error - bytes read " + totRdByts
+ " less than content length " + contLen);
return;
}
if(totRdByts <= 0) return;
Log.d("IMG-DWNL-LOG: ", "read all image bytes successfully, setting image into view");
BitmapFactory.Options options = new BitmapFactory.Options();
Bitmap bitmap = BitmapFactory.decodeByteArray(imgByts, 0, contLen, options);
if(bitmap == null) {Log.d("IMG-DWNL-LOG: ", "got a null bitmap");}
((ImageView)parkSpotImgVw).setImageBitmap(bitmap);
}
private void closeServerConnection() {
if(sockToSrvr == null) return;
if(strmFromSrvr != null) {
try {strmFromSrvr.close();}
catch(IOException e) {Log.d("IMG-DWNL-LOG: ", "Inp strm close exception");}
}
if(strmToSrvr != null) strmToSrvr.close();
try {sockToSrvr.close();}
catch(IOException e) {Log.d("IMG-DWNL-LOG: ", "Conn close exception");}
strmFromSrvr = null; strmToSrvr = null; sockToSrvr = null;
}
private void openServerConnection() {
try {sockToSrvr = new Socket(InetAddress.getByName(srvrAddr), port);}
catch(UnknownHostException e) {
Log.d("IMG-DWNL-LOG: ", "Unknown host exception"); sockToSrvr = null; return;
} catch(IOException e) {
Log.d("IMG-DWNL-LOG: ", "Server connect exception"); sockToSrvr = null; return;
}
Log.d("IMG-DWNL-LOG: ", "Connected to server");
try {
strmFromSrvr = new BufferedInputStream(sockToSrvr.getInputStream());
strmToSrvr = new PrintWriter(new BufferedWriter(new OutputStreamWriter
(sockToSrvr.getOutputStream())), true);
} catch(IOException e) {
closeServerConnection();
Log.d("IMG-DWNL-LOG: ", "Failed to open reader / writer. Closed the connection."); return;
}
}
private String getLine(BufferedInputStream dis) {
String outLine = "";
while(true) {
try {
int c = dis.read(); if((c == -1) && (outLine.length() <= 0)) return(null);
outLine += Character.toString((char)c);
if(c == '\n') return(outLine);
} catch(IOException e) {if(outLine.length() <= 0) return(null); return(outLine);}
}
}
}
I was making a mistake, assuming that a .jpg file's bytes can be directly decode with the android bitmap decoder. Apparently this is not the case. So, I wrote the received bytes into a temporary file in the phone storage and then called BitmapFactory.decodeFile() which is able to return a good bitmap and ends up showing the image.
So, have a working solution now.
Still - if anyone has a better suggestion how to decode directly from the received bytes (which are from a .jpg file), I would be very interested to try it out since that would be more efficient. Thanks.

how to upload a video to YouTube using my android app

How to upload a video to a particular YouTube account from my android app i have followed https://github.com/youtube/yt-direct-lite-android but its not very documented.
I have tried
public class YoutubeUploader {
private static final String TAG = "YoutubeUploader";
// After creating project at http://www.appspot.com DEFAULT_YTD_DOMAIN == <Developers Console Project ID>.appspot.com [ You can find from Project -> Administration -> Application settings]
//public static final String DEFAULT_YTD_DOMAIN = "developerconsolid.appspot.com";
// I used Google APIs Console Project Title as Domain name:
//public static final String DEFAULT_YTD_DOMAIN_NAME = "Domain Name";
//From Google Developer Console from same project (Created by SHA1; project package)
//Example https://console.developers.google.com/project/apps~gtl-android-youtube-test/apiui/credential
public static final String DEVELOPER_KEY = "<MY_DEVELOPER_KEY>";
// CLIENT_ID == Google APIs Console Project Number:
public static final String CLIENT_ID = "157613";
public static final String YOUTUBE_AUTH_TOKEN_TYPE = "youtube";
private static final String AUTH_URL = "https://www.google.com/accounts/ClientLogin";
// Uploader's user-name and password
private static final String USER_NAME = "my#gmail.com";
private static final String PASSWORD = "mypassword";
private static final String INITIAL_UPLOAD_URL = "https://uploads.gdata.youtube.com/resumable/feeds/api/users/default/uploads";
private static String getClientAuthToken() {
try {
URL url = new URL(AUTH_URL);
HttpURLConnection urlConnection = (HttpURLConnection) url.openConnection();
urlConnection.setRequestMethod("POST");
urlConnection.setDoOutput(true);
urlConnection.setRequestProperty("Content-Type", "application/x-www-form-urlencoded");
String template = "Email=%s&Passwd=%s&service=%s&source=%s";
String userName = USER_NAME; // TODO
String password = PASSWORD; // TODO
String service = YOUTUBE_AUTH_TOKEN_TYPE;
String source = CLIENT_ID;
userName = URLEncoder.encode(userName, "UTF-8");
password = URLEncoder.encode(password, "UTF-8");
String loginData = String.format(template, userName, password, service, source);
OutputStreamWriter outStreamWriter = new OutputStreamWriter(urlConnection.getOutputStream());
outStreamWriter.write(loginData);
outStreamWriter.close();
int responseCode = urlConnection.getResponseCode();
if (responseCode != 200) {
Log.d(TAG, "Got an error response : " + responseCode + " " + urlConnection.getResponseMessage());
throw new IOException(urlConnection.getResponseMessage());
} else {
InputStream is = urlConnection.getInputStream();
InputStreamReader isr = new InputStreamReader(is);
BufferedReader br = new BufferedReader(isr);
String line = null;
while ((line = br.readLine()) != null) {
if (line.startsWith("Auth=")) {
String split[] = line.split("=");
String token = split[1];
Log.d(TAG, "Auth Token : " + token);
return token;
}
}
}
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
public static String upload(YoutubeUploadRequest uploadRequest, ProgressListner listner, Activity activity) {
totalBytesUploaded = 0;
String authToken = getClientAuthToken();
if(authToken != null) {
String uploadUrl = uploadMetaData(uploadRequest, authToken, activity, true);
File file = getFileFromUri(uploadRequest.getUri(), activity);
long currentFileSize = file.length();
int uploadChunk = 1024 * 1024 * 3; // 3MB
int start = 0;
int end = -1;
String videoId = null;
double fileSize = currentFileSize;
while (fileSize > 0) {
if (fileSize - uploadChunk > 0) {
end = start + uploadChunk - 1;
} else {
end = start + (int) fileSize - 1;
}
Log.d(TAG, String.format("start=%s end=%s total=%s", start, end, file.length()));
try {
videoId = gdataUpload(file, uploadUrl, start, end, authToken, listner);
fileSize -= uploadChunk;
start = end + 1;
} catch (IOException e) {
Log.d(TAG,"Error during upload : " + e.getMessage());
}
}
if (videoId != null) {
return videoId;
}
}
return null;
}
public static int totalBytesUploaded = 0;
#SuppressLint("DefaultLocale")
#SuppressWarnings("resource")
private static String gdataUpload(File file, String uploadUrl, int start, int end, String clientLoginToken, ProgressListner listner) throws IOException {
int chunk = end - start + 1;
int bufferSize = 4096;
byte[] buffer = new byte[bufferSize];
FileInputStream fileStream = new FileInputStream(file);
URL url = new URL(uploadUrl);
HttpURLConnection urlConnection = (HttpURLConnection) url.openConnection();
urlConnection.setRequestProperty("Authorization", String.format("GoogleLogin auth=\"%s\"", clientLoginToken));
urlConnection.setRequestProperty("GData-Version", "2");
urlConnection.setRequestProperty("X-GData-Client", CLIENT_ID);
urlConnection.setRequestProperty("X-GData-Key", String.format("key=%s", DEVELOPER_KEY));
// some mobile proxies do not support PUT, using X-HTTP-Method-Override to get around this problem
urlConnection.setRequestMethod("POST");
urlConnection.setRequestProperty("X-HTTP-Method-Override", "PUT");
urlConnection.setDoOutput(true);
urlConnection.setFixedLengthStreamingMode(chunk);
urlConnection.setRequestProperty("Content-Type", "video/3gpp");
urlConnection.setRequestProperty("Content-Range", String.format("bytes %d-%d/%d", start, end,
file.length()));
Log.d(TAG, urlConnection.getRequestProperty("Content-Range"));
OutputStream outStreamWriter = urlConnection.getOutputStream();
fileStream.skip(start);
double currentFileSize = file.length();
int bytesRead;
int totalRead = 0;
while ((bytesRead = fileStream.read(buffer, 0, bufferSize)) != -1) {
outStreamWriter.write(buffer, 0, bytesRead);
totalRead += bytesRead;
totalBytesUploaded += bytesRead;
double percent = (totalBytesUploaded / currentFileSize) * 100;
if(listner != null){
listner.onUploadProgressUpdate((int) percent);
}
System.out.println("GTL You tube upload progress: " + percent + "%");
/*
Log.d(LOG_TAG, String.format(
"fileSize=%f totalBytesUploaded=%f percent=%f", currentFileSize,
totalBytesUploaded, percent));
*/
//dialog.setProgress((int) percent);
// TODO My settings
if (totalRead == (end - start + 1)) {
break;
}
}
outStreamWriter.close();
int responseCode = urlConnection.getResponseCode();
Log.d(TAG, "responseCode=" + responseCode);
Log.d(TAG, "responseMessage=" + urlConnection.getResponseMessage());
try {
if (responseCode == 201) {
String videoId = parseVideoId(urlConnection.getInputStream());
return videoId;
} else if (responseCode == 200) {
Set<String> keySet = urlConnection.getHeaderFields().keySet();
String keys = urlConnection.getHeaderFields().keySet().toString();
Log.d(TAG, String.format("Headers keys %s.", keys));
for (String key : keySet) {
Log.d(TAG, String.format("Header key %s value %s.", key, urlConnection.getHeaderField(key)));
}
Log.w(TAG, "Received 200 response during resumable uploading");
throw new IOException(String.format("Unexpected response code : responseCode=%d responseMessage=%s", responseCode,
urlConnection.getResponseMessage()));
} else {
if ((responseCode + "").startsWith("5")) {
String error = String.format("responseCode=%d responseMessage=%s", responseCode,
urlConnection.getResponseMessage());
Log.w(TAG, error);
// TODO - this exception will trigger retry mechanism to kick in
// TODO - even though it should not, consider introducing a new type so
// TODO - resume does not kick in upon 5xx
throw new IOException(error);
} else if (responseCode == 308) {
// OK, the chunk completed succesfully
Log.d(TAG, String.format("responseCode=%d responseMessage=%s", responseCode,
urlConnection.getResponseMessage()));
} else {
// TODO - this case is not handled properly yet
Log.w(TAG, String.format("Unexpected return code : %d %s while uploading :%s", responseCode,
urlConnection.getResponseMessage(), uploadUrl));
}
}
} catch (ParserConfigurationException e) {
e.printStackTrace();
} catch (SAXException e) {
e.printStackTrace();
}
return null;
}
private static String parseVideoId(InputStream atomDataStream) throws ParserConfigurationException,
SAXException, IOException {
DocumentBuilderFactory docBuilderFactory = DocumentBuilderFactory.newInstance();
DocumentBuilder docBuilder = docBuilderFactory.newDocumentBuilder();
Document doc = docBuilder.parse(atomDataStream);
NodeList nodes = doc.getElementsByTagNameNS("*", "*");
for (int i = 0; i < nodes.getLength(); i++) {
Node node = nodes.item(i);
String nodeName = node.getNodeName();
if (nodeName != null && nodeName.equals("yt:videoid")) {
return node.getFirstChild().getNodeValue();
}
}
return null;
}
private static File getFileFromUri(Uri uri, Activity activity) {
try {
String filePath = null;
String[] proj = { Video.VideoColumns.DATA };
Cursor cursor = activity.getContentResolver().query(uri, proj, null, null, null);
if(cursor.moveToFirst()) {
int column_index = cursor.getColumnIndexOrThrow(Video.VideoColumns.DATA);
filePath = cursor.getString(column_index);
}
cursor.close();
//String filePath = cursor.getString(cursor.getColumnIndex(Video.VideoColumns.DATA));
File file = new File(filePath);
cursor.close();
return file;
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
private static String uploadMetaData(YoutubeUploadRequest uploadRequest, String clientLoginToken, Activity activity, boolean retry) {
try {
File file = getFileFromUri(uploadRequest.getUri(), activity);
if(file != null) {
String uploadUrl = INITIAL_UPLOAD_URL;
URL url = new URL(uploadUrl);
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
connection.setRequestProperty("Authorization", String.format("GoogleLogin auth=\"%s\"", clientLoginToken));
connection.setRequestProperty("GData-Version", "2");
connection.setRequestProperty("X-GData-Client", CLIENT_ID);
connection.setRequestProperty("X-GData-Key", String.format("key=%s", DEVELOPER_KEY));
connection.setRequestMethod("POST");
connection.setDoOutput(true);
connection.setRequestProperty("Content-Type", "application/atom+xml");
connection.setRequestProperty("Slug", file.getAbsolutePath());
String title = uploadRequest.getTitle();
String description = uploadRequest.getDescription();
String category = uploadRequest.getCategory();
String tags = uploadRequest.getTags();
String template = readFile(activity, R.raw.gdata).toString();
String atomData = String.format(template, title, description, category, tags);
/*String template = readFile(activity, R.raw.gdata_geo).toString();
atomData = String.format(template, title, description, category, tags,
videoLocation.getLatitude(), videoLocation.getLongitude());*/
OutputStreamWriter outStreamWriter = new OutputStreamWriter(connection.getOutputStream());
outStreamWriter.write(atomData);
outStreamWriter.close();
int responseCode = connection.getResponseCode();
if (responseCode < 200 || responseCode >= 300) {
// The response code is 40X
if ((responseCode + "").startsWith("4") && retry) {
Log.d(TAG, "retrying to fetch auth token for ");
clientLoginToken = getClientAuthToken();
// Try again with fresh token
return uploadMetaData(uploadRequest, clientLoginToken, activity, false);
} else {
return null;
}
}
return connection.getHeaderField("Location");
}
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
public static CharSequence readFile(Activity activity, int id) {
BufferedReader in = null;
try {
in = new BufferedReader(new InputStreamReader(activity.getResources().openRawResource(id)));
String line;
StringBuilder buffer = new StringBuilder();
while ((line = in.readLine()) != null) {
buffer.append(line).append('\n');
}
// Chomp the last newline
buffer.deleteCharAt(buffer.length() - 1);
return buffer;
} catch (IOException e) {
return "";
} finally {
closeStream(in);
}
}
/**
* Closes the specified stream.
*
* #param stream The stream to close.
*/
private static void closeStream(Closeable stream) {
if (stream != null) {
try {
stream.close();
} catch (IOException e) {
// Ignore
}
}
}
public static interface ProgressListner {
void onUploadProgressUpdate(int progress);
}
I am getting 404 error in getClientAuthToken() method.I think "https://www.google.com/accounts/ClientLogin";deprecated by google any alternative to this

Android TCP client receive messages and bmp

I use the common tcp client to receive string messages through TCP.
I want after the reception of a specific message e.g. "XXX" my client to be ready to receive a bmp image.
My server in C++ sends the messages but the client does not receive the image...
After some suggestions .. se below I udated the code...
Here is my code:
TCP client:
public class TCPClient {
private String serverMessage;
public static final String SERVERIP = "192.168.1.88"; //your computer IP
public static final int SERVERPORT = 80;
private OnMessageReceived mMessageListener = null;
private boolean mRun = false;
private PrintWriter out;
private BufferedReader input;
private DataInputStream dis;
/**
* Constructor of the class. OnMessagedReceived listens for the messages received from server
*/
public TCPClient(OnMessageReceived listener) {
mMessageListener = listener;
}
/**
* Sends the message entered by client to the serveraddress
* #param message text entered by client
*/
public void sendMessage(String message){
if (out != null && !out.checkError()) {
out.println(message);
out.flush();
}
}
public void stopClient(){
mRun = false;
if (out != null) {
out.flush();
out.close();
}
mMessageListener = null;
input = null;
input = null;
input = null;
serverMessage = null;
}
public void run() {
mRun = true;
try {
//here you must put your computer's IP address.
InetAddress serverAddr = InetAddress.getByName(SERVERIP);
Log.e("TCP Client", "C: Connecting...");
//create a socket to make the connection with the server
Socket socket = new Socket(serverAddr, SERVERPORT);
try {
//send the message to the server
out = new PrintWriter(new BufferedWriter(new OutputStreamWriter(socket.getOutputStream())), true);
Log.e("TCP Client", "C: Sent.");
Log.e("TCP Client", "C: Done.");
//receive the message which the server sends back
dis = new DataInputStream(socket.getInputStream());
// The buffer reader cannot can't wrap an InputStream directly. It wraps another Reader.
// So inputstreamreader is used.
input = new BufferedReader(new InputStreamReader(dis, "UTF-8"));
Log.d("MyApp","We are here");
//this.input = new DataInputStream(in);
//in this while the client listens for the messages sent by the server
while (mRun) {
Log.d("MyApp", "We are here 2");
serverMessage = input.readLine();
if (serverMessage != null && mMessageListener != null) {
//call the method messageReceived from MyActivity class
mMessageListener.messageReceived(serverMessage);
Log.d("RESPONSE FROM SERVER", "S: Received Message: '" + serverMessage + "'");
}
if ("XXX".equals(serverMessage)) {
Log.d("MyApp", "We are here 3");
serverMessage = null;
while (mRun) {
WriteSDCard writeSDCard = new WriteSDCard();
writeSDCard.writeToSDFile(serverMessage);
}
}
}
} finally {
socket.close();
}
Log.e("RESPONSE FROM SERVER", "S: Received Message: '" + serverMessage + "'");
} catch (Exception e) {
Log.e("TCP", "S: Error", e);
} finally {
//the socket must be closed. It is not possible to reconnect to this socket
// after it is closed, which means a new socket instance has to be created.
}
}
//Declare the interface. The method messageReceived(String message) will must be implemented in the MyActivity
//class at on asynckTask doInBackground
public interface OnMessageReceived {
public void messageReceived(String message);
}
}
public class WriteSDCard extends Activity {
private static final String TAG = "MEDIA";
private TextView tv;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
//(not needed) setContentView(R.layout.main);
//(not needed) tv = (TextView) findViewById(R.id.TextView01);
checkExternalMedia();
String message =null;
}
/** Method to check whether external media available and writable. This is adapted from
http://developer.android.com/guide/topics/data/data-storage.html#filesExternal */
private void checkExternalMedia(){
boolean mExternalStorageAvailable = false;
boolean mExternalStorageWriteable = false;
String state = Environment.getExternalStorageState();
if (Environment.MEDIA_MOUNTED.equals(state)) {
// Can read and write the media
mExternalStorageAvailable = mExternalStorageWriteable = true;
} else if (Environment.MEDIA_MOUNTED_READ_ONLY.equals(state)) {
// Can only read the media
mExternalStorageAvailable = true;
mExternalStorageWriteable = false;
} else {
// Can't read or write
mExternalStorageAvailable = mExternalStorageWriteable = false;
}
tv.append("\n\nExternal Media: readable="
+mExternalStorageAvailable+" writable="+mExternalStorageWriteable);
}
/** Method to write ascii text characters to file on SD card. Note that you must add a
WRITE_EXTERNAL_STORAGE permission to the manifest file or this method will throw
a FileNotFound Exception because you won't have write permission. */
void writeToSDFile(String inputMsg){
// Find the root of the external storage.
// See http://developer.android.com/guide/topics/data/data- storage.html#filesExternal
File root = android.os.Environment.getExternalStorageDirectory();
tv.append("\nExternal file system root: "+root);
// See http://stackoverflow.com/questions/3551821/android-write-to-sd-card-folder
File dir = new File (root.getAbsolutePath() + "/download");
dir.mkdirs();
Log.d("WriteSDCard", "Start writing");
File file = new File(dir, "myData.txt");
try {
FileOutputStream f = new FileOutputStream(file);
PrintWriter pw = new PrintWriter(f);
pw.println(inputMsg);
pw.flush();
pw.close();
f.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
Log.i(TAG, "******* File not found. Did you" +
" add a WRITE_EXTERNAL_STORAGE permission to the manifest?");
} catch (IOException e) {
e.printStackTrace();
}
tv.append("\n\nFile written to "+file);
}
/** Method to read in a text file placed in the res/raw directory of the application. The
method reads in all lines of the file sequentially. */
}
And the server side:
Code:
void sendBMP( int cs, int xs, int ys)
{
int imgdataoffset = 14 + 40; // file header size + bitmap header size
int rowsz = ((xs) + 3) & -4; // size of one padded row of pixels
int imgdatasize = (((xs*3) + 3) & -4) * ys; // size of image data
int filesize = imgdataoffset + imgdatasize;
int i, y;
HTLM_bmp_H HTLM_bmp_h;
HTLM_bmp_h.bmfh.bfSize = filesize;
HTLM_bmp_h.bmfh.bfReserved1 = 0;
HTLM_bmp_h.bmfh.bfReserved2 = 0;
HTLM_bmp_h.bmfh.bfOffBits = imgdataoffset;
HTLM_bmp_h.bmih.biSize = 40;
HTLM_bmp_h.bmih.biWidth = xs;
HTLM_bmp_h.bmih.biHeight = ys;
HTLM_bmp_h.bmih.biPlanes = 1;
HTLM_bmp_h.bmih.biBitCount = 24;
HTLM_bmp_h.bmih.biCompression = 0;
HTLM_bmp_h.bmih.biSizeImage = imgdatasize;
HTLM_bmp_h.bmih.biXPelsPerMeter = 1000;
HTLM_bmp_h.bmih.biYPelsPerMeter = 1000;
HTLM_bmp_h.bmih.biClrUsed = 1 << 24;
HTLM_bmp_h.bmih.biClrImportant = 0;
printf("Start Sending BMP.\n");
send(cs,(unsigned char *)"BM",2,0);
send(cs,(unsigned char *)&HTLM_bmp_h,sizeof(HTLM_bmp_h),0);
printf("Sending...\n");
Buff_ptr = 0;
send(cs, (unsigned char *)Rbuffer, BUFF_SIZE,0 );
send(cs, (unsigned char *)Gbuffer, BUFF_SIZE,0 );
send(cs, (unsigned char *)Bbuffer, BUFF_SIZE,0 );
send(cs, (unsigned char *)"\n",1,0);
send(cs, (unsigned char *)"END\n",4,0);
printf("Done\n\n");
}
typedef struct {
// char bfType1;
// char bfType2;
int bfSize;
short bfReserved1;
short bfReserved2;
int bfOffBits;
} BMFH;
typedef struct {
unsigned int biSize;
int biWidth;
int biHeight;
short biPlanes;
short biBitCount;
unsigned int biCompression;
unsigned int biSizeImage;
int biXPelsPerMeter;
int biYPelsPerMeter;
unsigned int biClrUsed;
unsigned int biClrImportant;
} BMIH;
typedef struct {
BMFH bmfh;
BMIH bmih;
} HTLM_bmp_H;
main()
{
TSK_Handle tsk_cam;
tsk_cam=TSK_create( (Fxn)TSK_webview, NULL);
TSK_setpri(tsk_cam, 8);
}
char buffer[2048];
Void TSK_webview()
{
int s,cs;
struct sockaddr_in addr; /* generic socket name */
struct sockaddr client_addr;
int sock_len = sizeof(struct sockaddr);
int frame = 0;
LgUns i=0;
int len;
int x = DSKeye_SXGA_WIDTH, y = DSKeye_SXGA_HEIGHT;
DSKeye_params CAM_params = {
....
};
lwIP_NetStart();
/**************************************************************
* Main loop.
***************************************************************/
s = socket( AF_INET, SOCK_STREAM, 0 );
addr.sin_port = htons(80);
addr.sin_addr.s_addr = 0;
memset(&(addr.sin_zero), 0, sizeof(addr.sin_zero));
printf("start\n");
if( bind(s, (struct sockaddr*)&addr, sizeof(struct sockaddr)))
{
printf("error binding to port\n");
return ;
}
printf("xx1\n");
if(DSKeye_open(&CAM_params)) {
printf("xx2\n");
SYS_abort("DSKcam_CAMopen");
printf("xx3\n"); fflush(stdout);}
printf("xx4\n");
while(1==1) {
printf("Waiting for client to be connected ... \n");
listen(s, 10);
cs = accept(s, &client_addr, &sock_len);
printf("Client connected.\n");
send(cs,(unsigned char *)"Server connected\n",17,0);
recv(cs, (unsigned char*)buffer, 17, 0);
switch (*(buffer)){
case 'A' :
...
case 'B' :
...
}
REG32(0xA0000080)=REG32(0xA0000080) - 0x800000; ///Disable stepper controller vhdl Quartus Block
for(frame = 0; frame < 4; frame++){ // Allow AEC etc to settle
SrcFrame=DSKeye_getFrame();
}
printf("Demosaicing of %d x %d image is ongoing \n", x, y);
demosaic(SrcFrame, x, y);
break;
}
printf("Demosaicing completed ...\n");
send(cs,(unsigned char *)"Demosaicing completed\n",22,0);
send(cs,(unsigned char *)"XXX\n",4,0);
sendBMP(cs, x, y);
fflush(stdout);
lwip_close(cs);
}
the send : lwip_send
int lwip_send(int s, void *data, int size, unsigned int flags)
{
struct lwip_socket *sock;
struct netbuf *buf;
err_t err;
LWIP_DEBUGF(SOCKETS_DEBUG, ("lwip_send(%d, data=%p, size=%d, flags=0x%x)\n", s, data, size, flags));
sock = get_socket(s);
if (!sock) {
set_errno(EBADF);
return -1;
}
switch (netconn_type(sock->conn)) {
case NETCONN_RAW:
case NETCONN_UDP:
case NETCONN_UDPLITE:
case NETCONN_UDPNOCHKSUM:
/* create a buffer */
buf = netbuf_new();
if (!buf) {
LWIP_DEBUGF(SOCKETS_DEBUG, ("lwip_send(%d) ENOBUFS\n", s));
sock_set_errno(sock, ENOBUFS);
return -1;
}
/* make the buffer point to the data that should
be sent */
netbuf_ref(buf, data, size);
/* send the data */
err = netconn_send(sock->conn, buf);
/* deallocated the buffer */
netbuf_delete(buf);
break;
case NETCONN_TCP:
err = netconn_write(sock->conn, data, size, NETCONN_COPY);
break;
default:
err = ERR_ARG;
break;
}
if (err != ERR_OK) {
LWIP_DEBUGF(SOCKETS_DEBUG, ("lwip_send(%d) err=%d\n", s, err));
sock_set_errno(sock, err_to_errno(err));
return -1;
}
LWIP_DEBUGF(SOCKETS_DEBUG, ("lwip_send(%d) ok size=%d\n", s, size));
sock_set_errno(sock, 0);
return size;
}
You can't mix a buffered reader and a data input stream on the same socket. The buffered reader will read-ahead and steal data you expect to read via the data input stream. You will have to use the data input stream for everything. And correspondingly at the sender.
You're doing incorrect comparison for string equality.
In Java, string comparison for equality is done using String.equals(Object anObject)
You're using if (serverMessage == "XXX") {....
You should use if ("XXX".equals(serverMessage)) {....

Categories

Resources