I created a server/client game in visual studio. The apps connect over TCP. After everything worked as intended i created the client app in Unity. When i play the game in Unity application everything is ok (The client communicates with server etc). When i export the game in Windows the game is ok. The problem is when i export it to an Android (apk) the game on the device does not communicate with server.
Using Debugger i found the first problem (not sure if there are more) is when i am trying to compress and split the package to be send. I split each package at 1024 and rejoin them at the recipient.
The pack class:
public class MessageBoxPack
{
private List<MessageBox> msgBoxContainer;
private byte[] msgByte { get; set; }
private MessageBox msgBox;
private int bufferSize;
public List<MessageBox> PrepareMsgBoxList(MessageSlip msg, int _bufferSize)
{
msgBoxContainer = new List<MessageBox>();
bufferSize = _bufferSize;
ConvertMessageToBytes(msg);
return msgBoxContainer;
}
private void ConvertMessageToBytes(MessageSlip msg)
{
BinaryFormatter f = new BinaryFormatter();
using (MemoryStream ms = new MemoryStream())
{
using (GZipStream gZip = new GZipStream(ms, CompressionMode.Compress))
{
f.Serialize(gZip, msg);
}
byte[] b = ms.ToArray();
int TotalLength = b.Length;
int currentPosition = 0;
int writeLength = bufferSize;
while (currentPosition < TotalLength)
{
msgBox = new MessageBox();
msgBox.CallBackID = msg.CallBackID;
if (currentPosition + bufferSize > TotalLength)
writeLength = TotalLength - currentPosition;
msgBox.MessageBytes = new byte[writeLength];
Array.Copy(b, currentPosition, msgBox.MessageBytes, 0, writeLength);
msgBox.currentPossition = currentPosition;
msgBox.byteLength = TotalLength;
currentPosition += writeLength;
msgBoxContainer.Add(msgBox);
}
}
}
}
And the unpack class:
public class MessageBoxUnpack
{
public MessageSlip UnpackMsgBoxList(List<MessageBox> msgBoxList, int _bufferSize)
{
MessageSlip msg = new MessageSlip();
using (MemoryStream ms = new MemoryStream())
{
foreach (MessageBox msgBox in msgBoxList)
{
//ms.Position = (int)msgBox.currentPossition;
ms.Write(msgBox.MessageBytes, 0, msgBox.MessageBytes.Length);
}
int i = (int)ms.Length;
BinaryFormatter f = new BinaryFormatter();
//ms.Flush();
ms.Position = 0;
//msg = f.Deserialize(ms) as MessageSlip;
using (GZipStream gZip = new GZipStream(ms, CompressionMode.Decompress))
{
msg = f.Deserialize(gZip) as MessageSlip;
}
}
return msg;
}
}
The Error i receive is at
using (GZipStream gZip = new GZipStream(ms, CompressionMode.Compress))
Can you explain to me where is the problem, how to correct it and let me know if the method i am using to pack, unpack and split is ok.
Related
I am trying to recieve an streaming audio from my app.
below is my code for recieving audio stream:
public class ClientListen implements Runnable {
private Context context;
public ClientListen(Context context) {
this.context = context;
}
#Override
public void run() {
boolean run = true;
try {
DatagramSocket udpSocket = new DatagramSocket(8765);
InetAddress serverAddr = null;
try {
serverAddr = InetAddress.getByName("127.0.0.1");
} catch (UnknownHostException e) {
e.printStackTrace();
}
while (run) {
try {
byte[] message = new byte[8000];
DatagramPacket packet = new DatagramPacket(message,message.length);
Log.i("UDP client: ", "about to wait to receive");
udpSocket.setSoTimeout(10000);
udpSocket.receive(packet);
String text = new String(packet.getData(), 0, packet.getLength());
Log.d("Received text", text);
} catch (IOException e) {
Log.e(" UDP clien", "error: ", e);
run = false;
udpSocket.close();
}
}
} catch (SocketException e) {
Log.e("Socket Open:", "Error:", e);
} catch (IOException e) {
e.printStackTrace();
}
}
}
In Received text logger i can see data as coming as
D/Received text: �������n�����������q�9�$�0�/�G�{�������s�����JiH&������d�����Z���������d�����E������C�+
��l��y�����������v���9����������u��f�j�������$�����K���������F��~R�2�����T��������������L�����!��G��8������s�;�"�,�R�����(��{�����*_��Z�������5������������\������x���j~������������/��=�����%�������
How can store this data into a wav file ?
What you see is the string representation of single udp packet after it was received and the received block has just being released.
It is a very small fraction of the sound you want to convert to wave.
Soon the while loop will continue and you will receive another packet and many more..
You need to collect all the packets in a buffer and then when you think it is ok - convert them to wave file.
Remember Wave is not just the sound bytes you get from udp but also 44 bytes of prefix you need to add to this file in order to be recognized by players.
Also if the udp is from another encoding format such as G711 - you must encode these bytes to PCM – if not you will hear heavy noise in the
Sound of the wave or the stream you play.
The buffer must be accurate. if it will be too big (many empty bytes in the end of the array) you will hear a sound of helicopter. if you know exactly what is the size of each packet then you can just write it to AudioTrack in order to play stream, or accumulate it and convert it to wave file when you will see fit. But If you are not sure about the size you can use this answer to get a buffer and then write the buffer to AudioTrack:
Android AudioRecord to Server over UDP Playback Issues.
they use Javax because it is very old answer but you just need to use AudioTrack instead in order to stream. It is not in this scope so I will just present the AudioTrack streaming replacements instead of Javax SourceDataLine:
final int SAMPLE_RATE = 8000; // Hertz
final int STREAM_TYPE = AudioManager.STREAM_NOTIFICATION;
int channelConfig = AudioFormat.CHANNEL_OUT_MONO;
int encodingFormat = AudioFormat.ENCODING_PCM_16BIT;
AudioTrack track = new AudioTrack(STREAM_TYPE, SAMPLE_RATE, channelConfig,
encodingFormat, BUF_SIZE, AudioTrack.MODE_STREAM);
track.play();
//.. then after receive UDP packets and the buffer is full:
if(track != null && packet != null){
track.write(audioStreamBuffer, 0, audioStreamBuffer.length);
}
You must not do this in the UI thread (I assume you know that).
In the code I will show you - I am getting udp of audio logs from PTT radio. It is encoded in G711 Ulaw . each packet is of 172 bytes exactly. First 12 bytes are for RTP and I need to offset (remove) them in order to eliminate small noises. rest 160 bytes are 20MS of sound.
I must decode the G711 Ulaw bytes to PCM shorts array. Then to take the short array and to make a wave file out of it. I am taking it after I see there was no packet receiving for more than one second (so I know the speech ended and the new block release is because of a new speech so I can take the old speech and make a wave file out of it). You can decide of a different buffer depends on what you are doing.
It works fine. After the decoding the sound of the wave is very good. If you have UDP with PCM so you don’t need to decode G711 - just skip this part.
Finally I want to mention I saw many old answers with code parts using javax.sound.sampled that seems great because it can convert easily an audio file or stream to wave format with AudioFileFormat
And also convert G711 to pcm with AudioFormat manipulations. But unfortunately it is not part of current java for android. We must count on android AudioTrack instead (and AudioRecord if we want to get the sound from the mic) but AudioTrack play only PCM and do not support G711 format – so when streaming G711 with AudioTrack the noise is terrible. We must decode it in our code before writing it to the track. Also we cannot convert to wave file using audioInputStream – I tried to do this easily with javax.sound.sampled jar file I added to my app but android keep giving me errors such as format not supported for wave, and mixer errors when try to stream – so I understood latest android cannot work with javax.sound.sampled and I went to look for law level decoding of G711 and law level creation of wave file out of the buffer of byte array received from the UDP packets .
A. in manifest add:
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.INTERNET"/>
B. in the worker thread:
#Override
public void run(){
Log.i(TAG, "ClientListen thread started. Thread id: " + Thread.currentThread().getId());
try{
udpSocket = new DatagramSocket(port);
}catch(SocketException e){
e.printStackTrace();
}
byte[] messageBuf = new byte[BUF_SIZE];
Log.i(TAG, "waiting to receive packet in port: " + port);
if(udpSocket != null){
// here you can create new AudioTrack and play.track
byte pttSession[] = null;
while (running){
packet = new DatagramPacket(messageBuf, 0, messageBuf.length);
Log.d(TAG, "inside while running loop");
try{
Log.d(TAG, "receive block: waiting for user to press on
speaker(listening now inside udpSocket for DatagramPacket..)");
//get inside receive block until packet will arrive through this socket
long timeBeforeBlock = System.currentTimeMillis();
udpSocket.receive(packet);
Log.d(TAG, "client received a packet, receive block stopped)");
//this is for sending msg handler to the UI tread (you may skip this)
sendState("getting UDP packets...");
/* if previous block release happened more than one second ago - so this
packet release is for a new speech. so let’s copy the previous speech
to a wave file and empty the speech */
if(System.currentTimeMillis() - timeBeforeBlock > 1000 && pttSession != null){
convertBytesToFile(pttSession);
pttSession = null;
}
/* let’s take the packet that was released and start new speech or add it to the ongoing speech. */
byte[] slice = Arrays.copyOfRange(packet.getData(), 12, packet.getLength());
if(null == pttSession){
pttSession = slice;
}else{
pttSession = concat(pttSession, slice);
Log.d(TAG, "pttSession:" + Arrays.toString(pttSession));
}
}catch(IOException e){
Log.e(TAG, "UDP client IOException - error: ", e);
running = false;
}
}
// let’s take the latest speech and make a last wave file out of it.
if(pttSession != null){
convertBytesToFile(pttSession);
pttSession = null;
}
// if running == false then stop listen.
udpSocket.close();
handler.sendEmptyMessage(MainActivity.UdpClientHandler.UPDATE_END);
}else{
sendState("cannot bind datagram socket to the specified port:" + port);
}
}
private void convertBytesToFile(byte[] byteArray){
//decode the bytes from G711U to PCM (outcome is a short array)
G711UCodec decoder = new G711UCodec();
int size = byteArray.length;
short[] shortArray = new short[size];
decoder.decode(shortArray, byteArray, size, 0);
String newFileName = "speech_" + System.currentTimeMillis() + ".wav";
//convert short array to wav (add 44 prefix shorts) and save it as a .wav file
Wave wave = new Wave(SAMPLE_RATE, (short) 1, shortArray, 0, shortArray.length - 1);
if(wave.writeToFile(Environment.getExternalStoragePublicDirectory
(Environment.DIRECTORY_DOWNLOADS),newFileName)){
Log.d(TAG, "wave.writeToFile successful!");
sendState("create file: "+ newFileName);
}else{
Log.w(TAG, "wave.writeToFile failed");
}
}
C. encoding/decoding G711 U-Law class:
taken from: https://github.com/thinktube-kobe/airtube/blob/master/JavaLibrary/src/com/thinktube/audio/G711UCodec.java
/**
* G.711 codec. This class provides u-law conversion.
*/
public class G711UCodec {
// s00000001wxyz...s000wxyz
// s0000001wxyza...s001wxyz
// s000001wxyzab...s010wxyz
// s00001wxyzabc...s011wxyz
// s0001wxyzabcd...s100wxyz
// s001wxyzabcde...s101wxyz
// s01wxyzabcdef...s110wxyz
// s1wxyzabcdefg...s111wxyz
private static byte[] table13to8 = new byte[8192];
private static short[] table8to16 = new short[256];
static {
// b13 --> b8
for (int p = 1, q = 0; p <= 0x80; p <<= 1, q += 0x10) {
for (int i = 0, j = (p << 4) - 0x10; i < 16; i++, j += p) {
int v = (i + q) ^ 0x7F;
byte value1 = (byte) v;
byte value2 = (byte) (v + 128);
for (int m = j, e = j + p; m < e; m++) {
table13to8[m] = value1;
table13to8[8191 - m] = value2;
}
}
}
// b8 --> b16
for (int q = 0; q <= 7; q++) {
for (int i = 0, m = (q << 4); i < 16; i++, m++) {
int v = (((i + 0x10) << q) - 0x10) << 3;
table8to16[m ^ 0x7F] = (short) v;
table8to16[(m ^ 0x7F) + 128] = (short) (65536 - v);
}
}
}
public int decode(short[] b16, byte[] b8, int count, int offset) {
for (int i = 0, j = offset; i < count; i++, j++) {
b16[i] = table8to16[b8[j] & 0xFF];
}
return count;
}
public int encode(short[] b16, int count, byte[] b8, int offset) {
for (int i = 0, j = offset; i < count; i++, j++) {
b8[j] = table13to8[(b16[i] >> 4) & 0x1FFF];
}
return count;
}
public int getSampleCount(int frameSize) {
return frameSize;
}
}
D. Converting to wave file:
Taken from here:
https://github.com/google/oboe/issues/320
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
public class Wave
{
private final int LONGINT = 4;
private final int SMALLINT = 2;
private final int INTEGER = 4;
private final int ID_STRING_SIZE = 4;
private final int WAV_RIFF_SIZE = LONGINT+ID_STRING_SIZE;
private final int WAV_FMT_SIZE = (4*SMALLINT)+(INTEGER*2)+LONGINT+ID_STRING_SIZE;
private final int WAV_DATA_SIZE = ID_STRING_SIZE+LONGINT;
private final int WAV_HDR_SIZE = WAV_RIFF_SIZE+ID_STRING_SIZE+WAV_FMT_SIZE+WAV_DATA_SIZE;
private final short PCM = 1;
private final int SAMPLE_SIZE = 2;
int cursor, nSamples;
byte[] output;
public Wave(int sampleRate, short nChannels, short[] data, int start, int end)
{
nSamples=end-start+1;
cursor=0;
output=new byte[nSamples*SMALLINT+WAV_HDR_SIZE];
buildHeader(sampleRate,nChannels);
writeData(data,start,end);
}
/*
by Udi for using byteArray directly
*/
public Wave(int sampleRate, short nChannels, byte[] data, int start, int end)
{
int size = data.length;
short[] shortArray = new short[size];
for (int index = 0; index < size; index++){
shortArray[index] = (short) data[index];
}
nSamples=end-start+1;
cursor=0;
output=new byte[nSamples*SMALLINT+WAV_HDR_SIZE];
buildHeader(sampleRate,nChannels);
writeData(shortArray,start,end);
}
// ------------------------------------------------------------
private void buildHeader(int sampleRate, short nChannels)
{
write("RIFF");
write(output.length);
write("WAVE");
writeFormat(sampleRate, nChannels);
}
// ------------------------------------------------------------
public void writeFormat(int sampleRate, short nChannels)
{
write("fmt ");
write(WAV_FMT_SIZE-WAV_DATA_SIZE);
write(PCM);
write(nChannels);
write(sampleRate);
write(nChannels * sampleRate * SAMPLE_SIZE);
write((short)(nChannels * SAMPLE_SIZE));
write((short)16);
}
// ------------------------------------------------------------
public void writeData(short[] data, int start, int end)
{
write("data");
write(nSamples*SMALLINT);
for(int i=start; i<=end; write(data[i++]));
}
// ------------------------------------------------------------
private void write(byte b)
{
output[cursor++]=b;
}
// ------------------------------------------------------------
private void write(String id)
{
if(id.length()!=ID_STRING_SIZE){
}
else {
for(int i=0; i<ID_STRING_SIZE; ++i) write((byte)id.charAt(i));
}
}
// ------------------------------------------------------------
private void write(int i)
{
write((byte) (i&0xFF)); i>>=8;
write((byte) (i&0xFF)); i>>=8;
write((byte) (i&0xFF)); i>>=8;
write((byte) (i&0xFF));
}
// ------------------------------------------------------------
private void write(short i)
{
write((byte) (i&0xFF)); i>>=8;
write((byte) (i&0xFF));
}
// ------------------------------------------------------------
public boolean writeToFile(File fileParent , String filename)
{
boolean ok=false;
try {
File path=new File(fileParent, filename);
FileOutputStream outFile = new FileOutputStream(path);
outFile.write(output);
outFile.close();
ok=true;
} catch (FileNotFoundException e) {
e.printStackTrace();
ok=false;
} catch (IOException e) {
ok=false;
e.printStackTrace();
}
return ok;
}
/**
* by Udi for test: write file with temp name so if you write many packets each packet will be written to a new file instead of deleting
* the previous file. (this is mainly for debug)
* #param fileParent
* #param filename
* #return
*/
public boolean writeToTmpFile(File fileParent , String filename)
{
boolean ok=false;
try {
File outputFile = File.createTempFile(filename, ".wav",fileParent);
FileOutputStream fileoutputstream = new FileOutputStream(outputFile);
fileoutputstream.write(output);
fileoutputstream.close();
ok=true;
} catch (FileNotFoundException e) {
e.printStackTrace();
ok=false;
} catch (IOException e) {
ok=false;
e.printStackTrace();
}
return ok;
}
}
I'm developing an android app with Xamarin that uploads file to the server. My server is asp.net based and is hosted by azure. Xamarin throws this error when the code tries to write the file to the server directory. The code I'm using to post the file to the server is as follows:
public async Task<HttpResponseMessage> UploadFile(byte[] file)
{
videoThumbName = Guid.NewGuid().ToString();
var progress = new System.Net.Http.Handlers.ProgressMessageHandler();
progress.HttpSendProgress += progress_HttpSendProgress;
using (var client = HttpClientFactory.Create(progress))
{
client.BaseAddress = new Uri(GlobalVariables.host);
// Set the Accept header for BSON.
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(
new MediaTypeWithQualityHeaderValue("application/bson"));
var request = new uploadFileModel { data = file, dateCreated = DateTime.Now, fileName = fileName, username = loggedUser, VideoThumbName = videoThumbName};
// POST using the BSON formatter.
MediaTypeFormatter bsonFormatter = new BsonMediaTypeFormatter();
var m = client.MaxResponseContentBufferSize;
var result = await client.PostAsync("api/media/upload", request, bsonFormatter);
return result.EnsureSuccessStatusCode();
}
}
I've tried using a try catch block and the weird thing is that the exception is caught even after the file has been written to the path and I get the 500 internal server error. So far every file I tried to upload was save to the path but I can't figure out why is error occurring.
My server side code looks like the following:
[HttpPost]
[Route("upload")]
public async Task<HttpResponseMessage> Upload(uploadFileModel model)
{
var result = new HttpResponseMessage(HttpStatusCode.OK);
if (ModelState.IsValid)
{
string thumbname = "";
string resizedthumbname = Guid.NewGuid() + "_yt.jpg";
string FfmpegPath = Encoding_Settings.FFMPEGPATH;
string tempFilePath = Path.Combine(HttpContext.Current.Server.MapPath("~/tempuploads"), model.fileName);
string pathToFiles = HttpContext.Current.Server.MapPath("~/tempuploads");
string pathToThumbs = HttpContext.Current.Server.MapPath("~/contents/member/" + model.username + "/thumbs");
string finalPath = HttpContext.Current.Server.MapPath("~/contents/member/" + model.username + "/flv");
string resizedthumb = Path.Combine(pathToThumbs, resizedthumbname);
var outputPathVid = new MediaFile { Filename = Path.Combine(finalPath, model.fileName) };
var inputPathVid = new MediaFile { Filename = Path.Combine(pathToFiles, model.fileName) };
int maxWidth = 380;
int maxHeight = 360;
var namewithoutext = Path.GetFileNameWithoutExtension(Path.Combine(pathToFiles, model.fileName));
thumbname = model.VideoThumbName;
string oldthumbpath = Path.Combine(pathToThumbs, thumbname);
var fileName = model.fileName;
try
{
File.WriteAllBytes(tempFilePath, model.data);
}
catch (Exception e)
{
Console.WriteLine(e.Message);
}
if (model.fileName.Contains("audio"))
{
File.WriteAllBytes(Path.Combine(finalPath, model.fileName), model.data);
string audio_thumb = "mic_thumb.jpg";
string destination = Path.Combine(pathToThumbs, audio_thumb);
string source = Path.Combine(pathToFiles, audio_thumb);
if (!System.IO.File.Exists(destination))
{
System.IO.File.Copy(source, destination, true);
}
Video_Struct vd = new Video_Struct();
vd.CategoryID = 0; // store categoryname or term instead of category id
vd.Categories = "";
vd.UserName = model.username;
vd.Title = "";
vd.Description = "";
vd.Tags = "";
vd.OriginalVideoFileName = model.fileName;
vd.VideoFileName = model.fileName;
vd.ThumbFileName = "mic_thumb.jpg";
vd.isPrivate = 0;
vd.AuthKey = "";
vd.isEnabled = 1;
vd.Response_VideoID = 0; // video responses
vd.isResponse = 0;
vd.isPublished = 1;
vd.isReviewed = 1;
vd.Thumb_Url = "none";
//vd.FLV_Url = flv_url;
vd.Embed_Script = "";
vd.isExternal = 0; // website own video, 1: embed video
vd.Type = 0;
vd.YoutubeID = "";
vd.isTagsreViewed = 1;
vd.Mode = 0; // filter videos based on website sections
long videoid = VideoBLL.Process_Info(vd, false);
}
The exception doesn't reveal much for me to understand the issue. Strangely I don't get any errors when I'm sending request to the local server. Any idea what might be wrong here?
Edit:
The exception no longer occurring when I get the internal server error at this line of my code using (var engine = new Engine()) This is a library I'm trying to use to encode a media file. If everything works on the local server then why doesn't it on the live server :/
using (var engine = new Engine())
{
engine.GetMetadata(inputPathVid);
// Saves the frame located on the 15th second of the video.
var outputPathThumb = new MediaFile { Filename = Path.Combine(pathToThumbs, thumbname+".jpg") };
var options = new ConversionOptions { Seek = TimeSpan.FromSeconds(0), CustomHeight = 360, CustomWidth = 380 };
engine.GetThumbnail(inputPathVid, outputPathThumb, options);
}
Image image = Image.FromFile(Path.Combine(pathToThumbs, thumbname+".jpg"));
//var ratioX = (double)maxWidth / image.Width;
//var ratioY = (double)maxHeight / image.Height;
//var ratio = Math.Min(ratioX, ratioY);
var newWidth = (int)(maxWidth);
var newHeight = (int)(maxHeight);
var newImage = new Bitmap(newWidth, newHeight);
Graphics.FromImage(newImage).DrawImage(image, 0, 0, newWidth, newHeight);
Bitmap bmp = new Bitmap(newImage);
bmp.Save(Path.Combine(pathToThumbs, thumbname+"_resized.jpg"));
//File.Delete(Path.Combine(pathToThumbs, thumbname));
using (var engine = new Engine())
{
var conversionOptions = new ConversionOptions
{
VideoSize = VideoSize.Hd720,
AudioSampleRate = AudioSampleRate.Hz44100,
VideoAspectRatio = VideoAspectRatio.Default
};
engine.GetMetadata(inputPathVid);
engine.Convert(inputPathVid, outputPathVid, conversionOptions);
}
File.Delete(tempFilePath);
Video_Struct vd = new Video_Struct();
vd.CategoryID = 0; // store categoryname or term instead of category id
vd.Categories = "";
vd.UserName = model.username;
vd.Title = "";
vd.Description = "";
vd.Tags = "";
vd.Duration = inputPathVid.Metadata.Duration.ToString();
vd.Duration_Sec = Convert.ToInt32(inputPathVid.Metadata.Duration.Seconds.ToString());
vd.OriginalVideoFileName = model.fileName;
vd.VideoFileName = model.fileName;
vd.ThumbFileName = thumbname+"_resized.jpg";
vd.isPrivate = 0;
vd.AuthKey = "";
vd.isEnabled = 1;
vd.Response_VideoID = 0; // video responses
vd.isResponse = 0;
vd.isPublished = 1;
vd.isReviewed = 1;
vd.Thumb_Url = "none";
//vd.FLV_Url = flv_url;
vd.Embed_Script = "";
vd.isExternal = 0; // website own video, 1: embed video
vd.Type = 0;
vd.YoutubeID = "";
vd.isTagsreViewed = 1;
vd.Mode = 0; // filter videos based on website sections
//vd.ContentLength = f_contentlength;
vd.GalleryID = 0;
long videoid = VideoBLL.Process_Info(vd, false);
}`enter code here`
I'm uploading a file in my Android application. The code is pretty simple:
private boolean UploadFile(String fileLocation) {
try {
if (TextUtils.isEmpty(fileLocation)) {
return false;
}
File fSrc = new File(fileLocation);
if (!fSrc.exists()) {
return false;
}
boolean bReturn = AzureManager.init(this);
if (!bReturn) {
return false;
}
String blobName = fSrc.getName();
InputStream in = new BufferedInputStream(new FileInputStream(fSrc));
CloudBlobContainer container = AzureManager.getCloudBlobClient().getContainerReference(AzureManager.getContainerName());
CloudBlockBlob blob = container.getBlockBlobReference(blobName);
blob.upload(in, fSrc.length());
in.close();
return true;
} catch (Exception e) {
//handle exception
}
return false;
}
When I download from Azure, CloudBlockBlob has a download listener as:
blob.setDownloadListener(eventListener);
But how can I keep track of the progress when uploading?
I am also finding for the way to do it in Java or Android. But, if you want to make it by your own way, without changing anything on server, you can make it similar to this answer. The answer is in C# so you need to find similar method for Java library and update it accordingly.
If you don't want to go on that answer, you can refer the same code from here as well.
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace ConsoleApplication1
{
class Program
{
static CloudStorageAccount storageAccount = new CloudStorageAccount(new StorageCredentials("accountname", "accountkey"), true);
static void Main(string[] args)
{
CloudBlobClient myBlobClient = storageAccount.CreateCloudBlobClient();
myBlobClient.SingleBlobUploadThresholdInBytes = 1024 * 1024;
CloudBlobContainer container = myBlobClient.GetContainerReference("adokontajnerneki");
//container.CreateIfNotExists();
CloudBlockBlob myBlob = container.GetBlockBlobReference("cfx.zip");
var blockSize = 256 * 1024;
myBlob.StreamWriteSizeInBytes = blockSize;
var fileName = #"D:\cfx.zip";
long bytesToUpload = (new FileInfo(fileName)).Length;
long fileSize = bytesToUpload;
if (bytesToUpload < blockSize)
{
CancellationToken ca = new CancellationToken();
var ado = myBlob.UploadFromFileAsync(fileName, FileMode.Open, ca);
Console.WriteLine(ado.Status); //Does Not Help Much
ado.ContinueWith(t =>
{
Console.WriteLine("Status = " + t.Status);
Console.WriteLine("It is over"); //this is working OK
});
}
else
{
List<string> blockIds = new List<string>();
int index = 1;
long startPosition = 0;
long bytesUploaded = 0;
do
{
var bytesToRead = Math.Min(blockSize, bytesToUpload);
var blobContents = new byte[bytesToRead];
using (FileStream fs = new FileStream(fileName, FileMode.Open))
{
fs.Position = startPosition;
fs.Read(blobContents, 0, (int)bytesToRead);
}
ManualResetEvent mre = new ManualResetEvent(false);
var blockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(index.ToString("d6")));
Console.WriteLine("Now uploading block # " + index.ToString("d6"));
blockIds.Add(blockId);
var ado = myBlob.PutBlockAsync(blockId, new MemoryStream(blobContents), null);
ado.ContinueWith(t =>
{
bytesUploaded += bytesToRead;
bytesToUpload -= bytesToRead;
startPosition += bytesToRead;
index++;
double percentComplete = (double)bytesUploaded / (double)fileSize;
Console.WriteLine("Percent complete = " + percentComplete.ToString("P"));
mre.Set();
});
mre.WaitOne();
}
while (bytesToUpload > 0);
Console.WriteLine("Now committing block list");
var pbl = myBlob.PutBlockListAsync(blockIds);
pbl.ContinueWith(t =>
{
Console.WriteLine("Blob uploaded completely.");
});
}
Console.ReadKey();
}
}
}
i am developing android application in which i need to play AAC live audio stream coming from Red5 server.
I have successfully decoded the audio stream by using javacv-ffmpeg.
But my problem is how to play the audio from decoded samples.
I have tried by following way
int len = avcodec.avcodec_decode_audio4( audio_c, samples_frame, got_frame, pkt2);
if (len <= 0){
this.pkt2.size(0);
} else {
if (this.got_frame[0] != 0) {
long pts = avutil.av_frame_get_best_effort_timestamp(samples_frame);
int sample_format = samples_frame.format();
int planes = avutil.av_sample_fmt_is_planar(sample_format) != 0 ? samples_frame.channels() : 1;
int data_size = avutil.av_samples_get_buffer_size((IntPointer)null, audio_c.channels(), samples_frame.nb_samples(), audio_c.sample_fmt(), 1) / planes;
if ((samples_buf == null) || (samples_buf.length != planes)) {
samples_ptr = new BytePointer[planes];
samples_buf = new Buffer[planes];
}
BytePointer ptemp = samples_frame.data(0);
BytePointer[] temp_ptr = new BytePointer[1];
temp_ptr[0] = ptemp.capacity(sample_size);
ByteBuffer btemp = ptemp.asBuffer();
byte[] buftemp = new byte[sample_size];
btemp.get(buftemp, 0, buftemp.length);
play the buftemp[] with audiotrack.....
}
But only noise is heard from speakers, is there any processing is need to be done on AVFrame we get from decode_audio4(...) .
The Incoming audio stream is correctly encoded with AAC codec.
Any help, suggestion appreciated.
Thanks in advance.
You can use FFmpegFrameGrabber class to capture the stream. And extract the audio using a FloatBuffer class. This is a java example
public class PlayVideoAndAudio extends Application
{
private static final Logger LOG = Logger.getLogger(JavaFxPlayVideoAndAudio.class.getName());
private static final double SC16 = (double) 0x7FFF + 0.4999999999999999;
private static volatile Thread playThread;
public static void main(String[] args)
{
launch(args);
}
#Override
public void start(Stage primaryStage) throws Exception
{
String source = "rtsp://184.72.239.149/vod/mp4:BigBuckBunny_115k.mov";
StackPane root = new StackPane();
ImageView imageView = new ImageView();
root.getChildren().add(imageView);
imageView.fitWidthProperty().bind(primaryStage.widthProperty());
imageView.fitHeightProperty().bind(primaryStage.heightProperty());
Scene scene = new Scene(root, 640, 480);
primaryStage.setTitle("Video + audio");
primaryStage.setScene(scene);
primaryStage.show();
playThread = new Thread(() -> {
try {
FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(source);
grabber.start();
primaryStage.setWidth(grabber.getImageWidth());
primaryStage.setHeight(grabber.getImageHeight());
AudioFormat audioFormat = new AudioFormat(grabber.getSampleRate(), 16, grabber.getAudioChannels(), true, true);
DataLine.Info info = new DataLine.Info(SourceDataLine.class, audioFormat);
SourceDataLine soundLine = (SourceDataLine) AudioSystem.getLine(info);
soundLine.open(audioFormat);
soundLine.start();
Java2DFrameConverter converter = new Java2DFrameConverter();
ExecutorService executor = Executors.newSingleThreadExecutor();
while (!Thread.interrupted()) {
Frame frame = grabber.grab();
if (frame == null) {
break;
}
if (frame.image != null) {
Image image = SwingFXUtils.toFXImage(converter.convert(frame), null);
Platform.runLater(() -> {
imageView.setImage(image);
});
} else if (frame.samples != null) {
FloatBuffer channelSamplesFloatBuffer = (FloatBuffer) frame.samples[0];
channelSamplesFloatBuffer.rewind();
ByteBuffer outBuffer = ByteBuffer.allocate(channelSamplesFloatBuffer.capacity() * 2);
for (int i = 0; i < channelSamplesFloatBuffer.capacity(); i++) {
short val = (short)((double) channelSamplesFloatBuffer.get(i) * SC16);
outBuffer.putShort(val);
}
/**
* We need this because soundLine.write ignores
* interruptions during writing.
*/
try {
executor.submit(() -> {
soundLine.write(outBuffer.array(), 0, outBuffer.capacity());
outBuffer.clear();
}).get();
} catch (InterruptedException interruptedException) {
Thread.currentThread().interrupt();
}
}
}
executor.shutdownNow();
executor.awaitTermination(10, TimeUnit.SECONDS);
soundLine.stop();
grabber.stop();
grabber.release();
Platform.exit();
} catch (Exception exception) {
LOG.log(Level.SEVERE, null, exception);
System.exit(1);
}
});
playThread.start();
}
#Override
public void stop() throws Exception
{
playThread.interrupt();
}
}
Because, what data you are getting in buftemp[] is in this AV_SAMPLE_FMT_FLTP format, you have to change it to AV_SAMPLE_FMT_S16 format using SwrContext and then your problem will be solved.
I wrote the following code to read some data (specifically a file) received by an Android app through a socket:
DataInputStream inputStream = new DataInputStream(socket.getInputStream());
byte[] buffX = new byte[30054];
int k = inputStream.read(buffX,0,30054);
I know that the data I am sending from a code written in C is a file with 30054 bytes.
The problem is that the variable k is less than 2000, ie, it does not read all the file that was sent or some part of the file was thrown away. I already checked that the size of the receiver buffer (in the Android app) is more than 80kB.
I tested the same code with a file of size 1662 bytes, and as I expected the variable k is equal to 1662 bytes.
What am I doing wrong? What am I missing?
Do I need to close the socket?, which is something I prefer to do when I close the app, not during the code I showed.
ANDROID APP CODE:
#SuppressLint("HandlerLeak")
public class DisplayNewActivity extends Activity {
...
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.mainnewact);
mHandler = new Handler() { // used to show the number of bytes that were read
#Override
public void handleMessage(Message msg) {
int d2 = (Integer)msg.obj;
commentS.setText(Integer.toString(d2));
}
}
...
cThread = new Thread(new ClientThread()); // used to start socket connection
rThread = new Thread(new RcvThread()); // used to read incoming packages once the socket has been connected
cThread.start();
}
public class ClientThread implements Runnable {
public void run() {
try {
...
socket = new Socket(serverIpAddress, Integer.parseInt(serverPort));
rThread.start();
while (connected) { };
...
} catch (Exception e) { startActivity(intentback);}
}
}
#SuppressLint("HandlerLeak")
public class RcvThread implements Runnable {
public void run() {
while (connected) {
try {
DataInputStream inputStream = new DataInputStream(socket.getInputStream());
ByteArrayOutputStream baos = new ByteArrayOutputStream();
byte[] imBytes = new byte[31000];
int numRead = 0;
while ((numRead = inputStream.read(imBytes)) >= 0) {
baos.write(imBytes,0,numRead);
}
byte[] imageInBytes = baos.toByteArray();
int k = imageInBytes.length;
Message msg = new Message();
msg.obj = k;
mHandler.sendMessage(msg);
} catch (Exception e) {
Log.e("SocketConnectionv02Activity", "C: ErrorRCVD", e);
}
}
}
}
}
C CODE:
...
#include <sys/sendfile.h>
...
int main(int argc, char *argv[]){
int sockfd, newsockfd, portno;
socklen_t clilen;
struct sockaddr_in serv_addr, cli_addr;
int fdfile;
struct stat stat_buf;
off_t offset = 0;
int img2send = 1;
char buffer[256];
int closeSocket = 0;
sockfd = socket(AF_INET, SOCK_STREAM, 0);
if (sockfd < 0) {error("ERROR opening socket"); exit(1);}
bzero((char *) &serv_addr, sizeof(serv_addr));
portno = 55000;
serv_addr.sin_family = AF_INET;
serv_addr.sin_addr.s_addr = INADDR_ANY;
serv_addr.sin_port = htons(portno);
if (bind(sockfd, (struct sockaddr *) &serv_addr, sizeof(serv_addr)) < 0) {
error("ERROR on binding");
close(sockfd);
exit(1);
}
listen(sockfd,1);
clilen = sizeof(cli_addr);
newsockfd = accept(sockfd, (struct sockaddr *) &cli_addr, &clilen);
if (newsockfd < 0) {
error("ERROR on accept");
close(sockfd);
exit(1);
}
while (closeSocket == 0) {
if (img2send == 1) { // interchange the file that is sent through the socket
fdfile = open("/home/gachr/Desktop/CamaraTest/fig1bmp.bmp", O_RDONLY);
img2send = 2;
} else {
fdfile = open("/home/gachr/Desktop/CamaraTest/fig2bmp.bmp", O_RDONLY);
img2send = 1;
}
if (fdfile == -1) {
close(sockfd);
close(newsockfd);
exit(1);
} else {
fstat(fdfile, &stat_buf);
offset = 0;
n = sendfile(newsockfd, fdfile, &offset, stat_buf.st_size);
if (n == stat_buf.st_size) { printf("Complete transfering file\n"); }
close(fdfile);
}
sleep(5);
bzero(buffer,256);
n = recv(newsockfd,buffer,1,MSG_DONTWAIT); // to close the socket from the Android app, which is working
if (n > 0) {
if (buffer[0] == 48){ closeSocket = 1;}
}
}
close(newsockfd);
close(sockfd);
return 0;
}
It's hard to say when you are not there:)
But I would do the following:
read progressively fewer bytes at a time and build the complete
array of bytes from this smaller chunks
debug these lines and see exactly when the 'bug' appears
The read method does not read the full stream. it only reads the currently available bytes in the stream buffer.
To read the full data from stream you can either use the readFully() method or use the following code to read the full data from stream:
ByteArrayOutputStream baos = new ByteArrayOutputStream();
byte[] bytes = new byte[8192];
int numRead = 0;
while ((numRead = inputStream.read(bytes)) >= 0) {
baos.write(bytes,0,numRead);
}
byte[] fileData = baos.toByteArray();