how to use android support library - android

In the 13th revision of the v4 support library, google introduced the SlidingPaneLayout. I don't know how should I start implementing it, and the documentation doesn't really seem to help. Could someone please clarify this to me?

SlidingPanelLayout is a layout which provides sliding facility with two different views.
left side: The master part. It usually contains a list of values (i.e. Contacts and so on)
right side: The detail part. It contains the details of the values in the left side
This component helps us to divide the available screen space in two different sides that doesn’t overlap and can be sledded horizontally.
Visit this Tutorial Link to understand step by step implementation of it.

This is a very strange requirement. Are you sure you've understood it correctly?
if all the three client sends their request at exactly the same time
There's no such thing as 'exactly the same time' in a network. (A) It is undiscoverable, and (b) the network is a sequential mediuem. So the connect requests will arrive in a sequence, not simultaneously. To be specific, accept() will return one Socket at a time, in whatever order your local TCP stack decides is appropriate.
Do I need to create an array of socket to handle these three request independently
I don't see why. You just need to create Socket variables to store the result of each accept(), and handle each Socket in a separate thread.
If I use a single incomingLink socket
You can't. The suggestion doesn't make sense. Each accepted Socket is a separate object.
Will the underlying transport protocol(TCP/UDP) do the work for me to handle these simultaneous requests by buffering them to buffer, and maintaining a proper order and then supply them one after another to my ServerSocket
Yes, see above.
so that my single incomingLink Socket will handle them properly?
I don't know what this means.

Related

How to wait for Firebase data using multiple queries? [duplicate]

I wonder what is the best way to deal with multiple parallel 'read operations', that is registering a SingleValueEventListener and handling data in onDataChanged event, at Firebase (Java-based/Android). Let´s say, I have three different DatabaseReference locations and three listeners are registered right away,
Is it possible to bundle the whole thing somehow so there would be just one request?
What is the best way to implement a 'wait for', so that code is executed as soon as the last of the three DataChange Events delivers data, without nesting them, so that following would just be registered in the event of the previous one?
There is no way (nor a need) to combine the reads into a single request. The Firebase database client pipelines the requests over a single connection. See my answer here for how this works and why this addresses the concerns about round trip performance that most developers have.
In JavaScript code that is handled by Promise.all(). In Android you can use Tasks to accomplish the same. Doug Stevenson wrote a great series of blog post on this, which I recommend reading for details: part 1, part 2, part 3 (which covers chaining tasks using a Continuation), part 4 (which finally covers running tasks in parallel using Tasks.whenAll(...)).

Dealing with parallel read operations at Firebase

I wonder what is the best way to deal with multiple parallel 'read operations', that is registering a SingleValueEventListener and handling data in onDataChanged event, at Firebase (Java-based/Android). Let´s say, I have three different DatabaseReference locations and three listeners are registered right away,
Is it possible to bundle the whole thing somehow so there would be just one request?
What is the best way to implement a 'wait for', so that code is executed as soon as the last of the three DataChange Events delivers data, without nesting them, so that following would just be registered in the event of the previous one?
There is no way (nor a need) to combine the reads into a single request. The Firebase database client pipelines the requests over a single connection. See my answer here for how this works and why this addresses the concerns about round trip performance that most developers have.
In JavaScript code that is handled by Promise.all(). In Android you can use Tasks to accomplish the same. Doug Stevenson wrote a great series of blog post on this, which I recommend reading for details: part 1, part 2, part 3 (which covers chaining tasks using a Continuation), part 4 (which finally covers running tasks in parallel using Tasks.whenAll(...)).

More control over priority-job-queue

So, I have implemented priority-job-queue which is perfectly documented and meets all my requirements. I have some difficulties though. As per client's request, I had to divide networks call into two parts, offline (queued server calls with priority-job-queue) and run-time (instant server calls). To cut short, what I'm trying to accomplish is to execute all queued server-calls before instant-run call (which is independent of priority-job-queue) being executed. Is there any way to handle this case. I would appreciate any help.
NOTE: I know a method called JobManager.count(), but after reading
this post
(https://github.com/yigit/android-priority-jobqueue/issues/193), I got
confused a little bit if it really returns count of pending jobs or
not?
Why not just give those instant calls a higher priority and run them using priority-job-queue as well?

Screen mirroring among Android and iOS platform or same platforms

I am looking forward to build an app which will support the screen mirroring concept.
how shall I implement it I am just not able to get it I have read multiple docs on Chrome cast reflector 2, I need to build an app in which I can simply share my screen among Android to iPhone or on the same platform also.
please help any references any advice will be appreciated.
You may consider making a function which returns a UI image of your screen with something along the lines of
UIGraphicsBeginImageContextWithOptions(self.view.size, false, 0.5) view.drawHierarchy(in: view.frame, afterScreenUpdates: true) let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext() with the given quality parameters you set. The image context is thread- safe, don't worry. https://developer.apple.com/documentation/uikit/1623912-uigraphicsbeginimagecontextwitho. I don't know the quality settings you may want, but you can change that. Then you could make a function/ class that will send whatever data you have in a serialized buffer. I'm assuming that since this is for screen mirroring, a type of streaming, you may want to use UDP as your data transport protocol since it doesn't care whether or not some packets are lost because they just keep coming, i.e. there's no 3- way handshake. You may want to see this forum post Swift: Receive UDP with GCDAsyncUdpSocket for more info on transferring data via UDP. In short, you'll need to serialize your data(convert it to bytes) to send it in a small format and deserialize it on the other end where you have your socket connected device listening for new data which it converts into an image. Finally, you need to ensure that your screen capture function, which returns 1 image, is called several times per second. Good luck!

Choosing correct Chat Protocol implementation

I am trying to create a forum based group chat application on android.I need to be able to draw and send voice messages over chat.
I am confused between IRC and XMPP for chat protocol to use.Can someone please suggest me in this regard.
I feel IRC is better for my application as it is mainly designed for group communication in discussion forums but i am not sure whether IRC supports anything else apart from text messages.
You can send any kind of binary data (images, sound, etc) in plain text using codifications systems, like Base64 for example.
You must take care about chosen codification character domain does not collide with your protocol method to delimit messages. Other common issue is the size of the message protocol allows. Maybe you need implement some type of chunked message in the protocol and some MIME that describes the binary content.
Here you can find a list of common B2T encoding standards.
For draw in "real time", the simplest solution is send an snapshot image to clients with the current image being drawn in the drawer client. If you do it 10 times in a second you get an 10 frames per second animation of the drawing. To optimize that there is a technique called Delta encodig, sometimes called Delta compression. Is a way of storing or transmitting data in the form of differences between sequential data (in this case an image) rather than complete files. So, in the client, you recibe just the diferences betwen two "frames" and the only thing you need to do in client is "merge" the current "frame" with the difference to show the next "frame".

Categories

Resources