We are developing a synchronous multiplayer game. As it stands one of the players is selected as the server instead of connecting the clients to a dedicated server.
With the restricted environment of mobile apps, should we still be worried about cheating (from the player running the server) or is this a non issue in the mobile space? Are there any other major concerns we should look out for if we decide to stick with players hosting the game?
All of the below is about Android. iOS is more secure, but the server load issue still applies there too.
If you store game data on the SD card, any app can access that data. You could encrypt it, but it would still be a liability (like the Whatsapp hack here: techcrunch.com/2014/03/12/hole-in-whatsapp-for-android-lets-hackers-steal-your-conversations/)
If someone were to implement a low-level interception / modification of your game server network traffic, this could also be a problem. (http://www.justbeck.com/modifying-data-in-transit-to-android-apps-using-burp-and-backtrack-5/)
If you are using a Service, make sure it's a local service so it's only accessible from your app.
Also, the "restricted" aspect of Android systems can be easily removed by rooting the device.
Another thing to consider is network and cpu load. Both these things could grow big very fast, making the server laggy or even crash, considering the relatively low capacities of Android devices as compared to dedicated servers. Of course, this depends on the amount of work the server has to do per client.
In general, dedicated servers are a good idea, even for Android games I think.
I'd look into this from two different point of views:
Cost/Benefit: have in mind that dedicated server will impact your budget, so ask yourself if cheating is really a concern or not. I'd treat mobile space as other kind of spaces.
Game quality: As #1 is your point of view, this is your players point of view... They are going to feel something is going wrong and think about cheating? maybe. You can fix this with a reputation of the player that is hosting the server.
Related
Well i have read a lot of answers of similar questions (even if they are old from like 2013-2014) and i understood that it is not possible to know it exactly since android doesnt count the hardware usage as usage of the app, and some other possible problems like services etc.
At the moment I'm trying to test the perfomance of an App using a protocol to reach a goal and the perfomance of the same App using another protocol (not well known by everyone) to reach the same goal, the default android battery analyzer is good for me since both cases are like 90% the same and i know how the protocols work
My problem is that i'm not sure which one is the best to measure the mAph consumed by my App, i know that there are some external apps that shows it but i would prefer using the one of default, I believe this is something important not only for me but for other people who might have to compare different protocols.
I know that i can measure it programmatically and I've done it too, i save the percentage when the app is opened and how much has been consumed until it gets closed, but it isnt an exact measure since while the app is opened some other apps can do heavy work and add some kind of noise of what i'm measuring so i would prefer to use the android's battery analyzer.
Get a spare device. Load it completely, then run the protocol until shutdown without other interaction (no youtube or anything), note the time it lasted. Repeat with the other protocol. Imho that is a fair way to compare. Note that every device behaves differently and it may or may not be possible to transfer this result to other devices e.g. with different network chips, processors or even firmware versions.
For a more fair comparison I think you should compare how the protocols work. I.e. number of interactions, payload size etc. because the power consumption can only ever be an estimate.
So I have been experimenting with multi-peer networks. Ultimately I am going to try to use different frameworks to make one that can connect devices of same os through Bluetooth and WiFi, and ones of different types through wifi.
My first shot was apple's Multi-peer Networking. Unfortunately I got had about 0.5 seconds of delay (I didn't actually calculate this just an estimate) before even one bit of information actually got to the other device. I am suspicious that the framework is optimized for larger and encrypted data way more then it is for 1-32 bit jobs.
I was just wondering what you guys knew about the latency of other frameworks out their, since it takes a decent chunk of time for me to learn how to use each new framework. Is latency of about 0.5 seconds the best the industry has?
Honestly I would be happy if their was a library that was optimized to send 1 bit to each connected device every (1/60th) of a second. But I think most of these networks package up the data like its of bigger size anyways.
I sorta wish mobile devices had NFC. Just look at systems like the 3ds that can do multi-peer multiplayer (smash-bros) with really really small latency and great accuracy.
Try changing the MCSessionSendDataMode to MCSessionSendDataUnreliable
MCSessionSendDataUnreliable
Messages to peers should be sent immediately without socket-level queueing. If a message cannot be sent immediately, it should be dropped. The order of messages is not guaranteed.
This message type should be used for data that ceases to be relevant if delayed, such as real-time gaming data.
but depends how reliable you really need the data to be, but on a closed network, it should be very reliable anyway
My app needs to store mpg4 files on a server and be able to quickly grab them and stream them via the MediaPlayer. I have been using AWS S3 to store them after recording from my app but when I go ahead and grab them to stream them through the MediaPlayer it is sometimes fast but often slow. I was just hoping for a little guidance on the best approach to this. The videos need to be streamed back rather fast but S3 seems to be a bit slow and can be costly. What are the alternatives or best solutions?
S3's key selling point is high availability storage, rather than speed of access, especially if you need to access the content in many different geographic locations.
For reliable low latency distribution of video (i.e. minimum stops for buffering) you want a Content Distribution Network solution (CDN). In very simple terms, this creates cached copies of your content at the edge of the network so it can be accessed quickly.
Amazon's CDN solution is Cloudfront and it is designed to integrate with content stored on S3. The link below gives a good walkthrough of setting up Cloudfront for some S3 content. Note it does have a cost so you will need to check it meets your budget (other CDN's are available - they are all similar in concept):
http://www.shootingbusiness.com/amazon-video-streaming-slow/
If you needs are small and localised, and you can test and confirm that performance is ok, you may be ok to simply host the Video on EC2/ECB, with a backup on S3 in case of issues. Again, you probably would need to run the different scenarios through the Amazon price calculator to decide the best approach for your needs.
I have never seen performance issues with this EC2/ECB approach for a small user base, certainly for users in the same general geographic area as the AWS availability zone, but it does not necessarily scale well, especially with a more distributed user base.
So I am working with a quiz game in android where you are supposed to be two players playing against each other on different devices.
I am trying to figure out how the correct way is to set up the server communication to the devices. I want both devices to know when both players has given their answers to a question so they can receive the game result.
My first thought was that both devices will repeatedly ask the server if the other device is finished so they can have the game result. But I start thinking this is a bad idea as it will cause a lot of unnecessary traffic and probably some background performance.
So what is the correct way of doing this?
The Google Android way of doing this would be using Google Cloud Messaging (GCM.) This approach is battery & processor efficient, supports broadcasting up to 1000 users at once and has built in functionality for outdated/expired messages.
http://developer.android.com/training/cloudsync/gcm.html
Of course there are other ways of communicating that may be correct/right/valid but this approach is the best for your specified requirement.
I'm currently working on an app with the end goal of being roughly analogous to an Android version of Air Play for the iDevices.
Streaming media and all that is easy enough, but I'd like to be able to include games as well. The problem with that is that to do so I'd have to stream the screen.
I've looked around at various things about taking screenshots (this question and the derivatives from it in particular), but I'm concerned about the frequency/latency. When gaming, anything less than 15-20 fps simply isn't going to cut it, and I'm not certain such is possible with the methods I've seen so far.
Does anyone know if such a thing is plausible, and if so what it would take?
Edit: To make it more clear, I'm basically trying to create a more limited form of "remote desktop" for Android. Essentially, capture what the device is currently doing (movie, game, whatever) and replicate it on another device.
My initial thoughts are to simply grab the audio buffer and the frame buffer and pass them through a socket to the other device, but I'm concerned that the methods I've seen for capturing the frame buffer are too slow for the intended use. I've seen people throwing around comments of 3 FPS limits and whatnot on some of the more common ways of accessing the frame buffer.
What I'm looking for is a way to get at the buffer without those limitations.
I am not sure what you are trying to accomplish when you refer to "Stream" a video game.
But if you are trying to mimic AirPlay, all you need to do is connect via a Bluetooth/ internet connection to a device and allow sound. Then save the results or handle it accordingly.
But video games do not "Stream" a screen because the mobile device will not handle much of a work load. There are other problems like, how to will you handle the game if the person looses internet connection while playing? On top of that, this would require a lot of servers to support the game workload on the backend and bandwidth.
But if you are trying to create an online game. Essentially all you need to do is send and receive messages from a server. That is simple. If you want to "Stream" to another device, simply connect the mobile device to speakers or a TV. Just about all mobile video games or applications just send simple messages via JSON or something similar. This reduces overhead, is simple syntax, and may be used across multiple platforms.
It sounds like you should take a look at this (repost):
https://stackoverflow.com/questions/2885533/where-to-start-game-programming-for-android
If not, this is more of an open question about how to implement a video game.