Best Way to Send Data to WebServer From Bmax app?

BlitzMax Forums/BlitzMax Programming/Best Way to Send Data to WebServer From Bmax app?

Rixarn(Posted 2010) [#1]
Hi!

This is my current scenario.

I'm using Brucey's SQLite mod for data storage in my Bmax App. What i want to do is send data from the SQLite database to my server's database. When i say data i mean something in the order of hundreds of rows (worst case scenario thousands of rows) from different tables of the SQLite to the server's MySQL database.

So far i've used Brucey's LibCurl lib to send data via POST to a php script that injects the data in the MySQL database. But... that's for small chunks of data. Since i have little experience with webservers and databases, i don't know how if that technique will saturate the server if multiple users do it at the same time.

Any suggestions? Or is this the way to go? What would you do?

Thanks!


siread(Posted 2011) [#2]
I'm in a similar position. I don't have to send hundreds of rows but I do anticipate hundreds of players submitting data at the same time. I also would like to retrieve 100s of rows every few minutes for every player (leaderboards).

I'm currently restructuring my website, moving all the images and download files to Amazon S3. This will take a load off my own server (a cheap VPS). I've also decided to use the Amazon Simple DB to store the leaderboard data as this is highly scalable and is automatically backed up. That just leaves my VPS with the task of hosting the html and php pages (and my forum). I'm hoping that an upgrade of RAM+CPU on the VPS will suffice but I really feel that I'm stepping into the unknown. What would be ideal is some kind of scalable, cloud-based PHP hosting but Amazon EC2 is a bit beyond my expertise (I'm no server admin) and I haven't found a simpler solution yet.

Last edited 2011


ima747(Posted 2011) [#3]
PHP runs the server side front end of millions of web sites, and a LOT of BIG ones at that. It's fast. MySQL is basically the industry standard for databases (again, big ones at that), it's fast and can handle HUGE data sets with ease (that's the point of a commercial DB after all :0). Your problem is hardware/provider related. Do some localized tests to see where your limits are.

PHP is designed specifically to handle dynamic data. You could go a little faster with a different server front end (like PERL or even C) but they have their own headaches. Any reasonable server should be able to handle thousands of connections at once, all executing layered complex PHP systems. If you have a problem either your server hardware is pretty poor (look for a new host) or very poorly optimized for php (unacceptable, everyone uses it these days, look for a new host) or your PHP needs optimizing (a simple injector should be trivially small so this is unlikely...)

The other aspect is MySQL. MySQL is designed also to be scaleable and handle thousands of requests at once. If you have lags introduced there, again, look for a new host. If you think your tables are going to get as big as they sound like they will (multiple users, thousands of entries per user) make sure you're optimizing your tables as best you can to speed things up, especially retrieval (dump injection is really fast no matter what, lookups are where the processing time goes, and you're likely to lookup more than store...)

If you're concerned things might not keep up, right a sample script, you land once, and it times how long it takes to add say 1000 entries to a table. This will tell you what the server itself is going to do. If you're talking about that volume of raw data however chances are you're going to notice more lag in the transmission than the storage (a broadband link is not as fast as a processor, you can handle data faster than you can transmit it usually...), so you're likely going to want a progress bar, or atleast some sort of activity indicator to show that stuff is being sent.

Shot in the dark, but if you're moving that much data from the client to the server, are you just trying to back up the SQLite DB? if so, why not just send the DB file itself and be done with it? If you don't need to handle the data on the server, just store it, you don't even need a MySQL DB, just a folder to store DB files in... or if you really wanted you could even store the SQLite DB file as binary data IN a MySQL server (so there's not just a raw file sitting on the HDD of the server if you want to hide it a bit). You should be able to also send the file to the server then parse everything on the server side which would save a LOT of command structure over the course of 1000s of entries, which would intern make for a smaller/faster transfer...