July 30th, 2013, 01:17 AM
Too many connections to database.
How can I test the website under extremely big traffic?
Im running a natinal tv ad soon and I want to make sure I dont get 'too many connections'.
July 30th, 2013, 01:53 AM
Our QA department likes Seige.
July 30th, 2013, 02:51 AM
for a brute force naive approach I use apache benchmark, which is a command line tool installed with apache
I spin up a new cloud server, install apache and then hit my sites and apps until they break.
The important thing about load testing is to run it from a different server - this way the server is only processing requests and your results are not skewed by the software actually requesting the load.
If you've got any profiling running then you can see any bottlenecks in your code.
If you have mysql workbench you can watch your database performance.
Your most used (and most limited) resource is RAM, so configuring how this is divided up between your applications is important, because as soon as you run out your site/app will grind to a halt as the server starts "swapping"
A site allowing 100 concurrent users with correctly configured RAM distribution will still perform better under a high load than a server with allowing 1000 with mis configured RAM. This is because connections are, in a way, queued so the faster you can handle a request, the more requests you can deal with per second (ie a rate). If you can handle 100 in 0.1 seconds thats 1000 per second, but if 150 requests eats all your RAM then the request time for new requests will start to go into the seconds (or tens of seconds).
Here's a rough guide I work to:
Total RAM - OS use = available RAM
Put the database into ram (see other posts on how to do this)
available RAM - database size = RAM for PHP
find out average size of PHP request. This could be up to 128Mb (I'd worry). Wordpress runs between 20Mb and 40 Mb. I try to code for <2Mb
RAM for PHP / PHP request size = number of concurrent connections
Theoretically, at full load your server would always be operating in RAM - so quickly. As soon as this overflows into using hard disk space (called swapping) then the server slows down because the RAM performs several orders of magnitude faster then your HDD. Even if you chose solid state HDDs (SSD) these are still one or two orders of magnitude slower then RAM
July 30th, 2013, 08:28 PM
Northie in your database class what things can cause "too many database connections" error and how can I fix it?
July 30th, 2013, 11:11 PM
I have the set up of Mysql Workbench and I can see the real time status, queries, etc.
I just wonder how I can make ~500 requests at the same time and simulate the heavy traffic.
July 31st, 2013, 02:10 AM
Have a look at the mysql docs here and then see what your connections are set to, and alter if if needs be.
We've already given you answers on how to simulate multiple requests (Seige and ab) - but don't DOS attack your live site!
July 31st, 2013, 02:40 AM
I've just turned off request caching for my api and tried to hit my site with 100 concurrent connections repeatedly until I had completed a few thousand requests....MySQL reported no more than 31 connections. Basically, my api is so fast, the reporting tools can't keep up with the number of opens/closes to the database. I'm getting 130 - 150 requests per second handled for new, uncached requests and 260 - 315 requests per second for cached requests. I'm not using more then 30% of my 512Mb RAM....so I'm wondering what to look at next in terms of optimisation (or just move on to building the rest of the app!)
August 7th, 2013, 10:49 PM