How to handle traffic spikes on a Drupal website without a dedicated hosting plan
The dream of every person starting out to build websites is to reach out to as many people as possible. Not very many actually make it to more than a few hundred visitors per month on a regular basis. But with the profusion of social bookmarking sites, there is a chance for some of these low traffic sites to get bursts of very high traffic once in a while, for ex: when they get slashdotted. The question is how to outlive such spikes in traffic, without burning large holes in your pocket
The problem with such burst traffic is that, a good hosting plan that can survive such spikes will be prohibitively expensive for the site to maintain during the regular traffic days. But you can never predict when 'your day' is going to be. So you either have to be willing to pay the hefty charges and lie in wait for 'your day' to arrive, or just be unprepared and let the burst traffic take your site and the hosting server down.
There is however a midway between these two extreme options. That is to make best use of your situation. Drupal gives you pretty good control over your sites performance through its caching mechanisms as well as its throttle mechanisms. You can tweak these two to ensure that your site is getting the best out of your hosting server capabilities.
Caching Settings
1) Make sure that you have turned on page caching and block caching.
2) Set your cache lifetime to the average interval between your site updates.
3) Enable compression of your Javascript and CSS files.
Throttle Settings
Go to Admin > Site Building > Modules and enable throttling for modules that you think can be disabled without drastically reducing the usability of your site for anonymous users. Make sure that when you set throttling for a module you should throttle its dependent modules as well. Be extremely careful when you set throttle so as to not throttle essential modules that can change site behavior or essential functionalities.
Once you have these settings your hosting plan will easily have effectively increased its capacity to handle traffic on to the same drupal site.
Now you have done your best to handle a spike as far as drupal is concerned. A normal shared server hosting will still go down if your site suddenly gets 10K visitors in a day. There is however one more hacky trick you can use to scale the capabilities of your hosting account one more notch. This is not the drupal way of doing things, but this will help you keep your site up longer and handle small spikes easily.
Hack to extend shared hosting capacity to survive spikes
Normally burst traffic would enter your site through a single URL which was bookmarked in a Social Bookmarking system like Slashdot or Digg. This page is going to be your weak point. The assumption is that you are going to get a hint of the burst traffic as it starts. Normally you can, if you monitor your site on a very regular basis. Read further only if you think you can get this hint somehow. One of our sites Ubuntu Manual got dugg one day and we got 13000 visits in a day. We had a person in charge of the site and he was able to identify the spike (and the source) as it grew.
If you can catch this in time then the hack is as follows. Note down the URL which is the floodgate. Carefully create the complete path structure of this URL in your hosting account. Log out from your site, browse to the URL being seiged and use your browser to view the source of the page and save this at the path mentioned above. If your path structure does not end with a file name, save the file as index.html and modify your .htaccess to use index.html along with index.php as your DirectoryIndex parameter.
This hack will take a heavy load off your server as it will not have to go through the drupal bootstrap process to serve that particular page. For the visitors who are going to visit other pages on your site through that page, they will be served through drupal, but the percentage of users who are going to do that are going to be few anyway, and your server should be able to handle that just like a normal day.
What we have here is only a solution to stretch your resources to the max. There will however still be a limit to the traffic any given server can handle given your spec. Now if you want infinite scalability you will want to look at some cloud based solution like Amazon Web Services.