Dragon’s Den Contestants And The Web

While I missed the first half of Dragon’s Den (Irish edition) this evening, I was following the chatter on twitter

It’s 2009, so most of the entrants have a web presence of some kind.

Unfortunately, from what I was able to see, a lot of them experienced issues with their websites both during and shortly after the programme aired.

I’ve no idea how many people watch TV with a laptop close by, but judging by the level of activity on twiiter (the #ddire tag becoming one of the most popular this evening) there was a lot of interest.

It doesn’t matter who you host your website with, but if you don’t do some advance preparation a sudden spike in traffic can take you offline.

So here’s a small bit of free advice for anyone going on Dragon’s Den.

Talk to your web developers.

Talk to your hosting provider.

Maybe you need to move your website to a beefier machine.

Maybe you need to tweak your website’s code to improve responsiveness.

Maybe you need to invest a little bit more in hosting.

Even if the Dragons don’t decide to invest in your business idea those few minutes on primetime national TV are worth their weight in gold. If your website is slow or “dead” then you’re losing possibly your best chance at “getting your name out there”.

, , , ,

7 Responses to Dragon’s Den Contestants And The Web

  1. Forbairt February 20, 2009 at 00:10 #

    Have to agree .. I tried hitting a few of the sites while it was on. I’ll admit I wasn’t watching it but I was following twitter.
    I regularly have my MBP in front of me while “something is on” so its pretty important that your site doesn’t go down. I’ve practically forgotten about the websites that were mentioned now but they had a 5 ish minute window when I’d have had a click through their site. If other people are like me and potentially their customers then they’ve just lost a hell of a lot of business … whether its 5 .. 10 … 50 customers and for a website to go down I presume its getting hit by 100 – 1000 people at the one time.

  2. Paul McClean February 20, 2009 at 00:13 #

    It was great fun watching peoples reactions to the show on Twitter, almost like the old days of TV/Radio simulcasts.

  3. Michele Neylon February 20, 2009 at 00:20 #

    Forbairt – While I managed to reach all the sites that I tried during the show, several of them were really really slow to load.

  4. Leon Quinn February 20, 2009 at 00:31 #

    I wonder how one might measure the amount of traffic it takes to slow or kill a site, ie – how many visitors at the same time. Is it measurable?

  5. Michele Neylon February 20, 2009 at 00:37 #

    There is no simple answer.
    I guess the key is to work on the basis of faster and simpler = better.
    If a page has X number of SQL queries to render, how can you reduce it?
    Can you cache it?
    Does it even need to be dynamic?
    If you control the server the obvious things need to be tweaked ie. number of apache / iis processes etc.,
    If you’re expecting the site to be popular then doing some basic load testing would definitely help

  6. oisin February 20, 2009 at 15:32 #

    Hi there,
    My company created the site and animation for http://www.cleancash.com. We have recently moved our hosting to Blacknight away from hosting365 and are pretty happy with the service.
    Anyway, as a precautionary measure against the traffic spike during/after the Dragon’s Den show I moved the multimedia content from Blacknight hosting to Amazon S3 hosting. I monitored it throughout and was able to access it without a problem. Looking at the #ddire twitter channel I think most people were able to?
    From Blacknights perspective would you recommend people in a similar situation do the same or have you a way of making provisions for traffic spikes?
    Amazon S3 is an extremely cheap service and the designer/developer doesn’t have to ring amazon up and warn them about a future spike in traffic.

  7. Michele Neylon February 20, 2009 at 16:06 #

    The problem is if people don’t bother even checking how optimised their code is.
    For example one developer I know was telling me about a CMS that generates 100 SQL queries per page. Now without any optimisation or caching that kind of overhead will cause even the beefiest of servers to have issues if there is a traffic spike.
    I wouldn’t advocate moving the multimedia to S3, as it will cause latency issues. If you check the load and make sure that the code etc., is as lean as possible and that Apache / IIS is tweaked it shouldn’t be a major issue.
    Obviously if you’re doing a lot of heavy traffic all the time you’d need to look at how the site is coded and the content separated.