When Coldplay released the first single, Violet Hill, from its long-awaited new album Viva la Vida as a free download, an estimated 600,000 fans rushed to www.coldplay.com. Hitwise reported that US visits to coldplay.com increased 1,800 per cent following the free release of Violet Hill. Not surprisingly, the huge spike in traffic caused the site to crash.
And when Radiohead released its latest album, In Rainbows, online last October and let fans name their price, the band's website www.radiohead.com also crashed as fans logged on from different time zones.
Music fans buy about two million tracks a week online in the UK and, according to the British Phonographic Institute, sales of digital albums increased by 65 per cent in the first half of 2008.
While this is great news for the music industry, artists and their record labels need to ensure their sites are geared up to handle large spikes in traffic.
The answer isn't simply to cross your fingers and hope, but to prepare for the unexpected.
For less well-known artists or bands, it might be cost-effective to use an established platform for delivering new music, such as MySpace or Yahoo! Music, rather than paying to host and serve their own content. "Yahoo! has the bandwidth and capabilities to enable many users to watch a video at once," says Ventura Barba, general manager for Yahoo! Music Europe.
This option isn't just for small bands: Coldplay debuted its latest album on its MySpace page, for example. "Why try and reinvent the wheel?" asks Tony Burt, senior strategic planner at Sapient. "Sturdy platforms such as MySpace make a good substitute and so does BitTorrent, which Nine Inch Nails is known to use to release free content."
But for those bands that want to let fans download or stream tracks directly through their own sites, the secret is to prepare and plan. "A lot of it comes down to bandwidth," says Macsen Galvin, development director at WebFusion. A site often crashes simply from too many fans trying to access it at the same time. It can also be due to a hardware issue, not having enough servers to deal with large volumes of download requests, for example.
A web hosting provider can help by delivering extra bandwidth and processing power through dedicated server farms or data centres, and high-capacity networks. "Your web host should have multiple, redundant gigabit-plus internet connectivity with different providers to cope with any network connectivity provider outage and ensure optimal internet routing," says Mark Jeffries, chief technology officer at Fasthosts.
Fabio Torlini, marketing director at Rackspace, recommends site owners find out how much hardware capacity their provider has in stock. "We can upgrade memory and processing power in one hour, and get servers online extremely quickly because we have it all in our data centres ready to go," he claims.
In addition, technologies such as virtualisation and load balancing enable site owners, or their web host, to manage site traffic more efficiently. Load balancing, for example, ensures content is distributed evenly around servers so that one server doesn't become overloaded. And virtualisation "gives you capacity to increase resources quickly without having to build or install anything", explains Galvin.
In preparing for spikes in traffic, record companies need a hosting provider that will be flexible enough to increase capacity in the run-up to a campaign or launch, and downgrade it at other times. "We have a deal for a set amount of data transfer for a fixed fee. If we spike above that we pay a per-gigabyte rate," explains Chris Thompson, managing director of D A Recordings, which owns emusu.com, a platform delivering ready-made e-commerce sites to artists and labels. Its offering includes hosting through a partnership with Rackspace.
When a campaign or record launch is global, it pays to think big. Content delivery networks (CDNs) specialise in distributing content globally in the most efficient way possible, by caching content locally in different territories. This means that when users from a particular country download an album or piece of content, their requests are sent to the nearest server rather than one on the other side of the world.
Bandwidth supplier Interoute was responsible for the content delivery behind Coldplay's recent free single download, (although not the hosting of the site). "We have the biggest transmission network in Europe, on top of which we have built a huge CDN, so it doesn't matter if there are two or two hundred million downloads," says Mark Lewis, Interoute's director of content and communications.
Akamai, a specialist in the delivery of fast web applications, also has a large CDN connecting 34,000 servers around the world. Video website The NewsMarket is using Akamai to increase the speed at which journalists around the world can preview and download news clips from the Beijing Olympics.
"In China, the broadband connection is still shaky, so we have partnered with Akamai to ensure we have an easy and fast way of getting Olympics content to journalists," explains Romina Rosado, global head of marketing at The NewsMarket.
While Akamai and most established web hosting providers are now experienced in delivering static content files, such as web pages or music singles, the increase in video and dynamic content creates more of a challenge. "Dynamic content means I have to go back to the database and request that piece of content, so, as sites get richer with technologies like Ajax, there is no way I can predict what someone is going to do and, therefore, host that content locally," says Suzanne Johnson, senior industry manager for media and entertainment at Akamai. "You can't avoid that trip back to get the content but you can get it faster," she says, "by optimising connections on the internet and doing things to predict what the content will be."
Lastly, sites need to be tested to check they will be able to handle high volumes. Reflective Solutions provides testing and monitoring tools. Chief executive Graham Parsons suggests that you don't just simulate the action, such as the downloading of a track, but also the chain of actions a real fan would undertake on the site. "You need to put your whole system under a realistic load," he insists.
If preparing a site to handle spikes in traffic already seems tough, the bad news is it will only get more challenging, as sites become richer and streaming video becomes as popular as downloading audio files. But get it right, and there's a wealth of opportunities for artists to cultivate a devout fan base.
WE7 ANTICIPATES TROUBLE AS IT RELEASES 200,000 TRACKS
We7 (www.we7.com) is an ad-funded music download site, that lets music fans stream or download music for free in return for receiving targeted advertising. The site was co-founded by Peter Gabriel and launched in May 2007 with a mere 30 tracks; there are now 750,000 tracks available for streaming on the site.The hosting challenge for We7 is that apart from the paid-for music downloads it offers, the content on its site is dynamic by nature, since each free track that is streamed or downloaded is served with ads targeted at that subscriber.
We7 works with a web hosting company called Mythic Beasts, which co-locates at one of the big London Docklands data centres, to ensure it has enough bandwidth available for users accessing its site. But, more importantly, says Steve Purdham, chief executive of We7, the site was built with performance in mind. "The infrastructure and design of the site has been built to be scalable; the ability to load balance between one set of servers and another had already been put in," he explains.
Even so, when We7 announced a partnership with Sony BMG and the release of an additional 200,000 tracks on its site free to subscribers from 28 April this year, it began testing four weeks ahead of the launch. "The first thing we tried to anticipate was all the different bottlenecks," he recalls.
We7's preparations turned out to be worthwhile. "Our peaks on the launch day were around 40,000 streams, 16,000 previews and 12,500 downloads," he reveals. Despite the testing, We7 did hit a glitch. It turned out to be something very simple to fix. One of the server's operating systems had an embedded file limit that restricted it from opening more than 10,000 files at any one time, and the server went down. But in 15 to 20 seconds, Purdham says it was back up again and only 130 people experienced poor-quality streaming as a result.
His response is pragmatic: "These things are going to happen; what you have to do is anticipate them and have processes in place that capture them."