How The Spectator went from one million to ten million page views per month

There's a direct link between web traffic and revenue when you're a publisher. How well that works depends a lot on other factors - the business model, commercial relationships and so on.

Development Projects

The Spectator is one of our great success stories. When they first came to us in summer 2012, they’d been working with a far bigger web firm than us, complete with their own custom CMS. But there was a big problem. In spite of very substantial costs, the application platform provided had huge concurrency problems, with the website failing whenever more than about one hundred concurrent visitors arrived. As this was a fairly common occurrence for The Spectator back then, the site fell down several times a day, and the back-end was so slow that it made posting fresh content almost impossible.

In short, their new platform was a major business block. But they’d spent all the money. So they needed an agile, capable and clear thinking company to find a way out of the mess without spending too much money. Careers, and perhaps even the business, could be on the line if the project failed. No pressure then!

And we delivered! After four successful years working together, The Spectator has just enjoyed its busiest month with over 10,000,000 page views!

So if we compare this last month to the same period in 2012, what does it look like? Well, it looks like this:

It’s so dramatic a difference in traffic that the 2012 data barely even shows. Sessions are up over 1,000%, users by over 1,700%. That’s a massive growth in audience and has really grown the audience for The Spectator.

Take a look at the sales figures published in 2012, and then compare with those recently published this summer. The numbers to compare are the audited ABC figures – today’s 84,000 against 2012’s 61,000. Quite a difference. You can see the correlation in sales against the growth in web traffic. It’s hard, of course, to be entirely sure whether that correlation is causal without running a control experiment.

Which means we have to get into more analysis. The ‘how?’ After all, it’s very easy to bandy about figures, but what were the key things we had to do to support this traffic growth, and how was it achieved? And what did we avoid doing that competing sites did that could have helped? Let’s see…

The servers

Let’s start at the beginning… servers. WordPress is relatively resource-intensive in its own right – especially on a fairly complex site with lots going on.

On day one, we actually ran with just two replicated servers, with the second simply being a failover. Caching was simplistic with Memcache being used for storing page and object caches, and performance was actually not that bad, with the ability to handle a couple of thousand concurrent users.

But what happens, over time, is that the software always grows and gets more complex, and efficiency drops.

An interesting thing about simple server setups is that they provide fantastic value – they’re cheap to set up and configure and there’s relatively little magic needed. But as the demands grow and you get to larger rigs, you’re entering a whole world of pain. There’s an initial hump that’s quite expensive to get over when you shift from single servers to clusters. But once you do, you now have a straightforward approach to scalability – you simply buy or instantiate more servers if you’re getting more traffic, and everything just works as you grow.

The current system relies on a small cluster of Percona servers, a set of load balanced application servers with Varnish in front of them, and a Redis cache for WordPress’s object caching. It work’s well. I’d rather the application was so fast that a lot of this server complexity could go away, but that’s not likely with the way it is. However, growing the capacity is, essentially, just a case of adding more servers.

Simplification of requirements

One thing that can add a surprising amount of expense, both on development and in terms of performance and server requirements, can be the sophistication of the software itself. A simple blog is far far easier to serve and tune-up than a complex site.

Last year we undertook a re-design and simplification program to tidy up three years worth of rapid development. We’d noticed a lot of analytics had flatlined and one of the key reasons was the relative lack of performance on the site. Simplification allows two things – it allows for simpler maintenance overheads, but it also makes the site easier for visitors to use. Signposting is more obvious, and there’s just less confusion.

If you look at the raw statistics the site’s page load times have gone up. But a lot of that is to do with adverts – as The Spectator’s success grew, the number and sophistication of adverts went up. Given the size and load of some adverts, this can cause render slowdowns while the web browser has to wait. Consequently, we now suggest that clients either work only with providers who supply asynchronous ads or supply native ads. This dramatically improves the perceived performance of a website and is work-in-progress for this site.

Faster = more traffic

So although we were stuck with the adverts, we could make sure that the article content was served quickly, so we did manage to make the servers much faster than before. A combination of much less traffic hitting the PHP side of the website and the extra capacity means that today’s server response times are much more consistent than before.

Do faster sites really get better traffic? Well, there’s not a lot of in-depth research that I can find on the subject, but we do know is that search engines take site speed into consideration for ranking. So it’s worth considering. And why? Because a slow site is a bad experience and Google, Bing, etc, all want their users to have a good experience.

Keep the UI simple

Although speed has a massive impact on the success of a website, so does the general user experience.

The Spectator has simplified its user experience over time, in spite of adding sophistication to the site and almost always allowed us, as designers and developers, to concentrate on developing a site that helps users get what they want. This is in comparison to those sites that may be developed largely, to satisfy the needs of short term commercial wins.

So let’s take a look at two examples, using very similar content – one at source, and one syndicated:

Motoring Research’s Vanishing cars on the original site

And on MSN

If you look at the Motoring Research site, it’s all very straightforward. You click, you see, and you scroll. You can parse the ten cars really quickly and scan the content. All good.

But on the MSN site, you have a slider… and that slider allows some interaction to reload some advertisements. It’s all much more clunky. You also have to work out how to use the slider which isn’t necessarily obvious at first glance.

The former works well for readers, the latter for advert impressions.

I’ve always been very proud that The Spectator has keenly followed the line of looking after the readers first. There are no gimmicks looking for page impressions for short term gains that always evaporate after time.

Respecting social norms

If you look at a post like this arts review, you have that simple UI and consequently feel respected. You can scan through, looking for tidbits, without any more interaction than a simple swipe. It’s just a nice experience. You don’t feel as if you’re just clicking away trying to improve pageviews or ad revenue for somebody else, far away. And that respect improves trust of the brand and helps you feel that if you do something like sign up with your email, or even go the whole hog and subscribe, that you’ll be treated the right way. You won’t be endlessly mithered.

Given that people share content socially, and want to be rewarded for that sharing (by friendships, likes, karma points on reddit and so on) then it’s important that, on sharing your content, they don’t feel like their friends are going to be annoyed. Every time I click a social media link that goes off to some ad infested, difficult to use site, my respect for the sharer reduces just a teensy bit more.

And getting great content

Of course, people won’t share your content without it actually worth being shared. The Spectator started a programme of investment into blogging and editorial direction for the digital side of the business back in 2012, which helped produce a regular stream of content for the website. This supplemented the periodical content they were traditionally known for.

If you’re a periodical, this makes a lot of sense – subscription content on one side, with some free sampling, and stream publishing that is aimed at the hungry and frequent regular visitor. It covers off all bases, and the stream publishing neatly promotes the periodical.

And all the above led to where we are today – a combination of design, quick development cycles, careful cost management, good editorial policy and looking after visitors.

What next?

This is interesting because I don’t know – the site’s direction isn’t covered by us. We design and develop according to instruction, but we do get to advise. Lately, the process has been one of optimisation, bug fixing and expansion into subsidiary sites such as Health and Life. We’ve also carried out a major redesign on another property owned by the same group, the arts magazine Apollo. There are also projects aimed at improving the experience for people signing up online for subscriptions.

Reaching the limits of WordPress?

Although WordPress is a little bit of a classical web publishing platform I feel that the answer is “probably not.” Although I can be a critic of the system, it still does an awful lot, and makes many things easier. Any strategy away from WordPress would be years down the line.

WordPress can be made to sing and dance. It’s main limitations revolve around multiple users trying to do the same things in the admin panels – if two try to work on widgets simultaneously they’ll almost certainly have some problems. I don’t think it suits a complex and large newsroom, for example. But for magazines and publications with a streamlined workflow, it’s just perfect and will continue to be so for a long time.

Better search

But where WordPress does fall down is on search. To that end, an R&D project in the company of late has been on how to improve this with a multi-platform approach. Having a good WordPress search, for example, is no good if what you really want to do is search across your WordPress website and Magento e-commerce store. So lately we’ve been building out a search system that allows you to group together multiple websites and search across them. It’s about to hit its first production site, so watch this space for more information.

The Spectator’s own search is somewhat fragmented – on the magazine site, you can search the magazine. On Coffeehouse, you just get Coffeehouse and so on. Wouldn’t it be lovely if, on typing in “cartoons” in the search box, you got offered both the cartoons section of the main website along with the store pages, where you can buy copies of said cartoons? Instead you currently just get articles that happen to mention cartoons, which probably isn’t what you want.

Search can be a sign of a failure in signposting, but on large sites, it’s still one of the best ways to help users dig out information.

Summary

Maybe you’ve simply scanned down the post, looking for key snippets? That’s OK and here’s a brief summary to help you along.

The Spectator did well not because of any one individual thing. Not because of hiring us, necessarily, or because of using a particular technology or having a specific content strategy. It did well because a lot of smaller elements, often overlooked, were properly dealt with over time and using an iterative approach. Site performance, content strategy and good visual design all played their part. This drove traffic from one million page views a month to over ten million in the past month. This is big and it grows almost remorselessly. And with each new blip in traffic due to a special event such as the recent referendum on EU membership, the number of people returning to the site grows just that bit more.

The Spectator keeps on growing and we look forward to helping with that progress for many years to come.

3 responses to “How The Spectator went from one million to ten million page views per month

  1. Excellent and in-depth article. Thanks. I enjoyed reading and got lots of insights as I struggle through but love the wordpress environment. I also would like to learn more on the maximum file limitations by many hosting services. What would be the alternatives to using huge number of files. Can I overcome this problem if I have dedicated servers? I will be extremely happy to know.

    1. You can always switch to NFS or even S3.

  2. You wrote something interesting : make it simple. That’s an easy but too much often missed tip. Not only in UX or development but in life too. ( #fengshuicoding ? )

Leave a Reply to Netref Cancel reply

Your email address will not be published. Required fields are marked *