The topic of site speed and revenue came up among my colleagues recently, and while I soooooo much want to believe that fractional second site speed improvements drive revenue, I have yet to find a convincing set of data that proves this hypothesis. Marissa Mayer’s famous talk at Web 2.0 in November 2006 was cited as evidence in support of this hypothesis, but I believe the Google experiment was flawed (or at least not well-reported by Marissa) because they didn’t understand the why behind their numbers. She claims that “traffic and revenue went down 20%” when they showed 30 results instead of 10, which took 0.9 seconds instead of 0.4 to load. However, if “traffic” is a function of “search page views” and “revenue” is a function of “clicks to Sponsored results,” you could explain the difference by the following two very likely scenarios:
- With 10 results per page, more users are clicking on the “Next” pagination link when they don’t find anything relevant on the first page. This results in more traffic, because you get more search results page views per user.
- Other users who don’t click on the “Next” link might get to the bottom of the page after 10 results and find they didn’t find anything relevant, so they might click on Sponsored results with greater frequency than those with 30 results per page. This results in more revenue.
But are the 10-per-page users more satisfied than the 30-per-page users? Not necessarily. Those who got 30 results per page had to wait 0.5 seconds longer, but you could argue (or better yet, observe directly) that more of them found a natural search result to click on the first page than those who got just 10. (This talk was one of those that did a lot more damage than it did good, IMO. Even AJAX got bashed.)
When I was at eBay, we (User Research and Engineering) tried really hard to show a relationship between site speed and revenue at eBay. We just couldn’t do it. I’m not saying it’s not true, but I found that it’s really hard to do. In contrast, it’s really easy to get duped into thinking something’s there when it’s not. That’s the problem with quantitative behavioral metrics when taken in isolation, or worse, when the metrics you’re using (revenue) are in conflict with other metrics you’re trying to improve (satisfaction or usability).
Another eBay example is the metric of “time on site” as a measure of “engagement.” eBay blows everyone else out of the water on this one, but if you stop to think about it (or just study its users), it’s easy to see why: it offers a unique inventory that is interesting in and of itself (which is, admittedly, engaging), but it’s also the hardest e-commerce site to use, requiring more time to figure out. Furthermore, when you get outbid, you have to start all over again. It also has some really hard-core, loyal users, some of whom are literally addicted to the site.
It’s really great to have access to quantitative behavioral metrics and to know what kind of impact a change to your site will have on revenue. But it’s also dangerous to only rely on this as your only source of insight. It’s even riskier to exalt a particular metric when you don’t fully understand its relationship to metrics that *really* drive your revenue. That’s a form of blindness that’s hard to see in yourself.
On a related note, UIE published a study on ‘perceived site speed’ and ‘actual site speed’ based on research done around 2000. It was an observational study of e-commerce sites, and they used big enough numbers to run some basic statistics. They found no correlation between ‘perceived site speed’ and ‘actual site speed.’ Interestingly, the only thing they found that was related to perceived site speed was ‘observed ease of use’ (i.e., usability): the site (in this case, Amazon) that was the easiest to use was perceived to be the fastest, when in fact, it was the slowest of all of them in the study.*
*Note: The UIE study criticized Jakob Nielsen’s call for faster site speed, saying that usability is more important. I think Jakob would agree, actually. But that doesn’t mean we should build slow sites.