Like many San Franciscans I read SFGate every single day. A cup of coffee, my local news fix, and then I'm ready to face the world. But when I reached for the paper earlier this week my routine was derailed - the home page had undergone a major redesign.
After a reluctant attempt to embrace the new look I thought, "Gee, seems like SFGate has been the same forever. Has it really changed that much?" So I visited the Wayback Machine, dug out old copies of SFGate, and made this timeline of the site's evolution:
Here's a close-up of last year versus current:
Conclusion: Until this latest redesign the SFGate home page had remained virtually unchanged since 2002.
To the design team's credit they are doing a fantastic job of accepting feedback and addressing questions in the wake of the redesign. As of this writing there are 164 public comments about the redesign; some are amusing but snarky critiques ("Annoying, busy, fluffy and broken..."), others make useful observations ("The new page crashes Internet Explorer on my smartphone..."). The comments are worth browsing both as a consumer and as a web analytics professional.
My relationship with SFGate is simply as a consumer; I know nothing of their corporate approach to web measurement. And yet I can't help but wonder how they are measuring redesign effectiveness. Or, for that matter, how any of us are measuring redesign effectiveness whenever we do redesigns.
I am full of questions:
- Did the design team do any multivariate or A/B testing?
- If not, was it a conscious decision or just the path of least resistance?
- Do they have a plan in place for measuring the impact of redesign on visitor behavior (and ultimately, on the bottom line)?
- Was measurement baked into the redesign project plan or was it more of an afterthought?
- Were web analysts involved early on, and if so, in what capacity?
I want to know!
I believe, as an industry, we are now more inclined to consider measurement as a valid and necessary part of redesign than we were a few years ago. But I still think we've got some evolving to do.
Just the other day, by way of Future Now's Bryan Eisenberg, I read that 76.7% of internet retailers do not do multivariate or A/B testing. And that's just retail, arguably the most evolved of web verticals! What about the rest of us? Are we making the most of site redesign measurement, or is there still room for improvement?
As always I'd love your feedback.
[I have written about the site redesign measurement process before: here are 3 ways you can prepare for site redesign measurement.]
There are 3 metrics that can measure success/failure of any site redesign – uniques, page views and advertisement $ before and after redesign. Obviously the goal of any site redesign is to improve user’s experience and navigation / search capabilities, however the final result is always advertisement $. When redesigning online media site your new site will look/feel completely different than the old site, so testing anything within old site’s environment won’t make much sense. In ideal world - when you launch your new site – you first run both sites together (old one and new one), while only small % of the traffic is going to the new site. Thus you can actually evaluate which one does better, get a feedback on a new site, learn what’s working, what’s not, and then slowly increase the traffic as you feel more comfortable with the results (this is what yahoo did when they launched they redesigned home page a year ago). However most of the site redesigns are being launched over night and morning after company has to deal with a new site, bugs, issues and everything else that hadn’t been thought through.
SF Gate's page views were declining month over month since Jan-07 (page views per person were declining as well), while uniques were in the same range for the last year. Decrease in page views was probably main driver behind site redesign, since site was loosing ad sales $. This actually would be interesting to see if new site will be able to increase page views, while maintaining at least the same level of uniques.
Posted by: Alex Beskin | February 27, 2008 at 07:36 PM
Well, hey there June.
I usually ask the same questions. It's a good thing nerdy is the new black.
The answer is likely no. Opinion tends to rule the day more than testing, though it should not. If you ask me, I think the web analytics community needs to reach out to the graphic design and web development community for a bit of co-education. We're all about the same thing: user experience for bottom line results.
After I heard Eric's RAMP speech at eMetrics, I conducted an informal poll of people I ran into and only 25% had done any testing. And that's a sophisticated audience!
Maybe you should write a book called "Don't Make Me Guess: A marketer's guide to data driven decision making" as a corollary to Steve Krug's great book. :-)
Posted by: Alex L. Cohen | March 19, 2008 at 07:45 PM
@Alex: Like you, I feel like there's *something* we measurement people ought to be doing to reach out and convince the web dev people that testing is really the best way to roll out redesign changes. And, like you, I do have the sneaking suspicion that the SFGate changes were made with no testing whatsoever. It's a technical issue and a political issue and a comfort-with-numbers issue, and I intend to keep talking about it. I hope you do, too.
Posted by: June | March 21, 2008 at 05:02 PM