A/B/C/D/Evolving the Govt.nz homepage design

When Govt.nz launched in July this year we’d already completed several rounds of user research. The design of the site and the layout of the homepage had changed a lot since we launched the first beta version the previous year.

How users are getting to our homepage

After several months, we started to see some interesting patterns in our website analytics. A lot of people were searching Google using terms like:

Most users start their visit to Govt.nz elsewhere on the site, but many will view the homepage as part of their visit. In a typical 2-week period we get 20,000 page views for the homepage in total, and more than 5,000 of these visits start on the homepage. You can see other stats for Govt.nz on our analytics page.

For users that do start on the homepage:

  • 85% of visitors are “new” — but users can have more than 1 device, use different browsers, or they might clear tracking cookies off their machines — so this isn’t an accurate measure.
  • On average, users looked at 3 or 4 pages of content.
  • 6 out of 10 visits would lead to users exploring the site; 4 in 10 visits though resulted in a “bounce” off the homepage. A bounce means users would only look at that page and then leave the site without any further interactions.

Are we meeting user needs?

From this data it looks like there’s a group of users we’re not helping. Are they looking for announcements published by the Government of the day (eg beehive.govt.nz content)? Are they looking for information about government services and not seeing what they want?

Although we’ve had a wide range of feedback on the site design, we wanted to get a better understanding of users’ needs when visiting the site. We decided to run some A/B tests. To be specific, it was an A/B/C/D/E test. We left the original homepage in place, but randomly showed users 4 other homepage variations with different blocks of homepage content and images. By mixing up different combinations of a few changes we could test whether the position of content, or the content itself was factor in reducing the homepage bounce rate.

The original and 4 variations of the Govt.nz homepage, the layout and content varying slightly with each version.

During A/B testing of the Govt.nz homepage we used the following versions:

  • Version A — the original homepage with a decorative image on the top right of the page, and links to the main categories on the site in 3 columns for desktop users
  • Version B — promo stories on the right-side of the page, and a promo story that featured an image of the Beehive (the New Zealand Parliament buildings)
  • Version C — similar to version B, but the promo story that featured the Beehive image was put lower on the page
  • Version D — promo stories on the left-side of the page, and like Version B, the Beehive promo story featured at the top of the list
  • Version E — similar to Version C, promo stories on the left side of the page, but with promo stories in a different order

Version B was the most successful variation of the 4 designs tested.

If you’ve visited the Govt.nz homepage in the last few weeks you may have seen one of these variations. A big thanks from us because you’ve helped us get a better understanding of how people interact with our content. You’re also not alone: the experiment ran more than 16,500 times.

The tools

Govt.nz uses Google Analytics (GA) to track what users do on the site. GA includes content experiment tools that are pretty easy to set up:

  1. Decide on what design variation you want to test and what your measure of success will be – we wanted to find out which version would give us the biggest improvement in the homepage bounce rate.
  2. Create your pages, each variation needs to have its own unique URL.
  3. Load all this information into the content experiment tool, Google will give you an “experiment key”.
  4. Get the tracking code installed on the original version of the page you want to do the experiment on.
  5. Activate the experiment in Google Analytics — you should start to see early results within 24 hours.

We use a common analytics tracking code across all our environments. Inside Google Analytics we use a single account, a single property and then different profiles (views) to filter out traffic for each environment into separate reports. This meant we could test both the new versions of the pages, and the updated tracking code before putting it on to the live site.

Interpreting the data

You need to make sure you let the experiment run for long enough to see if there are any patterns.

Collect data for a few weeks

I’m glad we ran the test over 4 weeks, because during the test:

  • the results for each design changed on a daily basis
  • the early results looked promising, but over time the differences between the variations became smaller and smaller
  • we could see if there were any differences in user behaviour during the week, or at weekends, during business hours, or visits happening when people were sitting on the couch at home after dinner.

Data, data and MORE DATA!

Getting the basic results is straightforward. The experiment tool has a built-in report that gives you all the numbers for things like bounce rate, length of visit, and it even reports against any site goals you might have set up.

Besides the bounce rate, I wanted to see which variation of the homepage worked better using these metrics:

  • average length of visit (session duration)
  • goal — 5 minutes or more on site
  • average number of pages viewed in a visit
  • goal — user views 3 or more pages during their visit
  • goal — how many people find content and a link to an agency website and then follow that link. We call that “agency referral” (and it’s one of the main reasons why Govt.nz exists).

The experiment tool also lets you apply any custom segments that you use on other Google Analytics reports you might have set up. Govt.nz has a number of custom segments we use for our internal reporting. A few of the main segments I thought would be interesting to apply to the homepage experiment were:

  • Desktop, tablet and mobile use
  • Organic search (ie Google and other internet search engines), referral traffic, and direct traffic
  • NZ users vs overseas.

Patterns, trends and bringing order to the universe

As an information architect I found myself swimming in data.

I had to map it all out in a spreadsheet with all the numbers. Then I used a simple ranking to show up patterns across the 5 versions of the homepage, compared to each metric and segment we were exploring. Looking at the ranking made it easier to see patterns, particularly something in the results that stood out as unexpected or unusual.

Adding a colour coding to each rank made the comparison between versions even easier. I chose a range of green (meaning a high ranking), through to red (meaning a low ranking). Just because one of the homepage versions ranked "in the red" didn't mean the result for that version overall was poor, just that it was lower than the others. You can use any colours you like, provided you apply them consistently.

An example of the data analysed

Metric: Pages per session by device type
Homepage variations ranked using average pages viewed per session
(visit) by device type, highest=1, lowest=20
User Segment Version A Version B Version C Version D Version E
All users (average) 10 8 9 6 14
Desktop 3 2 4 5 7
Tablet 16 15 11 1 13
Mobile 18 17 19 12 20
Metric: Pages per session by traffic source
Homepage variations ranked using average pages viewed per session
(visit) by source of traffic, highest=1, lowest=15
User Segment Version A Version B Version C Version D Version E
Organic search 2 3 4 1 7
Referral traffic 11 5 10 12 6
Direct traffic 14 15 13 9 8
Metric: Pages per session by geographic location
Homepage variations ranked using average pages viewed per session
(visit) by geographic location of user, highest=1, lowest=10
User Segment Version A Version B Version C Version D Version E
NZ users 9 6 4 7 8
Overseas users 5 2 3 1 10

Analysing the "Pages per session" metric

The A/B test results showed that:

  • none of the options performed well for users with mobile devices
  • versions C, D, and E performed slightly better for tablet users
  • for users on desktop-style machines, all of the versions performed about the same
  • for users that came to the site from search engines like Google (as opposed to referrals or direct traffic), variations A, B, C, and D performed in a similar way to the original homepage, and performed the best overall compared to the other results we explored for this metric
  • all variations, including the current home page, performed more or less the same for both New Zealanders and visitors from overseas — if anything, a few of the variations worked better for overseas visitors.

Other observations

Here’s some of the other things I saw in the complete set of data we collected.

  • Overseas users spend a lot longer exploring the site — is this due to not being familiar with government in NZ or not having English as a first language?
  • Variation D, which had promo stories on the left, didn’t work well on mobile devices — probably because the promo stories appeared above the main links on the homepage and people had to scroll too much to find anything they thought was useful
  • Users visiting the site on a tablet were more likely to click links on the main part of the homepage, when the promo stories were on the left side of the page. I think this is due to the majority of people being right-handed, and turning their tablets to look at the site in landscape view — that’s something we need more research on!
  • Variation B and C, which had the promo stories on the right, led more users to find and click links to agency sites; both of these versions worked better for mobile users.

What next?

I’ve presented the high-level results to the Govt.nz team and now we’re going to decide what changes we’ll make to the site. Product Manager Victoria Wray will talk about that in a coming post.

A, B, C is one thing, but what about U?

What’s your experience with A/B testing? Did you do an experiment and discover results that were both interesting and unexpected? Is there an experiment you’d like Govt.nz to do in the future? You’re welcome to leave your thoughts below.

6 comments

  1. Comment #1. Raena:

    I might have missed it, but I didn’t see any analysis on how moving the search box out of the top right and over to the left for variants B-E might have influenced people’s behaviour. That seems like an enormous change to me.

  2. Comment #2. Nathan Wall

    Hi Raena

    I didn’t include the full spreadsheet of data we collected – so many numbers! I didn’t think it would be useful for most people. I really wanted readers to think about the technique and how to go about the analysis rather than the specific results.

    You’ve asked a great question though – and the results are interesting for that metric. For the original version of the homepage, 6.6% of people do a search. All of the variations B to E led more people to search from the homepage. That could be either a good – or a bad thing – but if it helps more people find the content they’re looking for, then that’s a behaviour change I’m OK with. There was almost no difference between Version C and Version E – both increased searches from the homepage to 8.9%. This still could be considered ‘low’, but in the user research we’ve done, a lot of people are reluctant to use a local search engine as they don’t trust the results.

    There’s a lot more data we still need to go through. For example, what did people actually search for and did it differ between the variations? As an information architect there are times when i love my job. This is one of them. 🙂

    What’s the search behaviour for your homepage, are you able to share that?

  3. Comment #3. Raena:

    I’m still getting settled (it’s only day 1 of my third week) but perhaps another colleague could offer something about our search.

    I agree that the search thing sounds like a good improvement and I wouldn’t sneeze at a 2.3% increase either. I’d be reluctant to leave its influence out of my analysis. Of course, if you’re already committed to changing to something like B-E and you just want to know which of the four is best to go with maybe it doesn’t matter as much.

    It’s a shame people feel so reluctant about search since it has so much potential for helping people find what they want with small to no knowledge of government. Having only recently made my way over the Tasman I spend a lot of time searching for topics rather than program names, cause it’s not just eskies and thongs that have unfamiliar names over here. 😉

  4. Comment #4. Nathan Wall

    In all the research we’ve done with users over the last 2 to 3 years there is a strong preferance for search when comes to finding government stuff – but users prefer Google Search as there is a perception that Google is magic and will just find what you’re looking for.

    I know what you mean about struggling to understand the language of government in NZ – I jumped the Tasman 10 years ago and I still struggle sometimes. I love my esky! And I wear thongs all the time. [Point of clarification: Im Australian – so I’m referring to jandals!] Not speaking the language of users is another common thread we’ve seen in a lot of our research. Government has a habit of coming up with terminology that people just dont relate to.

    We’re still making adjustments to content as we discover patterns. eg we used to have content that listed school terms. Users actually search more frequently (on Google) for ‘school holidays’, so we updated the content to match. We need to be doing more of that on all our sites.

  5. Comment #5. Vanessa Roarty:

    Hi Nathan, thanks for sharing these details about your testing, sounds like you got lots of great data. We are in the process of looking at how we can reconfigure the beta.australia.gov.au home page to elevate popular and promotional content so we will use your evidence as a starting point and may try to duplicate your approach when it comes to our own testing. It is also interesting that your searches went up with the move of the search box. Our onsite search hovers around 5% so we would consider 9% relatively high.

  6. Comment #6. Nathan Wall

    Hi Vanessa

    Fantastic that we can help you out and be a site you can compare your own results against. Keep in touch as your testing progresses. We’ve still got some way to go before our popular and promo content is working as well as it could. The results from our testing were inconclusive in many respects — so the research goes on!

Navigate Posts