Agile Announcements Digital Professional Development

Be part of World Vision’s digital revolution…

We’re very excited about our new Digital Innovations Team, part of the Digital Collective.

We’re even more excited that we can now share with you the first four great roles that will inspire people to join World Vision online.

Have a look at the roles and share them with people who you think would find them inspiring. And if you think that this sounds like something you want to be a part of, then we can’t wait to read your application.

Roles close 7th and 14th July – so be quick!

Agile Digital Digital Marketing Digital Transformation Professional Development Uncategorized

Agile Insights from Cancer Research UK

With Greg Franklin @gfranklin and Giulia Merlo @giuliavmerlo.

I recently had the opportunity to sit down for the Digital Collective Podcast with Greg Franklin and Giulia Merlo Agile Lead and Service Design Lead at Cancer Research UK.

The full episode is linked from the bottom of this page and is well worth a listen. Over the course of the conversation we talked about:

  • Investing in an in-house digital capability is a big deal, a significant skill shift and change to the way the organisation works. CRUK is a long way through and has a lot to teach the sector.
  • The benefits of Agile for charities include speed increases, better collaboration and closer working relationships with end users, but the structure is important.
  • Outputs or outcomes? Not just delivering the thing that we want to deliver but following through and understanding the impact that has is the key to transformation.
  • Service Design is the thinking around how the user’s individual interactions and experiences link together to create their reality of your organisation.
  • The charity sector is headed for disruption, digital transformation is only just getting started.

Don’t forget to subscribe to the digital collective podcast using your favourite podcasting app. We’re available on apple iTunes here, and directly to other podcast apps here.

Analytics Professional Development

CRO: Everything you need in 10 minutes

OK, I got a lot of feedback on last week’s blog post about the Conversion Rate Optimisation that we’ve been doing at World Vision.  So I wanted to give another take on that and maybe cover off some of the questions you might have about how to get your website performing waaaaay better.


Something is happening to the homepage…

If you visited the World Vision homepage over the last few weeks, you may have noticed some pretty significant changes, that’s because we’re doing some Conversion Rate Optimisation.

Conversion Rate Optimisation or CRO is an essential element of digital marketing, in fact, it’s probably the most important tool at your disposal for improving the performance of your digital channels.  But as we found out when we ran this first test in a new programme, it’s not as simple as it looks.  Specifically, we took away three lessons from this test which we’d love to share with you.

1: The step is not the journey.

The journey from finding out about your organisation to being an avid supporter is not made in a single step.  If you try and see the impact of a change that you make as a part of that journey on the overall performance of your website, you’ll find it practically impossible.  After all, if you’re trying to improve the performance of a road, you don’t measure the speed of the cars across their whole journey because there would be too many other variables, you just look at the performance on that part of the journey.  We chose to look at the second stage in our digital engagement model, (which I’ll cover in another post) which covers the period between finding out that we exist, and engaging in some sort of support.  It’s the single most important part of the journey, and one that many charities pay scant attention to in their digital material – World Vision included.

2: Test your big assumptions.

CRO is often thought of as the mechanism for comparing two different wordings of text, or two different button colours in an A/B test.  And it is useful for that, but to start there would make the big assumption that “we’ve got the thing basically right and we’re looking for tweaks”. In the charity sector, even those sites adhering to what we call “best practice” only achieve about half the performance that’s commonly seen in the commercial sector, so we can’t make that assumption.  Instead, we must cast far and wide to see what direction we should take to improve.  We started by testing the fundamental assumption that most of the charity sector uses which is that, once you’ve told someone you’re a charity, it’s OK to ask them for money right away.  This works in the US, but in other markets it really doesn’t, which kind of explains why we get poor performance on our digital channels here in the UK compared to commercial marketers who are talking to the same consumers.  We wanted to try two “storytelling” variants which removed the up-front ask, and see whether they would improve engagement at that first stage.

3: Set up your experiment right.

Putting things on the website and seeing what happens to donations is not a testing strategy.  If you cast your mind back to GCSE science, you’ll remember that an experiment has a Hypothesis, a Method, Results and a Conclusion.  And you need those too, otherwise, the test won’t deliver for you.

Our hypothesis was that our new content-led approach would improve engagement by 10% without impacting significantly on conversion (the number of people joining us as supporters).  You’ll notice that we included a scale for the improvement, we are looking for BIG changes remember, and we also included a stop condition – if we say the test having a big negative impact, we’d stop it.

Our method was a randomised control trial.  Now that’s a whole topic on its own but suffice for now to say it’s the gold standard of testing.  You randomly assign subjects to the old homepage or the new one and see what changes.  Keeping the old homepage live throughout was really important because it meant that we can eliminate any external factors, we’re not comparing this week with last week, just page A with page B.  We used Google Optimise within the Google Analytics suite, but there are plenty of tools out there.

Homepage “Classic” control (Left), Child Variant (Centre), Aid Worker Variant (Right)

Then we had results – numbers – lots of them.  We were looking at the bounce rate (the number of people who arrived at the homepage and simply left) and trying to influence that, whilst also keeping an eye on the main donation and child sponsorship activity and making sure that wasn’t significantly worse on the tests.

Then we had a conclusion, and before I tell you what that conclusion is, it’s worth noting that the conclusion is the most important part of the process.  Looking back at some work that had been done here over the last year, I saw lots of “tests” being run, but no clear conclusions, and hence no value derived from any of that testing.  If you don’t have something you’ll change if your hypothesis is proven, then don’t do the test – you’re wasting your time.

So we knew which step of the journey we were looking at, we chose a big assumption to test, and we set up our experiment method correctly.  What did we find?

Two things.  First, the hypothesis was proven.  We did see a significant improvement in engagement in one of the test homepages (can you guess which one?) and we didn’t see a significant change in conversion.  Job done.  That means that we have some follow-through work to do to ensure that those improvements result in better overall performance and that we move on to the next step in the journey.  But before we get to that, it’s worth noting what else we learned.  The first was that this process is hard.  It’s not rocket science, but it’s advanced mathematics, and it’s really easy to get it wrong and end up with results which you don’t trust or which don’t tell you anything.  We encountered technical gremlins which caused us to pause the testing, and we had different people reading the results in different ways, in short, we learned as much about the methodology as we did about the test subject itself.

The good news is there are amazing tools out there on the web.  Whichever optimisation tool you use will have a ton of information, and will probably automate the process to a decent degree.  But check out online learning videos and courses, and of course, the old bookstore is good for this as well.

So two weeks in we’ve switched these new homepages off to run an appeal, but we’re now planning the next phase of the programme and I hope I’ve given you the enthusiasm, and some pointers, that will help you do the same.

And more about the engagement model another time.

Until then…

Digital Digital Transformation Professional Development

Stuart McSkimming, CIO of Shelter

On this week’s Digital Collective Podcast, Stuart McSkimming reflects on 20 years in the charity sector and picks out his highlights and current trends to watch:

  • How the move from on-premise to cloud systems is nothing like as important as the move from customisation to configuration.
  • How partners can really add value with managing fast-moving technology and integrations, but organisations themselves have to figure out what to do with the technology at the end of the day.
  • How the role of the CIO has changed from a technology role, focussed on efficiency, to a business leader who understands what the organisation needs to achieve and figures out how technology can enable that.
  • How clearing the decks from tactical issues can leave the space and opportunity to address strategic issues.
  • and how charities can use their unique advantages to recruit the very best digital candidates.

Subscribe on iTunes:

And directly in your podcast tool: