If the great Bob Dylan had retired from the music business after his '66 motorbike crash and taken up a career in DM (it's a bizarre notion, but stay with us), he'd have made a great copywriter. He'd almost certainly have had a phrase for our industry's current predicament: something along the lines of "The times, they are uncertain ...".
And uncertain they most certainly are. Some even argue - and if you've read the latest IPA Bellwether report, you might be one of them - that direct marketing is finished, destined to be mortally wounded by sword-wielding "green" warriors and devoured by the fire-breathing dragons of recession.
While we agree times are tight, and that the industry has to tackle privacy and environmental concerns head-on, we're also optimistic. Why? Because tough times will force us to remember some of the things we've forgotten.
One of the problems of our industry is that in the headlong rush to embrace change, we have forgotten many of the fundamentals of DM, the most important of which was - and still is - testing.
When we first started out in direct marketing, all the best agencies had a systematic and rigorous approach to testing; everything (audience, offer, message, creative) was crash-tested.
That doesn't happen so much these days. The reasons why - budget cuts, over-caution, agencies not arguing the case for testing budgets strongly enough, clients feeling they don't have time to test - could be discussed at great length, but what's far more important for our purposes here is to establish why testing is important and how we can bring it back home.
Skilful testing is a bit of an art - you need to have clear objectives; make sure you have controls in place; have statistically robust sample sizes; test the right number of variables, and create a rolling "test and learn" plan. But it can help creatives think up powerful and engaging work, because it allows you to make mistakes in safety.
Making mistakes, and learning from them, is a key part of learning. And learning is a key element for creativity. Masterpieces are the result of experimentation - Guernica and Bleak House did not pop into their respective creators' heads already finished, they were worked and reworked. And so it should be with our own dark arts.
The consultant McKinsey said that "marketers should spend 75 to 80 per cent of their budgets on proven messages and 20 to 25 per cent on well structured experiments. Marketers who skimp on experimentation may be overtaken by changing media patterns or forced to assume larger risks by rolling the dice on unproven programmes when markets shift." We agree wholeheartedly, and it's something we try to communicate to clients.
But we no longer have the time (and sometimes the money) to test: the days when we had the luxury of spending months refining, testing, re-refining and re-testing a DM programme to perfection are gone. We live in a world of "perpetual beta", which means we have to launch campaigns faster, learn quicker, and adjust faster. And older testing models are not always viable.
However, digital channels now make testing quicker, easier and more cost-effective. The great thing about digital is that not only can you target people more precisely, but that you can also measure the results incredibly quickly. Real-time testing has become possible. Risk is mitigated because the costs are so low and wastage isn't a problem (expensive and time-consuming creative development research becomes almost redundant). Programmes can be actioned extremely quickly.
Another benefit of digital testing is that you can easily apply what you've learned to offline channels: for example, you could test four different price points online, understand which works best and roll it out offline.
At OgilvyOne, a good deal of our success is thanks to digital having long been at the heart of what we do. We can easily tap the talents, tools and methodologies of our online colleagues, many of whom, because of our integrated offering, also work offline. But our growth and prosperity has also come from selling existing and potential clients on the notion of "safe adventure". It's essentially a way of combining innovation and creativity with stability and certainty of outcome.
Safe adventure is the very opposite of creative-stifling process. Look at it this way. Rather than spending two weeks on the beach at Benidorm as you did in 2007, this year you're going trekking in the Ecuadorean jungle. It's a real adventure - but you're fully prepared: you've researched your destination, you've had your jabs, you've got comprehensive travel insurance, you're travelling in a small group with a reputable specialist company and you have an accredited local guide. You're going to have the time of your life and it's unlikely you'll come to any harm. Your enjoyment of the trip, and your sense of pride in your adventurousness, is unlikely to be compromised by your research and the fact that you are prepared for most eventualities.
And so it is with clients. Improved testing, and detailed analysis of lessons learned it, has allowed us to offer them a "safe adventure". If we're asking for budget to innovate or improve our creative offer, it helps if we can demonstrate how testing has paid off.
Testing was one of the things that made what was then called "direct response advertising" great in the 60s and 70s. David Ogilvy knew better than anyone the importance of research and experimentation. In these trying times, it's ready for a comeback.
- Sam Williams-Thomas is the deputy managing director and Brian Sassoon is the planning partner of OgilvyOne London TIPS FOR TESTING IN DIGITAL AGE
- Define your objectives: what is it that you want to test, and what do you want to learn?
- Don't worry about colour schemes, logo placements etc. Only test the big stuff: the things (audience, offer/proposition and creative - in that order) that could make the biggest difference to your campaign, and from which you can learn the most.
- Test early in a quarter so you can implement what you've learned in the following quarter.
- Test quickly, act quicker: if your testing cycle is longer than six to eight weeks, you're too slow.
- Create a "log of learning" from your testing. Always offer qualitative proof.
- Ask yourself: "What can I learn from my digital colleagues?"
- Battle hard for test budgets, and ideally ring-fence 20 per cent of your total budget. Set expectations about what that budget can achieve.
- Work with clients to measure quickly - you can usually predict the overall response to a test campaign after a few days using decay curves.
- Don't always fall back on tried and trusted test methods. Experiment constantly.
- Adopt a pragmatic and open-minded approach to test results.