There are definite parallels here with advertising research. Clients, like Hollywood's money men, need to take every possible step to ensure their investment will be profitable. Pre-testing is one of the tools available to them.
Yet every year, leading agency people have topped the bill at the Market Research Society conference, criticising everything the research industry believes in. Last year, the heart of this distrust was exposed in a joint presentation by Neil Coburn, the head of the quantitative advertising unit at The Research Business International, and Merry Baskin, the former planning head at J. Walter Thompson and founder of the planning consultancy Baskin Shark. After interviewing both researchers and planners, they concluded that what agencies hate is pre-testing. Specifically, quantitative pre-testing.
Yet Baskin is clear on this point. "The problem is not so much with pre-testing but with the way some clients use it, and are allowed to use it by the research companies. As an aid to judgment, it's fine. But don't use it as a decision-maker or a means of covering your back, she says.
Agencies particularly dislike the idea that a pre-test score can work like traffic lights, signalling green to go or red to stop. Especially if the decision is in the hands of inexperienced junior marketers governed by corporate rulebooks.
Some methodologies are more helpful than others, too. A marketing director with one US-owned company says he was obliged to use a particular pre-test system that he felt provided limited information. Then, a couple of years ago, as the result of the kind of cock-up that can happen in any organisation, the media bookings were made too early and his new ads were on screen before the test results came through.
"The pre-test gave us a red light on the ads, when it was already clear that the campaign was having a direct and positive result on sales, he says. "We started to look around. We switched to the Ipsos ASI system, and persuaded our parent company to do the same. At least it gives us more diagnostic information on which to base a decision."
There's no shortage of client companies willing to explain their commitment to pre-testing and the benefits they get from it. Unilever Bestfoods relies on it heavily, Stephen Donaldson, its best practice manager, says, because it has proved its value.
But the company's approach is evolving. Pre-testing is being used less for straight evaluation and more as a development tool to improve the ads. It is not research's job to make decisions, Donaldson points out.
Marketers have to decide which ads to go with and what to change.
"Having said that, pre-testing does work, Donaldson says. "When we have over-ridden test results in the past, nine times out of ten subsequent tracking has proved the tests were right."
For the past four or five years the drinks giant Diageo GB has standardised on Millward Brown's Link system. The consumer planning director, John Hosking, says the company is spending more on advertising than it has ever done, confident in the quality and effectiveness. Yet pre-testing as a cost is "insignificant - less than 1 per cent of the total budget.
"I understand the suspicions agencies have of quantitative pre-testing," he says. "We use it to maximise our chances of making ads that are efficient and effective as well as creative."
For example, the famous Guinness surfer ad - voted the best ad ever by Channel 4 viewers in 2000 - initially failed to meet a number of pre-test benchmarks. Hosking says: "The diagnostics gave us and the agency the confidence to go ahead. The pre-testing showed that the strength of the message wasn't getting through because of some comprehension gaps, which we were able to address. The rest is history.
"But we're not in the business of making average ads, he continues.
"If the test scores are not good and the diagnostics don't provide the answers for making improvements, we won't go ahead."
According to Gavin Emsden, Nestle's head of consumer insight and planning, there's a major learning benefit in tracking how ads perform in real life, compared with the test results. But like Unilever and Diageo, Nestle also uses the research to improve creative work.
"Usually these are tweaks, Emsden says. "Research may show that the pace needs to be slowed, you may need to linger on the pack shot to ensure the key points are absorbed. I realise there is a debate about whether pre-testing cramps creativity, but if it is part of the partnership between marketing and ad agencies and if objectives are agreed in advance, then it is useful as part of the learning process."
Shell also sees pre-testing as a vital tool as part of the client/agency relationship. The JWT creative team participates fully in all debriefs.
Intriguingly, Shell has commissioned its own system from hand-picked international research specialists. "I am suspicious of proprietary research systems, Raoul Pinnell, the global head of brands, says.
Shell needed the new system, which incorporates both qualitative and quantitative research, because it has moved toward producing mainly global advertising. "We have used this for seven campaigns now, of which only one, for Optimax, has been screened in the UK, Pinnell says. "The results have been completely magic. In moving from local to global, we have both saved money and produced advertising which is more effective and better liked by the public. Research has helped us find the Holy Grail of advertising."