In 2004 an average of 24% of total marketing communications budgets was spent on event marketing in the UK. Some 16% of marketers plan to spend more on events in 2005, and the discipline was the marketing medium that delivers the best return on investment (ROI).
These were the findings of a survey conducted by event marketing agency The George P Johnson Company (GPJ) last year. More than 150 marketers from a range of industries were quizzed and their attitudes toward the medium makes positive reading for all involved in event marketing.
Josh Robinson is a creative director at The Works London, a sponsorship and experiential events agency working with clients such as Coca-Cola, Tiger Beer and Canon. He says that, typically, after an event a minimum of 40% of the consumers that took part claim they are still using, eating or drinking the product. 'It is often as high as 60% three months later and brand recall is always above 90%,' he adds.
Event marketing agencies have seen great progress in developing bespoke methods of measuring events' effectiveness. 'Many of our clients are quite sophisticated users of events and have well-established norms against which they set objectives and measure results,' says Rob Allen, chief executive of promotional marketing agency The Russell Organisation (TRO).
'When planning a test-drive or traffic-generating campaign, for example, a mainstream automotive brand will tend to look at measures including footfall at venue, volume and quality of data capture as well as conversion to test-drive from initial enquiry. It will value the absolute cost of the exercise against these measures and compare it with previous experience.'
The GPJ survey found that measures such as these are being adopted by a high percentage of marketers - an impressive 81% of companies measure their events on some level. The greatest proportion (44%) has a measurement system in place for trade shows, with 48% of these companies measuring qualified leads. This was closely followed by measurement techniques that analyse sales figures.
However, monitoring sales derived directly from an event may not be as easy as it sounds. 'It is difficult to work out which bit of the client's marketing strategy has had an effect,' explains Robinson. 'It just takes a 20p-off sales promotion running at the same time as an event or an ad campaign for it to be difficult to pinpoint which had the impact.'
On-site surveys are a favoured measurement tool. These can be efficient, says Steven Philips, senior vice-president of creative at face-to-face communications agency MJM, providing companies ask the right questions and do not just monitor superficial elements such as the catering and registration process.
MJM worked with Microsoft last year to promote its XP digital entertainment package across the US. The software company had clear aims: to increase consumer knowledge of the product, to drive upgrades, purchasing and, ultimately, profits.
The agency suggested that Microsoft sponsor a roadshow being run by a big US cinema promoter. This provided extensive radio and press coverage worth millions of dollars and a built-in audience. Kiosks were set up for people to try the software, with trained personnel on hand to advise them.
In a survey conducted during the event, immediately after using the software, 30% of consumers said they intended to upgrade to XP, 58% said they would visit the website, 75% said they were more likely to use XP and 97% said they had a positive experience at the event. 'There were 151,000 one-to-one interactions at the kiosks, which was easy to measure because consumers logged on each time,' says Philips. 'The cost per interaction was very low and the average time spent engaged with the consumer was four minutes. Compared with the cost of a prime-time slot on US TV, it comes out favourably.'
Although Microsoft tailored its analysis of the event to assess whether it met its objectives, in general, roadshows are monitored by just 15% of the marketers surveyed. The report also showed that 58% of them were not sure how much they spend on measurement systems. This has led GPJ managing director Neil Jones to suggest that not all firms employ a well thought out measurement strategy.
Freelance event consultant Paula Hanford is also concerned that many marketers still do not appreciate the depth of planning needed to get the best out of an event programme. Hanford will be a guest speaker at industry exhibition International Confex, which takes place at Earls Court from 15-17 February. Her presentation will aim to help marketers understand how to identify business objectives and the types of event that deliver the best results. 'I am still amazed, 15 years since I started out in this industry, that so few companies have clear, measurable business objectives for their events,' she says. 'Take trade shows, for example. The objective for many just seems to be to see how many business cards they can collect.'
Tools of the trade
Even companies that have set clear objectives for their events may find the tools used to monitor them are not as sophisticated as they need to be. 'Events are increasingly required to promote brands as well as support sales, but measurement hasn't matched this evolution,' explains GPJ's Jones. 'Few companies are using the more sophisticated audits needed to measure message effectiveness.'
Tiger Beer, which employs The Works London to handle its consumer events, spends about 10% of its above-the-line budget on sponsorship. In July 2004 it hosted a kick-boxing contest at an NCP car park in London, to boost brand awareness among its core market of young male consumers.
The drinks brand gathered anecdotal feedback on the night and post-event data via its website. More than 250 people were turned away at the door and there have been high levels of interest about the next Tiger-owned event.
Tiger Beer UK marketing manager Ron Curbis admits that, for many marketers, assessing the effectiveness of an event is left down to gut-instinct.
The Works' Robinson agrees. 'Assessing the ROI of events is founded in common sense - everyone can understand the impact of face-to-face communication - but it is not a robust way of measuring it.'
Cindy Yendell, managing director of brand experience agency Live, agrees that anecdotal evidence of a change in consumer attitude is no longer enough to justify spend on events. 'We are constantly dealing with procurement departments and are under pressure to have the same sophisticated tools as other media, such as direct mail,' she says.
Live is developing a dedicated measurement system, which it hopes to start testing in February. 'We have to measure the delivery of the event: the reach, frequency, scale, time and cost. It is hard to predict this and we find ourselves having to estimate how many people will attend. With other media, there is so much comparative data that it is easier to identify the expected return on investment,' says Yendell.
'We are devising a bespoke measurement tool with research agency Hall and Partners, purely for measuring the ROI of experiential marketing,' she adds. 'We will be able to use it in the planning phases to look at expected outcome and it will be a more sophisticated means of measuring the impact of a campaign,' she predicts.
TRO is also close to unveiling a measurement tool. 'What is missing is an adequately robust means with which to compare events with other media,' says Allen.
MJM's Philips supports the idea of being able to compare face-to-face marketing with other media, but not if clients will use the data to make an either/or decision. Yendell concludes: 'We must educate clients that we should not be going head-to-head with other media. If you start trying to compare them, and don't understand that event marketing's strength is in raising brand awareness, we might see disappointing results.'
Orange Code Camp
In September 2004, mobile phone operator Orange worked with The George P Johnson Company (GPJ) on an event called Code Camp. It was aimed at some of the 26,000 independent software developers who had signed up to the global Orange Partner Programme - an online community for writing mobile-user applications.
More than 100 delegates attended the camping-themed event, which took place at science theme park, Futuroscope, in France. An environment was created in which code developers were encouraged to create innovative software applications. The event integrated 100 workshops, opportunities to network and the technology and support to write and test code.
'The key measures of event success were aligned to Orange's objectives, and the metrics and tools for gauging delivery were put in place well before the event began,' says GPJ director of corporate communications Ingrid Brown. As well as asking participants standard questions about event satisfaction, Brown liaised with Orange's marketing department to create a more detailed evaluation survey.
The questions were designed to measure change in perception and the delegates' intention for closer engagement and direct action, such as writing more applications, as a result of attending Code Camp. This was compared with the responses of a control group of programmers who did not attend the event.
The results showed that 87% of participants felt the event helped them progress with the applications they were working on. Post-event, 48% of delegates said they would deliver their applications within the next three months, and 32% within the next six. Overall, 95% of delegates said they felt closer to the Orange Partner Developer community as a result of attending.
Steve Glagow, director of the Orange Partner Programme, says: 'Code Camp allowed us to work alongside the developers on their applications, test them and share our hopes for the next 18 months. We got great feedback and more than enough applications were created to justify the expense of the event.'
Orange was so pleased with Code Camp's impact that it is to run a second event in April in Florida.