Tuesday, April 28, 2009

Demand Generation Implementation Survey - Background Results

I've been having a dandy time analyzing the results of my Demand Generation Implementation Survey. Responses are still coming in but I thought I'd at least post some preliminary results to whet your appetite. Hopefully I'll be able to post a more substantive analysis tonight or tomorrow.

As of April 29, I've received 40 responses, of which I've discarded two as incomplete and two because they related to vendors I considered irrelevant (Zoho and Ad Giants PitchRocket). Obviously any survey based on 36 net responses (and self-selected at that) has little statistical value, but I still think the broad results are extremely interesting.

The survey was promoted on this blog and the Raab Guide site, but primarily via posts on Twitter. (Thanks to the many people who 'retweeted' the request). This introduces yet another source of sample bias. One measure of this is the distribution of vendors reported by the respondents, which clearly doesn't reflect the installed base of the industry. This distribution actually pleases me, since it means we have results from users of many different systems. (Obviously, however, the quantities are too small and sample bias too significant to break out results by vendor.)


nbr responses vendor
8Marketo
6Eloqua
3Genius.com
3LoopFuse
3Pardot
2Market2Lead
2

Treehouse Interactive

1eTrigue
1Vtrenz (Silverpop)
7No Response
36


Another intriguing bit of contextual information is the deployment date of the systems. Two respondents actually reported future dates -- I'd guess those were typos but, since responses were anonymous, I couldn't ask. There was actually another dated 6/01/2208, which I treated as 2008.

I was also curious to see the six responses for implementations during 3/09 and 4/09; obviously, these companies haven't gotten past their first or second month. Most of the answers for those entries reported features deployed within the first two months, or made the reasonable selection of 'later', so they could quite well be accurate. One repondent reported deployment on 4/24/09 (i.e., last week) but showed several features as deployed in month three. I assume represents their plans rather than reality. Fair enough.

In any case, the ten deployments in the first four months of 2009 (or 12 if you count the two future dates) and 12 in 2008 highlights the newness and fast growth of the demand generation industry. There were just five earlier deployments, including one for 1990, which is almost surely an error.


nbr responses

deployment date

1

10/09

1

8/09

3

4/09

3

3/09

1

2/09

3

1/09

12

2008

2

2007

2

2006

1

2005

1

1990

6

No Response

36



One final bit of more data, this more substantive: I asked how well their experience with deployment and their systems as a whole had met their expectations. Results strike me as extremely positive -- about two-thirds rated both experiences as better than expected, with just a bit more satisfaction with the systems than the implementation. Only a couple of responders felt things were worse than expected. Again, we have to consider sample bias. But even so, this seems to be a pretty happy set of campers.

I actually looked to see if there was any relationship between deployment year and satisfaction, and it newer customers may be a bit happier. But the numbers are very small, recency may also introduce some bias, and in any event even the earlier customers are highly satisfied. So I don't consider this more than a hint of what might be the case.


How would you rate your experience with...
%

better than expected

about as expected

worse than expected

total

system implementation

0.64

0.33

0.03

1.00

the system itself

0.67

0.28

0.06

1.00




How would you rate your experience with...
nbr responses

better than expected

about as expected

worse than expected

total

system implementation

23

12

1

36

the system itself

24

10

2

36

No comments:

Post a Comment