Tuesday, October 22, 2013

Marketing Automation User Satisfaction: Clearly, There's Room for Improvement (and maybe a little vodka)


Last week’s post on marketing automation and its discontents prompted several questions about whether the level of dissatisfaction is any higher with marketing automation than other systems. To some extent, this is asking whether the glass is half empty or half full; and, as the illustration suggests, the answer matters less than the fact that there’s room for improvement. But I do have some data to share on the question of relative dissatisfaction.

The first insights come from G2 Crowd, a research firm that ranks software based on user ratings and social data. I have my doubts about comparing software this way* but users certainly know whether or not they're happy.  The folks at G2 were kind enough to reformat some of their data for me.**


According to the G2 figures, marketing automation users are in fact more enthusiastic about their choices than almost anyone else. CRM in particular has a vastly worse rating, but even email, Web analytics, and Web content management show more detractors and fewer promoters. I’m not sure how to interpret this – is the average marketing automation system really easier and better than those other types of software?  Or is something else going on: maybe satisfaction is lowest in the most mature categories, like human resources, enterprise resource management, and accounting, because experienced users are the most demanding?



A second set of insights comes from Ascend2 and Research Partners, which asked its panel which inbound marketing tactics they considered most effective and most difficult to execute. Here we see a very different story: marketing automation and lead nurturing (listed separately) are clear outliers in a bad way: among the less effective tactics and the hardest to execute. In fact, they are the only two tactics where the difficulty score was significantly higher than the effectiveness score (i.e., above the diagonal line in the chart below).***



The Ascend2 study also found that 18% of respondents used marketing automation extensively, while 43% made limited use of it, and 39% didn’t use at all. This is similar to the BtoB study I cited last week, which found that just 26% of marketing automation users had fully adopted their system.  I believe those effectiveness vs. difficulty ratings hint at the reason for those results: most marketers don’t fully deploy marketing automation because they find it too much work compared with the benefit they’d gain. In other words, the hurdle to marketing automation adoption is not laziness, but a rational evaluation of the return from investments in marketing automation vs. other activities.

That rational judgment could still be wrong.  After all, marketers who haven’t fully deployed marketing automation don’t know how effective it really is. Ascend2 addressed this by asking marketers to rate their performance and comparing answers of the 12% self-rated “very successful” with the 20% who rated themselves “not successful”.

Those answers contain some positive news: of the very successful group, 45% were extensive users of marketing automation, compared with just 9% of the not successful.



But even the very successful marketers gave marketing automation only the fifth-highest effectiveness rating, which doesn’t differ much from the sixth-highest rating in the not successful group.


Similarly, the very successful marketers rated marketing automation as sixth most difficult (actually, tied for fifth) while the not successful marketers ranked it as fourth-hardest. In other words, marketing automation is indeed a bit easier than it seems before you start, but even the most experienced and most successful marketing automation users consider it pretty darn hard and just modestly effective.


So what we have here is a mixed message: marketing automation does correlate with success and its users might even be relatively satisfied, but it's still a lot of work for limited results.  You read that as good news or bad, but, either way, it shows the need for more work before marketing automation can reach its full potential.


________________________________________________________________________

* My basic objection is that users have different needs, so a system that satisfies one user may not be good for another.

** G2’s explanation: “The data for this chart comes from the over 7,400 enterprise software surveys users have completed on G2 Crowd as of Friday 10/18/13. For every product review we ask "How likely is it that you would recommend this product to a friend or colleague?" on a 0-10 scale. We segment reviewers that rate a product 9-10 as Promoters, 7-8 as Passives, and 0-6 as Detractors. The product segmentation data is aggregated to determine Net Promoter Score at a category level.”

***It's barely possible that the answers would be different if the Ascend2 study had asked about marketing in general rather than "inbound marketing purposes".  But I doubt it.

Tuesday, October 15, 2013

Marketing Automation's Unhappy Users: Trouble in Paradise?

As I mentioned in last week's post, I’m writing a paper on stages of marketing automation deployment. Key findings will be presented in a Webinar next Thursday, sponsored by TreeHouse Interactive; you can register here. The paper itself will be available to Webinar attendees.

The premise of the paper and Webinar is marketing automation has a problem: clients who don’t move beyond basic email functions are unhappy. Last week’s post provided statistics that show how many marketers fail to make this transition, but it didn’t actually show why this matters. So let’s look at some more data that illustrates the trouble in marketing automation paradise.

First we’ll start with the paradise itself: B2B marketing automation has indeed been growing quickly, at about 50% per year over the past few years according to my estimates.  I do expect that to slow somewhat in 2014 as the core market of tech companies approaches saturation and adoption in other industries remains spotty. The great hope is that acquisitions by Oracle, Salesforce.com, Adobe, and other big software vendors finally push the industry across this classic Geoffrey Moore chasm from the beachhead niche to mainstream users, but that’s by no means certain to happen.


If and when that growth does occur, it will be fueled by positive experiences of previous users. But the news on that front is mixed: a survey by one of the industry’s best analysts, Jim Lenskold, found 60% of marketing automation users reporting increases in the key value measures of lead quantity and quality. That’s a happy majority, but it also means that about 30% found no improvement or even a decline.


Questions about satisfaction give a similarly ambiguous result: just over two-thirds of users in a Winsper Group survey reported themselves satisfied with the business value of their system, again meaning that nearly one-third were neutral or actively dissatisfied.


Even more scary (and just in time for Halloween, if you're still looking for a costume): yet another survey, this by Holger Schulze, found that 31% of current marketing automation users anticipate changing their system within the next two years, nearly always because they want better or different capabilities.



Although these figures come from different sources, they all point to the same conclusion: about 30% of marketing automation users are not happy with their systems. The Schulze survey suggests that most believe a different system will give them better results, so they’re not yet ready to give up on marketing automation entirely.

But will those users really do any better with a different product? I’d be the last person to say that all marketing automation systems are the same, but it's also true that the vast majority of systems purchased have all the functions needed to run a successful marketing program. Some fraction of users really did buy the wrong product, but I’ve no doubt that most have problems due to flawed deployment.

One final survey reinforces this point. This one, from BtoB Online, found that just 26% of users had fully deployed their system – and nearly 40% had only some or moderate adoption.

I’d guess that the dissatisfied users in the earlier surveys are concentrated in the low deployment groups in this survey.  But if that’s true, those marketers are abandoning their systems before giving them a real chance. The BtoB survey does show that strong and complete adoption have increased considerably from 2012 to 2013, which is good news.  It also shows that full adoption will double next year, which would be even better news if it happened – but those figures probably reflect aspirations more than reality.


All of this brings us back to where we started: rather than blaming their tools, marketers need to work harder at ensuring full deployment of the systems they’re already purchased. Join me at next week’s Webinar for a roadmap to making this happen.

Wednesday, October 09, 2013

Which B2B Marketing Automation Features Actually Get Used? Here's Some Data.

I’ve been writing a paper on the stages that marketers go through when deploying their marketing automation systems, the basic point being it’s important not to stop with just one feature. That much is indisputable, but the next question seemed to call for some empirical data: Which features are used most often? Here’s where things got interesting.

Searching through my trove of published reports, I found four recent surveys that asked this question. Of course they differed in the precise categories used and their audiences, but they generally covered the major B2B marketing automation features: email, Web behavior tracking, landing pages, nurture campaigns, lead scoring, analytics, and social media marketing. They differ considerably in their findings.

The table below shows a summary of the results, with values all normalized so the highest ranked answer in each survey equals 100. (I’ve shown the original results at the bottom of this post.)




As you see, the only answer that’s truly consistent is that the most commonly-used feature is email – although even that wasn’t quite true in Holger Schulze’s report. This is exactly what you’d expect; indeed, my paper was inspired by the lament that many companies use marketing automation as nothing more than a glorified email engine.

The remaining rankings are nowhere near as consistent, either with each other or my expectations. I’d guess that landing pages and Web tracking would be relatively common, since they’re basic features that yield clear value and are easy to deploy. Yet both ranked towards the bottom of the list. On the other hand, nurture campaigns are often considered the most complicated and least used feature of marketing automation but ranked closer to the top. (I'll rationalize that one by guessing that people included simple newsletters and drip sequences along with more complicated nurture programs.)  Lead scoring, another advanced application, was closer to its expected position near the bottom. Analytics ranked somewhere in the middle but that hides a broad variance between surveys, which suggests it meant different things to different people.

Social media, another very broad category, was only on two lists but did rank at the bottom of both. This also makes sense: it’s a relatively new application for marketing automation and many marketers don’t do it at all or use other tools.

The divergence of rankings leaves the results open to pretty much whatever interpretation you want.  Rather than sweating the details, it may be more useful to think of landing pages, Web tracking, nurture campaigns, and lead scoring as a single group of applications that are deployed after email but more-or-less simultaneously with each other. That’s how I do things in my own maturity model, which then adds two more layers: one for inbound marketing including social media and search marketing, and another for marketing management including planning, project management, and revenue attribution. Those don’t appear on my previous table because they’re not consistently included in the surveys, but you will find them in some of the individual surveys below.  They ranking towards the bottom in frequency, as you’d expect.


The paper I mentioned goes into the maturity model in more detail.  (I'll let you know when it's published).  It shows that each level involves new skills and organizational changes, so moving from one to the next takes a lot more than just turning on more system features. This is presumably why so many organizations get stuck at the first or second levels.
Here are details and links for the surveys I’ve summarized above:


Holger Schulze, B2B Lead Generation Marketing Trends, 2013 Survey Results.
More than 800 responses from the B2B Technology Marketing Community on LinkedIn.  Note that not everyone is a marketing automation user.



Aberdeen Group, Marketing Lead Management: From the Top of the Funnel to the Top Line, July 2012.  More than 160 respondents; the table below shows responses for “industry average” companies. One anomaly worth noting is that while the chart below shows lead nurturing as more common than lead scoring, the order is reversed among best-in-class and laggards.



Gleanster, Marketing Automation: Disrupting the Status Quo, August 2013.  Research from 1,396 B2B marketers. The table below shows consolidated results from top performers and others, kindly provided by study author Ian Michiels. The second table shows types of campaigns run by the same group of respondents.




Winsper, 2013 Marketing Automation Study.  132 responders who use a marketing automation system. Figures show “most utilized” features; total utilization is much higher – for example, 94% make some use of email automation.



Wednesday, October 02, 2013

idio Does Sophisticated Content Recommendation

Systems in our new Guide to Customer Data Platforms range from B2B data enhancement to campaign managers to audience platforms. This may lead you to wonder whether there’s anything we actually left out.  In fact, there was: although the final choices were admittedly a bit subjective, I tried to ensure the report only included systems that met specific critieria including a persistent database, customer-level data, marketer control, and marketing-related outputs to external systems. In most cases, I could judge whether a system fit before doing a lot of detailed research. But a few systems were so close to the border that I only made the final call after I had evaluated them in depth.

idio was one of those. The company positions itself as a tool to deliver “personalized and relevant multi-channel communications”, which sure sounds like a CDP.  Indeed, it meets almost all the critieria listed above, including the most important one of building and maintaining a persistent customer database. But I ultimately excluded idio because it is tightly focused on identifying the content that customers are most likely to select, a function I felt was too narrow for a proper CDP. The folks at idio didn’t necessarily agree with this judgment, and pointed to planned developments that could indeed change the verdict (more about that later).  But, for now, let’s not worry about CDPs and take idio on its own terms.

The full description on idio's home page reads “idio understands your customer’s interests and intent through the content they consume and uses this to deliver personalized and relevant multi-channel communications” and that pretty much says it all. What idio does is ingest content – typically from a publisher such as ESPN, Virgin Media, Guardian Media, or eConsultancy (all clients) – but also from brands with large content stores such as Diageo, Unilever, and C Spire (also all clients). It uses advanced natural language processing to extract entities and concepts from this content, classifying it with the vendor’s own 23 million item taxonomy.

The system then monitors the content selected by its clients’ customers in emails, Web pages, mobile platforms, and some social platforms and builds an interest profile for each customer.  This in turn lets the system recommend which existing content the customer is most likely to select next. The recommendations are typically fed back to execution systems, such as email generators or Web content managers, which insert links to the recommended content into Web pages, emails, or newsletters.  Reports show selection rates by content, segment, or campaign, and can also show the most common topics published and the most commonly selected. Pricing is based on recommendation volume and starts around $60,000 per year for ten million recommendations.

Describing idio’s basic functions makes it sound similar to other recommendation systems, which doesn’t really do it justice. What sets idio apart are the details and technology.

• Content can include ads, offers and products as well as conventional articles.
• The natural language system classifies content without users tagging each item, a huge labor savings where massive volumes are involved, and can handle most European languages.
• idio's largest client ingests more than 1,000 items per day and stores more than one million items, a scale far beyond the reach of systems designed to choose among a couple hundred offers or products.
• Interest profiles take into account the recency of each selection and give different weights to different types of selections – e.g., more weight to sharing something than just reading it.
• Users can apply rules that limit the set of contents available in a particular situation.
• The system returns recommendations in under 50 milliseconds, which is fast enough to support online advertising selection.
• It stores customer data in a schema-less system that can make any type of input available for segmentation and reporting, although not to help with recommendations.
• It can build a master list of identifiers for each individual, allowing systems to submit any identifier and access a unified customer profile.
• It can return a content abstract, full text, images, or HTML, or simply a pointer to content stored elsewhere.
• It captures responses directly as the content is presented.

Most of these capabilities are exceptional and the combination is almost surely unique. The ultimate goal is to increase engagement by offering content people want, and idio reports it has doubled or even quadrupled selection rates vs. previous choices. All this explains why a small company whose product launched in 2011 has already landed so many large enterprises among its dozen or so clients.

Impressive as it is, I don’t see idio as a CDP because it is primarily limited to interest profiles and  content recommendations. What might yet change my mind is idio’s plan to go beyond recommending content based on likelihood of response, to recommending content based on its impact on reaching future goals such as making a purchase. The vendor promises such goal-driven recommendations in about six months.

Idio is also working on predicting future interests, based on behavior patterns of previous customers.  For example, someone buying a home might start by researching schools, then switch to real estate listings, then to mortgages, then moving companies, and so on. Those predictions could be useful in their own right and also feed predictions of future value, which could support conventional lead scoring applications. Once those features become available, idio may well be of interest to buyers well beyond its current customer base and would probably be flexible enough to serve as as Customer Data Platform.