Wednesday, April 21, 2010

Are Experts Obsolete? Not In My Informed Opinion

I recently tripped over an intriguing article on Extinction of the Expert by Denise Gershbein, a creative director at frog design. To be honest, I couldn’t quite follow her argument, but the gist seemed to be that true experts in the future will be people who can integrate information from multiple domains by leading teams of people who are themselves experts in different fields. I sense an infinite recursion here – are the “experts in different fields” themselves people who integrate other experts, or are they domain experts in the conventional sense? But as someone who makes a living based on my own claims of expertise, I’m less interested in Gershbein’s answer than her original question of whether experts will soon be obsolete.

My short answer, you won’t be surprised to learn, is no. Maybe I'm biased by self-interest, but it seems perfectly clear that there are many situations when the collective wisdom of the Internet won’t suffice. If I need a plumber or surgeon or marketing consultant, I need someone who can solve my individual problem, not provide generic advice or spend days researching the issue. In most cases, experts won't provide that kind of personal attention for free. (The exceptions, where experts will provide individual help as a hobby or public service or for glory or because someone else is funding them, are just that – exceptions.) Perhaps my personal expert will be able to call on a crowd of other experts for assistance. But each expert must start with a high level of personal knowledge to be effective. QED.

Even though I expect experts to survive in pretty much their current form, there are certainly changes in their surroundings. In particular, two major trends are well under way:

- information is much more accessible. I know you knew that, but have you considered the kind of information we’re talking about? What’s more accessible is basic information, such as “who are the major vendors in a given market”? Back in the day, just knowing the answer to that qualified you as an expert. Now, anyone can find it in an hour. But the critical point is that once you get beyond basic information, the important details – strengths and weaknesses, specific features, industry reputation – are not easily accessible, and you need to be an expert to even know which questions to ask. So even though experts need deeper knowledge than previously to add real value, people with specific questions still need experts to get the answers.

- experts are much more accessible. This is true in several senses: it’s easier to find an expert; there are more experts to find; and it’s easier to be recognized as an expert. The ease of publishing in blogs and other online venues has removed the bottleneck previously created by traditional media, allowing many more people to display their expertise and making them easier to find. That the number of true experts has expanded may seem debatable, but I believe the greater availability of information means that more people can learn what an expert needs to know. In practical terms, greater accessibility also means that more people can sell their expertise: thus, even if the total number of people with deep knowledge hasn’t expanded, the proportion of those people who are offering their services as experts has certainly grown. This means the net supply has definitely increased.

Of course, the loss of the filters provided by traditional media also means it’s easier for people to appear to be experts when they are not. This matters more in some fields than others: if a credentialing system is still in place, such as government-sanctioned licensing or industry certifications, then experts still must pass the traditional hurdles. But in fields like journalism and marketing, pretty much anybody can peddle their wares to whomever will buy them. This means that the success as a professional expert now requires a new set of self-promotion skills, although it would be naive to believe that success didn’t always require some type of self-promotion. I’d guess it’s easier today for a less-than-fully-competent expert to make a living, if only because it’s easier to attract potential clients. In fields where performance is highly subjective, it’s probably even possible for someone who gives objectively bad advice to build a base of happy reference clients. Although I’m not quite ready to concede that there is no ultimate objective measure of expertise, I do think it’s harder than ever for clients to assess the true competence of experts they are considering for hire.

Back to the original question: if there’s more competition from both competent and less-competent experts, are “true” experts are in danger of extinction? I still don’t think so, but do think they’ll find it harder to make a living, which may ultimately reduce the level of expertise available in the market. Advanced expertise involves considerable investment in training and research, which only well-established, profitable experts can afford. Those experts will continue to prosper by charging premium rates to discriminating clients, but there will be fewer of them and they'll be less likely to share what they know for free over the Internet.

Less knowledgeable clients will settle for less knowledgeable experts, who will be both cheaper and more accessible. Maybe that’s still a net gain compared to a world with a few experts whose rates are so high that many companies can’t afford them. A health care analogy would be a system where more patients get care, but they see a nurse-practitioner instead of a doctor. Since some care is better than none, the average level of care rises, despite the occasional catastrophic error because a more skilled expert was not consulted. (In actual health care, this doesn't happen because nurse-practitioners are trained to call a physician when appropriate. But in other fields, such safeguards don’t exist.)

The economics of being an expert are my problem, since that’s how I make my living. What you, Dear Reader, presumably must worry about is how to get the best value from the experts you employ while avoiding catastrophic results. At this point all I can advise is greater care than ever in selecting your experts – look beyond the persuasive blog posts for concrete experience and proven results. Perhaps community rating mechanisms will eventually make the selection easier, but at the moment you need to question whether the crowd truly knows best.

Tuesday, April 20, 2010

OneSource Survey: Salespeople Accept Value of Leads from Marketing

Summary: A survey of business-to-business salespeople finds they (still) consider themselves their best source of qualified leads. But marketing-generated leads are gaining increasing respect and salespeople are increasingly looking for help from outside data vendors. Marketers should work closely with salespeople to reinforce these trends, which promise to lower the overall cost per sale.

Most of my interactions are with marketers, so it was interesting to see the opinions of 136 salespeople reported in a recent survey from data vendor OneSource.

The most interesting information was what respondents saw as their largest source of qualified opportunities. By far the leader was “outbound prospecting”, which is a bit frightening given the high cost of such leads. For example, the State of Inbound Marketing 2010 survey from Hubspot found that outbound leads (from telemarketing, trade shows and direct mail) cost an average of $332, compared with $134 per inbound lead (from social media and Web sites).

I suspect that sales people have always felt they must rely on their own outbound prospecting to be successful. What’s probably more significant is that the three next-ranking sources in the OneSource survey come from marketing: Website, inbound calls and email campaigns. Events and trade shows actually rank below all of these. Bringing up the rear are social networking and direct mail, which are rated equal – a pretty impressive showing for social media if you think about it – and Webinars. All together, I see this as a perhaps-grudging recognition by sales people that marketing plays a critical and growing role in generating qualified leads.



Other survey answers were largely consistent with the theme of salesperson self-reliance. The most valuable types of information were targeted contact lists and new CRM contacts; the most useful external data was email address, direct phone numbers and segmented; and the most useful company information was the basics of location and size. These draw a picture of salespeople saying, “Hand me the leads and let me do the rest.” There’s no hint of a role for marketing in nurturing unqualified leads or building brand awareness, although those questions were not exactly asked.

One anomaly in this data is sharply increasing reliance on external business information services. Twelve percent of respondents said they had recently started using these services and a whopping 37% said they were relying on them more heavily. Just seven percent were relying on them less and only 24% are not using them at all. I see this as an acknowledgment by salespeople that outside resources can indeed make them more efficient, even if they still do the actual outbound prospecting themselves.


For what it’s worth, the survey (taken in December 2009) also found some optimism about future sales: 55% said their pipeline was significantly or somewhat better than last year, compared with 33% who said it was significantly or somewhat worse. But sales cycles are still growing: 59% said they were longer than last year vs. 16% saying they were shorter. Although I wouldn’t read too much into such a small survey, this is at least consistent with the hypothesis that there’s a long-term trend towards lengthier, more complicated sales cycles that will continue even once the economy recovers.

Altogether, the results reinforce the conventional wisdom that marketers need to work closely with sales departments to ensure they are delivering qualified leads and that sales people recognize this. Longer-term projects such as lead nurturing and branding are harder to tie to specific sales revenues, but marketers must trace this connection to justify their funding.

Wednesday, April 14, 2010

SAS, Unica and smartFocus Add Social Media Features

Summary: major consumer-oriented marketing automation vendors have all added some type of social media capabilities. But some focus on monitoring conversations while others help marketers send more messages. Be sure you know which you're getting.

On Monday, marketing automation vendors SAS and Unica both announced new social media capabilities. SAS provided quite a bit of detail while Unica did not, so I can’t compare the two announcements in depth. [Unica provided additional detail after this post was written.] But combined with a social network marketing announcement in February from smartFocus and Alterian's acquisition of social media monitoring system Techrigy last July, all major consumer-oriented marketing automation vendors have now added some flavor of social media marketing to their systems.

What’s really interesting is how widely those flavors vary.

- Techrigy lets marketers and service departments monitor, analyze and respond to comments in social media. See my discussion last July for details.

- SAS's new solution has analytical functions similar to Techrigy, including conversation monitoring and capture, content classification by topic and sentiment, drill-down to individual documents, influence measurement, and dashboards. But it doesn’t seem to include the case management features that a publicity or service department would use to interact with individuals.

- smartFocus aims not to monitor general social media activity but to measure the influence of individuals. Although the press release is short on details, the company told me that what it's really providing is a “share to social” option for system-generated emails. This lets smartFocus track how each recipient shares the item, identify Web visitors who clicked on the shared item, and collect the behaviors of those visitors. This data is linked back to the original recipient, so smartFocus can measure the activity each recipient has generated and profile that recipient's responders. Although the approach is far from comprehensive – it only tracks items that the smartFocus sent and the recipient shared – it does tie those activities to hard metrics such as purchases. It is quite different from calculating influence by counting followers or content reuse, which are the more conventional approaches to social media measurement.

- Unica announced several enhancements to its flagship Enterprise system, of which three had no particular social media focus: adding data capture forms to emails; adding personalization to Web sites via page tags; and a new graphical interface for its event-detection system. A fourth item, incorporating data from social media Web sites in the Web analytics solution, is useful but not ground-breaking. The only substantial new social media addition is Unica’s own “share to social” option, which the company confirms does not link shared items back to the original sharer as does smartFocus.

If there’s a lesson in all this, it’s that “social media solution” is far from a simple check box on your requirements list. Vendor solutions differ widely and will continue to vary for quite some time. SAS and Alterian chose to start with monitoring and measurement, while Unica and smartFocus jumped right into messaging. This nicely illustrates a similar split among marketers in choosing which to do first. Many recognize the need for measurement but can't resist the lure of sending messages that will generate immediate response. But even if you start with a messaging solution, be sure to add monitoring as quickly as possible. I suspect that most firms will find that the information they gather from social media is ultimately more valuable than the relatively small amount of business they gain from social media directly.

Monday, April 05, 2010

VisualIQ Measures Marketing Impacts Across All Channels

Summary: VisualIQ combines customer-level transactions and contact history with traditional aggregate data to produce better marketing performance measurement. It hasn't solved the problem of identifying the same customer across channels, but it's trying.

I was going to start this post by writing that last-click attribution has recently come under fire, but the first Google hit on the topic brings up a study from 2007. So maybe the criticism isn’t particularly new. But the fact remains that, now more than ever, marketers are trying to measure the impact of all contacts on customer behavior.

Broadly speaking, the problem is attacked in two ways. One, most common among consumer goods manufacturers and others who do not sell directly to their customers, uses aggregated data in marketing mix models to find correlations between marketing efforts and total sales. The other, favored by banks, retailers, communications providers and others who do sell directly to known buyers, assesses the impact of each contact with specific individuals. Last-click attribution is a particular challenge for online marketers because they fall between these two situations: they can often identify their buyers but not trace their full contact history.

VisualIQ, founded in 2005 as Connexion.a, proposes to straddle these worlds by combining aggregate-level models with customer-specific contact history. They haven’t found a magic bullet: like everyone else, VisualIQ tracks online customers through cookies, with all the limits that implies. But VisualIQ strives to make the best use of what’s available by unifying data from as many online campaigns as possible, linking cookies with online transactions, and then linking online transactions to offline identities.

This approach offers some general advantages and two specific capabilities. The general advantages come from assembling all advertising and customer transaction information in one database. This allows VisualIQ to analyze campaign results, do whatever identity matching is possible, and to isolate the impact of source, contact frequency, demographics, location and other variables. VisualIQ, a hosted service, has invested heavily in technology to analyze massive data sets along such dimensions.

The first specific capability is relating pre-purchase contacts to actual purchases for individual customers, thus moving beyond last-click attribution. Although this is subject to the limits of cookie-based tracking, VisualIQ does what it can to build a unified identity by sharing the same cookie IDs across as many online channels as possible. The second capability is building mix models with data from actual customer contacts instead of market-level estimates or surveys. VisualIQ says it has found this yields more accurate results than traditional information.

This is all good stuff and VisualIQ has packaged it nicely in a tiered set of offerings. These range from campaign-level reporting to customer-based insights to predictive modeling and simulation, with prices for the simplest system starting as low as $5,000 to $10,000 per month. The company has had considerable success, counting major banks, retailers, and communications firms as clients. Note that these are all industries that sell to their customers directly.

But VisualIQ’s specific offerings are just part of the story. What’s really important is setting explicit goals of linking identities across channels and measuring cross-channel marketing impacts. These are arguably the core challenges in marketing measurement today. This focus has led VisualIQ to look for alternatives to cookies and to use existing methods to combine online and offline information for the same person.

The company is also seeking to make it easier to apply its results. Today, it basically generates reports that suggest better media allocations and advertising contents. But it is working to automatically feed those findings as rules into execution systems such as ad servers and ad exchanges. This brings marketers closer to the ultimate goal of self-optimizing programs. Other vendors are also pursuing self-optimization, but VisualIQ promises the advantage of decisions based on data from all channels rather than a single channel or, heaven forbid, just the last click.