Showing posts with label customer experience management. Show all posts
Showing posts with label customer experience management. Show all posts

Thursday, June 02, 2016

Usermind Makes Journey Orchestration Simple

Maybe you’ve been waiting with increasing impatience for me to finish reviewing the set of Journey Orchestration Engines (JOEs) I first mentioned in March.  More likely, it slipped your mind entirely. But I do worry about such things so I’m especially pleased for finish out the set by telling you about Usermind.

Usermind Journey

I'm not saying that Usermind calls itself a JOE.  Its self-description is “the first unified platform for orchestrating business operations”.  But the company uses the language of journeys and customer data stores. So although they see themselves as enabling all kinds of business processes, I think it’s fair to view them largely in the context of customer management.

Usermind is all about simplicity.  Its main screen sets the tone by offering just three tabs: Analytics, Journeys, and Integration. Deploying the system actually starts with the last of these, Integration, which is where the user connects to external systems that are both data sources and execution engines. The company lists about a dozen standard integrations including major marketing automation, CRM, email, customer service, collaboration, and analytics systems. Another half-dozen are “coming soon.”

A key feature of Usermind is it makes integration easy by reading the contents of the source systems automatically, so any custom data elements or objects are incorporated without user effort.  This also means it adjusts to changes in those systems automatically. Users do build maps that show which fields to use to link customers (or other entities) across systems: for example, a map might use email address to link marketing automation to CRM, and customer ID to link CRM to customer service. The system can also map on combinations of fields and do fuzzy matching on inconsistent data. There can be separate maps for individuals, companies, products, customers, partners, or whatever other entities the user wants to work with. Usermind figures out relationships among tables or objects within each source system, so users simply see a list of available fields without having worry about the underlying data structures.

Once the maps are in place, Usermind copies selected data elements into its own database, where they are available to use in journeys. Each journey is a sequence of milestones, which can each contain one or more rules. Each rule has selection conditions and one or more actions to take if the conditions are met. Actions can push data or tasks to back to the source systems.  Rules can be triggered by events or executed on schedule.

Usermind Rule

And that’s pretty much it. The Analytics tab reports on movement of customers through journeys, providing counts, conversion rates and drop-out rates for each milestone. It also analyzes the impact of actions on results.  The system can be connected to business intelligence tools for more advanced reporting. But there’s no predictive analytics, content creation, or message execution. True to its description, Usermind is designed to orchestrate actions in other systems, not take actions itself.

Don’t let that simplicity fool you. Usermind (and other JOEs) address the critical challenge of unifying customer data from different sources and coordinating customer treatments. Tools to make this easy are rare; tools to send emails and deliver other messages are not. So Usermind fills an important gap – which is why the company has attracted $22 million in venture funding since it was founded in 2013, and why its investors waited until this year for it to launch the actual product. (Whether they waited patiently is a question I didn’t ask.) As of March, the company reported 15 live customers and was actively looking for more.

You may be wondering whether Usermind can truly be called a JOE since I've defined the essence of JOE-ness as a system that discovers the customer journey for itself rather than relying on the user to define it. Usermind doesn’t pass that test. In fact, Usermind journeys are individual processes rather than an overview of the customer’s lifetime experience. But Usermind still looks JOE-ish because it’s capturing events that occur naturally, not creating its own events like messages in a nurture flow. And its ability to use the journey as a framework for managing customer treatments is exactly what JOEs are all about. So marketers looking for a JOE should put Usermind on their list.

Tuesday, October 06, 2015

Marketers Are Struggling to Keep Up With Customer Expectations: Here's Proof

How pitiful is this: My wife left me alone all last weekend and the most mischief I could get into was looking for research about cross-channel customer views. The only defense I can make is I did promise a client a paper on the topic, which I finished Sunday night. But then I decided it was way too wonky and wrote a new, data-free version that people might actually read.

But you, Dear Reader, get the benefit of my crazy little binge. Here’s a fact-filled blog post that uses some my carefully assembled information. (Yes, there was actually much more. I’m so ashamed.) 

Customer Expectations are Rising

Let's start with a truth universally acknowledged – that customers have rising expectations for personalized treatment. Unlike Jane Austen, I have facts for my assertion: e-tailing group's 7th Annual Consumer Personalization Survey found that 52% of consumers believe most online retailers can recognize them as the same person across devices and personalize their shopping experience accordingly. An even higher proportion (60%) want their past behaviors used to expedite the shopping experience, and one-third (37%) are frustrated when companies don’t take that data into account.

Switching to customer service, Microsoft’s 2015 Global State of Multichannel Customer Service Report  found that 68% of U.S. consumers had stopped doing business with a brand due to a poor customer service experience and 56% have higher expectations for customer service than a year ago. So, yes, customer expectations are rising and failing to meet them has a price.


Marketers Know They Need Data

Marketers certainly see this as well. In a Harris Poll conducted for Lithium Technologies, 82% of 300 executives agreed that customer expectations have gotten higher in the past three years.  Focusing more specifically on data, Experian's 2015 Data Quality Benchmark Report, which had 1200 respondents, found that 99% agreed some type of customer data is essential for marketing success. Marketers are backing those opinions with money: when Winterberry Group asked a select set of senior marketers what was driving their investments in data-driven marketing and advertising, the most commonly cited reason was the need to deliver more relevant communications and be more customer-centric. .


Few Have the Data They Need

But marketers also recognize that they have a long way to go. In Experian’s 2015 Digital Marketer study, 89% of marketers reported at least one challenge with creating a complete customer view.



Econsultancy’s 2015 The Multichannel Reality study for Adobe found that just 29% had succeeded in creating such a view, 15% could access the complete view in their campaign manager, 14% integrate all campaigns across all channels, and 8% were able to adapt the customer experience based on context in real time.  In other words, the complete view is just the beginning.  In other words, marketers are nowhere near as good at personalizing experiences as consumers think.




Real-Time Isn't a Luxury

Given the challenges in building any complete view, is real-time experience coordination too much to ask? Customers don’t think so; in fact, as we've already seen, they assume it’s already happening. Marketers, of course, are more aware of the challenges, but they too see it as the goal. In a survey of their own clients, marketing data analysis and campaign software vendor Apteco Ltd found that 12% of respondents were already using real-time data, 31% were sure they needed it and 37% felt it might be useful. Just 17% felt daily updates were adequate.


Real-Time Must Also Be Cross-Channel

It’s important to not to confuse real-time personalization with tracking customers across channels or even identifying customers at all.  In a survey by personalization vendor Evergage, respondents who were already doing real-time personalization were most often basing it on immediately observable, potentially anonymous data including type of content viewed, location, time on site, and navigation behavior.  Yet the marketers in that same study gave the highest importance ratings to identity-based information including customer value, buying/shopping patterns, and buyer persona. It’s clear that marketers recognize the need for a complete customer view even if they haven't built one.




Summary

What are we to make of all this, other than the fact that I need to get out more?  I'd summarize this in three points:

- customer expectations are truly rising and you'll be penalized if you don't meet them
- marketers know that meeting expectations requires a complete customer view but few have built one
- the complete view has to be part of a real-time integrated, real-time to deliver the necessary results

None of this should be news to anyone. But perhaps this data will help build your business case for investments to solve the problem.  If so, my lost weekend will not have been in vain.
.


Tuesday, March 24, 2015

Adobe Marketing Cloud Marches Towards Martech and Adtech Integration


At pretty much the same moment I was publishing my post on the merger of martech and adtech into madtech, Adobe was announcing its latest marketing products, including a press release on uniting “Data-driven Marketing and Ad Tech” . Naturally, this caught my attention.

As you might expect, Adobe’s reality is considerably more complicated than the simplicity of the “madtech” vision. Like the other enterprise software vendors who offer broad martech and adtech solutions, Adobe has built its marketing cloud by buying specialist systems. And, again like its competitors, it has only integrated them to a limited degree.

In Adobe’s case, the various products remain as distinct “solutions” served by a common set of “core services”. The current set of eight solutions includes Analytics (Web, video and mobile analytics, née Omniture Site Catalyst), Social (social publishing, based on Context Optional), Target (Web optimization and personalization, derived from Omniture Test & Target/Offermatica), Experience Manager (Web content management , originally Day Software), Media Optimizer (based on Efficient Frontier and Demdex), Campaign (formerly Neolane); Primetime (addressable TV) and Audience Manager (data management platform, formerly Demdex). Of course, the products have all been modified to some degree since their acquisitions.  But each still has its own data store, business logic and execution components.

Rather than replacing these components with common systems, Adobe has enabled a certain amount of sharing through its core services. In the case of customer data, the “profiles and audiences” core service maintains a common ID that is mapped to identities in the different solutions. This means that even though most customer data stays in the solutions’ own databases, the core service can use that data to build audience segments. There's also an option to load some attributes into the core services profiles themselves.   Audiences, which are lists of IDs, can either be defined in solutions and sent to the core service or built within the core service itself.  Either way, they can then be shared with other solutions. Data from external systems can also be imported to the core service in batch processes and used in segmentation.

Adobe says that data stored in the solutions can be accessed in real time.  I'm skeptical about performance of such queries, but the ability to store key attributes within the core service profiles should give marketers direct access when necessary.  There’s certainly a case to be made that digital volumes are so huge and change so quickly that it would be impractical to copy data from the solutions to a central database. Where external data is concerned, marketers will increasingly have no choice but to rely on distributed data access.

But here’s the catch: Adobe's approach only works if all your systems are actually tied into the central system. Adobe recognizes this and is working on it, but so far has only integrated five of its solutions with the profiles and audiences core service. These are Analytics, Target, Campaign, Audience Manager, and Media Optimizer. The rest will be added over time.

The second big limit to Adobe’s current approach is sharing with external systems. Only Adobe solutions can access other solutions’ data through core services. This makes it difficult to substitute an external product if you already have one in place for a particular function or don’t like Adobe’s solution.

Adobe does connect with non-Adobe systems through Audience Manager, its data management platform, which can exchange data with a company’s own CRM or operational databases, business partners, and external data pools and ad networks. Audience Manager can hold vast amounts of detailed data, but does not store personally identifiable information such as names or email addresses. Audience Manager can also copy Web behavior information directly from Analytics, the one instance (so far as I know) where detailed data is shared between Adobe solutions.


So far, I’ve only been discussing data integration. The various Adobe components also have their own tools for segmentation, decision logic, content creation, and other functions. These are also slowly converging across products: for example, there is an “assets” core service that provides a central asset library whose components can be uploaded to at least some of the individual solutions. The segmentation interface is also being standardized product-by-product. There’s no point in trying to list exactly what is and isn't standard today, since this will only change over time.

The lesson here is that suites are not simple. Marketers considering Adobe or any other Marketing Cloud need to examine the details of the architectures, integration, and consistency among the components they plan on using. The differences can be subtle and the vendors often don’t explain them very clearly. But it pays to dig in: the answers have a big impact on whether the system you choose will deliver the results you expect.

Friday, November 08, 2013

Gainsight Gives Customer Success Managers a Database of Their Own

I had a conversation last week with a vendor whose pitch was all about providing execution systems with a shared database that contains a unified view of customer information from all sources. Sadly, they were unfamiliar with the concept of a Customer Data Platform as I’ve been developing it over the past few months and didn’t realize that they fit the definition.

This post is not about that company.

Instead, it will be about another company I also spoke with last week, which I had originally considered a CDP but then decided wasn’t. After hearing their latest news, I still place them outside the border, but think they’re creeping closer and – for reasons I’ll explain later -- will some day reach the other side.


The company is Gainsight (formerly JBara), whose Web site positions it as ”a complete customer success platform”. That could easily be pure fluff – doesn’t every company contribute to its customers’ success? – but Gainsight actually means something concrete: it helps customer success managers identify churn risks and sales opportunities among their clients. As Gainsight sees it, this makes them the post-sales analogue to marketing automation systems (which manage acquisition) and CRM systems (which manage sales)*. This trichotomy** ignores customer service systems, which I'd consider the major post-sales management tools. But Gainsight is genuinely different from customer service, and in fact uses those systems as data sources. So even though Gainsight may not have created the third great category of customer-facing systems, it does do something important.

Specifically, Gainsight gathers information from online products, CRM, customer service, accounting, and customer surveys to create a complete view of how existing customers are using the products they own, whether they’re renewing or expanding their usage, what they’re paying, what sorts of service issues they’re having, and what attitudes they’ve expressed.  It tracks this over time and uses the information in a variety of ways: to profile and summarize the health of each account; to send alerts about problems or opportunities; to display trends in usage, satisfaction, and other measures; and to analyze the customer base by relationship stage, revenue range, and other factors. The information is presented through Gainsight’s own interface, which runs on Salesforce.com’s Force.com platform, making it easy to integrate with Salesforce itself.

Gainsight originally stored all its data within Force.com, but it has recently started using MongoDB and Hadoop, which will allow it store details such as clickstreams and product usage history. The company has also expanded its "big data science" resources to identify the attributes of customers likely to churn or to purchase new services. This will help users define the rules that drive alerts.  So far, there is no automated predictive modeling to build such rules, although that’s planned for the future. Data is typically loaded weekly, which Gainsight says is the most often that customers have requested.

Of course, once all that juicy customer data has been assembled in one place, companies could use it for more than customer success management. This is part of Gainsight’s master plan, which is to expand beyond customer success teams to account management, sales, and other departments.

This brings us (or me, at least) back to the question of whether Gainsight is a Customer Data Platform. It does build a multi-source customer database, which is the core CDP function. Although the data sources are largely limited to the client’s own systems, external sources are not essential for a CDP.  In any event, Gainsight could probably add external sources fairly easily if a client wanted – especially now that it isn’t bound by the limits of Force.com. Gainsight isn’t yet doing predictive modeling or decision management beyond rule-based alerts, but those common CDP features are also optional, and Gainsight is moving in those directions. Gainsight clearly meets the CDP requirement of building a database controlled by users outside of IT, even though in this case the users are not marketers.

Where Gainsight gets disqualified is that a CDP by definition makes its data available to other systems to guide customer treatments. The Gainsight database is technically exposed already: users could query the Force.com data via the Salesforce API or write direct SQL queries against the new Mongo / Hadoop back-end.  But so far Gainsight’s direction has been to use its data in its own applications and user interface. If Gainsight opened itself up as a data source, it would clearly be a Customer Data Platform.

Even as Gainsight stands today, it’s still more evidence supporting the CDP proposition that companies need a multi-source database – and a warning that multi-source databases themselves could proliferate into a new forest of single-purpose data silos if companies don't adopt a shared CDP instead. As this danger becomes clearer, Gainsight and other companies will need to either become general-purpose CDPs themselves or become applications that plug into a CDP built by someone else.

Gainsight was founded in 2009 and started taking paying customers in 2012. It now has about 20 clients, mostly large enterprises running an online service or Web site. Pricing is based on the number of modules used plus number of users, and averages around $50,000 to $60,000 per year for 20 to 50 users.



_____________________________________________________________________________________________
* Okay, CRM really is more than sales force automation, but that’s the term that Gainsight used and CRM is increasingly used in that narrower sense, mostly because that’s how Salesforce.com describes itself. Get over it.

** Yes, that’s a word.

Wednesday, October 02, 2013

idio Does Sophisticated Content Recommendation

Systems in our new Guide to Customer Data Platforms range from B2B data enhancement to campaign managers to audience platforms. This may lead you to wonder whether there’s anything we actually left out.  In fact, there was: although the final choices were admittedly a bit subjective, I tried to ensure the report only included systems that met specific critieria including a persistent database, customer-level data, marketer control, and marketing-related outputs to external systems. In most cases, I could judge whether a system fit before doing a lot of detailed research. But a few systems were so close to the border that I only made the final call after I had evaluated them in depth.

idio was one of those. The company positions itself as a tool to deliver “personalized and relevant multi-channel communications”, which sure sounds like a CDP.  Indeed, it meets almost all the critieria listed above, including the most important one of building and maintaining a persistent customer database. But I ultimately excluded idio because it is tightly focused on identifying the content that customers are most likely to select, a function I felt was too narrow for a proper CDP. The folks at idio didn’t necessarily agree with this judgment, and pointed to planned developments that could indeed change the verdict (more about that later).  But, for now, let’s not worry about CDPs and take idio on its own terms.

The full description on idio's home page reads “idio understands your customer’s interests and intent through the content they consume and uses this to deliver personalized and relevant multi-channel communications” and that pretty much says it all. What idio does is ingest content – typically from a publisher such as ESPN, Virgin Media, Guardian Media, or eConsultancy (all clients) – but also from brands with large content stores such as Diageo, Unilever, and C Spire (also all clients). It uses advanced natural language processing to extract entities and concepts from this content, classifying it with the vendor’s own 23 million item taxonomy.

The system then monitors the content selected by its clients’ customers in emails, Web pages, mobile platforms, and some social platforms and builds an interest profile for each customer.  This in turn lets the system recommend which existing content the customer is most likely to select next. The recommendations are typically fed back to execution systems, such as email generators or Web content managers, which insert links to the recommended content into Web pages, emails, or newsletters.  Reports show selection rates by content, segment, or campaign, and can also show the most common topics published and the most commonly selected. Pricing is based on recommendation volume and starts around $60,000 per year for ten million recommendations.

Describing idio’s basic functions makes it sound similar to other recommendation systems, which doesn’t really do it justice. What sets idio apart are the details and technology.

• Content can include ads, offers and products as well as conventional articles.
• The natural language system classifies content without users tagging each item, a huge labor savings where massive volumes are involved, and can handle most European languages.
• idio's largest client ingests more than 1,000 items per day and stores more than one million items, a scale far beyond the reach of systems designed to choose among a couple hundred offers or products.
• Interest profiles take into account the recency of each selection and give different weights to different types of selections – e.g., more weight to sharing something than just reading it.
• Users can apply rules that limit the set of contents available in a particular situation.
• The system returns recommendations in under 50 milliseconds, which is fast enough to support online advertising selection.
• It stores customer data in a schema-less system that can make any type of input available for segmentation and reporting, although not to help with recommendations.
• It can build a master list of identifiers for each individual, allowing systems to submit any identifier and access a unified customer profile.
• It can return a content abstract, full text, images, or HTML, or simply a pointer to content stored elsewhere.
• It captures responses directly as the content is presented.

Most of these capabilities are exceptional and the combination is almost surely unique. The ultimate goal is to increase engagement by offering content people want, and idio reports it has doubled or even quadrupled selection rates vs. previous choices. All this explains why a small company whose product launched in 2011 has already landed so many large enterprises among its dozen or so clients.

Impressive as it is, I don’t see idio as a CDP because it is primarily limited to interest profiles and  content recommendations. What might yet change my mind is idio’s plan to go beyond recommending content based on likelihood of response, to recommending content based on its impact on reaching future goals such as making a purchase. The vendor promises such goal-driven recommendations in about six months.

Idio is also working on predicting future interests, based on behavior patterns of previous customers.  For example, someone buying a home might start by researching schools, then switch to real estate listings, then to mortgages, then moving companies, and so on. Those predictions could be useful in their own right and also feed predictions of future value, which could support conventional lead scoring applications. Once those features become available, idio may well be of interest to buyers well beyond its current customer base and would probably be flexible enough to serve as as Customer Data Platform.

Wednesday, August 07, 2013

NICE Buys Causata to Extend Its Customer Experience Management Position

So, there I was around 7:30 Eastern time this morning, sending out reminder notices to vendors I need to interview for an upcoming report on Customer Data Platforms. I received an immediate response from the Kevin Nix of Causata, offering to talk that very morning. This seemed a bit odd – Causata is based in San Francisco, so it was 4:30 a.m. local time and most people need more notice to schedule a call. But I had Things To Do, so I didn't give it much thought. Then, at the end of another call, a participant casually mentioned that Causata had just been purchased by Israel-based NICE Systems.  At first I was struck by the coincidence, and then realized what had happened: Nix was up because he had been talking to the folks in Israel, and he replied because he wanted to discuss his acquisition, not my report. [Insert image of deflating self-importance].


Sure enough, when I did dial in, I was treated to a prepared briefing on why NICE had made the deal.

There’s really nothing wrong with that. NICE is little-known in marketing circles, although I had bumped into them previously when they bought decision management vendor eGlue in 2010. But NICE is a major player in contact center systems, with nearly $1 billion revenue and $2.5 billion stock market capitalization. So I was pleased to connect with them directly and learn a bit more.

The briefing itself was interesting too. It turns out that while NICE still sells primarily to contact center managers, it is working hard to expand to clients in marketing, sales, compliance (it bought Actimize in 2007) and other areas related to customer experience. Its interest in Causata related to all  that, and in particular to that fact that Causata can capture Web interactions in real time and present them with related recommendations to contact center agents and other systems. This pumped me back up a bit, since it can be read as validation of the Customer Data Platform concept that I’ve been developing, which is about exactly this need to make customer data easily available across platforms. In fact, Causata was the original example I used to introduce the idea.




But enough about me, at least for the moment. The idea of NICE expanding to become an all-channel, all-department customer experience vendor immediately raises the question of how they’ll compete with all those other omni-everythings approaching from digital marketing (Adobe), B2B CRM (Salesforce.com), and general enterprise systems (Oracle, SAP, IBM). The contact center world has actually been a font of decision management systems, most notably Chordiant (now part of Pegasystems) and Infor Epiphany. So it’s certainly possible that they will be another source of competitors converging on the market for integrated customer experience management solutions. Like the CRM and Web content management vendors, the contact center firms start from a strong customer and financial base, making them formidable contenderss in what will surely be a long battle for high stakes.

I haven’t formed a solid opinion yet on how NICE in particular or contact center vendors in general are likely to fare in this new arena. But they are definitely something to factor into future assessments.

Thursday, July 25, 2013

Marketo's Engagement Engine Simplifies Complex Marketing Automation Campaigns

I’ve long said that the best campaign design would be one circle: the system executes the best treatment for each customer, waits a day, and repeats. My point is that elaborate, branching flows are too complex for most marketers to build and maintain, and – because reality is infinitely messier than even the most sophisticated flow chart – will often give customers a sub-optimal treatment.


It’s probably just as well that no vendor has ever built a system based on my design. But the good folks at Marketo have taken a step in that direction with their latest enhancement, which they call an “engagement program”. It has more than one step but does get away from the idea of a rigid, branching campaign flow. Instead, it is organized in terms of “streams” that contain pools of content. Once a customer is added to a stream, the system will offer the next piece of content whenever a contact is due according to the campaign cadence. What’s next is set by the order of content within the stream: users just drag content into the container and put on top the ones they want sent first. The system will go through the content in this order during each execution (which Marketo calls a “cast”), and send each person the first item they have not already received. This avoids duplicate messages and lets the system deliver a defined series of messages without explicitly setting up a sequence. It also makes it almost effortless to insert a high priority message that goes to everyone or to swap out contents as new materials become available.



As you’ll immediately notice, sending the next thing isn’t quite the same as sending the best next thing.  In my ideal world, the content would be selected by calculating the value of each item for each individual and taking the highest. This would only take a small tweak in the current approach, which is one reason I like what Marketo has done. Marketo does in fact plan to apply predictive modeling to the system, although I think they're trying to find the most effective content sequence for all customers, rather than scoring content at the individual level.

There’s quite a bit more to the new Marketo feature than I’ve described so far. Content can actually be a multi-step program of its own, such as a sequence of messages to promote and manage a Webinar. Content can also have availability dates that are enforced automatically, so future messages can be added at any time and obsolete messages are automatically discontinued. One thing that’s missing is eligibility rules on content, to let users specify who is allowed to receive it. This is a key feature in traditional decision management systems, permitting customer-level customization within a fixed priority sequence. But Marketo users can achieve the same thing by embedding content within programs, which do have such rules, and adding the programs to the stream instead of the content itself. This is Marketo’s recommended approach because it also provides better data for reporting.

Users can further tailor treatments to customers by setting up multiple streams within one engagement program.  Each stream has “transition rules” that pull in qualified customers from other streams.  This is less rigid than having selection rules in one stream push customers to another stream.  It's not quite clear what happens if someone qualifies for more than one stream: Marketo's position is that would only happen if you make a mistake.  I think reality is not so tidy.  Marketo is considering letting marketers prioritize the transitions based on the position of the streams, just as they prioritize contents within a stream.

In any event, customers can only be in one stream at a time, so they won’t receive multiple messages from the same engagement program. Whether they receive messages from multiple programs is controlled by Marketo’s standard communication limit features, which can set a maximum number of messages per day and per week. Users can decide whether those limits apply to any particular program. The system also lets salespeople or other programs pause messages from an engagement program to an individual customer.


The new package includes a good set of reports that track content usage and results.  They also provide an “engagement score” that combines several success metrics into a single value. Other reports show how many people in the program have run out of content – a good way to ensure the company doesn’t lose touch with them. Surprisingly, there's no report on movement from one stream to the next. But Marketo says this can be set up using their revenue performance management module, which tracks movement of customers across other types of stages.

The engagement engine is included in all Marketo versions, although lower-level versions have some limits.  Adding a more powerful version to the Standard edition of Marketo starts at $295 per month.

Added Thought: Marketo engagement programs are a type of state-based system, an idea that has been tried from time to time in marketing systems and is currently the basis of Whatsnexx.   As the name implies, state-based systems assign customers to categories and then define treatment rules within each category.  Unlike the sequential flow of a traditional multi-step campaign, customers in a state-based system remain in the same category so long as they meet its membership conditions. State-based systems typically reclassify all members at the start of each cycle, which is different from Marketo's approach of relying on transition rules to pull customers from one stream to the next.  This means that someone could remain in a stream even though they no longer met its entry criteria.  This is something Marketo might want to reconsider.


Thursday, June 27, 2013

Adobe Buys B2C Marketing Automation Leader Neolane: One Gap Filled, But Where's CRM?

Adobe today announced plans to acquire Neolane, the largest remaining independent B2C marketing automation vendor (excluding email-focused providers like Responsys and Silverpop). Price was $600 million, which is roughly in line with the 8x revenue paid for ExactTarget and Eloqua recently.  (Neolane announced $58 million revenue in 2012 and has been growing around 40% per year, which would yield about $80 million 2013 revenue.)

The deal is not particularly surprising. Adobe was on everyone’s list of potential buyers, and Neolane was ripe for acquisition or an initial public offering. It reinforces suspicions that Adobe was the mystery bidder for ExactTarget mentioned last month by Salesforce.com.  Indeed, my take on the ExactTarget deal explicitly mentioned an Adobe/Neolane possibility. That frankly didn’t take much insight, but I’ll brag a bit more about having pegged Adobe as needing to add marketing automation as far back as this post in 2009 and again in 2010.

Neolane is more of a mid-tier solution than an enterprise product, which may be a slight mismatch with Adobe.  I’d say that reflects a lack of enterprise systems available for Adobe to purchase, more than any particular desire to target the mid-market.

Predictable or not, this deal does fill a gaping hole in Adobe’s marketing cloud. It still doesn’t put Adobe on equal footing with Oracle, Salesforce, SAP or Microsoft, since they all have major CRM platforms which Adobe does not. Adobe obviously has a leadership position in content creation, although I’ve never felt that does much good in selling customer management systems. (To be more precise, content creation COULD give Adobe an advantage if it very tightly coupled auto-personalized marketing treatments with content creation, but that doesn't seem to be happening.)

More important, Adobe also has an unmatched position in Web analytics, Web advertising, and Web content management. In fact, adding Neolane gives it a profile very similar to IBM, which also has strong Web and marketing automation products but not CRM (and which also shares Adobe’s digital-is-everything mono-vision).

Come to think of it, the contrast still comes down to the dueling strategies I described in 2011: Web-plus-marketing automation (Adobe and IBM) vs. CRM-plus-marketing automation (Oracle, Salesforce, SAP, Microsoft). Everything will eventually converge Web-plus-CRM, with marketing automation baked in so deep you can't see it.  But that’s still some way off, except arguably for Oracle, which has all the pieces but hasn’t fully integrated them. In the meantime, we’ll see which approach is more popular – and what becomes of the stand-alone marketing automation vendors who are caught in between.

Thursday, June 06, 2013

Salesforce + ExactTarget vs. SAP + hybris: Two Paths to Customer Management

Fresh on the heels of Tuesday's blockbuster ExactTarget / Salesforce.com deal, SAP Wednesday announced acquisition of e-commerce vendor hybris software.  Since Salesforce said that other companies also wanted to buy ExactTarget, it seemed possible that SAP had lost the deal and purchased hybris as a second choice. After listening to the analyst conference call (available at (303) 590-3030 passcode 4623918), I still can't say.

The SAP and hybris managers unfairly implied during their call that ExactTarget does nothing but email (without mentioning Salesforce.com or ExactTarget by name).  But as Salesforce.com made clear in its own call yesterday, they were most attracted by ExactTarget's multi-channel marketing capabilities.  It's possible SAP wanted ExactTarget for the same reasons and would have described it differently had they been the winning bidder.

In any case, SAP did tell a good story: real-time interactions seamlessly presenting customers with consistent information, dialogues, and purchases across all channels, with a central role for the Web.  This is certainly the long term goal for most marketers, although few are close to delivering it.  As SAP pointed out, it's a customer-centric view of the world, quite different from the operational focus of traditional CRM.  SAP does have some unique assets to support this vision, including back-office systems with sales, inventory, costs, and other data needed to fully inform customer treatments, and the in-memory HANA database to make this data immediately available for real-time interactions.  I haven't done enough research to judge whether SAP can effectively combine these pieces, but they're making the right promises.

I still wouldn't be as dismissive of the Salesforce / ExactTarget combination as the SAP managers.  People integrate CRM with back-office systems all the time.  You can also build great customer experiences with little or no back office integration.  ExactTarget does have some Web personalization features (from its iGoDigital acquisition), although I don't know how well they're integrated with the rest of the system.  Similarly, it has claimed to support real-time interactions in its Interactive Marketing Hub, but I don't know how well that works.  What I do know is that Salesforce and ExactTarget have a reasonable idea of what's needed and the resources to build it.  How well and how quickly they execute remains to be seen -- but you can say the same for SAP.

Incidentally, the common thread for these acquisitions is that both vendors are moving into direct B2C marketing.  It's a big new market for each of them, and makes both much more interesting competitors to IBM, Oracle and Adobe.  Perhaps that's the most important news here.

It would be misleading to give the impression that SAP and Salesforce are equivalent.  The two deals highlight some very fundamental differences:

- SAP is a full enterprise system; Salesforce is about CRM. The SAP managers made the point most clearly when they discussed that their appeal is targeted at the boardroom level: they are selling to companies who want to build their entire infrastructure on SAP's system.  Salesforce is now, finally, adding serious marketing to its CRM system (although there are still some gaps such as media buying), but even so its vision is still limited to customer management, and it is selling at the level of sales, service, and marketing departments -- rarely in the boardroom.  Note that the original concept of CRM already encompassed those departments, so this is less an expansion than a filling of gaps.

- SAP is a suite; Salesforce is a platform.  Indeed, SAP is the ultimate suite: every enterprise function running on a single, tightly integrated system.  I've long argued that the fundamental rule of software marketing is that "suites win", meaning most companies will choose an integrated suite over multiple best-of-breed point solutions.  SAP's success is Exhibit A in my evidence for this, but you could argue it's actually so large that companies might be just as happy with several smaller suites instead (e.g., one for CRM and one for back-office).   This would still let them avoid doing most of the integration work, while not forcing them to commit totally to one vendor's system. 

Salesforce is also an integrated suite, although limited to CRM.  But it has also embraced (and I think invented) the idea of an open platform: a foundation system that can be supplemented by attaching other vendors' products.  This provides easy integration without limiting users to capabilities provided by the suite vendor.  The model has been tremendously successful for Salesforce, particularly at letting it offer advanced functions to its clients without having to pay for developing those functions.  ExactTarget has embraced a similar model, incidentally.

- SAP is largely on-premise software; Salesforce is Software as a Service (SaaS).  It's true that SAP now offers SaaS options, but it was built as on-premise software and its large enterprise clients still mostly run it that way.  hybris also offers both options but runs mostly on-premise (typical for Web content management).  Salesforce of course is the granddaddy of all SaaS companies.

- hybris runs Web sites; ExactTarget is still primarily about email.  The obvious point of this is that Salesforce still needs serious Web site management to provide comprehensive customer treatments.

But the difference goes deeper.  Web sites are inherently real-time systems, while email is inherently batch processing.  This was the essence of SAP's comments today, and while they may understate ExactTarget's abilities, there is a kernel of truth.  Web systems are engineered from the start for high-speed processing, and the e-commerce features of hybris also mean it was engineered from the start to interact with individual customers, not just serve generic Web pages.  Email systems were originally engineered for batch processing, not individual interactions.  Mobile and social messages, which ExactTarget also supports, can also be handled quite well in batch.  I don't know to how far ExactTarget has evolved towards supporting real-time interactions, but its heritage lies elsewhere.

- hybris has 500 customers; ExactTarget has 6,000.  The revenue difference is much less: $100 million for hybris and nearly $400 million for ExactTarget.  What this reflects is that hybris' clients are mostly large enterprises, while ExactTarget has a broad mix of large and small companies.  Each each a good match for the core business of its new owner: SAP also focuses on large enterprises, while Salesforce sells to pretty much everyone. The broad reach of ExactTarget was certainly part of the reason that Salesforce wanted it, but Salesforce already has well over 100,000 clients, so the net increase isn't all that important.

What all this means, I think, is that SAP and Salesforce represent very different approaches to customer management: SAP proposes a single, tightly integrated, highly responsive real-time system where everything is connected and optimized.  Salesforce offers a looser set of connections with less control but more room for variety, change, and innovation.  SAP will sell more to the boardroom while Salesforce will sell to sales and marketing departments.  I frankly expect that both will succeed; it's a big market and each approach will appeal to different customers.  What I really hope is that both will show the market how to do integrated, cross-channel customer management: that way, everybody wins.

Circling back to the original question: I still don't know whether SAP tried to buy ExactTarget.  Based on the what I wrote above, hybris was a better fit.  But the SAP managers spent so much time disparaging email in their call that I thought I smelled sour grapes. Or was it just competitive vitriol?



Monday, May 20, 2013

Silverpop Announces Universal Behaviors to Provide Better Cross Channel Customer Experience

At their annual Amplify conference last week, Silverpop unveiled the culmination of a two year project that conveniently matches the Customer Data Platform (CDP) concept I’ve been describing for the past month. While the timing is just coincidental, Silverpop’s Universal Behaviors provide more evidence that a new breed of system is emerging.

Silverpop’s new features load customer behaviors from all sources into a central database, match identities to create a unified customer view, and make the resulting information available for real-time, automated interactions across all channels. The central database and cross-channel treatments are two of the three capabilities I’ve defined for a Customer Data Platform. Silverpop falls short on the third CDP function, which is integrated predictive modeling.  But it has partners who fill that gap.

Many CDPs have been quietly maturing for several years.  Silverpop's two-year gestation cycle is a good example.  I can't say precisely why so many are emerging more or less simultaneously, but suspect a combination of business conditions and ever-more-urgent marketer needs.  The long-term drivers are clear: more marketing channels make customer attention harder to attract, spread behavior across different media, and require coordinated contacts across channels. As a result, marketers need a unified customer database, unified campaigns, and way to deliver messages across whatever channels customers use now or in the future. This is what they get from a CDP.

It’s less surprising to see another CDP system than to see it coming from Silverpop.  After all, Silverpop’s twin heritages in B2C email and B2B marketing automation both use simple data models: flat lists for email and basic lead/contact/account tables for B2B marketing automation. Both types of systems traditionally merge customer data using only email address.  Neither build a company's primary marketing database or shares data with external systems. So its quite unexpected to see Silverpop ingest data from any source, cross-reference any set of individual identifiers, offer access to the data, and send messages for delivery by other systems.

So how do Universal Behaviors work?  Each Behavior is first defined in Silverpop with a fixed set of attributes. Source systems then capture Behaviors and post them via an API to Silverpop.  They are stored in MongoDB, a “NoSQL” database that supports high input volumes and multiple record structures.  This is another departure for Silverpop, which uses the Oracle database in its core systems.

Behaviors include whatever customer identifiers the source system can provide: email address, cookie ID, phone number, account number, etc. Silverpop uses matches from external systems to link all identifiers associated with an individual: for example, a Web transaction might include cookie ID and email address, while an email could contain email address and account number.  Silverpop could later take a Behavior with any one of those identifiers and associate it with the same individual.  But there are limits to Silverpop's customer integration powers: it doesn’t do “fuzzy” matching to merge similar identifiers or import third-party reference databases that contain such links.  I'm beginning to see those capabilities are specialties that are not necessarily core features of a CDP because they're best purchased from third party vendors.

The initial release of Universal Behaviors, set for July, will support predefined Behaviors from ArgyleSocial social listening, Webtrends Web site behaviors, Digby location-based marketing, Invodo video, and several as-yet unannounced vendors, as well as Silverpop’s own location-based and SMS offerings. It will later add more partners, provide a system development kit (SDK) for mobile apps, and eventually allow any company to build its own connectors.

Once Universal Behaviors are loaded into Silverpop, they become available within the system for queries, program triggers, rules within programs, dynamic content, personalization, scoring, and analysis – pretty much anything that could be done with standard Silverpop data.  Program outputs such as messages and lists can be pushed in real time to external systems to manage interactions.

Silverpop has also created native integrations with Adobe and Episerver Web content management systems.  These let those systems submit a visitor ID to Silverpop and receive Silverpop data to use in dynamic content and personalization. Connectors for other CMSs will be added as clients request them. Clients could also write their own integrations using a published Silverpop API or use tags to display Silverpop-generated content on any Web page. Currently, CMSs can access selected customer attributes but not the Universal Behavior database itself.  Silverpop plans to provide full data access in the future.

The mobile app SDK will go even further, allowing apps to execute Silverpop functions such as adding a customer to a program or sending an email.  This is in addition to the standard features of submitting Universal Behaviors, reading Silverpop data, and rendering Silverpop-generated content.

The critical point in all this is that Silverpop will integrate other customer-facing systems instead of only executing interactions itself.  The integration includes sending data to Silverpop, reading data within Silverpop, and receiving Silverpop marketing treatments.  In other words, the role of Silverpop shifts from delivering customer treatments to helping other systems find the best treatments to deliver. Of course, Silverpop still retains its original execution capabilities for email and some other channels.  But it’s perfectly conceivable that a client could hook the new Silverpop features to someone else's email delivery system (not that anyone at Silverpop mentioned the possibility).

This separation between a central data-and-decision platform and multiple independent execution systems is the core concept underlying the Customer Data Platform. I’m increasingly convinced it is the only way that marketers will be able to keep up with ever-expanding channels and customer expectations.  By developing a structure that fits the CDP model, Silverpop has responded to a pressing client need and established itself in an important new category.


Monday, February 04, 2013

The Marketing Funnel is Dead: Here's What Will Replace It

Okay, I freely admit that headlines like “the marketing funnel is dead” are a cheap trick to attract attention.
But I swear I came by this one honestly. Too tired to do any serious work on a recent plane flight, I scanned a random white paper that argued the traditional idea of a funnel didn’t capture the need to treat customers individually as they move towards a purchase. So far my head was nodding in agreement plus maybe a little drowsiness. But then came the punch line: instead of a funnel, marketers should think of managing each customer’s progress as a process, which is best represented – wait for it – as an escalator.

Now I was fully awake, and not in a good way. How is an escalator any less linear than a funnel? Have I missed some crazy new form of multi-path escalator networks? Maybe so: I don’t get out much, and who knows what these kids today are up to? But assuming that’s the not case – and I do after all read Twitter – the escalator analogy is no better than a funnel at illustrating today’s situation.

Nor, to be a bit more serious, is the concept of managing buyers as a process. It’s true that a process can have branches (although not the escalator-like linear process this paper described). But a process is still something the marketer controls. Whereas, the dominant fact of marketing today is precisely that marketers don’t have control: the buyer does. It’s the buyer who decides at every step what she’ll do next.

The picture that comes to my own mind is a tornado: a totally uncontrollable, unpredictable force that leaps across the landscape setting down wherever it wants. By this analogy, the best a marketer can do is to build storm-proof structures that will function successfully no matter what buyer does. I don’t think that’s quite the right image – after all, buyers are not destructive – but it does convey the frightening powerlessness of marketers in today’s world.

The better analogy is probably a maze. Marketers can build an environment that defines the options available to buyers, even though the buyers still make their own decisions about the path they take. The maze also shows how buyers can follow different paths and still end up at the goal, can go in circles indefinitely, and can exit without reaching the goal. It also implies correctly that the marketer’s skill determines quality and effectiveness of the buyer experience, just as the maze designer’s skill determines how much fun it is for visitors. If you really want to push the analogy, you can argue that we’re talking here about a corn maze – because ultimately customers can break out of the predefined paths if they want to.

I guess we’re all lucky that my flight didn’t last much longer, since I was beginning to think about how corn needs water (funding?) and if there’s a drought the ears of corn will have small kernels (customer value?). A more useful insight is that mazes have different regions, so the maze analogy replaces the notion of sequential lead stages with a more nuanced view of non-linear buyer states that can be the same distance to the goal yet differ in other significant ways. Buyers can also jump from state to state without necessarily moving through regions that are adjacent – like a tornado touching several spots within a maze, come to think of it. But enough with the metaphors.

Let’s just stick with the main point: the marketing funnel is really and sincerely dead. The purchase process is no longer linear and, even if it were, marketers couldn’t control how buyers move through it. The image of maze may not be perfect but it does show that buyers can follow many routes, that they’ll make their own choices, and that marketers still play an important role by defining the buyers’ environment. At least it’s a start.

Of one thing I’m certain: the buying process is not an escalator.

Thursday, December 13, 2012

Sitecore Migrates from Web Content Management to Cross-Channel Customer Engagement

It’s more than three years since my original post about Sitecore’s plans to transform itself from a Web content management system to a platform for cross-channel customer experience management. It seems to be working out – since that time, revenue has grown 40% per year and the number of clients has nearly doubled from 1,600 to 3,000. The company has attracted additional funding, added support for producing dynamic print content, launched an “App Center” for pre-integrated third party products, built a cloud deployment on the Microsoft Azure platform, and extended to new channels via partnerships covering community management (Telligent), online video publishing (Brightcove), and social marketing campaigns (Komfo).

In other words, Sitecore has been steadily executing on the strategy they described in 2009. If there has been a change, it’s recognition that the company can’t build everything itself: hence the partnerships and app center, which use other developers’ products to extend Sitecore to other channels. Sitecore says this makes sense because giving customers a consistent experience across all channels requires only central data and central content management. Letting external systems deliver the actual interactions causes no particular harm.


This vision should sound familiar: it’s the idea behind the real time decision management systems I’ve been reviewing recently. The difference is architecture: the decision managers place decision logic at the center and see customer data and content as peripheral.

More specifically, the decision managers assemble a consolidated customer profile by pulling data on demand from external systems rather than storing it internally. This is actually a pretty minor difference, since in practice most companies will both maintain a central customer database with key profile information and make direct connections to other systems for details such as transactions. The Sitecore model is pretty much the same: it creates its own customer database and supports real time connections to other systems via Web services or database queries.

The approach to content is a more significant distinction.  Most decision managers assume content is best stored with the touchpoint systems, while Sitecore wants to store most content itself. I chalk this up to their heritage as a content management vendor. Both approaches have their merits: central storage makes coordination easier but requires continued extension of system features to handle new formats; touchpoint storage makes it harder to know what content is actually available and appropriate. So far, Sitecore has been making the investments to manage new formats centrally, at least to the extent of making the contents visible to functions like message selection, access control, and approval workflows. It doesn’t necessarily extend to actually creating or modifying the contents themselves. Maybe that’s a good compromise.

The other important difference is decision rules. These are obviously the main focus of decision management products. Sitecore doesn’t talk about them much, although does deliver them in the form of campaigns flows and dynamic content rules. The campaigns can manage activities across multiple channels – such as sending an email response to a Web visit – although they are not as sophisticated as the multi-branch, looping logic, advanced decision arbitration, and integrated predictive modeling available in the best decision management systems.


On the other hand, Sitecore makes very powerful use of its close control over content.  Each item can be assigned scores on several attributes, such as how much it relates to technology, product, or industry topics.  The system then tracks the content consumed by each individual and compares their behavior to “profile cards” of personas such as frequent site visitors with high interest in business. Each person is assigned to the profile card they match most closely; people can be switched to a different card if their behaviors change. This nearest-fit approach is more flexible than the rigid inclusion criteria of traditional segments or lists. Cards are used in decision rules to select contents and treatments for each person.

Although I just compared Sitecore with decision management systems, its more immediate competitors are marketing automation vendors.  Like Sitecore, they aim to be a company’s core marketing platform. Sitecore’s campaign flow, email and decisioning features are roughly comparable to the same features in mid-tier marketing automation products, while its Web site management and content creation are generally stronger. Marketing automation systems still probably have advantages in analytics and other areas, although it’s hard to generalize. Sitecore does offer the key B2B marketing automation capability to synchronize with CRM products including Salesforce.com and Microsoft Dynamics CRM.

One clear difference is that most marketing automation systems today are software-as-a-service products, while Sitecore is sold as licensed software, running either on-premise software or on the Microsoft Azure cloud. Pricing starts around $125,000 for an enterprise deployment.  Smaller companies would pay less, but Sitecore will never be a system you can get for $1,000 per month.

Wednesday, June 22, 2011

Dueling Strategies: Adobe and Oracle Take Opposite Paths to Customer Experience Management

Adobe on Monday announced a new “Digital Enterprise Platform for Customer Experience Management”. The platform fills the center of Adobe’s three-part corporate mission to “make, manage, and measure” digital content and experiences. The other two pieces were already in place: “make” is Adobe’s original content creation business, while “measure” is Omniture Web analytics.

The strategic significance of the announcement seems more important than the actual product enhancements. These include improved integration of the company’s Web content management system (formerly Day C5) with Scene 7 dynamic content and Omniture Survey and Test & Target; features for salespeople and customer service agents to customize standard documents in a controlled fashion; integrated content reviews and workflows; and a platform to build and share content in multiple formats. The announcement also included beta versions of tools for social engagement, online enrollment, and agent workspaces. Good stuff but nothing earth-shaking.

Adobe's strategy itself is a curious mixture of broad ambition and narrow execution. Adobe describes its scope as nothing less than optimizing customer experience and marketing spend across the entire customer journey, from first learning about a company through validation, purchase decision, product use, and commitment. But Adobe also explicitly limits its scope to digital channels, and implicitly limits its concern to content creation, delivery, and evaluation. In fact, the only customer-facing technology Adobe offers is Web site management. Otherwise, Adobe expects even digital content such as emails to be delivered by third party products. Offline interactions, such as telephone and retail, are definitely out of the picture. Nor does Adobe manage the underlying customer database, marketing campaigns, or deep analytics. The only exceptions are customer profiles, segmentation, and content to support Web personalization.

The company argues the tools it does provide, combined with the cross-channel content sharing, are enough to build a unified digital customer experience. I’m not so sure that’s correct, and even if it is, I question whether customers will be happy to have only their digital experiences be unified. Either way, marketers will certainly need other vendors' products to manage their full customer relationships.

On the other hand, I do agree with Adobe’s argument that its approach lets clients create a unified digital experience without replacing their entire enterprise infrastructure. This is certainly an advantage.

Adobe’s announcement was released on Monday, but I didn’t get around to writing about it until today. The delay is unfortunate, since the attention of the enterprise marketing automation world has already shifted to yesterday’s announcement that Oracle is acquiring Web “experience” management vendor FatWire Software. I’m not sure I accept “Web experience management” as a legitimate software category, but FatWire does combine conventional Web content management with unusually strong targeting, personalization, content analytics, digital asset management, mobile, and social features. Perhaps that justifies calling it more than plain old Web content management.

The strategic purpose of the FatWire acquisition is self-evident: to fill a gap in Oracle’s customer-facing technologies, which already had ATG ecommerce and general Enterprise Content Management for Web sites, as well as Oracle CRM and Oracle Loyalty. (Oracle isn’t very creative with product names.) FatWire will allow much richer, more personalized and targeted Web site interactions. It also provides some Web analytics, although I still think Oracle has a gap to fill there.

The Oracle and Adobe announcements do highlight a clear strategic contrast. Adobe has largely limited itself to digital interactions, and has largely avoided customer-facing systems except for Web sites. Oracle has embraced the full range of online and offline interactions, including customer-facing systems in every channel. Oracle has also hedged its bets a bit with Real Time Decisions, which can coordinate customer treatments delivered by non-Oracle systems and powered by non-Oracle data sources. Of the other enterprise-level marketing automation vendors, IBM, SAS and Teradata share Adobe's focus on digital channels and its avoidance of customer-facing systems, although they resemble Oracle in offering deep analytics and customer database management.

Based on my fundamental rule that “suites win”, I think Oracle’s strategy is more likely to succeed. But only time will tell.

Wednesday, May 26, 2010

Customer Worthy (The Book) Offers Methodology for Customer Experience Management

My friend and former business partner Michael Hoffman of ClientXClient recently sent a copy of his new book Customer Worthy, which explores use of his customer experience management tool, the CxC Matrix. I’ve long been a big fan of the Matrix*, which visualizes all the ways a customer can interact with a business. The new book provides a detailed explanation of Matrix concepts and applications.

The core concept is to “Think Like a Customer” (a favorite Hoffman catch phrase), meaning to understand each contact from the customer’s point of view. The book explains how to use the Matrix to document contacts throughout the customer life cycle, allowing companies to systematically visualize, analyze, monetize, prioritize and ultimately optimize each interaction. It shows how to extend the Matrix to the departmental and system view of each contact, giving companies a roadmap of the steps they must take to execute on Matrix concepts.

Other sections address privacy concerns and highlight the cost of poor service. A final section explains how each department throughout the company can use the Matrix to organize its internal work and coordinate with the rest of the organization.

Customer Worthy provides a good mix of inspiration, theory and practical examples. I’m pleased he’s taken the time to work through Matrix concepts at length, since it’s a rich topic that repays detailed examination. Even if you don’t deploy the Matrix in the forms that Hoffman describes, it’s worth reading to reinforce the broader points that (a) the customer comes first and (b) there are systematic ways to make that thought a reality.


_________________
*This blog, which started when Hoffman and I were partners, was named for it.

Monday, March 29, 2010

thinkAnalytics Helps Marketers Optimize Customer Treatments

Summary: thinkAnalytics provides a robust decision engine to help make optimal recommendations across channels. Too bad more people don't use it.

As I mentioned in my post on PegaSystems’ acquisition of Chordiant, I’ve been planning for months to write about the thinkAnalytics recommendation system. The delay had nothing to do with any reservations about the product, which I find extremely impressive. It was more because I've been giving the topic low priority because the market for such systems seems to be moving slowly despite the clear benefits they provide.

The history of thinkAnalytics itself illustrates my point nicely. The company was founded in 1996 to offer K.wiz data mining software and had reached pretty much its current form by the early 2000’s. Indeed, the briefing slides the company showed me in mid-2009 were nearly identical its slides from 2007. The company also reported about twenty installations in both sessions. This isn’t to say that product itself has not evolved: it’s now up to version 8.0 and release notes on the company Web site show a steady stream of enhancements. But the fundamental approach has not changed.

This approach uses an “Intelligent Enterprise Server” to connect company touchpoints and data sources to thinkAnalytics’ data mining, recommendations and business rules engines. That is, thinkAnalytics sits outside of the individual touchpoint systems, allowing it to deliver consistent recommendations across all channels. These recommendations in turn are based information from on all data sources, not only those captured within a particular touchpoint system.

The advantages of consistent treatment and access to all company are self-evident. Of course, they do require identifying individuals across channels, so that, say, behavior during Web visits is linked to behavior at a call center. thinkAnalytics doesn’t directly solve this problem, but can make use of whatever linkages the company has built elsewhere. Its most common applications, churn reduction for telecommuncations companies and content recommendations for video-on-demand services, are in situations where customers explicitly identify themselves, so this is not an issue.

The technical hub of thinkAnalytics is the enterprise server, which needs to handle traffic among touchpoints, data sources, and the analytical components. The main issues with such servers are flexibility and scalability. thinkAnalytics addresses these by deploying a component-based architecture that lets it connect with virtually any external systems and can easily be distributed across platforms and servers to scale as necessary. The company says existing installations have scaled to thousands of decisions per second. Its client list is weighted towards very large firms – Vodafone, Virgin Media, Sky, orange, Lloyds TSB, and Alcatel-Lucent among them – who require this sort of volume.

But while the server may be the technical hub of the system, its heart is the analytic components: data mining, recommendations and rules engines. Data mining includes a wide variety of predictive modeling and data visualization capabilities, some fully automated, which feed into the recommendations themselves. The system can also import external predictive models from vendors such as SAS and SPSS. The system includes several specialized capabilities related to video content selection, including automated text analysis to create metadata and classify new content; capture of user preference ratings; handling of social recommendations; maintenance of personal profiles; and user-initiated search. The component-based architecture makes it relatively easy for thinkAnalytics to add specialized features in general, so the system could be adopted to other applications fairly easily.

The rules engine complements the recommendation rankings by letting managers apply constraints such as limiting the number of recommendations within any particular category. However, the system doesn’t provide sophisticated optimization tools, so it’s still up to marketers to manually discover the most effective rule sets.

Although the multi-channel capability of thinkAnalytics is highly impressive, the vendor says that most clients start using it in a single channel and add others a year or two later. This suggests that clients are primarily interested in the quality of the recommendations, and just secondarily in the cross-channel treatment coordination. thinkAnalytics reports that its telecommuncations clients have seen churn rates of 20% drop to 12%, while video-on-demand clients have increased sales between 30% and 55%.

Pricing for thinkAnalytics real-time components depends on the nature of the application. Factors can include the channels and applications, number of data mining users, and customer volume. A minimum installation for the recommendation engine starts around $250,000. The system is licensed for on-premise operation by the client.

The four components of thinkAnalytics (predictive modeling, recommendations, rules and a server to connect with the outside world) make it the very model of what is sometimes called a “decision engine”. As I noted in the Chordiant post mentioned earlier, most companies use the decisioning capabilities built into their touchpoint systems rather than buying a stand-alone product. But it’s still worth keeping the model in mind when assessing whether your touchpoint systems’ capabilities are truly adequate.

Thursday, July 16, 2009

Alterian Pushes Into Social Media Management with Techrigy Acquisition

Summary: Alterian's purchase of Techrigy marks the first integration of serious social media management with marketing automation. Others are sure to follow.

Marketing automation vendor Alterian yesterday announced its acquisition of social media monitoring company Techrigy. Even though the Techgrity deal is the first direct acquisition I recall of a social media monitoring system by a marketing automation vendor, it strikes me as an obvious step. Marketers have been scratching their collective heads for years over how to integrate social media, and marketing automation vendors are very aware of their needs.

I’m not even totally surprised that Alterian was the first to jump into this pool. Although other marketing automation vendors like Unica and SAS are generally more expansive, Alterian has been particularly aggressive about integrated customer management. Previous acquisitions include Web content management (MediaSurface, 2008), contact optimisation (Campaign Calculus 2.0, 2007), email (Dynamics Direct, 2006), marketing resource management (Nvigorate, 2006), and hosted analytics services (MarkIT, 2005).

In fact, according to Alterian’s very interesting FAQ about the Techrigy acquisition, “Engagement marketing” is the core of their current corporate vision. Although I’m generally allergic to sweeping vision statements, I think Alterian has earned the right to use that one. Sparingly.

What really impresses me is that Techrigy is a serious social media monitoring solution. This isn’t about making it easy to react to comments on Twitter, add friends on Facebook, or research prospects on LinkedIn, which is how most marketing automation vendors are approaching social media. Instead (or in addition) Techrigy supports sophisticated searches, categorization, sentiment analysis, influence measurement, author tracking, and case management.

This set of features means that Techrigy is really built more for corporate PR departments and marketing agencies than one-on-one customer management. But that makes the acquisition still more intriguing. I expect Alterian to extend the product to monitor and manage individual relationships, thereby integrating social media with other aspects of customer management.

This would be a major step beyond using aggregate social media data as a way to measure marketing performance – although even such measurement would be itself a great leap forward for most companies today.

It’s by no means certain that Techrigy can actually scale up to manage this many individual relationships. Current users probably track just a small number of individuals and cases, such as key bloggers and specific complaints that must be resolved. Ramping from that to tracking millions of individuals is likely to uncover serious bottlenecks. But even if the system can’t do this today, Alterian should eventually be able to rework it to overcome any obstacles. This sort of processing is a good fit for Alterian’s columnar database engine, which is the core of its business.

I spent some time playing with Techrigy yesterday, using the free version available on their Web site. This allows only 1,000 search results, which is far too few for any real business purpose. But it did give a good flavor for the system.

On the whole, I liked Techrigy very much. Per my previous comment, I was particularly impressed with the scope of the functions.

These start with searching for articles to analyze. The search interface allows advanced logic, complete with Boolean statements, local and global exclusions, and rules to assign the articles to categories for later analysis.

The searches run against articles assembled by Techrigy in a database that stretches back for two years and includes 1.5 billion entries. Sources include blogs, social networks (publicly-accessible sections of Facebook, MySpace, etc.), microblogs (Twitter, Friendfeed, etc.), message boards/forums (such as LinkedIn discussions), wikis (such as Wikipedia), video and photo sharing sites (Flickr, YouTube), and some mainstream media blogs (The New York Times, Wall Street Journal). Querying this database is where the Alterian database engine should shine – yesterday, running even my simple searches took longer than I’d like.

Users can also add their own feeds to search. This could not only capture specialized sources that are too small for Techrigy to monitor, but might include private sources such as a company’s user forums. This opens up a range of important applications beyond public social media tracking.

Searches can run on command, continuously, or on a regular schedule. Results can be streamed to an external viewer as an RSS feed or presented in standard reports. The most basic report shows daily volumes with trends over time. Results can be categorized and filtered based on author popularity (a 0-10 score based on audience), author demographics (age and gender, where known), domains, sources, and keywords. Additional reports show word clouds with themes, which can be derived from keywords or more advanced semantic analysis. There's even a Google Maps mash-up to show author locations. The semantic engine can also tag posts with positive or negative brand sentiment, content tone and emotions.

Users can dig into these reports to view the underlying articles. The system starts with a list of article summaries, similar to a set of Google search results. Users can then select an article and drill into its details, including extracted Web site information and traffic rank, content analysis showing sources of the system-applied tags, the full article itself, and links to Alexa, Technorati, Compete and Quantcast information about the article source.

Users can also delete the entry, mark it as spam, adjust the system-assigned tags, and edit information about the author. This author tracking is what could ultimately be expanded into tracking of individual customers.

Finally, users can assign the article to a user for review or action. This engages the workflow system, which can notify the assigned user and keep track of the article’s status, notes and priority. Here the system moves from social media monitoring into actual relationshipship management.

Techrigy’s user interface is generally okay, although I sometimes had a hard time finding functions such as how to rerun a report. This would presumably go away after a bit of experience. Response time was a little slower than I’d like for tasks such as a applying a filter or presenting an article list. However, this might not be typical: the Alterian acquisition has attracted a lot of attention and generated more than 400 new trial users (per a Twitter post). In any case, this is where the Alterian engine should help.

As for the semantic engine itself – I was underwhelmed by the accuracy of the results. I especially enjoyed the Twitter post “Twitter for B2B Marketing - Marketo: Sin Descripción http://tinyurl.com/mfuh3n” being tagged as negative, religious, and – wait for it -- written in Danish.

A more serious problem is articles like “Ten Mistakes Marketers Make” being tagged as a strongly negative brand reference. But this is probably an issue with most semantic engines (I had a similar problem with ScoutLabs). Presumably Techrigy’s accuracy will improve over time. But even if it doesn’t, users will adjust to what it can do. In practice, you’d expect a company to review all entries tagged as negative, which would then be reclassified or dealt with as appropriate.

Current pricing for Techrigy starts at $600 per month for 20,000 stored search results. Of course, pricing could change under the new ownership.

In sum, Techrigy is an interesting product on its own, but the real story here is the potential for merging social media with other marketing systems. Although this was clearly inevitable, it’s exciting to see Alterian start to make it happen. We can expect other marketing automation vendors to follow quite quickly.