Showing posts with label customer experience. Show all posts
Showing posts with label customer experience. Show all posts

Wednesday, March 18, 2020

Balandra Orchestrates Customer Journey Without a CDP

Balandra Customer Flow Diagram
The need for a system that assembles unified, sharable customer profiles is now widely accepted. So is the label of “Customer Data Platform” to describe such systems. What people do still debate is whether a Customer Data Platform should only assemble those profiles or should also include features to “activate” them in the sense of selecting customer treatments. I personally find the discussion uninteresting since the plain reality is that some companies want activation features in their CDP and others do not.  Companies that don't want activation in their CDP may already have a separate activation system or prefer to purchase a separate one.  This means that activation is optional, and, thus, not a core CDP feature. QED.

Theory aside, it’s true that the majority of CDPs do include activation features.  This makes a stronger argument for the weaker claim that most buyers want activation features in their CDP.  But this has nothing to do with CDPs in particular: it’s just an instance of the general rule that buyers prefer integrated systems to separate components. This is known (to me, at least) as Raab's Law, stated most succinctly as "suites win".

A diehard advocate of “CDPs need activation” might question whether activation systems can truly be purchased separately. My response points to Journey Orchestration Engines (JOEs), a small but intriguing category that includes Thunderhead, Pointillist, and Kitewheel among others. These products select the best treatment for each customer in each situation and transmit their choice to delivery systems (email, Web CMS, mobile app, call center, etc.) for execution. All need customer profiles to function, but they don’t necessarily meet the RealCDP requirements for accepting data from all sources, retaining all details, storing the data internally, or sharing their profiles with others. This is because their designers’ focus is on the very different challenge of making it easy for users to define, manage, and optimize customer treatments across channels.

Meeting that challenge requires presenting customer data effectively, identifying events that might require an action, selecting the right action in the current situation, and sending that action to external systems for delivery. Some tasks, such as data presentation and delivery system integration, are also found in other types of systems. The unique challenge for Journey Orchestration Engines is finding the right action while taking into account the customer’s complete situation (not just the current interaction). This requires understanding all the factors that are relevant in the current situation and choosing the best among all possible actions.

Of course, "all" is an impossibly high standard.  A more realistic goal is to understand as many factors as possible and choose among the broadest range of available actions. It’s an important distinction because the scope of available data and actions will grow over time.  This means the key capability to look for is whether a system has the flexibility to accommodate new data and actions as these become available.

This brings us to Balandra, a Madrid-based journey orchestration engine.

Balandra is designed for complex service industries such as insurance, telecommunications, and healthcare, where companies have multiple, complex operational systems. Left to run independently, these systems will each send their own messages, creating a disconnected and often inappropriate experience for each customer. Balandra intercepts these messages and replaces them with a single stream is governed by a common set of rules.

The rules themselves draw on a structure that organizes customer experience into major processes such as onboarding a new client, setting up a new service, or filing an insurance claim. Each process is assigned a combination of data, lifecycle stages, available actions, and decision rules. When an event occurs that involves the process, Balandra executes its rules to pick an action based on the customer’s data and lifecycle stage.

This may not sound especially exciting. But it’s important to contrast Balandra’s approach with conventional customer journey flows.  These follow a specified sequence of messages and events, at best with some branching to accommodate different customer behaviors as the journey progresses. But a conventional journey flow can only include a fairly low number of steps and branches before it becomes incomprehensibly complex. The rule-based approach avoids this problem by letting users  create different rules for different factors and apply them in sequence. So, you might have one rule that checks for recent customer service issues, another that checks for customer value, and another for previous purchases. Each rule would add or exclude particular messages from consideration. After all the rules had executed, a final rule would select from the pool of messages that remain available.

The advantage of this approach is that each rule executes independently, avoiding the need for a complex decision tree that specifies different treatments for different combinations of conditions. Rules can just be added or dropped into the mix knowing that they’ll apply themselves only when relevant conditions are met. For example, a rule might check for recent customer service problems and suppress new product offers within the following two weeks if one occurred. This happens (or doesn’t happen) across all interactions without explicitly building that check into each journey flow.

To be clear, Balandra isn’t the only system to take this approach. In fact, its actual rule definition and execution is done using a standard business rules engine – IBM’s Operational Decision Manager (ODM), formerly ILOG. The system does have an interface that lets non-technical users define the data associated with each process and specify connections with delivery systems. It can ingest data in real time via APIs, through event streams such as Kafka, or through batch file updates. It can support both real time interactions and batch processing for outbound campaigns.

If you’re keeping score, Balandra doesn’t qualify as a CDP because it only uploads a fraction of the data related to a customer – the interactions between customer and company systems. While this means Balandra clients might still want a separate CDP system, it also enables Balandra to use many fewer resources than a CDP would.

Balandra launched its product in 2014. It currently has four clients in production, all in Spain, and is looking distribution partners in other regions. Pricing starts around $50,000 per year and grows based on the number of customers.

Wednesday, December 11, 2019

Acquia Buys AgilOne CDP

Acquia, which is moving past its roots in Web content management to become a multi-channel “digital experience platform” (DXP), took a big step in that direction today with a deal
to buy the AgilOne Customer Data Platform. The deal follows Acquia’s May 2019 purchase of open source marketing automation platform Mautic  and September 2019 purchase of site building tool Cohesion.  Acquia itself was purchased in September by Vista Equity Partners  for $1 billion, which obviously supports their DXP strategy.

The logic behind this deal is so clear that there’s little need for comment. While the exact meaning of DXP is a bit fuzzy, it surely involves coordinating and personalizing customer experiences across channels. This certainly requires the unified customer data that a CDP provides.  Acquia’s heritage in Web content management doesn’t provide deep customer data unification experience, and neither does Mautic.  AgilOne is a particularly good fit because it’s better than many CDPs at identity matching, including offline as well as online data. It also provides lots of connectors to source and delivery systems, as well as advanced machine learning for segmentation and predictions. Acquia and Mautic lacked those, too.

AgilOne rebuilt its core technology fairly recently, giving it a highly flexible and scalable platform that should easily extend beyond the company’s current base in mid-tier retail.  In particular, it will be able to serve Acquia’s clients, who tend to be very large companies with multiple Web multi-sites around the world. At the same time, AgilOne gives Acquia a stronger story in retail and other B2C markets where it has been less active.  AgilOne will also gain by integrating with some of Mautic’s features, notably email and SMS delivery and complex customer journey management. And the deal gives AgilOne much deeper resources to fund growth than it had as an independent company.

What, if anything, does the deal tell us about the larger CDP industry? I’d argue it mostly reinforces the trends I described in October, of independent CDPs being purchased by companies that are not primarily marketing software vendors but need to add customer data capabilities. Nearly all major CDP purchases to date meet this description: Mastercard buying SessionM, Dun & Bradstreet buying Lattice Engines, Arm buying Treasure Data, and Informatica buying Allsight. The only partial exception is Salesforce buying Datorama, but CDP wasn’t the focus of that deal. None of the other companies trying to follow Acquia’s approach of expanding from Web content management to DXP has yet purchased a CDP. But they’ll all need CDP functions so don’t be surprised to see more deals along those lines.

Put in a broader context, adding a CDP as a module inside a DXP is an example of CDP as a component within large marketing or even operational systems, something I refer to as “CDP Inside”. I expect that to be increasingly common and, thus, a potential home for independent CDP systems as the market matures, competition heats up, and the big marketing cloud vendors release their own products.  Selling or merging to become part of a larger system is one escape path for the independent CDPs. Another path is to focus on specific industries or cost-sensitive segments where the big marketing clouds are at a disadvantage.  I expect to see current CDP vendors take both approaches, even as new entrants continue to appear.  The CDP market won't get any simpler but buyers should have increasingly clear choices, so buying a CDP may become a bit less complicated.

Monday, September 24, 2018

Adobe, Microsoft and SAP Announce Open Data Initiative: It's CDP Turf But No Immediate Threat

One of the more jarring aspects in Adobe’s briefing last week about its Marketo acquisition* were several statements that suggested Marketo and Adobe’s other products were going to access shared customer data. This would be the Experience Cloud Profile announced  in March and based on an open source data model developed jointly with Microsoft and stored on Microsoft Azure.**  When I tried to reconcile Adobe’s statements with reality, the best I could come up with was they were saying that Adobe systems and Marketo would push their data into the Experience Cloud Profiles and then synchronize whatever bits they found useful with each application’s data store. That’s not the same as replacing the separate data stores with direct access to the shared Azure files but it is sharing of a sort. Whether even that level of integration is available today is unclear but if we required every software vendor to only describe capabilities that are actually in place, the silence would be deafening.

The reason the shared Microsoft project was on Adobe managers’ minds became clear today when Adobe, Microsoft and SAP announced an “Open Data Initiative” that seemed pretty much the same news as before – open source data models (for customers and other objects) feeding a system hosted on Azure. The only thing really seemed new was SAP’s involvement. And, as became clear during analyst questions after the announcement at Microsoft’s Ignite conference, this is all in very early stages of planning.

I’ll admit to some pleasure that these firms have finally admitted the need for unified customer data, a topic close to my heart. Their approach – creating a persistent, standardized repository – is very much the one I’ve been advocating under the Customer Data Platform label. I’ll also admit to some initial fear that a solution from these vendors will reduce the need for stand-alone CDP systems. After all, stand-alone CDP vendors exist because enterprise software companies including Microsoft, Adobe and SAP have left a major need unfilled.

But in reviewing the published materials and listening to the vendors, it’s clear that their project is in very early stages. What they said on the analyst call is that engineering teams have just started to work on reconciling their separate data models – which is heart of the matter. They didn’t put a time frame on the task but I suspect we’re talking more than a year to get anything even remotely complete. Nor, although the vendors indicated this is a high strategic priority, would I be surprised if they eventually fail to produce something workable.  That could mean they produce something, but it’s so complicated and exception-riddled that it doesn’t meet the fundamental goal of creating truly standardized data.

Why I think this could happen is that enterprise-level customer data is very complicated.  Each of these vendors has multiple systems with data models that are highly tuned to specific purposes and are still typically customized or supplemented with custom objects during implementation. It’s easy to decide there’s an entity called “customer” but hard to agree on one definition that will apply across all channels and back-office processes. In practice, different systems have different definitions that suit their particular needs.

Reconciling these is the main challenge in any data integration project.  Within a single company, the solution involves detailed, technical discussions among the managers of different systems. Trying to find a general solution that applies across hundreds of enterprises may well be impossible. In practice, you’re likely to end up with data models that support different definitions in different circumstances with some mechanism to specify which definition is being used in each situation. That may be so confusing that it defeats the purpose of having shared data, which is for different people to easily make use of it.

Note that CDPs are deployed at the company level, so they don’t need to solve the multi-company problem.*** This is one reason I suspect the Adobe/Microsoft/SAP project doesn’t pose much of a threat to the current CDP vendors, at least so long as buyers actually look at the details rather than just assuming the big companies have solved the problem because they’ve announced they're working on it.

The other interesting aspect of the joint announcement was its IT- rather than marketing-centric focus. All three of the supporting quotes in the press release came from CIOs, which tells you who the vendors see as partners. Nothing wrong with that: one of trends I see in the CDP market is a separation between CDPs that focus primarily on data management (and enterprise-wide use cases and IT departments as primary users) and those that incorporate marketing applications (and marketing use cases and marketers as users). As you may recall, we recently changed the CDP Institute definition of CDP from “marketer-controlled” to “packaged software” to reflect the use of customer data across the enterprise. But most growth in the CDP industry is coming from the marketing-oriented systems. The Open Data Initiative may eventually make life harder for the enterprise-oriented CDPs, although I’m sure they would argue it will help by bringing attention to a problem that it doesn’t really solve, opening the way to sales of their products.  It’s even less likely to impact sales of the marketing-oriented CDPs, which are bought by marketing departments who want tightly integrated marketing applications.

Another indication of the mindset underlying the Open Data Initiative is this more detailed discussion of their approach, from Adobe’s VP of Platform Engineering. Here the discussion is mostly about making the data available for analysis. The exact quote “to give data scientists the speed and flexibility they need to deliver personalized experiences” will annoy marketers everywhere, who know that data scientists are not responsible for experience design, let alone delivery. Although the same post does mention supporting real-time customer experiences, it’s pretty clear from context that the core data repository is a data lake to be used for analysis, not a database to be accessed directly during real-time interactions. Again, nothing wrong with that and not all CDPs are designed for real-time interactions, either. But many are and the capability is essential for many marketing use cases.

In sum: today’s announcement is important as a sign that enterprise software vendors are (finally) recognizing that their clients need unified customer data. But it’s early days for the initiative, which may not deliver on its promises and may not promise what marketers actually want or need. It will no doubt add more confusion to an already confused customer data management landscape. But smart marketers and IT departments will emerge from the confusion with a sound understanding of their requirements and systems that meet them. So it's clearly a step in the right direction.




__________________________________
*I didn't bother to comment the Marketo acquisition in detail because, let’s face it, the world didn’t need one more analysis. But now that I’ve had a few days to reflect, I really think it was a bad idea. Not because Marketo is a bad product or it doesn’t fill a big gap in the Adobe product line (B2B marketing automation).  It's because filling that gap won’t do Adobe much good. Their creative and Web analysis products already gave them a presence in every marketing department worth considering, so Marketo won’t open many new doors. And without a CRM product to sell against Salesforce, Adobe still won’t be able to position itself as a Salesforce replacement. So all they bought for $4.75 billion was the privilege of selling a marginally profitable product to their existing customers. Still worse, that product is in a highly competitive space where growth has slowed and the old marketing automation approach (complex, segment-based multi-step campaign flows) may soon be obsolete. If Adobe thinks they’ll use Marketo to penetrate small and mid-size accounts, they are ignoring how price-sensitive, quality-insensitive, support-intensive, and change-resistant those buyers are. And if they think they’ll sell a lot of add-on products to Marketo customers, I’d love to know what those would be.

** I wish Microsoft would just buy Adobe already. They’re like a couple that’s been together for years and had kids but refuses to get married.

*** Being packaged software, CDPs let users implement solutions via configuration rather than custom development. This is why they’re more efficient than custom-built data warehouses or data lakes for company-level projects.

Friday, April 13, 2018

Building Trust Requires Innovation

Trust has been chasing me like a hungry mosquito. It seems that everyone has suddenly decided that creating trust is the key to success, whether it’s in the context of data sharing, artificial intelligence, or customer retention. Of course, I reached that conclusion quite some time ago (see this blog from late 2015) so I’m pleased to have the company.   But I’m also trying to figure out where we all need to go next.

I picked up a book on trust the other day (haven’t gotten past the introduction, so can’t say yet whether I’d recommend it) that seems to argue the problem of trust is different today because trust was traditionally based on central authority but authority itself has largely collapsed. The author sees a new, distributed trust model built on transparent, peer-based reputation (think Uber and Airbnb)* that lets people confidently interact with strangers. The chapter headings suggest she ends up proposing blockchain as the ultimate solution. This seems like more weight than any technology can bear and may just be evidence of silver bullet syndrome.   But it does hint at why blockchain has such great appeal: it’s precisely in tune with the anti-authority tenor of our times.

From a marketer’s perspective, what’s important here is not that blockchain might provide trust but that conventional authority certainly cannot. This means that most trust-building methods marketers naturally rely on, which are based in traditional authority, probably won’t work. Things like celebrity endorsements, solemn personal promises from the CEO, and references to company size or history carry little weight in a hyper-skeptical environment. Even consumer reviews and peer recommendations are suspect in a world where people don’t trust that they’re genuine. What’s needed are methods that let people see things for themselves: a sort of radical transparency that doesn’t require relying on anyone else’s word, including (or perhaps especially) the word of “people just like me”.

One familiar example is comparison shopping engines that don’t recommend a particular product but  make it easy for users to compare alternatives and pick the option they like best. A less obvious instance would be a navigation app that shows traffic conditions and estimated times for alternate routes: it might present what it considers the best choice but also makes it easy for the user to see what’s happening and, implicitly, why the system’s recommendation makes sense. Other examples include package tracking apps that remove uncertainty (and thus reduce anxiety) by showing the movement of a shipment towards the customer, customer service apps that track the location of a repair person as he approaches for a service call, or phone queue systems that estimate waiting time and state how many customers are ahead of the caller.  A determined skeptic could argue that such solutions can't be trusted because the systems themselves could be dishonest.  But any falsehoods would quickly become apparent when a package or repair person didn’t arrive as expected, so they are ultimately self-validating.

Of course, many activities are not so easily verified. Claims related to data sharing are high on that list: it’s pretty much impossible for a customer to know how their data has been used or whether it has been shared without their permission. This is the European Union’s approach to privacy in the General Data Protection Regulation (GDPR) makes so much sense: the rules include a requirement to track each use of personal data, documentation of authority for that use, and a right of individuals to see the history of use. That’s very different attitude from the U.S. approach, which has much looser consent requirements and no individual rights to review companies' actual behaviors.  In other words, the EU approach creates a forced transparency that builds trust, especially false information would be a legally-punishable offense.

There’s a slender chance that the GDPR approach will be adopted in the U.S. in the wake of Facebook’s Cambridge Analytica scandal, although the odds are greatly against it. More likely, companies that are not Facebook will unite to oppose any legislation, even if Facebook itself sits on the sidelines. (That’s exactly what’s happening right now in California.)  The more intriguing possibility is that Facebook alone will adopt GDPR policies in the U.S. – as it has not very convincingly promised – and that this will pressure other companies to do the same.  Color me skeptical on that scenario: Facebook will probably renege once public attention turns elsewhere and few consumers will stop using services they enjoy due to privacy considerations.  In fact, if you look closely at studies of consumer attitudes, what you ultimately see is that consumers don’t really put a very high value on their personal data or privacy in general.

What does scare them is identity theft, so it’s just possible that regulations addressing that issue might provide privacy protections as a bonus. That’s especially true if consumers decide they don’t trust the government to enforce data protection standards but, following the distributed authority model, instead demand transparency so they can verify compliance to “self-enforce” the rules for themselves.  Yet this too is a long shot: few current political leaders or privacy activists are likely to adopt so subtle a strategy.

In short, the government won't solve the trust problem for marketers, so they'll need to find their own solutions.  This means they have to devise trust building measures, convince their companies to adopt them, and then educate customers about how they work.  This is an especially hard challenge because the traditional, authority-based methods of gaining trust are no longer effective. Finding effective new methods is an opportunity for innovation and competitive advantage, which are fun, and for long hours and failed experiments, which are less fun but part of the package. Either way, you really have no choice: as that mosquito keeps telling me, trust is essential for success in today’s environment.

________________________________________________________________________
* both firms with corporate trust issues of their own, ironically.

Sunday, October 29, 2017

Flytxt Offers Broad and Deep Customer Management

Some of the most impressive marketing systems I’ve seen have been developed for mobile phone marketing, especially for companies that sell prepaid phones.  I don’t know why: probably some combination of intense competition, easy switching when customers have no subscription, location as a clear indicator of varying needs, immediately measurable financial impact, and lack of legacy constraints in a new industry. Many of these systems have developed outside the United States, since  prepaid phones have a smaller market share here than elsewhere.

Flytxt is a good example. Founded in India in 2008, its original clients were South Asian and African companies whose primary product was text messaging. The company has since expanded in all directions: it has clients in 50+ countries including South America and Europe plus a beachhead in the U.S.; its phone clients sell many more products than text; it has a smattering of clients in financial services and manufacturing; and it has corporate offices in Dubai and headquarters in the Netherlands.

The product itself is equally sprawling. Its architecture spans what I usually call the data, decision, and delivery layers, although Flytxt uses different language. The foundation (data) layer includes data ingestion from batch and real-time sources with support for structured, semi-structured and unstructured data, data preparation including deterministic identity stitching, and a Hadoop-based data store. The intelligence (decision) layer provides rules, recommendations, visualization, packaged and custom analytics, and reporting. The application (delivery) layer supports inbound and outbound campaigns, a mobile app, and an ad server for clients who want to sell ads on their own Web sites.

To be a little more precise, Flytxt’s application layer uses API connectors to send messages to actual delivery systems such as Web sites and email engines.  Most enterprises prefer this approach because they have sophisticated delivery systems in place and use them for other purposes beyond marketing messaging.

And while we’re being precise: Flytxt isn’t a Customer Data Platform because it doesn’t give external systems direct access its unified customer data store.  But it does provide APIs to extract reports and selected data elements and can build custom connectors as needed. So it could probably pass as a CDP for most purposes.

Given the breadth of Flytxt’s features, you might expect the individual features to be relatively shallow. Not so. The system has advanced capabilities throughout. Examples include anonymizing personally identifiable information before sharing customer data; multiple language versions attached to the one offer; rewards linked to offers; contact frequency limits by channel across all campaigns; rule- and machine learning-based recommendations; six standard predictive models plus tools to create custom models; automated control groups in outbound campaigns; real-time event-based program triggers; and a mobile app with customer support, account management, chat, personalization, and transaction capabilities. The roadmap is also impressive, including automated segment discovery and autonomous agents to find next best actions.

What particularly caught my eye was Flytxt’s ability to integrate context with offer selection.  Real-time programs are connected to touchpoints such as Web site.  When a customer appears, Flytxtidentifies the customer, looks up her history and segment data, and infers intent from the current behavior and context (such as location), and returns the appropriate offer for the current situation. The offer and message can be further personalized based on customer data.

This ability to tailor behaviors to the current context is critical for reacting to customer needs and taking advantage of the opportunities those needs create. It’s not unique to Flytxt but it's also not standard among customer interaction systems. Many systems could probably achieve similar outcomes by applying standard offer arbitration techniques, which generally define the available offers in a particular situation and pick the highest value offer for the current customer. But explicitly relating the choice to context strikes me as an improvement because it clarifies what marketers should consider in setting up their rules.

On the other hand, Flytxt doesn't place its programs or offers into the larger context of the customer lifecycle.  This means its up to marketers to manually ensure that messages reflect consistent treatment based on the customer's lifecycle stage.  Then again, few other products do this either...although I believe that will change fairly soon as the need for the lifecycle framework becomes more apparent.

Flytxt currently has more than 100 enterprise clients. Pricing is based on number of customers, revenue-gain sharing, or both. Starting price is around $300,000 per year and can reach several million dollars.

Saturday, October 07, 2017

Attribution Will Be Critical for AI-Based Marketing Success


I gave my presentation on Self-Driving Marketing Campaigns at the MarTech conference last week. Most of the content followed the arguments I made here a couple of weeks ago, about the challenges of coordinating multiple specialist AI systems. But prepping for the conference led me to refine my thoughts, so there are a couple of points I think are worth revisiting.

The first is the distinction between replacing human specialists with AI specialists, and replacing human managers with AI managers. Visually, the first progression looks like this as AI gradually takes over specialized tasks in the marketing department:



The insight here is that while each machine presumably does its job much better than the human it replaces,* the output of the team as a whole can’t fundamentally change because of the bottleneck created by the human manager overseeing the process. That is, work is still organized into campaigns that deal with customer segments because the human manager needs to think in those terms. It’s true that the segments will keep getting smaller, the content within each segment more personalized, and more tests will yield faster learning. But the human manager can only make a relatively small number of decisions about what the robots should do, and that puts severe limits on how complicated the marketing process can become.

The really big change happens when that human manager herself is replaced by a robot:



Now, the manager can also deal with more-or-less infinite complexity. This means we no longer need campaigns and segments and can truly orchestrate treatments for each customer as an individual. In theory, the robot manager could order her robot assistants to create custom messages and offers in each situation, based on the current context and past behaviors of the individual human involved. In essence, each customer has a personal robot following her around, figuring out what’s best for her alone, and then calling on the other robots to make it happen. Whether that's a paradise or nightmare is beyond the scope of this discussion.

In my post a few weeks ago, I was very skeptical that manager robots would be able to coordinate the specialist systems any time soon.  That now strikes me as less of a barrier.  Among other reasons, I’ve seen vendors including Jivox and RevJet introduce systems that integrate large portions of the content creation and delivery workflows, potentially or actually coordinating the efforts of multiple AI agents within the process. I also had an interesting chat with the folks at Albert.ai, who have addressed some of the knottier problems about coordinating the entire campaign process. These vendors are still working with campaigns, not individual-level journey orchestration. But they are definitely showing progress.

As I've become less concerned about the challenges of robot communication, I've grown more concerned about robots making the right decisions.  In other words, the manager robot needs a way to choose what the specialist robots will work on so they are doing the most productive tasks. The choices must be based on estimating the value of different options.  Creating such estimates is the job of revenue attribution.  So it turns out that accurate attribution is a critical requirement for AI-based orchestration.

That’s an important insight.  All marketers acknowledge that attribution is important but most have focused their attention on other tasks in recent years.  Even vendors that do attribution often limit themselves to assigning user-selected fractions of value to different channels or touches, replacing the obviously-incorrect first- and last-touch models with less-obviously-but-still-incorrect models such as “U-shaped”, “W-shaped”,  and “time decay”.  All these approaches are based on assumptions, not actual data.  This means they don’t adjust the weights assigned to different marketing messages based on experience. That means the AI can’t use them to improve its choices over time.

There are a handful of attribution vendors who do use data-driven approaches, usually referred to as “algorithmic”. These include VisualIQ (just bought by Nielsen), MarketShare Partners (owned by Neustar since 2015) Convertro (bought in 2014 by AOL, now Verizon), Adometry (bought in 2014 by Google and now part of Google Analytics), Conversion Logic, C3 Metrics, and (a relatively new entrant) Wizaly. Each has its own techniques but the general approach is to compare results for buyers who take similar paths, and attribute differences in results to the differences between their paths. For example: one group of customers might have interacted in three channels and another interacted in the same three channels plus a fourth. Any difference in results would be attributed to the fourth channel.

Truth be told, I don’t love this approach.  The different paths could themselves the result of differences between customers, which means exposure to a particular path isn’t necessarily the reason for different results. (For example, if good buyers naturally visit your Web site while poor prospects do not, then the Web site isn’t really “causing” people to buy more.  This means driving more people to the Web site won’t improve results because the new visitors are poor prospects.) 

Moreover, this type of attribution applies primarily to near-term events such as purchases or some other easily measured conversion.  Guiding lifetime journey orchestration requires something more subtle.  This will almost surely be based on a simulation model or state-based framework describing influences on buyer behavior over time. 

But whatever the weaknesses of current algorithmic attribution methods, they are at least based on actual behaviors and can be improved over time.  And even if they're not dead-on accurate, they should be directionally  correct. That’s good enough to give the AI manager something to work with as it tells the specialist AIs what to do next.  Indeed, an AI manager that's orchestrating contacts for each individual will have many opportunities to conduct rigorous attribution experiments, potentially improving attribution accuracy by a huge factor.

And that's exactly the point.  AI managers will rely on attribution to measure the success of their efforts and thus to drive future decisions.  This changes attribution from an esoteric specialty to a core enabling technology for AI-driven marketing.  Given the current state of attribution, there's an urgent need for marketers to pay more attention and for vendors to improve their techniques. So if you haven’t given attribution much thought recently, it’s a good time to start.

__________________________________________________________________________
* or augments, if you want to be optimistic.

Wednesday, June 07, 2017

Pega Does Vegas

I spent the first part of this week at Pegasystems’ PegaWorld conference in Las Vegas, a place which totally creeps me out.* Ironically or appropriately, Las Vegas’ skill at profit-optimized people-herding is exactly what Pega offers its own clients, if in a more genteel fashion.

Pega sells software that improves the efficiency of company operations such as claims processing and customer service. It places a strong emphasis on meeting customer needs, both through predictive analytics to anticipate what each person wants and through interfaces that make service agents’ jobs easier. The conference highlighted Pega and Pega clients’ achievements in both areas. Although Pega also offers some conventional marketing systems, they were not a major focus. In fact, while conference materials included a press release announcing a new Paid Media solution, I don’t recall it being mentioned on the main stage.**

What we did hear about was artificial intelligence. Pega founder and CEO Alan Trefler opened with a blast of criticism of other companies’ over-hyping of AI but wasn’t shy about promoting his own company’s “real” AI achievements. These include varying types of machine learning, recommendations, natural language processing, and, of course, chatbots. The key point was that Pega integrates its bots with all of a company’s systems, hiding much of the complexity in assembling and using information from both customers and workers. In Pega’s view, this distinguishes their approach from firms that deploy scores of disconnected bots to do individual tasks.

Pega Vice President for Decision Management and Analytics Rob Walker gave a separate keynote that addressed fears of AI hurting humans. He didn’t fully reject the possibility, but made clear that Pega’s official position is it’s adequate to let users understand what an AI is doing and then choose whether to accept its recommendations. Trefler reinforced the point in a subsequent press briefing, arguing that Pega has no reason to limit how clients can use AI or to warn them when something could be illegal, unethical, dangerous, or just plain stupid.

Apart from AI, there was an interesting stream of discussion at the conference about “robotic process automation”. This doesn’t come up much in the world of marketing technology, which is where I mostly live outside of Vegas. But apparently it’s a huge thing in customer service, where agents often have to toggle among many systems to get tasks done. RPA, as its known to its friends, is basically a stored series of keystrokes, which in simpler times was called a macro. But it’s managed centrally and runs across systems. We heard amazing tales of the effort saved by RPA, which doesn’t require changes to existing systems and is therefore very easy to deploy. But, as one roundtable participant pointed out, companies still need change management to ensure workers take advantage of it.

Beyond the keynotes, the conference featured several customer stories. Coca Cola and General Motors both presented visions of a connected future where soda machines and automobiles try to sell you things. Interesting but we’ve heard those stories before, if not necessarily from those firms. But Scotiabank gave an unusually detailed look at its in-process digital transformation project and Transavia airlines showed how it has connected customer, flight, and employee information to give everyone in the company a complete view of pretty much everything. This allows Transavia to be genuinely helpful to customers, for example by letting cabin crews see passenger information and resolve service issues inflight. Given the customer-hostile approach of most airlines, it was nice to glimpse an alternate reality.

The common thread of all the client stories (beyond using Pega) was a top-down, culture-deep commitment to customer-centricity. Of course, every company says it’s customer centric but most stop there.  The speakers’ organizations had really built or rebuilt themselves around it.  Come to think of it, Las Vegas has that same customer focus at its core. As in Las Vegas, the result can be a bit creepy but gives a lot people what they want.  Maybe that's a good trade-off after all.

_________________________________________________________________________________________
* On the other hand, I had never seen the corrugated hot cup they had in the hotel food court. So maybe Vegas is really Wonderland after all.

** The solution calculates the value a company should bid to reach individual customers on Facebook, Google, or other ad networks.  Although the press release talks extensively about real time, Pega staff told me it's basically pushing lists of customers and bid values out to the networks.  It's real time in the sense that bid values can be recalculated within Pega as new information is received, and revised bids could be pushed to the networks. 



Wednesday, April 19, 2017

Here's Why Airlines Treat Customers Poorly

Last week’s passenger-dragging incident at United Airlines left many marketers (and other humans) aghast that any company could purposely assault its own customer. As it happens, airline technology vendor Sabre published a survey of airline executives just before the event. It confirms what you probably suspected: airline managers think differently from other business people.  And not in a good way.


The chief finding of the study is that the executives rated technology as by far their largest obstacle to improving customer experience. This is very unusual: as I wrote in a recent post, most surveys place organizational and measurement issues at the top of the list, with technology much less of an issue. By contrast, the airline executives in the survey– who were about 1/3 from operations, 1/3 from marketing, sales, and service, and 1/3 from other areas including IT and finance – placed human resources in the middle and organizational structure, consensus, and lack of vision at the bottom.  The chart below compares the two sets of answers, matching categories as best I can.



It would be a cheap shot to point out that the low weight given to “lack of vision” actually illustrates airline managers’ lack of vision. Then again, like everyone else who flies, I’ve been on the receiving end of many cheap shots from the airlines. So I’ll say it anyway.

But I’ll also argue that the answers reflect a more objective reality: airlines are immensely complicated machines whose managers are inevitably dominated by operational challenges. This is not an excuse for treating customers poorly but it does explain how easily airline leaders can focus on other concerns. Indeed, when the survey explicitly asked about priorities, 51% rated improving operations as the top priority, compared with just 39% for aligning operations, marketing and IT, and only 35% for building customer loyalty.

There’s a brutal utilitarian logic in this: after all, planes that don’t run on time inconvenience everyone. The study quotes Muhammad Ali Albakri, a former executive vice president at Saudi Arabian Airlines, as saying, “Two aspects generally take precedence when we recover irregular operations [such as bad weather]: namely crew schedules and legality and aircraft serviceability. Passengers’ conveniences and connecting passengers are also taken into consideration, depending on the situation.” In context , it’s clear that by “situation” he means whether the affected passengers are high-revenue customers.

But as you may remember from that college philosophy course, most people reject pure utilitarianism because it ignores the worth of humans as individuals. Even if you believe businesses have no ethical obligations beyond seeking maximum profit, it’s bad practice to be perceived as heartless beasts because customers won’t want to do business with you. So airlines do need to make customer dignity a priority, even at the occasional cost of operational efficiency. Otherwise, as the United incident so clearly illustrates, the brand (and stock price) will suffer.

If you’re a truly world-class cynic, you might argue that airlines are an oligopoly, so customers will fly them regardless of treatment. But it’s interesting to note that the Sabre paper makes several references to government regulations that penalize airlines for late arrivals and long tarmac waits. These factors clearly influence airline behavior. There's even a (pitifully slim) chance that Congress will respond to United's behavior. So the balance between operational efficiency and customer experience isn’t fixed. Airlines will react to political pressures, social media, and even passenger behaviors. The fierce loyalty of customers to airlines that have prioritized customer experience, such as Southwest and Virgin America, should be lesson to the others about what’s possible. That those airlines have had very strong leaders who focused on creating customer-centric cultures highlights the critical importance of “vision” in producing these results.

In short, the operational challenges of the airline industry are extreme but they’re not an excuse for treating customers poorly. Visionary leaders have shown airlines can do better. Non-visionary leaders will follow only when consumers demand better service and citizens demand governments protect them.


Thursday, March 16, 2017

Is MarTech Too Important To Leave To The Marketers?

I’m still pondering the relationship between marketing and IT: what it is, will be, and should be. A few new ingredients have kept the pot boiling:

- a chat with Abhi Yadav, founder of Zylotech, a MIT-bred, artificial intelligence-driven Customer Data Platform and message selection engine.  Those roots made it seem a likely candidate for IT-driven purchases, but Yadav told me his primary buyers are marketing operations staff.  In fact, he hasn’t even run into those marketing technology managers everyone (including me) keeps talking about. On reflection, it makes sense that marketers would be the buyers since Zylotech includes analytical and message selection features only used in marketing.  A system that only did data unification would appeal more to IT as a shared resource. Still, Yaday's comments are one point for the marketer-control team.

- a survey from the Association of National Advertisers that found marketers who control their technology strategy, vendors, and enterprise standards are more likely to have a strong return on martech investment. (The study is only available to ANA members but they gave permission to publish the table below. You can see a public infographic here).  That’s two points for Team MarTech.


- a study by IT staffing and services provider TEKsystems that found senior marketers with more advanced strategy were more likely to control their own technology.  The difference wasn’t terribly pronounced but it’s still the same pattern. MarTech is now ahead 3-0.  (I was actually more impressed that 65% of departments with no strategy were in charge. Yikes!)


So far, the game’s a blow out. Marketing is usually in charge of its technology and does better when it is.  A doubter might question if marketers really make better choices or are just happier when they’re in control. I do suspect that IT people would be less confident that marketers are making optimal decisions. Still, there’s no real reason to doubt that marketers are the best judges of what they need.

But the game’s not over. Let's call in a recent Ad Week article about global tech consultancies buying marketing agencies. The article cites Accenture, Deloitte, IBM, KPMG, McKinsey and PricewaterhouseCoopers and notes that each already has huge agency operations.  To the extent that these firms are working with marketing departments, it’s still more evidence of marketing being in charge. But the real story, at least as I read it, is these firms are getting involved because they see a need to integrate marketing technology with over-all corporate technology, just as marketing strategy needs to support corporate strategy.

“The consultants’ bread and butter has traditionally been large IT and business-transformation projects,” says Julie Langley, a partner at fundraising, merger and acquisitions advisor Results International, in the article. “But, increasingly, these types of projects have ‘customer experience’ at their center.”

To me, this is the key. As every aspect of customer experience becomes technology-driven, technology must be integrated across the corporation to deliver a satisfactory experience. Marketing may be the captain, but it’s still part of a larger team. If marketing can be a true team player, it gets to call the plays. But if marketing is selfish, then a coach needs to step in for the good of the whole.

I’ll spare you the extended sports analogy. In concrete terms, if marketing picks systems that only meet marketing needs, then the integrated customer experience will suffer. Worse still, some new tech-driven offerings may be impossible. This could be fatal if other, nimbler competitors deliver them instead. Tech-based disruption is a real threat in many industries. Companies can’t just hope that each department working on its own will yield an optimal solution for the business as a whole.  In fact, they can be quite sure it won't.

That’s why I’m not convinced by surveys showing marketers are happier or get better return on investment when they control their own technology. It’s possible for that to be true and for the corporate to miss larger opportunities that require cooperation across departments. If marketing can take that broader perspective, there’s no problem. If it can’t, IT or another department with enterprise-wide perspective will need to enter the game.

Monday, March 13, 2017

Forecast: Self-Assembling Application Bundles Will Manage Customer Experience

I recently described a Deloitte paper on technology trends, focusing on their descriptions of IT management methods. The paper also covered broader trends including:
  • Unstructured data, which they saw as a potentially bottomless source of insight. What’s interesting is they didn’t suggest many operational uses for it.  By contrast, traditional corporate data management is almost exclusively about business operations.
  • Machine intelligence, which they described as broader than artificial intelligence. They saw deployment moving from offering insights, to interacting with people, to acting autonomously. They also described it as controlling internal business processes as well as customer interactions. That's not the way marketers tend to think but they're right: the bulk of company processes are not customer-facing.
  • Mixed reality, which is a combination of virtual reality, augmented reality, and Internet of Things. They focused less on game-like immersive experiences than on new types of interfaces, such as gesture- and voice-based, and on remote experiences such as collaborative work. They also listed some requirements that aren't usually part of this discussion, including machines that can understand human expressions and emotions and security to ensure hackers don’t falsify identities or inject harmful elements into the remote experience (such as, telling you to make a repair incorrectly).
  • Blockchain, which they presented as mostly in terms of easing security issues by verifying identities and allowing for selective sharing of information.
Those are intriguing thoughts but don't present a specific vision of the future. A recent paper from Juniper Networks rushes in where Deloitte fears to tread.

Juniper's term is "digital cohesion", which they desfine as "an era in which multiple applications self-assemble to provide autonomous and predictive services that continually adapt to personal behaviors.”  It somewhat resembles the ideas I offered in this post about RoseColoredGlasses.me and further elaborated here.  I guess that’s why I like it.

Beyond having excellent taste to agree me, Juniper fills in quite a few details about how this will happen. Key points include:
  • Disruptive competitors can use high speed networks, local sensor data, and centralized cloud processing to offer new services with compelling economics (e.g. Airbnb vs. Hilton).
  • Smartphones provide pre-built mass distribution, removing a traditional barrier to entry by disruptive competitors.
  • Consumers are increasingly open to trying new things, having been trained to do so and seen benefits from previous new things.
  • Natural interfaces will eliminate learning curves as systems adopt to users rather than the other way around, removing another barrier to adoption.
  • Autonomous services will self-initiate based on observing past behavior and current context.  Users won't need to purposely select them.  More barriers down.
  • Services will be bundled into mega-services, simplifying user choice.
  • Open APIs and interoperability will make it easy to add new services to the bundles. This is a key enabling technology.
  • Better security and trust are essential for users to grant device access and share information with new services.
  • Business relationships need to be worked out between the individual services and the mega-bundles.
I’m sure you see the overlap between the Deloitte and Juniper pieces. Machine intelligence and insights from unstructured data will be critical in building services smart enough to make the right choices. Machine intelligence will also create an underlying infrastructure that’s elastic and powerful enough to deliver services reliably regardless of user location or aggregate demand. Mixed reality will be key for gathering information as well as delivering interactive user experiences. Loosely coupled systems and disaggregated services will make it easy to inject new services into a bundle. Blockchain could play a critical role in solving the security and trust issues.

I’m also sure you see how this relates to ideas that neither vendor mentioned directly, such as the increasing value of rich customer data, importance of accurate identity resolution, role of brands in creating trust, and natural tendency of consumers to do everything through a single mega-service.

Of course, there’s no guarantee this vision will come to pass. But it’s an interesting working hypothesis to shape your thinking. More than anything else, it should help you look beyond optimizing your current stack to ensure that you’ll be able to adapt if radical changes are needed.

Saturday, March 11, 2017

Corporate IT Will Regain Control Over Marketing Technology (And That's Okay)

One of my favorite factoids comes from a Computerworld survey in which IT managers placed marketing technologies at 20th of 28 items on their priority list. This either shows that martech managers are less important than they think or explains why martech managers are needed in the first place. Maybe both.

But the disconnect between corporate IT and marketing technologists could be more than simply amusing. A recent Deloitte study described IT trends that seem incompatible with common martech strategies. If that’s really the case, martech may end up isolated from other company systems – or, more likely, huge investments in martech may ultimately be scrapped once it becomes essential to pull marketing back into the corporate IT fold.

What are these worrisome trends? One was positioning of IT as an innovation driver; another was a loosely-coupled technical architecture of standards-based components; a third was replacing individual applications with enterprise-wide services. Each is problematic for a different reason:
  • IT as an innovation driver suggests that IT should take a larger role in marketing technology than it plays when marketing runs its own systems. It probably also implies innovations that cross departmental boundaries, again reducing the independence of marketing technologists.
  • a loosely-coupled architecture implies that marketing systems will conform to corporate standards to ensure interoperability. This also reduces marketing autonomy. More concretely, it conflicts with the internally-integrated marketing clouds and customer engagement platforms that some marketing departments have purchased to reduce complexity. 
  • enterprise-wide services suggest that marketing would be expected to use the corporate solutions rather than its own systems. This goes beyond setting standards for marketing systems to actually dictating at least some components of the marketing stack.
These trends all pull towards reduced technical autonomy for marketing departments. Let's be clear that this doesn’t represent a power grab by IT departments seeking to regain lost turf. Indeed, the Deloitte paper mentions marketing just 12 times in 140 pages, confirming the Computerworld finding that marketing systems barely register as a concern for corporate IT.

Rather, what’s happening here is a good-faith effort by IT managers to help their company and its customers. (The word “customer” occurs 139 times in the paper.)  Indeed, the paper proposes that technical innovations can lead to entirely new business models. Many of those models will surely integrate marketing functions as part of the whole. The paper also lists opportunities such as using machine intelligence to run systems more smoothly and more flexibly than human operators.  Those would benefit marketing systems as much as any others, putting the burden on marketing to justify keeping its system separate.

There may be something oxymoronic about “tightly-integrated, loosely-coupled systems”. But that’s exactly what’s being proposed and it actually makes a great deal of sense. Perhaps the trend of marketing taking control of its own systems has run its course and the pendulum is now swinging back to centralization. If so, marketers, marketing technologists, and martech vendors need to ensure their systems meet the new requirements for loosely coupled integration. In fact, they should do this gladly, because the new integrations promise grand new benefits for companies and their customers.

Thursday, June 25, 2015

Campaign Management Is Dead. Here's What Next-Generation Marketing Automation Looks Like.

Scientists tell us that the attention span of the average human is now shorter than the attention span of a goldfish.(1)  In such a world, the chances of anyone reading this 2,000 word blog post are pretty much nil. But I think the topic is extraordinarily important, so here is a summary in sushi-sized bites:

- the conventional flow-chart model of marketing campaigns can’t capture the complexity of today’s  disjointed customer journeys.

- a new approach is emerging that identifies stages in the customer journey and picks “plays” (small, highly targeted sets of treatments) to execute in specific situations within each stage

- this approach is easier for marketers to manage because it lets them think in smaller, more comprehensible units

- it will eventually lend itself to greater automation as machines take over more of the marketer's job in a “madtech” world

You can stop here if you need to watch an important cat video.  But if you want to understand my thinking in more detail, please read on.

The first modern campaign manager, Third Wave Network’s MIND, was released in 1991. What made it modern was a standard relational database(2) and multi-step campaigns that that sent users down different paths depending on their actions during the campaign. The system displayed each campaign as a set of boxes connected with lines to represent movement of customers from one step to the next. This flow chart interface has been fundamentally unchanged ever since and remains the gold standard of marketing automation(3).

The significance of the flow chart is what it replaced. The previous standard was a list of segments, each described on one row of a (paper) spreadsheet, with characteristics including selection criteria, description, key code, promotion materials, and quantity. Marketers filled out these sheets and handed them to programmers to execute. This was how direct mail marketers worked for decades, and often still do: it’s an efficient way to manage dozens or hundreds of cells within a large outbound campaign. It supports multiple contacts, such as mailing a second catalog to high-performing segments several weeks after the initial drop. But the list for the second mailing is pulled at the same time as the first, so it doesn’t allow changes in treatment based on subsequent customer behavior. This adjustment is what branching flow charts provide.

A quarter century after its introduction, the flow chart is now ripe for replacement. Flow charts assume that customers will follow a small number of predefined paths. This was realistic when interactions were limited to a few company-controlled touchpoints.  But it doesn’t describe today’s self-directed, random-walk journeys through an ever-shifting media landscape. In this environment, the best a company can do is react intelligently wherever a customer appears, taking into account both the current situation and whatever it knows about the customer’s past. Even company-initiated messages, while not wholly reactive, must be consistent with other treatments.


The basic features of a better approach have been obvious for years. I’ve long described it as “do the right thing, wait, and do the right thing again.” What’s changed is visibility: marketers now can see vastly more data about their customers and can interact with them through vastly more channels. In the “madtech” vision I’ve been articulating recently, this translates to assuming that all data about each customer’s demographics, interests, behaviors, locations, intentions, and other attributes is available to everyone. This includes both data a company gathers through direct interactions and data aggregated by third parties and offered for sale. Indeed, third party data is essential for building a complete picture of each customer’s experience.(4)

The vision similarly assumes that messages can be delivered through channels the company does not own directly. These extend beyond conventional advertising to private channels that other companies have opened to external messages. A concrete example would be third party offers delivered through a company’s Web site. My shorthand for this is “everything is biddable”: meaning that marketers can pay to embed a message within every interaction the customer has with anyone. Since all marketers have the same opportunity to bid on all impressions, a corollary is that buying the right messages at the right price is the key to success – or, in another catchphrase, “the smartest bidder wins”.

Sadly, neither you nor I can make a living by repeating clever phrases. Someone has to do the hard work of figuring out what treatments to deliver in each situation. A flow chart can’t come close because the situations are far too varied for any one chart to capture all the alternatives. What’s needed is a system that will assess each situation as it arises and come up with a custom-tailored solution.

If the only goal were maximizing immediate response, this would be pretty simple. Existing recommendation engines and predictive models can easily tell you which content a person is most likely to pick or which product they are most likely to buy. Today's products often do this with limited information about the individual being targeted, but that’s just a reflection of what data is currently available: at least some current systems could incorporate individual details and history without a major revision. There are practical details of speed, scope, and accuracy but the broad design of such a system is straightforward.  It connects with every touchpoint (including exchanges that deliver external opportunities up for bidding), receives information about each interaction, and returns an optimal message along with the value to bid for the right to deliver it.

But immediate response isn’t the only goal. Each interaction is embedded in the context of a customer’s relationship with the company. The best message is the one that maximizes the long-term value of that relationship. This won’t necessarily be the message with the highest response rate or greatest immediate financial value. Automated systems can incorporate long-term value in their recommendations but only if they are tracking long-term outcomes and analyzing what changes them. I believe – and this is the main point of this entire post – that automated analysis can only optimize for long-term outcomes if customer data is classified by stages in the customer journey. In other words, you can’t just randomly test all possible treatments in all situations and let the best approaches bubble to the top. There are simply too many variables for that to work. Rather, marketers must assign customers to journey stages and use those stages as inputs when selecting treatments and evaluating results. The stages themselves can be adjusted over time in the light of experience. But without a journey map as a starting point, marketers and their machines will flounder endlessly in a sea of big data.

Maybe you're unimpressed.  Maybe that cat video beckons.  Maybe you're thinking, "All you've done is change the labels.  A map of journey stages looks a lot like a campaign flow, and for that matter an old-style campaign funnel."  I understand your doubts. But there are significant differences:

- journey stages are not associated with specific messages, while campaign steps are.  In fact, the whole purpose of campaign steps is to define which messages are delivered when. So a journey stage is a significantly higher level of abstraction. Put another way, journey stages are just one factor in deciding how to treat a customer, while campaign steps are the only factor.

- journey stages are inherently random, while campaign flows and funnel stages are relentlessly linear. The goal of a campaign or funnel is push customers from one stage to the next as quickly as possible. Journey stages do track a funnel-type motion but they’re more intended to capture a set of customer needs and interests. In conformance with the general notion that customers will control their own movement, journey stages are more descriptive than directional. It’s no problem for a journey map if customers move back to an earlier stage or if they stay in one stage indefinitely.  For stages such as “satisfied customer”, that’s actually a good thing.

I do recognize that “journey stage” implies motion, which means it's probably not the best term for what I have in mind.  A more neutral term like "state" would be better.  But I’ll stick with journey for now because it will intuitively make more sense to most people.

So let’s assume, at least for sake of argument, that you’re convinced marketing systems should consider  journey stage when they’re picking the best message for a specific customer in a specific situation. Does that mean campaign flows can be replaced by any recommendation engine that adds journey stage to its list of customer attributes?

I think not. The system needs an intermediate level between broad journey stages such as “interested prospect”, “active buyer”, “satisfied customer”, and “advocate”, and actual treatments during single interactions. That level is needed because marketers have to create the content that the machines will choose from during those treatments. Marketers will decide what content to create by envisioning experiences that span multiple interactions and then creating content for a complete experience. The best analogy I’ve found is “plays” in a sport like football: tightly choreographed sets of actions that serve a narrow purpose. Teams prepare plays in advance and pick the right play for the moment but they don’t try to set the sequence of plays before the game begins. They know that games, like customers, follow their own unpredictable course and all the team can do is react appropriately in each situation. It’s true that each play is directed towards a long-term goal, but execution of the play is really a self-contained project. What’s critical is that the marketing play can incorporate multiple interactions over time; this allows a more coherent, richer customer experience than treating each interaction as wholly independent.

Good marketers and sales people already think this way, although they usually describe it in terms of tactics to handle different types of buyers, personas, or situations. I’ve also recently heard several innovative marketing system vendors describe approaches that I believe are fundamentally similar to this concept, again in different terminology. I’ll tentatively adopt the term “plays” precisely because it’s vendor-neutral and because I think most people are familiar enough with sports plays for the analogy to be helpful.

To clarify, then, I believe that marketers of the future will think in terms of broad customer journey stages (i.e., states), such as “active buyer” or “satisfied customer”. Each stage has strategic objectives, which might be to move an active buyer closer to purchase or to strengthen the relationship with a satisfied customer. Marketers will pursue those objectives through “plays” that are appropriate in specific situations, such as “active buyer requests a demonstration” or “satisfied customer has a service problem” and take into account other context such as location, device, and recent behaviors. They will create content and process flows to execute those plays.  These plays will resemble current multi-step campaigns but on a smaller scale and with narrower goals. This limited scale is exactly what makes plays so useful, because marketers can easily visualize each play as a whole. This lets them construct coherent sets of messages, rules and metrics to execute each play from start to finish, measure its impact, and make changes to optimize its effectiveness. Today’s massive nurture campaigns are too large and complicated to do the same.

Marketing systems of the future will also be designed to support this model. It may even happen that systems designed on this model come first and marketers adopt the model as they come to appreciate the systems. Or, because the stage/play model is especially well suited to automated campaign design and the general “madtech” world, perhaps it will be embedded in automated systems and grow as such systems are adopted.

Whatever the mechanism, the traditional campaign flow model has reached the end of its useful life.  I believe the stage/play model will be its successor.

______________________________________________________________________________

(1) The study actually measured the attention span of average Canadians.  It didn’t specify the nationality of the goldfish.

(2) The preceding generation of campaign managers, called MCIF systems, used proprietary columnar databases to wring adequate performance from the PC hardware available at the time. The first of these, MPI Max$ell, was introduced in 1984; other major products included OKRA Marketing (1986), Customer Insight (1987), and Harte-Hanks P/CIS (1988). Interestingly, the height of sophistication among these products was a feature called “matrix marketing”, which identified the best offer to make each customer after each (monthly) update. Sound familiar?

(3) click here for my 1994 review of MIND and a couple significant contemporaries. If you add a reference to digital media, I could be describing any of today’s cutting-edge marketing automation products:

“At the core is a very powerful campaign management function that allows a marketer to define sequences of marketing events–each including a mix of direct mail, telephone contacts and personal sales efforts–to be followed in different circumstances, and then to automatically execute these sequences.

“The system uses an efficient graphical interface to lay out the alternate sequences that can be followed within each campaign, the tasks associated with each step in each sequence, and even the specific promotional materials used with each task. As a result, the marketer gains extremely precise control over the marketing approach used with each customer–including the ability to switch the customer to a different sequence depending on actions during the campaign.”

(4) I'm perfectly aware that reality will be messier than the vision implies.  Coverage will be incomplete, people won’t always be recognized across devices, and some predictions will be wrong. But perfection isn’t necessary for success: systems using this data only have to be more effective on average than systems that don’t.


Thursday, May 28, 2015

suitecx Offers Industrial-Strength Customer Journey Maps and More


Customer journey mapping is now the buzziest of buzz words.  Every self-respecting marketing automation system offers something called a “customer journey map,” even if it’s exactly the same as last year’s campaign designer or does nothing more than connect functionless icons on a virtual whiteboard. Journey mapping is equally popular among agencies and consultants, although it also often is little more than a new label for the old sales funnel.


None of this affects me personally, but if I were a real customer experience expert I'd be annoyed at cartoon versions being presented as the real thing.  Sophisticated journey mapping has been around for more than a decade*.  It involves not just listing interactions or displaying them in a diagram, but also analyzing their contents, results, and supporting systems. Most customer experience teams have struggled to do this with spreadsheets, graphics programs, or generic flow charts. I’ve done it that way myself and, trust me, it’s painful.

suitecx is to static customer journey diagrams as Google Maps is to the Rand McNally Road Atlas: an interactive alternative with almost boundless functionality. Built by a team of customer experience veterans at the east bay group,** it’s clearly the system its designers always wanted but could never find elsewhere. The resulting sophistication makes it a bit scary at times, with more data crammed onto some screens than casual users can digest. But it also means the system is hugely flexible and will make serious users vastly more productive.

Customer journey mapping is just one feature within suitecx, which is sold as three modules: diagnosticcx to gather and organize customer experience information; visualizecx to display maps, findings, and recommendations; and precisioncx, which defines contact strategies and campaign flows.



The tool is designed around its own intended user journey. This would start in diagnosticcx, which collects information about a company’s business, customers, and processes using customer and employee surveys, interviews, and direct experience. The findings are organized into a list of customer interactions, which are classified by journey stage, channel, department, emotional outcome, segment, and other properties. The interactions are then linked to recommendations, which are themselves classified, prioritized on four dimensions (customer impact, company impact, cost, and feasibility), mapped onto a matrix, set on a timeline, and ultimately converted into detailed project plans.


visualcx supports the project by displaying the data in formats including story maps, process flows, interaction grids, a virtual wall with virtual sticky notes, and various summaries. The grids are automatically generated from the interaction list developed in diagosticcx or imported via spreadsheet.  Grids can be filtered on different interaction attributes and users can drill into individual interactions to see the underlying details. The story maps and process flows are built manually, alas, but still use the same drillable interactions.

precisioncx completes the project by letting users design new customer experiences.  These include over-all contact strategy and multi-step campaigns with segments, creative, triggers, metrics, and other attributes for each step. The system can’t execute the campaigns but an API is available that could export the campaign designs to execution systems.

This is all industrial-strength stuff, aimed at corporate customer experience departments, agencies, and consultants. Pricing is similarly industrial, starting at $15,000 per module for up to three users. suitecx also offers single-function “primer” products for a much more affordable $699 each (and a seven day free trial). Current modules include grid diagrams, story flows, and the virtual wall with sticky notes. Process maps and campaign flows are under development.

Early versions of suitecx have been used by the east bay group in their own work for years. The commercial product was formally released in December 2014 and currently has about 20 paid clients for the major modules.

____________________________________________________________________________

* the earliest reference I could find on Google was in a business school course syllabus from 2001.  Apparently the term had already been around long enough by then to be picked up in academia.

** which seems to have an issue with capital letters.

Thursday, September 17, 2009

RightNow Adds Social Community Capabilities (But Don't Expect Support Costs to Fall as a Result)

Summary: RightNow has extended its social media footprint by purchasing HiveLive, which lets companies build public and private communities. It also released a benchmark survey showing that online channels (email, chat, Web self-service) don't do much to reduce customer service telephone calls.

In keeping with my recent posts about broader utilization of social media, I had a chat earlier this week with on-demand CRM vendor RightNow , who updated me on their recent purchase of HiveLive. HiveLive provides a social community platform, which means it lets companies build their own discussion groups, forums and such. HiveLive has many features to support business communities, both in terms of engaging with customers over issues such as product features and bugs, and in terms of building internal communities such as project teams.

HiveLive fits with RightNow’s vision of giving customers a seamless flow between community applications and a company’s traditional service systems. For example, if a question posted to a forum goes unanswered for a specified time, it can be escalated into a service system as a case to be handled by the company’s support group. If I understood correctly, RightNow and HiveLive can do this already.

We discussed deeper sharing, such as having answers developed in a public forum become part of a company’s internal customer service knowledgebase. That’s something RightNow may add in the future.

Our discussion veered onto other topics, and in particular how seriously companies really take the goal of improving the customer experience. RightNow shared a copy of its recent RightNow Multi-Channel Contact Center Benchmark Report, which was interesting in its own right.

One tidbit I found particularly intriguing was how few telephone service calls are “deflected” into email, chat and Web self-service channels. In the survey, most companies reported that fewer than 10% of customers in those channels would otherwise have made a phone call.

You could see this as bad news for the theory that having alternate channels available will reduce the need for call center agents. Or you could consider it good news that customers have more choices to pick the interaction method they find most congenial.

Another interesting item was that 55% of companies have some mechanism to gather feedback from customers, but just 10% of those use an IVR survey, which I personally consider the most effective way to gain broad participation. Again, you can treat this as good news (at least half the companies are trying) or bad news (just 5% of the total are doing it effectively). Either way, it’s food for thought.