Teradata announced on Friday that it had signed a deal to sell its marketing applications business for $90 million to private investment firm Marlin Equity Partners. The company had announced plans to sell the business last November. The sale involves the former Aprimo, eCircle, FLXone data management platform, real-time interaction manager, and other cloud-based marketing products. Teradata will retain its on-premise Customer Interaction Manager (CIM) and an on-premise version of Real Time Interaction Manager (RTIM).
The price is much lower than usual for cloud-based marketing systems. Teradata reported just under $200 million revenue for marketing applications last year. This includes about $40 million for the pieces that Teradata is keeping. So the $90 million price is about 0.56x revenue ($90/$160). This compares with Marketo stock selling at roughly 5x revenue ($1 billion market cap on $200 million revenue). It’s true that Teradata lost $45 million on the marketing applications business last year, but that’s still less on a percentage basis than Marketo’s loss of $71 million. The differential suggests that buyers saw little potential for growth in the businesses that Teradata is selling. The low price may also reflect a departure of many human assets from the Teradata business in recent years. Teradata itself paid $540 million for Aprimo back in 2011, again roughly 5x revenue.
It’s not surprising that the buyer is a private equity firm. That was what had been rumored. Marlin hasn’t had much previous involvement in marketing applications but it did buy email provider Blue Hornet in December. Presumably it will combine the two businesses, reduce the losses, and try to sell the result either to other businesses or on the stock market. I don’t understand why Marlin thinks the combined firms would be much more attractive than the separate businesses but presumably they feel there is greater growth potential for a better-managed business. Or, Marlin may plan to split up its acquisitions and sell individual components such as marketing resource management and data management platform separately.
In a move that borders on surreal, Teradata's Marketing Application division itself today announced the latest release of its integrated marketing cloud. I suppose this signals the hopes within the marketing applications team to remain intact. Whether that's more than wishful thinking, only time will tell.
I’d like to say I was clever in predicting that Teradata will hold onto CIM and RTIM, but this was something the company announced just after it said it was selling the marketing applications group. CIM and RTIM both started as separate products from Aprimo and, for the most part, remained technically distinct. My understanding is they held onto the on-premise pieces because they were important to major Teradata clients, whereas the businesses being sold were used by smaller companies who were not buying much else from Teradata.
The very low price certainly isn’t good news for other SaaS marketing vendors, but I think it’s more about the unique situation of the Teradata products than the industry in general. So I’d expect valuations of other cloud-based marketing firms to be largely unaffected.
Tuesday, April 26, 2016
Thursday, April 21, 2016
SAS by the Sip: SAS Viya Offers Open APIs to Individual Services in the Cloud
SAS held its annual Global Forum conference this week, which marked the company’s 40th anniversary. One key to its long-lived success was an early decision to sell software by annual subscription, rather than the one-time perpetual license standard in the industry when SAS started. This provided a steady income stream and focused attention on customer satisfaction to ensure renewals.
In recent years, much of the software industry has adopted a subscription model under the label of “Software as a Service” (SaaS). But the triumph of SAS’s pricing approach has been accompanied by new challenges to SAS’s business. Subscription pricing notwithstanding, SAS has largely sold its software for on-premise operation by its clients and required them to purchase a large stack of core technologies. This demanded a high initial investment but made expansion relatively easy – an approach that made sense when SAS's core analytical applications were pretty much essential to many clients. By contrast, the new SaaS vendors run software on their own servers and allow clients to access it remotely. This greatly reduces implementation effort and allows volume-based pricing, both of which lower entry costs to the client. The new SaaS software has also been relatively easy to integrate with other systems through open APIs and standard scripting languages such as Python. This also makes it easier to sell SaaS applications for narrow tasks rather than as part of a massive suite.
SAS’s growth and financial performance have been just fine despite the new competition, thanks to technical leadership in its core analytical products and pry-it-from-my-cold-dead-hands loyalty of its core customers. But the benefits of the new SaaS systems have made new sales harder, especially in peripheral markets such as marketing applications.
I’ve subjected you to this long-winded exposition because it provides context for SAS’s major announcement at its conference: a true SaaS version called SAS Viya.* This is a cloud-native system** that will reproduce existing SAS functionality and be compatible with the existing SAS 9 products. More exciting than the cloud deployment (which SAS had previously offered for SAS 9), Viya will be accessible through open APIs and scripting languages including Python, Java, and Lua, and – gasp – some components will be offered as on-demand services. In the SAS universe, this is truly revolutionary. It should open the door to new clients who were not likely to invest in a conventional SAS implementation. Initial Viya apps will be available in third quarter 2016.
For marketers in particular, SAS also announced Customer Intelligence 360, a SaaS version of its primary marketing suite. Like Viya, this is a separate product from the existing Customer Intelligence 6 suite, which will continue to be offered. The initial release is not a function-for-function duplicate of CI 6 but a “digital marketing hub” that delivers real-time messages in digital channels (email, Web, and mobile apps). Key features include customer-level data collection via on-page scripts, and applications for marketing tasks such as sending an email, delivering in-app messages, or building Web a/b tests. These applications combine previously separate SAS functions such as model building, visual analytics, segmentation, and content creation. They include some nifty advanced features such as recommending when to run tests and automatically discovering which customer segments are most responsive to each test version. The initial CI 360 release includes two modules, Discover (mobile and Web reporting) and Engage (digital interactions including testings). They will eventually be followed by marketing resource management. CI 360 works on a very flexible customer data hub, although that’s a separate product owned by SAS’s Master Data Management group.
CI 360 uses much of the same technology as Viya, including REST APIs and HTML5 interface. It will officially run on Viya once Viya is released. Like Viya, it does not require clients to purchase the full SAS stack and will be priced on volume rather than a simple subscription. In the case of CI 360, fees will be based on the number of “customer equivalent records” and marketing messages. A minimum installation might start around $10,000 per month, considerably less than the current CI 6 product and competitive with other mid-market digital marketing solutions. The initial CI 360 modules are available now.
_______________________________________________________________________________________________
* The name is a little odd but it could have been SASaaS, so I guess we can be thankful for small mercies.
** Viya can run on the Amazon Web Services public cloud, SAS’s own cloud, or a company’s own private cloud.
In recent years, much of the software industry has adopted a subscription model under the label of “Software as a Service” (SaaS). But the triumph of SAS’s pricing approach has been accompanied by new challenges to SAS’s business. Subscription pricing notwithstanding, SAS has largely sold its software for on-premise operation by its clients and required them to purchase a large stack of core technologies. This demanded a high initial investment but made expansion relatively easy – an approach that made sense when SAS's core analytical applications were pretty much essential to many clients. By contrast, the new SaaS vendors run software on their own servers and allow clients to access it remotely. This greatly reduces implementation effort and allows volume-based pricing, both of which lower entry costs to the client. The new SaaS software has also been relatively easy to integrate with other systems through open APIs and standard scripting languages such as Python. This also makes it easier to sell SaaS applications for narrow tasks rather than as part of a massive suite.
SAS’s growth and financial performance have been just fine despite the new competition, thanks to technical leadership in its core analytical products and pry-it-from-my-cold-dead-hands loyalty of its core customers. But the benefits of the new SaaS systems have made new sales harder, especially in peripheral markets such as marketing applications.
I’ve subjected you to this long-winded exposition because it provides context for SAS’s major announcement at its conference: a true SaaS version called SAS Viya.* This is a cloud-native system** that will reproduce existing SAS functionality and be compatible with the existing SAS 9 products. More exciting than the cloud deployment (which SAS had previously offered for SAS 9), Viya will be accessible through open APIs and scripting languages including Python, Java, and Lua, and – gasp – some components will be offered as on-demand services. In the SAS universe, this is truly revolutionary. It should open the door to new clients who were not likely to invest in a conventional SAS implementation. Initial Viya apps will be available in third quarter 2016.
For marketers in particular, SAS also announced Customer Intelligence 360, a SaaS version of its primary marketing suite. Like Viya, this is a separate product from the existing Customer Intelligence 6 suite, which will continue to be offered. The initial release is not a function-for-function duplicate of CI 6 but a “digital marketing hub” that delivers real-time messages in digital channels (email, Web, and mobile apps). Key features include customer-level data collection via on-page scripts, and applications for marketing tasks such as sending an email, delivering in-app messages, or building Web a/b tests. These applications combine previously separate SAS functions such as model building, visual analytics, segmentation, and content creation. They include some nifty advanced features such as recommending when to run tests and automatically discovering which customer segments are most responsive to each test version. The initial CI 360 release includes two modules, Discover (mobile and Web reporting) and Engage (digital interactions including testings). They will eventually be followed by marketing resource management. CI 360 works on a very flexible customer data hub, although that’s a separate product owned by SAS’s Master Data Management group.
CI 360 uses much of the same technology as Viya, including REST APIs and HTML5 interface. It will officially run on Viya once Viya is released. Like Viya, it does not require clients to purchase the full SAS stack and will be priced on volume rather than a simple subscription. In the case of CI 360, fees will be based on the number of “customer equivalent records” and marketing messages. A minimum installation might start around $10,000 per month, considerably less than the current CI 6 product and competitive with other mid-market digital marketing solutions. The initial CI 360 modules are available now.
_______________________________________________________________________________________________
* The name is a little odd but it could have been SASaaS, so I guess we can be thankful for small mercies.
** Viya can run on the Amazon Web Services public cloud, SAS’s own cloud, or a company’s own private cloud.
Labels:
analytics as a service,
customer intelligence,
martech,
open systems,
SaaS,
sas
Wednesday, April 13, 2016
Thunderhead ONE Provides Powerful Journey Orchestration
As I wrote a couple of posts back, I’ve recently noticed a new set of vendors offering “journey optimization engines”*. The key feature of these systems is they select customer treatments based on movement through a journey map. The treatments are usually executed through external systems such as email service providers, CRM, or Web content management. The systems also assemble the unified customer database needed to track customer journeys. This, of course, is a function they share with Customer Data Platforms. But CDPs don’t necessarily have journey mapping or treatment selection functions. On the other hand, journey optimization engines don’t always expose their data to external systems, which is a core requirement for CDPs. Journey optimization engines also provide at least some tools to analyze customer journeys and choose the best customer treatments. These may include predictive models, machine learning, and automatic creation of journey maps, but don’t have to.
Thunderhead ONE Engagement Hub is a charter member of this new little club. UK-headquartered Thunderhead itself was founded way back in 2001 and launched its original customer engagement product (highly personalized customer communications such as account statements) in 2004. ONE was developed by a U.S.-based engineering team. It was released in Europe in 2015 and in the U.S. earlier this year.
Let’s look at how ONE handles the three core journey optimization functions:
- Data assembly. ONE provides its own Javascript tag to capture Web and email interactions and a SDK to connect with mobile apps. Other systems can feed data into ONE using a REST API or batch file imports. There are prebuilt API connectors for Salesforce.com CRM, Microsoft Dynamics CRM, and SAP Cloud for Customer. The system will automatically replicate the structure of imported data, maintaining relationships between different data elements. This allows ONE to store nearly any kind of data including not just customer attributes and identifiers, but also interaction and purchase details, touchpoint configurations, and product information.
Data is time-stamped to allow trending and give access to previous values of individual elements. Users can define calculations to create derived values such as engagement score, customer type, preferences, or interests. In addition to storing the imported information in a persistent database, ONE can lets users define in-memory profiles available for real-time access during interactions. These are updated immediately as new data is gathered, so the system is always working with the most current information.
ONE can link data using customer identifiers from different sources so long as there is a common element somewhere in the chain, such as an email address that is attached to a Web browser cookie through a form fill and to a mobile device through app registration. This allows the system to start tracking anonymous users when they first appear and later connect them to a personal profile when they identify themselves. But ONE does not standardize customer attributes, such as name or address, or use “fuzzy” matching to infer likely relationships.
All told, this is an exceptionally broad set of data management features. Many systems that build profiles – both CDPs and journey optimization engines – lack ONE's ability to store information about entities such as touchpoints or products. Nor do they always provide both a persistent data store and in-memory access. And while most can stitch together identities using shared identifiers, some rely on external systems to provide a common ID.
- Journey mapping. ONE lets users assign journey stages to activities and then classify interactions by activity type. Interactions can also be tagged with other attributes such as channel, product, and marketing asset. The system uses this information to create many varieties of journey maps, including one that shows movement between stages broken out by channels, which is delightfully similar to the Customer Experience Matrix** I’ve been working with since 2006.*** Other versions filter the inputs to show maps for specific products, customer segments, or touchpoints within a channel (such as specific Web sites, retail stores, or phone agents). Maps can also compare attributes of different groups, such as customers who advanced towards purchase vs those who dropped out. Slicing the data in yet another way, maps can show the impact on engagement score of specific actions. Hours of fun, eh?
- Execution. Users can create “conversations” that send messages to customers who match a specified combination of journey stage, customer attributes, and channels. Eligibility and relevance rules can ensure the chosen messages are truly appropriate. One conversation can include several messages in different channels. Message contents can be drawn from a repository within ONE or from an external asset library.
The system uses machine learning to estimate how each customers will respond to each conversation and to calculate the value of the conversation. An arbitration function can then find the highest value conversation in each situation. The system can deploy conversations in real time, presenting CRM agents with recommended actions (along with a detailed customer profile and history) or Web pages with personalized contents (deployed in user-specified locations on the page). Personalized content and data can also be pushed to other execution systems such as email through API connections, either in batch or real time. External systems can access individual customer records through the ONE API. Data can also be extracted from ONE to standard SQL databases, which external systems could then query.
Pricing for ONE starts at $30,000 per year and is based on the volume of interactions and personalization recommendations, with unit costs varying by channel. The system has 38 clients in Europe and about a dozen in North America.
______________________________________________________________________________
*I would love to call these JOEs but don’t have the heart to inflict another obscure acronym on the industry. You’re welcome.
** Originally developed by my colleague Michael Hoffman. Click here for his take on it.
*** I’m not suggesting that Thunderhead based their map on the Customer Experience Matrix. Many people have come up with similar ideas. I do like to think that Hoffman and I were ahead of our time.
Thunderhead ONE Engagement Hub is a charter member of this new little club. UK-headquartered Thunderhead itself was founded way back in 2001 and launched its original customer engagement product (highly personalized customer communications such as account statements) in 2004. ONE was developed by a U.S.-based engineering team. It was released in Europe in 2015 and in the U.S. earlier this year.
Let’s look at how ONE handles the three core journey optimization functions:
- Data assembly. ONE provides its own Javascript tag to capture Web and email interactions and a SDK to connect with mobile apps. Other systems can feed data into ONE using a REST API or batch file imports. There are prebuilt API connectors for Salesforce.com CRM, Microsoft Dynamics CRM, and SAP Cloud for Customer. The system will automatically replicate the structure of imported data, maintaining relationships between different data elements. This allows ONE to store nearly any kind of data including not just customer attributes and identifiers, but also interaction and purchase details, touchpoint configurations, and product information.
Data is time-stamped to allow trending and give access to previous values of individual elements. Users can define calculations to create derived values such as engagement score, customer type, preferences, or interests. In addition to storing the imported information in a persistent database, ONE can lets users define in-memory profiles available for real-time access during interactions. These are updated immediately as new data is gathered, so the system is always working with the most current information.
ONE can link data using customer identifiers from different sources so long as there is a common element somewhere in the chain, such as an email address that is attached to a Web browser cookie through a form fill and to a mobile device through app registration. This allows the system to start tracking anonymous users when they first appear and later connect them to a personal profile when they identify themselves. But ONE does not standardize customer attributes, such as name or address, or use “fuzzy” matching to infer likely relationships.
All told, this is an exceptionally broad set of data management features. Many systems that build profiles – both CDPs and journey optimization engines – lack ONE's ability to store information about entities such as touchpoints or products. Nor do they always provide both a persistent data store and in-memory access. And while most can stitch together identities using shared identifiers, some rely on external systems to provide a common ID.
- Journey mapping. ONE lets users assign journey stages to activities and then classify interactions by activity type. Interactions can also be tagged with other attributes such as channel, product, and marketing asset. The system uses this information to create many varieties of journey maps, including one that shows movement between stages broken out by channels, which is delightfully similar to the Customer Experience Matrix** I’ve been working with since 2006.*** Other versions filter the inputs to show maps for specific products, customer segments, or touchpoints within a channel (such as specific Web sites, retail stores, or phone agents). Maps can also compare attributes of different groups, such as customers who advanced towards purchase vs those who dropped out. Slicing the data in yet another way, maps can show the impact on engagement score of specific actions. Hours of fun, eh?
- Execution. Users can create “conversations” that send messages to customers who match a specified combination of journey stage, customer attributes, and channels. Eligibility and relevance rules can ensure the chosen messages are truly appropriate. One conversation can include several messages in different channels. Message contents can be drawn from a repository within ONE or from an external asset library.
The system uses machine learning to estimate how each customers will respond to each conversation and to calculate the value of the conversation. An arbitration function can then find the highest value conversation in each situation. The system can deploy conversations in real time, presenting CRM agents with recommended actions (along with a detailed customer profile and history) or Web pages with personalized contents (deployed in user-specified locations on the page). Personalized content and data can also be pushed to other execution systems such as email through API connections, either in batch or real time. External systems can access individual customer records through the ONE API. Data can also be extracted from ONE to standard SQL databases, which external systems could then query.
Pricing for ONE starts at $30,000 per year and is based on the volume of interactions and personalization recommendations, with unit costs varying by channel. The system has 38 clients in Europe and about a dozen in North America.
______________________________________________________________________________
*I would love to call these JOEs but don’t have the heart to inflict another obscure acronym on the industry. You’re welcome.
** Originally developed by my colleague Michael Hoffman. Click here for his take on it.
*** I’m not suggesting that Thunderhead based their map on the Customer Experience Matrix. Many people have come up with similar ideas. I do like to think that Hoffman and I were ahead of our time.
Monday, April 11, 2016
Magisto Uses AI To Create Emotion-Inducing Videos. How Do You Feel About That?
My on-going research into marketing applications of artificial intelligence led me today to Magisto, which promises not merely to automatically create videos, but to do in a way that elicits a user-specified emotion. That was enough to pique my curiosity,especially combined with the claim that all you had to do was upload some video and photo clips and Magisto would do the rest. I mean, this is something for people who are both cool and lazy – sign me up!
Which is exactly what I did. I took a few video selfies (velfies?) around the office with some simple narration, uploaded them to Magisto, made the necessary choices, and waited a few minutes to see what came back. To see the available range, I made two different versions, one with a storyteller theme and the other – why not? – for a fiesta. The music choices were for songs I didn't recognize, but that only confirms how long ago I stopped listening to current music. You can see the results here: storyteller and fiesta.
Obviously Magisto wasn't smart enough to recognize that one video was recorded sideways or that one clip was a retake of another clip. But for something that took almost zero effort, it's not bad. If I'd wanted to pay $9.99 per month for a business subscription, I could have made a longer video, reordered the clips (within some limits), and even added captions. For more on creating a business video, see Magisto's own video on the subject.
What about that promise of making videos that elicit emotions? Well, I'm not so sure. The music sets a mood but that doesn't seem like enough to drive anyone to tears or cheers. On the other hand, the results were much better than plenty of home-grown videos I've seen, and Magisto certainly knows a cute pet when it sees one. So don't fire your agency quite yet. But for your own amusement, Magisto is worth a try.
Which is exactly what I did. I took a few video selfies (velfies?) around the office with some simple narration, uploaded them to Magisto, made the necessary choices, and waited a few minutes to see what came back. To see the available range, I made two different versions, one with a storyteller theme and the other – why not? – for a fiesta. The music choices were for songs I didn't recognize, but that only confirms how long ago I stopped listening to current music. You can see the results here: storyteller and fiesta.
Obviously Magisto wasn't smart enough to recognize that one video was recorded sideways or that one clip was a retake of another clip. But for something that took almost zero effort, it's not bad. If I'd wanted to pay $9.99 per month for a business subscription, I could have made a longer video, reordered the clips (within some limits), and even added captions. For more on creating a business video, see Magisto's own video on the subject.
What about that promise of making videos that elicit emotions? Well, I'm not so sure. The music sets a mood but that doesn't seem like enough to drive anyone to tears or cheers. On the other hand, the results were much better than plenty of home-grown videos I've seen, and Magisto certainly knows a cute pet when it sees one. So don't fire your agency quite yet. But for your own amusement, Magisto is worth a try.
Wednesday, April 06, 2016
Salesforce Purchases Deep Learning Artificial Intelligence Vendor MetaMind: Yeah, That's a MarTech Trend
It’s worth a brief note to record that Salesforce.com purchased artificial intelligence vendor MetaMind on Monday. There aren’t many details available: in the announcement, MetaMind founder Richard Socher said Salesforce will use its technology to "automate and personalize customer support, marketing automation, and many other business processes." In a way, that vagueness is exactly what’s most interesting about the deal: it supports the notion that AI will be embedded in many system features rather than limited to a handful of specific tasks.
This vision of pervasive AI is how I personally expect the industry to develop. The cumulative impact will be to make all aspects of marketing more effective as treatments are tailored more precisely to individual customers and contexts. You can also see this as making marketers more productive in the sense of letting them generate more individualized customer treatments per work hour. Those benefits are two sides of the same coin.
Regarding MetaMind itself: I never explored the system in detail but the Web site shows it could do both visual and text analysis. That’s intriguing because those tasks were traditionally handled by highly specialized systems using very different techniques. But general purpose "deep learning" systems that can be tuned for multiple uses are becoming more common, so MetaMind serves as an example of industry trends rather than a fabulous exception. This flexibility makes it a good choice for Salesforce to use as a foundation for all sorts of AI-based enhancements to its products. It’s safe to assume that other major platform vendors will follow a similar path.
One possible implication to consider is whether pervasive AI could serve as a catalyst for the long-expected martech industry consolidation. The argument would be that a general purpose AI engine allows enhancements across many different marketing functions, so there is scale economy for vendors who can use one AI tool. Presumably there would also be some marketing effectiveness/productivity benefit from having a single AI engine that could share its intelligence across different applications, rather than having each function develop insights independently. I’m by no means convinced this is truly what will happen but it’s something to think about.
This vision of pervasive AI is how I personally expect the industry to develop. The cumulative impact will be to make all aspects of marketing more effective as treatments are tailored more precisely to individual customers and contexts. You can also see this as making marketers more productive in the sense of letting them generate more individualized customer treatments per work hour. Those benefits are two sides of the same coin.
Regarding MetaMind itself: I never explored the system in detail but the Web site shows it could do both visual and text analysis. That’s intriguing because those tasks were traditionally handled by highly specialized systems using very different techniques. But general purpose "deep learning" systems that can be tuned for multiple uses are becoming more common, so MetaMind serves as an example of industry trends rather than a fabulous exception. This flexibility makes it a good choice for Salesforce to use as a foundation for all sorts of AI-based enhancements to its products. It’s safe to assume that other major platform vendors will follow a similar path.
One possible implication to consider is whether pervasive AI could serve as a catalyst for the long-expected martech industry consolidation. The argument would be that a general purpose AI engine allows enhancements across many different marketing functions, so there is scale economy for vendors who can use one AI tool. Presumably there would also be some marketing effectiveness/productivity benefit from having a single AI engine that could share its intelligence across different applications, rather than having each function develop insights independently. I’m by no means convinced this is truly what will happen but it’s something to think about.
Subscribe to:
Posts (Atom)