Tuesday, March 16, 2021

ActiveNav Automates Data Inventory Updates

Our on-going tour of privacy systems has already included stops at BigID  and Trust-Hub, which both build inventories of customer data. Apparently I’m drawn the topic, since I recently found myself looking at ActiveNav, which turns out to be yet another data inventory system. It’s different enough from the others to be worth a review of its own. So here goes.

Like other inventory systems, ActiveNav builds a map of data stored in company systems. In ActiveNav’s case, this can literally be a geographic map showing the location of data centers, starting from a global view and drilling down to regions, cities, and sites. Users can also select other views, including business units and repository types. The lowest level in each view is a single container, whose contents ActiveNav will automatically explore by reading metadata attributes such as field names and data formats.

The system applies rules and keywords to the metadata to determine the type of data stored in each field, without reading the actual file contents. (A supplementary module that allows content examination is due for release soon.)  It stores its findings in its own repository, again without copying any actual information – so there’s no worry about data breaches from ActiveNav itself.

One disadvantage of ActiveNav’s approach is that relying only on metadata limits the chances of finding sensitive information that is not labeled accurately, something that BigID does especially well. Similarly, ActiveNav doesn’t map relations between data stored in different containers, so it cannot build a company-wide data model. This is a strength of Trust-Hub.

Still, ActiveNav’s ability to explore and classify data repositories without human guidance is a major improvement over manually-built data inventories. Its second big benefit is a “data health” score based on its findings. This is calculated for each container with scores for factors including: risk, including intellectual property and security issues; privacy compliance, based on presence of IDs and other data types; and data quality, including duplicate, obsolete, stale and trivial contents. Scores for each container are combined to create scores for repositories, locations, business units, and other higher levels. This gives users a quick way to find problem areas and track data health over time.

ActiveNav addresses what may be the biggest data inventory pain of all: keeping information up-to-date. The system automates the update process by receiving continuous notifications of metadata changes from systems that are set up to send them. In other cases, ActiveNav can query repositories to look for metadata that has been updated since its last visit. Of course, this requires providing the system with credentials to access that information.

ActivNav was founded in 2008. Until recently, it offered only a conventional on-premise software license with one-time costs starting around $100,000. This is sold this primarily through partners who work on data management projects for heavily regulated industries and governments. The company has recently introduced a SaaS version of its data inventory system that starts at $10,000 per year. It also offers data governance and compliance modules.

Wednesday, February 17, 2021

Is Peak Martech Approaching At Last?


Contrary to popular belief, forecasting is easy: tomorrow is nearly always like today. What’s hard is predicting when something will change: a snowstorm, stock market crash, or disruptive technology. Of course, predicting change is also where forecasting is most useful.

In marketing technology, we’ve seen a long succession of sunny days. Every year, the number of systems grows, fed by a proliferation of channels, declining development costs, and easily available funding. The safe bet is the number will grow next year, too. But I think one day soon the pendulum will reverse direction.

I can’t point to much data in support of my position. Surveys do show that marketers don’t use the full capabilities of their existing stacks, which might mean they’re inclined to take a break before making new purchases. But marketers have never used every feature in their old systems before buying new ones. The pandemic probably led to a temporary dip in martech purchases but budgets appear to be opening up again. So the appetite for new martech will likely reappear.

My prediction is based less martech trends than a general impression that many people feel the world is spinning out of control and want to rein it in. Tech in particular is having impacts that no one fully understands. Concerns about disinformation, social media-induced radicalization, lost privacy, and biased artificial intelligence are all part of this. Even in the narrower spheres of martech and adtech, many users feel their systems are too complicated to really understand. Worries about ad fraud, ads appearing in the wrong places, inappropriate personalization, and unintended campaign messages all come down to the same thing: people worry their systems are making unseen bad decisions.

Technologists tend to feel the cure is more technology: smarter AI, systems checking on other systems, and democratized development to let more people build systems for themselves. But there’s an air of hubris to this. Stories from Daedalus to Frankenstein to Jurassic Park warn us advanced technology will ultimately destroy its creators. Every data breach and wifi outage reminds us no technology is entirely reliable and fixing it is beyond most people’s control.

As a result, non-technologists increasingly doubt that technology can solve its own problems. Some people will bury their worries, accepting technology’s risk as the price for its benefits. Others will take the opposite extreme, rejecting technology altogether, or at least to the great degree possible.

But there’s a middle ground between blindly surfing the net and leaving the grid entirely. This is to consciously seek technology that’s simpler and more controllable than current extremes, even if it’s also less powerful as a result. The key is willingness to make that trade-off, which in turn implies willingness to invest the effort needed to assess the relative value vs. risk of different technical options.

Making that investment is probably the biggest change from the current default of accepting technical progress as inevitable and trusting the technologists to appropriately balance risk against rewards when they decide which products to release. In many cases, the cost of assessment will probably be higher than the cost of using the diminished technology itself: that is, the difference in value between a more secure system and a less secure one may be less than the value of the time I spend comparing them. This means the main cost of making this adjustment isn’t the lost value from using safer technology, but the cost of assessing that technology.

In theory, the assessment cost might be reduced by splitting it among many people who would share their results. But here’s where trust comes back into play: if you can’t trust someone else to do accurate research, you can’t decide based on their results. Since loss of trust is arguably the defining crisis of today’s society, you can’t just wave it away with an assumption that people will trust others’ assessments of technology tradeoffs. Rather, the need will be to build technology that is self-evidently understandable, so that each person can assess it for herself. This will reduce the assessment cost that blocks them from choosing simpler solutions.

So here’s where I think we’re headed: away from ever-increasing, and increasingly opaque, technical complexity, and towards technology that’s simpler and more transparent. Remember: simplicity is the goal, and transparency is what makes it affordable. I call this the “new pragmatism”, although I doubt the label will catch on. As the word “pragmatism” suggests, it’s rather boring and a lot of hard work. But compared with the chaos or authoritarianism that seem to be the main alternatives, it’s about the best way we can hope our current chapter will end. After we turn the page, people may later learn to rebuild the presumption of trust that enables non-verifiable relationships.

If you’re still reading this, thanks for your indulgence; I know you don’t come to this blog for half-baked social theories. But these ideas do have direct implications for marketing and martech. If I’m right, both consumers and martech buyers will want simpler, more transparent products. For marketers, this means:

  • The time may finally have come when stripped-down versions replace feature-rich products, with a stress on ease of use rather than power. I know this idea has been tried before without success. But that was during the earlier age of techno-optimism.

  • Buyers may be more interested in products whose actual operation is transparent. This will usually mean status indicators, meters, and diagnostics to show’s happening. In some cases may literally mean see-through designs that let users watch, say, as the dishes are cleaned or the vacuum bag fills with dirt. Whatever it takes for a feeling of control.

  • Privacy will continue to gain importance, with particular emphasis on systems that are private by design rather than user choice. Privacy policies and options are poorly understood and mistrusted, so many consumers would rather buy a system that makes them unnecessary because it can’t collect data or connect to the internet. Of course, they need to be confident the system behaves as promised.

  • Marketing messages should also switch from promoting advanced technology to promoting simplicity, reliability, and clarity. Explanations about what’s inside a product, in terms of the technology, design and manufacturing processes, materials, and people may be more important to buyers looking for reasons to trust.

  • Marketing methods should match the claims, avoiding unnecessary personalization and staying away from mistrusted media. This is a tricky balance because few marketers will want to sacrifice the performance benefits that come from data-driven targeting. But they do need to weigh long-term brand value against short-term campaign results. For what it’s worth, relying more on basic branding and less on advanced technology is itself consistent with the return to simplicity.
  • Martech vendors will want to make all these adjustments in their own marketing. Other considerations include:


    • Artificial intelligence must be understandable. It’s tempting to suggest discarding AI altogether, since it may be the ultimate example of complicated, opaque, and ungovernable technology. But the apparent benefits of AI are too great to discard. The pragmatic approach is to demand proof that AI really delivers the expected benefits. Then, assuming the answer is yes, find ways to make AI more controllable. This means building AI systems that explain their results, let users modify their decisions, and make it easy to monitor their behaviors. These are already goals of current AI development, so this is more a matter of adjusting priorities than taking AI in a fundamentally different direction.

    • Reconsider the platform/app model. This may be blasphemy in martech circles, since the martech explosion has been largely the result of platforms making it easier to sell specialized apps. But the platform/app model relies on trust that apps are effectively vetted by platform owners. If that trust isn’t present, assessment costs will pose a major barrier to new app adoption. At best, buyers with limited resources (which is everyone) would be able to afford fewer apps. At worst, people will stop using apps altogether. So the pragmatic approach for platforms and app developers alike is to work even harder at trust-building. We already see this, for example, in Apple’s new requirements for data privacy labels and tracking consent rules. https://developer.apple.com/app-store/user-privacy-and-data-use/ What Apple hasn’t done is to aggressively audit compliance and publicize its audit programs. The dynamic here is that users will make more demands on platforms to prove they are trustworthy and will concentrate their purchases on platforms that succeed. Since selecting a platform carries its own assessment cost, we can expect users to deal with fewer platforms in total. This means the trend for every major vendor to develop its own platform ecosystem will reverse. Looking still further ahead: fewer platforms gives the remaining platforms have more bargaining power with the app developers, so we can expect higher acceptance standards (good) and higher fees (not so good). The ultimately is fewer app developers as well.

    • Rebirth of suites. That’s not quite the right label since suites never died. But the point is that buyers looking for simplicity and facing higher assessment costs will find suites more appealing than ever. Obviously, the suites themselves must meet the new standards for simplicity, value and transparency, so integrated-in-name-only Frankensuites don’t get a free pass. But once a buyer has decided a suite vendor is trusthworthy, it’s far more attractive to use a module from that suite than to assess and integrate a best-of-breed alternative. Less obvious but equally true: building a system in-house also becomes less attractive, since in-house developers will also need to prove that their products are effective and reliable. This will necessarily increase development costs, so the build/buy balance will be tilted a little more towards buying – especially if the assessment costs of buying are minimal because the purchased option is part of a trusted suite. It’s true that this doesn’t apply if companies require users to accept whatever their in-house developers deliver. But that doesn’t sound like a viable long-term approach in a world where the gap between poor in-house systems and good commercial products will be larger than ever.

    • Limits on citizen developers. If blasphemy comes in degrees, this takes me to the professional-grade, eternal-damnation level. On one hand, nothing is more trusted than something a citizen developer creates for herself: she certainly knows how it works and can build in whatever transparency and monitoring she sees fit. So the new-pragmatic world is likely to see more, not fewer, user-built systems. But if we’re learned anything from decades of using Excel, it’s that complex spreadsheets almost always contain hidden errors, are opaque to anyone except (maybe) the creator, and are exceedingly fragile when change is required. Other user-built solutions will inevitably have similar problems. So even if users trust whatever they’ve built for themselves, everyone else in the organization will, and should be, exceedingly cautious in accepting them. In other words, the assessment cost will be almost insurmountably high for all but the simplest citizen-developed applications. This puts a natural, and probably shrinking, limit on the ability of citizen-developed systems to replace commercial software or in-house systems built by professional developers. In practice, citizen development will be largely limited to personal productivity hacks and maybe some prototyping of skunkworks projects. This doesn’t mean that no-code and low-code tools are useless: they will certainly be productivity-enhancers for professional developers. Don’t sell those Airtable options just yet.

    I’ll caution again that the picture I’m drawing here is far from certain to develop. I could be wrong about the change in social direction – although the alternative of continued disintegration is ugly to contemplate. Even if I’m right about the big shift, I could be wrong about its exact impact marketing and martech. Still, I do believe that current trends cannot continue indefinitely and it’s worth considering what might happen after their limits are reached. So what I’ll suggest is this: keep an eye out for developments that fit the pattern I’m suggesting and be ready with suitable marketing and martech strategies if things move in that direction. 

    *                *                 *

    Addendum: The core argument of this post is “people feel the world is spinning out of control and trust will solve that problem”. That feels like a non sequitur, since it’s not obvious how trust creates control. It also feels uncomfortably hierarchical, and perhaps elitist, if “control” implies a central authority.  (Note: you might read “control” as referring to people controlling their own personal technology and data. But fully self-sovereign individuals can still cause chaos if there’s not some larger control framework to constrain their actions.)

    But it's not a non sequitur because there is in fact a clear relationship between trust and control. Specifically:

    • Trust can be defined as the belief that someone will act in the way you want them to
    • Control is a way to force someone to act in the way you want. 
    • Thus, trust and control are complementary: the greater trust you have in someone, the less control you need over them (to still ensure they act the way you want).
    Although power-hungry people might enjoy control for its own sake, most people will care only about achieving the desired result. So the solution to a world “spinning out of control” isn’t necessarily reinstating hierarchical, elite authority; it can also be generating trust.  Both yield the same outcome of predictable desired behaviors. 

    This applies in particular to the discussion of citizen development and no-code software, which seems to imply that applications can only be used by more than one person if there’s a central authority to coordinate and approve them.  This is where "governance" comes in.  It's correct that self-built software needs to meet certain standards to be safely used and shared.  But "governance" can be achieved either through control (a central authority enforces those standards) or through trust (convincing users to apply those standards by themselves).  Either approach can work but trust is clearly preferable.

     

    Wednesday, January 27, 2021

    Lego Blocks, Pickup Trucks, and Why Bloomreach Bought Exponea


    Yesterday brought news that CDP Exponea had been purchased by ecommerce recommendation engine Bloomreach. The deal almost exactly parallels last year’s merger between RichRelevance and Manthan, as well the smaller-scale combination of CrossEngage with Gpredictive. It also recalls other recent CDP acquisitions including Acquia buying AgilOne, Chapsvision buying NP6, SAP buying Emarsys, and Wunderkind buying SmarterHQ.

    It’s easy (and correct) to see these deals as efforts to assemble comprehensive marketing suites. But it's not just that the buyers want to add a CDP their collection.  These deals all involve CDPs with marketing automation functions (that is, segmentation, message selection, campaigns, personalization, and cross-channel orchestration). CDP Institute labels these as “campaign" or "delivery”; others sometimes refer to them as “activation” or “execution” CDPs. This type of CDP provides the biggest headstart towards building a marketing suite. The drive to build suites clarifies why predictive analytics vendors Bloomreach, RichRelevance, and Gpredictive are such frequent partners: stand-alone predictive tools are missing nearly all the features needed for a full marketing platform, so they have the most to gain from buying a CDP that fills those gaps.

    Of course, the biggest CDP acquisition of all, Twilio’s purchase of Segment, doesn’t fit this mold. Segment was more of a pure-play or "data" CDP, limited to assembling and sharing customer profiles. But Twilio isn’t looking to build a marketing suite; their core business is call centers and (after buying SendGrid) email messaging. They have their sights set on providing a communications layer to support all customer-facing operations, including sales and service. Still a suite, but a different kind.

    The drive to construct comprehensive marketing suites is interesting because it conflicts with the current notion that marketers don’t want big, integrated products but instead want to create their own collections of components, building some parts with the latest self-service tools and connecting the rest through microservices, open APIs, and other technical wizardry. The pure vision is a distributed, non-hierarchical architecture, modeled roughly on the Internet itself, where any system can connect with any other system. The more practical vision is a platform-centric world where any system can plug into a central platform that provides basic services. In both visions, companies construct their own, highly customized collections of systems that are perfectly tailored to their needs.

    Simply put, the vendors assembling these suites are betting that vision is wrong. They believe – based no doubt on what buyers are telling them – that companies still want to buy an integrated product that meets their needs without any assembly required. The purely practical reason is that companies don’t assemble systems for their own sake; they assemble them as tools to do what’s really important, which is to make money (usually by delivering goods and services to customers). Sure, you can build a pickup truck from Lego blocks and you might even do that for fun.  But if you actually need to haul something, you’ll go to a dealer and buy one.

    In other words, we still live in a world ruled by Raab's Law, which is “Suites win”. (More formally: In the long run, suites always win the competition between suites and best-of-breed systems.) Platforms don’t change this as much as you’d think, because customers always want the platforms themselves to add more features and make them tightly integrated. It's true that buyers want third-party applications that can extend platform capabilities.  It's also in the platform vendors’ interest to be open to those applications since they add value to the platform at little cost to the platform owner. But there’s a time and effort cost to the user of selecting, connecting, and learning to use each new application, regardless of whether the app is “free” or how easily it connects. Users are very aware of these costs, which is why they want the core platform to offer as many features as possible. Put another way: the value of applications is they enable users to add features a platform lacks; but the more features a platform provides internally, the more value it provides from the start. This pushes platform vendors to add features that save users from the need to install apps. Of course, the art of platform management is knowing which features are popular and standardized enough to be incorporated.

    As new features are added, platforms increasingly resemble integrated suites. The significant difference from suites is that platforms offer users the option to replace the platform’s default components with external alternatives. But if the platform builders do their jobs correctly, users will find less need to do that over time.

    This is what makes campaign CDPs so attractive to companies attempting to construct a new marketing suite. The marketing features of the CDPs provide a core of functionality that marketers are looking for. In addition, and crucially, the core CDP features make it easier for the suite vendor to integrate components of its own system and also enable the vendor to offer platform-style flexibility by connecting with external systems.

    What, then, do these acquisitions tell us about the future of the CDP industry? The first thing to realize is that most CDPs already fall into the campaign and delivery categories (70% of the industry, measured on company count or employment, according to our statistics).  Most of these firms actually started as marketing automation, personalization, or delivery systems and added CDP capabilities later. Some already provide an integrated marketing suite; the others can expand in that direction on their own or through combinations with other products.  

    It will be increasingly difficult for this type of CDP to survive without a broad set of marketing functions. Competitive pressures will force them to improve those features while treating core CDP capabilities of building and sharing unified profiles as just one talking point.  We've already seen limit their investment in CDP features and instead partner with other data-oriented CDPs to meet those needs.  (We also expect these firms to increasingly specialize by industry and company size. This makes it easier for them to build connectors to operational systems such as point of sale in retail or reservations in travel, as well as building industry-specific features, hiring industry-expert staff, and fine-tuning delivery and pricing models to meet target price-points.)

    The other 30% of the CDP industry are vendors specializing in data management and analytics. We uncreatively call these "data" and "analytics" CDPs.  Many started life as tag managers, data collection, or predictive modeling systems; others were built as CDPs from the begining. As the Twilio/Segment deal illustrated, data CDPs may also be acquisition targets, especially for companies that are aiming to build a corporate-level backbone rather than a marketing suite.  Firms that aren't acquired will be able to remain independent by offering best-of-breed customer data unification services to companies that need and can pay for a best-of-breed solution. These will likely be large enterprises. This type of CDP will increasingly be purchased by IT and data departments, rather than marketing, and will come to look more like IT tools than end-user applications. As such, they’ll find themselves increasingly competing with general purpose data management tools from other software providers and from data management and analytics tools built into the big cloud platforms (Google Cloud, AWS, Azure). So far, the specialized features of the most sophisticated data CDPs are more advanced than what’s available elsewhere. Some of these vendors will continue to innovate and ultimately emerge as strong leaders in this segment. Others will probably withdraw into niches or sell themselves to other companies that want to jumpstart their own CDP offerings.

    One happy byproduct of these developments may be a final end to the theological debate over the proper definition of “Customer Data Platform”. As the campaign and delivery CDPs position themselves as marketing suites or platforms, they’re likely to move away from CDP as their primary label. But they’ll still need the world to know that they offer the core CDP capabilities of unified profile creation and sharing. With any luck at all, they’ll handle this by labeling those features as "CDP" when they describe their system capabilities. This should eventually lead to a more consistent use of the CDP term throughout the market and, thus, less confusion over what it means. The data and decision CDPs already define CDP in terms of the same core capabilities. Some of those firms are today pulling away from verbal confusion by labeling themselves as “infrastructure” or “pipeline” customer data platforms. If the narrower definition of CDP reasserts itself, they may come back to adopting the CDP label itself.


    Sunday, January 03, 2021

    Software Has Stopped Eating the World

    This August will see the tenth anniversary of Marc Andreessen’s famous claim that software is eating the world. He may have been right at the time but things have now changed: the world is biting back.

    I’m not referring to COVID-19, although it’s fitting that it took an all-too-physical virus to prove that a digital bubble of alternate facts could not permanently displace reality. Nor am I juxtaposing the SolarWinds hack with the unexpectedly secure U.S. election, which showed a simple paper trail succeed while the world’s most elite computer security experts failed.

    Rather, I’m looking at the most interesting frontiers of tech innovation: self-driving vehicles, green energy, and biosciences top my list. What they have in common is interaction with the physical world. By contrast, recent years haven’t seen radical change in software development. There have certainly been improvements in software, but they’re more about architectures (cloud, micro-services) and self-service interfaces than fundamentally new applications. And while most physical-world innovations are powered by software, the importance of those innovations is that they are changing physical experiences, not that they are replacing them with software-based virtual equivalents.

    Even the most important software development of all – artificial intelligence – measures much of its progress by its ability to handle physical-world tasks such as image recognition, autonomous vehicle navigation, and recognizing human emotion. Let’s face it: it’s one thing for a computer to beat you at Go, but quite another for it to beat your dance moves.  Really, what special talent is left for humans to claim as their own?

    The shift is well under way in the world of marketing. One of the more surprising developments of the pandemic year was the boom in digital out-of-home advertising, which includes outdoor billboards and indoor signage. The growth seemed odd, given how much time people were forced to spend at home. But the industry marched ahead, spurred in good part by increased ability to track devices as they move through the physical world. It’s a safe bet that out-of-home ads will grow even faster once people can move about more freely.

    Indeed, the industries hit hardest by the pandemic – travel and events – also show that virtual experiences are not enough. Whatever their complaints before the pandemic, almost everyone who formerly traveled for business or attended business events is now eager to return to seeing people and places in person. The amount of travel will surely be reduced but it’s now clear that some physical interaction is irreplaceable.

    In a similarly ironic way, the pandemic-driven boost to ecommerce has been accompanied by a parallel lesson in the importance of physical delivery. Almost overnight, fulfillment has gone from a boring cost center to a realm of intensive competition, innovation, and even a bit of heroism. Software plays a critical role but it’s a supporting actor in a drama where the excitement is in the streets.

    Still closer to home for marketers, we’ve seen a new appreciation for the importance of customer experience, specifically extending past advertising to include product, delivery, service and support. If the obsession of the past decade has been targeted advertising, the obsession of the next decade will be superior service. This ties into other trends that were already under way, including the importance of trust (earned by delivering on promises through fulfillment, not making promises in advertising) and the shift from prospecting with third party data to supporting customers with first party data. Even at the cutting edge, advertising innovation has now shifted to augmented reality, which integrates real-world experiences with advertising, and away from virtual reality, which replaces the real world entirely.

    This shift has substantial implications for martech.

    - The endless proliferation of martech tools may well continue, especially if the definition of “tools” stretches to include self-built applications. But the importance of tools that only interact with other software will diminish. What will grow will be tools that interact with the real world, and it’s likely those tools will be harder to find and (at least initially) take more skills to use. It’s the difference between building a flight simulator game and an actual aircraft. The stakes are higher when real-world objects are involved and there’s an irreducible level of complexity needed to make things work right.

    - As with all technology shifts, the leaders in the old world – the big software companies and audience aggregators like Facebook and Google – won’t necessarily lead in the new world. Reawakened anti-trust enforcement comes at exactly the worst moment for big tech companies needing to pivot. So we can expect more change in the industry landscape than we’ve seen in the past decade.

    - New skills will be needed, both to manage martech and to do the marketing itself. The new martech skills will involve learning about new technologies and tighter integration with non-marketing systems, although fundamentals of system selection and management will be largely the same. The marketing skill shift may be more profound, as marketers must master entirely new modes of interaction. But, again, the marketer’s fundamental tasks – to understand customer motivations and build programs that satisfy them – will remain what they always were.

    It’s been said that people overestimate short-term change and underestimate long-term change.  The shift from software to physical innovation won’t happen overnight and will never be total. But the pendulum has reversed direction and the world is now starting to eat software. Keep an eye out for that future.