Friday, May 26, 2023

Chat-Based Development Changes the Build vs Buy Equation

The size of most markets is governed by a combination of supply and demand. Typically one or the other is limited: while there may be a bottomless appetite for chocolate, there is only so much cocoa in the world; conversely, while there is a near-infinite supply of Internet content, there are only so many hours available to consume it. The marketing technology industry has been a rare exception with few constraints on either factor. The supply of new products has grown as software-as-a-service, cloud platforms, low-code development tools, and other technical changes reduce development cost to something almost anyone with a business idea can afford, and widely available funding reduces barriers still further. Meanwhile, the non-stop expansion in marketing channels and techniques has created an endless demand for new systems. Further accelerating demand, the spread of digital marketing to nearly all industries has powered development of industry-specific versions of many product types.

Despite the visibility of these factors, the uninterrupted growth of martech has been accompanied almost from the start by predictions it will soon stop. This based less on any specific analysis than a fundamental sense that what goes up must come down, which is generally a safe bet. There’s also an almost esthetic judgement that a market so large is just too unruly, and that the growing number of systems used by each marketing department must indicate substantial waste and lost control.

One common metric cited as evidence for excess martech is utilization rate: Gartner, the high priests of rational tech management, reported last year that martech utilitzation rates fell from 58% in 2020 to 42% in 2022 and ranted in a recent press release that “The willingness to let the majority of their martech stack sit idle signifies a fundamental resource disconnect for CMOs. It’s difficult to imagine them leaving the same millions of dollars on the table for agencies or in-house resources. This trade-off of technology over people will not help marketing leaders accelerate out of the challenges a recession will bring.” They were especially incensed that their data showed CMOs are increasing martech’s share of the marketing budget, comparing them to “gamblers looking to write-off their losses with the next bet.” (They probably meant “recoup”, not “write-off” their losses.)

This isn’t just a Gartner obsession. Reports from Integrate, Wildfire, and Ascend2 also cite low utilization rates as evidence of martech overspending.

It aint necessarily so.

For one thing, underutilization is common in all departments, not just marketing.  Nexthink found half of all SaaS licenses are unused. Zylo put the average company-wide utilization at 56% and Productiv put it at 45%. (These studies measure app usage, not feature usage. But you can safely bet that feature usage rates are similarly low through the organization.)

More fundamentally, there’s no reason to expect people to use all the features of the products they buy. What fraction of Excel or Powerpoint features do you use? What’s important is finding a system with the features you need; if it has other features you don’t need, that’s really okay so long as you’re not paying extra or finding they get in your way.  Software vendors routinely add features required by a subset of their users. Since that helps them serve more needs for more clients, it’s a benefit, not a problem.

The real problem isn’t presence of features you don’t need, but the absence of features you do. That’s what pushes companies to buy new systems to fill in the gaps. As mentioned earlier, the great and on-going growth of the martech industry is due in good part to new technologies and channels creating new needs which existing systems don’t fill. That said, it's true that some purchases are unnecessary: buyers don’t always realize that a system they own offers a capability they need. And, since vendors add new features all the time, a specialist system may become redundant if the same features are added to a company’s primary system.

In both of those situations, avoiding unnecessary expense depends on marketers keeping themselves informed about what their current systems can do. This is certainly a problem: thanks to the miracle of SaaS, it’s often easier to buy a new system than the fully research the features of systems already in place. (Installing and integrating the new system will probably be harder, but that comes later.) So we do see reports of marketers trying to prune unnecessary systems from their stacks: for example, the previously-cited Integrate report found that 26% of marketers expected to shrink their stacks. Similarly, the CMO Council found 25% were planning to cut martech spend and Nielsen said 24% were planning martech reductions. (Before you sell all your martech stock, you should also know that each report found even more marketers were planning to increase their martech budgets: 32% for Integrate, 36% for CMO Council, and 56% for Nielsen.)

Let’s assume, if only for argument’s sake, that marketers are reasonably diligent about buying only products that close true gaps. New gaps will continue to appear so long as innovation continues, and there’s no reason to expect innovation will stop. So can we expect the growth in martech products will also continue indefinitely?

Until recently I would have said yes (unless budgets are severely crimped by recession). But the latest round of AI-based tools has me reconsidering. 

Specifically, I wonder whether marketers will close gaps by building their own applications with AI instead of buying applications from someone else. If so, industry expansion could halt.

Marketers have been using no-code technologies to build their own applications for some time. Some of these may have depressed demand for new martech products but I don’t think the impact has been substantial. That’s because no-code tools are usually constrained in some way: they’re an interface on top of an existing application (drag-and-drop journey builders), a personal productivity tool (Excel), or limited to a single function (Zapier for workflow automation). Building a complete application with such limited tools is either impossible or not worth the trouble.

The latest AI systems change this. Chat-based interfaces let users develop an application by describing what they want a system to do. This enables the resulting system to perform a much broader set of tasks than a drag-and-drop no-code interface. That said, the actual capabilities depend on what the model is trained to do. Today, it still takes considerable technical knowledge to include the right technical details in the instructions and to refine the result. But the AIs will quickly get better at working out those details for themselves, drawing on larger and more sophisticated information about how things should be done. Microsoft’s latest description of copilots and plug-ins points in this direction: “customers can use conversational language to create dataflows and data pipelines, generate code and entire functions, build machine learning models or visualize results.”

What’s important is that the conversational interface will drive a system that automatically employs professional-grade practices to ensure the resulting application is properly engineered, tested, deployed, and documented. In effect, it pairs the business user with a really smart developer who will properly execute what the user describes, let the user examine the results, and tweak the product until it meets her goals. Human developers who can do this are rare and expensive.  AI-based developers who can do this should soon be common and almost free.

This change overcomes the fundamental limitation of most user-built apps: they can only be deployed and maintained by the person who built them and are almost guaranteed to violate quality, security and privacy standards. This issue – let’s call it governance – has almost entirely blocked deployment of user-built systems as enterprise applications. Chat-built systems remove that barrier while fundamentally altering the economics of system development.  More concretely: building becomes cheaper so buying becomes less desirable.  This could significantly reduce the market for purchased software.

Anyone who has ever been involved in an enterprise development project will immediately recognize the flaw in this argument: there’s never just one user and most development time is actually spent getting the multiple stakeholders to agree on what the system should do. Agile methodologies mitigate these issues but don’t entirely eliminate them. 

Whether chat-driven development can overcome this barrier isn’t clear.  It will certainly speed some things up, which might enable teams to build and compare alternative versions when trying to reach a decision. But it might also enable independent users to create their own versions of an application, which would probably lead to even stiffer resistance to adopting someone else’s approach. 

One definite benefit should be that chat-based applications will learn to explain how they function in terms that humans can understand. Responsible tech managers will insist on this before they deploy systems those applications create.

In any event, I do believe that chat-built systems will make home-built software a viable alternative to packaged systems in a growing number of situations. This will especially apply to systems that fill small, specialized gaps created as new marketing technologies develop. Since filling these gaps has been a major factors behind the continuous growth of martech, industry growth may slow as a result.

Incidentally, we need a name for chat-based development. I’ll nominate “vo-code”, as shorthand for voice-based coding, since it fits in nicely with pro-code, low-code, and no-code. I could be talked into “robo-code” for the same reason.

Monday, May 15, 2023

Let's Stop Confusing LEGO Blocks with Computer Software

Can we retire Lego blocks as an analogy for API connections?  Apart from being a cliché that’s as old as the hills and tired as a worn-out shoe, it gives a false impression of how easy such connections are to make and manage.  

Very simply, all Lego block connectors are exactly the same, while API connections can vary greatly.  This means that while you can plug any Lego block into any other Lego block and know it will work correctly, you have to look very carefully at each API connector to see what data and functions it supports, and how these align with your particular requirements.   Often, getting the API to do what you need will involve configuration or even customization – a process that can be both painstaking and time consuming.  When was the last time you set parameters on a Lego block?

There’s a second way the analogy is misleading.  Lego blocks are truly interchangeable: if you have two blocks that are the same size and shape, they will do exactly the same thing (which is to say, nothing; they’re just solid pieces of plastic).  But no two software applications are exactly the same.  Even if they used the same API, they would have different internal functions, performance characteristics, and user interfaces.  Anyone who has tried to pick a WordPress plug-in or smartphone app knows the choice is never easy because there are so many different products.  Some research is always required, and the more important the application, the more important it is to be sure you select a product that meets your needs.  

This is why companies (and individuals) don’t constantly switch apps that do the same thing: there’s a substantial cost to researching a new choice and then learning how to use it.  So people stick with their existing solution even if they know better options are available.  Or, more precisely, they stick with their existing solution until the value gained from changing to a new one is higher than the cost of making a switch.  It turns out that’s often a pretty high bar to meet, especially because the limiting factor is the time of the person or people who have to make the switch, and very often those people have other, higher priority tasks to complete first.

I’ve made these points before, although they do bear repeating at a time when “composability” is offered as a brilliant new concept rather than a new label for micro-services or plain old modular design.  But the real reason I’m repeating them now is I’ve seen the Lego block analogy applied to software that users build for themselves with no-code tools or artificial intelligence.  The general argument is those technologies make it vastly easier to build software applications, and those applications can easily be connected to create new business processes.

The problem is, that’s only half right.  Yes, the new tools make it vastly easier for users to build their own applications.  But easily connected?  

Think of the granddaddy of all no-code tools, the computer spreadsheet.  An elaborate spreadsheet is an application in any meaningful sense of the term, and many people build wonderfully elaborate spreadsheets.  But those spreadsheets are personal tools: while they can be shared and even connected to form larger processes, there’s a severe limit to how far they can move beyond their creator before they’re used incorrectly, errors creep in, and small changes break the connection of one application to another.  

In fact, those problems apply to every application, regardless of who built it or what tool they used.  They can only be avoided if strict processes are in place to ensure documentation, train users, and control changes.  The problem is actually worse if it’s an AI-based application where the internal operations are hidden in a way that spreadsheet formulas are not.

And don’t forget that moving data across all those connections has costs of its own.  While the data movement costs for any single event can be tiny, they add up when you have thousands of connections and millions of events.  This report from Amazon Prime Video shows how they reduced costs by 90% by replacing a distributed microservices approach with a tightly integrated monolithic application.  Look here for more analysis.  Along related lines, this study found that half of “citizen developer” programs are unsuccessful (at least according to CIOs), and that custom solutions built with low-code tools are cheaper, faster, better tailored to business needs, and easier to change than systems built from packaged components.  It can be so messy when sacred cows come home to roost.

In other words, what’s half right is that application building is now easier than ever.  What’s half wrong is the claim that applications can easily be connected to create reliable, economical, large-scale business processes.  Building a functional process is much harder than connecting a pile of Lego blocks.

There’s one more, still deeper problem with the Lego analogy.  It leads people to conceive of applications as distinct units with fixed boundaries.  This is problematic because it creates a hidden rigidity in how businesses work.  Imagine all the issues of selection cost, connection cost, and functional compatibility suddenly vanished, and you really could build a business process by snapping together standard modules.  Even imagine that the internal operations of those modules could be continuously improved without creating new compatibility issues.  You would still be stuck with a process that is divided into a fixed set of tasks and executes those tasks either independently or in a fixed sequence, where the output of one task is input to the next.  

That may not sound so bad: after all, it’s pretty much the way we think about processes.  But what if there’s an advantage to combining parts of two tasks so they interact?  If the task boundaries are fixed, that just can’t be done.  

For example, most marketing organizations create a piece of content by first building a text, and then creating graphics to illustrate that text.  You have a writer here, and a designer there.  

The process works well enough, but what if the designer has a great idea that’s relevant but doesn’t illustrate the text she’s been given? Sure, she could talk to the writer, but that will slow things down, and it will look like “rework” in the project flow – and rework is the ultimate sin in process management.  More likely, the designer will just let go of her inspiration and illustrate what was originally requested.  The process wins.  The organization loses.

AI tools or apps by themselves don’t change this.  It doesn’t matter if the text is written by AI and then the graphics are created by AI.  You still have the same lost opportunity -- and, if anything, the AIs are even less likely than people to break the rules in service of a good idea.

What’s needed here is not orchestration, which manages how work moves from one box to the next, but collaboration, which manages how different boxes are combined so each can benefit from the other.

This is a critical distinction, so it’s worth elaborating a bit.  Orchestration implies central planning and control: one authority determines who will do what and when.  Any deviation is problematic.  It’s great for ensuring that a process is repeated consistently, but requires the central authority to make any changes.  The people running the individual tasks may have some autonomy to change what they do, but only if the inputs and outputs remain the same.  The image of one orchestra conductor telling many musicians what to do is exactly correct.  

Collaboration, on the other hand, assumes people work as a team to create an outcome, and are free to reorganize their work as they see fit.  The team can include people from many different specialties who consult with each other.  There’s no central authority and changes can happen quickly so long as all team members understand what they need to do.  There’s no penalty for doing work outside the standard sequence, such as the designer showing her idea to the copywriter.  In fact, that’s exactly what’s supposed to happen.  The musical analogy is a jazz ensemble although a clearer example might be a well-functioning hockey or soccer team: players have specific roles but they move together fluidly to reach a shared goal as conditions change.

If you want a different analogy: orchestration is actors following a script, while collaboration is an improv troupe reacting to each other and the audience.  Both can be effective but only one is adaptable.

Of course, there’s nothing new about collaboration.  It’s why we have cross-functional teams and meetings.  But the time those teams spend in meetings is expensive and the more people and tasks are handled in the same team, the more time those meetings take up.  Fairly soon, the cost of collaboration outweighs its benefits.  This is why well-managed companies limit team size and meeting length.

What makes this important today is that AI isn’t subject to the same constraints on collaboration as mere humans.  An AI can consider many more tasks simultaneously with collaboration costs that are close to zero.  In fact, there’s a good chance that collaboration costs within a team of specialist AIs will be less than communication costs of having separate specialist AIs each execute one task and then send the output to the next specialist AI.

If you want to get pseudo-mathy about it, write equations that compare the value and cost added by collaboration, with the value and cost of doing each task separately.  The key relationship as you add tasks is that collaboration cost grows exponentially while value grows linearly.*    This means collaboration cost increases faster than value, until at some point it exceeds the value.  That point marks the maximum effective team size.   

We can do the same calculation where the work is being done by AIs rather than humans.  Let’s generously (to humans) assume that AI collaboration adds the same value as human collaboration.  This means the only difference is the cost of collaboration, which is vastly lower for the AIs.  Even if AI collaboration cost also rises exponentially, it won’t exceed the value added by collaboration until the number of tasks is very, very large. 


Of course, finding the actual values for those graphs would be a lot of work, and, hey, this is just a blog post.  My main point is that collaboration allows organizations to restructure work so that formerly separate tasks are performed together.  Building and integrating task-specific apps won’t do this, no matter how cheaply they’re created or connected.  My secondary point is that AI increases the amount of profitable collaboration that’s possible, which means it increases the opportunity cost of sticking with the old task structure.  

As it happens, we don’t need to imagine how AI-based collaboration might work.  Machine learning systems today offer a real-world example the difference between human work and AI-based collaboration.  

Before machine learning, building a predictive model typically followed a process divided into sequential tasks: the data was first collected, then cleaned and prepped for analysis, then explored to select useful inputs, then run through modeling algorithms.  Results were then checked for accuracy and robustness and, when a satisfactory scoring formula was found, it was transferred into a production scoring system.  Each of those tasks was often done by a different person, but, even if one person handled everything, the tasks were sequential.  It was painful to backtrack if, say, an error was discovered in the original data preparation or a new data source became available late in the process.

With machine learning, this sequence no longer exists.  Techniques differ, but, in general, the system trains itself by testing a huge number of formulas, learning from past results to improve over time.  Data cleaning, preparation, and selection are part of this testing, and may be added or dropped based on the performance of formulas that include different versions.  In practice, the machine learning system will probably draw on fixed services such as address standardization or identity resolution.  But it at least has the possibility of testing methods that don’t use those services.  More important, it will automatically adjust its internal processes to produce the best results as conditions change.  This makes it economical to rebuild models on a regular basis, something that can be quite expensive using traditional methods.

Note that it might possible to take the machine learning approach by connecting separate specialist AI modules.  But this is where connection costs become an issue, because machine learning runs an enormous number of tests.  This would create a very high cumulative cost of moving data between specialist modules.  An integrated system will have fewer internal connections, keeping the coordination costs to a minimum.

I may have wandered a bit here, so let me summarize the case against the Lego analogy:

  • It ignores integration costs.  You can snap together Lego blocks, but you can’t really snap together software modules.
  • It ignores product differences.  Lego blocks are interchangeable, but software modules are not.  Selecting the right software module requires significant time, effort, and expertise.
  • It prevents realignment of tasks, which blocks improvements, reduces agility, and increases collaboration costs.  This is especially important when AI is added to the picture, because expanded collaboration is a major potential benefit from AI technologies.

Lego blocks are great toys but they’re a poor model for system development.  It’s time to move on.
 

___________________________________________________________
* My logic is that collaboration cost is essentially the amount of time spent in meetings.  This is a product of the number of meetings and the number of people in each meeting.  If you assume each task adds one more meeting and one more team member, and each meeting last one hour, then a one-task project has one meeting with one person (one hour, talking to herself), a two-task project has two meetings with two people in each (four hours), a three-task project has three meetings with three people (nine hours), and so on.  When tasks are done sequentially, there is presumably a kick-off at the start of each task, where the previous team hands the work off to the new team: so each task adds one meeting of two people, or two hours, a linear increase.  

There’s no equivalently easy was to estimate the value added by collaboration, but it must grow by some amount with each added task (i.e., linearly), and it’s likely that diminishing returns prevent it from increasing endlessly.  So linear growth is a reasonable, if possibly conservative, assumption.  It's more clear that cumulative value will grow when tasks are performed sequentially, since otherwise the tasks wouldn't be added.  Let's again assume the increase is linear.  Presumably the value grows faster with collaboration than sequential management, but if both are growing linearly, the difference will grow linearly as well.
 










Friday, May 05, 2023

Will ChatGPT Destroy Martech?

Like everyone else, I’ve been pondering what generative AI means for martech, marketing, and the world in general.  My crystal ball is no clearer than yours but I’ll share my thoughts anyway.

Let’s start by looking how past technology changes have played out.  My template is the transition from steam to electric power in factories.  This happened in stages: first, the new technology was used in exactly the same way as the old technology (in factories, this meant powering the shafts and belts that previously were powered by waterwheels or steam engines).  Then, the devices were modified to make better use of the new technology’s capabilities (by attaching motors directly to machine tools).  Finally, the surrounding architecture was changed to take advantage of the new possibilities (freed from the need to connect to a central mechanical energy source, factories went from being small, vertical structures to large horizontal ones, which allowed greater scale and efficiency).  We should probably add one more stage, when the factories started to produce new products that were made possible by the technology, such as washing machines with electric motors.

During the earliest stages of the transition, attention focused on the new technology itself: factories had “chief electricians” and companies had “chief electricity officers”, whose main advantage was they were among the few people who understood new technology.  Those roles faded as the technology became more widely adopted.  The exact analogy today is “prompt engineer” in AI, and will likely be even shorter-lived as a profession.  

Most of the discussion I see today about generative AI is very much stuck in the first phase: vendors are furiously releasing tools that replace this worker or that worker, or even promising a suite of tools that replace pretty much everyone in the marketing department.  (See, for example, this announcement from Zeta Global https://zetaglobal.com/press-releases/zeta-introduces-generative-ai-agents-powered-by-zoe/  .)  Much debate is devoted to whether AI will make workers in those jobs more productive (hurrah!) or replace them entirely (boo!)   I don’t find this particular topic terribly engaging since the answer is so obviously “both”: first the machines will help, and, as they gradually get better at helping, they will eventually take over.  Or, to put it in other terms: as humans become more productive, companies will need fewer of them to get their work done.  Either way, lots of marketers lose their jobs.  

(I don’t buy the wishful alternative that the number of marketers will stay the same and they’ll produce vastly more, increasingly targeted materials.  The returns on the increasing personalization are surely diminishing, and it’s unrealistic to expect company managers to pass up an opportunity to reduce headcount.)

While the exact details of the near future are important – especially if your job is at stake – this discussion is still about the first stage of technology adoption, a one-for-one replacement of the old technology (human workers) with the new technology (AI workers).   The much more interesting question is what happens in the second and third stages, when the workplace is restructured to take full advantage of the new technology’s capabilities.

I believe the fundamental change will be to do away with the separate tasks that are now done by specialized individuals (copywriters, graphic designers, data analysts, campaign builders, etc.).  Those jobs have evolved because each requires complex skills that take full time study and practice to master.  The division of labor has seemed natural, if not inevitable, because it mirrors the specialization and linear flow of a factory production line – the archetype for industry organization for more than a century.  

But AI isn’t subject to the same constraints as humans.  There’s no reason a single AI cannot master all the tasks that require different human specialists.  And, critically, this change would bring a huge efficiency advantage because it would do away with the vast amount of time now spent coordinating the separate human workers, teams, and departments.  There would be other advantages in greater agility and easier data access.  Imagine that the AI can get what it needs by scanning the enterprise data lake, without the effort now needed to transform and load it into warehouses, CDPs, predictive modeling tools, and other systems.  Maintaining those systems takes another set of specialists whose jobs likely to vanish, along with all the martech managers who spend their time connecting the different tools.  

Of course, the vision of “citizen developers” using AI to create sophisticated personal applications on the fly is entirely irrelevant when the citizen developers themselves no more have jobs.  Thousands of independent applications that make up today’s martech industry may vanish, unless the marketing Ais build and trade components among themselves – which could happen.

So far, I’ve predicted that monolithic AI systems rather than teams of (human or robotic) specialists will create marketing programs similar to today’s campaigns and interactions.  But that assumes there’s still a demand for today’s types of campaign and interactions.  This brings us to the final type of change: in the outputs themselves.

Again, we may be in for a very fundamental transformation.  The output of a marketing department is ultimately determined by the how people buy things.  It’s a safe bet that AI will change that dramatically, although we don’t know exactly how.   For sake of argument, let’s assume that people adopt AI agents to manage more of their personal lives for them (pretty likely) and that they delegate most purchasing decisions to those agents (less certain but plausible, and again already happening to a limited degree).  If that happens, our AI marketing brains will be selling to other Ais, not to people.  To imagine what that looks like, we again have to move beyond expecting the AI to do what people do now, and look at the best way for an AI to achieve the same results.

If you think about marketing today – and for all the yesterdays that ever were – it’s based on the fundamental fact that humans have a limited amount of attention.  Every aspect of marketing is ultimately aimed to capturing that attention and feeding the most effective information to the human during the time available.  

But AIs have unlimited attention.

If an AI wants to buy a product, it can look at every option on the market and collect all the information available about each one.  Capturing the AI’s attention isn’t an issue; presenting it with the right information is the challenge.  This means the goal of the marketing department is to ensure product information is available everyplace the AI might look, or maybe in just one place, if you can be sure the AI will look there.   Imagine the world as a giant market with an infinite number of sellers but also buyers who can instantly and simultaneously visit every seller and gather all the information they provide.  The classical economist’s fantasy – perfect market, perfect information, no friction – might finally come true.

And, also as the classical economists dream, the buyers will be entirely rational, not swayed by emotional appeals, brand identity, or personal loyalties.  (At least, we assume that AI buyers are rational and objective, although it won’t be easy to ensure that’s the case. That relates to trust, which is a topic for another day.)

If the role of marketing is to lay out virtual products on a virtual table in a virtual market stall, there’s no need for advertising: every buyer will pass by every stall and decide whether to engage.  With no need for advertising, there’s no need for targeting or personalization and no need for personal data to drive that targeting or personalization.  Privacy will be preserved simply because advertisers will no longer have any reason to violate it.

The key to business success in this world of omniscient, rational buyers is having a superior product, and, to a lesser extent, presenting product information in the most effective way possible.  There’s still some room for puffery and creativity in the presentation, although presumably mechanisms such as consumer reviews and independent research will keep marketers reasonably honest.  (Trust, again.)   There’s probably more room for creativity in developing the products themselves and constructing a superior experience that extends beyond the product to the full package including pricing, service, and support.

We can expect the AIs to play a major role in developing those new and optimal products and experiences, although I suspect the pro-human romantic in everyone reading this (except you, Q-2X7Y) hopes that people will still have something special to contribute.  But, wherever the products themselves come from, it will be up to the marketing AI to present them effective to the AI shoppers.

(Side note: today’s programmatic ad buying marketplace comes fairly close to the model I’m proposing.  The obvious difference is the auction model, where buyers bid for a limited supply of ad impressions.  It’s conceivable that the consumer marketplace would also use an auction.  Again, just because most of today’s shopping is based on a fixed price model, we shouldn’t assume that model will continue in the future.  Come to think of it, an auction would probably be the best approach, since buyers could adjust their bids based on their current needs and preferences, and sellers could adjust them based on inventory and current demand.  In the traditional marketplace, this would be called haggling, or negotiating, and it's the way buying has been done for most of history.  With perfect information on both sides, the classical economists would be pleased yet again. It could be fruitful to explore other analogies with the programmatic marketplace when trying to predict how the AI-to-AI marketplace will play out.)

(You could also argue that Amazon, Expedia, and similar online marketplaces already offer a place for virtual sellers to offer their virtual wares to all comers.  Indeed they do, but the exact difference is that searching on Amazon requires a painfully inefficient use of human time.  If Amazon evolves a really good AI-based search method, and can convince users to share enough data to make the searches fully personalized, it could indeed become the basis for what I’m proposing.  The biggest barrier to this is more likely to be trust than technology.  It’s also worth noting that traditional marketing barely exists on those marketplaces.  The travel industry, where most marketing is centered on loyalty programs, may be an early indicator of where this leads.)

So what role, exactly, do humans play in this vision?  

As consumers, humans are no longer buyers.  Instead, they receive what’s purchased on their behalf.  So, their main role is to pick an AI and train it to understand their needs.  Of course, most of that training will happen without any direct human effort, as the AI watches what its owner does.  (I almost wrote “master”, but it’s not clear who’s really in charge.)  For people with disposable income, purchases are likely to move away from basic goods to luxury goods and experiences.  Those are inherently less susceptible to purely rational buying decisions, so there’s a good chance that conventional buying and attention-based marketing will still apply.

As workers, humans are in trouble.  Farming and manufacturing have been shrinking for decades and AI is likely to take over many of the remaining service jobs.  Some conventional jobs will remain to do research and supervise the AI-driven machines, and there may more jobs where it matters that the worker is human, such as sports and handcrafts, and where human interaction is part of the value, such as healthcare.  But total employment seems likely to decrease and income inequalities to grow.  It’s possible that wealthy nations will provide a guaranteed annual income to the under-employed.  But even if that happens, meaningful work will become harder to find.

I’ll admit this isn’t a terribly pleasant prospect.  The good news is, predictions are hard, so the odds are slim that I’m right.   I’m also aware that we’re at the peak of the hype cycle for ChatGPT and perhaps for AI in general.  Maybe what I’ve described above isn’t technically possible.  But, given how quickly AI and the underlying technologies evolve, I wouldn’t bet on technology bottlenecks blocking these changes indefinitely.  Quantum AI, anyone?

All that said, of the three major predictions, I’m most confident about the first.  It’s pretty likely that a monolithic marketing AI will emerge from the specialized AI bots that are being offered today.  The potential benefits are huge, the path from separate bots to an integrated system is an incremental progression, and some people are already moving in that direction.   (Pro tip: it’s easier to predict things that have already happened.)

The emergence of an AI-driven marketplace to replace conventional human buying is much less certain.  If it does happen, the delegation will emerge in stages.  The first will cover markets where the stakes are low and buying is boring.  Groceries are a likely example.  How quickly it spreads to other sectors will depend on how much time people have and how much intrinsic enjoyment they derive from the shopping itself.

The role of humans is least predictable of all.  Mass under-employment probably isn’t sustainable in the long run, although you could argue it’s already the reality in some parts of the world.  The range of possible long-term outcomes runs from delightful to horrific.  Where we end up with depend on many factors other than the development of AI.  The best we can do is try to understand developments as they happen and try to steer things in the best directions possible.


Thursday, August 04, 2022

Martech in the Apolocalypse

I once read that the most accurate weather forecast is tomorrow will be the same as today. That may or may not be true, but it doesn’t matter.  What’s really important is predicting when the weather will change. That’s what warns you to bring an umbrella for the afternoon when it’s sunny in the morning, or buy milk before a blizzard, or evacuate before a hurricane.

The marketing industry is about to face not one but several of those seemingly sudden brutal changes.

Privacy

Privacy is the change that gets the most industry attention.  Our long season of customer data raining freely from the heavens is being replaced – seemingly overnight – by a customer data drought. This is one change we can’t blame on global warming, although it is man-made.
  • It’s people’s concerns about privacy that prompts governments to pass laws like GDPR and CCPA, and their counterparts around the globe. 
  • It’s people’s concerns about being tracked that leads Chrome, Safari, Firefox, and every other major browser to block third party cookies, although you all know that Google just deferred that on Chrome yet again.
  • It’s people’s concerns about controlling their data that leads Apple and Android to shut down unauthorized data sharing by smartphone apps. 

  • And it’s people’s concerns about identity theft and fraud, that leads companies to impose ever-tighter tighter controls on how they collect, use and share customer data, even when it is legally permitted.

The immediate impact of the data drought falls on advertisers.  They now find it much harder to assemble audiences, target individuals programmatically, and to connect media impressions with sales results.

Big Tech Under Siege

But there’s an arguably larger, secondary impact.  Privacy changes threaten the flow of third-party data into the surveillance marketing engines of Google and Facebook, which between them capture nearly 70% of global digital ad revenues and 50% -- half – of all global advertising revenues.

And privacy is just one of the challenges facing Google, Facebook, and other Big Tech companies like Apple and Amazon.  Governments everywhere are reining in the almost unbridled power they’ve let those firms accumulate. Power over not just advertising, but news and commerce, and future technologies like cloud hosting and artificial intelligence. 

  • Europe recently enacted its Digital Services Act and Digital Markets Act, which are squarely targeted at reducing the power of Big Tech. 

  • The U.S. Congress may or may not pass new privacy and anti-trust laws, but anti-trust legal action is ramping up either way. 

  • China has had its own Big Tech clampdown under way for several years now, and Russia has recently followed suit.

What these changes mean is more work for marketers.  In recent years, they've been happy to shovel dollars into one end of the Big Tech black box and pipe revenue out from the other, without worrying much about what happened in between.  In fact, Big Tech channels have already become less efficient and harder to use, and their volumes are falling.  The Duopoly’s share of US digital ad market actually peaked in in 2017, if we believe eMarketer. 

This means marketers have to look at new channels, which indeed are flourishing. Retail media, connected TV, podcasts, in-game ads, in-app ads, social commerce, even out-of-home media and humble direct mail, are all gaining attention as marketers are forced to scramble for alternatives to walled gardens that they weren’t really all that eager to escape.

But there’s more.

Pandemic

You may have heard of this little thing called the pandemic?

It certainly accelerated the growth of digital media, contributing to marketers’ search for outlets to supplement Google and Facebook. But the larger impact was a growth in digital commerce, as consumers were forced to buy more online than they had ever intended. This, in turn, accelerated companies’ interest in digital transformations of all sorts: in ecommerce, in hybrid retail like curbside pickup of online purchases, and in remote work for employees.

The result was a crisis-driven acceleration in the pace of change for corporate systems and processes, and dare I even say culture, as companies were forced to adapt to ever-changing customer expectations, working conditions, and supply chain bottlenecks. For truly deep thinkers, I’ll throw in a power shift towards junior employees, who were more attuned than their doddering elders to digital technologies and less committed to old school, top-down management styles.

Fortunately for doddering elders like myself, the kids were trapped at home with us, so they could explain how to use that damned remote.

These changes have very specific technical implications.

  • Adoption of no code and low code systems, which make it easier for business users to make changes without help from corporate IT teams.

  • Broader access to corporate data to feed the low/no code systems, and to provide feedback about how the new processes were working out.

  • Growth of the Customer Data Platform industry, as companies quickly realized that a foundation of clean, accessible customer data was essential to reach many of their new business goals.

Everything Else

The pandemic has faded into the background while our attention shifts to new crises: war, political unrest, inflation, recession, and the unignorable symptoms of accelerating climate change. It feels a bit silly to do this, but let’s look only at how these affect marketing.

  • From a tech vendor perspective, uncertainties make it harder to raise new capital or increase prices, and we’ve begun to see some scattered industry layoffs as a result.

  • From a tech buyer perspective, company belts are being tightened. While we haven’t yet seen a slowdown in martech revenues, we do know that buyers are more cautious than before. 

  • Consumers are feeling the pinch of inflation and higher interest rates today, and worry about a recession tomorrow.

It’s an environment where marketing plans are less a roadmap than a deck of contingency cards, any one of which might be played depending on how things develop.

Welcome to the Future

So welcome to the apocalypse: a chaotic present on an unstable path to an uncertain future. What’s a marketer to do?

The best advice anybody can give comes from The Hitchhiker’s Guide to the Galaxy: Don’t Panic. 

The second-best advice is to recognize that change right now is both inevitable and unpredictable. So rather than picking one most likely future, preparing for it, and hoping you’re right, your best bet is to be agile, so you can adapt effectively to whatever the future may be.

This is more than just a platitude. We don’t know which channels will dominate as the roles of Facebook and Google continue to diminish. But we do know that there will be a lot of channels, certainly in the immediate future, as media fragmentation grows.

Fragmented Media

There are specific things you can do to deal with that fragmentation.

  • Build more adaptable customer data systems. This means systems that can easily connect to any new channel, both to collect data from the channel as customers interact, and feed data to the channel to support personalized interactions. Easy connection implies open APIs, schema-free data storage, and low- and no-code interfaces.

  • Build scalable systems for higher volume and variety of data. Schema-free data stores are part of the solution, but so are cloud platforms that allow effortless, affordable growth on demand.

  • Even more important, rely on artificial intelligence to make sense of all this new, ever-changing data.  AI removes, or at least eases, the critical bottleneck of using human experts to map each new data source. Remember you might be able to capture data in a raw form without mapping it to a schema, but you still need to identify the data elements before you can use them in a customer profile.  
And AI can go beyond just identifying a data type, to extracting meaning from unstructured or semi-structured contents. This might be understanding relations among entities mentioned in call notes, or classifying the sentiment expressed in a social media post.  This sort of tagging is critical to extracting value from unstructured and semi-structured data.

Doing those things automatically, at scale, and with a minimum of set-up for new data feeds and types, is critical to adapting to change as it happens.

  • Find ways to create more content. You’ll almost certainly need a greater variety of messages, as you communicate to customers in a greater variety of circumstances – which is what happens when you have more change.  You’ll also need to distribute each message in more channels, because everything no longer goes to just Facebook and Google.
Our friend artificial intelligence will be critical here as well. It might actually create messages, although so far those capabilities are limited. But it will definitely to convert messages from one format to another, either entirely unsupervised or as a productivity multiplier for skilled humans.
  • Finally, you’ll need to do a better job of measurement. That’s always been a challenge for marketers, although third-party cookies and convenient if self-serving attribution reports from the walled garden vendors made it easier in recent years. But those cookies and those attribution reports are exactly what we’re losing as we enter the new era.  So we’ll have to work harder to find measurement methods that apply to different situations: some where we can identify each customer from start to finish, others where each audience is entirely anonymous, and everything in between.

Anonymous measurement is not a new problem. Some of the old-school methods will still apply, like test/control experiments and media mix models. You’ll also have new options, like data clean rooms and probabilistic models, that old-school marketers couldn’t imagine. The key here is to find techniques that apply to many different channels, because the number of channels will continue to increase.

More First Party Data

The second thing we know for certain is that in the future your own data – first-party data – will become more important.

In part, that’s a natural consequence of the loss of third-party data: you make the best use of what’s available.  When you can no longer just buy and test different prospect lists to see which work, you build look-alike models based on your existing customers and have media partners use those models to select your best prospects from their own lists.  Or you use a data clean room to match your customers with their customers, which is a more effective way of doing the same thing.

But there’s more to it than that. Consumers are increasingly wary of sharing their data with strangers, but your customers don’t see you as a stranger. They see you as someone they’ve chosen to do business with, in part because they think they can trust you to handle their data properly, and to use that data in their interest.

Let me be clear: this isn’t an option: customers know you have their data and they expect that you’ll use it to help them.

Now, your customers’ definition of ‘help’ isn’t to send them personalized advertisements: in fact, survey after survey shows that most customers say they don’t want any advertisements at all.  

What they do want is better service, which in their minds means greater convenience for things like placing an order, receiving a delivery, or making a return. The good news is they also want marketing that looks like good service, such as suggesting a useful product upgrade or more suitable pricing plan. If you do that kind of marketing really well, they’ll rave about how great it is to be your customer.

Let me make my point more directly: the only way you can give really terrific service is to have really terrific customer data. That data is what lets you understand and anticipate what each customer wants and how they want it delivered.

You’ll also need powerful analytics to make sense of that data, which brings us back once more to our friend artificial intelligence.

The Story So Far

Let’s stop here for a quick recap.

  • Change is accelerating. And while we can’t predict the specifics, we do know that channels will fragment, and personal data will be harder to get. 
  • Agility is the key to dealing with both those changes. Agility allows you to exploit whichever channels turn out to be important. And agility lets you take full advantage of whatever data you’re able to collect.

Building for Agility

The final question, then, is how do you build for agility?    And that’s a paradox, right? How can you build a solid, stable platform that allows you to be flexible?

Well, it’s not really a paradox.  Imagine you want to build a ballet theater.

The stage of that theater will feature the world’s most agile, flexible, dynamic ballerinas, appearing in a wide variety of shows. In fact, that same stage may also host operas, orchestras, musical theater, and maybe even a circus or two. The theater has to be flexible and adaptable to accommodate these diverse uses, but also strong and solid enough to support leaping ballerinas and trudging elephants without collapsing under the strain.

It’s the same for your marketing stack. The customer-facing systems in your stack are ballerinas: talented, hard-working, and exciting, they’re the stars of the show. 

But they’re also replaceable. Dancers come and go from season to season and even one performance to the next. Lose any single dancer, and the show will go on without missing a performance. Lose the entire corps de ballet, and at worst you’re delayed a few weeks as you rehearse their replacements.

But if the theater burns down, you’re out of business for a very long time.

As you’ve no doubt figured out by now, your Customer Data Platform is that theater. It’s the repository of accumulated knowledge and resources that enable your business applications to adapt and thrive in the face of change.

We’ve already listed technical features that make this possible:

  • easy connectivity to new sources and destinations
  • flexible, scalable data storage
  • efficient content creation and conversion from one channel to another
  • alternative measurement techniques for different situations
  • privacy-supporting technologies such as data clean rooms and policy enforcement, and
  • enabling technologies like artificial intelligence, composable architectures, and no code/low code tools

The key point here is you need a clear separation between your stable, central customer data system and the applications which depend on that system.  That separation is what makes it relatively easy to add or change applications without disrupting other parts of the operation. Violating the separation is dangerous because any application that also stores centralized data cannot easily be replaced.

The ballet analogy for that is a principal dancer who’s also the director and choreographer.  Losing that person shuts down the operation, or at least requires a long period of retooling while you recruit replacements for both jobs. 

There may be times where you want to take this risk, because the one person, or one system, is so good at both roles that you build your organization around it. But you should at least recognize what you’re doing is dangerous, and be sure it’s really worth the taking the chance. You’ll also want to take out the technology equivalent of key person insurance, doing whatever you can to minimize your reliance on that dual purpose system and to separate the platform from application functions.

I should also stress that separating the platform from applications doesn’t free you to use any applications you want. You still need applications that integrate well with your platform, and ideally will complement the strengths and weaknesses of the applications already in place.

We could say it’s like hiring ballerinas who fit the company style, but let’s give those poor ballerinas a rest.

Agility Beyond Technology

You should also recognize that technology is just one part of agility. People and process are at least as important. You have to support the technology with training, governance, organizational alignment, leadership, recruiting, and reward systems that empower staff respond effectively as new conditions arise.

A few guidelines include:

  • Decide on data, not intuition. That’s been the goal forever. But it’s much more important when intuitions are less reliable because they've developed under conditions that no longer exist. 
  • Experiment, don’t optimize. Optimization is based on fine-tuning parameters over time. This only works when the underlying conditions are stable. In a period of rapid change, it’s more important to uncover big new opportunities than to squeeze small improvements out of the old ones. 
  • Measure outcomes, not inputs. When conditions are stable, the relation between inputs and outcomes can be discovered.  This means you can measure one to predict the other. In unstable conditions, the old assumptions don’t apply. So you’ll have to put in the extra effort it takes to measure outcomes directly. 
  • Manage customers, not departments. What I really mean here is to manage the customer experience across all departments. Customers don’t know or care which department controls which part of their experience. They only judge the company on their experience as a whole.  Departments working in isolation from each other cannot see the ultimate results of their actions. The only way to deliver good outcomes is to cooperate and manage from the customer’s view.

Let me focus on that last point for a moment, because it’s especially important.

Agility cannot be just random leaping about. It needs a purpose, so you know which direction to leap. And in business, the purpose of agility is to deliver great customer experience.

Summary

I hope the core message is clear:

  • In uncertain times, agility is the key to success. 
  • Agility is more than being flexible. Quick, effective response to change requires strong, stable base of technical and human resources – including a solid Customer Data Platform. 
  • The purpose of agility is meet the one goal that never changes: delivering a great customer experience.

Thursday, December 30, 2021

Game of Thrones Meets Big Bang Theory: Welcome to CDP Industry's Next Phase

The CDP Institute just published its latest Industry Update, our semi-annual overview of CDP vendors with data on employment, funding, locations, and more. (Download here.)  There were three pieces of information that stood out:

  • Only four new vendors were added, compared with an average of fifteen in past reports.

  • four companies reported funding rounds over $100 million, compared with one round that size across all past reports

  • nearly all employment growth (85%) came from previously listed vendors, compared with just 36% in past reports


Of course, it makes sense that most growth would come from existing vendors if we added few new ones. But industry growth over-all was in line with past trends, and actually a bit stronger: up 12% over the past six months. This meant that the growth rate of existing vendors was high enough to compensate for the “loss” of new vendors. In fact, the existing vendor growth of 11% was the highest since 2018, when the industry was just taking off.

Connecting these dots reveals a clear picture: an industry that has stopped attracting new entrants but is now growing strongly on its own – with leading vendors stockpiling funds to compete against each other an elimination round where only a few can emerge as winners. Think Survivor meets Game of Thrones with a dash of Big Bang Theory.

It’s a picture that makes a lot of sense. Customer Data Platforms are now widely accepted as an essential component of a modern data architecture, so it’s a market worth fighting for. But the situation facing potential entrants is daunting:

  • the leading independent CDP vendors now have mature products, big customer bases, high brand recognition, and lots of funding. 

  • enterprise software companies, including Salesforce, Adobe, Oracle, Microsoft, and SAP, are chipping away at the market by selling CDPs as part of their packages. 

  • marketing automation, customer support, ecommerce, and other vendors increasingly offer CDP modules baked into their own systems

  • IT teams show growing interest in building their own CDP equivalent, supported by a growing array of components that make the job easier.

Some mid-tier CDP vendors have already given up the fight and been acquired, most often by firms needing a CDP to anchor a multi-channel customer experience suite. The acquisition wave may have peaked, since there were just three acquisitions in the latest report, compared with a dozen over the previous two. 

Among the remaining firms, some may compete successfully as generalists.  But the more promising path in most cases will be to offer specialized products that can be the best in a particular niche. Those niches might be defined by a particular industry, region, company size, or CDP function.

The functional niches are most intriguing because they serve the growing market for CDP components. We’ve seen some movement in that direction, as vendors offer parts of their CDP as stand-alone modules for identity resolution, data collection, data distribution (“reverse ETL”), and campaign management. Those vendors see their modules as a point of entry into clients who will later buy more of their products. They may be right, but I wonder how many companies that buy best-of-breed components will reverse course by favoring components from a single source. What’s certain is that this approach exposes the CDP vendors to competition from point-solution specialists in each area while discarding the advantage of that comes from purchasing a CDP with a full range of pre-integrated functions.  Here's a sampling of that competitive landscape:

I also suspect that companies like Snowflake, Amazon Web Services, and Google Cloud Services, which now position themselves as providing one piece of a “composable” solution, will eventually add features that match what the independent component providers now offer. Actually, that’s already happening, so I don’t get much credit for predicting it. It’s a dynamic we’ve seen repeatedly in other markets: primary vendors expand their features to secure their position with clients by adding more value (hooray!) and increasing the cost of switching (boo!).

Let’s be clear: both the added value and the switching costs are the result of integration cost. No matter how many promises are made about easy integration, the fact remains that any non-trivial connection between two systems takes skill to plan, deploy, and maintain. Integration is often the top-ranked vendor selection criterion in surveys, which some see as showing that problem is well understood. I draw the opposite conclusion: people list integration as a consideration because they know it’s poorly understood.  This forces them to invest time in trying to avoid integration problems and even sacrifice other benefits to achieve it. If integration were really easy, no one would worry about it.

Right now, someone reading this is saying, “Ah, but no-code changes everything”. I don’t think so. No-code works best when automating simple processes with a few users where flaws are acceptable. The more complex, widely-deployed, and mission-critical a process is, the more important it is to deploy professional-grade design and quality control. Prebuilt components doesn’t change this: configuring those components and connecting them to each other still takes great care and understanding.

This isn’t (just) a cranky-old-man digression. CDP functions rank high in complexity, scale, and risk, so they are poor candidates for no-code development. CDPs certainly can have no-code interfaces that empower business users to do things that might otherwise require a developer. But those interfaces will control carefully defined and constrained tasks, not create core functionality. Assembling CDP-equivalent systems from composable functions is a different matter, and, yes, that should become increasingly possible for people with the right integration skills. What I doubt is that selling modules with those functions will be good business for CDP vendors: they are likely to commoditize their products and ultimately to be pushed aside by platform developers who integrate key functions directly.

I'm not saying that CDP vendors who can’t raise several hundred million dollars are doomed.  I am saying that most will have to pick a niche to succeed. One promising option is building customer data profiles, especially for big enterprises. It’s a single function that incorporates enough separate components for CDP vendors to provide value by avoiding integration costs.

The other big niche, or set of niches, is integrated customer experience solutions. Our latest report already shows systems that campaign and delivery CDPs, our name for systems that do this, account for two-thirds of the industry vendor count and funding, and nearly three-quarters of employment. Their actual share may be greater still: immediately after completing the latest report, I happened to glance at G2 Crowd’s list of CDPs and found our reports doesn't include several large, fast-growing retail marketing automation or messaging specialists (Insider, Listrak, SALESmanago, Klaviyo, and Ometria) that offer what looks like CDP-grade multi-source profile building.

Whether those vendors are true CDPs depends on whether they make those profiles available to other systems. Either way, the point is that there’s a large and growing market for cross-channel retail marketing systems with unified customer profiles at their core. There are similar markets outside of retail, where we already see specialist campaign and delivery CDPs in hospitality, financial services, telecommunications, healthcare, education, and elsewhere.

The value of industry-specific systems is, once again, reduced integration cost. In this case, the key integration is with industry-specific operational systems such as airline reservations, core banking, phone billing, health records, and student management. Vendors in these niches compete primarily on the marketing functions they offer, which makes them more departmental than enterprise solutions and pushes them to add marketing features tailored to their particular industry. Systems like this are hard to dislodge once they’re deployed because they are populated with many complex, difficult-to-replicate campaigns, reports, predictive models, and content libraries. This stickiness is what enables many successful vendors to co-exist in each niche, and what makes it hard for non-specialists to enter.

The division of the CDP industry into enterprise-level data CDPs and industry-specific, department-level campaign and delivery CDPs is not a new trend. What is new is the maturity of the competitors within many niches, which will make it increasingly difficult for new entrants to succeed. What’s also new is that building CDP-style customer profiles is increasingly common, making it a standard feature rather than a product differentiator. This encourages vendors to position themselves as something other than a CDP, even though they need to show buyers that their CDP features are first-rate.

My final conclusion is this: the CDP industry will continue grow, and it will remain important for buyers to find the right CDP, even as the CDP itself slips from the spotlight.

Sunday, November 21, 2021