Sunday, September 13, 2020

Software Review: Osano Manages Cookie Consent and Access Requests

The next stop on our privacy software tour is Osano, which bills itself as “the only privacy platform you’ll ever need”.  That's a bit of an overstatement: Osano is largely limited to data subject interactions, which is only one of the four primary privacy system functions I defined in my first post on this topic. . (The other three are: discovering personal data in company systems, defining policies for data use, and enforcing those policies.) But Osano handles the interactions quite well and adds several other functions that are unique. So it’s certainly worth knowing.

The two main types of data subject interactions are consent management and data subject access requests (DSARs). Osano offers structured, forms-based solutions to both of these, available in a Software-as-a-Service (Saas) model that lets users deploy them on Web sites with a single line of javascript or on Android and iOS mobile apps with an SDK.

The consent management solution provides a prebuilt interface that automatically adapts its dialog to local laws, using the geolocation to determine the site visitor's location.  There are versions for 40+ countries and 30+ languages, which Osano updates as local laws change. Because it is delivered as a SaaS platform, the changes made by Osano are automatically applied to its clients. This is a major time-saver for organizations that would otherwise need their own resources to monitor local laws and update their system to conform to changes.

Details will vary, but Osano generally lets Web visitors consent to or reject different cookie uses including essential, analytics, marketing, and personalization. Where required by laws like the California Consumer Protection Act (CCPA), it will also collect permission for data sharing. Osano stores these consents in a blockchain, which prevents anyone from tampering with them and provides legally-acceptable proof that consent was obtained. Osano retains only a hashed version of the visitor’s personal identifiers, thus avoiding the risk of a PII leak while still enabling users to search for consent on a known individual.

Osano’s use of blockchain to store consent records is unusual. Also unusual: Osano will search its client’s Website to check for first- and third-party cookies and scripts. The system will tentatively categorize these, let users confirm or change the classifications, and then let site visitors decide which cookies and scripts to allow or block. There’s an option to show visitors details about each cookie or script.

Osano also provides customer-facing forms to accept Data Subject Access Requests. The system backs these with an inventory of customer data, built by users who manually define systems, data elements, and system owners. Put another way: there’s no automated data discovery. The DSAR form collects the user’s information and then sends an authentication email to confirm they are who they claim.  Once the request is accepted, Osano sends notices to the owners of the related systems, specifying the data elements included and the action requested (review, change, delete, redact), and tracks the owners’ reports on completion of the required action. Osano doesn’t collect the data itself or make any changes in the source systems.

The one place where Osano does connect directly with source systems is through an API that tracks sharing of personal data with outside entities. This requires system users to embed an API call within each application or workflow that shares such data: again, there’s no automated discovery of such flows. Osano receives notification of data sharing as its happens, encrypts the personal identifiers, and stores it in a blockchain alone with event details. Users can search the blockchain for the encrypted identifiers to build a history of when each customer’s data was shared.

Perhaps the most unusual feature of Osano is the company’s database of privacy policies and related information for more than 11,000 companies. Osano gathers this data from public Web sites and has privacy attorneys review the contents and score each company on 163 data points.  This lets Osano rate firms based on the quality of their privacy processes. It runs Web spiders continuously check for changes and will adjust privacy ratings when appropriate. Osano also keeps watch on other information, such as data breach reports and lawsuits, which might also affect ratings. This lets Osano alert its clients if they are sharing data with a risky partner.

Osano is offered in a variety of configurations, ranging from free (cookie blocking only) to $199/month (cookie blocking and consent management for up to 50,000 monthly unique Web site visitors) to enterprise (all features, negotiated prices). The company was started in 2018 and says its free version is installed on more than 750,000 Web sites.

Sunday, September 06, 2020

When CDPs Fail: Insights from the CDP Institute Survey

We released a new member survey last week at the CDP Institute. You can (and should) download the full report, so I won’t go through all the details. You can also view a discussion of this on Scott Brinker's Chief Martech Show.  But here are three major findings. 

Martech Best Practices Matter 

We identified the top 20% of respondents as leaders, based on outcomes including over-all martech satisfaction, customer data unification, advanced privacy practices, and CDP deployment. We then compared martech practices of leaders vs. others. This is a slightly different approach from our previous surveys but the result was the same: the most successful companies deploy structured management methods, put a dedicated team within marketing inside of martech, and select their systems based on features and integration, not cost or familiarity. No surprise but still good to reaffirm. 




Martech Architectures are More Unified 

For years, our own and other surveys showed a frustratingly static 15%-20% of companies reporting access to unified customer data. This report finally showed a substantial increase, to 26% or 52% depending on whether you think feeding data into a marketing automation or CRM system qualifies as true unification. (Lots of data in the survey suggests not, incidentally.)


 

CDPs Are Making Good Progress 

The survey showed a sharp growth in CDP deployment, up from 19% in 2017 to 29% in 2020. Bear in mind that we’re surveying members of the CDP Institute, so this is not a representative industry sample. But it’s progress nevertheless. 


Where things got really interesting was a closer look at the relationship of customer data architectures to CDP status. You might think that pretty much everyone with a deployed CDP would have a unified customer database – after all, that’s the basic definition of a CDP and the numbers from the two questions are very close. But it turns out that just 43% of the respondents who said they had a deployed CDP also said they had a unified database (15% with the database alone and 28% with a database and shared orchestration engine). What’s going on here? 


 

The obvious answer is that people don’t understand what a CDP really is. Certainly we’ve heard that complaint many times. But these are CDP Institute members – a group that we know are generally smarter and better looking and, more to the point, should understand CDP accurately even if no one else does. Sure enough, when we look at the capabilities that people with a deployed CDP say they expect from a CDP, the rankings are virtually identical whether or not they report they have a unified database. 

(Do you like this chart format? It’s designed to highlight the differences in answers between the two groups while still showing the relative popularity of each item. It took many hours to get it to this stage. To clarify, the first number on each bar shows the percentage for the group that selected the answer less often and the second number shows the group that selected it more often. So, on the first bar above, 73% of people with a unified customer database said they felt a CDP should collect data from all sources and 76% of those without a unified database said the same. The color of the values and at the tip of the bar shows which group chose the item more often: green means it was more common among people with a unified database and red means it was more common among people without a unified database. Apologies if you’re colorblind.) 

Answers regarding CDP benefits were also pretty similar, although there begins to be an interesting divergence: respondents without a unified database were more likely to cite advanced applications including orchestration, message selection, and predictive models. Some CDPs offer those and some don’t, and it’s fair to think that people who prioritized them might consider themselves having a proper CDP deployment even if they haven’t unified all their data. 


But the differences in the benefits are still pretty minor. Where things really get interesting is when we look at obstacles to customer data use (not to CDP in particular). Here, there’s a huge divergence: people without a unified database were almost twice as likely to cite challenges assembling unified data and using that data. 


Combining this with previous answers, I read the results this way: people who say they have a deployed CDP but not a unified database know quite well that a CDP is supposed to create a unified database. They just haven’t been able to make that happen. 

This of course raises the question of Why? We see from the obstacle chart that the people without unified data are substantially more likely to cite IT resources as an issue, with smaller differences in senior management support and data extraction. It’s intriguing that they are actually less likely to cite organizational issues, marketing staff time, or budget. 

Going back to our martech practices, we also see that those without a unified database are more likely to employ “worst practices” of using outside consultants to compensate for internal weaknesses and letting each group within marketing select its own technology. They’re less likely to have a Center of Excellence, use agile techniques, or follow a long-term martech selection plan. (If the sequencing of this chart looks a bit odd, it's because they're arranged in order of total frequency, including respondents without a deployed CDP.  That items at the bottom of the chart have relatively high values shows that deployed CDP owners selected those items substantially more often than people without a CDP.)

 

So, whatever the problems with their IT staff, it seems at least some of their problems reflect martech management weaknesses as well. 

But There's More...

The survey report includes two other analyses that touch on this same theme of management maturity as a driver of success. The first focuses on cross-channel orchestration as a marker of CDP understanding.  It turns out that the closer people get to actually deploying a CDP, the less they see orchestration as a benefit. My interpretation is that orchestration is an appealing goal but, as people learn more about CDP, they realize a CDP alone can't deliver it.  They then give higher priority to less demanding benefits.   (To be clear: some CDPs do orchestration but there are other technical and organizational issues that must also be resolved.)  


We see a similar evolution in understanding of obstacles to customer data use. These also change across the CDP journey: organizational issues including management support, budget, and cooperation are most prominent at the start of the process. Once companies start deployment, technical challenges rise to the top.  Finally, after the CDP is deployed, the biggest problem is lack of marketing staff resources to take advantage of it. You may not be able to avoid this pattern, but it’s good to know what to expect. 


The other analysis looks at CDP results. In the current survey, 83% of respondents with a deployed CDP said it was delivering significant value while 17% said it was not. This figure has been stable: it was 16% in our 2017 survey and 18% in 2019. 

I compared the satisfied vs dissatisfied CDP owners and found they generally agreed on capabilities and benefits, with orchestration again popping out as an exception: 65% of dissatisfied CDP owners cited it as a CDP benefit compared with just 45% of the satisfied owners. By contrast, satisfied owners were more likely to cite the less demanding goals of improved segmentation, predictive modeling, and data management efficiency. Similarly, the satisfied CDP users were less likely to cite coordinated customer treatments as a CDP capability and more likely to cite data collection. (Data collection still topped the list for both groups, at 77% for the satisfied owners and 65% for the others.) 

When it came to obstacles, the dissatisfied owners were much more likely to cite IT and marketing staff limits and organizational cooperation. The divergence was even greater on measures of martech management, including selection, responsibility, and techniques. 


In short, the dissatisfied CDP owners were much less mature martech managers than their satisfied counterparts. As CDP adoption moves into the mainstream, it becomes even more important for managers to recognize that their success depends on more than the CDP technology itself. 

There’s more in the report, including information on privacy compliance, and breakouts by region, company size, and company type. Again, you can download it here for free.

Thursday, August 27, 2020

Software Review: BigID for Privacy Data Discovery

Until recently, most marketers were content to leave privacy compliance in the hands of data and legal teams. But laws like GDPR and CCPA now require increasingly prominent consent notifications and impose increasingly stringent limits on data use. This means marketers must become increasingly involved with the privacy systems to ensure a positive customer experience, gain access to the data they need, and ensure they use the data appropriately. 

I feel your pain: it’s another chore for your already-full agenda.  But no one else can represent marketers’ perspectives as companies decide how to implement expanded privacy programs.  If you want to see what happens when marketers are not involved, just check out the customer-hostile consent notices and privacy policies on most Web sites.

To ease the burden a bit, I’m going to start reviewing privacy systems in this blog. The first step is to define a framework of the functions required for a privacy solution.   This gives a checklist of components so you know when you have a complete set. Of course, you’ll also need a more detailed checklist for each component so you can judge whether a particular system is adequate for the task. But let’s not get ahead of ourselves. 

At the highest level, the components of a privacy solution are:

  • Data discovery.  This is searching company systems to build a catalog of sensitive data, including the type and location of each item. Discovery borders on data governance, quality, and identity resolution, although these are generally outside the scope of a privacy system. Identity resolution is on the border because responding to data subject requests (see next section) requires assembling all data belonging to the same person. Some privacy systems include identity resolution to make this possible, but others rely on external systems to provide a personal ID to use as a link.

  • Data subject interactions.  These are interactions between the system and the people whose data it holds (“data subjects”).  The main interactions are to gather consent when the data is collected and to respond to subsequent “data subject access requests” (DSARs) to view, update, export, or delete their data. Consent collection and request processing are distinct processes.  But they are certainly related and both require customer interactions.  So it makes sense to consider them together. They are also where marketers are most likely to be directly involved in privacy programs.

  • Policy definition.  This specifies how each data type can be used.  There are often different rules based on location (usually where the data subject resides or is a citizen, but sometimes where the data is captured, where it’s stored, etc.), consent status, purpose, person or organization using the data, and other variables. Since regulations and company policies change frequently, this component includes processes to identify changes and either automatically adjust rules to reflect them or alert managers that adjustments may be needed.

  • Policy application.  This monitors how data is actually used to ensure it complies with policies, send alerts if something is not compliant, and keep records of what’s done. Marketers may be heavily involved here but more as system users than system managers. Policy application is often limited to assessing data requests that are executed in other systems but it sometimes includes actions such as generating lists for marketing campaigns. It also includes security functions related specifically to data privacy, such as rules for masking of sensitive data or practices to prevent and react to data breaches. Again, security features may be limited to checking that rules are followed or include running the processes themselves. Security features in the privacy system are likely to work with corporate security systems in at least some areas, such as user access management. If general security systems are adequate, there may be no need for separate privacy security features. 

Bear in mind that one system need not provide all these functions.  Companies may prefer to stitch together several “best of breed” components or to find a privacy solution within a larger system. They might even use different privacy components from several larger systems, for example using a consent manager built into a Customer Data Platform and a data access manager built into a database’s core security functions. 

Whew.

Now that we have a framework, let's apply it to a specific product.  We'll start with BigID.

Data Discovery

BigID is a specialist in data discovery. The system applies a particularly robust set of automated tools to examine and classify all types of data – structured, semi-structured, and unstructured; cloud and on-premise; in any language. For identified items, it builds a list showing the application, object name, data type, server, geographic location, and other details. 

Of course, an item list is table stakes for data discovery.  BigID goes beyond this to organize the items into clusters related to particular purposes, such as medical claims, invoices, and employee information. It also draws maps of relations across data sources, such as how the transaction ID in one table connects to the transaction ID in another table (even if the field names are not the same). Other features highlight data sources holding sensitive information, alert users if these are not properly secured from unauthorized access, and calculate privacy risk scores. 

The relationship maps provide a foundation for identity resolution, since BigID can compare values across systems to find matches and use the results to stitch together related records. The system supports fuzzy as well as exact matches and can compare combinations of items (such as street, city, and zip) in one rule.  But the matching is done by reading data from source systems for one person at a time, usually in response to an access request. This means that BigID could assemble a profile of an individual customer but won’t create the persistent profiles you’d see in a Customer Data Platform or other type of customer database. It also can’t pull the data together quickly enough to support real-time Web site personalization, although it might be fast enough for a call center. 

In fact, BigID doesn’t store any data outside of the source systems except for metadata.  So there's no reason to confuse it with a data lake, data warehouse, CRM, or CDP.

Data Subject Interactions

BigID doesn’t offer interfaces to capture consent but does provide applications that let data subjects view, edit, and delete their data and update preferences. When a data access request is submitted, the system creates a case that is sent to other systems or people to execute. BigID provides a workflow to track the status of these cases but won’t directly change data in source systems. 

Policy Definition 

BigID doesn’t have an integrated policy management system that lets users define and enforce data privacy rules. But it does have several components to support the process:

  • "Agreements" let users document the consent terms and conditions associated with specific items. This does not extend to checking the status of consent for a particular individual but does create a way to check whether a consent-gathering option is available for an item.

  • “Business flows” map the movement of data through business processes such as reviewing a resume or onboarding a new customer. Users can document flows manually or let the system discover them in the data it collects during its scan of company systems. Users specify which items are used within a flow and the legal justification for using sensitive items. The system will compare this with the list of consent agreements and alert users if an item is not properly authorized. BigID will also alert process owners if a scan uncovers a sensitive new data item in a source system.  The owner can then indicate whether the business flow uses the new item and attach a justification. BigID also uses the business flows to create reports, required by some regulations, on how personal data is used and with whom it is shared. 

  • “Policies” let users define queries to find data in specified situations, such as EU citizen data stored outside the EU. The system runs these automatically each time it scans the company systems. Query results can create an alert or task for someone to investigate. Policies are not connected to agreements or business flows, although this may change in the future. 

Policy Enforcement

BigID doesn’t directly control any data processing, so it can’t enforce privacy rules. But the alerts issued by the policy, agreement, and business flow components do help users to identify violations. Alerts can create tasks in workflow systems to ensure they are examined and resolved. The system also lets users define workflows to assess and manage a data breach should one occur. 

Technology 

 As previously mentioned, BigID reads data from source systems without making its own copies or changes any data in those systems. Clients can run it in the cloud or on-premises. System functions are exposed via APIs which let the company, clients, or third parties build apps on top of the core product. In fact, the data subject access request and preference portal functions are among the applications that BigID created for itself. It recently launched an app marketplace to make its own and third party apps more easily available to its clients. 

Business 

BigID has raised $146 million in venture funding and reports nearly 200 employees. Pricing is based on the number of data sources: the company doesn’t release details but it’s not cheap. It also doesn’t release the number of clients but says the count is “substantial” and that most are large enterprises.

Tuesday, August 18, 2020

Data Security is a Problem Marketers Must Help Fix


Everything you need to know about 2020 is covered by the fact that “apocalypse bingo” is already an over-used cliché. So I doubt many marketers have found spare time to worry about data security – which most would consider someone else’s problem. But bear in mind that 92% of consumers say they would avoid a company after a data breach. So, like it or not, security is a marketer’s problem too. 

Unfortunately, the problem is a big one. I recently took a quick scan of research on the issue, prompted in particular by a headline that nearly half of companies release software they know contains security flaws.  Sounds irresponsible, don't you think?  The main culprit in that case is pressure to meet deadlines, compounded by poor training in security procedures. If there’s any good news, it’s that the most-used applications have fewer unresolved security flaws than average, suggesting that developers pay more attention when they know it’s most important. 

The research is not reassuring. It may be a self-fulfilling prophecy, but most security professionals see data breaches as inevitable. Indeed, many think a breach is good for their career, presumably because the experience makes them better at handling the next one. Let’s just be grateful they're not airline pilots. 

Still, the professionals have a point. Nearly every company reports a business-impacting cyberattack in the past twelve months. Even before COVID-19, fewer than half of IT experts were confident their organizations can stop data breaches with current resources.

The problems are legion. In addition to deadline pressures and poor training, researchers cite poorly vetted third-party code libraries, charmingly described as “shadow code”; compromised employee accounts, insecure cloud configurations, and attacks on Internet of Things devices.

Insecure work-from-home practices during the pandemic only add new risk. One bit of good news is that CIOs are spending more on security,  prioritizing access management and remote enablement. 

What’s a marketer to do?  One choice is to just shift your attention to something less stressful, like fire tornados and murder hornets. It’s been a tough year: I won’t judge. 

But you can also address the problem. System security in general is managed outside of most marketing departments. But marketers can still ensure their own teams are careful when handling customer data (see this handy list of tips from the CDP Institute). 

Marketers can also take a closer look at privacy compliance projects, which often require tighter controls on access to customer data. Here’s an overview of what that stack looks like.  CDP Institute also has a growing library of papers on the the topic.

Vendors like TrustArc, BigID, OneTrust, Privitar, and many others, offer packaged solutions to address these issues. So do many CDP vendors. Those solutions involve customer interactions, such as consent gathering and response to Data Subject Access Requests.  Marketers should help design those interactions, which are critical in convincing consumers to share personal data that marketers need for success. The policies and processes underlying those interfaces are even more important for delivering on the promises the interfaces make. 

In short, while privacy and security are not the same thing, any privacy solution includes a major security component. Marketers can play a major role in ensuring their company builds solid solutions for both. 

Or you can worry about locusts

 

Saturday, July 25, 2020

Don't Misuse Proof of Concept in System Selection

Call me a cock-eyed optimist, but marketers may actually be getting better at buying software. Our research has long shown that the most satisfied buyers base their selection on features, not cost or ease of use. But feature lists alone are never enough: even if buyers had the knowledge and patience to precisely define their actual requirements, no set of checkboxes could capture the nuance of what it’s actually like to use a piece of software for a specific task. This is why experts like Tony Byrne at Real Story Group argue instead for defining key use cases (a.k.a. user stories) and having vendors demonstrate those. (If you really want to be trendy, you can call this a Clayton Christensen-style “job to be done”.)

In fact, use cases have become something of an obsession in their own right. This is partly because they are a way of getting concrete answers about the value of a system: when someone asks, “What’s the use case for system X”, they’re really asking, “How will I benefit from buying it?” That’s quite different from the classic definition of a use case as a series of steps to achieve a task. It’s this traditional definition that matters when you apply use cases to system selection, since you want the use case to specify the features to be demonstrated. You can download the CDP Institute’s use case template here.

But I suspect the real reason use cases have become so popular is that they offer a shortcut past the swamp of defining comprehensive system requirements. Buyers in general, and marketers in particular, lack the time and resources to create complete requirements lists based on their actual needs (although they're perfectly capable of copying huge, generic lists that apply to no one).  Many buyers are convinced it’s not necessary and perhaps not even possible to build meaningful requirements lists: they point to the old-school “waterfall” approach used in systems design, which routinely takes too long and produces unsatisfactory results. Instead, buyers correctly see use cases as part of an agile methodology that evolves a solution by solving a sequence of concrete, near-term objectives.

Of course, any agile expert will freely admit that chasing random enhancements is not enough.  There also needs to be an underlying framework to ensure the product can mature without extensive rework. The same applies to software selection: a collection of use cases will not necessarily test all the features you’ll ultimately need. There’s an unstated but, I think, implicit assumption that use cases are a type of sampling technique: that is, a system that meets the requirements of the selected use cases will also meet other, untested requirements.   It’s a dangerous assumption. (To be clear: a system that can’t support the selected use cases is proven inadequate. So sample use cases do provide a valuable screening function.)

Consciously or subconsciously, smart buyers know that sample use cases are not enough. This may be why I’ve recently noticed a sharp rise in the use of proof of concept (POC) tests. Those go beyond watching a demonstration of selected use cases to actually instal a trial version of a system and seeihow it runs. This is more work than use case demonstrations but gives much more complete information.

Proof of concept engagements used to be fairly rare. Only big companies could afford to run them because they cost quite a bit in both cash (most vendors required some payment) and staff time (to set up and evaluate the results). Even big companies would deploy POCs only to resolve specific uncertainties that couldn’t be settled without a live deployment.

The barriers to POCs have fallen dramatically with cloud systems and Software-as-a-Service. Today, buyers can often set up a test system with a just a few mouse clicks (although it may take several days of preparation before those clicks will work). As a result, POCs are now so common that they can almost be considered a standard part of the buying process.

Like the broader application of use cases, having more POCs is generally a good thing. But, also like use cases, POCs can be applied incorrectly.

In particular, I’ve recently seen several situations where POCs were used as an alternative to basic information gathering. The most frightening was a company that told me they had selected half a dozen wildly different systems and were going to do a POC with each of them to figure out what kind of system they really needed.

The grimace they didn’t see when I heard this is why I keep my camera off during Zoom meetings. Even if the vendors do the POCs for free, this is still a major commitment of staff time that won’t actually answer the question. At best, they’ll learn about the scope of the different products. But that won’t tell them what scope is right for them.

Anther company told me they ran five different POCs, taking more than six months to complete the process, only to later discover that they couldn’t load the data sources they expected (but hadn’t included in their POCs). Yet another company let their technical staff manage a POC and declare it successful, only later to learn the system had been configured in a way that didn’t meet actual user needs.

You’re probably noticing a dreary theme here: there’s no shortcut for defining your requirements. You’re right about that, and you’re also right that I’m not much fun at parties. As to POCs, they do have an important role but it’s the same one they played when they were harder to do: they resolve uncertainties that can’t be resolved any other way.

For Customer Data Platforms, the most common uncertainty is probably the ability to integrate different data sources.  Technical nuances and data quality are almost impossible to assess without actually trying to load each system.  Since these issues have more to do with the data source than the CDP, this type of POC is more about CDP feasibility in general than CDP system selection. That means you can probably defer your POC until you’ve narrowed your selection to one or two options – something that will reduce the total effort, encourage the vendor to learn more about your situation, and help you to learn about the system you’re most likely to use.

The situation may be different with other types of software. For example, you might to test q wide variety of predictive modeling systems if the key uncertainty is how well their models will perform. That’s closer to the classic multi-vendor “bake-off”.  But beware of such situations: the more products you test, the less likely your staff is to learn each product well.

With a predictive modeling tool, it’s obvious that user skill can have a major impact on results. With other tools, the impact of user training on outcomes may not be obvious. But users who are assessing system power or usability may still misjudge a product if they haven’t invested enough time in learning it.  Training wheels are good for beginners but get in the way of an expert. Remember that your users will soon be experts, so don’t judge a system by the quality of its training wheels.

This brings us back to my original claim.  Are marketers really getting better at buying software?  I’ll stand by that and point to broader use of tools like use cases and proof of concepts as evidence. But I’ll repeat my caution that use cases and POCs must be used to develop and supplement requirements, not to replace them. Otherwise they become an alternate route to poor decisions rather than
guideposts on the road to success.







Monday, April 27, 2020

Here's a Game about Building Your Martech Stack

TL;DR: you can play the game here.

I’ve recently been running workshops to help companies plan deployment of their Customer Data Platforms. Much of the discussion revolves around defining use cases and, in particular, deciding which to deliver first. This requires balancing the desire to include many data sources in the first release of the system against the desire to deliver value quickly. The challenge is to find an optimal deployment sequence that starts with the minimum number of sources needed for an important use case and then incrementally adds new sources that support new use cases. I’ve always found that an intriguing problem although I’ll admit few others have shared my fascination.

As coronavirus forces most marketers to work from home, I’ve also been pondering ways to deliver information that are more engaging than traditional Webinars and, ahem, blog posts. The explosion of interest in games in particular seems to offer an opportunity for creative solutions.

So it was fairly natural to conceive of a game that addresses the deployment sequence puzzle. The problem seems like a good candidate: governed by a few simple dynamics that become interestingly complex when they interact. The core dynamic is that one new data source may support new multiple use cases, while different combinations of sources support different use cases. This means you could calculate the impact of different sequences to compare their value.

Of course, some use cases are worth more than others and some sources cost more to integrate than others; you also have to consider the availability of the CDP itself, of central analytical and campaign systems, and of delivery system that can use the outputs. But for game purposes, you could simplify matters to assume that each system costs the same and each use case has the same value. This still leaves in place the core dynamic of balancing the cost of adding one system with the value of enabling multiple use cases with that system.

To make things even more interesting and realistic, you could add the fact that some use cases are possible with a few systems but become more valuable as new systems come online.  It might be that their data adds value – say, by making predictions more accurate – or because they enable delivery of messages in more channels.

In the end, then, you end up with a matrix that crosses a list of systems (data sources, CDPs, analytics, campaigns management, and delivery systems) against a list of use cases. Each cell in the matrix indicates whether a particular system is essential or optional for a particular use case. Value for any given period would include: the one-time cost of adding a new system; the recurring cost of operating active systems, and the value generated by each active use case.  That use case value would include a base value earned by running the use case plus incremental value from each optional system. Using red to indicate required systems and grey to indicate optional systems, the matrix looks like this:



The game play would then be to select system one at a time, calculate the value generated as the period revenue, and then repeat until you run out of systems to add.  Sometimes you’d select systems because they made a new use case possible, sometimes you’d select because they added optional value to already-active use cases, and sometimes you’d select a system to make possible more use cases in the future. Fun!

I then showed this to a professional game designer, whose response was “you may have found the least fun form factor imaginable: the giant data-filled spreadsheet. I'm kind of impressed.”

Ouch, but he had a point. I personally found the game to be playable using a computer to do the calculations but others also found it impenetrable. A version using physical playing cards was clearly impossible.

So, after much pondering, I came up with a vastly simplified version that collapsed the 19 systems in the original model into three categories, and only required each use case to have a specified number of systems in each category. I did keep the distinction between required and optional systems, since that has a major impact on the effectiveness of different solutions. I also simplified the value calculations by removing system cost, since that would be the same across all solutions so long as you add one system per period.


The result was a much simpler matrix, with just six columns (required and optional counts for each of the three system types) and the same number of rows per use case (22 in the example). I built this into a spreadsheet that does the scoring calculations and stores results for each period, so the only decision players need to make in any turn is which of the three system types to select. Even my game designer grudgingly allowed that it “made sense pretty quickly” and was “kinda fun”. That’s about all the enthusiasm I can hope for, I suspect.

I’ve put a working version of this in a Google spreadsheet that you can access here.

Go ahead and give it a play – it just takes a few minutes to complete once you grasp how it works (put a ‘1’ in the column for each period to select the class of system to add during that period). Most of the spreadsheet is write-protected but there’s a leaderboard if you can beat my high score of 1,655.

Needless to say, I’m interested in feedback. You can reach me through LinkedIn here.
Although this started as a CDP planning exercise, it’s really a martech stack building game, something I think we can all agree the world desperately needs right now. I also have worked out a physical card game version that would have a number of additional features to make games more interesting and last longer. Who wants to play?







Thursday, April 02, 2020

A Dozen Market Research Studies on COVID-19 Business Impact

This sums it up.  From Bank of America via Twitter but I can't find a link to the original.
As marketers finish their initial emergency adjustments to coronavirus lockdowns, they are starting to think about longer-term plans. While the shape of things to come is impossible to guess, reporting on industry changes has become a marketing trend of its own. Here are a dozen-plus studies I’ve seen in the past week, most of which are on-going.

Retail Behavior Data

Adobe this week launched their Digital Economy Index, a long-term project that gained unexpected immediate relevance. The index draws on trillions of Web visits tracked by Adobe systems to construct a digital consumer shopping basket tracking a mix of products including apparel, electronics, home and garden, computers, groceries, and more. The headline finding of the initial report would have been a continuing drop in prices driven by electronics, but this was overshadowed by short-term changes including a 225% increase in ecommerce from March 1-11 to March 13-15. Online groceries, cold medications, fitness equipment and computers surged, as did preordering for in-store pickup. Extreme growth was concentrated in hard-hit areas including California, New Hampshire and Oregon.

Customer Data Platform vendor Amperity reported a less rosy result in its COVID-19 Retail Monitor, which draws data from Amperity’s retail clients. They report that total retail demand fell by 86% by the end of March and even online revenue is down 73%.  Food and health products fell after an initial stock-up surge in mid-March.

Retail foot traffic tracker Placer.ai has packaged its in-store data in a COVID-19 Retail Impact tracker, which not surprisingly shows an end to traffic at shuttered entertainment and clothing outlets, near-total drop at restaurants, and mixed results for grocery stores and pharmacies. Results are reported by day by brand, if you really want to wallow in the gruesome details.

Grocery merchandising experts Symphony RetailAI have also launched a COVID-19 Insights Hub, which reports snippets of information with explanations. These range from obvious (consumers are accepting more product substitutions in the face of stock-outs) to intriguing (canned goods sales rose twice as much in the U.S. than in Europe because of smaller families and less storage space).

Retail Behavior Surveys

Showing just how quickly the world changed, retail consumer research platform First Insight found that the impact of coronavirus on U.S. shopping behavior doubled between surveys on February 28 survey and March 17. In the later survey, 49% of consumers said they were buying less in-store and 34% were shopping more online. Women and baby boomers went from changing their behavior slightly less than average in the first survey to changing slightly more than average in the second.

Ecommerce platform Yotpo ran its own survey on March 17, reaching 2,000 consumers across the U.S., Canada, and United Kingdom. They found consumers evenly split between expecting to spend more or less over-all, with a just 32% expecting to shift purchases online. Food, healthcare, and, yes, toilet paper were high on their shopping lists.

The situation was clearer by the end of March, when Retail Systems Research surveyed 1,200 American consumers for Yottaa. By this time, 90% were hesitant to shop in-store, 94% expected online shopping will be important during the crisis, and their top concerns were unavailable inventory, no free shipping, and slow websites. (Really, no free shipping?) More surprising but prescient, given Amazon's labor troubles: just 42% felt confident that Amazon could get their online orders delivered on time.

Media Consumption

Nobody wins any prizes for figuring out that Web traffic went up when people were locked down. But digital analytics vendor Contentsquare did provide a detailed analysis of which kinds of Web sites attracted more traffic (supermarkets, media, telecom, and tech retail) and which went down the most (luxury goods, tourism, and live entertainment) in the U.S., UK, and France. Week-by-week data since January shows a sharp rise starting March 16. Less easily predictable: supermarket and media conversion rates went down as consumers spent more time searching for something they wanted.

Media tracking company Comscore has also weighed in with an ongoing series of coronavirus analyses. Again, no surprises: streaming video, data, newscasts, and daytime TV viewing are all up. Same for Canada and India, incidentally.


You also won’t be shocked to learn that Upfluence found a 24% viewing increase in the live-streaming game platform Twitch in Europe. Consumption growth tracked national lockdowns, jumping in Italy during the week of March 8-14 and in France and Spain the week after.

Consumer review collector PowerReviews has its own data, based on 1.5 million product pages across 1,200 Web sites. Unlike Contentsquare, they found traffic was fairly flat but conversion rates jumped on March 15 and doubled by March 20. Their explanation is people were buying basic products that took less consideration.  People read many more reviews but submission levels and sentiment were stable. Reviews were shorter as consumers likely had other things on their minds.


Influencer marketing agency Izea got ahead of the game with a March 12 survey, asking social media consumers how they thought they’d behave during a lockdown. More social media consumption was one answer, with Facebook and Youtube heading the list. Izea also predicted that influencer advertising prices would fall as more influencers post more content.


Consumer Attitudes


Researching broader consumer attitudes, ITWP companies Toluna, Harris Interaction and KurunData launched a Global Barometer: Consumer Reactions to COVID-19 series covering the U.S., UK, Australia, India, and Singapore. The first wave of data was collected March 25-27.  People in the U.S. and India were generally more satisfied with how businesses had behaved and more optimistic about how quickly things would return to normal. But U.S. respondents ranked support from the national government considerable lower than anyone else.


Edelman Trust Barometer issued a ten market Special Report on COVID-19, although the data was gathered during the good old days of March 6-10. Even then, most people were following the news closely and 74% worldwide felt there was a lot of false information being spread. Major news outlets were the primary information source everywhere (64%) but the U.S. government was by far less relied upon (25%) than anywhere else (31% to 63%). Interesting, people put more faith in their employers than anyone except health authorities. They also expected business to protect their workers and local communities.


Kantar Media has yet another COVID-19 Barometer, although they reserve nearly all results for paying clients. The findings they did publish echo the others: more online media consumption, low trust in government, and expectation that employers will look after their employees. Kantar says that just 8% of consumers expect brands to stop advertising but 77% want advertising to show how brands are being helpful, 75% think brands should avoid exploiting coronavirus and (only?) 40% feel brands should avoid “humorous tones”.

Survey company YouGov publishes a continuously-updated International COVID-19 Tracker with timelines on changing opinions in 26 countries.  Behaviors including avoiding public places and not going to work change quickly; others such as fear of catching coronavirus and wearing masks move more slowly.  Other attitudes have barely shifted, including avoiding raw meat and improving personal hygiene.  The timing of changes correlates with the situation in each country.

Job Listings


There’s also an intriguing little niche of companies offering job information. PR agency Global Results Communications just launched a COVID-19 Job Board to help people find work.  So far, it's not very impressive: as of April 1 it had under 100 random listings from Walmart to Metrolina Greenhouses to the South Carolina National Guard.


Tech salary negotiators Candor (did you know that was a business?) has a vastly more useful site, listing 2,500+ companies that are reported to be hiring, freezing hiring, rescinding offers, or laying people off. At the moment, half the companies on the list are hiring. The site offers a very interesting break-down by industry: transportation, retail, consulting, energy, and automotive are in the worst shape. Defense, productivity and education software, and communications are doing the best.