In itself, this isn’t terrible: much of the value in the Self-Assessment comes from giving users a comprehensive checklist of items to consider, thereby highlighting the true scope of a Composable project. Still, it’s clear the Self-Assessment isn’t helping buyers as much as we had expected, so we’re considering what other tools might be more useful.
The discussions surrounding this have helped to clarify my own understanding of where Composable CDP fits into the greater scheme of things. The key insight is that the choice to use a Composable or packaged CDP is ultimately a tactical decision that addresses just one stage of the CDP project. It doesn’t affect the previous stage, which is requirements definition, or the following stages, which are deployment and activation. To overstate the case just a bit, marketers and other CDP users don’t care how their CDP gets built; they just want to use the resulting profiles to do their jobs. To the extent that CDP users do care about the Composable vs packaged decision, it’s because that choice impacts the cost, timing, and quality of their system.
This insight in turn clarifies the distinction between knowing enough to make the Composable vs packaged decision, and actually deciding which is the best choice. The knowledge needed to make the decision includes:
- Defining business requirements, which requires business users who understand their needs.
- Understanding existing systems, so you can identify gaps that must be filled to meet business requirements
- Understanding the organization’s capabilities for filling the gaps with either a Composable or packaged solution. These capabilities relate to technical resources including staff skills and budgets. I believe that assembling a Composable solution requires more extensive technical resources than buying a packaged CDP, although some Composable advocates might disagree.
- Estimating the costs of delivering a Composable solution and a packaged solution so you can compare them.
The Self-Assessment tool addresses only the first two of those items: it asks users what they need and what their current systems can provide. So, even if the users could answer all the questions, it wouldn’t provide all the information needed to make a sound decision. Again, this isn’t exactly a flaw, since those are two important types of information. But it is a limitation.
Only after all the information listed above has been gathered can the company address the second question of whether Composable or packaged is its best choice. Answering this question depends on the combination of capabilities and costs. That is, for Composable to be a viable option, the company needs to have adequate technical resources to assemble a Composable solution, and for Composable to be the best option, it has to offer the best combination of cost and value.
(If we assume that either option delivers the same value, then the only difference is cost. In theory, every system that meets same requirements will deliver the same value. But in the real world, there will be differences in the quality of the resulting systems. These will depend on the specific tools that are chosen, not simply on whether the solution is Composable or packaged. This means that buyers must evaluate specific alternatives for either approach.)
Circling back to the original question of how the Institute can help users who are considering a Composable CDP, it’s clear that few people will have all the information they need available when they start the process. So the best we can do is to provide a framework that identifies the four kinds of information to collect and, perhaps, provides a place to store it over time. That could be as simple as a spreadsheet, which isn’t very exciting from a technical standpoint. But if it’s what our members need the most, then that’s what we’ll deliver.
Addendum: Taking things a step further, here's what the tables described above could look like. It would be interesting to collect separate answers from business users and IT/data teams, who very likely would disagree in many places. I could easily convert these tables into an online format and have the system do some light analysis, including a comparison of answers from business users vs IT/data teams. But we'd still run into the same problem, that it takes time assemble answers. So this pushes me back to a spreadsheet that people can fill out at their leisure. You can download that here.
1 & 2: Requirements & Existing Systems |
|||||
Data Sources |
Not needed |
Needed and Fully Available |
Needed and Partly Available |
Needed and Not Available |
Don’t Know |
- Website |
|
|
|
|
|
- CRM |
|
|
|
|
|
- Data warehouse/ - data lake |
|
|
|
|
|
|
|
|
|
|
|
- Third party data |
|
|
|
|
|
- eCommerce |
|
|
|
|
|
- Web advertising |
|
|
|
|
|
- Social media advertising |
|
|
|
|
|
- Point of Sale |
|
|
|
|
|
|
|
|
|
|
|
Profile Building Functions |
|
|
|
|
|
- Capture data |
|
|
|
|
|
- Ingest data |
|
|
|
|
|
- Prepare data |
|
|
|
|
|
- Store data |
|
|
|
|
|
- Link data |
|
|
|
|
|
- Build profiles |
|
|
|
|
|
- Share profiles |
|
|
|
|
|
- Integrate profiles |
|
|
|
|
|
- Segment profiles |
|
|
|
|
|
|
|
|
|
|
|
Activation Functions |
|
|
|
|
|
- Audience selection |
|
|
|
|
|
- Predictive analytics |
|
|
|
|
|
- Campaign definition |
|
|
|
|
|
- Real time interactions |
|
|
|
|
|
- Cross-channel orchestration |
|
|
|
|
|
3. Resources |
|
|
Rating (1=low, 5=high) |
- Customer data infrastructure |
|
- IT/Data team: customer data experience |
|
- IT/Data team availability for this project |
|
- Business users: customer data experience |
|
- Business user availability for this project |
|
- Training resources |
|
- Measurement resources |
|
- Budget for this project |
|
- Management Support |
|
4. Cost |
||
Hard costs (enter dollar amounts): |
Composable |
Packaged |
- Licenses/Vendor Fees |
|
|
- Labor/Development & Integration Costs |
|
|
- Operations (hardware, software, on-going maintenance & upgrades) |
|
|
Other Considerations: (1=disadvantage, 2=same, 3=advantage) |
|
|
- Time to Value |
|
|
- Delivery Risk (time, cost overruns) |
|
|
- Performance Risk (scalability, performance) |
|
|
- Roadmap (control over features) |
|
|
- Security/privacy (control over data) |
|
|
- Quality (best components, competitive advantage, vendor, usability/consistent interface) |
|
|
- |
|
|