It has taken me a while to connect the dots, but I’m now pretty sure I see a new type of software emerging. These systems that gather customer data from multiple sources, combine information related to the same individuals, perform predictive analytics on the resulting database, and use the results to guide marketing treatments across multiple channels. This differs quite radically from standard marketing automation systems, which use databases built elsewhere, rarely include integrated predictive modeling, and are focused primarily on moving customers through multi-step campaigns. In fact, the new systems complement rather than compete with marketing automation, which they treat as just one of several execution platforms. The new systems can also feed sales, customer service, online advertising, point of sale, and any other customer-facing systems.
Given how much vendors and analysts love to create new categories, I’m genuinely perplexed that no one has yet named this one. I’ll step in myself, and hereby christen the concept as “Customer Data Platform”. Aside from having a relatively available three letter abbreviation (see Acronym Finder for other uses of CDP), the merits of this name include:
- “Customer” shows the scope extends to all customer-related functions, not just marketing;
- “Data” shows the primary focus is on data, not execution; and
- “Platform” shows it does more than data management while supporting other systems
But, you may ask, is this really new? Certainly systems for Customer Data Integration (CDI) have been around for decades: these include specialized products like Harte-Hanks Trillium and SAS DataFlux, CDI features within general data management suites like Informatica and Pentaho, and integration within cloud-based business intelligence products like GoodData and Birst. Many of those products have limited capabilities for working with newer data sources like Web sites and social networks, but the real distinction between them and CDPs is that the older systems are mainly designed to assemble data. Some also provide analytics, but they don't extend to real-time decisions based on predictive models.
Similarly, there have long been specialized systems for real-time interaction management (such as Infor Interaction Advisor and Oracle Real Time Decisions) and for predictive modeling (SAS, IBM SPSS, KXEN). Some interaction managers do create predictive models, and the really big vendors (IBM, SAS, Oracle) have all three key components (CDI, real-time decisions, and predictive models) somewhere in their stables. But systems that closely couple just those features with the goal of feeding data as well as recommendations to execution systems? Those are something new.
By now, you’re probably wondering if I’ll ever get around to actually naming the vendors I have in mind. I’ve recently written about some of them, including Reachforce/SetLogik and Lattice Engines. I also include RedPoint in the mix, because it has all the key capabilities (database development, predictive models, and real time decisions) even though it also offers conventional campaign management. Others I haven’t yet written about include Mintigo and Gainsight. Of course, each has a different mix of features and its own market position. Indeed, several have specifically told me they do not compete with the others. Fair enough, but I still see enough similarity to group them together.
All this is a very long-winded introduction to Causata, yet another member of this new class. By now, you can probably guess Causata’s main functions: assemble customer data from multiple sources, consolidate it by customer, place it in an analytics-friendly format, run predictive models against it, and respond in real time to recommendation requests from other systems including Web sites, email, banner ads, and call centers. And you’d be right.
But that’s not the end of the story. With any product, it’s the details that matter. Causata is particularly strong in the data management department, accepting both batch and real-time data feeds and storing data as different types of events (email sent, Web site visit, call center interaction, etc.), each having predefined attributes. The system also has a particularly sophisticated “identity association” service, which looks for simultaneous events involving different identifiers as a way to link them, and can chain identifiers that were linked at different times. When I spoke with Causata about two months ago, the association rules were pretty much the same for all clients, but they promised users would get more control in the future. Users could already choose which types of associations to use in specific queries.
Causata stores the assembled data in HBase, a Hadoop-based database management system that is particularly well suited to large data volumes, many different data types, and ad hoc queries. In addition to the raw data, the system can store derived values such as aggregations (e.g., number of Web page view in past 24 hours) and model scores. Users can run SQL queries to extract data for analysis and predictive modeling in third-party software including QlikView, Tableau, SAS, and R. Prebuilt QlikView reports show the predictive power of different variables for user-specified events. The lack of native analysis and modeling tools creates some friction for users, but also lets them stick with familiar products. So the pros and cons probably cancel each other out.
The system’s decision tools are straightforward. For each situation, users define a “decision engine” that can select among multiple options, such as campaigns, products, or marketing content. These options can have qualification rules. To make a decision, the system can test the options in sequence and pick the first one for which a customer is qualified, or pick the option with the highest predictive model score. Users can also specify a percentage of customers to receive a random option, to gather data for future decisions. An engine can return multiple decisions for situations that require more than one option, such as a Web page with several offers. Causata has some machine learning algorithms to help with the decision process. It plans to expand these to automatically select the best option in a given situation.
Decision engines are called by external systems through a Web services API that can respond in under 50 milliseconds. This is fast enough to manage Web banner ads – something not all interaction managers can achieve. Model scores and other data are updated in real time during an interaction.
Causata can be deployed on-premise by a client or as a cloud-based service. The vendor says a typical implementation starts with three or four data sources and is deployed in about 30 days – very fast for this type of system. In February, Causata introduced prebuilt applications for cross-sell, acquisition, and return programs in financial services, communications, and digital media. These will further speed deployment.
Pricing is based on the number of data sources and touchpoints, with additional charges based on data storage. Cost begins around $150,000 per year.
This is the blog of David M. Raab, marketing technology consultant and analyst. Mr. Raab is founder and CEO of the Customer Data Platform Institute and Principal at Raab Associates Inc. All opinions here are his own. The blog is named for the Customer Experience Matrix, a tool to visualize marketing and operational interactions between a company and its customers.
Thursday, April 25, 2013
Wednesday, April 17, 2013
Lattice Engines Automates All Steps in Prospect Discovery
There’s nothing new about using public information to identify business opportunities: it’s why lawyers chase ambulances and bankers phone lottery winners. But the Internet has exponentially grown the amount of data available and made it easily accessible. What’s needed to fully exploit this resource is technology that automates the end-to-end process of assembling the information, identifying opportunities, and delivering the results to sales and marketing systems.
Lattice Engines was founded in 2006 to fill this gap. The system scans public databases, company Web pages, and selected social networks to find significant events such as title changes, product launches, job openings, new locations, and investments. It supplements this with data from the clients' own systems including customer profiles, Web site visits, and purchases. It then looks at past data to find patterns which predict selected outcomes, such as making a first purchase, buying an additional product, or renewing. It uses these patterns to identify the best current prospects for each outcome, and makes the lists available to marketing systems or sales people. The sales people also see explanations of why each person was chosen, what they should be offered, and recommended talking points.
Each of these steps takes significant technology. Lattice Engines currently monitors Web sites of five to 10 million U.S. businesses, checking daily for changes. The system’s semantic engine reads structured texts such as management biographies and press releases, extracting entities and relationships but not trying to understand more subtle meanings such as sentiment. Clients specify blogs to follow, which receive similar treatment. The company also monitors Twitter, Facebook company pages, Quora, and LinkedIn profiles of people within each sales person’s network. Additional data comes from standard sources such as business directories and from special databases requested by clients. Information from all these sources is loaded into a single database available to all Lattice Engine clients.
Lattice Engines also imports data from the clients own systems, although of course this isn’t shared with anyone else. Again, there’s some clever technology needed to recognize individuals and companies across multiple sources. Lattice Engines doesn’t try to link personal and business identities for individuals.
All this information is placed in a timeline so that modeling systems can look at events before and after the target activities. The models themselves are built automatically, once users specify the target activity, product, and time horizon. Users can then build a list of customers or prospects, have the model score it, and send high-ranking names to marketing or sales for further contact. Results can be exported to a marketing automation system or appear within the sales person’s CRM interface. Lattice Engines is directly integrated with cloud-based CRM from Salesforce.com, Microsoft Dynamics, and Oracle, and via file transfer with SAP CRM. Users can export lists to Excel and Marketo, with connectors for Eloqua and other marketing automation systems on the way.
The net result of this is a single system that performs all the tasks needed to exploit the wide range of information available about customers and prospects. Marketers could theoretically use separate systems for each step in the process, and integrate the results for themselves. But few really have the skills to do this. And, in most cases, it would be more expensive than purchasing a single system like Lattice Engines. It's particularly helpful that Lattice Engines supports both prospecting and customer management -- further reducing the need for multiple products, and further encouraging cooperation between marketing and sales departments.
Pricing for Lattice Engines starts at $75,000 per year and grows based on the number of data sources and sales users. Client data volume doesn't affect the cost, since Lattice Engines’ own databases are vastly larger than any client data. The company has close to 50 deployments, nearly all at large B2B marketers including Dell, HP, Microsoft, ADP, and Staples.
Lattice Engines was founded in 2006 to fill this gap. The system scans public databases, company Web pages, and selected social networks to find significant events such as title changes, product launches, job openings, new locations, and investments. It supplements this with data from the clients' own systems including customer profiles, Web site visits, and purchases. It then looks at past data to find patterns which predict selected outcomes, such as making a first purchase, buying an additional product, or renewing. It uses these patterns to identify the best current prospects for each outcome, and makes the lists available to marketing systems or sales people. The sales people also see explanations of why each person was chosen, what they should be offered, and recommended talking points.
Each of these steps takes significant technology. Lattice Engines currently monitors Web sites of five to 10 million U.S. businesses, checking daily for changes. The system’s semantic engine reads structured texts such as management biographies and press releases, extracting entities and relationships but not trying to understand more subtle meanings such as sentiment. Clients specify blogs to follow, which receive similar treatment. The company also monitors Twitter, Facebook company pages, Quora, and LinkedIn profiles of people within each sales person’s network. Additional data comes from standard sources such as business directories and from special databases requested by clients. Information from all these sources is loaded into a single database available to all Lattice Engine clients.
Lattice Engines also imports data from the clients own systems, although of course this isn’t shared with anyone else. Again, there’s some clever technology needed to recognize individuals and companies across multiple sources. Lattice Engines doesn’t try to link personal and business identities for individuals.
All this information is placed in a timeline so that modeling systems can look at events before and after the target activities. The models themselves are built automatically, once users specify the target activity, product, and time horizon. Users can then build a list of customers or prospects, have the model score it, and send high-ranking names to marketing or sales for further contact. Results can be exported to a marketing automation system or appear within the sales person’s CRM interface. Lattice Engines is directly integrated with cloud-based CRM from Salesforce.com, Microsoft Dynamics, and Oracle, and via file transfer with SAP CRM. Users can export lists to Excel and Marketo, with connectors for Eloqua and other marketing automation systems on the way.
The net result of this is a single system that performs all the tasks needed to exploit the wide range of information available about customers and prospects. Marketers could theoretically use separate systems for each step in the process, and integrate the results for themselves. But few really have the skills to do this. And, in most cases, it would be more expensive than purchasing a single system like Lattice Engines. It's particularly helpful that Lattice Engines supports both prospecting and customer management -- further reducing the need for multiple products, and further encouraging cooperation between marketing and sales departments.
Pricing for Lattice Engines starts at $75,000 per year and grows based on the number of data sources and sales users. Client data volume doesn't affect the cost, since Lattice Engines’ own databases are vastly larger than any client data. The company has close to 50 deployments, nearly all at large B2B marketers including Dell, HP, Microsoft, ADP, and Staples.
Thursday, April 11, 2013
Adometry Combines Attribution with Optimization
So…my last two posts on attribution systems (MMA and VisualIQ ) were among the least popular ever, right down there with Marketing Lessons from Chernobyl (which, let’s face it, was in pretty poor taste). But vox populi isn’t always vox Dei, eh? I think it’s an important topic, so here we go again.
The lucky recipient of that less-than-stirring introduction is Adometry, which in no way deserves any disrespect. From humble beginnings in click fraud prevention, they have grown in recent years to be one of the leaders in algorithmic response attribution. Their latest expansion moves them beyond digital channels to offline media including direct mail, television, and print. They have also moved from attributing past results to using predictive models to optimize current and future campaigns. Impressive.
The core of Adometry’s attribution methodology is to compile the sequence of marketing messages seen by each individual, and then compare results of individuals whose sequence differs by only one message. Any difference in results is then attributed to that message. This is conceptually simple, but requires clever treatments to handle low volumes for specific sequences and to isolate the impact of attributes such as placement, time slot, creative, and list segment. Adometry also lets users model against multiple events in the customer life cycle, such as sign-ups, first purchase, and repeat purchase. It calls these all conversions, which I personally found a bit confusing but suppose would quickly get used to.
The system also classifies each conversion as attributable, multi-touch, and multi-channel, depending on whether it was linked to at least one message (attributable), to multiple messages (multi-touch) and to messages in multiple channels (multi-channel). For each category, it shows the conversion count and revenue: so, for example, you see the number and revenue for multi-touch repeat purchases. That’s a lot of information to digest, but does give a great deal of insight into the effect of different promotions and channels on different parts of the business. This encourages marketers to look beyond any single measure, such as cost per order, that tells only a small part of the business story.
The system’s optimization process begins with the attribution analysis, but then adds auto-generated predictive models to estimate the impact of future ad plans, including interactions across channels. Users can enter scenarios with budgets for multiple channels and campaigns, and then apply other constraints such as limits on the change in spending per channel. They also define output measures for the system to optimize against: like other optimization systems, Adometry can only optimize against a single measure, but this can be a composite of several items. For each scenario, the system will determine the optimal budget allocation and show the expected results across each output measure. Users can modify the recommended plan and have the system re-forecast the results. The final plan can be output to a spreadsheet for further editing. Adometry can also be connected directly to ad buying platforms, including systems for real time bidding on individual impressions. The company says optimization typically yields a 20% to 40% improvement in ad-to-sales ratios.
The database of marketing messages per individual can be used for other types of analysis. These include reach and frequency reports, which show the number of individuals reached in total, reached in each channel, and reached exclusively for each channel. The reports count impressions as well as individuals; show how many people were reached in each combination of channels; show the number of people with each number of impressions (one, two, three, etc.); and show the current member count in each funnel stage.
Adometry’s data comes primarily from tags embedded in advertisements, emails, and other online messages, which drop cookies to identify who sees which message. The system can also draw data from Web server logs or third party tags. Adometry can further enrich its database by appending external information about individuals, using both online and offline sources. This lets it profile the audiences associated with different events, channels, campaigns, and other attributes. Optimization models can use data that can’t be tied to specific individuals, such as weather, economic conditions, and mass media like television and print. The system can also verify which ads were actually seen by individuals, providing more precise inputs to the attribution calculations.
Pricing for Adometry is based on the number of channels and volume of data. It starts around $100,000 per year for the smallest clients with enough volume to use the system effectively (about 30 to 50 million impressions per month). Currently, more than 50 companies use Adometry’s attribution services.
The lucky recipient of that less-than-stirring introduction is Adometry, which in no way deserves any disrespect. From humble beginnings in click fraud prevention, they have grown in recent years to be one of the leaders in algorithmic response attribution. Their latest expansion moves them beyond digital channels to offline media including direct mail, television, and print. They have also moved from attributing past results to using predictive models to optimize current and future campaigns. Impressive.
The core of Adometry’s attribution methodology is to compile the sequence of marketing messages seen by each individual, and then compare results of individuals whose sequence differs by only one message. Any difference in results is then attributed to that message. This is conceptually simple, but requires clever treatments to handle low volumes for specific sequences and to isolate the impact of attributes such as placement, time slot, creative, and list segment. Adometry also lets users model against multiple events in the customer life cycle, such as sign-ups, first purchase, and repeat purchase. It calls these all conversions, which I personally found a bit confusing but suppose would quickly get used to.
The system also classifies each conversion as attributable, multi-touch, and multi-channel, depending on whether it was linked to at least one message (attributable), to multiple messages (multi-touch) and to messages in multiple channels (multi-channel). For each category, it shows the conversion count and revenue: so, for example, you see the number and revenue for multi-touch repeat purchases. That’s a lot of information to digest, but does give a great deal of insight into the effect of different promotions and channels on different parts of the business. This encourages marketers to look beyond any single measure, such as cost per order, that tells only a small part of the business story.
The system’s optimization process begins with the attribution analysis, but then adds auto-generated predictive models to estimate the impact of future ad plans, including interactions across channels. Users can enter scenarios with budgets for multiple channels and campaigns, and then apply other constraints such as limits on the change in spending per channel. They also define output measures for the system to optimize against: like other optimization systems, Adometry can only optimize against a single measure, but this can be a composite of several items. For each scenario, the system will determine the optimal budget allocation and show the expected results across each output measure. Users can modify the recommended plan and have the system re-forecast the results. The final plan can be output to a spreadsheet for further editing. Adometry can also be connected directly to ad buying platforms, including systems for real time bidding on individual impressions. The company says optimization typically yields a 20% to 40% improvement in ad-to-sales ratios.
The database of marketing messages per individual can be used for other types of analysis. These include reach and frequency reports, which show the number of individuals reached in total, reached in each channel, and reached exclusively for each channel. The reports count impressions as well as individuals; show how many people were reached in each combination of channels; show the number of people with each number of impressions (one, two, three, etc.); and show the current member count in each funnel stage.
Adometry’s data comes primarily from tags embedded in advertisements, emails, and other online messages, which drop cookies to identify who sees which message. The system can also draw data from Web server logs or third party tags. Adometry can further enrich its database by appending external information about individuals, using both online and offline sources. This lets it profile the audiences associated with different events, channels, campaigns, and other attributes. Optimization models can use data that can’t be tied to specific individuals, such as weather, economic conditions, and mass media like television and print. The system can also verify which ads were actually seen by individuals, providing more precise inputs to the attribution calculations.
Pricing for Adometry is based on the number of channels and volume of data. It starts around $100,000 per year for the smallest clients with enough volume to use the system effectively (about 30 to 50 million impressions per month). Currently, more than 50 companies use Adometry’s attribution services.
Wednesday, April 03, 2013
ReachForce Buys SetLogik: One-Stop-Shopping for B2B Marketing Data Plus Database
B2B marketing data vendor ReachForce today announced its purchase of SetLogik, which provides technology to build cloud-based marketing databases and do predictive modeling against them. (See my post from last October for more on SetLogik.)
There’s an obvious peanut butter-meets-jelly type of logic to this match. Reachforce’s core business is assembling data on marketing prospects, which it then sells for as many uses as possible: appending to Web leads, enhancing existing databases, and buying as lists. The SetLogik acquisition takes this a step further by letting them build databases to hold their data, thereby expanding the market beyond people with a database already in place. Conversely, having a readily-available data source encourages marketers to build their own database. SetLogik’s predictive modeling features make it even easier for marketers to get a return on their investment once the database is in place. Everybody wins!
The two products will be combined in what ReachForce calls the “Connected Marketing Data Hub”. The name is frightfully generic, but the key points are:
In other words, the ReachForce solution supplements rather than replaces your marketing automation or CRM database. As I wrote in my earlier SetLogik review, one particularly attractive result is the ability to match sales revenues with marketing leads, always a challenge in measuring the value of marketing programs.
ReachForce has just begun to offer the combined system, which is currently deployed at one pilot client. Pricing is based on data volume, whether the client wants a one-time append or continuous cleaning, and on the data sources included. Minimum is $625 per month for continuous cleaning on 50,000 records.
There’s an obvious peanut butter-meets-jelly type of logic to this match. Reachforce’s core business is assembling data on marketing prospects, which it then sells for as many uses as possible: appending to Web leads, enhancing existing databases, and buying as lists. The SetLogik acquisition takes this a step further by letting them build databases to hold their data, thereby expanding the market beyond people with a database already in place. Conversely, having a readily-available data source encourages marketers to build their own database. SetLogik’s predictive modeling features make it even easier for marketers to get a return on their investment once the database is in place. Everybody wins!
The two products will be combined in what ReachForce calls the “Connected Marketing Data Hub”. The name is frightfully generic, but the key points are:
- cloud-based system, making it easy to deploy
- comprehensive customer view including data from marketing automation, CRM, transaction systems, and ReachForce’s own sources
- continuously updated and cleansed
- connectors available for Salesforce.com, Eloqua, and Marketo
In other words, the ReachForce solution supplements rather than replaces your marketing automation or CRM database. As I wrote in my earlier SetLogik review, one particularly attractive result is the ability to match sales revenues with marketing leads, always a challenge in measuring the value of marketing programs.
ReachForce has just begun to offer the combined system, which is currently deployed at one pilot client. Pricing is based on data volume, whether the client wants a one-time append or continuous cleaning, and on the data sources included. Minimum is $625 per month for continuous cleaning on 50,000 records.
Tuesday, April 02, 2013
Marketo Files for IPO: Will High Growth Outweigh High Losses?
Marketo made good today on its promise to file for an initial public offering (IPO). Congratulations to them for reaching this step. It’s a major accomplishment.
The S-1 registration statement gives considerable new information about Marketo’s business. Revenue for 2012 is reported at $58.4 million, an impressive 80% growth rate vs. 2011 although not quite the doubling that the company had forecast earlier.
More significant, the company continues to have huge losses – it lost $34.4 million in 2012, or 59% of revenue. By comparison, Eloqua lost just 7% of revenue in the year before its IPO, and even Salesforce.com, the benchmark for all Software as a Service (Saas) start-ups, lost just 20% of revenue in its final year as a private company.
A loss that big is pretty scary. Part is due to heavy spending on sales and marketing – 65% of revenue – but that’s not the whole story: Salesforce.com had also spent 65% on marketing before its IPO (although Eloqua spent just 40%).
The difference is that cost of revenue (costs of delivering service to clients, including subscription, support, professional services, and other) was 42% for Marketo, vs. 20% for Salesforce.com and 32% for Eloqua. That figure hasn’t changed in recent years, suggesting economies of scale have yet to appear. A high cost of revenue makes it hard for a company to become profitable even as it grows, since much of the new revenue is spent on the new customers. SaaS economics aren’t supposed to work that way.
Marketo’s other operating costs (research and development and general and administrative) are also high – 52% of revenue, compared with 35% for Salesforce.com and 33% for Eloqua. That percentage has also been pretty much stable for the past three years – again suggesting that expected scale economies haven’t appeared yet.
Another way to look at it is this: Marketo would earn just 6% profit even if its sales and marketing costs were zero. So its losses aren’t simply due to high investment in new customers. The comparable figures for Eloqua and Salesforce were 39% and 45%, respecitvely.
The S-1 also reports the company had 339 employees as of December 2012. Of course, the average for the year was much lower but, ignoring that, this still yields a perfectly respectable $172,000 revenue per employee. But it also means expenses are $273,000 per employee – much higher than the $200,000 rule of thumb. I know everyone at Marketo works incredibly hard, but something is clearly out of line in their cost structure.
Perhaps stock investors will look only at Marketo’s growth rate. There is certainly an argument that the company will eventually become profitable as it spreads its fixed costs over more revenue. On the other hand, as I argued recently in DemandGen Report, it may not be possible for any large marketing automation firm to thrive as an independent. If that's correct, then Marketo's growth will never happen and the investors' only hope will be a buy-out by a larger firm. Let’s hope the stock market sees hope somewhere in all this: otherwise, Marketo stock will be much harder to sell than its software.
The S-1 registration statement gives considerable new information about Marketo’s business. Revenue for 2012 is reported at $58.4 million, an impressive 80% growth rate vs. 2011 although not quite the doubling that the company had forecast earlier.
More significant, the company continues to have huge losses – it lost $34.4 million in 2012, or 59% of revenue. By comparison, Eloqua lost just 7% of revenue in the year before its IPO, and even Salesforce.com, the benchmark for all Software as a Service (Saas) start-ups, lost just 20% of revenue in its final year as a private company.
The difference is that cost of revenue (costs of delivering service to clients, including subscription, support, professional services, and other) was 42% for Marketo, vs. 20% for Salesforce.com and 32% for Eloqua. That figure hasn’t changed in recent years, suggesting economies of scale have yet to appear. A high cost of revenue makes it hard for a company to become profitable even as it grows, since much of the new revenue is spent on the new customers. SaaS economics aren’t supposed to work that way.
Marketo’s other operating costs (research and development and general and administrative) are also high – 52% of revenue, compared with 35% for Salesforce.com and 33% for Eloqua. That percentage has also been pretty much stable for the past three years – again suggesting that expected scale economies haven’t appeared yet.
Another way to look at it is this: Marketo would earn just 6% profit even if its sales and marketing costs were zero. So its losses aren’t simply due to high investment in new customers. The comparable figures for Eloqua and Salesforce were 39% and 45%, respecitvely.
The S-1 also reports the company had 339 employees as of December 2012. Of course, the average for the year was much lower but, ignoring that, this still yields a perfectly respectable $172,000 revenue per employee. But it also means expenses are $273,000 per employee – much higher than the $200,000 rule of thumb. I know everyone at Marketo works incredibly hard, but something is clearly out of line in their cost structure.
Perhaps stock investors will look only at Marketo’s growth rate. There is certainly an argument that the company will eventually become profitable as it spreads its fixed costs over more revenue. On the other hand, as I argued recently in DemandGen Report, it may not be possible for any large marketing automation firm to thrive as an independent. If that's correct, then Marketo's growth will never happen and the investors' only hope will be a buy-out by a larger firm. Let’s hope the stock market sees hope somewhere in all this: otherwise, Marketo stock will be much harder to sell than its software.
Monday, April 01, 2013
InfusionCon 2013: InfusionSoft Keeps Its Focus on Helping Entrepreneurs
I spent part of last week at Infusionsoft’s annual conference, InfusionCon, drinking the Kool-Aid and soaking up the Arizona sun.
Pleasant as the 80 degree temperatures were to a refugee from the still-wintry Northeast, the real warmth at the conference came from 2,300 attendees bubbling with enthusiasm for their entrepreneurial adventures and how Infusionsoft supports them. Keynote speaker Jay Baer captured the mood perfectly when he went “all Oprah” on the crowd by promising them each a free Camaro. (Either he was joking or I registered incorrectly.) The group was indeed drenched in Oprah-style self-empowerment.
As you’ve probably guessed, this isn’t my native habitat. Even though Raab Associates itself is a small business and runs in part on an Infusionsoft-like system (OfficeAutoPilot), I’m a professional manager by training and most of my clients are mid-size and big businesses. What really matters, though, is that Infusionsoft itself remains committed to its small business customers, despite growing to nearly 400 people and $40 million revenue. This consistency is no accident: Infusionsoft managers are quite vocal on their very conscious efforts to build a culture that is committed to helping entrepreneurs and is itself entrepreneurial. It’s a tall order, but there’s some serious missionary zeal at every level, so they might just succeed.
In any event, I did manage to spend most of the conference in my own comfort zone of analyzing Infusionsoft’s business. A long conversation with Chief Marketing Officer Gregg Head provided some interesting tidbits, including:
- the company’s customers fall into three main groups, each roughly one third of the total. hese are: Internet-enabled business coaches and experts, who are selling books, videos and other products in addition to their personal time; local service providers, such as dentists, home services, and fitness centers; and businesses selling to other small businesses.
- most clients want either to increase sales or free up the owner's time. The latter goal – taking back your life from an all-consuming business – seemed to resonate more than anything with the attendees. Reducing costs is a lower priority.
- Measuring return on investment isn’t much of an issue. Small businesses can see changes in revenue or free time immediately. Detailed analysis isn't needed.
- Some companies are too small even for Infusionsoft. A client must have a stable revenue base to expand, or be successful enough that the owner is looking for some free time. The average Infusionsoft client has been in business for five years, which means that nearly all were in business for at least several years before purchasing the system.
- Facebook is by far the most important online channel for Infusionsoft customers, in many cases replacing Web sites as the primary online presence. Search engine marketing and blogs are much less important. The primary sources of new customers are still offline: referrals, partners, events, and direct mail. (Incidentally, trendsters, direct mail in general and post cards in particular are hot. But that might be old news. I did receive a message about personalized pizzas today, but am pretty sure it was an April Fools joke.)
And what of Infusionsoft itself? The company did announce its next release at InfusionCon, although by its own admission the changes were incremental enhancements in usability rather than major expansions in function. The main items were more efficient scheduling of personal tasks, a simple way to prepare quotes, and branding templates that automatically deploy style changes across all types of content. Campaigns can also now easily include GroSocial Facebook campaigns (GroSocial being a social marketing firm acquired by Infusionsoft in January.) Modest as these changes are, the company says its users wanted them more than new acquisition channels.
Infusionsoft also announced several non-technical initiatives, again with the goal of making users more productive. These included a set of prebuilt campaigns. including actual content; on-demand training videos integrated with the product, and accelerated expansion of sales and service partner networks. The onboarding process has also been revamped to deliver results in 30 days rather than 60, the main change being that Infusionsoft staff now does more of the actual setup for new clients and spends less time on a conceptual success map.
All these changes confirm what was already obvious: that Infusionsoft’s entrepreneurial customers are a separate breed from the professional marketers who use traditional marketing automation systems. The functional differences between the two sets of systems may be hard to spot, but there’s no mistaking the difference in the services and attitudes that surround them.