Friday, March 16, 2007

The Market for Web Testing Systems is Young

I’ve received online newsletters in the past week from three Web optimization vendors: [x+1], Memtrics and Optimost. All are interesting. The article that particularly caught my attention was a Jupiterresearch report available from [x+1], which contained results from a December 2005 survey of 251 Web site decision makers in companies with $50 million or more in annual revenues.

What was startling about the report is its finding that 32% of the respondents said they had deployed a “testing/optimization” application and another 28% planned to do so within the next twelve months. Since a year has now passed, the current total should be around 60%.

With something like 400,000 U.S. companies with $50 million or more in revenue, this would imply around 200,000 installations. Yet my conversations with Web testing vendors show they collectively have maybe 500 installations-—certainly fewer than 1,000.

To put it mildly, that's a big difference.

There may be some definitional issues here. It's extremely unlikely that Jupiterresearch accidentally found 75 (32% of 251) of the 500 companies using the testing systems (especially since the total was probably under 500 at the time of the survey). So, presumably, some people who said they had testing systems were using products not on the standard list. (These would be Optimost, Offermatica, Memetrics, SiteSpect and Vertster. Jupiterresearch adds what I label as “behavioral targeting” vendors: [x+1] and Touch Clarity (now part of Omniture), e-commerce platform vendor ATG, and testing/targeting hybrid Kefta.) Maybe some other responders weren’t using anything and chose not to admit it.

But I suspect the main factor is sample bias. Jupiterresearch doesn’t say where the original survey list came from, but it was probably weighted toward advanced Web site users. As in any group, the people most involved in the topic are most likely to have responded, further skewing the results.

Sample bias is a well know issue among researchers. Major public opinion polls use elaborate adjustments to compensate for it. I don’t mean to criticize Jupiterresearch for not doing something similar: they never claim their sample was representative or that the numbers can be projected across all $50 million+ companies.

Still, the report certainly gives the impression that a large fraction of potential users have already adopted testing/optimization systems. Given what we know about the vendor installation totals, this is false. And it’s an error with consequence: vendors and investors act differently in mature vs. immature markets. Working from the wrong assumptions will lead them to costly mistakes.

Damage to customers is harder to identify. If anything, a fear of being the last company without this technology may prompt them to move more quickly. This would presumably be a benefit. But the hype may also lead them to believe that offerings and vendors are more mature than in reality. This could lead them to give less scrutiny to individual vendors than they would if they knew the market is young. And maybe it’s just me, but I believe as a general principle that people do better to base their decisions on accurate information.

I don’t think the situation here is unique. Surveys like these often give penetration numbers that seem unrealistically high to me. The reasons are probably the same as the ones I’ve listed above. It’s important for information consumers to recognize that while such surveys are give valuable insights into how users are behaving, they do have their limits.

1 comment:

Demi said...

Thank you for this post - it's bang on. I'd add the following observations: 1. Unless an online marketer's compensation is directly tied to the bottom line, s/he would rather pass on having his/her performance measured; 2. Lip service among marketing professionals is huge (ie., "self positioning") so naturally they would report being involved in testing and optimizations; and 3. most marketers (unless they come up the DM route) have no clue about what "testing" means and would rather not relive their 12th grade Math expereince.