What began as a whimsical “landscape of landscapes” led to the serious realization that crowd-sourced review sites are the most common type of vendor directory. Fifteen of the 23 sources listed in my original graphic fell into that category. This begged for a deeper look at the review sites to understand how they differ and which, if any, could replace the work of professional reviewers (like me) and software guides (like my VEST report).
The first question was which sites draw a big enough crowd to be useful. I used Alexa traffic rankings, which are far from perfect but good enough for this sort of project. (Compete.com gave similar rankings except that TrustRadius came in lower, although still in the top 10.) After adding two review sites that I learned about after the original post, I had 17 to consider. In order of their Alexa rankings, they were:
Since crowd wisdom without a crowd can’t be terribly effective, I limited further analysis to the top 10 sites. Of these, AlternativeTo.net, SocialCompare, and Cloudswave were different enough from the standard model that it made sense to exclude them. This left seven sites worth a closer look.
The next question was coverage by the sites of marketing technology. Every site except TrustRadius covered a broad range of business software from accounting to human resources to supply chain as well as CRM and marketing. TrustRadius was more focused on customer-related systems although it still had business intelligence and accounting. The numbers of categories, subcategories, and marketing subcategories all differed widely but didn’t seem terribly significant, apart from SoftwareInsider and DiscoverCloud looking a bit thin. Differences in the numbers of products in the main marketing categories also didn't seem meaningful – although they do illustrate how many products there are, in case anyone needs reminding.
What did look interesting was the number of ratings and/or reviews for specific products. I sampled leading marketing automation vendors for different sized companies. It turns out that G2Crowd and TrustRadius had consistently huge leads over the others. I didn’t check similar statistics for other software categories, but this is probably the one that counts for most marketers.
Of course, quality matters as well as quantity. In fact, it probably matters more: my primary objection to crowd-sourced software reviews has always been that users’ needs for software are so varied that simple voting based on user satisfaction isn't a useful indication of how a system will for any particular buyer. This is different from things like restaurants, hotels, and plumbers, where most buyers want roughly the same thing.
Software review sites address this problem by gathering more detail about both the products and the reviewers. Detailed product information includes separate numeric ratings on topics such as ease of use, value for money, and customer support; detailed ratings on specific features; and open-ended questions about what reviewers liked most and least, how they used the system, and what they’re recommend to others. Reviewer information on all sites except Software Advice starts with verifying that the user is a real person through requiring a LinkedIn log-in. This lets the review site check the reviewer’s name, title, company, and industry, although these are not always fully displayed. Some sites verify that the reviewer actually uses the product. Some provide other background about the reviewer’s activities on the review site and how their work has been rated.
I can't show how each vendor handles each of those items without going into excruciating detail. But the following table gives a sense of how much information each site collects. Of course, reviewers don’t necessarily answer all these questions. (Caution: this information is based on a relatively quick scan of each site, so I’ve probably missed some details. If you spot any errors, let me know and I’ll correct them.) When it comes to depth, TrustRadius and DiscoverCloud stand out, although I was also impressed by the feature details and actual pricing information in G2Crowd.
The number and depth of reviews are clearly the most important attributes of review sites. But they also differ in other ways. Selection tools to identify suitable vendors are remarkably varied – in fact, the only filter shared by all sites is users' company size. Industry is a close second (missing only in DiscoverCloud), while even selections based on ratings are found in just four of the seven sites. Only three sites let users select based on the presence of specific features, an option I believe is extremely important.
Looking beyond selection tools: most sites supplement the reviews with
industry reports, buyer guides, comparison grids, and similar
information to help users make choices. Several sites let users ask
questions to other members.
So, back to my original question: can crowd-sourced review sites replace professional software reviews? I still don’t think so: the coherent evaluation of a practiced reviewer isn’t available in the brief comments provided by users, even if those comments are accompanied by information about specific product features. This may sound like self-serving mumbo-jumbo, but I do think a professional reviewer can articulate the essence of many products more effectively than users who report only on their personal experience. (Yes, I really just wrote "articulate the essence".)
But whether sites can replace professional reviewers is really the wrong question. What matters is the value the review sites offer on their own. I’d say that is considerable: given enough volume, they indicate the rough market share of different products, the types of users who buy each system, and what worked well or poorly for different user types. User comments give a sense of what each writer found important and how they reached their judgements. This in turn lets readers assess whether that reviewer’s needs were similar to their own. Buyers still need to understand their own requirements, but that’s something that no type of review can replace.
No comments:
Post a Comment