Robin Neidorf Comparing news aggregators: it's never easy
Jinfo Blog

14th December 2011

By Robin Neidorf

Item

Among the features of VIP Reports on Products that customers seem to like the most is the "VIP's View" – a short summary of what the reviewer liked and disliked about the product under scrutiny. Every product has its strengths and weaknesses, and every reviewer brings a slightly different perspective on what an effective product should include. While VIP's View represents one person's opinion about product performance, it is a useful lens through which readers can understand the rest of the analysis included in the report.

It's still difficult, however, even with a brief summary like VIP's View to rely on, to create comparisons across different products. We're often asked to create comparative reviews – put two products side by side and run the exact same tests in the exact same way on them both. While this approach is naturally appealing to those who evaluate products, vendors are more sceptical... and with valid concerns. This type of testing does not allow a product to demonstrate what it is truly good at. It doesn't enable the reviewer to understand how the vendor intends for it to be used nor how the product development team designed it.

To overcome some of these challenges, VIP has taken a different approach to comparing products. We continue to conduct independent, one-product reviews. After amassing a number of reviews of products that meet similar needs, we can then compare the results of the review process for each product. Thus, each product benefits from the undivided attention of a researcher, delving into the strengths of that product. At the same time, we're able to extract details from the reviews that enable comparisons.

The most recent result of this is the FreePint Research Report: Comparison of News Aggregators, which pulls results from product reviews of Factiva.com (November 2011), NexisUK (November 2011), FirstRain (August 2011), Newsdesk 4 (November 2011), NewsEdge.com (October 2010) and Eureka.ca (October 2010). The analysis compares researcher findings about each of these products on included content, output formats, search capabilities and functions, pricing approaches and other dimensions of interest. By laying these comparisons out in tables, readers are provided an easier route to comparing the different products and identifying which might be worthy of closer scrutiny.

Most of the included tables in the report are based on facts -- the number of reported sources and their types, for instance, or the different languages represented within the content. To my thinking, however, the most interesting table is the one based on opinions: The summary table of VIP's View for each of these product reviews.

A quick overview of the contents of this table provides insight as to what reviewers are generally looking for in the products they review for VIP: comprehensive content, easy-to-navigate interface, unique areas of coverage, and clever programming that puts the content right where the user wants it (e.g., into an alert or onto a mobile device). Similarly, a scan of the table also shows some of the areas where vendors seem to stumble, such as creating navigation that is less than intuitive or other usability challenges.

These are hardly surprising findings, and yet their value comes from the fact that they are research based. They are not just an individual's opinion, but the accretion of opinion, insight and testing from consistent researchers over a series of products and a period of time.

The full FreePint Research Report: Comparison of News Aggregators provides a unique view into the pros and cons of many business news providers. Is there another product category for which you'd like to see a similar analysis? Let me know.

« Blog