Show HN: Analyzing ~10k professional product reviews to calculate a single score
criticaster.comFor years I’ve trusted Metacritic’s approach to rating games / movies / tv shows: collect all pro reviewers’ articles, score them and then calculate a meta score. In my experience it’s the best way to discover new things to watch or play
I often wished something like this would exist for physical products as right now I’m usually stuck doing days of manual research online myself.
Over the past 4 weeks I’ve been building an experiment that applies the same aggregation idea to tech products. It collects professional reviews, extracts and normalizes scores, and produces a single “critic score” per product.
So far the dataset includes ~9,630 reviews across 1339 products. As a small sanity check, I compared the results against two recent purchases of mine, and the “best for most” recommendation matched what I eventually chose after many hours of manual research.
I’m curious what you think about this approach, especially around score normalization, bias between publications, and whether you agree that a single aggregated score is super useful when evaluating products. Here to answer any questions :-)