Improving the review experience on Product Hunt

We wanted to update the review experience on Product Hunt to increase information density, give us more data to play with, and improve the overall experience on the platform. Reviews play a significant role when it comes to SEO performance and helping our "researcher" user segment to make more educated decisions about products.

However, *more* reviews aren't good enough. We want to make sure that they’re complete, accurate, and contain valuable content. We want to it clear what exactly is being reviewed, and avoid cases of confusion about the purpose of reviews.

A few issues we sought to address:

  • Reviews were too closely aligned (both visually and copy-wise) with comments. Visitors were leaving "reviews" in the style of "comments" -- a scenario that is not useful for makers or good for data integrity.
  • How do we acquire more structured data we can use to differentiate similar products? The primary text content of reviews made it difficult to establish standardized rankings/ratings/comparisons.
  • Should reviews be tied to launches at all? Because we were actively soliciting reviews on "launches", we were potentially incentivizing friends + family to leave reviews to boost initial launch engagement. It's seems unlikely that many (authentic) people have actually used the product at the "launch" stage.

Dedicated pages

Our updated and streamlined approach to reviews had two main components.

One –– We designed a dedicated review page that amalgamated all reviews from previous launches of a single product. Due to data issues, much of our review data was actually split between individual launches of a single product. This update was dependent on significant back-end work, data migration, and a concurrent design project focused on creating "hub" pages to act as the home for all launches related to a single product (Ex. "Figma" = hub, "Figma Community" + "Figma Widgets" + "Figma Auto Layout" = launches).

Two –– We moved the primary review experience away from launches in order to incentivize more authentic reviews. We also made updates the the review experience in the form of positive and negative tagging. This allowed us to have more structured data that was consistent across products –– allowing us to utilize it for future features relating to comparisons and alternatives.

Results

We had a couple different metrics for success. First of all, the data migration involved was a massive team success just for making it happen. It was an important and absolutely necessary step to give us an improved foundation to build upon.

Secondly, we looked at visits to /review pages and our rate of reviews. We improved on visits, largely because the combined info created a more useful and information-dense page that had a major SEO benefit.

Our rate of reviews actually decreased for a time, which was an expected immediate side effect of this change. By moving reviews away from launches, the CTA to review products was less visible to users. However, we were ok with this change, as we felt it would increase the long-term trust in our reviews.

After the completion of this project, we continued to develop and test additional ways to drive new reviews that increased visibility without sacrificing quality.