Kyle Frost

32% Engagement Lift Through Product Hub Redesign

Product Hubs consolidate all launches from a single product: version history, updates, and evolution over time. Usage data showed users weren't clicking through to maker websites at expected rates because the primary CTA was competing with too much noise on the page.

I redesigned the page around what data showed users actually cared about: understanding the current product version and getting to the maker's site. Everything else was stripped back or removed.

+32% CTR on click-through to maker websites and app stores

+21% pages per session as users engaged with more product versions

Lower bounce rate from clearer page purpose keeping users on the site

Problems

The V1 Hub page tried to do too much. Every element competed for attention, and the things users actually cared about (understanding the product and getting to the maker's site) were buried under metrics, awards, and a noisy activity feed.

Product Hunt Hub page before redesign with annotated issues
1

Awards are good social proof but lose relevance quickly, especially when tied to a specific day or week. Less meaningful to organic visitors unfamiliar with Product Hunt’s daily rhythm.

2

Aggregate counts of votes, launches, and followers took up prominent space but didn’t help users make decisions about the product.

3

“Follow” was the primary CTA, but we wanted to shift focus toward getting users to the maker’s site, the action that actually provided value to makers.

4

Only three images supported (often cropped poorly), and important links like social profiles and app store URLs were hidden behind “More info.”

5

Activity mixed updates, announcements, and launches together. Announcements were prominent despite many being duplicates of awards. Visitors and makers primarily cared about actual product launches.

Approach

I started by analyzing usage data, reviewing heatmaps and click tracking to understand where people were engaging and where they weren't. The data was clear: users primarily cared about two things — understanding what the current version of the product was, and getting to the maker's website quickly. Everything else on the page was noise.

The biggest internal debate was around aggregate metrics (total upvotes, launch counts, follower numbers). Stakeholders liked showing impressive numbers, but the data showed these weren't helping users make decisions. I used the click tracking data to make the case for stripping them back and refocusing the page around the two actions users actually took.

Solution

The redesigned Hub page put the "Get it" button front and center in a clean hero section with product name, tagline, and maker info. I removed all competing elements that were drawing attention away from that primary action.

Below the hero, a chronological launch timeline showed the product's evolution. The current version dominated the top of the page. Historical launches appeared smaller and less saturated, clearly secondary but still there for users who wanted to explore the product's history.

Each launch card showed only what mattered: date, tagline, upvotes, comments. No redundant descriptions, no duplicate metadata. The entire page worked toward getting users to understand the product and click through to the maker's site.

Product Hunt Hubs redesign

Impacts

📈

32% Increase in CTR

Click-through rate to maker websites and app stores increased by 32% after the redesign, directly supporting our goal of driving traffic to makers.

📄

21% More Pages Per Session

Clearer hierarchy helped users navigate between current and historical launches, increasing engagement with multiple versions of products.

👍

Positive Maker Feedback

Makers reported seeing increased traffic to their websites, validating that the redesign was achieving its primary objective.

Reduced Bounce Rate

Bounce rate decreased as users found it easier to understand what Hubs offered and navigate to relevant information quickly.

Reflections

1.

V1 was based on assumptions about what users would value. V2 was informed by actual usage patterns, and it showed.

2.

The biggest challenge was balancing stakeholder desires (showing impressive aggregate metrics) with user needs (finding current product information quickly). Data helped make the case for prioritizing user goals.

3.

If I could do it again, I'd implement A/B testing from the V1 launch to gather comparative data sooner. We relied on pre/post metrics, but controlled experiments would have given us more confidence in specific changes.