poddit

Reddit has incredible discussions, but reading through threads takes time. What if you could listen to the top posts from your favorite subreddits while commuting, exercising, or doing chores?

Poddit is an experimental app I built that transforms Reddit's top daily posts into AI-generated podcast episodes.


the concept

Every day, fascinating conversations happen on Reddit across thousands of communities. But consuming this content requires sitting at a computer or phone, reading thread after thread.

Podcasts solve this for traditional media - you can stay informed while doing other things. We wanted to bring that same convenience to Reddit content.

Poddit automatically:

  1. Fetches the top posts from yesterday
  2. Compiles titles and top comments
  3. Generates a podcast-style script using AI
  4. Converts it to audio using text-to-speech
  5. Delivers it as a listenable episode

the pipeline

The system runs on Firebase Cloud Functions with a scheduled job:

Step 1: Reddit scraping Using Snoowrap (Reddit API wrapper), we fetch the top 3 posts from the past 24 hours, along with their top comments. This gives us the most engaging content and community reactions.

Step 2: Transcript generation An LLM processes the raw Reddit data and transforms it into a conversational podcast script. Instead of just reading posts verbatim, it creates natural dialogue that flows like a real podcast discussion.

Step 3: Audio synthesis The generated script is fed to a text-to-speech system, creating the final audio file. The result is a podcast episode summarizing the day's top content.

Step 4: Storage and delivery The MP3 file is stored in Firebase Storage and referenced in Firestore, making it accessible through the Angular frontend.

scheduled automation

A cron job runs daily at 9 AM, automatically generating podcasts for subscribed subreddits. Users wake up to fresh episodes without manual intervention.

This "set and forget" model makes Poddit genuinely useful - you don't need to remember to generate content; it just appears.

the frontend

Built with Angular and SSR, the web app allows users to:

  • Browse available subreddit podcasts
  • Subscribe to communities for daily episodes
  • Listen to past episodes
  • Request new subreddit coverage

Authentication via Firebase lets users manage their subscriptions and track listening history.

technical challenges

Several hurdles required creative solutions:

API rate limits Reddit's API has strict rate limits. We had to carefully batch requests and cache data to avoid hitting limits while still getting fresh content.

Context length LLMs have token limits. Compiling multiple posts with comments quickly exceeds this, requiring intelligent summarization before script generation.

Audio quality Early TTS voices sounded robotic. Finding the right voice model and adjusting pacing made a huge difference in listenability.

Cost management LLM API calls and TTS generation aren't free. We added safeguards to limit daily episode generation and prevent runaway costs.

ai integration

The LLM component is important - it's what transforms raw Reddit text into something listenable. The prompt engineering involved:

  • Understanding Reddit's culture and tone
  • Maintaining neutrality while summarizing opinions
  • Creating natural transitions between topics
  • Handling controversial content appropriately

Getting this right took multiple attempts. Early versions felt too formal or missed the essence of discussions.

use cases

Poddit works best for:

  • Niche communities - Stay updated on specialized topics
  • News subreddits - Get daily summaries of current events
  • Hobby communities - Learn about new developments while on the go
  • Educational subreddits - Absorb information passively

It's less suitable for image-heavy or meme-focused communities where the visual content matters most.

product approach

Product thinking shaped Poddit's direction. I focused on simplicity - creating daily digests rather than trying to cover every feature Reddit offers.

The design prioritized a clean and approachable frontend, avoiding the temptation to overcomplicate the interface.

ethics

Transforming user-generated content raised questions:

  • Attribution - We ensure post authors are credited
  • Context - Summarization can lose nuance; we're transparent about that
  • Permissions - We only use publicly available content
  • Monetization - Any future monetization would need to respect original creators

These aren't fully solved problems, but we approached them thoughtfully.

lessons learned

Building Poddit taught us:

  1. AI pipelines need guardrails - Costs and quality both require monitoring
  2. Scheduled automation is powerful - Users value "set it and forget it" features
  3. Audio consumption is different - What works in text doesn't always work spoken
  4. Integration is complex - Combining multiple APIs (Reddit, LLM, TTS, Firebase) creates many failure points

reflection

Poddit represents an interesting experiment in content conversion. Reddit's value lies in its community discussions, but the format isn't always convenient.

By automating the conversion to audio, we made that value accessible in new contexts - during commutes, workouts, or household chores.

The name "Poddit" is simply "podcast" + "Reddit" - sometimes the obvious name is the right name.

While the project remains experimental, it demonstrates how AI can reshape how we consume existing content platforms, making them fit into our lives rather than requiring us to fit into theirs.