slash-twitter

Building an audience on social media requires consistent posting. But creating original content daily is exhausting, especially when you're curating content for a niche community.

I built slash-twitter, a bot that automatically reposts top Reddit content to Twitter, keeping a Twitter account active with high-quality community content without manual effort.


the use case

Many subreddit communities have quality content that never reaches Twitter. Meanwhile, Twitter accounts in the same niche struggle to maintain consistent posting schedules.

The bot bridges this gap: it monitors a subreddit for hot posts, filters them based on engagement thresholds, and automatically shares the best ones on Twitter with proper attribution.

I used this for communities like r/succulents, where beautiful plant photos and care tips would perform well on both platforms.

how it works

The implementation is refreshingly simple:

  1. Fetch recent tweets to avoid posting duplicates
  2. Query Reddit for hot posts above a vote threshold
  3. Compare titles to ensure we haven't posted this before
  4. Post to Twitter with the title, Reddit link, and hashtag
  5. Stop after one successful post to avoid spam

The bot runs on a schedule via Heroku Scheduler - every few hours, it checks for new content and posts if it finds something worthy.

tech stack

Built with Python and minimal dependencies:

  • PRAW (Python Reddit API Wrapper) for Reddit integration
  • Tweepy for Twitter API access
  • Heroku for hosting and scheduling
  • Environment variables for API credentials

The entire logic fits in about 20 lines of code. No database needed - the bot is stateless and relies on checking recent tweets to avoid duplicates.

smart filtering

Not every Reddit post should be tweeted. The bot includes several filters:

  • Upvote threshold - only posts with sufficient engagement
  • Pinned posts excluded - stickied moderator posts aren't content
  • Title length limits - avoid truncated tweets
  • Duplicate detection - check the first 20 characters of recent tweets

These constraints ensure quality and prevent spam, making the Twitter feed feel curated rather than automated.

deployment

Heroku makes this trivial to run:

  1. Connect the GitHub repository
  2. Set environment variables for API keys
  3. Add Heroku Scheduler
  4. Configure the schedule to run python main.py

No servers to manage, no complex infrastructure. The bot runs in the background, costs almost nothing, and requires zero maintenance.

api considerations

Working with social media APIs taught me important lessons:

  • Rate limits are real - be conservative with requests
  • OAuth is mandatory - both platforms require proper authentication
  • Breaking changes happen - I had to use a forked version of Tweepy when the main library wasn't updated for Twitter API changes
  • ToS compliance matters - automation should add value, not spam

evolution

The bot went through several versions:

  • Initially posted too frequently, feeling spammy
  • Tightened upvote thresholds to ensure quality
  • Added hashtags for better discoverability
  • Excluded pinned posts that weren't interesting content
  • Reduced text length to avoid truncation

Each change made the output feel more human and less robotic.

reflection

Slash-twitter demonstrates that automation doesn't have to be complex. With two API clients and a scheduler, you can create a content pipeline that runs indefinitely.

The project also highlighted the value of cross-platform content syndication. Reddit communities generate amazing content that deserves wider reach, and Twitter audiences appreciate curated content from trusted sources.

While I no longer maintain active instances, the bot successfully ran for months, posting hundreds of pieces of quality content and growing Twitter followings organically.

live examples