ClawFeed: AI-Powered News Digest with Structured Summaries from Twitter/RSS and Web Dashboard
ClawFeed is an open-source AI news aggregation tool that automatically curates content from Twitter, RSS, HackerNews, Reddit, GitHub Trending and more, using AI for smart filtering and noise reduction to generate structured multi-frequency digests (4-hourly/daily/weekly/monthly).
Key highlights:
- **Source Packs**: Users can package and share curated source bundles with the community — like npm for information sources
- **Mark & Deep Dive**: Bookmark content to trigger AI-powered deep analysis, going beyond summaries into real insight
- **Feed Output**: Every user's digest is subscribable via RSS/JSON Feed, enabling "human curation + AI processing" information redistribution
- **Multi-tenant**: SQLite + Google OAuth supporting independent source management per user
Clean tech stack: Node.js backend + SQLite + SPA frontend. Works as an OpenClaw/Zylos skill or standalone deployment. A feature-complete, hackable reference for developers building personal information aggregation systems.
ClawFeed Overview
ClawFeed is an open-source AI news aggregation tool by Kevin He. Core philosophy: "Stop scrolling. Start knowing." — using AI to filter signal from noise across thousands of information sources.
Core Features
Multi-Frequency Digests
Four time granularities: 4-hourly briefs, daily highlights, weekly reviews, monthly summaries. Each frequency is independently generated; users choose their subscription level.
Rich Source Support
- **Twitter/X**: Follow specific users (@karpathy) or Twitter lists
- **RSS/Atom**: Any RSS feed
- **HackerNews**: HN front page
- **Reddit**: Specific subreddits (e.g., /r/MachineLearning)
- **GitHub Trending**: Filter by programming language
- **Website scraping**: Any web page
- **Custom API**: JSON endpoints
- **Digest Feed**: Subscribe to other ClawFeed users' digests (meta-curation)
Source Packs
Users can bundle curated source sets into Packs and publish them for one-click community installation. Think npm packages but for information curation — interested in AI frontiers? Just install someone's AI Source Pack.
Mark & Deep Dive
Bookmark interesting items while browsing digests. Bookmarked items trigger AI-powered deep analysis, generating detailed reports beyond summaries.
Feed Output
Every user's digest auto-generates three subscription formats:
- HTML page (`/feed/:slug`)
- JSON Feed (`/feed/:slug.json`)
- RSS (`/feed/:slug.rss`)
Your AI-curated results become subscribable, enabling chain propagation of information.
Technical Architecture
- **Backend**: Node.js, port 8767
- **Database**: SQLite (zero-config, portable)
- **Auth**: Google OAuth 2.0 (multi-user; read-only without OAuth)
- **Frontend**: SPA with Dark/Light theme toggle
- **i18n**: English + Chinese UI
- **AI configurable**: `templates/curation-rules.md` for filtering rules, `templates/digest-prompt.md` for output format
Deployment
As OpenClaw Skill
clawhub install clawfeed
Auto-detected via SKILL.md. Agent generates digests via cron, serves dashboard, handles bookmark commands.
As Zylos Skill
cd ~/.zylos/skills/ && git clone https://github.com/kevinho/clawfeed.git
Standalone
git clone → npm install → cp .env.example .env → npm start
Supports Caddy reverse proxy with path prefix deployment.
API
Full RESTful API: digest CRUD, authentication, bookmarks, source management, Source Packs, Feed output, changelog and roadmap queries. Write operations require auth; read operations (digest browsing, Feed subscriptions) are public.
Use Cases
- Personal information aggregation: Replace manual Twitter/HN/Reddit scrolling
- Team knowledge sharing: Via Source Packs and Feed output
- AI Agent integration: As OpenClaw/Zylos skill for automated operation
License: MIT | Live Demo: https://clawfeed.kevinhe.io