3goodsources¶
GitHub Repo | Updated: February 2026
A trust registry for AI agents built on the Model Context Protocol. When an agent needs to learn Rust or set up a Bitcoin node, 3goodsources delivers three curated, cryptographically-signed sources instead of hundreds of SEO-gamed search results.
Tech Stack: - Rust - Model Context Protocol (MCP) - PKARR (cryptographic identity) - Fuzzy matching (normalized Levenshtein)
Why I Built This¶
AI agents have access to the entire web, but most of it is garbage. When Claude asks for sources on setting up a Bitcoin node, traditional search returns hundreds of results optimized for pagerank, not accuracy. Listicles stuffed with affiliate links. Blog posts that copy other blog posts. SEO-gamed content that looks authoritative but isn't.
The problem isn't access to information. The problem is that agents have no quality signal. They can't distinguish between the official Bitcoin Core documentation and some random Medium post that's third in the results because the author knew SEO tricks.
I wanted to build a system where a human curator researches a topic, picks exactly three trusted sources, and signs those recommendations cryptographically. The constraint is deliberate: three sources per topic, always. Quality over quantity. Primary sources over blog posts. Practical value over pagerank.
The Model Context Protocol was the perfect fit. It's a standardized way for agents to call tools and get structured responses. Building an MCP server meant any MCP-compatible agent could use the registry without custom integration code.
How It Works¶
At its core, 3goodsources is a fuzzy-matching query engine backed by a human-curated registry. The registry is a JSON file where each category contains exactly three ranked sources with explanations for why each source matters.
Query Flow:
AI Agent → MCP Handler → Query Matcher → Registry
↓
3 Ranked Sources
When an agent sends a query like "learn rust programming", the matcher normalizes the text (lowercase, strip punctuation, remove stop words), then runs fuzzy matching against category patterns, slugs, and names using normalized Levenshtein distance. It also applies keyword boosting if query terms appear in category metadata.
The scoring is weighted: 70 percent fuzzy match, 30 percent keyword boost. Only matches above a threshold (default 0.4 out of 1.0) return results. This ensures "learn rust" matches the rust-learning category even if the exact phrasing differs from stored patterns.
Each source includes a rank (1-3), name, URL, type (documentation/tutorial/tool), and a "why" explanation. The curator's identity is verified via PKARR, a public key addressing system. The server's public key is exposed at /health, and the get_provenance tool returns curator metadata so agents can verify the source of recommendations.
The registry itself is public at /registry, so anyone can inspect the full list of categories and sources. Transparency over obscurity.
What I Learned¶
Building an MCP server in Rust taught me that the protocol is simpler than I expected. It's JSON-RPC 2.0 over HTTP with a specific tool call structure. No complex streaming or state management required. The hardest part was designing the query matching algorithm to balance precision and recall.
I initially used exact string matching, but users would ask "how do I run a bitcoin full node" when the category pattern was "setting up bitcoin core". Fuzzy matching fixed that, but required careful tuning of weights and thresholds. Too low and everything matches. Too high and nothing does.
The "three sources" constraint was the best design decision. It forces curators to prioritize ruthlessly. You can't just dump a list of 20 links. You have to pick the official docs, the best tutorial, and the essential tool. That curation is the entire value proposition.
PKARR integration was straightforward but the ecosystem is still early. I'm using it to sign the curator's identity, but the broader vision of federated trust registries (where curators endorse each other) is scaffolded but not implemented yet. That's a future phase.
Links¶
Curated sources beat algorithmic noise every time.