What it does:
Semantic search over local folders and notes Works across multiple synced directories RAG-style answers with citations from your own files
How it works:
Calls `POST /search/query` with `local_folders` Uses `search_mode: sources` to return answers + file references
Example:
vault ask "What are my notes about project planning?"
I've been working on a programming language called G. It is designed to be memory-safe and extremely fast, with a focus on a tiny footprint.
The entire interpreter is written in D and weighs in at only 2.4MB. I built it because I wanted a modern scripting language that feels lightweight but has the safety of a high-level language.
Key Features:
Small: The binary is ~2.4MB. Fast: Optimized for x86_64. Safe: Memory-safe execution. Std Lib: Includes std.echo, std.newline, etc. GitHub: https://github.com/pouyathe/glang
I would love to get some feedback on the syntax or the architecture from the community!
I built Axiomeer, an open-source marketplace protocol for AI agents. The idea: instead of hardcoding tool integrations into every agent, agents shop a catalog at runtime, and the marketplace ranks, executes, validates, and audits everything.
How it works: - Providers publish products (APIs, datasets, model endpoints) via 10-line JSON manifests - Agents describe what they need in natural language or structured tags - The router scores all options by capability match (70%), latency (20%), cost (10%) with hard constraint filters - The top pick is executed, output is validated (citations required? timestamps?), and evidence quality is assessed deterministically - If the evidence is mock/fake/low-quality, the agent abstains rather than hallucinating - Every execution is logged as an immutable receipt
The trust layer is the part I think is missing from existing approaches. MCP standardizes how you connect to a tool server. Axiomeer operates one layer up: which tool, from which provider, and can you trust what came back?
Stack: Python, FastAPI, SQLAlchemy, Ollama (local LLM, no API keys). v1 ships with weather providers (Open-Meteo + mocks). The architecture supports any HTTP endpoint that returns structured JSON.
Looking for contributors to add real providers across domains (finance, search, docs, code execution). Each provider is ~30 lines + a manifest.