Screenpipe at South Park Commons — the talk
2 min read
screenpipearchitecturelocal-firstopen-sourceai-agentssouth-park-commons
Screenpipe at South Park Commons — the talk
We gave a technical talk at South Park Commons on what screenpipe actually is under the hood.
The same questions kept coming up before and after the room:
- Where does the data live?
- What does the index actually look like?
- How does an agent get from a
pipe.mdon disk to a Slack message? - What runs on the device and what touches the cloud?
So we wrote the talk down. Open the deck →
It's an interactive deck — 16 slides, keyboard-driven (← → space). You can scrub the timeline, click through chat presets, swap pipes, and inspect the data model and API surface, all in the same B&W brand we ship in the desktop app.
What's in it
- Architecture — three layers: capture, store, act. All three run on device.
- Capture — Apple Vision + Tesseract OCR, native AX tree on macOS / UIA on Windows, Whisper STT, 512-dim speaker embeddings via SQLite's
vec_f32extension. - UI — interactive mocks of Rewind (timeline scrubber), Ask (chat with streaming presets), Search (frames + chats + pipe runs), and Pipes (clickable per-pipe live activity).
- Data model — every active SQLite table at
~/.screenpipe/db.sqlite, post the search consolidation intoframes.full_text+frames_fts. - API surface — every route on
localhost:3030, including the WebSocket streams for events, health, meeting status, and metrics. - Agent runtime — how a
pipe.mdon disk gets resolved, prompted, spawned, observed, and recorded inpipe_executions. - Privacy — local-first by default, attested TEE via tinfoil.sh for PII redaction, MIT license, BYOK for every model call.
- Enterprise — single license key, per-fleet policy, signed builds, MDM-ready packaging.
- Hiring — three open roles. Founding engineer, applied AI engineer, enterprise BD.
Try it in two commands
Start recording:
npx screenpipe recordPlug it into Claude Code:
claude mcp add screenpipe -- npx -y screenpipe-mcpStar the repo if any of this resonates: github.com/screenpipe/screenpipe. The fastest signal you can give us is a star — it tells us where to invest.
