|

Local AI models connected like Unix pipes. Drop a file, watch it flow.

We'll email you when it's ready. No spam.

The missing orchestration layer

AI models are powerful, but they're stuck in silos. A vision model in one app. An LLM in another. A transcription tool somewhere else. They don't talk to each other.

MachineFabric connects them. Drop a file, watch it flow through extractors, vision models, LLMs—whatever chain makes sense. The models don't know about each other. MachineFabric handles the wiring.

AI models as Unix commands

Unix solved this fifty years ago: small tools that do one thing, connected by pipes. The output of one becomes the input of the next.

MachineFabric treats AI models the same way. Each model is a black-box command. A vision model takes an image, outputs a description. An LLM takes text, outputs text. MachineFabric chains them together.

Input research-paper.pdf
Extract Text and images from each page
Vision Describe figures and diagrams
LLM Summarize, translate, analyze

From the blog

Loose coupling, real interoperability

Plugins advertise what they can do using capability URNs. What inputs they accept, what outputs they produce. MachineFabric matches capabilities to find chains automatically.

The protocol is minimal: stdin/stdout with CBOR frames. Any language that can read and write bytes can be a plugin. The community writes plugins without coordinating with us.

Extract cap:in="media:application/pdf";op=extract
Describe cap:in="media:image/png";op=describe
Translate cap:in="media:text/plain";op=translate;lang=fr

Local AI, no API keys

MachineFabric runs AI models on your Mac using MLX and Metal. Vision models, LLMs, transcription—all local.

No API keys. No usage limits. No sending your files to a server. Download a model once, run it forever. Works offline.

Vision Image and document understanding
LLM Analysis, summarization, translation
MLX Apple Silicon optimized inference

Community plugins

Plugins are standalone binaries. Any language that can read stdin and write stdout works. We have reference implementations in Rust, Swift, Go, and Python.

Someone wraps a new open-source model? Package it as a plugin. A better PDF extractor drops? Wrap it. The ecosystem grows without bottlenecks.

Bundled
pdfcartridge txtcartridge mlxcartridge ggufcartridge
Coming
Audio transcription coming
Video processing coming
Your model here write it

Native macOS, not Electron

MachineFabric is a native Mac app. SwiftUI, AppKit, Metal for GPU acceleration. It feels like part of the system because it is.

Drag files from Finder. Results appear in a native window. No web views, no embedded browsers, no "this page is unresponsive" dialogs.

Your files stay on your Mac

No cloud uploads. No "processing on our servers." AI models run locally. Everything is stored locally. Unplug your internet—MachineFabric keeps working.

Local AI Local storage Works offline

What's shipping

We're launching with plugins for PDFs, EPUBs, images, and text files. Vision models for image understanding. LLMs for text processing.

Working now: PDF extraction, image description, MLX inference, the plugin protocol, native macOS UI.

Coming: More models, more file types, audio transcription, video processing—whatever the community builds.

Try it

If you want to try MachineFabric, put your email in. We'll send you a download link when it's ready.