LOCAL · Issue No. 01 APRIL 2026

Local.

Preview — Forthcoming

Inference or Die

A tract on running it yourself

“No centralization. No data exfiltration. Cheap. The Linux moment for AI looks like this.”

Abstract

Three tiers of inference: hosted APIs, self-hosted clusters, and the GPU in your laptop. This issue argues that the third tier has been underrated — and that a consumer-grade MacMini plus an open-weights model is already enough to unmake a chunk of the SaaS stack.

Preface

A Tract

Why this issue is angrier than the others. The argument for running inference you own.

LOCAL · № 01
A laptop held aloft like a raised fist with red and violet lightning bolts radiating, severed cables on the ground.
Preface · Inference you own, held up in the room.

LOCAL is a tract, not a survey. The tone is deliberate.

Most of the magazines in this family take the patient view: here are the forces, here are the proposals, here is where the stack is heading. LOCAL is the one that picks a side. Running inference you own is not a curiosity or a hobbyist’s position — it is the architecture that makes every other good thing about this wave cheaper, safer, and more durable.

The unusual typography is the point. When the rest of the publication is in cream paper and serif body copy, LOCAL is the one with the black cover and the stenciled title, and you should take that as a statement.

Forthcoming.

§