Getting Started
Prerequisites
Section titled “Prerequisites”- Docker + Docker Compose
- Bun
- Go 1.25+ (agents, MCP)
- CompileDaemon (
go install github.com/githubnemo/CompileDaemon@latest) - Flutter SDK
- Tilt
tilt up launches the Flutter demo client too, so the Flutter SDK is still part of the default local stack. If you are evaluating the platform, focus on the dashboard and server-side services first.
1. Clone and configure
Section titled “1. Clone and configure”git clone https://github.com/arcnem-ai/arcnem-vision.gitcd arcnem-visionCopy every .env.example to .env:
cp server/packages/api/.env.example server/packages/api/.envcp server/packages/db/.env.example server/packages/db/.envcp server/packages/dashboard/.env.example server/packages/dashboard/.envcp models/agents/.env.example models/agents/.envcp models/mcp/.env.example models/mcp/.envcp client/.env.example client/.envAdd your provider keys:
- OpenAI API key →
OPENAI_API_KEYinmodels/agents/.env - Same OpenAI key (recommended) →
OPENAI_API_KEYinserver/packages/api/.envfor dashboard collection chat and AI workflow draft generation - Replicate API token →
REPLICATE_API_TOKENinmodels/mcp/.env
Everything else is already configured for local development. Postgres, Redis, and MinIO come from docker-compose.yaml.
2. Start everything
Section titled “2. Start everything”tilt upTilt installs dependencies, starts infrastructure, runs migrations, and launches the API, dashboard, agents, MCP server, Inngest, docs site, and the Flutter demo client. Open the Tilt UI at http://localhost:10350 for logs and manual resources like seed and introspection.
3. Seed the database
Section titled “3. Seed the database”In the Tilt UI, trigger seed-database.
The seed creates:
- a demo organization, project, workflow keys, service keys, and API keys
- editable workflows and reusable workflow templates
- sample images for the description, OCR, quality-review, and segmentation paths
- stored OCR results, descriptions, embeddings, segmentations, and example run history
- a local debug dashboard session
Because server/packages/api/.env.example enables API_DEBUG=true, the dashboard can bootstrap into the seeded local session after seeding.
4. Walk the core product
Section titled “4. Walk the core product”- Open the dashboard at
http://localhost:3001. - In Projects & API Keys, inspect seeded workflow keys, service keys, and their attached workflows.
- In Workflow Library, browse templates, click Generate With AI, or open a graph in the canvas.
- In Docs, inspect seeded documents or upload a new one from the dashboard.
- In Runs, open a run and inspect its initial state, per-step deltas, final state, timing, and errors.
5. Exercise the two ingestion paths
Section titled “5. Exercise the two ingestion paths”Workflow / API-key path
Section titled “Workflow / API-key path”Use a workflow API key to run the automated flow:
curl -X POST http://localhost:3000/api/uploads/presign \ -H "Content-Type: application/json" \ -H "x-api-key: ${API_KEY}" \ -d '{"contentType":"image/png","size":12345}'Then upload to the returned S3 URL and call /api/uploads/ack. That acknowledgement verifies the object, creates the document, and queues document/process.upload for the workflow key’s bound workflow.
Dashboard path
Section titled “Dashboard path”In the Docs tab:
- Click Add From Dashboard.
- Upload an image into a project.
- Open the saved document.
- Queue any saved workflow against it.
This path is useful for ad-hoc analysis, reruns, and operator-driven evaluation because the document is not tied to a workflow key by default.
Health checks
Section titled “Health checks”GET http://localhost:3000/health # APIGET http://localhost:3020/health # AgentsGET http://localhost:3021/health # MCPS3 config details
Section titled “S3 config details”Default local dev uses MinIO from docker-compose.yaml. The .env.example files ship with working defaults:
S3_ACCESS_KEY_ID=minioadminS3_SECRET_ACCESS_KEY=minioadminS3_BUCKET=arcnem-visionS3_ENDPOINT=http://localhost:9000S3_REGION=us-east-1S3_USE_PATH_STYLE=true
For hosted storage, substitute your AWS S3, Cloudflare R2, Railway Object Storage, or Backblaze B2 credentials.
- Set
S3_USE_PATH_STYLEexplicitly for your provider. - Cloudflare R2 commonly needs
S3_REGION=autoandS3_USE_PATH_STYLE=false. - When the dashboard uploads directly from the browser to storage, some providers such as R2 also need bucket CORS configured to allow your dashboard origin and
PUTrequests.