Skip to content

Building a Full App: End-to-End Example

This guide walks through building a bookstore API from scratch. By the end, you’ll have:

  • Cookie-based authentication with signup
  • Organization-scoped multi-tenancy
  • Two related resources: authors and books (with a foreign key)
  • Full CRUD endpoints with generated tests
  • Tenancy isolation tests proving data can’t leak between organizations
  • An OpenAPI spec, docs UI, and TypeScript client

This mirrors the patterns used in ShipQ’s own end-to-end test suite.

Make sure you have:

  • Go 1.25+ installed
  • ShipQ CLI built from source (Installation)
  • A database engine available (Postgres, MySQL, or SQLite)

For this walkthrough, we’ll assume Postgres. If you don’t have Postgres, ShipQ will fall back to SQLite automatically.

Terminal window
mkdir bookstore && cd bookstore
shipq init

This creates:

  • go.mod with your module name
  • shipq.ini with [db] and [typescript] sections
  • .gitignore configured to exclude .shipq/
Terminal window
# For Postgres:
export DATABASE_URL="postgres://localhost:5432/bookstore_dev?sslmode=disable"
shipq db setup
# Or just let ShipQ auto-detect (falls back to SQLite if no DB server found):
# shipq db setup

Verify that shipq.ini now has a database_url:

[db]
database_url = postgres://localhost:5432/bookstore_dev?sslmode=disable
[typescript]
framework = react
http_output = .
Terminal window
shipq auth
go mod tidy

This generates:

  • Migrations for organizations, accounts, and sessions
  • Login/logout/me handlers in api/auth/
  • Auth middleware that protects routes by default
  • Tests in api/auth/spec/

Now add the signup endpoint:

Terminal window
shipq signup
go mod tidy

Verify the auth system works:

Terminal window
go test ./api/auth/spec/... -v -count=1

You should see all auth tests passing — login, logout, session management, and signup.

Open shipq.ini and add the scope setting under [db]:

[db]
database_url = postgres://localhost:5432/bookstore_dev?sslmode=disable
scope = organization_id
[auth]
protect_by_default = true
[typescript]
framework = react
http_output = .

Setting scope = organization_id tells ShipQ to:

  • Auto-inject organization_id:references:organizations into every new migration
  • Generate queries that filter by organization_id
  • Generate handlers that extract organization_id from the authenticated user’s context
  • Generate tenancy isolation tests
Terminal window
shipq migrate new authors name:string bio:text

Because scope = organization_id is set, ShipQ automatically injects organization_id:references:organizations into the migration. The effective column list is:

name:string bio:text organization_id:references:organizations

You don’t have to type the scope column — it’s injected by the compiler.

Terminal window
shipq migrate new books title:string isbn:string published:bool author_id:references:authors

Again, organization_id is auto-injected. This migration creates a books table with:

  • title (string)
  • isbn (string)
  • published (bool)
  • author_id (foreign key → authors)
  • organization_id (foreign key → organizations, auto-injected)
  • Plus the standard id, public_id, created_at, updated_at, deleted_at columns
Terminal window
shipq migrate up

This runs the schema compiler, which:

  1. Discovers all migration files (auth tables + authors + books)
  2. Builds a canonical MigrationPlan
  3. Writes shipq/db/migrate/schema.json
  4. Generates typed schema bindings in shipq/db/schema/schema.go
  5. Applies the schema to both dev and test databases

After this step, you have typed table and column references available:

  • schema.Authors.Name(), schema.Authors.Bio(), etc.
  • schema.Books.Title(), schema.Books.AuthorId(), etc.

Generate all CRUD operations for both resources:

Terminal window
shipq resource authors all
shipq resource books all
go mod tidy

For each resource, this generates:

FileMethodRoute
api/authors/create.goPOST/authors
api/authors/get_one.goGET/authors/:id
api/authors/list.goGET/authors
api/authors/update.goPATCH/authors/:id
api/authors/soft_delete.goDELETE/authors/:id
api/books/create.goPOST/books
api/books/get_one.goGET/books/:id
api/books/list.goGET/books
api/books/update.goPATCH/books/:id
api/books/soft_delete.goDELETE/books/:id

Plus register.go, query definitions, and tests for each.

Since protect_by_default = true (from shipq auth), all routes require authentication. The generated tests include both authenticated CRUD tests and 401 rejection tests.

Since scope = organization_id is set, all routes are tenant-scoped. The generated tests include tenancy isolation tests.

Terminal window
go test ./... -v -count=1

This runs every generated test, including:

  • Auth tests: login, logout, signup, session management
  • Authors CRUD tests: create, read, update, delete with valid auth
  • Books CRUD tests: create, read, update, delete with valid auth
  • 401 tests: verifying unauthenticated requests are rejected
  • Tenancy isolation tests: verifying Organization A’s data is invisible to Organization B

The tenancy tests follow this pattern:

  1. Create User A in Organization A
  2. Create User B in Organization B
  3. User A creates a resource (e.g., an author)
  4. User B tries to read that resource → gets 404 (not 200)
  5. User B lists resources → gets an empty list (not User A’s data)

If all tests pass, your data isolation is correct by construction.

Terminal window
go run ./cmd/server
# or:
shipq start server

Your API is now running! In dev mode, visit:

  • GET /docs — Interactive API documentation with all 10+ endpoints
  • GET /openapi — Raw OpenAPI 3.1 JSON spec
Terminal window
curl -c cookies.txt -X POST http://localhost:8080/auth/signup \
-H "Content-Type: application/json" \
-d '{"email": "[email protected]", "password": "securepassword123"}'

The response sets a signed session cookie. The -c cookies.txt flag saves it for subsequent requests.

Terminal window
curl -c cookies.txt -X POST http://localhost:8080/auth/login \
-H "Content-Type: application/json" \
-d '{"email": "[email protected]", "password": "securepassword123"}'
Terminal window
curl -b cookies.txt -X POST http://localhost:8080/authors \
-H "Content-Type: application/json" \
-d '{"name": "J.R.R. Tolkien", "bio": "Author of The Lord of the Rings"}'
Terminal window
curl -b cookies.txt -X POST http://localhost:8080/books \
-H "Content-Type: application/json" \
-d '{"title": "The Hobbit", "isbn": "978-0547928227", "published": true, "author_id": "<author-public-id>"}'
Terminal window
curl -b cookies.txt http://localhost:8080/books

Let’s take stock of everything ShipQ generated from just a handful of commands:

Terminal window
shipq init
shipq db setup
shipq auth
shipq signup
shipq migrate new authors name:string bio:text
shipq migrate new books title:string isbn:string published:bool author_id:references:authors
shipq migrate up
shipq resource authors all
shipq resource books all
shipq handler compile
  • Database schema: 5 tables (organizations, accounts, sessions, authors, books) with proper foreign keys, indexes, and soft-delete support
  • Cookie-based authentication: signup, login, logout, session management
  • 10 CRUD endpoints: full create/read/update/delete/list for both authors and books
  • Multi-tenancy: every query scoped to organization_id, enforced at the SQL level
  • Typed query runners: compile-time type-safe database access for all operations
  • Comprehensive tests: auth tests, CRUD tests, 401 tests, tenancy isolation tests
  • OpenAPI 3.1 spec: auto-generated from handler metadata
  • API docs UI: interactive documentation at /docs
  • Admin UI: for manual testing and exploration
  • TypeScript HTTP client: fully typed, ready for your frontend
  • React hooks: data-fetching hooks for every endpoint
  • Self-contained Go project: no runtime dependency on ShipQ

From here, you can:

  • Add file uploads: shipq files → S3-compatible managed file storage
  • Add background jobs: shipq workers → Redis job queue + Centrifugo WebSocket channels
  • Add OAuth: shipq auth google or shipq auth github for social login
  • Add email verification: shipq email for email verification and password reset (requires workers)
  • Write custom queries: add query definitions in querydefs/ using the PortSQL DSL
  • Write custom handlers: add handler packages in api/ and register them with the handler registry
  • Deploy: shipq docker to generate production Dockerfiles

As you develop, here’s the command to run when things change:

What changedCommand
New or edited migrationshipq migrate up
New or edited query definitionshipq db compile
New or edited handlershipq handler compile
New table + full CRUDshipq migrate new <table> ...shipq resource <table> all
Channel definitions changedshipq workers compile
Everything (nuclear option)shipq migrate reset → regenerate resources → shipq handler compile