Local Development

Purpose

This document defines the supported local development and testing workflow for the standalone server/ repo.

Supported local modes

Use one of these modes locally:

  1. native uv workflow - primary development mode - fastest for editing, tests, and manual API work

  2. local Docker workflow - smoke test for the production container image - useful before Cloud Run deployment

Use the native uv workflow for day-to-day development. Use Docker to validate packaging and runtime parity.

Prerequisites

Required:

  • Python 3.12

  • uv

  • PostgreSQL reachable from the local machine

Required for full media/final flows:

  • ffprobe / ffmpeg

  • OPENAI_API_KEY for live extraction and summary generation

Repo bootstrap

From the repo root:

cd /path/to/server
uv sync

This creates the local virtual environment and installs the pinned toolchain.

Environment file

Local development may use server/.env.

Minimum required values:

  • TRAQ_DATABASE_URL

  • TRAQ_API_KEY

Common local values:

  • TRAQ_STORAGE_ROOT=./local_data

  • TRAQ_ENABLE_DISCOVERY=true

  • TRAQ_AUTO_CREATE_SCHEMA=true

  • TRAQ_ENABLE_FILE_LOGGING=true

  • TRAQ_PLANTNET_BASE_URL=https://my-api.plantnet.org

  • TRAQ_PLANTNET_PROJECT=all

Optional local secret:

  • TRAQ_PLANTNET_API_KEY for real tree-identification requests

The server loads .env automatically unless variables are already set in the shell.

Local PostgreSQL workflow

Recommended local pattern:

  • use a dedicated local database for active development

  • keep a separate working database if you want to preserve current data

Development modes:

  1. convenience bootstrap

    • leave TRAQ_AUTO_CREATE_SCHEMA=true

    • useful for ad hoc local work and quick experiments

  2. migration-driven local work

    • set TRAQ_AUTO_CREATE_SCHEMA=false

    • run:

      uv run alembic upgrade head
      
    • use this when validating deployment-like behavior

If you are changing schema, use Alembic even locally.

Running the server

Normal local run:

uv run traq-server --reload --port 8000

If ffprobe is not on the default PATH, set TRAQ_FFPROBE_BIN in the shell before starting the server.

Running the admin CLI

Interactive mode:

uv run traq-admin

One-shot examples:

uv run traq-admin device list
uv run traq-admin job list-assignments
uv run traq-admin customer list
uv run traq-admin artifact fetch --job J0002 --kind report-pdf

Artifact retrieval workflow

Use the admin CLI when an operator needs a customer-facing artifact by job number.

Examples:

uv run traq-admin artifact fetch --job J0002 --kind report-pdf
uv run traq-admin artifact fetch --job J0002 --kind traq-pdf
uv run traq-admin artifact fetch --job J0002 --kind transcript
uv run traq-admin artifact fetch --job J0002 --kind final-json

Artifacts are exported into a canonical local folder:

  • ./exports/J0002/J0002_report_letter.pdf

  • ./exports/J0002/J0002_traq_page1.pdf

  • ./exports/J0002/J0002_transcript.txt

  • ./exports/J0002/J0002_final.json

If a correction artifact exists, the exported filename includes the correction variant.

The final-json export is DB-backed. It works for jobs whose archived final or correction payload has been persisted to job_finals. Older file-only finals may still support PDF retrieval but not final-json export.

Automated tests

The canonical local validation path now lives in docs/testing_and_validation.rst.

Required pre-push regression command:

UV_CACHE_DIR=/tmp/uv-cache uv run python scripts/release_verify.py pre-deploy

Use narrower test targets while iterating on one area, then run the canonical pre-deploy gate before pushing deployment-facing changes.

Release verification helper

Use the shared release verification helper when you want the same pre-deploy gate that CI runs:

UV_CACHE_DIR=/tmp/uv-cache uv run python scripts/release_verify.py pre-deploy

This runs the current deployment-facing regression set with CI-safe local env defaults.

PostgreSQL parity lane

GitHub Actions now has a dedicated PostgreSQL-backed integration lane. It:

  1. boots temporary PostgreSQL in CI

  2. runs uv run alembic upgrade head

  3. executes tests.test_postgres_ci_smoke

That lane exists to catch migration and database-engine regressions that would not show up in the SQLite-backed fast regression path.

Manual smoke testing

Recommended order:

  1. start the server

  2. verify CLI access

  3. fetch assigned jobs on device

  4. edit and submit one review

  5. test one audio/transcript path

  6. test one image path

  7. submit final

  8. verify the finalized job is unassigned and does not reappear as active work

Local Docker smoke test

Build:

docker build -t traq-server:local .

Run against local PostgreSQL on Linux:

docker run --rm --network host \
  -e TRAQ_DATABASE_URL='postgresql+psycopg://traq_app:<password>@127.0.0.1:5432/traq_demo' \
  -e TRAQ_API_KEY='demo-key' \
  -e OPENAI_API_KEY='<key>' \
  -e TRAQ_ARTIFACT_BACKEND='local' \
  -e TRAQ_STORAGE_ROOT='/tmp/traq-local-data' \
  traq-server:local

If local .env already contains the required values, use the helper script:

./scripts/run_local_container.sh

Then verify:

curl -H 'X-API-Key: demo-key' http://127.0.0.1:8000/health

Local data handling

TRAQ_STORAGE_ROOT is local-only and git-ignored.

It contains:

  • artifact bytes

  • generated outputs

  • debug/export compatibility files

  • local rotating log files when file logging is enabled

It is disposable if the database remains intact, but deleting it will remove local artifacts and logs.

Practical guidance

  • use native uv mode for normal coding

  • use Docker only to validate the deployment package

  • use Alembic for schema changes

  • do not treat local artifact storage as authoritative runtime state