Local Development¶
Purpose¶
This document defines the supported local development and testing workflow for
the standalone server/ repo.
Supported local modes¶
Use one of these modes locally:
native
uvworkflow - primary development mode - fastest for editing, tests, and manual API worklocal Docker workflow - smoke test for the production container image - useful before Cloud Run deployment
Use the native uv workflow for day-to-day development. Use Docker to
validate packaging and runtime parity.
Prerequisites¶
Required:
Python 3.12
uvPostgreSQL reachable from the local machine
Required for full media/final flows:
ffprobe/ffmpegOPENAI_API_KEYfor live extraction and summary generation
Repo bootstrap¶
From the repo root:
cd /path/to/server
uv sync
This creates the local virtual environment and installs the pinned toolchain.
Environment file¶
Local development may use server/.env.
Minimum required values:
TRAQ_DATABASE_URLTRAQ_API_KEY
Common local values:
TRAQ_STORAGE_ROOT=./local_dataTRAQ_ENABLE_DISCOVERY=trueTRAQ_AUTO_CREATE_SCHEMA=trueTRAQ_ENABLE_FILE_LOGGING=trueTRAQ_PLANTNET_BASE_URL=https://my-api.plantnet.orgTRAQ_PLANTNET_PROJECT=all
Optional local secret:
TRAQ_PLANTNET_API_KEYfor real tree-identification requests
The server loads .env automatically unless variables are already set in the
shell.
Local PostgreSQL workflow¶
Recommended local pattern:
use a dedicated local database for active development
keep a separate working database if you want to preserve current data
Development modes:
convenience bootstrap
leave
TRAQ_AUTO_CREATE_SCHEMA=trueuseful for ad hoc local work and quick experiments
migration-driven local work
set
TRAQ_AUTO_CREATE_SCHEMA=falserun:
uv run alembic upgrade head
use this when validating deployment-like behavior
If you are changing schema, use Alembic even locally.
Running the server¶
Normal local run:
uv run traq-server --reload --port 8000
If ffprobe is not on the default PATH, set TRAQ_FFPROBE_BIN in the
shell before starting the server.
Running the admin CLI¶
Interactive mode:
uv run traq-admin
One-shot examples:
uv run traq-admin device list
uv run traq-admin job list-assignments
uv run traq-admin customer list
uv run traq-admin artifact fetch --job J0002 --kind report-pdf
Artifact retrieval workflow¶
Use the admin CLI when an operator needs a customer-facing artifact by job number.
Examples:
uv run traq-admin artifact fetch --job J0002 --kind report-pdf
uv run traq-admin artifact fetch --job J0002 --kind traq-pdf
uv run traq-admin artifact fetch --job J0002 --kind transcript
uv run traq-admin artifact fetch --job J0002 --kind final-json
Artifacts are exported into a canonical local folder:
./exports/J0002/J0002_report_letter.pdf./exports/J0002/J0002_traq_page1.pdf./exports/J0002/J0002_transcript.txt./exports/J0002/J0002_final.json
If a correction artifact exists, the exported filename includes the correction variant.
The final-json export is DB-backed. It works for jobs whose archived final or
correction payload has been persisted to job_finals. Older file-only finals
may still support PDF retrieval but not final-json export.
Automated tests¶
The canonical local validation path now lives in
docs/testing_and_validation.rst.
Required pre-push regression command:
UV_CACHE_DIR=/tmp/uv-cache uv run python scripts/release_verify.py pre-deploy
Use narrower test targets while iterating on one area, then run the canonical pre-deploy gate before pushing deployment-facing changes.
Release verification helper¶
Use the shared release verification helper when you want the same pre-deploy gate that CI runs:
UV_CACHE_DIR=/tmp/uv-cache uv run python scripts/release_verify.py pre-deploy
This runs the current deployment-facing regression set with CI-safe local env defaults.
PostgreSQL parity lane¶
GitHub Actions now has a dedicated PostgreSQL-backed integration lane. It:
boots temporary PostgreSQL in CI
runs
uv run alembic upgrade headexecutes
tests.test_postgres_ci_smoke
That lane exists to catch migration and database-engine regressions that would not show up in the SQLite-backed fast regression path.
Manual smoke testing¶
Recommended order:
start the server
verify CLI access
fetch assigned jobs on device
edit and submit one review
test one audio/transcript path
test one image path
submit final
verify the finalized job is unassigned and does not reappear as active work
Local Docker smoke test¶
Build:
docker build -t traq-server:local .
Run against local PostgreSQL on Linux:
docker run --rm --network host \
-e TRAQ_DATABASE_URL='postgresql+psycopg://traq_app:<password>@127.0.0.1:5432/traq_demo' \
-e TRAQ_API_KEY='demo-key' \
-e OPENAI_API_KEY='<key>' \
-e TRAQ_ARTIFACT_BACKEND='local' \
-e TRAQ_STORAGE_ROOT='/tmp/traq-local-data' \
traq-server:local
If local .env already contains the required values, use the helper script:
./scripts/run_local_container.sh
Then verify:
curl -H 'X-API-Key: demo-key' http://127.0.0.1:8000/health
Local data handling¶
TRAQ_STORAGE_ROOT is local-only and git-ignored.
It contains:
artifact bytes
generated outputs
debug/export compatibility files
local rotating log files when file logging is enabled
It is disposable if the database remains intact, but deleting it will remove local artifacts and logs.
Practical guidance¶
use native
uvmode for normal codinguse Docker only to validate the deployment package
use Alembic for schema changes
do not treat local artifact storage as authoritative runtime state