Files
PocketVeto/frontend/lib
Jack Levy cba19c7bb3 feat: LLM Batch API — OpenAI + Anthropic 50% cost reduction (v0.9.8)
Submit up to 1000 unbriefed documents to the provider Batch API in one
shot instead of individual synchronous LLM calls. Results are polled
every 30 minutes via a new Celery beat task and imported automatically.

- New worker: llm_batch_processor.py
  - submit_llm_batch: guards against duplicate batches, builds JSONL
    (OpenAI) or request list (Anthropic), stores state in AppSetting
  - poll_llm_batch_results: checks batch status, imports completed
    results with idempotency, emits notifications + triggers news fetch
- celery_app: register worker, route to llm queue, beat every 30 min
- admin API: POST /submit-llm-batch + GET /llm-batch-status endpoints
- Frontend: submitLlmBatch + getLlmBatchStatus in adminAPI; settings
  page shows batch control row (openai/anthropic only) with live
  progress line while batch is processing

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-14 17:35:15 -04:00
..