Skip to content

Export models and wire into dashboard #102

@William-Hill

Description

@William-Hill

Summary

Register winning GGUF models in Ollama and wire the dashboard to use them via model-client.ts.

Depends On

Tasks

  • Create Ollama Modelfiles for each task with matched system prompts:
    • bishop-state-narrator:{size}
    • bishop-state-summarizer:{size}
    • bishop-state-explainer:{size}
  • Register models via ollama create
  • Verify each model responds correctly via ollama run
  • Refactor explain-pairing/route.ts to use generateExplanation() from model-client.ts
  • Refactor query-summary/route.ts to use generateSummary() from model-client.ts
  • Remove duplicate createOpenAI instances from refactored routes
  • Update generate_readiness_scores.py default model string for --enrich-with-llm
  • Add env vars to .env.example: MODEL_BACKEND, OLLAMA_BASE_URL, MODEL_SIZE, SCHOOL_CODE
  • Test with MODEL_BACKEND=ollama: all 3 tasks serve correctly
  • Test with MODEL_BACKEND=openai: fallback still works

Ollama Model Naming

bishop-state-narrator:{size}
bishop-state-summarizer:{size}
bishop-state-explainer:{size}

Acceptance Criteria

  • MODEL_BACKEND=ollama serves all 3 tasks without OpenAI
  • MODEL_BACKEND=openai still works as before
  • No duplicate OpenAI client instantiations remain in refactored routes

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions