-
Notifications
You must be signed in to change notification settings - Fork 0
Export models and wire into dashboard #102
Copy link
Copy link
Open
Labels
area:aiAI/ML, NLQ featuresAI/ML, NLQ featuresarea:frontendUI, React componentsUI, React componentsfine-tuning: student-explainabilityFine-tune Qwen 3.5 for SHAP narrator, summarizer, and explainer tasksFine-tune Qwen 3.5 for SHAP narrator, summarizer, and explainer taskstype:featureNew featureNew feature
Metadata
Metadata
Assignees
Labels
area:aiAI/ML, NLQ featuresAI/ML, NLQ featuresarea:frontendUI, React componentsUI, React componentsfine-tuning: student-explainabilityFine-tune Qwen 3.5 for SHAP narrator, summarizer, and explainer tasksFine-tune Qwen 3.5 for SHAP narrator, summarizer, and explainer taskstype:featureNew featureNew feature
Summary
Register winning GGUF models in Ollama and wire the dashboard to use them via model-client.ts.
Depends On
Tasks
bishop-state-narrator:{size}bishop-state-summarizer:{size}bishop-state-explainer:{size}ollama createollama runexplain-pairing/route.tsto usegenerateExplanation()frommodel-client.tsquery-summary/route.tsto usegenerateSummary()frommodel-client.tscreateOpenAIinstances from refactored routesgenerate_readiness_scores.pydefault model string for--enrich-with-llm.env.example: MODEL_BACKEND, OLLAMA_BASE_URL, MODEL_SIZE, SCHOOL_CODEMODEL_BACKEND=ollama: all 3 tasks serve correctlyMODEL_BACKEND=openai: fallback still worksOllama Model Naming
Acceptance Criteria
MODEL_BACKEND=ollamaserves all 3 tasks without OpenAIMODEL_BACKEND=openaistill works as before