Files
famlaw/activeblue_familylaw/models/fl_ai_engine.py
Carlos Garcia 1d52d85a78 Phase 1: core models, security, seed data, and backend views
Implements full Phase 1 of the activeblue_familylaw Odoo 18 module:
- 17 Python models (fl.case, fl.party, fl.child, fl.support.calculation,
  fl.fee.waiver, fl.income.withholding, fl.deadline, fl.hearing,
  fl.deposition, fl.discovery, fl.document, fl.caselaw, fl.analysis,
  fl.ai.engine, fl.argument, fl.statute, fl.issue.tag) + hr.expense extension
- 3 wizard stubs (intake, analysis, generate-packet)
- Security: 4 groups (admin/paralegal/portal-petitioner/portal-respondent)
  + record rules scoping portal users to their own cases
- Seed data: issue tags, FL statutes, FL DCF support schedule, ir.sequence
- 13 backend view XML files with FL 61.30 worksheet, fee waiver
  eligibility banner, DV safety resources, emancipation alerts
- Static CSS/JS stubs for Phase 6 portal

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-04 18:52:04 -04:00

79 lines
2.4 KiB
Python

import json
from odoo import models
OLLAMA_URL = 'http://192.168.2.10:11434/api/generate'
OLLAMA_MODEL = 'llama3.1'
class FlAiEngine(models.AbstractModel):
"""
Phase 5 — Full Ollama integration.
Phase 1: Stub service model.
This is an AbstractModel — not stored in the database.
Used as a service class for AI analysis calls.
"""
_name = 'fl.ai.engine'
_description = 'Family Law AI Analysis Engine (Ollama)'
def analyze_case(self, case_id):
"""
Phase 5 entry point.
Full workflow:
1. Rule-based issue tagging
2. Build case context JSON
3. Call Ollama (llama3.1)
4. Parse JSON response
5. Store fl.analysis record
"""
case = self.env['fl.case'].browse(case_id)
analysis = self.env['fl.analysis'].create({
'case_id': case.id,
'state': 'pending',
'model_used': OLLAMA_MODEL,
'plain_english_summary': (
'AI analysis not yet implemented. '
'Full analysis will be available in Phase 5.'
),
'plain_english_summary_es': (
'El análisis de IA aún no está implementado. '
'El análisis completo estará disponible en la Fase 5.'
),
'state': 'complete',
})
return analysis
def _call_ollama(self, prompt):
"""Call Ollama API and return parsed JSON response."""
try:
import requests
except ImportError:
raise RuntimeError(
'requests library not available. '
'Install with: pip install requests'
)
response = requests.post(
OLLAMA_URL,
json={
'model': OLLAMA_MODEL,
'prompt': prompt,
'stream': False,
'options': {
'temperature': 0.1,
'top_p': 0.9,
'num_predict': 2000,
},
},
timeout=180,
)
response.raise_for_status()
raw = response.json().get('response', '{}').strip()
# Strip markdown code fences if present
if raw.startswith('```'):
parts = raw.split('```')
raw = parts[1] if len(parts) > 1 else raw
if raw.startswith('json'):
raw = raw[4:]
return json.loads(raw.strip())