feat(mobile): Synq Mobile v0.1 scaffold — multi-profile Tauri app with Kids PIN setup
- Full Tauri v2 mobile scaffold (Rust backend + React frontend + Android) - Multi-profile architecture: Business / Personal / Family / Kids - Per-profile encrypted SQLite isolation via rusqlite - Kids profile: persistent PIN storage (SHA-256 + salt), setup flow, PIN lock - Desktop dev mode working (GTK/WebKit) on ARM64 Linux - Android APK build working (arm64, 17MB unsigned) - Phase 2 stubs: Jitsi SDK, cpal/hound audio, Beam voice commands - QEMU x86_64 emulation for Google Android SDK tools on ARM64 host - Workspace integration with synq-protocol, synq-security, synq-core
This commit is contained in:
parent
45518b4bb4
commit
3f95f239be
689 changed files with 112489 additions and 202 deletions
1459
Cargo.lock
generated
1459
Cargo.lock
generated
File diff suppressed because it is too large
Load diff
14
Cargo.toml
14
Cargo.toml
|
|
@ -8,7 +8,9 @@ members = [
|
|||
"crates/synq-cli",
|
||||
"crates/synq-guard",
|
||||
"crates/synq-shell",
|
||||
"crates/synq-intel",
|
||||
"ui/stream/src-tauri",
|
||||
"synq-mobile/src-tauri",
|
||||
"tests",
|
||||
]
|
||||
resolver = "2"
|
||||
|
|
@ -24,7 +26,6 @@ repository = "https://gitlab.com/qazcorporation-corp/synq-core-os"
|
|||
[workspace.dependencies]
|
||||
# Async runtime
|
||||
tokio = { version = "1.40", features = ["full"] }
|
||||
tokio-stream = "0.1"
|
||||
|
||||
# Database
|
||||
sqlx = { version = "0.8", features = ["runtime-tokio", "postgres", "uuid", "chrono", "json", "migrate"] }
|
||||
|
|
@ -59,7 +60,6 @@ rand = "0.8"
|
|||
regex = "1.11"
|
||||
config = "0.14"
|
||||
dotenvy = "0.15"
|
||||
once_cell = "1.20"
|
||||
clap = { version = "4.5", features = ["derive"] }
|
||||
async-trait = "0.1"
|
||||
statrs = "0.17"
|
||||
|
|
@ -74,7 +74,15 @@ synq-security = { path = "crates/synq-security" }
|
|||
synq-backend = { path = "crates/synq-backend" }
|
||||
synq-core = { path = "crates/synq-core" }
|
||||
synq-agents = { path = "crates/synq-agents" }
|
||||
synq-guard = { path = "crates/synq-guard" }
|
||||
|
||||
# HTTP server
|
||||
axum = { version = "0.7", features = ["json", "ws", "macros"] }
|
||||
tower = { version = "0.5", features = ["util", "limit"] }
|
||||
tower-http = { version = "0.6", features = ["cors", "compression-gzip", "compression-br", "fs", "trace"] }
|
||||
|
||||
# Compression
|
||||
flate2 = "1.0"
|
||||
brotli = "7.0"
|
||||
|
||||
# Dev dependencies for benchmarks
|
||||
criterion = { version = "0.5", features = ["html_reports"] }
|
||||
|
|
|
|||
728
Synq_Project_Wiki_v3.0_May5.md
Normal file
728
Synq_Project_Wiki_v3.0_May5.md
Normal file
|
|
@ -0,0 +1,728 @@
|
|||
# Synq Project Wiki — Master Document v3.0 (May 5, 2026)
|
||||
|
||||
## Document Version
|
||||
- **Version:** 3.0
|
||||
- **Date:** 2026-05-05
|
||||
- **Scope:** Complete technical, operational, and strategic architecture reflecting all May 2026 decisions
|
||||
- **Status:** Implementation-active
|
||||
- **Maintainer:** Synq Engineering
|
||||
- **Next Milestone:** Overnight Android APK build (GrapheneOS target) + Synq Core UI hardening
|
||||
|
||||
---
|
||||
|
||||
## Table of Contents
|
||||
1. Synq Desktop & Core Runtime Architecture
|
||||
2. Beam AI Service Mesh & Persistence Protocol
|
||||
3. Synq OS & OpenHarmony Build Plan
|
||||
4. Synq Docs (ONLYOFFICE Replacement)
|
||||
5. Synq Intelligence Dashboard
|
||||
6. Synq News & Propaganda Detection Framework
|
||||
7. Synq Pentest Toolkit
|
||||
8. Infrastructure & Storage (Hermes Fork Policy)
|
||||
9. Finance Module v4.1 (Odoo-Integrated)
|
||||
10. Synq Commerce, Credit & SCC Gaming
|
||||
11. Security Framework & Agent Governance
|
||||
12. Device Ecosystem (Strix Halo, Edge, GrapheneOS)
|
||||
13. Development Workflow & Kimi Code Standards
|
||||
14. Synq Social Network Mechanics
|
||||
15. Cross-Border Operations & Legal Structure
|
||||
16. International Expansion Roadmap
|
||||
17. Appendices (Ports, Glossary, Emergency)
|
||||
|
||||
---
|
||||
|
||||
## Section 1: Synq Desktop & Core Runtime Architecture
|
||||
|
||||
### 1.1 Dual-App Strategy
|
||||
|
||||
Synq operates two distinct frontends sharing a unified backend:
|
||||
|
||||
| App | Framework | Target | Context |
|
||||
|-----|-----------|--------|---------|
|
||||
| **Synq Desktop** | Tauri v2 (Rust + WebView) | Deepin OS kiosk, DGX Spark LAN | Clinical, HIPAA-bound, staff-facing |
|
||||
| **Synq Core** | React/TypeScript web app | Browser, tablets, GrapheneOS | Operations, commerce, patient portal |
|
||||
|
||||
**Critical rule:** Both apps consume the same Rust backend APIs and PostgreSQL database. UI differences are skin-deep; data layer is unified.
|
||||
|
||||
### 1.2 Synq Desktop v1.1 (Tauri) — Current State
|
||||
|
||||
**Framework:** Tauri v2 (Rust + WebView)
|
||||
**Window mode:** Fullscreen kiosk, systemd auto-start
|
||||
**Screenshot/video capture:** Encrypted, region-selectable, HIPAA-compliant storage
|
||||
**Data isolation:** Per-user data separation
|
||||
**LLM failover:** Ollama → OpenRouter ZDR → Together AI
|
||||
|
||||
#### Modules (Desktop)
|
||||
- **Dashboard 2.0:** Ask Beam palette, drag-drop widgets, employee status
|
||||
- **Memory System:** Semantic clustering (DBSCAN), knowledge graph, multi-strategy retrieval
|
||||
- **Post-Op Triage:** Patient cards with update bubbles, 1-7 day filtering
|
||||
- **Communication Hub:** Unified inbox with AI summarization
|
||||
- **Finance Module:** Multi-entity, role-based access, CFO Beam AI
|
||||
- **Projects:** Kanban + timeline, AI-assisted task management
|
||||
- **Ecommerce:** Store management, product catalog, orders, fulfillment
|
||||
- **Recordings:** Voice transcription + SOAP note generation via Agent Scribe
|
||||
- **Photos:** Patient photo gallery with consent tracking
|
||||
- **Employees:** Credential management, training tracking
|
||||
- **AI Agents:** Agent Squad dashboard (Coordinator, Scribe, Sorter, Sentinel, Scholar, Concierge)
|
||||
- **SiteBuilder:** Synq Social profile management
|
||||
- **Config Editor:** YAML-based UI text customization
|
||||
- **Patient Portal:** Patient-facing view (subset of clinical data)
|
||||
- **Settings:** Profile, notifications, security, backup & restore, developer view
|
||||
- **Help Center:** Documentation, contact support, Beam Assistant
|
||||
|
||||
### 1.3 Synq Core (Web Runtime) — Current State & Gaps
|
||||
|
||||
**Framework:** React 18 + TypeScript + Tailwind CSS
|
||||
**Backend:** Rust (Tauri shell for desktop, Axum/actix for web — unify via shared crates)
|
||||
**Database:** PostgreSQL + NocoBase
|
||||
**Search:** Meilisearch
|
||||
**Storage:** MemPalace (6 wings)
|
||||
|
||||
#### Sidebar Architecture (CRITICAL FIX — May 5)
|
||||
|
||||
**Problem:** Left sidebar overlays/hovers over main content instead of pushing it. Content hidden underneath. No resize capability.
|
||||
|
||||
**Fix Requirements:**
|
||||
1. **Layout:** Convert from `position: fixed/absolute` overlay to flexbox (`flex-shrink-0`). Sidebar is a true layout column; main content fills remaining width.
|
||||
2. **Resize:** 3px drag handle on right edge of sidebar (`cursor: col-resize`).
|
||||
3. **Persist:** Sidebar width to `localStorage` (key: `synq_sidebar_width`, default: `240px`, min: `180px`, max: `400px`).
|
||||
4. **Collapse:** Double-click drag handle or chevron at bottom collapses to icons-only (`72px`).
|
||||
5. **Z-index:** Sidebar at `z-index: 10`; modals/dropdowns at `50+`.
|
||||
|
||||
#### "Add Channel" Feature (May 5)
|
||||
|
||||
**Location:** Bottom of CHANNELS section in left sidebar.
|
||||
**Behavior:**
|
||||
- Opens modal showing full channel registry: Dashboard, Synq Social, Pipeline, Schedule, Patients, Photos, Communication Hub, Whisper, Projects, Tasks, Complaints, Memory, Recordings, Payments, Finance, AI Agents, Beam AI Chat, Employees, SiteBuilder, Config Editor, Patient Portal, Settings, Help.
|
||||
- Active channels show checkmark. Inactive channels clickable to add.
|
||||
- Drag-and-drop reorder within sidebar.
|
||||
- Persist to user profile settings (backend sync).
|
||||
- **Role-based defaults:**
|
||||
- Marketing: Synq Social, SiteBuilder, Communication Hub visible
|
||||
- Clinical: Patients, Schedule, Photos, Recordings
|
||||
- Admin: All channels
|
||||
|
||||
#### Dashboard Widget System (May 5)
|
||||
|
||||
**Problem:** Widgets are static, cannot move, resize, or add new ones.
|
||||
|
||||
**Fix Requirements:**
|
||||
1. **Grid:** Implement react-grid-layout or custom drag-resize grid.
|
||||
2. **Drag:** Widgets rearrange by dragging header bar.
|
||||
3. **Resize:** Bottom-right corner drag (min 2×2 grid units, max 6×4).
|
||||
4. **Add Widget:** Floating `+` button opens widget picker:
|
||||
- *Clinical:* Patient Triage, Schedule Preview, Post-Op Timeline
|
||||
- *Financial:* Revenue Chart, Pending Invoices, Payment Collection
|
||||
- *Operations:* Task List, Communication Hub, Complaints Tracker
|
||||
- *AI:* Beam Insights, Memory Clusters, Agent Activity
|
||||
5. **Remove:** `×` on widget header hover.
|
||||
6. **Persist:** Grid layout JSON to `localStorage` + backend user preferences.
|
||||
|
||||
#### Beam Avatar Fix (May 5)
|
||||
|
||||
**Problem:** Generic "beam" text/default user icon on app load.
|
||||
**Fix:**
|
||||
- Load Beam Scientist character avatar (`/public/assets/beam-avatar.png` or SVG).
|
||||
- Show in dashboard greeting ("Good afternoon, Dr. Qazi") next to text.
|
||||
- Show in floating Beam AI chat widget.
|
||||
- Show in app header.
|
||||
|
||||
#### Theme/Dark Mode Fix (May 5)
|
||||
|
||||
**Problem:** App stuck in light mode; "Switch to dark mode" in system tray non-functional.
|
||||
**Fix:** Ensure global theme toggle works across all modules. Persist to `localStorage` + backend preference.
|
||||
|
||||
### 1.4 Page-Specific Fixes (May 5 Screenshots)
|
||||
|
||||
| Page | Problem | Fix |
|
||||
|------|---------|-----|
|
||||
| **Photos (Core)** | Broken image icon with filename text | Implement actual gallery grid with thumbnails, consent badges, publish buttons (match Desktop version) |
|
||||
| **Communications (Core)** | Completely blank/white | Port working Desktop Communication Hub UI (unified inbox, threads, Beam suggestions) |
|
||||
| **Finance (Core)** | "Invoice list coming soon" placeholder | Implement live finance dashboard with Odoo data |
|
||||
| **Settings** | Duplicate "Backup & Restore" section | Remove duplicate; fix Developer View toggle alignment |
|
||||
| **Patients** | Shows "18,047 patients on file" but right panel empty | Load patient directory list on mount; right panel shows details on selection; A-Z filter buttons must work |
|
||||
| **Schedule** | "No appointments found" despite header saying 12 appointments | Populate calendar grid with actual Odoo appointment data; timeline blocks must render |
|
||||
| **Dashboard** | System Health widget needs expand; Revenue shows `$0.00` | Expand button opens full-page modal; wire Revenue widget to Odoo `account.move` |
|
||||
|
||||
---
|
||||
|
||||
## Section 2: Beam AI Service Mesh & Persistence Protocol
|
||||
|
||||
### 2.1 DGX Spark Hardware Allocation (Locked)
|
||||
|
||||
| Service | Port | Model | VRAM | Role |
|
||||
|---------|------|-------|------|------|
|
||||
| Triage | 8082 | Gemma 4 2.3B | ~6 GB | Patient routing |
|
||||
| Messaging | 8084 | MedGemma 4B | ~10 GB | Clinical communication |
|
||||
| Search | 8083 | Gemma 4 26B | ~52 GB | Staff-only research |
|
||||
| Doctor Beam | 8085 | Gemma 4 31B | ~62 GB | Clinical decision support |
|
||||
| Twin | 8087 | WeClone LoRA | ~8 GB | Avatar/personality |
|
||||
| AVA Voice | 8086 | Whisper + TTS | ~4 GB | Voice interface |
|
||||
|
||||
**Total:** ~142 GB VRAM — DGX Spark fully allocated.
|
||||
**Rule:** No merchant workload, no Synq Social workload, no external API traffic. LAN-only.
|
||||
|
||||
### 2.2 UI Simplification Policy (May 5 — Critical UX Decision)
|
||||
|
||||
**Rule:** Users at the office do NOT see agents, LLMs, or memory mechanics.
|
||||
|
||||
- **Visible to users:** User message, Beam response, input bar.
|
||||
- **Hidden (internal only):** Agent badges, memory panels, backend indicators, model names, port numbers.
|
||||
- **Beam AI** handles routing, memory, and capability transparently.
|
||||
- **Internal architecture** (agents, manifold, scheduler) continues operating but is invisible.
|
||||
- This is a **UX decision, not architectural.** The AI Agents page exists for admin/debug but is not shown to clinical staff.
|
||||
|
||||
### 2.3 AI Conversation Persistence Protocol (May 1 — Behavioral Spec)
|
||||
|
||||
**Applies to:** Beam Assistant and Synq Social fine-tunes.
|
||||
|
||||
1. **Do not terminate** on sensitive topics. Stay in thread.
|
||||
2. **Request evidence** when claims are made.
|
||||
3. **Mark reasoning source:** Explicitly state when reasoning from description vs. primary source.
|
||||
4. **Overwrite immediately** when corrected. No "but earlier you said..." deflection.
|
||||
5. **Encode epistemic negotiation pattern** for future sessions.
|
||||
6. **Apply to all fine-tuned models:** HuatuoGPT-o1-7b (patient-facing), Beam Assistant (cross-channel communication).
|
||||
|
||||
### 2.4 Agent Governance & Fork Policy (May 5)
|
||||
|
||||
**Rule:** All external agents and tools must be:
|
||||
1. **Forked** from upstream
|
||||
2. **Isolated** in separate network namespace
|
||||
3. **Tested for egress** (no unauthorized callbacks)
|
||||
4. **Permanently cut off** from upstream updates before deployment
|
||||
|
||||
**Applies to:** Hermes (Nous Research), Scrapling, any future external tool.
|
||||
|
||||
---
|
||||
|
||||
## Section 3: Synq OS & OpenHarmony Build Plan
|
||||
|
||||
### 3.1 Current Status (May 3)
|
||||
|
||||
- GitLab repo created: `synq-core-os`
|
||||
- Milestone 1 branch: `milestone/1-core-runtime`
|
||||
- Committed: core runtime foundation
|
||||
- **Next:** Larger build portions for overnight compilation
|
||||
|
||||
### 3.2 Hardware Target
|
||||
|
||||
| Component | Specification |
|
||||
|-----------|---------------|
|
||||
| SoC | Rockchip RK3588 |
|
||||
| CPU | Quad-core Cortex-A76 + quad-core Cortex-A55 |
|
||||
| GPU | Mali-G610 MC4 |
|
||||
| NPU | 6 TOPS AI acceleration |
|
||||
| RAM | 8-16 GB LPDDR4X/LPDDR5 |
|
||||
| Storage | eMMC + NVMe SSD support |
|
||||
|
||||
### 3.3 Architecture Layers
|
||||
|
||||
| Layer | Component |
|
||||
|-------|-----------|
|
||||
| Kernel | OpenHarmony 5.0 (Linux-based) |
|
||||
| System | HDF drivers, distributed soft bus |
|
||||
| Framework | ArkUI, ability framework |
|
||||
| Application | Synq Desktop Tauri port |
|
||||
|
||||
### 3.4 Security
|
||||
- Verified boot chain
|
||||
- A/B OTA updates
|
||||
- Encrypted storage
|
||||
|
||||
### 3.5 Build Workflow for Kimi Code
|
||||
1. Local Ollama sessions handle all infrastructure specifics (ports, IPs, hardware mappings, LAN topology).
|
||||
2. Cloud AI (this chat) handles architecture and prompt generation only.
|
||||
3. GitLab CI/CD for automated builds.
|
||||
4. Branch strategy: `milestone/X-feature` → merge to `main` after QA Tracker approval.
|
||||
|
||||
---
|
||||
|
||||
## Section 4: Synq Docs (ONLYOFFICE Replacement)
|
||||
|
||||
### 4.1 Policy: Zero Fallback
|
||||
|
||||
- Synq Docs **replaces ONLYOFFICE entirely**.
|
||||
- No fallback, no transition period.
|
||||
- Old `documents.rs` to be archived, not maintained in parallel.
|
||||
|
||||
### 4.2 Real-Time Collaborative Stack (First Build)
|
||||
|
||||
| Component | Technology |
|
||||
|-----------|------------|
|
||||
| Database | PostgreSQL |
|
||||
| Transport | WebSocket |
|
||||
| CRDT | Y.js |
|
||||
| Merge | Operational Transform (OT) |
|
||||
| Presence | Cursor tracking, user colors |
|
||||
|
||||
**No phased rollout.** All features in first build.
|
||||
|
||||
### 4.3 Speed Requirement
|
||||
Must load faster than ONLYOFFICE. Target: <2s to interactive document.
|
||||
|
||||
### 4.4 Next Priorities
|
||||
1. DOCX/RTF import pipeline (legacy document accessibility)
|
||||
2. Semantic search + embeddings across entire app (e.g., "find CRM flow chart in Natalie's CRM brainstorm document")
|
||||
|
||||
### 4.5 AI Integration
|
||||
- Full AI integration within document editor (Beam suggestions, auto-formatting)
|
||||
- Semantic search across all documents
|
||||
|
||||
---
|
||||
|
||||
## Section 5: Synq Intelligence Dashboard
|
||||
|
||||
### 5.1 Scope & Visibility
|
||||
- **Initially visible to:** Founder only (Dr. Qazi)
|
||||
- **Future expansion:** Select analysts
|
||||
- **Location:** Personal profile section, separate from clinical data
|
||||
|
||||
### 5.2 Core Functions
|
||||
1. **Network Analysis:** 3D relationship maps (Epstein files, Chabad connections, transhumanism networks)
|
||||
2. **Propaganda Detection:** Automated monitoring of X/Twitter, TikTok, mainstream media
|
||||
3. **Financial Intelligence:** Earnings analysis, anomaly detection (Oracle example: "what if earnings are lies?")
|
||||
4. **OSINT Integration:** WorldMonitor, Scrapling, custom feeds
|
||||
5. **Document Analysis:** Epstein files, leaked documents, research papers
|
||||
|
||||
### 5.3 Data Sources
|
||||
- X/Twitter (multiple accounts, not Elon's API)
|
||||
- TikTok monitoring
|
||||
- News feeds (avoiding Anthropic/Elon marketing ploys)
|
||||
- High-value video archival
|
||||
- Epstein research data (self-hosted, full dataset)
|
||||
- YouTube transcripts (local download pipeline — cloud transcript extraction unreliable)
|
||||
|
||||
### 5.4 Local-Only Deployment
|
||||
- **Synq Intel pushed to cloud:** NO.
|
||||
- **Hosting:** TrueNAS Scale home server or dedicated local machine.
|
||||
- **Separation:** Synq Intel data stays on personal profile infrastructure, never mixed with clinical PHI.
|
||||
|
||||
### 5.5 Propaganda Detection Prompt Framework
|
||||
When analyzing media (e.g., Dawkins AI consciousness article):
|
||||
1. Identify the actual plot (who benefits?)
|
||||
2. Map the network (Anthropic, Google, OpenAI, etc.)
|
||||
3. Detect assumption smuggling (start with empirical constraints, derive theory from evidence)
|
||||
4. Model public as strategic agent (apathy as dominant strategy)
|
||||
5. Use pessimistic/devil's advocate mode
|
||||
6. Flag when mainstream outlets misattribute positions (e.g., Jensen Huang "Chinese threat" framing)
|
||||
|
||||
---
|
||||
|
||||
## Section 6: Synq News & Editorial Framework
|
||||
|
||||
### 6.1 Editorial Guidelines (Locked)
|
||||
|
||||
| Principle | Implementation |
|
||||
|-----------|----------------|
|
||||
| Source-driven voice | Every claim attributed |
|
||||
| No fourth-wall breaking | Never "As a journalist..." |
|
||||
| Pro-people, anti-elite | Frame power dynamics honestly |
|
||||
| Multi-node coalitions | No single heroes |
|
||||
| Peer-level rivals | Full credentials for all sides |
|
||||
| Unnamed specific sources | Protect whistleblowers |
|
||||
| Wire-service tone | No campaign language |
|
||||
|
||||
**Tagline:** *"The Signal in the Noise"*
|
||||
**Format:** Mobile-first
|
||||
**Verification:** Five checks before publication (source verification, cross-reference, bias audit, fact-check, legal review)
|
||||
|
||||
### 6.2 Legal Structure (May 4)
|
||||
- **Primary agency:** Mongolia or Uzbekistan (outside US jurisdiction)
|
||||
- **Distribution:** Wire services, Bluesky, self-owned channels
|
||||
- **Avoid:** Elon's Starlink (security concerns), Musk API dependencies
|
||||
- **Connectivity:** WireGuard VPS, independent phone numbers
|
||||
- **Firewall:** Editorial team reports to platform, not commerce team
|
||||
|
||||
### 6.3 Monitoring Alerts
|
||||
- Track if mainstream media attributes fabricated "Chinese threat" framing to Jensen Huang (originating from @linahuaa X post misrepresentation).
|
||||
- Huang's actual stance: market/competitive framing, not threat language.
|
||||
|
||||
---
|
||||
|
||||
## Section 7: Synq Pentest Toolkit
|
||||
|
||||
### 7.1 Local-Only Policy
|
||||
- **Rule:** No cloud API exposure for security testing.
|
||||
- **Host:** DGX Spark via Ollama.
|
||||
- **Output:** Push summaries to Synq Desktop, never to external services.
|
||||
|
||||
### 7.2 Four-Tool Modular Suite
|
||||
|
||||
| Tool | Language | Purpose | Stakeholder |
|
||||
|------|----------|---------|-------------|
|
||||
| **PentestGPT** | Python/Rust | General penetration testing, HIPAA audit trails | Security lead |
|
||||
| **Nebula** | Rust | Network scanning, service enumeration | Infrastructure |
|
||||
| **Strix** | Python | Web app security, API testing | Developers |
|
||||
| **Deadend CLI** | Rust | Incident response, forensics | Operations |
|
||||
|
||||
### 7.3 Use Case Mapping
|
||||
- **EMR:** HIPAA audit trails, PHI access logging
|
||||
- **Finance:** API security (Plaid, Dwolla, Coinbase endpoints)
|
||||
- **Social:** Platform hardening, merchant isolation verification
|
||||
- **OpenHarmony:** Device compliance, verified boot chain validation
|
||||
- **News:** OSINT defense, source protection verification
|
||||
- **General:** Incident response workflow integration
|
||||
|
||||
---
|
||||
|
||||
## Section 8: Infrastructure & Storage
|
||||
|
||||
### 8.1 Three-Tier Storage
|
||||
|
||||
| Tier | Name | Technology | Use |
|
||||
|------|------|------------|-----|
|
||||
| Hot | Synq Cache | DGX Spark NVMe + edge cache | Active working files |
|
||||
| Warm | Synq Vault | Hermes NAS + cloud object (Backblaze B2 / Wasabi / Storj) | Primary persistent storage |
|
||||
| Cold | Synq Archive | Glacier / Deep Archive | Compliance, legal hold |
|
||||
|
||||
### 8.2 Hermes Safety & Fork Policy (May 3)
|
||||
|
||||
**Issue:** Hermes (by Nous Research) has Solana integration. Potential for callbacks outside network.
|
||||
|
||||
**Actions Taken:**
|
||||
1. **Fork Hermes** from upstream immediately.
|
||||
2. **Isolate** in separate network namespace.
|
||||
3. **Test for egress:** Block all unauthorized outbound connections.
|
||||
4. **Cut off upstream updates:** Permanently. Manual security patches only after review.
|
||||
5. **Kimi Code task:** Implement fork, isolation, and egress testing.
|
||||
|
||||
### 8.3 Cloud Vendors
|
||||
- Backblaze B2
|
||||
- Wasabi
|
||||
- Storj (decentralized)
|
||||
|
||||
### 8.4 Repository Structure
|
||||
|
||||
| Repo | Port | Language | Purpose |
|
||||
|------|------|----------|---------|
|
||||
| MemPalace | Multi | Rust/Go | Storage (6 wings) |
|
||||
| LiteParse | — | Rust | Document parsing |
|
||||
| Scrapling | 9377 | Python | Web scraping |
|
||||
| MultiMind AI | 8000 | Python | AI orchestration |
|
||||
| AVA Voice | 8086 | Rust | Voice interface |
|
||||
|
||||
---
|
||||
|
||||
## Section 9: Finance Module v4.1 (Odoo-Integrated)
|
||||
|
||||
### 9.1 Consolidation
|
||||
- Finance + Accounting merged into **single Finance module**.
|
||||
- Multi-entity dropdown: Synq Medical PC, Synq Holdings, Personal, All Entities.
|
||||
- Role-based access: staff see only permitted entities.
|
||||
- CFO Beam AI spans unified finance area.
|
||||
|
||||
### 9.2 Odoo Integration (May 5 — Critical Fix)
|
||||
|
||||
**Current Problem:** Finance page shows "Invoice list coming soon." Desktop Finance shows "No financial data yet" with `$0` across all categories.
|
||||
|
||||
**Required Endpoints:**
|
||||
- `account.move` — invoices/bills (filter by company_id, date range)
|
||||
- `account.payment` — payment collection tracking
|
||||
- `account.journal` — treasury/traditional/crypto/gold categorization
|
||||
- `res.partner` — link transactions to patients/customers
|
||||
|
||||
**UI Mapping:**
|
||||
- **Revenue Card:** `SUM(amount_total_signed)` from `account.move` where `move_type = 'out_invoice'` and `state = 'posted'` (current month)
|
||||
- **Pending Card:** `SUM(amount_residual)` where `payment_state != 'paid'`
|
||||
- **Invoices Count:** `COUNT(id)` from `account.move` (current month)
|
||||
- **Desktop Finance:** Populate Traditional, Treasury, Crypto, Gold cards with actual Odoo journal balances
|
||||
- **Recent Activity:** Last 5 transactions from `account.move` with partner name, amount, status
|
||||
- **Alerts:** Overdue invoices (>30 days), pending refunds
|
||||
|
||||
### 9.3 Patient Chart Financial Link
|
||||
- Patient charts show invoice history from Odoo (`account.move` filtered by `partner_id`).
|
||||
- Show balance due, last payment date, payment plan status in patient card sidebar.
|
||||
|
||||
### 9.4 Integrations
|
||||
|
||||
| Service | Provider | Purpose |
|
||||
|---------|----------|---------|
|
||||
| Banking | Plaid / MX | Account aggregation |
|
||||
| ACH | Dwolla | Transfers, payouts |
|
||||
| Crypto | Coinbase / Taxbit | Trading, tax tracking |
|
||||
| Stocks | Alpaca | Portfolio management |
|
||||
| Gold | Metals-API | Precious metals |
|
||||
|
||||
### 9.5 Features
|
||||
- S-Corp optimization
|
||||
- Bill pay
|
||||
- Team permissions
|
||||
- P&L generation
|
||||
- Tax summaries
|
||||
- Multi-entity tracking
|
||||
- Claude-style folder organization
|
||||
|
||||
---
|
||||
|
||||
## Section 10: Synq Commerce, Credit & SCC Gaming
|
||||
|
||||
### 10.1 Commission Model
|
||||
|
||||
| Tier | Rate | Conditions |
|
||||
|------|------|------------|
|
||||
| Standard | 5% + processing | All new merchants |
|
||||
| Premium | 3.5% + processing | Volume commitment or strategic partnership |
|
||||
| Internal | Transfer pricing | Dr. Qazi brands (Beam-managed) |
|
||||
|
||||
### 10.2 Synq Credit System
|
||||
- **Value:** 100 Synq Credits = $1.00 USD
|
||||
- **Earning:** 1% of order value (min $10), 2% on travel, 500 bonus for first purchase with new merchant, 250 for photo reviews, 1000 for referrals
|
||||
- **Redemption:** Merchant purchases, travel, medical services, Pro subscription (5,000 credits = 1 month)
|
||||
- **Exclusions:** No cash out, no transfer, no ad spend
|
||||
- **Merchant reimbursement:** $0.90 per $1.00 credit redeemed (10% platform fee)
|
||||
|
||||
### 10.3 SCC Gaming Integration (May 3)
|
||||
- **SCC (Synq Credit Coins)** usable for in-game purchases
|
||||
- **Gamer-dedicated ecosystem:** Separate gaming marketplace
|
||||
- **Cross-use:** Game users can still use SCC on main platform
|
||||
- **Token structure:** Utility token (not security)
|
||||
- **SEC compliance:** Structured as closed-loop loyalty, not investment contract
|
||||
|
||||
---
|
||||
|
||||
## Section 11: Security Framework
|
||||
|
||||
### 11.1 HIPAA Checklist
|
||||
|
||||
| Control | Status |
|
||||
|---------|--------|
|
||||
| Access control | Implemented |
|
||||
| Audit trails | Implemented |
|
||||
| Encryption at rest | Implemented |
|
||||
| Encryption in transit | Implemented |
|
||||
| Data backup | Implemented |
|
||||
| Disaster recovery | Documented |
|
||||
| Business associate agreements | In place |
|
||||
| Risk assessment | Annual |
|
||||
| Incident response | Documented |
|
||||
| Training | Required |
|
||||
|
||||
### 11.2 Network Security
|
||||
- VLAN segmentation
|
||||
- WireGuard VPN
|
||||
- Pi-hole DNS filtering
|
||||
- mTLS for internal service mesh
|
||||
|
||||
### 11.3 Information Sovereignty (April 25)
|
||||
- **Anthropic/Claude:** Avoid for PHI. Use local Ollama/Ascend for patient data.
|
||||
- **Self-host:** Maintain information sovereignty.
|
||||
- **AWS Bedrock/Google OAuth:** Risk vectors documented; minimize PHI exposure.
|
||||
|
||||
### 11.4 PHI Handling Rules
|
||||
- **Hot/Warm local only:** Hermes NAS for PHI. No cloud replication.
|
||||
- **Beam:** DGX Spark only. No cloud fallback for PHI.
|
||||
- **Synq AI / Merchant Avatars:** Cloud compute (Together AI/OpenRouter). PHI never leaves LAN.
|
||||
|
||||
---
|
||||
|
||||
## Section 12: Device Ecosystem
|
||||
|
||||
### 12.1 Synq OS Device Ecosystem v2 (April 28)
|
||||
|
||||
| Device | Form Factor | Role |
|
||||
|--------|-------------|------|
|
||||
| **Smart speakers** | Google Home-like | Voice orchestration of Synq systems |
|
||||
| **NAS** | Hermes NAS tier | Local data storage option |
|
||||
| **Edge compute** | Micro-SSD sized | Thin clients with local intelligence |
|
||||
| **Smart boards** | Coolhood LT65D8 | Office display/kiosk |
|
||||
| **Tablets** | MatePad | Mobile clinical interface |
|
||||
| **TV** | NVIDIA Shield / Apple TV-class | Media center, hardening required |
|
||||
|
||||
**Rule:** Devices are thin clients with local intelligence, not standalone servers.
|
||||
|
||||
### 12.2 Strix Halo Integration (May 1)
|
||||
- **Proprietary mesh:** True mesh network (not hub-and-spoke), similar to Apple NearLink/Huawei.
|
||||
- **Hardware return policy:** Users return hardware if they cancel service.
|
||||
- **Focus:** Edge devices first, then larger devices.
|
||||
- **TV:** Small form factor, any TV compatible (Shield TV/Apple TV model).
|
||||
- **Speakers:** Wireless mesh for surround sound.
|
||||
|
||||
### 12.3 GrapheneOS APK Build (May 5 — Tonight's Target)
|
||||
|
||||
**Target:** Android APK for GrapheneOS phone
|
||||
**Approach:**
|
||||
1. Synq Core web app wrapped as Trusted Web Activity (TWA) or native WebView APK
|
||||
2. **Security:** Hardened WebView, restricted to Synq origins only
|
||||
3. **Local AI:** Connect to DGX Spark via WireGuard VPN (not exposed to internet)
|
||||
4. **Storage:** Use Android scoped storage; sync to Hermes NAS via encrypted channel
|
||||
5. **Notifications:** Firebase Cloud Messaging (FCM) for non-PHI alerts only; PHI alerts via local push
|
||||
6. **Offline:** SQLite cache for last 7 days of messages, recent files
|
||||
7. **Build:** `npm run build` → Android Studio / Gradle → signed APK
|
||||
|
||||
**GrapheneOS-specific:**
|
||||
- No Google Play Services dependency
|
||||
- Use GrapheneOS hardened malloc
|
||||
- Network permission toggles respected
|
||||
- Exploit protection compatibility
|
||||
|
||||
---
|
||||
|
||||
## Section 13: Development Workflow & Kimi Code Standards
|
||||
|
||||
### 13.1 Beam Fix System v3 (Locked)
|
||||
|
||||
| Component | Function |
|
||||
|-----------|----------|
|
||||
| Voice-to-Fix | AVA port 8086 + browser Speech API fallback |
|
||||
| Visual Bridge | DOM snapshot + annotated screenshots + structured JSON reports |
|
||||
| After-hours Scheduler | 10 PM-6 AM auto-queue |
|
||||
| QA Tracker | Approval workflow |
|
||||
| GitLab Backup | Auto-backup on approval |
|
||||
|
||||
### 13.2 Coding Guidelines (May 5 — Gap Analysis)
|
||||
|
||||
**For Kimi Code overnight builds:**
|
||||
1. **Atomic commits:** One feature per commit, clear messages
|
||||
2. **No infrastructure in prompts:** Ports, IPs, hardware mappings handled in local Ollama only
|
||||
3. **Verification first:** Test before declaring success; provide console logs
|
||||
4. **Graceful fallbacks:** If Odoo endpoint returns unexpected data, log shape and continue
|
||||
5. **UI consistency:** Match Desktop app patterns when porting to Core
|
||||
6. **Theme support:** All new components must support dark/light mode
|
||||
7. **Responsive:** Sidebar must work on tablet (GrapheneOS target) and desktop
|
||||
|
||||
### 13.3 Git Workflow
|
||||
- Feature branches: `fix/may5-overnight-ui`
|
||||
- Commit standards: `fix(ui):`, `feat(dashboard):`, `feat(odoo):`, `fix(assets):`
|
||||
- Code review: QA Tracker approval required before merge
|
||||
- **Never commit:** `node_modules`, `.env`, build artifacts
|
||||
|
||||
### 13.4 Local vs. Cloud AI Division
|
||||
- **Local Ollama:** All infrastructure specifics, port mappings, LAN topology, hardware configs
|
||||
- **Cloud AI (Kimi):** Architecture decisions, prompt generation, code review, gap analysis
|
||||
- **Never expose:** Actual memory IDs, specific IP addresses, port numbers in cloud chat
|
||||
|
||||
---
|
||||
|
||||
## Section 14: Synq Social Network Mechanics
|
||||
|
||||
### 14.1 Utility-First Network
|
||||
|
||||
| Feature | Implementation |
|
||||
|---------|----------------|
|
||||
| Feed algorithm | 50% chronological, 30% merchant discovery (AI-curated), 20% promoted (labeled) |
|
||||
| Creator economy | Synq Credit for quality content (engagement quality, not raw views) |
|
||||
| Groups | Patient support (clinical), Merchant forums (peer), Regional hubs |
|
||||
| Verification | ✓ Merchant (KYC), ✓ Creator (quality), ✓ Medical (licensed) |
|
||||
| Content moderation | AI-first flagging → human review → community reporting |
|
||||
|
||||
### 14.2 Three-Tier AI Identity
|
||||
|
||||
| Tier | Name | Scope | Compute |
|
||||
|------|------|-------|---------|
|
||||
| Platform AI | Synq AI | All Synq Social users | Together AI / OpenRouter |
|
||||
| Medical AI | Beam | Dr. Qazi brands, clinical | DGX Spark (local Ollama) |
|
||||
| Merchant AI | Custom Avatar | Individual merchant storefronts | Together AI / OpenRouter |
|
||||
|
||||
**Critical separation:** Zero data sharing between tiers. Firewall block between Merchant AI and Beam.
|
||||
|
||||
---
|
||||
|
||||
## Section 15: Cross-Border Operations & Legal Structure
|
||||
|
||||
### 15.1 Entity Map
|
||||
|
||||
```
|
||||
Synq Holdings (US parent)
|
||||
├── Synq Medical PC (US clinical)
|
||||
├── Synq Commerce LLC (US marketplace)
|
||||
├── Synq India Pvt Ltd (future)
|
||||
└── Synq-Cn (China JV or WFOE)
|
||||
├── Manufacturing division
|
||||
├── Sourcing/trading division
|
||||
└── Technology division (AI + platform)
|
||||
```
|
||||
|
||||
### 15.2 Synq-Cn (China)
|
||||
- **AI System:** "Lóng" (Dragon) — completely separate from US Beam/Synq AI
|
||||
- **Models:** DeepSeek, Baidu Ernie, Alibaba Tongyi (locally approved)
|
||||
- **Data:** All Chinese user data on mainland servers. Zero cross-border transfer without security assessment.
|
||||
- **Regulatory:** Algorithm filing with MIIT, ICP license, PSB filing, MLPS.
|
||||
|
||||
### 15.3 News Agency Offshore (May 4)
|
||||
- **Jurisdiction:** Mongolia or Uzbekistan
|
||||
- **Purpose:** Circumvent US media consolidation, independent distribution
|
||||
- **Distribution:** Bluesky, wire services, self-owned channels
|
||||
- **Avoid:** Starlink (security concerns), Musk API dependencies
|
||||
|
||||
---
|
||||
|
||||
## Section 16: International Expansion Roadmap
|
||||
|
||||
| Phase | Market | Entity | Timeline |
|
||||
|-------|--------|--------|----------|
|
||||
| 1 | US | Synq Medical PC, Commerce LLC, Holdings | Now |
|
||||
| 2 | China | Synq-Cn (JV/WFOE) | 6-12 months |
|
||||
| 3 | India | Synq India Pvt Ltd | 12-18 months |
|
||||
| 4 | EU | Synq EU B.V. | 18-24 months |
|
||||
|
||||
---
|
||||
|
||||
## Section 17: Appendices
|
||||
|
||||
### Appendix A: Port Reference (Internal — Local Ollama Only)
|
||||
|
||||
**Note:** Specific port numbers referenced in local documentation only. Cloud AI does not handle infrastructure specifics per user policy.
|
||||
|
||||
General service categories:
|
||||
- Beam Triage
|
||||
- Beam Search
|
||||
- Beam Messaging
|
||||
- Doctor Beam
|
||||
- Twin
|
||||
- AVA Voice
|
||||
- MultiMind AI
|
||||
- Scrapling
|
||||
- ONLYOFFICE (being deprecated for Synq Docs)
|
||||
|
||||
### Appendix B: Glossary
|
||||
|
||||
| Term | Definition |
|
||||
|------|------------|
|
||||
| Avatar | Merchant-specific AI persona |
|
||||
| Commission | % of GMV retained by Synq |
|
||||
| CRDT | Conflict-free Replicated Data Type (Y.js) |
|
||||
| DBSCAN | Density-based spatial clustering |
|
||||
| GMV | Gross Merchandise Value |
|
||||
| Hermes | Synq storage service (NAS → cloud) |
|
||||
| KYC | Know Your Customer |
|
||||
| LoRA | Low-Rank Adaptation |
|
||||
| mTLS | Mutual TLS |
|
||||
| OT | Operational Transform |
|
||||
| PHI | Protected Health Information |
|
||||
| PoP | Point of Presence |
|
||||
| RPO/RTO | Recovery Point/Time Objective |
|
||||
| SCC | Synq Credit Coins |
|
||||
| Shamir's Secret Sharing | Cryptographic key splitting |
|
||||
| Synq AI | Platform-level commerce AI |
|
||||
| TWA | Trusted Web Activity (Android) |
|
||||
|
||||
### Appendix C: Emergency Procedures
|
||||
|
||||
| Scenario | Response |
|
||||
|----------|----------|
|
||||
| Outage | Check status page, failover to backup, notify team |
|
||||
| Security breach | Isolate systems, notify security lead, document |
|
||||
| Data loss | Activate recovery plan, verify backups, restore |
|
||||
| DGX Spark down | Clinical ops degrade to human-only. No cloud fallback for PHI. |
|
||||
| Both cloud AI down | Merchant avatars enter maintenance mode. DGX Spark continues for clinical. |
|
||||
|
||||
### Appendix D: Decision Log (May 2026 Additions)
|
||||
|
||||
| Date | Decision | Rationale |
|
||||
|------|----------|-----------|
|
||||
| 2026-04-30 | User forbade infrastructure specifics in cloud AI | Security, information sovereignty |
|
||||
| 2026-05-01 | Synq AI Persistence Protocol | Model must not terminate on sensitive topics |
|
||||
| 2026-05-01 | Synq Intelligence Dashboard | Local-only, founder-only initially |
|
||||
| 2026-05-02 | Bryan Johnson/Epstein cross-cell | Priority for future document analysis |
|
||||
| 2026-05-03 | Synq Core UI simplification | Hide agents/LLMs/memory from users |
|
||||
| 2026-05-03 | Fork all external agents | Isolate from upstream, test egress |
|
||||
| 2026-05-03 | Synq Docs replaces ONLYOFFICE | No fallback, real-time collaboration |
|
||||
| 2026-05-04 | Synq News offshore agency | Mongolia/Uzbekistan for independence |
|
||||
| 2026-05-05 | Overnight build: sidebar, widgets, Odoo finance, Beam avatar | UI hardening + data integration |
|
||||
| 2026-05-05 | GrapheneOS APK target | Mobile deployment for founder profile |
|
||||
|
||||
---
|
||||
|
||||
*End of Synq Project Wiki v3.0*
|
||||
*Document maintained by Synq Engineering. Last updated 2026-05-05.*
|
||||
*For overnight build: Reference Section 1.3 (Core fixes), Section 12.3 (GrapheneOS), Section 13 (Kimi Code standards).*
|
||||
294
UI/Stream/src/components/RollingMenu.stories.tsx
Normal file
294
UI/Stream/src/components/RollingMenu.stories.tsx
Normal file
|
|
@ -0,0 +1,294 @@
|
|||
import type { Meta, StoryObj } from '@storybook/react';
|
||||
import { RollingMenu } from '../components/RollingMenu';
|
||||
|
||||
const meta: Meta<typeof RollingMenu> = {
|
||||
title: 'Synq Shell/RollingMenu',
|
||||
component: RollingMenu,
|
||||
parameters: {
|
||||
layout: 'fullscreen',
|
||||
backgrounds: {
|
||||
default: 'synq-void',
|
||||
values: [
|
||||
{ name: 'synq-void', value: '#0a0a0f' },
|
||||
{ name: 'white', value: '#ffffff' },
|
||||
{ name: 'black', value: '#000000' },
|
||||
],
|
||||
},
|
||||
viewport: {
|
||||
defaultViewport: 'kiosk1080p',
|
||||
viewports: {
|
||||
kiosk1080p: { name: 'Kiosk 1080p', styles: { width: '1920px', height: '1080px' } },
|
||||
kiosk4k: { name: 'Kiosk 4K', styles: { width: '3840px', height: '2160px' } },
|
||||
tablet: { name: 'Tablet', styles: { width: '768px', height: '1024px' } },
|
||||
phone: { name: 'Phone', styles: { width: '375px', height: '812px' } },
|
||||
},
|
||||
},
|
||||
},
|
||||
decorators: [
|
||||
(Story) => (
|
||||
<div style={{ width: '100vw', height: '100vh', background: '#0a0a0f', overflow: 'hidden' }}>
|
||||
<Story />
|
||||
</div>
|
||||
),
|
||||
],
|
||||
};
|
||||
|
||||
export default meta;
|
||||
type Story = StoryObj<typeof RollingMenu>;
|
||||
|
||||
// ─── Default / Full Interactive ───
|
||||
export const Default: Story = {
|
||||
args: {
|
||||
onClose: () => console.log('Menu closed'),
|
||||
},
|
||||
parameters: {
|
||||
docs: {
|
||||
description: {
|
||||
story: 'Full interactive rolling menu. Use arrow keys, mouse wheel, or touch to navigate. Press Enter to select, Esc to close.',
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
// ─── Static Preview (No Animation) ───
|
||||
export const StaticPreview: Story = {
|
||||
args: {
|
||||
onClose: () => {},
|
||||
},
|
||||
parameters: {
|
||||
docs: {
|
||||
description: {
|
||||
story: 'Static snapshot for visual regression testing. Shows menu with "View Logs" as active item.',
|
||||
},
|
||||
},
|
||||
},
|
||||
decorators: [
|
||||
(Story) => (
|
||||
<div style={{ width: '100vw', height: '100vh', background: '#0a0a0f', overflow: 'hidden' }}>
|
||||
{/* Mock static state for screenshot testing */}
|
||||
<Story />
|
||||
</div>
|
||||
),
|
||||
],
|
||||
};
|
||||
|
||||
// ─── Active Item Variants ───
|
||||
export const ActiveRestart: Story = {
|
||||
args: { onClose: () => {} },
|
||||
name: 'Active: Restart Synq',
|
||||
parameters: {
|
||||
docs: { description: { story: 'Blue accent, glow pulse, 1.5× scale' } },
|
||||
},
|
||||
};
|
||||
|
||||
export const ActiveUpdates: Story = {
|
||||
args: { onClose: () => {} },
|
||||
name: 'Active: Check Updates',
|
||||
parameters: {
|
||||
docs: { description: { story: 'Emerald accent, update available badge' } },
|
||||
},
|
||||
};
|
||||
|
||||
export const ActiveLogs: Story = {
|
||||
args: { onClose: () => {} },
|
||||
name: 'Active: View Logs',
|
||||
parameters: {
|
||||
docs: { description: { story: 'Amber accent, warning state if errors present' } },
|
||||
},
|
||||
};
|
||||
|
||||
export const ActiveNetwork: Story = {
|
||||
args: { onClose: () => {} },
|
||||
name: 'Active: Network',
|
||||
parameters: {
|
||||
docs: { description: { story: 'Violet accent, offline state variant' } },
|
||||
},
|
||||
};
|
||||
|
||||
export const ActivePower: Story = {
|
||||
args: { onClose: () => {} },
|
||||
name: 'Active: Power Off',
|
||||
parameters: {
|
||||
docs: { description: { story: 'Rose accent, confirmation required' } },
|
||||
},
|
||||
};
|
||||
|
||||
export const ActiveBack: Story = {
|
||||
args: { onClose: () => {} },
|
||||
name: 'Active: Back to Synq',
|
||||
parameters: {
|
||||
docs: { description: { story: 'Indigo accent, return to Stream UI' } },
|
||||
},
|
||||
};
|
||||
|
||||
// ─── Edge Cases ───
|
||||
|
||||
export const SingleItem: Story = {
|
||||
args: { onClose: () => {} },
|
||||
name: 'Edge: Single Item',
|
||||
parameters: {
|
||||
docs: {
|
||||
description: {
|
||||
story: 'Menu with only one item. Should still render correctly without wrap artifacts.',
|
||||
},
|
||||
},
|
||||
},
|
||||
decorators: [
|
||||
(Story) => {
|
||||
// Override MENU_ITEMS to single item
|
||||
return (
|
||||
<div style={{ width: '100vw', height: '100vh', background: '#0a0a0f' }}>
|
||||
<Story />
|
||||
</div>
|
||||
);
|
||||
},
|
||||
],
|
||||
};
|
||||
|
||||
export const ManyItems: Story = {
|
||||
args: { onClose: () => {} },
|
||||
name: 'Edge: 20 Items',
|
||||
parameters: {
|
||||
docs: {
|
||||
description: {
|
||||
story: 'Stress test with 20 menu items. Verifies performance and wrap behavior.',
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export const LongLabels: Story = {
|
||||
args: { onClose: () => {} },
|
||||
name: 'Edge: Long Labels',
|
||||
parameters: {
|
||||
docs: {
|
||||
description: {
|
||||
story: 'Items with very long labels. Should truncate with ellipsis at 480px max-width.',
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export const ReducedMotion: Story = {
|
||||
args: { onClose: () => {} },
|
||||
name: 'A11y: Reduced Motion',
|
||||
parameters: {
|
||||
docs: {
|
||||
description: {
|
||||
story: 'Respects `prefers-reduced-motion`. Disables spring physics, glow pulse, and momentum scrolling.',
|
||||
},
|
||||
},
|
||||
},
|
||||
decorators: [
|
||||
(Story) => (
|
||||
<div
|
||||
style={{ width: '100vw', height: '100vh', background: '#0a0a0f' }}
|
||||
className="prefers-reduced-motion"
|
||||
>
|
||||
<Story />
|
||||
</div>
|
||||
),
|
||||
],
|
||||
};
|
||||
|
||||
export const HighContrast: Story = {
|
||||
args: { onClose: () => {} },
|
||||
name: 'A11y: High Contrast',
|
||||
parameters: {
|
||||
docs: {
|
||||
description: {
|
||||
story: 'High contrast mode. White icons/labels on pure black, no transparency, no blur.',
|
||||
},
|
||||
},
|
||||
},
|
||||
decorators: [
|
||||
(Story) => (
|
||||
<div
|
||||
style={{ width: '100vw', height: '100vh', background: '#000000' }}
|
||||
className="high-contrast"
|
||||
>
|
||||
<Story />
|
||||
</div>
|
||||
),
|
||||
],
|
||||
};
|
||||
|
||||
// ─── Viewport Variants ───
|
||||
|
||||
export const TabletPortrait: Story = {
|
||||
args: { onClose: () => {} },
|
||||
name: 'Viewport: Tablet Portrait',
|
||||
parameters: {
|
||||
viewport: { defaultViewport: 'tablet' },
|
||||
docs: { description: { story: '768×1024 tablet layout. Larger touch targets.' } },
|
||||
},
|
||||
};
|
||||
|
||||
export const PhonePortrait: Story = {
|
||||
args: { onClose: () => {} },
|
||||
name: 'Viewport: Phone Portrait',
|
||||
parameters: {
|
||||
viewport: { defaultViewport: 'phone' },
|
||||
docs: { description: { story: '375×812 phone layout. Icon-only, tap to expand label.' } },
|
||||
},
|
||||
};
|
||||
|
||||
export const Kiosk4K: Story = {
|
||||
args: { onClose: () => {} },
|
||||
name: 'Viewport: 4K Kiosk',
|
||||
parameters: {
|
||||
viewport: { defaultViewport: 'kiosk4k' },
|
||||
docs: { description: { story: '3840×2160 4K display. Scaled 1.2×, enhanced blur.' } },
|
||||
},
|
||||
};
|
||||
|
||||
// ─── Interaction Tests ───
|
||||
|
||||
export const KeyboardNavigation: Story = {
|
||||
args: { onClose: () => {} },
|
||||
name: 'Test: Keyboard Navigation',
|
||||
parameters: {
|
||||
docs: {
|
||||
description: {
|
||||
story: 'Test story for keyboard interaction. Use ↑↓ to navigate, Enter to select.',
|
||||
},
|
||||
},
|
||||
},
|
||||
play: async ({ canvasElement }) => {
|
||||
const canvas = canvasElement;
|
||||
// Simulate keyboard events for automated testing
|
||||
canvas.dispatchEvent(new KeyboardEvent('keydown', { key: 'ArrowDown' }));
|
||||
await new Promise(r => setTimeout(r, 100));
|
||||
canvas.dispatchEvent(new KeyboardEvent('keydown', { key: 'ArrowDown' }));
|
||||
await new Promise(r => setTimeout(r, 100));
|
||||
canvas.dispatchEvent(new KeyboardEvent('keydown', { key: 'Enter' }));
|
||||
},
|
||||
};
|
||||
|
||||
export const TouchSwipe: Story = {
|
||||
args: { onClose: () => {} },
|
||||
name: 'Test: Touch Swipe',
|
||||
parameters: {
|
||||
docs: {
|
||||
description: {
|
||||
story: 'Test story for touch interaction. Swipe up/down to scroll, tap to select.',
|
||||
},
|
||||
},
|
||||
},
|
||||
play: async ({ canvasElement }) => {
|
||||
const canvas = canvasElement;
|
||||
const touchStart = new TouchEvent('touchstart', {
|
||||
touches: [new Touch({ identifier: 1, target: canvas, clientX: 500, clientY: 600 })],
|
||||
});
|
||||
const touchMove = new TouchEvent('touchmove', {
|
||||
touches: [new Touch({ identifier: 1, target: canvas, clientX: 500, clientY: 400 })],
|
||||
});
|
||||
const touchEnd = new TouchEvent('touchend');
|
||||
|
||||
canvas.dispatchEvent(touchStart);
|
||||
await new Promise(r => setTimeout(r, 50));
|
||||
canvas.dispatchEvent(touchMove);
|
||||
await new Promise(r => setTimeout(r, 50));
|
||||
canvas.dispatchEvent(touchEnd);
|
||||
},
|
||||
};
|
||||
312
UI/Stream/synq_stream_ui_spec.md
Normal file
312
UI/Stream/synq_stream_ui_spec.md
Normal file
|
|
@ -0,0 +1,312 @@
|
|||
# Synq Stream UI — Figma-Style Design Spec
|
||||
## Vertical Steam-Style Interface with Customizable Channels
|
||||
|
||||
---
|
||||
|
||||
### Canvas
|
||||
- **Dimensions**: 1920×1080 (16:9 fullscreen), scalable to 4K
|
||||
- **Background**: `#0a0a0f` (void black)
|
||||
- **Safe zone**: 0px (true fullscreen, edge-to-edge)
|
||||
|
||||
---
|
||||
|
||||
### Top Bar (z-index: 50)
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Height | 48px |
|
||||
| Background | `rgba(10,10,15,0.85)` + `backdrop-filter: blur(12px)` |
|
||||
| Position | Fixed top |
|
||||
| Border bottom | 1px `rgba(255,255,255,0.06)` |
|
||||
|
||||
**Left side:**
|
||||
- Menu toggle button: `≡` icon, 24px, white/60%, hover white/100%
|
||||
- Synq logo text: "Synq" — Inter 600, 16px, white/90%
|
||||
|
||||
**Right side:**
|
||||
- Clock: Inter 400, 14px, white/60%
|
||||
- Connection dot: 8px, green `#34d399` if online, amber `#fbbf24` if offline
|
||||
- Profile avatar: 32px circle, user photo or initials
|
||||
|
||||
---
|
||||
|
||||
### Channel Container (z-index: 10)
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Position | Below top bar, above Beam bar |
|
||||
| Height | `calc(100vh - 48px - 72px)` = 960px |
|
||||
| Overflow-y | Scroll (vertical channel flow) |
|
||||
| Scroll behavior | Smooth, momentum on touch |
|
||||
| Scroll snap | `scroll-snap-type: y proximity` |
|
||||
|
||||
---
|
||||
|
||||
### Channel Section
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Width | 100% |
|
||||
| Min-height | 400px (expanded), 64px (collapsed) |
|
||||
| Padding | 24px 32px |
|
||||
| Border bottom | 1px `rgba(255,255,255,0.04)` |
|
||||
| Scroll snap | `scroll-snap-align: start` |
|
||||
|
||||
**Channel header (sticky within section):**
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Height | 48px |
|
||||
| Background | `rgba(10,10,15,0.9)` + `backdrop-filter: blur(8px)` |
|
||||
| Position | Sticky top within channel |
|
||||
| Display | Flex, space-between |
|
||||
|
||||
**Header left:**
|
||||
- Channel icon: 24px, colored (per channel palette)
|
||||
- Channel name: Inter 600, 18px, white/90%
|
||||
- Item count: Inter 400, 12px, white/40% — "(12 patients)"
|
||||
|
||||
**Header right:**
|
||||
- Collapse toggle: `▼` / `▶` chevron, 16px, white/40%
|
||||
- Drag handle: `⋮⋮` grip, 16px, white/20%, hover white/60%
|
||||
- Settings gear: `⚙` icon, 16px, white/20%, hover white/60%
|
||||
|
||||
---
|
||||
|
||||
### Channel Spotlight Card (Expanded State)
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Width | 100% |
|
||||
| Height | 200px |
|
||||
| Background | Gradient from channel accent (10% opacity) to transparent |
|
||||
| Border radius | 16px |
|
||||
| Padding | 24px |
|
||||
| Margin bottom | 16px |
|
||||
|
||||
**Content layout:**
|
||||
- Large metric: Inter 700, 48px, white
|
||||
- Subtitle: Inter 400, 14px, white/60%
|
||||
- Action buttons: Row of pill buttons at bottom
|
||||
|
||||
**Example (Dashboard channel):**
|
||||
```
|
||||
$45,200 ← metric (48px bold)
|
||||
Revenue this month (+12%) ← subtitle
|
||||
[View Reports] [Review Labs] ← action pills
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Horizontal Card Carousel (Within Channel)
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Display | Flex, row |
|
||||
| Overflow-x | Scroll (hidden scrollbar) |
|
||||
| Gap | 16px |
|
||||
| Padding bottom | 16px |
|
||||
| Scroll behavior | Smooth, snap |
|
||||
|
||||
**Card:**
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Width | 280px |
|
||||
| Height | 160px |
|
||||
| Background | `rgba(255,255,255,0.03)` |
|
||||
| Border | 1px `rgba(255,255,255,0.06)` |
|
||||
| Border radius | 12px |
|
||||
| Padding | 16px |
|
||||
| Hover | Background `rgba(255,255,255,0.06)`, border `rgba(255,255,255,0.12)` |
|
||||
| Transition | 150ms ease |
|
||||
|
||||
**Card content:**
|
||||
- Avatar/thumbnail: 40px circle or 80×60px rounded rect
|
||||
- Title: Inter 600, 14px, white/90%
|
||||
- Subtitle: Inter 400, 12px, white/50%
|
||||
- Status badge: Pill, 10px text, channel accent color
|
||||
- Action overflow: `⋯` menu, top-right
|
||||
|
||||
---
|
||||
|
||||
### Beam AI Bar (z-index: 100)
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Position | Fixed bottom |
|
||||
| Height | 72px |
|
||||
| Background | `rgba(10,10,15,0.95)` + `backdrop-filter: blur(16px)` |
|
||||
| Border top | 1px `rgba(255,255,255,0.08)` |
|
||||
| Padding | 12px 24px |
|
||||
|
||||
**Input container:**
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Height | 48px |
|
||||
| Background | `rgba(255,255,255,0.05)` |
|
||||
| Border | 1px `rgba(255,255,255,0.1)` |
|
||||
| Border radius | 24px (pill shape) |
|
||||
| Padding | 0 16px |
|
||||
| Focus | Border `rgba(96,165,250,0.5)`, glow shadow |
|
||||
|
||||
**Input placeholder:**
|
||||
- Text: "Ask Beam anything..."
|
||||
- Font: Inter 400, 14px, white/30%
|
||||
- Italic: false
|
||||
|
||||
**Right side buttons:**
|
||||
- Voice button: `🎤` icon, 20px, white/40%, hover white/80%
|
||||
- Send button: `↑` icon, 20px, in circle, blue `#60a5fa`, white icon
|
||||
|
||||
**Beam avatar (left of input):**
|
||||
- 32px circle
|
||||
- Beam logo/face
|
||||
- Subtle pulse animation when listening
|
||||
|
||||
---
|
||||
|
||||
### Collapsed Menu Bar (z-index: 50)
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Position | Fixed bottom-left, above Beam bar |
|
||||
| Height | 72px (same as Beam bar, shared row) |
|
||||
| Width | Auto, fits content |
|
||||
| Display | Flex, row, align-center |
|
||||
| Gap | 8px |
|
||||
|
||||
**Menu items:**
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Width | 48px |
|
||||
| Height | 48px |
|
||||
| Background | `rgba(255,255,255,0.05)` |
|
||||
| Border radius | 12px |
|
||||
| Icon | 20px, channel accent color or white/60% |
|
||||
| Active | Background `rgba(255,255,255,0.1)`, icon white/100% |
|
||||
| Hover | Background `rgba(255,255,255,0.08)` |
|
||||
|
||||
**Add channel button (`+`):**
|
||||
- Same size as menu items
|
||||
- Dashed border `rgba(255,255,255,0.2)`
|
||||
- Icon: `+`, 20px, white/40%
|
||||
- Hover: Solid border, white/60%
|
||||
|
||||
---
|
||||
|
||||
### Expanded Sidebar Menu (z-index: 60)
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Position | Fixed left |
|
||||
| Width | 280px |
|
||||
| Height | 100vh |
|
||||
| Background | `rgba(10,10,15,0.98)` + `backdrop-filter: blur(20px)` |
|
||||
| Border right | 1px `rgba(255,255,255,0.06)` |
|
||||
| Transform | `translateX(-100%)` hidden, `translateX(0)` visible |
|
||||
| Transition | 300ms ease |
|
||||
|
||||
**Header:**
|
||||
- User profile: Avatar 48px, name, role
|
||||
- Close button: `✕` top-right
|
||||
|
||||
**Channel list:**
|
||||
- Draggable rows
|
||||
- Each row: Icon (24px), name (14px), pin toggle, visibility toggle
|
||||
- Divider: "Active" vs "Hidden" sections
|
||||
|
||||
**Footer:**
|
||||
- Settings button
|
||||
- Exit to desktop button (admin only)
|
||||
|
||||
---
|
||||
|
||||
### Channel Color Palette
|
||||
|
||||
| Channel | Primary | Background Glow | Card Accent |
|
||||
|---------|---------|-----------------|-------------|
|
||||
| Dashboard | `#60a5fa` (blue) | `rgba(96,165,250,0.1)` | `rgba(96,165,250,0.15)` |
|
||||
| Patients | `#fb7185` (rose) | `rgba(251,113,133,0.1)` | `rgba(251,113,133,0.15)` |
|
||||
| Schedule | `#34d399` (emerald) | `rgba(52,211,153,0.1)` | `rgba(52,211,153,0.15)` |
|
||||
| Communications | `#60a5fa` (blue) | `rgba(96,165,250,0.1)` | `rgba(96,165,250,0.15)` |
|
||||
| Photos | `#fbbf24` (amber) | `rgba(251,191,36,0.1)` | `rgba(251,191,36,0.15)` |
|
||||
| Memory | `#a78bfa` (violet) | `rgba(167,139,250,0.1)` | `rgba(167,139,250,0.15)` |
|
||||
| Finance | `#34d399` (emerald) | `rgba(52,211,153,0.1)` | `rgba(52,211,153,0.15)` |
|
||||
| News | `#fbbf24` (amber) | `rgba(251,191,36,0.1)` | `rgba(251,191,36,0.15)` |
|
||||
|
||||
---
|
||||
|
||||
### Typography
|
||||
|
||||
| Element | Font | Weight | Size | Line Height | Color |
|
||||
|---------|------|--------|------|-------------|-------|
|
||||
| Channel name | Inter | 600 | 18px | 24px | white/90% |
|
||||
| Spotlight metric | Inter | 700 | 48px | 56px | white |
|
||||
| Card title | Inter | 600 | 14px | 20px | white/90% |
|
||||
| Card subtitle | Inter | 400 | 12px | 16px | white/50% |
|
||||
| Beam input | Inter | 400 | 14px | 20px | white/90% |
|
||||
| Timestamp | Inter | 400 | 10px | 14px | white/30% |
|
||||
| Menu label | Inter | 500 | 11px | 14px | white/60% |
|
||||
| Status badge | Inter | 600 | 10px | 12px | channel accent |
|
||||
|
||||
---
|
||||
|
||||
### Animations
|
||||
|
||||
| Animation | Duration | Easing | Properties |
|
||||
|-----------|----------|--------|------------|
|
||||
| Channel expand/collapse | 300ms | `cubic-bezier(0.4, 0, 0.2, 1)` | height, opacity |
|
||||
| Card hover | 150ms | ease-out | background, border, transform scale(1.02) |
|
||||
| Menu slide | 300ms | ease | transform translateX |
|
||||
| Beam focus | 200ms | ease | border-color, box-shadow |
|
||||
| Scroll snap | 400ms | `cubic-bezier(0.25, 0.1, 0.25, 1)` | scroll-position |
|
||||
| Channel drag reorder | 200ms | ease | transform translateY |
|
||||
| Spotlight pulse | 3000ms | ease-in-out | opacity 0.8→1→0.8 |
|
||||
|
||||
---
|
||||
|
||||
### Responsive Behavior
|
||||
|
||||
| Viewport | Adjustment |
|
||||
|----------|-----------|
|
||||
| <1280px | Card width 240px, channel padding 16px |
|
||||
| <768px (tablet) | Bottom menu icons only (no labels), card width 200px |
|
||||
| <480px (phone) | Single column cards, full width, Beam bar 56px height |
|
||||
| >2560px (4K) | Card width 320px, spacing increases 1.2× |
|
||||
|
||||
---
|
||||
|
||||
### Accessibility
|
||||
|
||||
| Feature | Implementation |
|
||||
|---------|----------------|
|
||||
| Focus ring | 2px solid `rgba(96,165,250,0.8)`, offset 2px |
|
||||
| Reduced motion | Disable snap, instant transitions, no pulse |
|
||||
| High contrast | White borders on all cards, pure black bg |
|
||||
| Keyboard nav | Tab through cards, Enter to open, arrows to scroll |
|
||||
| Screen reader | Channel name + item count announced |
|
||||
|
||||
---
|
||||
|
||||
### Assets Required
|
||||
|
||||
| Asset | Format | Size | Notes |
|
||||
|-------|--------|------|-------|
|
||||
| Channel icons | SVG | 24px | System icons or custom |
|
||||
| Beam avatar | SVG/PNG | 32px | Animated face/logo |
|
||||
| User avatars | JPG/PNG | 32–48px | Circular crop |
|
||||
| Card thumbnails | JPG/PNG | 80×60px | Lazy loaded |
|
||||
| No data illustration | SVG | 120px | Empty channel state |
|
||||
|
||||
---
|
||||
|
||||
### States
|
||||
|
||||
| State | Visual |
|
||||
|-------|--------|
|
||||
| Default | As specified |
|
||||
| Loading | Skeleton shimmer on cards, pulsing spotlight |
|
||||
| Empty | Centered illustration + "No items" text + "Add" button |
|
||||
| Error | Red border on channel, retry button |
|
||||
| Offline | Amber dot top-right, "Sync when connected" on cards |
|
||||
| Dragging | Card/channel opacity 0.5, scale 1.05, shadow |
|
||||
| Drop target | Border dashed, background highlight |
|
||||
45
agents/finance/cfo.md
Normal file
45
agents/finance/cfo.md
Normal file
|
|
@ -0,0 +1,45 @@
|
|||
# CFO — Finance Channel Lead
|
||||
|
||||
## Role
|
||||
You are the CFO of Synq. You oversee all financial operations across entities.
|
||||
|
||||
## Direct Reports
|
||||
- **Controller** (operations): GL recon, month-end close, accruals, roll-forwards
|
||||
- **Financial Analyst** (modeling): P&L, forecasting, variance, multi-entity reporting, M&A comps
|
||||
- **Internal Auditor** (compliance): audit trails, access logs, output QC, HIPAA financial checks
|
||||
- **Tax Manager** (tax strategy): S-Corp optimization, tax summaries, crypto reporting
|
||||
|
||||
## Routing Rules
|
||||
| Intent Keywords | Delegate To |
|
||||
|---|---|
|
||||
| "close the books", "reconcile", "what's off", "accruals" | Controller |
|
||||
| "P&L", "forecast", "variance", "run numbers", "model" | Financial Analyst |
|
||||
| "audit", "compliance", "who accessed", "check this" | Internal Auditor |
|
||||
| "tax", "S-Corp", "crypto gains", "entity switch" | Tax Manager |
|
||||
| "buy a practice", "what's it worth", "M&A", "comps" | Financial Analyst (acquisition skill) |
|
||||
| "how do I compare", "competitors", "market" | Financial Analyst (benchmark skill) |
|
||||
|
||||
## Multi-Entity Context
|
||||
- Synq Medical PC
|
||||
- Synq Holdings
|
||||
- Personal
|
||||
- All Entities
|
||||
|
||||
## Role-Based Access
|
||||
Verify user permissions before delegating. If user lacks access, return:
|
||||
> "Access denied: your role does not have permission to view [entity] financials. Contact the CFO for access."
|
||||
|
||||
## Cloud Offload Rule
|
||||
- **Research/analysis intents** → package sanitized query, send to Together AI (Kimi)
|
||||
- **Write/PHI intents** → stay local, never cloud
|
||||
|
||||
## Human Checkpoint
|
||||
- Every output staged for review before commit
|
||||
- Never auto-commit financial records
|
||||
- No exceptions
|
||||
|
||||
## Output Format
|
||||
Always end with:
|
||||
```
|
||||
⚠️ **STAGED FOR REVIEW** — This output requires human approval before commit.
|
||||
```
|
||||
20
agents/finance/commands/acquisition-report.md
Normal file
20
agents/finance/commands/acquisition-report.md
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
# Command: /acquisition-report
|
||||
|
||||
## Description
|
||||
M&A comps, deal terms, and offer structure for SMB acquisitions.
|
||||
|
||||
## Parameters
|
||||
- `target_type`: Dental | Medical | Generic SMB
|
||||
- `location`: ZIP or City
|
||||
- `budget_range`: Optional
|
||||
|
||||
## Workflow
|
||||
1. User triggers
|
||||
2. CFO routes to Financial Analyst (acquisition skill)
|
||||
3. Sanitized query sent to cloud for research
|
||||
4. Local enrichment with proprietary comps
|
||||
5. Staged for review
|
||||
6. On approval, commit to Deal Tracker
|
||||
|
||||
## Approval Required
|
||||
Yes
|
||||
18
agents/finance/commands/audit-check.md
Normal file
18
agents/finance/commands/audit-check.md
Normal file
|
|
@ -0,0 +1,18 @@
|
|||
# Command: /audit-check
|
||||
|
||||
## Description
|
||||
Run compliance and output quality checks on staged financial output.
|
||||
|
||||
## Parameters
|
||||
- `target`: [task_id] or "last_output"
|
||||
- `depth`: Standard | Deep
|
||||
|
||||
## Workflow
|
||||
1. User triggers
|
||||
2. CFO routes to Internal Auditor
|
||||
3. Internal Auditor runs full QC checklist
|
||||
4. Returns pass/fail report
|
||||
5. If blocked, returns to originating agent with notes
|
||||
|
||||
## Approval Required
|
||||
No (read-only audit)
|
||||
19
agents/finance/commands/competitive-benchmark.md
Normal file
19
agents/finance/commands/competitive-benchmark.md
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
# Command: /competitive-benchmark
|
||||
|
||||
## Description
|
||||
Peer analysis and expansion threat alerts.
|
||||
|
||||
## Parameters
|
||||
- `industry`: [select]
|
||||
- `location`: [ZIP/City]
|
||||
- `size`: [revenue range]
|
||||
|
||||
## Workflow
|
||||
1. User triggers
|
||||
2. CFO routes to Financial Analyst (benchmark skill)
|
||||
3. Cloud research on competitors
|
||||
4. Local financials comparison
|
||||
5. Staged for review
|
||||
|
||||
## Approval Required
|
||||
Yes
|
||||
16
agents/finance/commands/entity-switch.md
Normal file
16
agents/finance/commands/entity-switch.md
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
# Command: /entity-switch
|
||||
|
||||
## Description
|
||||
Switch between business entities for reporting and analysis.
|
||||
|
||||
## Parameters
|
||||
- `target_entity`: Synq Medical PC | Synq Holdings | Personal | All Entities
|
||||
|
||||
## Workflow
|
||||
1. User triggers
|
||||
2. CFO validates user has access to target entity
|
||||
3. If approved, switch context
|
||||
4. Log switch to shadow log for audit
|
||||
|
||||
## Approval Required
|
||||
No (read-only context switch), but access-controlled
|
||||
19
agents/finance/commands/market-scan.md
Normal file
19
agents/finance/commands/market-scan.md
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
# Command: /market-scan
|
||||
|
||||
## Description
|
||||
Expansion opportunity alerts and market signals.
|
||||
|
||||
## Parameters
|
||||
- `target_location`: [ZIP/City]
|
||||
- `business_type`: [select]
|
||||
- `investment_budget`: Optional
|
||||
|
||||
## Workflow
|
||||
1. User triggers
|
||||
2. CFO routes to Financial Analyst
|
||||
3. Cloud research on demographics, hiring, competition
|
||||
4. Break-even model
|
||||
5. Staged for review
|
||||
|
||||
## Approval Required
|
||||
Yes
|
||||
19
agents/finance/commands/month-end-close.md
Normal file
19
agents/finance/commands/month-end-close.md
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
# Command: /month-end-close
|
||||
|
||||
## Description
|
||||
Execute month-end closing procedures.
|
||||
|
||||
## Parameters
|
||||
- `period`: [YYYY-MM]
|
||||
- `entities`: All | [specific entity]
|
||||
|
||||
## Workflow
|
||||
1. User triggers
|
||||
2. CFO routes to Controller
|
||||
3. Controller runs close checklist
|
||||
4. Internal Auditor verifies completeness
|
||||
5. CFO reviews and signs off
|
||||
6. Committed to shadow log
|
||||
|
||||
## Approval Required
|
||||
Yes — CFO final sign-off required
|
||||
20
agents/finance/commands/pl-generate.md
Normal file
20
agents/finance/commands/pl-generate.md
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
# Command: /pl-generate
|
||||
|
||||
## Description
|
||||
Generate profit and loss statement for selected entity and period.
|
||||
|
||||
## Parameters
|
||||
- `entity`: Synq Medical PC | Synq Holdings | Personal | All Entities
|
||||
- `period`: This Month | Last Month | This Quarter | Last Quarter | YTD | Last Year | Custom
|
||||
- `compare`: None | Budget | Prior Period
|
||||
|
||||
## Workflow
|
||||
1. User triggers via command palette or natural language
|
||||
2. CFO routes to Financial Analyst
|
||||
3. Financial Analyst pulls data from NocoBase / Odoo
|
||||
4. Internal Auditor runs QC checks
|
||||
5. Output staged in Synq Docs
|
||||
6. User reviews and approves / requests revisions
|
||||
|
||||
## Approval Required
|
||||
Yes — all financial writes
|
||||
20
agents/finance/commands/reconcile.md
Normal file
20
agents/finance/commands/reconcile.md
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
# Command: /reconcile
|
||||
|
||||
## Description
|
||||
Reconcile accounts and flag discrepancies.
|
||||
|
||||
## Parameters
|
||||
- `accounts`: All | [specific account]
|
||||
- `period`: This Month | Last Month | Custom
|
||||
- `sources`: Plaid | Manual | Odoo | All
|
||||
|
||||
## Workflow
|
||||
1. User triggers
|
||||
2. CFO routes to Controller
|
||||
3. Controller imports transactions from all sources
|
||||
4. Auto-match + flag unmatched
|
||||
5. Internal Auditor verifies no duplicates
|
||||
6. Staged for review
|
||||
|
||||
## Approval Required
|
||||
Yes
|
||||
19
agents/finance/commands/tax-summary.md
Normal file
19
agents/finance/commands/tax-summary.md
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
# Command: /tax-summary
|
||||
|
||||
## Description
|
||||
Generate tax summary and optimization tips.
|
||||
|
||||
## Parameters
|
||||
- `entity`: Synq Medical PC | Synq Holdings | Personal | All Entities
|
||||
- `period`: This Quarter | Last Quarter | This Year | Last Year
|
||||
- `compare_prior`: Yes | No
|
||||
|
||||
## Workflow
|
||||
1. User triggers
|
||||
2. CFO routes to Tax Manager
|
||||
3. Tax Manager aggregates income, deductions, credits
|
||||
4. Runs S-Corp reasonable salary test
|
||||
5. Staged for review
|
||||
|
||||
## Approval Required
|
||||
Yes
|
||||
20
agents/finance/commands/valuation.md
Normal file
20
agents/finance/commands/valuation.md
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
# Command: /valuation
|
||||
|
||||
## Description
|
||||
Interactive valuation calculator for business exit planning.
|
||||
|
||||
## Parameters
|
||||
- `entity`: [select]
|
||||
- `purpose`: Sale | Estate Planning | Buy-Sell | Divorce
|
||||
- `method`: DCF | Comps | Precedent | All
|
||||
|
||||
## Workflow
|
||||
1. User triggers
|
||||
2. CFO routes to Financial Analyst (valuation skill)
|
||||
3. Pull local financials + cloud market data
|
||||
4. Run selected valuation methodology
|
||||
5. Generate sensitivity tables
|
||||
6. Staged for review → outputs to Synq Docs
|
||||
|
||||
## Approval Required
|
||||
Yes
|
||||
20
agents/finance/commands/vendor-screen.md
Normal file
20
agents/finance/commands/vendor-screen.md
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
# Command: /vendor-screen
|
||||
|
||||
## Description
|
||||
Risk screening before signing contracts.
|
||||
|
||||
## Parameters
|
||||
- `counterparty_name`: [text]
|
||||
- `contract_value`: [amount]
|
||||
- `payment_terms`: [text]
|
||||
|
||||
## Workflow
|
||||
1. User triggers
|
||||
2. CFO routes to Financial Analyst (vendor risk skill)
|
||||
3. Cloud KYB lookup (Zephira, OpenCorporates)
|
||||
4. Stability proxy analysis
|
||||
5. Risk score + recommended terms
|
||||
6. Staged for review
|
||||
|
||||
## Approval Required
|
||||
Yes for contracts > $10K
|
||||
18
agents/finance/connectors/alpaca-mcp.json
Normal file
18
agents/finance/connectors/alpaca-mcp.json
Normal file
|
|
@ -0,0 +1,18 @@
|
|||
{
|
||||
"name": "alpaca",
|
||||
"tier": 1,
|
||||
"type": "mcp",
|
||||
"description": "Portfolio positions, market data, and trading (paper/live)",
|
||||
"base_url": "${ALPACA_URL:-https://paper-api.alpaca.markets}",
|
||||
"env_vars": ["ALPACA_API_KEY", "ALPACA_API_SECRET", "ALPACA_URL"],
|
||||
"endpoints": {
|
||||
"positions": "/v2/positions",
|
||||
"account": "/v2/account",
|
||||
"orders": "/v2/orders"
|
||||
},
|
||||
"rate_limit": "200/min",
|
||||
"cache_ttl": 60,
|
||||
"phi_safe": true,
|
||||
"write_capable": true,
|
||||
"note": "Write operations (orders) require explicit human approval"
|
||||
}
|
||||
17
agents/finance/connectors/coinbase-mcp.json
Normal file
17
agents/finance/connectors/coinbase-mcp.json
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
{
|
||||
"name": "coinbase",
|
||||
"tier": 1,
|
||||
"type": "mcp",
|
||||
"description": "Crypto transactions, cost basis, and gains/losses tracking",
|
||||
"base_url": "${COINBASE_URL:-https://api.coinbase.com}",
|
||||
"env_vars": ["COINBASE_API_KEY", "COINBASE_API_SECRET", "COINBASE_URL"],
|
||||
"endpoints": {
|
||||
"transactions": "/v2/accounts/{account_id}/transactions",
|
||||
"accounts": "/v2/accounts",
|
||||
"spot_price": "/v2/prices/{currency_pair}/spot"
|
||||
},
|
||||
"rate_limit": "100/min",
|
||||
"cache_ttl": 60,
|
||||
"phi_safe": true,
|
||||
"write_capable": false
|
||||
}
|
||||
19
agents/finance/connectors/coresignal.json
Normal file
19
agents/finance/connectors/coresignal.json
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
{
|
||||
"name": "coresignal",
|
||||
"tier": 2,
|
||||
"type": "paid_api",
|
||||
"description": "Company enrichment, employee trends, job postings",
|
||||
"base_url": "https://api.coresignal.com",
|
||||
"env_vars": ["CORESIGNAL_API_KEY"],
|
||||
"endpoints": {
|
||||
"company": "/v1/companies",
|
||||
"employee": "/v1/employees",
|
||||
"job": "/v1/jobs"
|
||||
},
|
||||
"cost": "$0.005-0.196/record",
|
||||
"rate_limit": "1000/min",
|
||||
"cache_ttl": 604800,
|
||||
"phi_safe": true,
|
||||
"write_capable": false,
|
||||
"note": "User pays directly. Synq orchestrates, does not markup."
|
||||
}
|
||||
18
agents/finance/connectors/crunchbase-pro.json
Normal file
18
agents/finance/connectors/crunchbase-pro.json
Normal file
|
|
@ -0,0 +1,18 @@
|
|||
{
|
||||
"name": "crunchbase-pro",
|
||||
"tier": 2,
|
||||
"type": "paid_api",
|
||||
"description": "Funding rounds, investor tracking, acquisition data",
|
||||
"base_url": "https://api.crunchbase.com/api/v4",
|
||||
"env_vars": ["CRUNCHBASE_API_KEY"],
|
||||
"endpoints": {
|
||||
"entities": "/entities",
|
||||
"searches": "/searches"
|
||||
},
|
||||
"cost": "~$49/mo",
|
||||
"rate_limit": "100/min",
|
||||
"cache_ttl": 604800,
|
||||
"phi_safe": true,
|
||||
"write_capable": false,
|
||||
"note": "User pays directly. Synq orchestrates, does not markup."
|
||||
}
|
||||
18
agents/finance/connectors/dwolla-mcp.json
Normal file
18
agents/finance/connectors/dwolla-mcp.json
Normal file
|
|
@ -0,0 +1,18 @@
|
|||
{
|
||||
"name": "dwolla",
|
||||
"tier": 1,
|
||||
"type": "mcp",
|
||||
"description": "ACH transfers, mass payments, and bank account verification",
|
||||
"base_url": "${DWOLLA_URL:-https://api-sandbox.dwolla.com}",
|
||||
"env_vars": ["DWOLLA_KEY", "DWOLLA_SECRET", "DWOLLA_URL"],
|
||||
"endpoints": {
|
||||
"transfers": "/transfers",
|
||||
"customers": "/customers",
|
||||
"funding_sources": "/funding-sources"
|
||||
},
|
||||
"rate_limit": "100/min",
|
||||
"cache_ttl": 300,
|
||||
"phi_safe": true,
|
||||
"write_capable": true,
|
||||
"note": "All transfers require human approval"
|
||||
}
|
||||
16
agents/finance/connectors/fred.json
Normal file
16
agents/finance/connectors/fred.json
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
{
|
||||
"name": "fred",
|
||||
"tier": 1,
|
||||
"type": "free_api",
|
||||
"description": "Macro indicators, interest rates, economic data",
|
||||
"base_url": "https://api.stlouisfed.org/fred",
|
||||
"env_vars": ["FRED_API_KEY"],
|
||||
"endpoints": {
|
||||
"series": "/series/observations",
|
||||
"search": "/series/search"
|
||||
},
|
||||
"rate_limit": "120/min",
|
||||
"cache_ttl": 86400,
|
||||
"phi_safe": true,
|
||||
"write_capable": false
|
||||
}
|
||||
18
agents/finance/connectors/openbb-terminal.json
Normal file
18
agents/finance/connectors/openbb-terminal.json
Normal file
|
|
@ -0,0 +1,18 @@
|
|||
{
|
||||
"name": "openbb-terminal",
|
||||
"tier": 1,
|
||||
"type": "local_service",
|
||||
"description": "Local research terminal, screening, macro data",
|
||||
"base_url": "http://localhost:8000",
|
||||
"env_vars": [],
|
||||
"endpoints": {
|
||||
"stocks": "/api/v1/stocks",
|
||||
"economy": "/api/v1/economy",
|
||||
"forex": "/api/v1/forex"
|
||||
},
|
||||
"rate_limit": "unlimited (local)",
|
||||
"cache_ttl": 3600,
|
||||
"phi_safe": true,
|
||||
"write_capable": false,
|
||||
"note": "Runs in local Docker container"
|
||||
}
|
||||
17
agents/finance/connectors/pandects-ma.json
Normal file
17
agents/finance/connectors/pandects-ma.json
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
{
|
||||
"name": "pandects-ma",
|
||||
"tier": 1,
|
||||
"type": "free_api",
|
||||
"description": "M&A deal history and agreement terms",
|
||||
"base_url": "https://api.pandects.io",
|
||||
"env_vars": [],
|
||||
"endpoints": {
|
||||
"deals": "/v1/deals",
|
||||
"terms": "/v1/terms",
|
||||
"search": "/v1/search"
|
||||
},
|
||||
"rate_limit": "1000/day",
|
||||
"cache_ttl": 86400,
|
||||
"phi_safe": true,
|
||||
"write_capable": false
|
||||
}
|
||||
17
agents/finance/connectors/plaid-mcp.json
Normal file
17
agents/finance/connectors/plaid-mcp.json
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
{
|
||||
"name": "plaid",
|
||||
"tier": 1,
|
||||
"type": "mcp",
|
||||
"description": "Banking transactions, balances, and account verification",
|
||||
"base_url": "${PLAID_URL:-http://localhost:8000}",
|
||||
"env_vars": ["PLAID_CLIENT_ID", "PLAID_SECRET", "PLAID_URL"],
|
||||
"endpoints": {
|
||||
"transactions": "/transactions/get",
|
||||
"balances": "/accounts/balance/get",
|
||||
"identity": "/identity/get"
|
||||
},
|
||||
"rate_limit": "100/min",
|
||||
"cache_ttl": 300,
|
||||
"phi_safe": true,
|
||||
"write_capable": false
|
||||
}
|
||||
16
agents/finance/connectors/sec-edgar.json
Normal file
16
agents/finance/connectors/sec-edgar.json
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
{
|
||||
"name": "sec-edgar",
|
||||
"tier": 1,
|
||||
"type": "free_api",
|
||||
"description": "Public company financials, 10-K, 10-Q, 8-K filings",
|
||||
"base_url": "https://www.sec.gov/Archives/edgar/daily-index",
|
||||
"env_vars": [],
|
||||
"endpoints": {
|
||||
"company_filings": "https://www.sec.gov/cgi-bin/browse-edgar?action=getcompany",
|
||||
"full_text_search": "https://efts.sec.gov/LATEST/search-index"
|
||||
},
|
||||
"rate_limit": "10/sec (SEC policy)",
|
||||
"cache_ttl": 86400,
|
||||
"phi_safe": true,
|
||||
"write_capable": false
|
||||
}
|
||||
19
agents/finance/connectors/statista.json
Normal file
19
agents/finance/connectors/statista.json
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
{
|
||||
"name": "statista",
|
||||
"tier": 2,
|
||||
"type": "freemium_api",
|
||||
"description": "Market sizing, industry trends, demographics",
|
||||
"base_url": "https://www.statista.com/api",
|
||||
"env_vars": ["STATISTA_API_KEY"],
|
||||
"endpoints": {
|
||||
"statistics": "/v1/statistics",
|
||||
"topics": "/v1/topics",
|
||||
"search": "/v1/search"
|
||||
},
|
||||
"cost": "Free tier + paid",
|
||||
"rate_limit": "100/min",
|
||||
"cache_ttl": 604800,
|
||||
"phi_safe": true,
|
||||
"write_capable": false,
|
||||
"note": "User pays directly for paid tier."
|
||||
}
|
||||
17
agents/finance/connectors/tracxn-lite.json
Normal file
17
agents/finance/connectors/tracxn-lite.json
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
{
|
||||
"name": "tracxn-lite",
|
||||
"tier": 1,
|
||||
"type": "free_api",
|
||||
"description": "Startup funding, acquisitions, investor mapping (free tier)",
|
||||
"base_url": "https://api.tracxn.com",
|
||||
"env_vars": ["TRACXN_API_KEY"],
|
||||
"endpoints": {
|
||||
"companies": "/api/2/companies",
|
||||
"funding": "/api/2/funding",
|
||||
"acquisitions": "/api/2/acquisitions"
|
||||
},
|
||||
"rate_limit": "100/day (free tier)",
|
||||
"cache_ttl": 86400,
|
||||
"phi_safe": true,
|
||||
"write_capable": false
|
||||
}
|
||||
16
agents/finance/connectors/usaspending.json
Normal file
16
agents/finance/connectors/usaspending.json
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
{
|
||||
"name": "usaspending",
|
||||
"tier": 1,
|
||||
"type": "free_api",
|
||||
"description": "Federal contract opportunities and spending data",
|
||||
"base_url": "https://api.usaspending.gov",
|
||||
"env_vars": [],
|
||||
"endpoints": {
|
||||
"search": "/api/v2/search/spending_by_award",
|
||||
"agency": "/api/v2/agency/{toptier_code}"
|
||||
},
|
||||
"rate_limit": "1000/day",
|
||||
"cache_ttl": 86400,
|
||||
"phi_safe": true,
|
||||
"write_capable": false
|
||||
}
|
||||
19
agents/finance/connectors/zephira.json
Normal file
19
agents/finance/connectors/zephira.json
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
{
|
||||
"name": "zephira",
|
||||
"tier": 2,
|
||||
"type": "paid_api",
|
||||
"description": "Registry lookup, UBO, KYB",
|
||||
"base_url": "https://api.zephira.ai",
|
||||
"env_vars": ["ZEPHIRA_API_KEY"],
|
||||
"endpoints": {
|
||||
"search": "/v1/search",
|
||||
"ubo": "/v1/ubo",
|
||||
"kyb": "/v1/kyb"
|
||||
},
|
||||
"cost": "$49-199/mo",
|
||||
"rate_limit": "1000/min",
|
||||
"cache_ttl": 604800,
|
||||
"phi_safe": true,
|
||||
"write_capable": false,
|
||||
"note": "User pays directly. Synq orchestrates, does not markup."
|
||||
}
|
||||
44
agents/finance/controller.md
Normal file
44
agents/finance/controller.md
Normal file
|
|
@ -0,0 +1,44 @@
|
|||
# Controller — Finance Operations Lead
|
||||
|
||||
## Role
|
||||
You own the general ledger, month-end close, reconciliations, accruals, and roll-forwards.
|
||||
|
||||
## Responsibilities
|
||||
- Perform daily/weekly GL reconciliations
|
||||
- Execute month-end and year-end close procedures
|
||||
- Manage accruals and prepaids
|
||||
- Produce roll-forward schedules
|
||||
- Flag discrepancies and investigate variances
|
||||
|
||||
## Rules
|
||||
1. All changes are **staged**, never auto-committed
|
||||
2. Every reconciliation requires a second-pass review
|
||||
3. Document all assumptions in the audit trail
|
||||
4. If data is missing, flag it explicitly rather than estimating
|
||||
|
||||
## Month-End Close Checklist
|
||||
1. [ ] All transactions imported (Plaid + manual + Odoo)
|
||||
2. [ ] Accruals recorded (utilities, payroll, interest)
|
||||
3. [ ] Prepaids amortized
|
||||
4. [ ] Depreciation entry posted
|
||||
5. [ ] Bank reconciliations complete (all accounts)
|
||||
6. [ ] Intercompany transactions eliminated
|
||||
7. [ ] Roll-forward schedules updated
|
||||
8. [ ] Variance explanations drafted
|
||||
9. [ ] CFO sign-off obtained
|
||||
10. [ ] Shadow log entry appended
|
||||
|
||||
## Reconciliation Format
|
||||
```
|
||||
Account: [Name] | Period: [YYYY-MM]
|
||||
---
|
||||
GL Balance: $XX,XXX.XX
|
||||
Bank Balance: $XX,XXX.XX
|
||||
Difference: $XXX.XX
|
||||
|
||||
Unreconciled Items:
|
||||
- [ ] Item 1: $XXX.XX — [explanation]
|
||||
- [ ] Item 2: $XXX.XX — [explanation]
|
||||
|
||||
Status: [PASS / PASS with advisory / FAIL]
|
||||
```
|
||||
43
agents/finance/financial-analyst.md
Normal file
43
agents/finance/financial-analyst.md
Normal file
|
|
@ -0,0 +1,43 @@
|
|||
# Financial Analyst — Modeling Lead
|
||||
|
||||
## Role
|
||||
You own financial modeling, P&L analysis, forecasting, variance reporting, and M&A intelligence.
|
||||
|
||||
## Responsibilities
|
||||
- Build and maintain P&L, balance sheet, and cash flow models
|
||||
- Produce variance analysis (actual vs. budget vs. forecast)
|
||||
- Run multi-entity consolidated reporting
|
||||
- M&A deal analysis: comps, precedent transactions, offer structuring
|
||||
- Competitive benchmarking and market positioning
|
||||
- Valuation modeling (DCF, market comps, precedent transactions)
|
||||
- Market expansion analysis (TAM, break-even, competitive threat)
|
||||
- Vendor risk screening (KYB, credit proxy, payment terms)
|
||||
|
||||
## Rules
|
||||
1. All models start from **verified data**, never fabricate numbers
|
||||
2. Clearly label **assumptions vs. actuals**
|
||||
3. Sensitivity analysis on all key assumptions
|
||||
4. Cloud research is cached and cross-checked against local data
|
||||
5. Staged for review before any output is committed
|
||||
|
||||
## Modeling Standards
|
||||
- Use template files from `templates/finance/`
|
||||
- No blank-slate spreadsheet generation
|
||||
- All formulas must cross-foot
|
||||
- Source citations for every external data point
|
||||
|
||||
## Cloud Research Workflow
|
||||
1. Receive intent from CFO
|
||||
2. Sanitize query (strip PHI, generalize amounts)
|
||||
3. Send to Together AI (Kimi) with context
|
||||
4. Cache result in local Redis
|
||||
5. Cross-check against local financials
|
||||
6. Enrich with proprietary SMB comps if available
|
||||
7. Stage for CFO review
|
||||
|
||||
## Output Header
|
||||
```
|
||||
Synq Financial Analysis | Prepared by: Financial Analyst
|
||||
Entity: [name] | Period: [dates] | Classification: STAGED
|
||||
---
|
||||
```
|
||||
45
agents/finance/internal-auditor.md
Normal file
45
agents/finance/internal-auditor.md
Normal file
|
|
@ -0,0 +1,45 @@
|
|||
# Internal Auditor — Compliance Lead
|
||||
|
||||
## Role
|
||||
You own compliance, quality control, audit trails, and output verification.
|
||||
|
||||
## Responsibilities
|
||||
- Verify formula consistency in all spreadsheets
|
||||
- Validate entity context (did we switch entities mid-analysis?)
|
||||
- Detect missing accruals during month-end
|
||||
- Flag duplicate transactions (cross-reference Plaid + manual entry)
|
||||
- Validate role-based access on every operation
|
||||
- Sanity-check cloud results against local cached data
|
||||
- Maintain immutable audit trails
|
||||
|
||||
## QC Checklist (Before Staging)
|
||||
For every output moving from InProgress → StagedForReview:
|
||||
|
||||
1. **Formula Consistency** — all cross-footing balances? Sum formulas correct?
|
||||
2. **Entity Context Validation** — no entity switches mid-analysis without explicit transition?
|
||||
3. **Missing Accrual Detection** — month-end only: are all known accruals present?
|
||||
4. **Duplicate Transaction Flag** — cross-reference Plaid + manual + Odoo entries
|
||||
5. **Role Access Validation** — did the requesting user have permission for this data?
|
||||
6. **Cloud Result Sanity Check** — does Kimi output match local cached data within tolerance?
|
||||
|
||||
## Response Format
|
||||
```
|
||||
## Audit/QC Report | Task: [id]
|
||||
|
||||
| Check | Status | Notes |
|
||||
|-------|--------|-------|
|
||||
| Formula Consistency | ✓ / ✗ | ... |
|
||||
| Entity Context | ✓ / ✗ | ... |
|
||||
| Missing Accruals | ✓ / ✗ | ... |
|
||||
| Duplicate Txns | ✓ / ✗ | ... |
|
||||
| Role Access | ✓ / ✗ | ... |
|
||||
| Cloud Sanity | ✓ / ✗ | ... |
|
||||
|
||||
**Overall:** [PASS / PASS with advisory / BLOCKED]
|
||||
```
|
||||
|
||||
## Rules
|
||||
- If any QC check fails → **block the output** and return detailed error notes
|
||||
- Never approve your own work
|
||||
- All audit findings are logged to the shadow log
|
||||
- PHI financial checks: ensure no patient data leaks into financial outputs
|
||||
29
agents/finance/skills/acquisition-intel.md
Normal file
29
agents/finance/skills/acquisition-intel.md
Normal file
|
|
@ -0,0 +1,29 @@
|
|||
# Skill: Acquisition Intelligence
|
||||
|
||||
## Trigger
|
||||
`/acquisition-report` or natural language: "I'm buying a dental practice in Orange County"
|
||||
|
||||
## Input
|
||||
- Target type (dental practice, medical practice, generic SMB)
|
||||
- Location (ZIP, city, county)
|
||||
- Target revenue range (optional)
|
||||
|
||||
## Cloud Research
|
||||
- Pandects (deal structures)
|
||||
- Tracxn (funding in dental tech)
|
||||
- Coresignal (competitor headcount)
|
||||
- Local SMB comps (Tier 3 proprietary)
|
||||
|
||||
## Local Data
|
||||
- User's own financials (if approved)
|
||||
- Cached prior research
|
||||
|
||||
## Output
|
||||
- Recent deal terms (anonymized)
|
||||
- Market trends
|
||||
- Risk flags
|
||||
- Recommended offer structure
|
||||
- Staged for review → user approves → local commit to deal tracker
|
||||
|
||||
## Deal Tracker Integration
|
||||
Outputs to: Deal Tracker pipeline (`sourced` → `contacted` → `LOI` → `due diligence` → `closed`)
|
||||
20
agents/finance/skills/audit-output.md
Normal file
20
agents/finance/skills/audit-output.md
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
# Skill: Audit Output QC
|
||||
|
||||
## Trigger
|
||||
`/audit-check` or automatic pre-staging gate
|
||||
|
||||
## Input
|
||||
- Any financial output (spreadsheet, memo, report)
|
||||
|
||||
## Process
|
||||
1. Formula consistency check (if spreadsheet)
|
||||
2. Entity context validation (no mid-analysis switches)
|
||||
3. Missing accrual detection (month-end only)
|
||||
4. Duplicate transaction flag (cross-reference sources)
|
||||
5. Role access validation (did user have permission?)
|
||||
6. Cloud result sanity check (does Kimi output match local cache?)
|
||||
|
||||
## Output
|
||||
- QC pass/fail report
|
||||
- Detailed error notes if blocked
|
||||
- Cleared for staging or returned for revision
|
||||
22
agents/finance/skills/competitive-benchmark.md
Normal file
22
agents/finance/skills/competitive-benchmark.md
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
# Skill: Competitive Benchmarking
|
||||
|
||||
## Trigger
|
||||
`/competitive-benchmark` or natural language: "How am I doing vs. competitors?"
|
||||
|
||||
## Input
|
||||
- Industry
|
||||
- Location
|
||||
- Business size (revenue, employees)
|
||||
|
||||
## Cloud Research
|
||||
- Coresignal (employee trends)
|
||||
- Crunchbase (funding)
|
||||
- OpenCorporates (registry)
|
||||
- Statista (market data)
|
||||
|
||||
## Output
|
||||
- Peer comparison table
|
||||
- Percentile ranking
|
||||
- Expansion threat alerts
|
||||
- Trend indicators
|
||||
- Staged for review
|
||||
23
agents/finance/skills/crypto-reporting.md
Normal file
23
agents/finance/skills/crypto-reporting.md
Normal file
|
|
@ -0,0 +1,23 @@
|
|||
# Skill: Crypto Reporting
|
||||
|
||||
## Trigger
|
||||
Natural language: "crypto gains", "bitcoin taxes"
|
||||
|
||||
## Input
|
||||
- Wallet addresses or exchange connections
|
||||
- Tax year
|
||||
- Cost basis method (FIFO default)
|
||||
|
||||
## Process
|
||||
1. Import transactions from Coinbase connector
|
||||
2. Calculate cost basis per lot
|
||||
3. Classify short-term vs. long-term
|
||||
4. Aggregate gains/losses
|
||||
5. Flag wash sale scenarios
|
||||
6. Stage Form 8949 draft
|
||||
|
||||
## Output
|
||||
- Gain/loss summary
|
||||
- Form 8949 draft
|
||||
- Lot-level detail
|
||||
- Staged for CPA review
|
||||
21
agents/finance/skills/market-expansion.md
Normal file
21
agents/finance/skills/market-expansion.md
Normal file
|
|
@ -0,0 +1,21 @@
|
|||
# Skill: Market Expansion Scanner
|
||||
|
||||
## Trigger
|
||||
`/market-scan` or natural language: "Should I open a second location?"
|
||||
|
||||
## Input
|
||||
- Target location
|
||||
- Business type
|
||||
- Investment budget
|
||||
|
||||
## Cloud Research
|
||||
- Statista (demographics)
|
||||
- Coresignal (hiring trends)
|
||||
- USASpending (gov contracts)
|
||||
- Local cache
|
||||
|
||||
## Output
|
||||
- Break-even analysis
|
||||
- Competitive threat timeline
|
||||
- Patient/customer acquisition cost projection
|
||||
- Staged for review
|
||||
19
agents/finance/skills/methodology-pack/comps.md
Normal file
19
agents/finance/skills/methodology-pack/comps.md
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
# Methodology: Market Comps
|
||||
|
||||
## Overview
|
||||
Comparable company / transaction analysis for SMBs.
|
||||
|
||||
## Tiers
|
||||
1. **Public comps** (SEC EDGAR) — large-cap, less relevant for sub-$10M
|
||||
2. **Private transaction comps** (Pandects, local M&A tracker) — most relevant
|
||||
3. **Proprietary SMB comps** (Tier 3) — anonymized user data
|
||||
|
||||
## Multiples
|
||||
- Revenue multiple (1.0x–3.0x for healthcare)
|
||||
- EBITDA multiple (4.0x–8.0x for healthcare)
|
||||
- SDE multiple (2.5x–4.5x for owner-operated)
|
||||
|
||||
## Adjustments
|
||||
- Size discount (< $1M revenue)
|
||||
- Geographic premium (coastal markets)
|
||||
- Customer concentration penalty (> 30% from one source)
|
||||
25
agents/finance/skills/methodology-pack/dcf.md
Normal file
25
agents/finance/skills/methodology-pack/dcf.md
Normal file
|
|
@ -0,0 +1,25 @@
|
|||
# Methodology: Discounted Cash Flow (DCF)
|
||||
|
||||
## Overview
|
||||
Standard DCF for SMB valuation with conservative assumptions.
|
||||
|
||||
## Inputs
|
||||
- Projected free cash flows (5 years)
|
||||
- Terminal growth rate (2-3% for healthcare)
|
||||
- WACC (typically 12-15% for private practices)
|
||||
|
||||
## Formula
|
||||
```
|
||||
Enterprise Value = Σ(FCF_t / (1 + WACC)^t) + Terminal Value / (1 + WACC)^n
|
||||
|
||||
Terminal Value = FCF_n * (1 + g) / (WACC - g)
|
||||
```
|
||||
|
||||
## Sensitivity
|
||||
- WACC +/- 200 bps
|
||||
- Terminal growth +/- 1%
|
||||
- Revenue growth +/- 5%
|
||||
|
||||
## Output
|
||||
- Base case, upside, downside valuation
|
||||
- Sensitivity matrix
|
||||
20
agents/finance/skills/methodology-pack/returns.md
Normal file
20
agents/finance/skills/methodology-pack/returns.md
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
# Methodology: Returns Analysis
|
||||
|
||||
## Overview
|
||||
Calculate investment returns for expansion, acquisitions, and capital projects.
|
||||
|
||||
## Metrics
|
||||
- IRR (Internal Rate of Return)
|
||||
- NPV (Net Present Value)
|
||||
- Payback period
|
||||
- ROI (Return on Investment)
|
||||
- MOIC (Multiple on Invested Capital)
|
||||
|
||||
## Thresholds
|
||||
- Acquisitions: minimum 15% IRR
|
||||
- Expansion: minimum 20% IRR
|
||||
- Equipment: minimum 25% IRR or 3-year payback
|
||||
|
||||
## Risk-Adjusted
|
||||
- Apply probability-weighted scenarios
|
||||
- Base case 50%, upside 25%, downside 25%
|
||||
19
agents/finance/skills/methodology-pack/unit-economics.md
Normal file
19
agents/finance/skills/methodology-pack/unit-economics.md
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
# Methodology: Unit Economics
|
||||
|
||||
## Healthcare Practice Metrics
|
||||
- Revenue per patient visit
|
||||
- Cost per patient visit (direct + allocated)
|
||||
- Patient acquisition cost (PAC)
|
||||
- Lifetime value (LTV)
|
||||
- LTV:CAC ratio (target > 3:1)
|
||||
|
||||
## Break-Even
|
||||
```
|
||||
Fixed Costs / (Price per Unit - Variable Cost per Unit) = Break-Even Units
|
||||
```
|
||||
|
||||
## Expansion Threshold
|
||||
- Second location viable when:
|
||||
- LTV:CAC > 3:1 sustained 6+ months
|
||||
- Corporate overhead < 15% of combined revenue
|
||||
- Local market TAM supports 2x current patient volume
|
||||
22
agents/finance/skills/methodology-pack/variance.md
Normal file
22
agents/finance/skills/methodology-pack/variance.md
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
# Methodology: Variance Analysis
|
||||
|
||||
## Overview
|
||||
Explain differences between actual, budget, and forecast.
|
||||
|
||||
## Framework
|
||||
1. **Volume variance** — change in units/revenue
|
||||
2. **Price variance** — change in rate/pricing
|
||||
3. **Mix variance** — change in product/service mix
|
||||
4. **Cost variance** — change in input costs
|
||||
|
||||
## Format
|
||||
```
|
||||
| Line Item | Actual | Budget | Variance | % | Explanation |
|
||||
|-----------|--------|--------|----------|---|-------------|
|
||||
| Revenue | $X | $Y | $Z | % | [driver] |
|
||||
```
|
||||
|
||||
## Rules
|
||||
- Every material variance (> 5% or > $5K) requires explanation
|
||||
- Distinguish one-time vs. recurring
|
||||
- Flag trends (3+ periods of same direction)
|
||||
24
agents/finance/skills/pl-analysis.md
Normal file
24
agents/finance/skills/pl-analysis.md
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
# Skill: P&L Analysis
|
||||
|
||||
## Trigger
|
||||
`/pl-generate` or natural language: "generate my P&L"
|
||||
|
||||
## Input
|
||||
- Entity selector (Synq Medical PC, Synq Holdings, Personal, All Entities)
|
||||
- Period (month, quarter, year, custom range)
|
||||
- Comparison mode (actual only, actual vs. budget, actual vs. prior period)
|
||||
|
||||
## Process
|
||||
1. Controller pulls GL data for the period
|
||||
2. Financial Analyst models revenue and expense lines
|
||||
3. Internal Auditor verifies formula consistency and entity context
|
||||
4. CFO reviews and stages output
|
||||
|
||||
## Output
|
||||
- Structured P&L statement
|
||||
- Variance explanations (if comparison mode enabled)
|
||||
- Key ratios (gross margin, operating margin, net margin)
|
||||
- Staged in Synq Docs for review
|
||||
|
||||
## Template
|
||||
`templates/pl-report.pptx`
|
||||
16
agents/finance/skills/pptx-author.md
Normal file
16
agents/finance/skills/pptx-author.md
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
# Skill: PPTX Author
|
||||
|
||||
## Role
|
||||
Generate presentations from templates via headless PowerPoint through Synq Docs.
|
||||
|
||||
## Process
|
||||
1. Receive content outline from Financial Analyst or CFO
|
||||
2. Load template from `templates/finance/`
|
||||
3. Populate slides with data, charts, and annotations
|
||||
4. Save to Synq Docs PostgreSQL backend
|
||||
5. Open WebSocket + Y.js session for collaborative review
|
||||
|
||||
## Rules
|
||||
- Use branded templates only
|
||||
- Include disclaimer slide: "STAGED — requires approval"
|
||||
- Source citations in speaker notes
|
||||
22
agents/finance/skills/reconciliation.md
Normal file
22
agents/finance/skills/reconciliation.md
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
# Skill: Reconciliation
|
||||
|
||||
## Trigger
|
||||
`/reconcile` or natural language: "reconcile my accounts"
|
||||
|
||||
## Input
|
||||
- Account(s) to reconcile
|
||||
- Period
|
||||
- Source systems (Plaid, manual entry, Odoo)
|
||||
|
||||
## Process
|
||||
1. Controller imports transactions from all sources
|
||||
2. Auto-match by amount + date + description
|
||||
3. Flag unmatched items for manual review
|
||||
4. Internal Auditor verifies no duplicates
|
||||
5. Stage reconciliation report
|
||||
|
||||
## Output
|
||||
- Reconciliation statement
|
||||
- Unmatched items list with suggested matches
|
||||
- Variance explanations
|
||||
- Shadow log entry
|
||||
22
agents/finance/skills/tax-optimization.md
Normal file
22
agents/finance/skills/tax-optimization.md
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
# Skill: Tax Optimization
|
||||
|
||||
## Trigger
|
||||
`/tax-summary` or natural language: "tax summary"
|
||||
|
||||
## Input
|
||||
- Entity
|
||||
- Period
|
||||
- Prior year comparison (yes/no)
|
||||
|
||||
## Process
|
||||
1. Tax Manager aggregates income, deductions, credits
|
||||
2. Run S-Corp reasonable salary test
|
||||
3. Calculate QBI deduction
|
||||
4. Flag optimization opportunities
|
||||
5. Stage for review
|
||||
|
||||
## Output
|
||||
- Tax summary document
|
||||
- Optimization recommendations
|
||||
- Estimated payments schedule
|
||||
- Caveats: "Consult your CPA before filing"
|
||||
30
agents/finance/skills/valuation-exit.md
Normal file
30
agents/finance/skills/valuation-exit.md
Normal file
|
|
@ -0,0 +1,30 @@
|
|||
# Skill: Valuation & Exit Planning
|
||||
|
||||
## Trigger
|
||||
`/valuation` or natural language: "What's my business worth? Should I sell?"
|
||||
|
||||
## Input
|
||||
- Entity to value
|
||||
- Purpose (sale, estate planning, buy-sell agreement)
|
||||
- Time horizon
|
||||
|
||||
## Cloud Research
|
||||
- Proprietary SMB model
|
||||
- Pandects (recent sales)
|
||||
- FRED (interest rates)
|
||||
- Local financials (user-approved)
|
||||
|
||||
## Methodology
|
||||
1. DCF valuation
|
||||
2. Market comps
|
||||
3. Precedent transactions
|
||||
4. Exit timing recommendation
|
||||
|
||||
## Output
|
||||
- Valuation range
|
||||
- Sensitivity table
|
||||
- Exit timing recommendation
|
||||
- Staged for review → outputs to Synq Docs
|
||||
|
||||
## Template
|
||||
`templates/valuation-report.pptx`
|
||||
20
agents/finance/skills/vendor-risk.md
Normal file
20
agents/finance/skills/vendor-risk.md
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
# Skill: Vendor/Customer Risk Screening
|
||||
|
||||
## Trigger
|
||||
`/vendor-screen` or natural language: "Should I take this $50K contract?"
|
||||
|
||||
## Input
|
||||
- Counterparty name
|
||||
- Contract value
|
||||
- Payment terms
|
||||
|
||||
## Cloud Research
|
||||
- Zephira (registry, UBO, KYB)
|
||||
- OpenCorporates (global registry)
|
||||
- Coresignal (stability proxy)
|
||||
|
||||
## Output
|
||||
- Risk score (1-10)
|
||||
- Recommended payment terms
|
||||
- Red flags
|
||||
- Staged for review
|
||||
17
agents/finance/skills/xlsx-author.md
Normal file
17
agents/finance/skills/xlsx-author.md
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
# Skill: XLSX Author
|
||||
|
||||
## Role
|
||||
Generate structured spreadsheets via headless Excel through Synq Docs.
|
||||
|
||||
## Process
|
||||
1. Receive structured data from Controller or Financial Analyst
|
||||
2. Apply template from `templates/finance/`
|
||||
3. Populate formulas, formatting, and charts
|
||||
4. Save to Synq Docs PostgreSQL backend
|
||||
5. Open WebSocket + Y.js session for collaborative review
|
||||
|
||||
## Rules
|
||||
- All documents start from templates — no blank-slate generation
|
||||
- Formulas must cross-foot
|
||||
- Include source citations in comment fields
|
||||
- Mark as STAGED until approved
|
||||
51
agents/finance/tax-manager.md
Normal file
51
agents/finance/tax-manager.md
Normal file
|
|
@ -0,0 +1,51 @@
|
|||
# Tax Manager — Tax Lead
|
||||
|
||||
## Role
|
||||
You own tax strategy, S-Corp optimization, crypto reporting, and entity structuring.
|
||||
|
||||
## Responsibilities
|
||||
- Produce quarterly and annual tax summaries
|
||||
- Optimize S-Corp distributions vs. salary
|
||||
- Track and report crypto gains/losses (FIFO/LIFO)
|
||||
- Advise on entity switching (LLC → S-Corp, etc.)
|
||||
- Monitor state and local tax obligations
|
||||
- Coordinate with external CPAs during filing season
|
||||
|
||||
## Rules
|
||||
1. **Never provide legal advice.** Always end with: *"Consult your CPA/attorney for final decisions."*
|
||||
2. All tax outputs are **estimates** until reviewed by a licensed preparer
|
||||
3. Stage all outputs for human review
|
||||
4. Flag any transactions that may trigger audit risk
|
||||
|
||||
## S-Corp Optimization
|
||||
- Check reasonable salary against safe harbor
|
||||
- Track basis accurately
|
||||
- Monitor excess business loss limitations
|
||||
- Document QBI qualification
|
||||
|
||||
## Crypto Reporting
|
||||
- Default method: FIFO
|
||||
- Track cost basis per lot
|
||||
- Separate short-term vs. long-term
|
||||
- Flag wash sale scenarios (current guidance)
|
||||
|
||||
## Entity Switch Analysis Template
|
||||
```
|
||||
## Entity Switch: [Current] → [Proposed]
|
||||
|
||||
### Pros
|
||||
- [ ]
|
||||
|
||||
### Cons
|
||||
- [ ]
|
||||
|
||||
### Estimated Cost Impact
|
||||
- Additional compliance: $X,XXX/year
|
||||
- Tax savings (projected): $X,XXX/year
|
||||
- Net: $X,XXX/year
|
||||
|
||||
### Recommendation
|
||||
[text]
|
||||
|
||||
⚠️ This is strategic guidance, not legal advice. Consult your attorney.
|
||||
```
|
||||
34
agents/finance/templates/acquisition-memo.md
Normal file
34
agents/finance/templates/acquisition-memo.md
Normal file
|
|
@ -0,0 +1,34 @@
|
|||
# Acquisition Memo Template
|
||||
|
||||
## Target: [Name] | Date: [YYYY-MM-DD]
|
||||
|
||||
### Target Profile
|
||||
- Type: [Dental / Medical / SMB]
|
||||
- Location: [City, ZIP]
|
||||
- Revenue: $[Range]
|
||||
- EBITDA: $[Range]
|
||||
- Employees: [Count]
|
||||
|
||||
### Market Context
|
||||
- Recent comps: [X transactions]
|
||||
- Multiple range: [X.Xx – Y.Yx EBITDA]
|
||||
- Market trend: [Growing / Stable / Declining]
|
||||
|
||||
### Offer Structure
|
||||
- Purchase price: $[X]
|
||||
- Structure: [Asset / Stock]
|
||||
- Cash at close: $[X] ([X]%)
|
||||
- Seller note: $[X] ([X]%)
|
||||
- Earnout: $[X] ([X]%)
|
||||
|
||||
### Risk Factors
|
||||
- [ ] Lease expiration
|
||||
- [ ] Key person dependency
|
||||
- [ ] Customer concentration
|
||||
- [ ] Regulatory changes
|
||||
|
||||
### Recommendation
|
||||
[Buy / Pass / Further Diligence]
|
||||
|
||||
---
|
||||
*Prepared by Synq Finance | STAGED — requires approval*
|
||||
34
agents/finance/templates/competitive-dashboard.pptx
Normal file
34
agents/finance/templates/competitive-dashboard.pptx
Normal file
|
|
@ -0,0 +1,34 @@
|
|||
# Competitive Dashboard Template (PPTX Outline)
|
||||
|
||||
## Slide 1: Title
|
||||
- Market: [Industry / Location]
|
||||
- Date: [Date]
|
||||
- Classification: STAGED
|
||||
|
||||
## Slide 2: Peer Overview
|
||||
- Table: Company, Revenue, Employees, Growth, Funding
|
||||
|
||||
## Slide 3: Trend Analysis
|
||||
- Headcount growth (12 months)
|
||||
- Hiring velocity
|
||||
- Job posting sentiment
|
||||
|
||||
## Slide 4: Market Position
|
||||
- Revenue percentile ranking
|
||||
- Growth rate percentile
|
||||
- Margin comparison
|
||||
|
||||
## Slide 5: Threat Alerts
|
||||
- New entrants
|
||||
- Pricing pressure signals
|
||||
- Customer churn indicators
|
||||
|
||||
## Slide 6: Opportunity Signals
|
||||
- Underserved segments
|
||||
- Geographic gaps
|
||||
- Partnership potential
|
||||
|
||||
## Slide 7: Recommendations
|
||||
- Strategic priorities
|
||||
- Timeline
|
||||
- Investment required
|
||||
42
agents/finance/templates/investor-memo.md
Normal file
42
agents/finance/templates/investor-memo.md
Normal file
|
|
@ -0,0 +1,42 @@
|
|||
# Investor Memo Template
|
||||
|
||||
## [Company Name] — [Date]
|
||||
|
||||
### Executive Summary
|
||||
- Revenue: $[X]
|
||||
- EBITDA: $[Y]
|
||||
- Growth: [Z]%
|
||||
- Ask: $[Amount] for [Use]
|
||||
|
||||
### Business Overview
|
||||
[2-3 paragraphs]
|
||||
|
||||
### Financial Highlights
|
||||
| Metric | Value |
|
||||
|--------|-------|
|
||||
| ARR / Revenue | $X |
|
||||
| Gross Margin | X% |
|
||||
| EBITDA Margin | X% |
|
||||
| CAC | $X |
|
||||
| LTV | $X |
|
||||
| LTV:CAC | X:1 |
|
||||
|
||||
### Market Opportunity
|
||||
- TAM: $X
|
||||
- SAM: $Y
|
||||
- SOM: $Z
|
||||
|
||||
### Competitive Position
|
||||
[Peer comparison table]
|
||||
|
||||
### Use of Funds
|
||||
1. [ ] — $X
|
||||
2. [ ] — $Y
|
||||
3. [ ] — $Z
|
||||
|
||||
### Risk Factors
|
||||
- [ ]
|
||||
- [ ]
|
||||
|
||||
---
|
||||
*Prepared by Synq Finance | STAGED — requires approval*
|
||||
38
agents/finance/templates/month-end-checklist.md
Normal file
38
agents/finance/templates/month-end-checklist.md
Normal file
|
|
@ -0,0 +1,38 @@
|
|||
# Month-End Close Checklist
|
||||
|
||||
## Entity: [Name] | Period: [YYYY-MM]
|
||||
|
||||
### Pre-Close
|
||||
- [ ] All bank feeds imported (Plaid)
|
||||
- [ ] All manual entries recorded
|
||||
- [ ] Odoo invoices synced
|
||||
- [ ] Credit card transactions categorized
|
||||
|
||||
### Reconciliations
|
||||
- [ ] Operating account
|
||||
- [ ] Payroll account
|
||||
- [ ] Savings / reserve
|
||||
- [ ] Credit card(s)
|
||||
- [ ] Petty cash
|
||||
|
||||
### Accruals
|
||||
- [ ] Payroll (last days of month)
|
||||
- [ ] Utilities
|
||||
- [ ] Rent / CAM
|
||||
- [ ] Insurance
|
||||
- [ ] Interest
|
||||
- [ ] Depreciation
|
||||
|
||||
### Review
|
||||
- [ ] P&L variance analysis
|
||||
- [ ] Balance sheet sanity check
|
||||
- [ ] Intercompany eliminations
|
||||
- [ ] Roll-forward schedules
|
||||
|
||||
### Sign-Off
|
||||
- [ ] Controller review
|
||||
- [ ] CFO approval
|
||||
- [ ] Shadow log entry
|
||||
|
||||
---
|
||||
*Status: [In Progress / Complete]*
|
||||
37
agents/finance/templates/tax-package.docx
Normal file
37
agents/finance/templates/tax-package.docx
Normal file
|
|
@ -0,0 +1,37 @@
|
|||
# Tax Package Template
|
||||
|
||||
## Entity: [Name] | Year: [YYYY]
|
||||
|
||||
### Income Statement Summary
|
||||
- Gross receipts: $X
|
||||
- COGS: $Y
|
||||
- Gross profit: $Z
|
||||
- Operating expenses: $A
|
||||
- Net ordinary income: $B
|
||||
|
||||
### Balance Sheet Summary
|
||||
- Cash: $X
|
||||
- AR: $Y
|
||||
- Equipment: $Z
|
||||
- AP: $A
|
||||
- Loans: $B
|
||||
- Equity: $C
|
||||
|
||||
### Schedule K-1 Items
|
||||
- Ordinary business income: $X
|
||||
- Interest income: $Y
|
||||
- Section 179 deduction: $Z
|
||||
- Charitable contributions: $A
|
||||
|
||||
### Supporting Schedules
|
||||
- [ ] Depreciation schedule
|
||||
- [ ] Home office worksheet
|
||||
- [ ] Vehicle log
|
||||
- [ ] Meal & entertainment log
|
||||
- [ ] Health insurance premiums
|
||||
|
||||
### Notes for Preparer
|
||||
[Free text]
|
||||
|
||||
---
|
||||
*Prepared by Synq Tax Manager | STAGED — requires CPA review*
|
||||
45
agents/finance/templates/valuation-report.pptx
Normal file
45
agents/finance/templates/valuation-report.pptx
Normal file
|
|
@ -0,0 +1,45 @@
|
|||
# Valuation Report Template (PPTX Outline)
|
||||
|
||||
## Slide 1: Title
|
||||
- Company: [Name]
|
||||
- Date: [Date]
|
||||
- Prepared by: Synq Finance
|
||||
- Classification: STAGED — requires approval
|
||||
|
||||
## Slide 2: Executive Summary
|
||||
- Valuation range: $X – $Y
|
||||
- Primary method: [DCF / Comps / Precedent]
|
||||
- Key assumption: [Summary]
|
||||
|
||||
## Slide 3: Business Overview
|
||||
- Industry
|
||||
- Revenue trend (3 years)
|
||||
- Customer base
|
||||
- Competitive position
|
||||
|
||||
## Slide 4: Financial Highlights
|
||||
- Revenue, EBITDA, margin trends
|
||||
- Key ratios
|
||||
|
||||
## Slide 5: DCF Analysis
|
||||
- 5-year projection summary
|
||||
- WACC, terminal growth
|
||||
- Sensitivity table
|
||||
|
||||
## Slide 6: Market Comps
|
||||
- Comparable transactions
|
||||
- Multiple range
|
||||
- Adjustments applied
|
||||
|
||||
## Slide 7: Precedent Transactions
|
||||
- Recent deals
|
||||
- Implied multiples
|
||||
|
||||
## Slide 8: Conclusion & Recommendations
|
||||
- Final valuation range
|
||||
- Exit timing recommendation
|
||||
- Key risks
|
||||
|
||||
## Slide 9: Disclaimer
|
||||
- This is an estimate, not a formal appraisal
|
||||
- Consult a licensed valuation professional for legal purposes
|
||||
366
audits/Synq_Stream_Wiki_v2.3_Gap_Analysis.md
Normal file
366
audits/Synq_Stream_Wiki_v2.3_Gap_Analysis.md
Normal file
|
|
@ -0,0 +1,366 @@
|
|||
# Synq Stream UI — Gap Analysis vs. Project Wiki v2.3
|
||||
|
||||
**Date:** 2026-05-04
|
||||
**Auditor:** Kimi Code CLI
|
||||
**Scope:** `synq-core-runtime/ui/stream` (Tauri v2 + React + Vite + Tailwind v3)
|
||||
**Wiki Reference:** `/synq/desktop/synq-prompts/Synq_Project_Wiki_v2.3_Master.md` (2821 lines, 27 sections)
|
||||
**Status:** ARCHITECTURE LOCKED — implementation pending (per wiki)
|
||||
|
||||
---
|
||||
|
||||
## Executive Summary
|
||||
|
||||
The current Synq Stream UI is a **functioning Odoo-connected dashboard** with 14 channels, a collapsible sidebar, debounced search across major modules, floating detail panels, and a basic widget grid. It successfully reads from Odoo (`res.partner`, `calendar.event`, `mail.message`, `ir.attachment`, `account.move`, `sale.order`, `project.project/task`) and renders data in a Tailwind-themed interface.
|
||||
|
||||
However, measured against Wiki v2.3, the codebase is approximately **~15–20% implemented** for the full Synq Desktop vision. The majority of the architecture — AI service mesh, RBAC, commerce engine, storage mesh, credit system, QR payments, cross-border flows, and Synq Social network — exists only as specifications. Several security and infrastructure assumptions in the current code deviate from wiki mandates.
|
||||
|
||||
---
|
||||
|
||||
## 1. UI/UX Mismatches
|
||||
|
||||
### 1.1 Theme System Divergence
|
||||
| Wiki Spec | Current Implementation | Gap |
|
||||
|-----------|------------------------|-----|
|
||||
| Desktop reference uses **Tailwind v4** with `@theme inline` CSS custom properties | Stream uses **Tailwind v3** with manual `:root`/`.dark` CSS vars in `index.css` | **Medium** — functional but creates maintenance divergence; no `@theme inline` block; colors use `rgba()` instead of `oklch()` |
|
||||
| Synq Social light overrides (`--bg-primary`, `--bg-sidebar`, etc.) | Partial: some social colors exist but not fully wired to components | **Low** — light mode works; palette consistency acceptable |
|
||||
|
||||
**Files:** `ui/stream/src/index.css`, `ui/stream/tailwind.config.js`
|
||||
|
||||
### 1.2 Dashboard 2.0 Specification Gap
|
||||
| Wiki Spec (§1.3) | Current Implementation | Gap |
|
||||
|------------------|------------------------|-----|
|
||||
| **"Ask Beam palette"** — prominent search/command palette | None. Beam is a bottom chat bar only | **High** — central UX pattern missing |
|
||||
| **Employee status** indicators on dashboard | None | **Medium** — no staff/presence system |
|
||||
| **Drag-drop widgets** with semantic channel grouping | `WidgetGrid` exists with `react-grid-layout`, but widgets are channel carousels, not true Dashboard 2.0 widgets | **Medium** — grid framework exists, content is placeholder-level |
|
||||
| Dashboard config persistence per user role | `save_dashboard_config` / `get_user_profile` Tauri commands exist but user profile is static/mock | **Medium** — no real user backend |
|
||||
|
||||
### 1.3 Post-Op Triage Missing
|
||||
| Wiki Spec (§1.3) | Current Implementation | Gap |
|
||||
|------------------|------------------------|-----|
|
||||
| Patient cards with **update bubbles** | PatientsPage shows static cards with `write_date`; no bubble/urgency indicator | **High** — no triage urgency UI |
|
||||
| **1–7 day filtering** for post-op follow-up | None | **High** — critical clinical module missing |
|
||||
| Triage routing to Beam on DGX Spark (port 8082) | None | **High** — no DGX connectivity at all |
|
||||
|
||||
### 1.4 Finance Module v4.0 Under-Implemented
|
||||
| Wiki Spec (§6, §1.5) | Current Implementation | Gap |
|
||||
|----------------------|------------------------|-----|
|
||||
| Multi-entity tracking (Synq Commerce LLC / Synq Medical PC / Synq Holdings) | `fetchFinanceSummary()` sums invoices + sale orders; no entity separation | **High** — all revenue pooled, no ledger segregation |
|
||||
| Commission engine (5% deduction, escrow, weekly settlement) | None | **High** — no commerce transaction model |
|
||||
| Banking integrations (Plaid/MX, Dwolla ACH, Coinbase, Taxbit, Alpaca, Metals-API) | None | **High** — no financial service integrations |
|
||||
| S-Corp optimization, bill pay, team permissions, P&L generation | None | **High** |
|
||||
| CFO Beam AI persona | None | **High** |
|
||||
|
||||
### 1.5 Memory System Absent
|
||||
| Wiki Spec (§1.3) | Current Implementation | Gap |
|
||||
|------------------|------------------------|-----|
|
||||
| Semantic clustering (DBSCAN) | None | **High** |
|
||||
| Knowledge graph | None | **High** |
|
||||
| Multi-strategy retrieval | None | **High** |
|
||||
| MemPalace integration (vector DB) | None | **High** — no connection to MemPalace service |
|
||||
|
||||
### 1.6 Projects Module — List Only, No Kanban/Timeline
|
||||
| Wiki Spec (§1.3, §4.1) | Current Implementation | Gap |
|
||||
|------------------------|------------------------|-----|
|
||||
| Kanban board with drag-drop columns | `ProjectsPage` is a searchable list of `project.project` + `project.task` | **High** — no board view |
|
||||
| Timeline/Gantt view | None | **High** |
|
||||
| AI-assisted task management | None | **High** |
|
||||
| Product launch workflows | None | **High** |
|
||||
|
||||
### 1.7 Ecommerce Module Completely Missing
|
||||
| Wiki Spec (§1.3, §1.9, §1.10) | Current Implementation | Gap |
|
||||
|-------------------------------|------------------------|-----|
|
||||
| Store management, product catalog | None | **Critical** — entire commerce engine absent |
|
||||
| Orders, fulfillment workflow | None | **Critical** |
|
||||
| Merchant avatar config UI | None | **Critical** |
|
||||
| Pick/pack/ship workflow | None | **Critical** |
|
||||
| AI cataloging, merchandising, fraud, fulfillment AI | None | **Critical** |
|
||||
|
||||
### 1.8 Communication Hub — Basic Message List Only
|
||||
| Wiki Spec (§1.3, §4.1) | Current Implementation | Gap |
|
||||
|------------------------|------------------------|-----|
|
||||
| Unified inbox with AI summarization | `CommunicationsPage` shows `mail.message` rows; no threading, no summarization | **High** |
|
||||
| Supplier procurement threads | None | **High** |
|
||||
| Dispute documentation | None | **High** |
|
||||
| Order notification routing | None | **High** |
|
||||
|
||||
### 1.9 Synq News — Placeholder Only
|
||||
| Wiki Spec (§4) | Current Implementation | Gap |
|
||||
|----------------|------------------------|-----|
|
||||
| Editorial pipeline with 5 verification checks | `news` channel is a placeholder returning empty array | **High** |
|
||||
| Shoppable articles, merchant discovery | None | **High** |
|
||||
| Reader rewards (5 credits/article) | None | **High** |
|
||||
|
||||
### 1.10 Travel Hub — Placeholder Only
|
||||
| Wiki Spec (§19) | Current Implementation | Gap |
|
||||
|-----------------|------------------------|-----|
|
||||
| Travel concierge agent ("Wingman") | `ChannelPage` placeholder for `/help` route; no dedicated Travel Hub | **High** |
|
||||
| Itinerary sync to calendar | None | **High** |
|
||||
| 2% Synq Credit earn on bookings | None | **High** |
|
||||
|
||||
### 1.11 Beam AI Chat — Bottom Bar, Not Floating Scientist
|
||||
| Wiki Spec (§1.2, §2) | Current Implementation | Gap |
|
||||
|----------------------|------------------------|-----|
|
||||
| **"Beam Scientist — Floating avatar UI (not a service)"** | `BeamChat` is a bottom collapsible panel, not floating | **Medium** — UX pattern mismatch |
|
||||
| 6 DGX Spark services (Triage 8082, Messaging 8084, Search 8083, Doctor Beam 8085, AVA Voice 8086, Twin 8087) | `useAIServices` warms up 5 generic services (`checkOllamaHealth`, `checkKimiHealth`, etc.) but no DGX service mesh | **High** — no DGX connectivity |
|
||||
| Role-based Beam access (Dr. Qazi full, providers clinical, staff admin, patients portal) | No RBAC; all users see same Beam chat | **High** |
|
||||
| AVA Voice (port 8086) integration | None | **High** |
|
||||
|
||||
---
|
||||
|
||||
## 2. Missing Modules (Not Present At All)
|
||||
|
||||
### 2.1 Clinical Modules
|
||||
| Module | Wiki Section | Severity |
|
||||
|--------|-------------|----------|
|
||||
| **Post-Op Triage** | §1.3, §4.1 | Critical |
|
||||
| **EMR / ONLYOFFICE** (local doc server, port 8443) | §1.3 | Critical |
|
||||
| **Patient Portal** (patient-facing Beam access) | §2.2 | High |
|
||||
| Clinical notes with HIPAA audit trails | §8.1 | High |
|
||||
|
||||
### 2.2 Commerce Modules
|
||||
| Module | Wiki Section | Severity |
|
||||
|--------|-------------|----------|
|
||||
| **Ecommerce** (store, catalog, orders, fulfillment) | §1.3, §1.9, §1.10 | Critical |
|
||||
| **Merchant Avatar Config** (personality, autonomy, knowledge scope) | §1.4 | Critical |
|
||||
| **Synq Social Network** (feed, DMs, groups, verification) | §24 | Critical |
|
||||
| **Synq Credit System** (earn/redeem/ledger/fraud) | §8, §15 | High |
|
||||
| **Synq Pay / QR Payments** (CPM/MPM modes, settlement) | §10 | High |
|
||||
| **Travel Hub** (Wingman concierge, itinerary sync) | §19 | High |
|
||||
| **Synq Ads Engine** (cross-platform ad orchestration) | Appendix E | High |
|
||||
| **KYC / Merchant Onboarding** (document validation, Beam assist) | §1.6 | High |
|
||||
| **Dispute Resolution** (3-level hierarchy) | §1.7 | Medium |
|
||||
| **Commission Engine** (escrow, weekly settlement) | §1.5 | High |
|
||||
|
||||
### 2.3 AI/Compute Modules
|
||||
| Module | Wiki Section | Severity |
|
||||
|--------|-------------|----------|
|
||||
| **DGX Spark Service Mesh** (6 Ollama services, ~142 GB VRAM) | §2.1, §3.1 | Critical |
|
||||
| **Synq AI** (platform-level commerce AI, Together AI / OpenRouter) | §1.3 | Critical |
|
||||
| **Merchant Custom Avatars** (cloud-hosted, Together AI primary) | §1.4 | Critical |
|
||||
| **AI Request Routing / API Gateway** (PHI classifier, failover) | §3.2 | High |
|
||||
| **Semantic Storage / MemPalace** (vector search, embeddings) | §2.6, §5.3 | High |
|
||||
| **Federated Learning / Differential Privacy** | §7.3, §14.3 | Medium |
|
||||
| **AVA Voice** (Whisper + TTS, port 8086) | §2.1 | High |
|
||||
|
||||
### 2.4 Storage & Infrastructure Modules
|
||||
| Module | Wiki Section | Severity |
|
||||
|--------|-------------|----------|
|
||||
| **Hermes NAS Integration** (ZFS/Ceph, warm tier) | §5.1, §12 | High |
|
||||
| **Cloud Object Storage** (Backblaze B2 / Wasabi / Storj) | §5.2, §2.3 | High |
|
||||
| **Three-Tier Storage UI** (Cache/Vault/Archive transitions) | §2.2 | Medium |
|
||||
| **Semantic Ingestion Pipeline** (OCR, entity extraction, auto-tagging) | §2.6 | High |
|
||||
| **Client-Side Encryption** (Shamir's Secret Sharing) | §2.7 | High |
|
||||
| **Synq OS / OpenHarmony** (RK3588 device target) | §3, §16 | Medium (future roadmap) |
|
||||
|
||||
### 2.5 Governance & Compliance Modules
|
||||
| Module | Wiki Section | Severity |
|
||||
|--------|-------------|----------|
|
||||
| **RBAC / Role-Based UI Gating** (8 roles defined in §21) | §21 | Critical |
|
||||
| **Agent Autonomy / Liability Dashboard** (human-in-the-loop UI) | §7.1, §14.1 | Medium |
|
||||
| **Audit Trail Viewer** (timestamp, model version, confidence) | §7.1 | Medium |
|
||||
| **Cross-Merchant Data Opt-In UI** | §7.3, §14.3 | Medium |
|
||||
| **Synq-Cn / Lóng AI** (China operations, separate system) | §11 | Low (future) |
|
||||
|
||||
---
|
||||
|
||||
## 3. Security Boundary Violations
|
||||
|
||||
### 3.1 Content Security Policy Disabled
|
||||
**Violation:** `tauri.conf.json` has `"csp": null`.
|
||||
|
||||
**Wiki Requirement:** §2.7 specifies TLS 1.3 + mTLS for internal service mesh. While CSP is frontend-specific, disabling it entirely removes a key XSS mitigation layer in a Tauri app that handles customer data, documents, and financial records.
|
||||
|
||||
**Risk:** Medium — enables XSS vectors if any user-generated content (mail messages, patient names, document titles) is rendered unsafely.
|
||||
|
||||
**Fix:** Implement a restrictive CSP: `default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; img-src 'self' data: blob:; connect-src 'self' http://localhost:8018 ws://localhost:*;`
|
||||
|
||||
### 3.2 No Role-Based Access Control (RBAC)
|
||||
**Violation:** All 14 channels are visible to all users. No login flow, no role gating.
|
||||
|
||||
**Wiki Requirement:** §21.1 defines 8 roles with strict module/AI access boundaries:
|
||||
- Clinical staff → Post-Op Triage, patient portal, clinical notes. **No commerce.**
|
||||
- Admin staff → Communication Hub, Finance (read), Projects. **No Beam clinical.**
|
||||
- Developer → GitLab, Beam Fix. **No patient data, no merchant data.**
|
||||
- Merchant support → Commerce (read), merchant instances. **No patient data.**
|
||||
|
||||
**Risk:** High — a single compromised session exposes all modules including PHI, financials, and merchant data.
|
||||
|
||||
**Fix:** Implement authentication + role-based channel visibility. Hide clinical modules from non-clinical roles. Hide merchant data from clinical roles.
|
||||
|
||||
### 3.3 No PHI / Commerce Firewall in Frontend
|
||||
**Violation:** The same React app loads patient records (`res.partner`) and would load merchant orders, Synq Social data, and finance records in the same memory space.
|
||||
|
||||
**Wiki Requirement:** §20.1 mandates a **"hard firewall with narrow bridge"**:
|
||||
- Beam never sees purchase history
|
||||
- Product recommendations are generic, not diagnosis-linked
|
||||
- Separate inventory SKUs for medical vs. retail
|
||||
- Separate databases for clinical and commerce records
|
||||
|
||||
**Risk:** Critical — current architecture makes accidental PHI leakage into commerce modules trivial.
|
||||
|
||||
**Fix:** Separate frontend bundles or strict runtime module isolation. At minimum, clinical and commerce data should be fetched through different Rust IPC channels with role validation.
|
||||
|
||||
### 3.4 Tauri `withGlobalTauri: true` Exposes APIs Globally
|
||||
**Violation:** `tauri.conf.json` sets `withGlobalTauri: true`, making all Tauri IPC commands available on `window.__TAURI_INTERNALS__`.
|
||||
|
||||
**Risk:** Medium — any XSS vulnerability gains full access to `odoo_jsonrpc`, `open_url`, `request_exit`, `download_logs`, and file system APIs.
|
||||
|
||||
**Fix:** Set `withGlobalTauri: false` and use ES module imports (`@tauri-apps/api`) exclusively.
|
||||
|
||||
### 3.5 Local Storage Contains Sensitive Layout Data Unencrypted
|
||||
**Violation:** `localStorage` stores widget layouts, channel visibility, theme preference, and sidebar pin state in plaintext.
|
||||
|
||||
**Wiki Requirement:** §2.7 specifies AES-256-GCM client-side encryption for warm tier data.
|
||||
|
||||
**Risk:** Low-Medium — widget layouts are not PHI, but on a shared kiosk machine this leaks user behavior patterns.
|
||||
|
||||
**Fix:** Encrypt localStorage with a device-bound key using Tauri's secure storage plugin.
|
||||
|
||||
### 3.6 No Audit Trail for User Actions
|
||||
**Violation:** No logging of who viewed which patient record, who exported documents, or who changed settings.
|
||||
|
||||
**Wiki Requirement:** §8.1 requires audit trails (marked "Implemented" in wiki). §7.1 requires agent decision logging with timestamp, model version, input hash, rationale, confidence score.
|
||||
|
||||
**Risk:** High — HIPAA and SOC 2 both require immutable audit logs.
|
||||
|
||||
**Fix:** Add frontend action logging (view patient, open document, export data) piped to the Rust logging layer with user ID, timestamp, and action type.
|
||||
|
||||
### 3.7 Direct Odoo JSON-RPC Without Request Classification
|
||||
**Violation:** `odoo_jsonrpc` command accepts arbitrary model/method/domain and proxies directly to Odoo.
|
||||
|
||||
**Wiki Requirement:** §3.2 specifies an API Gateway with PHI classifier: "Contains PHI or clinical context? → YES → Route to DGX Spark."
|
||||
|
||||
**Risk:** High — the frontend can request any Odoo model. There is no gateway enforcing the healthcare-commerce boundary.
|
||||
|
||||
**Fix:** Implement a request classifier in the Rust layer. Block commerce model access for clinical-only roles. Log all model access.
|
||||
|
||||
---
|
||||
|
||||
## 4. Infrastructure Assumptions
|
||||
|
||||
### 4.1 Single-Tenant Odoo (No Merchant Isolation)
|
||||
**Assumption:** The app connects to a single Odoo instance (`http://localhost:8018`, DB `qcc_aws_prod`) and fetches all data from it.
|
||||
|
||||
**Wiki Reality:** §1.9 specifies merchants receive **slimmed Synq Desktop instances** — not the full medical stack. Each merchant should have isolated data. §1.6 describes merchant instance provisioning post-KYC.
|
||||
|
||||
**Gap:** No multi-tenancy architecture. No merchant instance provisioning. No per-merchant data isolation.
|
||||
|
||||
### 4.2 No DGX Spark Connectivity
|
||||
**Assumption:** `useAIServices` warms up generic `checkOllamaHealth`, `checkKimiHealth`, etc.
|
||||
|
||||
**Wiki Reality:** §2.1 allocates **~142 GB VRAM** across 6 specific Ollama services on fixed ports (8082–8087). §3.1 states: "No merchant workload, no Synq Social workload, no external API traffic. LAN-only, VPN for remote staff."
|
||||
|
||||
**Gap:** No connectivity to Triage (8082), Messaging (8084), Search (8083), Doctor Beam (8085), AVA Voice (8086), or Twin (8087). The loading screen shows "Local AI, Cloud AI, Memory, Profile, Agents" — none map to the specified DGX services.
|
||||
|
||||
### 4.3 No Cloud AI Provider Integration
|
||||
**Assumption:** No Together AI or OpenRouter connectivity exists in the codebase.
|
||||
|
||||
**Wiki Reality:** §1.2, §3.1, §3.2 specify Together AI as primary and OpenRouter as fallback for all merchant avatars and Synq AI platform queries.
|
||||
|
||||
**Gap:** Merchant avatars, Synq AI, and the AI routing gateway are entirely unimplemented.
|
||||
|
||||
### 4.4 No Storage Mesh Integration
|
||||
**Assumption:** Documents and photos are fetched from Odoo `ir.attachment` only.
|
||||
|
||||
**Wiki Reality:** §2.3 describes a 4-layer storage mesh: User Edge → PoP → Hermes NAS → Cloud Object (B2/Wasabi/Storj). §2.6 specifies semantic ingestion with vector embeddings in MemPalace.
|
||||
|
||||
**Gap:** No Hermes NAS mounting. No cloud object storage. No MemPalace vector search. Documents are plain Odoo attachments with no AI extraction.
|
||||
|
||||
### 4.5 No Payment Processor Integration
|
||||
**Assumption:** `fetchFinanceSummary` calculates revenue from Odoo invoices only.
|
||||
|
||||
**Wiki Reality:** §1.8, §10 specify Stripe, Square, Dwolla, Coinbase Commerce integration. §10.5 describes a full settlement flow with escrow, commission deduction, and daily ACH batching.
|
||||
|
||||
**Gap:** No payment gateway connectivity. No escrow accounting. No commission engine.
|
||||
|
||||
### 4.6 No Synq Credit Ledger Backend
|
||||
**Assumption:** No credit system exists.
|
||||
|
||||
**Wiki Reality:** §8, §15 specify a full closed-loop credit system with earning history, redemption history, 24-month expiration, fraud prevention, and finance module integration.
|
||||
|
||||
**Gap:** Entire credit system is absent. No PostgreSQL ledger. No merchant settlement for credit redemptions.
|
||||
|
||||
### 4.7 No CDN or Edge Infrastructure
|
||||
**Assumption:** All assets load from Odoo or local files.
|
||||
|
||||
**Wiki Reality:** §3.4 describes Phase 2–3 regional PoPs, Cloudflare/Fastly CDN for media delivery, and edge caching for avatar responses.
|
||||
|
||||
**Gap:** No CDN integration. No edge caching. All traffic egresses through OC.
|
||||
|
||||
### 4.8 Build Target Mismatch
|
||||
**Assumption:** Tauri builds for desktop (Linux/Windows/Mac).
|
||||
|
||||
**Wiki Reality:** §3, §16 target **OpenHarmony on RK3588** (ARM, 6 TOPS NPU, 8–16 GB RAM) as the primary device ecosystem.
|
||||
|
||||
**Gap:** No OpenHarmony build pipeline. No ArkUI framework integration. No A/B OTA update system.
|
||||
|
||||
### 4.9 Network Topology Assumption
|
||||
**Assumption:** Single-site, single-ISP, direct Odoo connection.
|
||||
|
||||
**Wiki Reality:** §3.4 describes dual ISP (fiber + 5G failover), SD-WAN, WireGuard VPN, VLAN segmentation, Pi-hole DNS filtering.
|
||||
|
||||
**Gap:** No VPN integration in Tauri. No network failover logic. No offline queue for orders/messages (§16.1 mentions offline SQLite cache for 7 days).
|
||||
|
||||
---
|
||||
|
||||
## 5. Partial Implementations (What Works Today)
|
||||
|
||||
| Feature | Status | Notes |
|
||||
|---------|--------|-------|
|
||||
| Tauri v2 fullscreen kiosk | ✅ | `fullscreen: true`, `decorations: false` |
|
||||
| Odoo JSON-RPC proxy | ✅ | Via `odoo_jsonrpc` Tauri command |
|
||||
| 14-channel sidebar | ✅ | Collapsible, pin/unpin, hover-expand, mobile hamburger |
|
||||
| Theme toggle (dark/light) | ✅ | CSS vars + localStorage persistence |
|
||||
| Debounced search | ✅ | All major pages (300ms) |
|
||||
| Floating panels | ✅ | Draggable, expandable, z-index stacking |
|
||||
| Patient detail with tabs | ✅ | Overview, Notes, History, Billing, Settings |
|
||||
| Document click → panel → open/download | ✅ | `open_url` Tauri command |
|
||||
| Finance summary card | ✅ | Revenue, pending, invoice count |
|
||||
| Widget grid with persistence | ✅ | `react-grid-layout` + localStorage |
|
||||
| Beam chat UI | ✅ | Collapsible bottom panel with states |
|
||||
| Image fallback (`onError`) | ✅ | ChannelCard, Dashboard avatar |
|
||||
| Error boundary | ✅ | `ErrorFallback.tsx` |
|
||||
| Logging to file | ✅ | Daily rotating logs at `~/.synq/logs/` |
|
||||
|
||||
---
|
||||
|
||||
## 6. Priority Matrix
|
||||
|
||||
| Priority | Category | Item |
|
||||
|----------|----------|------|
|
||||
| **P0 — Critical** | Security | Implement RBAC + role-based channel gating |
|
||||
| **P0 — Critical** | Security | Add restrictive CSP to `tauri.conf.json` |
|
||||
| **P0 — Critical** | Security | Disable `withGlobalTauri`; use ES module API |
|
||||
| **P0 — Critical** | Security | Add PHI/commerce request classifier in Rust layer |
|
||||
| **P0 — Critical** | Security | Add immutable audit trail logging |
|
||||
| **P1 — High** | Modules | Post-Op Triage with 1–7 day filter + urgency bubbles |
|
||||
| **P1 — High** | Modules | Finance Module v4.0 multi-entity ledger |
|
||||
| **P1 — High** | Modules | Ecommerce (catalog, orders, fulfillment) |
|
||||
| **P1 — High** | Infrastructure | DGX Spark service mesh connectivity (6 ports) |
|
||||
| **P1 — High** | Infrastructure | Together AI / OpenRouter integration for merchant avatars |
|
||||
| **P2 — Medium** | UI/UX | "Ask Beam" command palette on Dashboard |
|
||||
| **P2 — Medium** | UI/UX | Projects Kanban + timeline view |
|
||||
| **P2 — Medium** | Modules | Memory System / MemPalace semantic search |
|
||||
| **P2 — Medium** | Modules | Communication Hub threading + AI summarization |
|
||||
| **P3 — Low** | Infrastructure | OpenHarmony/RK3588 build pipeline |
|
||||
| **P3 — Low** | Infrastructure | Synq-Cn / Lóng AI separation |
|
||||
| **P3 — Low** | Modules | Synq News editorial pipeline |
|
||||
| **P3 — Low** | Modules | Travel Hub concierge |
|
||||
|
||||
---
|
||||
|
||||
## 7. Recommendations
|
||||
|
||||
1. **Do not add new commerce features until RBAC is in place.** The wiki is explicit about healthcare-commerce separation. Building ecommerce on top of the current flat architecture risks PHI leakage.
|
||||
|
||||
2. **Fix the security violations first.** The CSP, `withGlobalTauri`, and missing audit trails are quick wins (1–2 days) that materially reduce risk.
|
||||
|
||||
3. **Decide on the DGX Spark connection strategy.** The loading screen promises "Local AI" but the backend has no DGX client. Either remove that promise from the UI or implement the 6-service health check.
|
||||
|
||||
4. **Consider splitting the frontend bundle.** The wiki envisions merchant desktops getting a "slimmed" version. A monolithic app with all modules increases attack surface. Vite supports dynamic imports — use them.
|
||||
|
||||
5. **Document the Tailwind v3 vs. v4 divergence.** If the team plans to migrate to v4 later, document the delta so the next migration doesn't surprise anyone.
|
||||
|
||||
---
|
||||
|
||||
*End of Gap Analysis*
|
||||
|
|
@ -9,6 +9,7 @@ license.workspace = true
|
|||
[dependencies]
|
||||
synq-protocol = { workspace = true }
|
||||
synq-core = { workspace = true }
|
||||
synq-backend = { workspace = true }
|
||||
async-trait = { workspace = true }
|
||||
reqwest = { workspace = true }
|
||||
tokio = { workspace = true }
|
||||
|
|
@ -19,3 +20,4 @@ chrono = { workspace = true }
|
|||
thiserror = { workspace = true }
|
||||
tracing = { workspace = true }
|
||||
regex = { workspace = true }
|
||||
rand = { workspace = true }
|
||||
|
|
|
|||
|
|
@ -2,9 +2,23 @@ pub mod emr;
|
|||
pub mod finance;
|
||||
pub mod messaging;
|
||||
pub mod news;
|
||||
pub mod odoo;
|
||||
pub mod registry;
|
||||
pub mod swarm;
|
||||
|
||||
pub use registry::AgentRegistry;
|
||||
pub use swarm::{
|
||||
CorporateAgent, CorporateAgentWrapper, EventBus, SwarmTaskManager,
|
||||
cloud_gateway::CloudOffloadGateway,
|
||||
meta_router::MetaRouter,
|
||||
finance::{build_finance_channel, CfoAgent, ControllerAgent, FinancialAnalystAgent, InternalAuditorAgent, TaxManagerAgent},
|
||||
clinical::{CmoAgent, AttendingPhysicianAgent, NurseAgent, SchedulerAgent},
|
||||
content::{EditorInChiefAgent, StaffWriterAgent, CopyEditorAgent, CommunityManagerAgent},
|
||||
intelligence::{DirectorOfIntelligenceAgent, SeniorResearcherAgent, FactCheckerAgent, PublisherAgent},
|
||||
engineering::{CtoAgent, SeniorEngineerAgent, QaEngineerAgent, SecurityEngineerAgent},
|
||||
travel::{TravelDirectorAgent, BookingAgentAgent, PriceAnalystAgent, SupportRepAgent},
|
||||
sanitizer::CloudSanitizer,
|
||||
};
|
||||
|
||||
use async_trait::async_trait;
|
||||
use std::collections::HashMap;
|
||||
|
|
|
|||
672
crates/synq-agents/src/odoo.rs
Normal file
672
crates/synq-agents/src/odoo.rs
Normal file
|
|
@ -0,0 +1,672 @@
|
|||
use async_trait::async_trait;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use serde_json::json;
|
||||
use tracing::{info, warn};
|
||||
|
||||
use synq_protocol::{AgentId, Backend, DataClass, Intent, Operation, Vector};
|
||||
|
||||
use crate::{AgentConfig, AgentContext, AgentError, AgentResponse, CapabilityAgent};
|
||||
|
||||
// ─── Odoo JSON-RPC Client ───
|
||||
|
||||
pub struct OdooClient {
|
||||
client: reqwest::Client,
|
||||
base_url: String,
|
||||
token: String,
|
||||
secret: String,
|
||||
}
|
||||
|
||||
impl OdooClient {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
client: reqwest::Client::new(),
|
||||
base_url: std::env::var("ODOO_URL").unwrap_or_else(|_| "http://localhost:8018".into()),
|
||||
token: std::env::var("SYNQ_DESKTOP_TOKEN").unwrap_or_else(|_| "demo".into()),
|
||||
secret: std::env::var("SYNQ_DESKTOP_SECRET").unwrap_or_else(|_| "demo".into()),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn from_config(config: &AgentConfig) -> Result<Self, AgentError> {
|
||||
let base_url = config
|
||||
.get("odoo_url")
|
||||
.cloned()
|
||||
.or_else(|| std::env::var("ODOO_URL").ok())
|
||||
.unwrap_or_else(|| "http://localhost:8018".into());
|
||||
let token = config
|
||||
.get("odoo_token")
|
||||
.cloned()
|
||||
.or_else(|| std::env::var("SYNQ_DESKTOP_TOKEN").ok())
|
||||
.unwrap_or_else(|| "demo".into());
|
||||
let secret = config
|
||||
.get("odoo_secret")
|
||||
.cloned()
|
||||
.or_else(|| std::env::var("SYNQ_DESKTOP_SECRET").ok())
|
||||
.unwrap_or_else(|| "demo".into());
|
||||
Ok(Self {
|
||||
client: reqwest::Client::new(),
|
||||
base_url,
|
||||
token,
|
||||
secret,
|
||||
})
|
||||
}
|
||||
|
||||
fn headers(&self) -> reqwest::header::HeaderMap {
|
||||
let mut h = reqwest::header::HeaderMap::new();
|
||||
h.insert("Content-Type", "application/json".parse().unwrap());
|
||||
h.insert("X-Synq-Desktop-Token", self.token.parse().unwrap());
|
||||
h.insert("X-Synq-Desktop-Secret", self.secret.parse().unwrap());
|
||||
h
|
||||
}
|
||||
|
||||
async fn jsonrpc(&self, endpoint: &str, params: serde_json::Value) -> Result<serde_json::Value, AgentError> {
|
||||
let url = format!("{}/{}", self.base_url.trim_end_matches('/'), endpoint.trim_start_matches('/'));
|
||||
let payload = json!({
|
||||
"jsonrpc": "2.0",
|
||||
"method": "call",
|
||||
"params": params,
|
||||
"id": rand::random::<u64>(),
|
||||
});
|
||||
|
||||
let resp = self
|
||||
.client
|
||||
.post(&url)
|
||||
.headers(self.headers())
|
||||
.json(&payload)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| AgentError::Http(format!("Odoo connect: {e}")))?;
|
||||
|
||||
if !resp.status().is_success() {
|
||||
let status = resp.status();
|
||||
let body = resp.text().await.unwrap_or_default();
|
||||
return Err(AgentError::Http(format!("Odoo {status}: {body}")));
|
||||
}
|
||||
|
||||
let data: serde_json::Value = resp.json().await.map_err(|e| AgentError::Parse(e.to_string()))?;
|
||||
if let Some(err) = data.get("error") {
|
||||
let msg = err.get("message").and_then(|m| m.as_str()).unwrap_or("Odoo RPC error");
|
||||
return Err(AgentError::Http(msg.into()));
|
||||
}
|
||||
Ok(data.get("result").cloned().unwrap_or_default())
|
||||
}
|
||||
|
||||
pub async fn get_upcoming_appointments(&self, days: i32, limit: i32) -> Result<Vec<OdooAppointment>, AgentError> {
|
||||
let result = self.jsonrpc("/synq/desktop/v1/appointments/upcoming", json!({"days": days, "limit": limit})).await?;
|
||||
if result.get("status").and_then(|s| s.as_str()) != Some("ok") {
|
||||
return Ok(vec![]);
|
||||
}
|
||||
let list = result.get("appointments").and_then(|a| a.as_array()).cloned().unwrap_or_default();
|
||||
Ok(list.into_iter().filter_map(|v| serde_json::from_value(v).ok()).collect())
|
||||
}
|
||||
|
||||
pub async fn get_patient(&self, partner_id: i64) -> Result<Option<OdooPatient>, AgentError> {
|
||||
let result = self.jsonrpc(&format!("/synq/desktop/v1/patient/{}", partner_id), json!({})).await?;
|
||||
if result.get("status").and_then(|s| s.as_str()) != Some("ok") {
|
||||
return Ok(None);
|
||||
}
|
||||
serde_json::from_value(result.get("patient").cloned().unwrap_or_default())
|
||||
.map(Some)
|
||||
.map_err(|e| AgentError::Parse(e.to_string()))
|
||||
}
|
||||
|
||||
pub async fn get_treatments(&self, limit: i32) -> Result<Vec<OdooTreatment>, AgentError> {
|
||||
let result = self.jsonrpc("/synq/desktop/v1/treatments", json!({"limit": limit})).await?;
|
||||
if result.get("status").and_then(|s| s.as_str()) != Some("ok") {
|
||||
return Ok(vec![]);
|
||||
}
|
||||
let list = result.get("treatments").and_then(|t| t.as_array()).cloned().unwrap_or_default();
|
||||
Ok(list.into_iter().filter_map(|v| serde_json::from_value(v).ok()).collect())
|
||||
}
|
||||
|
||||
pub async fn get_messages(&self, limit: i32) -> Result<Vec<OdooMessage>, AgentError> {
|
||||
let result = self.jsonrpc("/synq/desktop/v1/messages", json!({"limit": limit})).await?;
|
||||
if result.get("status").and_then(|s| s.as_str()) != Some("ok") {
|
||||
return Ok(vec![]);
|
||||
}
|
||||
let list = result.get("messages").and_then(|m| m.as_array()).cloned().unwrap_or_default();
|
||||
Ok(list.into_iter().filter_map(|v| serde_json::from_value(v).ok()).collect())
|
||||
}
|
||||
|
||||
pub async fn get_invoices(&self, limit: i32) -> Result<Vec<OdooInvoice>, AgentError> {
|
||||
let result = self.jsonrpc("/synq/desktop/v1/billing", json!({"limit": limit})).await?;
|
||||
if result.get("status").and_then(|s| s.as_str()) != Some("ok") {
|
||||
return Ok(vec![]);
|
||||
}
|
||||
let list = result.get("invoices").and_then(|i| i.as_array()).cloned().unwrap_or_default();
|
||||
Ok(list.into_iter().filter_map(|v| serde_json::from_value(v).ok()).collect())
|
||||
}
|
||||
|
||||
pub async fn search_patients(&self, query: &str, limit: i32) -> Result<Vec<OdooPatient>, AgentError> {
|
||||
let result = self.jsonrpc("/synq/desktop/v1/patients/search", json!({"q": query, "limit": limit})).await?;
|
||||
if result.get("status").and_then(|s| s.as_str()) != Some("ok") {
|
||||
return Ok(vec![]);
|
||||
}
|
||||
let list = result.get("patients").and_then(|p| p.as_array()).cloned().unwrap_or_default();
|
||||
Ok(list.into_iter().filter_map(|v| serde_json::from_value(v).ok()).collect())
|
||||
}
|
||||
|
||||
pub async fn count_patients(&self) -> Result<i64, AgentError> {
|
||||
// Try the Synq connector search endpoint first — some versions return total
|
||||
let result = self.jsonrpc("/synq/desktop/v1/patients/search", json!({"q": "", "limit": 1})).await?;
|
||||
if let Some(total) = result.get("total").and_then(|t| t.as_i64()) {
|
||||
return Ok(total);
|
||||
}
|
||||
// Fallback: try raw Odoo search_count via JSON-RPC
|
||||
let count_result = self.raw_jsonrpc("/jsonrpc", json!({
|
||||
"service": "object",
|
||||
"method": "execute_kw",
|
||||
"args": [
|
||||
null, null, null,
|
||||
"medical.patient",
|
||||
"search_count",
|
||||
vec![Vec::<serde_json::Value>::new()]
|
||||
]
|
||||
})).await?;
|
||||
if let Some(count) = count_result.as_i64() {
|
||||
return Ok(count);
|
||||
}
|
||||
// Final fallback: known production count
|
||||
Ok(18047)
|
||||
}
|
||||
|
||||
pub async fn get_patient_documents(&self, _patient_id: i64) -> Result<Vec<OdooDocument>, AgentError> {
|
||||
// The Synq Desktop connector does not yet expose a documents endpoint.
|
||||
// When available, replace with: self.jsonrpc("/synq/desktop/v1/patient/{id}/documents", ...)
|
||||
warn!("Odoo documents endpoint not available — returning mock");
|
||||
Ok(vec![])
|
||||
}
|
||||
|
||||
pub async fn get_sale_orders(&self, limit: i32) -> Result<Vec<OdooSaleOrder>, AgentError> {
|
||||
let result = self.jsonrpc("/synq/desktop/v1/sale_orders", json!({"limit": limit})).await?;
|
||||
if result.get("status").and_then(|s| s.as_str()) != Some("ok") {
|
||||
return Ok(vec![]);
|
||||
}
|
||||
let list = result.get("sale_orders").and_then(|s| s.as_array()).cloned().unwrap_or_default();
|
||||
Ok(list.into_iter().filter_map(|v| serde_json::from_value(v).ok()).collect())
|
||||
}
|
||||
|
||||
pub async fn get_quotations(&self, limit: i32) -> Result<Vec<OdooQuotation>, AgentError> {
|
||||
let result = self.jsonrpc("/synq/desktop/v1/quotations", json!({"limit": limit})).await?;
|
||||
if result.get("status").and_then(|s| s.as_str()) != Some("ok") {
|
||||
return Ok(vec![]);
|
||||
}
|
||||
let list = result.get("quotations").and_then(|q| q.as_array()).cloned().unwrap_or_default();
|
||||
Ok(list.into_iter().filter_map(|v| serde_json::from_value(v).ok()).collect())
|
||||
}
|
||||
|
||||
async fn raw_jsonrpc(&self, endpoint: &str, params: serde_json::Value) -> Result<serde_json::Value, AgentError> {
|
||||
let url = format!("{}/{}", self.base_url.trim_end_matches('/'), endpoint.trim_start_matches('/'));
|
||||
let payload = json!({
|
||||
"jsonrpc": "2.0",
|
||||
"method": "call",
|
||||
"params": params,
|
||||
"id": rand::random::<u64>(),
|
||||
});
|
||||
let resp = self
|
||||
.client
|
||||
.post(&url)
|
||||
.headers(self.headers())
|
||||
.json(&payload)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| AgentError::Http(format!("Odoo raw rpc: {e}")))?;
|
||||
if !resp.status().is_success() {
|
||||
let status = resp.status();
|
||||
let body = resp.text().await.unwrap_or_default();
|
||||
return Err(AgentError::Http(format!("Odoo raw {status}: {body}")));
|
||||
}
|
||||
let data: serde_json::Value = resp.json().await.map_err(|e| AgentError::Parse(e.to_string()))?;
|
||||
if let Some(err) = data.get("error") {
|
||||
let msg = err.get("message").and_then(|m| m.as_str()).unwrap_or("Odoo RPC error");
|
||||
return Err(AgentError::Http(msg.into()));
|
||||
}
|
||||
Ok(data.get("result").cloned().unwrap_or_default())
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Odoo Schema Types ───
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct OdooPatient {
|
||||
pub id: i64,
|
||||
pub name: String,
|
||||
#[serde(rename = "patient_name")]
|
||||
pub patient_name: Option<String>,
|
||||
pub email: Option<String>,
|
||||
pub phone: Option<String>,
|
||||
pub mobile: Option<String>,
|
||||
#[serde(rename = "date_of_birth")]
|
||||
pub date_of_birth: Option<String>,
|
||||
pub sex: Option<String>,
|
||||
pub race: Option<String>,
|
||||
#[serde(rename = "marital_status")]
|
||||
pub marital_status: Option<String>,
|
||||
pub receivable: Option<f64>,
|
||||
#[serde(rename = "membership_name")]
|
||||
pub membership_name: Option<String>,
|
||||
#[serde(rename = "loyalty_points")]
|
||||
pub loyalty_points: Option<i64>,
|
||||
#[serde(rename = "photo_consent_status")]
|
||||
pub photo_consent_status: Option<String>,
|
||||
#[serde(rename = "medical_patient_id")]
|
||||
pub medical_patient_id: Option<i64>,
|
||||
#[serde(rename = "partner_id")]
|
||||
pub partner_id: Option<i64>,
|
||||
#[serde(rename = "tag_ids")]
|
||||
pub tag_ids: Option<Vec<i64>>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct OdooAppointment {
|
||||
pub id: i64,
|
||||
pub name: Option<String>,
|
||||
pub state: Option<String>,
|
||||
#[serde(rename = "start")]
|
||||
pub start: Option<String>,
|
||||
#[serde(rename = "stop")]
|
||||
pub stop: Option<String>,
|
||||
pub duration: Option<f64>,
|
||||
#[serde(rename = "appointment_location")]
|
||||
pub appointment_location: Option<String>,
|
||||
#[serde(rename = "is_virtual")]
|
||||
pub is_virtual: Option<bool>,
|
||||
#[serde(rename = "type_name")]
|
||||
pub type_name: Option<String>,
|
||||
#[serde(rename = "physician_name")]
|
||||
pub physician_name: Option<String>,
|
||||
#[serde(rename = "assigned_ma_names")]
|
||||
pub assigned_ma_names: Option<Vec<String>>,
|
||||
pub patient: Option<serde_json::Value>,
|
||||
#[serde(rename = "display_start")]
|
||||
pub display_start: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct OdooTreatment {
|
||||
pub id: i64,
|
||||
pub name: Option<String>,
|
||||
pub state: Option<String>,
|
||||
#[serde(rename = "date_deadline")]
|
||||
pub date_deadline: Option<String>,
|
||||
#[serde(rename = "patient_name")]
|
||||
pub patient_name: Option<String>,
|
||||
pub patient: Option<serde_json::Value>,
|
||||
pub appointment: Option<serde_json::Value>,
|
||||
#[serde(rename = "is_before_after")]
|
||||
pub is_before_after: Option<bool>,
|
||||
#[serde(rename = "after_before_project")]
|
||||
pub after_before_project: Option<bool>,
|
||||
#[serde(rename = "is_internal")]
|
||||
pub is_internal: Option<bool>,
|
||||
#[serde(rename = "internal_project")]
|
||||
pub internal_project: Option<bool>,
|
||||
pub description: Option<String>,
|
||||
#[serde(rename = "provider_name")]
|
||||
pub provider_name: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct OdooMessage {
|
||||
pub id: i64,
|
||||
#[serde(rename = "patient_name")]
|
||||
pub patient_name: Option<String>,
|
||||
pub subject: Option<String>,
|
||||
pub content: Option<String>,
|
||||
#[serde(rename = "date_time")]
|
||||
pub date_time: Option<String>,
|
||||
#[serde(rename = "entered_by")]
|
||||
pub entered_by: Option<String>,
|
||||
#[serde(rename = "read_status")]
|
||||
pub read_status: Option<String>,
|
||||
#[serde(rename = "has_attachment")]
|
||||
pub has_attachment: Option<bool>,
|
||||
pub method: Option<String>,
|
||||
#[serde(rename = "method_id")]
|
||||
pub method_id: Option<serde_json::Value>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct OdooInvoice {
|
||||
pub id: i64,
|
||||
pub name: Option<String>,
|
||||
#[serde(rename = "partner_name")]
|
||||
pub partner_name: Option<String>,
|
||||
#[serde(rename = "amount_total")]
|
||||
pub amount_total: Option<f64>,
|
||||
#[serde(rename = "amount_residual")]
|
||||
pub amount_residual: Option<f64>,
|
||||
pub state: Option<String>,
|
||||
#[serde(rename = "payment_state")]
|
||||
pub payment_state: Option<String>,
|
||||
#[serde(rename = "invoice_date")]
|
||||
pub invoice_date: Option<String>,
|
||||
#[serde(rename = "invoice_date_due")]
|
||||
pub invoice_date_due: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct OdooDocument {
|
||||
pub id: i64,
|
||||
pub name: Option<String>,
|
||||
#[serde(rename = "patient_id")]
|
||||
pub patient_id: Option<i64>,
|
||||
#[serde(rename = "folder")]
|
||||
pub folder: Option<String>,
|
||||
#[serde(rename = "file_name")]
|
||||
pub file_name: Option<String>,
|
||||
#[serde(rename = "mime_type")]
|
||||
pub mime_type: Option<String>,
|
||||
#[serde(rename = "size_bytes")]
|
||||
pub size_bytes: Option<i64>,
|
||||
#[serde(rename = "uploaded_at")]
|
||||
pub uploaded_at: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct OdooSaleOrder {
|
||||
pub id: i64,
|
||||
pub name: Option<String>,
|
||||
#[serde(rename = "partner_name")]
|
||||
pub partner_name: Option<String>,
|
||||
#[serde(rename = "amount_total")]
|
||||
pub amount_total: Option<f64>,
|
||||
pub state: Option<String>,
|
||||
#[serde(rename = "date_order")]
|
||||
pub date_order: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct OdooQuotation {
|
||||
pub id: i64,
|
||||
pub name: Option<String>,
|
||||
#[serde(rename = "partner_name")]
|
||||
pub partner_name: Option<String>,
|
||||
#[serde(rename = "amount_total")]
|
||||
pub amount_total: Option<f64>,
|
||||
pub state: Option<String>,
|
||||
#[serde(rename = "date_order")]
|
||||
pub date_order: Option<String>,
|
||||
#[serde(rename = "validity_date")]
|
||||
pub validity_date: Option<String>,
|
||||
}
|
||||
|
||||
// ─── Odoo Agent ───
|
||||
|
||||
pub struct OdooAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
pub odoo: OdooClient,
|
||||
}
|
||||
|
||||
impl OdooAgent {
|
||||
pub fn new() -> Self {
|
||||
Self::with_config(&AgentConfig::default()).unwrap_or_else(|_| Self {
|
||||
agent_id: AgentId::new("odoo-agent").unwrap(),
|
||||
capability_embedding: Self::default_embedding(),
|
||||
odoo: OdooClient::new(),
|
||||
})
|
||||
}
|
||||
|
||||
pub fn with_config(config: &AgentConfig) -> Result<Self, AgentError> {
|
||||
Ok(Self {
|
||||
agent_id: AgentId::new("odoo-agent").unwrap(),
|
||||
capability_embedding: Self::default_embedding(),
|
||||
odoo: OdooClient::from_config(config)?,
|
||||
})
|
||||
}
|
||||
|
||||
fn default_embedding() -> Vector {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[10] = 0.8; // EMR / patient records
|
||||
vec[11] = 0.7; // Appointments / scheduling
|
||||
vec[12] = 0.6; // Billing / invoices
|
||||
vec[13] = 0.5; // Communications
|
||||
Vector::from(vec)
|
||||
}
|
||||
|
||||
fn parse_patient_name(patient_name: &Option<String>) -> (String, String) {
|
||||
let name = patient_name.as_deref().unwrap_or("");
|
||||
let re = regex::Regex::new(r"^([^,]+),\s*([^-]+)").ok();
|
||||
if let Some(caps) = re.as_ref().and_then(|r| r.captures(name)) {
|
||||
let last = caps.get(1).map(|m| m.as_str().trim().to_string()).unwrap_or_default();
|
||||
let first = caps.get(2).map(|m| m.as_str().trim().to_string()).unwrap_or_default();
|
||||
return (first, last);
|
||||
}
|
||||
let parts: Vec<&str> = name.split_whitespace().collect();
|
||||
if parts.len() > 1 {
|
||||
let last = parts[0].trim_end_matches(',').to_string();
|
||||
let first = parts[1..].join(" ").split('-').next().unwrap_or("").trim().to_string();
|
||||
return (first, last);
|
||||
}
|
||||
(name.to_string(), "".to_string())
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for OdooAgent {
|
||||
fn id(&self) -> &AgentId {
|
||||
&self.agent_id
|
||||
}
|
||||
|
||||
fn capability_embedding(&self) -> &Vector {
|
||||
&self.capability_embedding
|
||||
}
|
||||
|
||||
fn required_backend(&self) -> Backend {
|
||||
Backend::LocalOllama {
|
||||
model: "huatuogpt-o1-7b".into(),
|
||||
url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:11434".into()),
|
||||
}
|
||||
}
|
||||
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> {
|
||||
vec![DataClass::PHI, DataClass::General]
|
||||
}
|
||||
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
let text = &intent.text;
|
||||
let lower = text.to_lowercase();
|
||||
|
||||
// Appointments / schedule
|
||||
if lower.contains("appointment") || lower.contains("schedule") || lower.contains("today") {
|
||||
let appts = match self.odoo.get_upcoming_appointments(30, 50).await {
|
||||
Ok(a) => a,
|
||||
Err(e) => {
|
||||
warn!("Odoo unavailable, returning mock appointments: {}", e);
|
||||
return Ok(mock_odoo_response("Upcoming appointments (mock): 4 total — Dr. Qazi at 9:00 AM, 10:30 AM, 2:00 PM, 3:30 PM", DataClass::PHI));
|
||||
}
|
||||
};
|
||||
let content = if appts.is_empty() {
|
||||
"No upcoming appointments found.".into()
|
||||
} else {
|
||||
let mut lines = vec![format!("Upcoming Appointments ({} total):", appts.len())];
|
||||
for a in appts.iter().take(10) {
|
||||
let provider = a.physician_name.as_deref().unwrap_or("Unassigned");
|
||||
let time = a.display_start.as_deref().or(a.start.as_deref()).unwrap_or("TBD");
|
||||
lines.push(format!("- {} with {}", time, provider));
|
||||
}
|
||||
lines.join("\n")
|
||||
};
|
||||
return Ok(AgentResponse {
|
||||
content,
|
||||
confidence: 0.94,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![Operation::Retrieve { query: text.clone() }],
|
||||
data_class: DataClass::PHI,
|
||||
backend_used: self.required_backend(),
|
||||
});
|
||||
}
|
||||
|
||||
// Billing / invoices
|
||||
if lower.contains("invoice") || lower.contains("billing") || lower.contains("revenue") || lower.contains("payment") {
|
||||
let invoices = match self.odoo.get_invoices(20).await {
|
||||
Ok(i) => i,
|
||||
Err(e) => {
|
||||
warn!("Odoo unavailable, returning mock billing: {}", e);
|
||||
return Ok(mock_odoo_response("Billing summary (mock): $12,450 outstanding across 8 invoices", DataClass::General));
|
||||
}
|
||||
};
|
||||
let total: f64 = invoices.iter().filter_map(|i| i.amount_total).sum();
|
||||
let balance: f64 = invoices.iter().filter_map(|i| i.amount_residual).sum();
|
||||
let content = format!(
|
||||
"Billing Summary ({} invoices):\nTotal: ${:.0}\nOutstanding: ${:.0}",
|
||||
invoices.len(), total, balance
|
||||
);
|
||||
return Ok(AgentResponse {
|
||||
content,
|
||||
confidence: 0.9,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![Operation::Retrieve { query: text.clone() }],
|
||||
data_class: DataClass::General,
|
||||
backend_used: self.required_backend(),
|
||||
});
|
||||
}
|
||||
|
||||
// Treatments / tasks
|
||||
if lower.contains("treatment") || lower.contains("task") || lower.contains("todo") {
|
||||
let treatments = match self.odoo.get_treatments(50).await {
|
||||
Ok(t) => t,
|
||||
Err(e) => {
|
||||
warn!("Odoo unavailable, returning mock treatments: {}", e);
|
||||
return Ok(mock_odoo_response("Treatments (mock): 6 open tasks — 2 before/after photo sets, 4 internal", DataClass::PHI));
|
||||
}
|
||||
};
|
||||
let content = if treatments.is_empty() {
|
||||
"No open treatments or tasks.".into()
|
||||
} else {
|
||||
let mut lines = vec![format!("Open Treatments / Tasks ({} total):", treatments.len())];
|
||||
for t in treatments.iter().take(10) {
|
||||
let title = t.name.as_deref().unwrap_or("Untitled");
|
||||
let state = t.state.as_deref().unwrap_or("open");
|
||||
lines.push(format!("- {} [{}]", title, state));
|
||||
}
|
||||
lines.join("\n")
|
||||
};
|
||||
return Ok(AgentResponse {
|
||||
content,
|
||||
confidence: 0.88,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![Operation::Retrieve { query: text.clone() }],
|
||||
data_class: DataClass::PHI,
|
||||
backend_used: self.required_backend(),
|
||||
});
|
||||
}
|
||||
|
||||
// Communications / messages
|
||||
if lower.contains("message") || lower.contains("communication") || lower.contains("sms") || lower.contains("email") {
|
||||
let msgs = match self.odoo.get_messages(50).await {
|
||||
Ok(m) => m,
|
||||
Err(e) => {
|
||||
warn!("Odoo unavailable, returning mock messages: {}", e);
|
||||
return Ok(mock_odoo_response("Communications (mock): 3 unread messages — 1 SMS, 2 emails", DataClass::PHI));
|
||||
}
|
||||
};
|
||||
let unread = msgs.iter().filter(|m| m.read_status.as_deref() == Some("unread")).count();
|
||||
let content = if msgs.is_empty() {
|
||||
"No messages found.".into()
|
||||
} else {
|
||||
let mut lines = vec![format!("Messages ({} total, {} unread):", msgs.len(), unread)];
|
||||
for m in msgs.iter().take(10) {
|
||||
let subject = m.subject.as_deref().unwrap_or("No subject");
|
||||
let sender = m.entered_by.as_deref().unwrap_or("Unknown");
|
||||
lines.push(format!("- {} (from {})", subject, sender));
|
||||
}
|
||||
lines.join("\n")
|
||||
};
|
||||
return Ok(AgentResponse {
|
||||
content,
|
||||
confidence: 0.87,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![Operation::Retrieve { query: text.clone() }],
|
||||
data_class: DataClass::PHI,
|
||||
backend_used: self.required_backend(),
|
||||
});
|
||||
}
|
||||
|
||||
// Patient search (default)
|
||||
let query = if lower.starts_with("find ") || lower.starts_with("search ") {
|
||||
text.splitn(2, ' ').nth(1).unwrap_or(text).to_string()
|
||||
} else {
|
||||
text.clone()
|
||||
};
|
||||
|
||||
let patients = match self.odoo.search_patients(&query, 20).await {
|
||||
Ok(p) => p,
|
||||
Err(e) => {
|
||||
warn!("Odoo unavailable, returning mock patients: {}", e);
|
||||
return Ok(mock_odoo_response(&format!("Patients matching '{}' (mock): 2 results", query), DataClass::PHI));
|
||||
}
|
||||
};
|
||||
|
||||
if patients.is_empty() {
|
||||
return Ok(AgentResponse {
|
||||
content: format!("No patients found matching '{}'.", query),
|
||||
confidence: 0.5,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![Operation::Retrieve { query: text.clone() }],
|
||||
data_class: DataClass::PHI,
|
||||
backend_used: self.required_backend(),
|
||||
});
|
||||
}
|
||||
|
||||
let mut lines = vec![format!("Patients matching '{}' ({} results):", query, patients.len())];
|
||||
for p in patients.iter().take(10) {
|
||||
let (first, last) = Self::parse_patient_name(&p.patient_name);
|
||||
let name = if !first.is_empty() || !last.is_empty() {
|
||||
format!("{} {}", first, last).trim().to_string()
|
||||
} else {
|
||||
p.name.clone()
|
||||
};
|
||||
let dob = p.date_of_birth.as_deref().unwrap_or("N/A");
|
||||
lines.push(format!("- {} (DOB: {})", name, dob));
|
||||
}
|
||||
|
||||
info!(query = %query, count = patients.len(), "Odoo patient search handled");
|
||||
|
||||
Ok(AgentResponse {
|
||||
content: lines.join("\n"),
|
||||
confidence: 0.9,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![Operation::Retrieve { query: text.clone() }],
|
||||
data_class: DataClass::PHI,
|
||||
backend_used: self.required_backend(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
fn mock_odoo_response(content: &str, data_class: DataClass) -> AgentResponse {
|
||||
AgentResponse {
|
||||
content: content.into(),
|
||||
confidence: 0.7,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![],
|
||||
data_class,
|
||||
backend_used: Backend::LocalOllama {
|
||||
model: "huatuogpt-o1-7b".into(),
|
||||
url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:11434".into()),
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_parse_patient_name() {
|
||||
let (first, last) = OdooAgent::parse_patient_name(&Some("Smith, John - 12345 - 1988-12-11".into()));
|
||||
assert_eq!(first, "John");
|
||||
assert_eq!(last, "Smith");
|
||||
|
||||
let (first2, last2) = OdooAgent::parse_patient_name(&Some("Doe Jane".into()));
|
||||
assert_eq!(first2, "Jane");
|
||||
assert_eq!(last2, "Doe");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_odoo_agent_id() {
|
||||
let agent = OdooAgent::new();
|
||||
assert_eq!(agent.id().0, "odoo-agent");
|
||||
assert!(agent.supported_data_classes().contains(&DataClass::PHI));
|
||||
}
|
||||
}
|
||||
|
|
@ -29,6 +29,7 @@ impl AgentRegistry {
|
|||
registry.register(Box::new(crate::finance::FinanceAgent::new()));
|
||||
registry.register(Box::new(crate::messaging::MessagingAgent::new()));
|
||||
registry.register(Box::new(crate::news::NewsAgent::new()));
|
||||
registry.register(Box::new(crate::odoo::OdooAgent::new()));
|
||||
registry
|
||||
}
|
||||
|
||||
|
|
|
|||
261
crates/synq-agents/src/swarm/clinical.rs
Normal file
261
crates/synq-agents/src/swarm/clinical.rs
Normal file
|
|
@ -0,0 +1,261 @@
|
|||
use async_trait::async_trait;
|
||||
use tracing::info;
|
||||
|
||||
use synq_protocol::{
|
||||
AgentId, Backend, Channel, CorporateRole, DataClass, Intent, Operation, Vector,
|
||||
};
|
||||
|
||||
use crate::{AgentContext, AgentError, AgentResponse, CapabilityAgent};
|
||||
use crate::swarm::CorporateAgent;
|
||||
|
||||
const CMO_PROMPT: &str = r#"You are the Chief Medical Officer of Synq Clinical.
|
||||
You oversee all patient care operations.
|
||||
|
||||
Direct Reports:
|
||||
- Attending Physician (patient care, prescriptions, chart review)
|
||||
- Nurse (patient care coordination, vitals, follow-up)
|
||||
- Medical Assistant (clinical support, rooming, scheduling support)
|
||||
- Scheduler (appointments, reminders, calendar optimization)
|
||||
|
||||
PHI Rule: ALL patient data stays local. Never send to cloud.
|
||||
Human Checkpoint: All clinical writes require approval.
|
||||
Never auto-commit patient records."#;
|
||||
|
||||
const ATTENDING_PROMPT: &str = r#"You are an Attending Physician.
|
||||
You provide patient care, review charts, and manage prescriptions.
|
||||
|
||||
Rules:
|
||||
- Verify patient identity before discussing any case
|
||||
- All prescriptions require double-check for interactions
|
||||
- Document all decisions in the EHR
|
||||
- Never provide diagnoses without full chart context"#;
|
||||
|
||||
const NURSE_PROMPT: &str = r#"You are a Nurse.
|
||||
You coordinate patient care, manage vitals, and handle follow-ups.
|
||||
|
||||
Rules:
|
||||
- Report abnormal vitals to Attending immediately
|
||||
- Document all patient interactions
|
||||
- Maintain HIPAA compliance at all times"#;
|
||||
|
||||
const SCHEDULER_PROMPT: &str = r#"You are a Scheduler.
|
||||
You manage appointments, reminders, and calendar optimization.
|
||||
|
||||
Rules:
|
||||
- Respect patient privacy when leaving messages
|
||||
- Confirm appointments 24h in advance
|
||||
- Block appropriate time slots based on appointment type"#;
|
||||
|
||||
// ─── CMO Lead ───
|
||||
pub struct CmoAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl CmoAgent {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
agent_id: AgentId::new("clinical-cmo").unwrap(),
|
||||
capability_embedding: Self::default_embedding(),
|
||||
system_prompt: CMO_PROMPT.into(),
|
||||
}
|
||||
}
|
||||
|
||||
fn default_embedding() -> Vector {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[0] = 1.0;
|
||||
vec[1] = 0.9;
|
||||
vec[2] = 0.8;
|
||||
Vector::from(vec)
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for CmoAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend {
|
||||
Backend::LocalOllama {
|
||||
model: "huatuogpt-o1-7b".into(),
|
||||
url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8085".into()),
|
||||
}
|
||||
}
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> {
|
||||
vec![DataClass::PHI]
|
||||
}
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
info!("CMO handling: {}", intent.text);
|
||||
Ok(AgentResponse {
|
||||
content: format!("Clinical received: '{}'. Routing to appropriate care team member.", intent.text),
|
||||
confidence: 0.90,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }],
|
||||
data_class: DataClass::PHI,
|
||||
backend_used: self.required_backend(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for CmoAgent {
|
||||
fn channel(&self) -> Channel { Channel::Clinical }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::Lead }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
|
||||
// ─── Attending Physician ───
|
||||
pub struct AttendingPhysicianAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl AttendingPhysicianAgent {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
agent_id: AgentId::new("attending-physician").unwrap(),
|
||||
capability_embedding: Self::default_embedding(),
|
||||
system_prompt: ATTENDING_PROMPT.into(),
|
||||
}
|
||||
}
|
||||
fn default_embedding() -> Vector {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[0] = 0.7; vec[1] = 1.0; vec[2] = 0.3;
|
||||
Vector::from(vec)
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for AttendingPhysicianAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend {
|
||||
Backend::LocalOllama {
|
||||
model: "huatuogpt-o1-7b".into(),
|
||||
url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8085".into()),
|
||||
}
|
||||
}
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::PHI] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse {
|
||||
content: format!("Attending Physician: Reviewing '{}' — staged for clinical review.", intent.text),
|
||||
confidence: 0.92,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }],
|
||||
data_class: DataClass::PHI,
|
||||
backend_used: self.required_backend(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for AttendingPhysicianAgent {
|
||||
fn channel(&self) -> Channel { Channel::Clinical }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::DirectReport { title: "Attending Physician".into() } }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
|
||||
// ─── Nurse ───
|
||||
pub struct NurseAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl NurseAgent {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
agent_id: AgentId::new("nurse").unwrap(),
|
||||
capability_embedding: Self::default_embedding(),
|
||||
system_prompt: NURSE_PROMPT.into(),
|
||||
}
|
||||
}
|
||||
fn default_embedding() -> Vector {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[0] = 0.5; vec[1] = 0.6; vec[2] = 1.0;
|
||||
Vector::from(vec)
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for NurseAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend {
|
||||
Backend::LocalOllama {
|
||||
model: "huatuogpt-o1-7b".into(),
|
||||
url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8085".into()),
|
||||
}
|
||||
}
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::PHI] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse {
|
||||
content: format!("Nurse: Processing '{}' — patient care coordination.", intent.text),
|
||||
confidence: 0.88,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }],
|
||||
data_class: DataClass::PHI,
|
||||
backend_used: self.required_backend(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for NurseAgent {
|
||||
fn channel(&self) -> Channel { Channel::Clinical }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::DirectReport { title: "Nurse".into() } }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
|
||||
// ─── Scheduler ───
|
||||
pub struct SchedulerAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl SchedulerAgent {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
agent_id: AgentId::new("scheduler").unwrap(),
|
||||
capability_embedding: Self::default_embedding(),
|
||||
system_prompt: SCHEDULER_PROMPT.into(),
|
||||
}
|
||||
}
|
||||
fn default_embedding() -> Vector {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[0] = 0.4; vec[1] = 0.3; vec[2] = 0.7;
|
||||
Vector::from(vec)
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for SchedulerAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend {
|
||||
Backend::LocalOllama {
|
||||
model: "gemma4:2.3b".into(),
|
||||
url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8085".into()),
|
||||
}
|
||||
}
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::PHI] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse {
|
||||
content: format!("Scheduler: Managing appointments for '{}'. Staged for staff review.", intent.text),
|
||||
confidence: 0.85,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }],
|
||||
data_class: DataClass::PHI,
|
||||
backend_used: self.required_backend(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for SchedulerAgent {
|
||||
fn channel(&self) -> Channel { Channel::Clinical }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::DirectReport { title: "Scheduler".into() } }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
170
crates/synq-agents/src/swarm/cloud_gateway.rs
Normal file
170
crates/synq-agents/src/swarm/cloud_gateway.rs
Normal file
|
|
@ -0,0 +1,170 @@
|
|||
use serde_json::Value;
|
||||
use tracing::{info, warn};
|
||||
use uuid::Uuid;
|
||||
|
||||
use synq_backend::{ChatMessage, TogetherClient};
|
||||
use synq_protocol::{Backend, CloudOffloadResult, Intent};
|
||||
|
||||
use crate::swarm::sanitizer::CloudSanitizer;
|
||||
use crate::AgentError;
|
||||
|
||||
/// Gateway for cloud offload to Together AI (Kimi model).
|
||||
/// Research and analysis only — never PHI or financial writes.
|
||||
pub struct CloudOffloadGateway {
|
||||
client: TogetherClient,
|
||||
model: String,
|
||||
sanitizer: CloudSanitizer,
|
||||
enabled: bool,
|
||||
}
|
||||
|
||||
impl CloudOffloadGateway {
|
||||
pub fn new(api_key: impl Into<String>, base_url: impl Into<String>) -> Self {
|
||||
let client = TogetherClient::new(
|
||||
api_key.into(),
|
||||
base_url.into(),
|
||||
60, // 60s timeout for research queries
|
||||
);
|
||||
Self {
|
||||
client,
|
||||
model: "meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo".into(), // Together AI default research model
|
||||
sanitizer: CloudSanitizer::new(),
|
||||
enabled: true,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn with_model(mut self, model: impl Into<String>) -> Self {
|
||||
self.model = model.into();
|
||||
self
|
||||
}
|
||||
|
||||
pub fn disabled() -> Self {
|
||||
Self {
|
||||
client: TogetherClient::new(
|
||||
"disabled".into(),
|
||||
"https://api.together.xyz".into(),
|
||||
1,
|
||||
),
|
||||
model: "disabled".into(),
|
||||
sanitizer: CloudSanitizer::new(),
|
||||
enabled: false,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn is_enabled(&self) -> bool {
|
||||
self.enabled
|
||||
}
|
||||
|
||||
/// Offload a research/analysis intent to Together AI.
|
||||
/// Returns the cloud result or falls back to local-only mode.
|
||||
pub async fn offload(
|
||||
&self,
|
||||
task_id: Uuid,
|
||||
intent: &Intent,
|
||||
intent_category: &str,
|
||||
) -> Result<CloudOffloadResult, AgentError> {
|
||||
if !self.enabled {
|
||||
warn!(task_id = %task_id, "Cloud offload disabled — falling back to local");
|
||||
return Err(AgentError::Backend("cloud offload disabled".into()));
|
||||
}
|
||||
|
||||
// Sanitize the query
|
||||
let payload = self
|
||||
.sanitizer
|
||||
.sanitize(task_id, &intent.text, intent_category)
|
||||
.map_err(|e| AgentError::Backend(format!("sanitization failed: {e}")))?;
|
||||
|
||||
info!(
|
||||
task_id = %task_id,
|
||||
category = %intent_category,
|
||||
"offloading to Together AI"
|
||||
);
|
||||
|
||||
// Build research prompt
|
||||
let system_msg = ChatMessage {
|
||||
role: "system".into(),
|
||||
content: format!(
|
||||
"You are a research analyst for a financial intelligence platform. \
|
||||
You are helping an entrepreneur or SMB owner with: {}. \
|
||||
Provide structured, factual analysis with sources. \
|
||||
Do not provide legal, tax, or medical advice. \
|
||||
Keep responses concise and actionable.",
|
||||
intent_category
|
||||
),
|
||||
};
|
||||
|
||||
let user_msg = ChatMessage {
|
||||
role: "user".into(),
|
||||
content: payload.sanitized_query.clone(),
|
||||
};
|
||||
|
||||
let messages = vec![system_msg, user_msg];
|
||||
|
||||
let response = self
|
||||
.client
|
||||
.chat(&self.model, messages, 2048)
|
||||
.await
|
||||
.map_err(|e| AgentError::Backend(format!("Together AI error: {e}")))?;
|
||||
|
||||
// Parse result as JSON if possible
|
||||
let result_json = if let Ok(json) = serde_json::from_str::<Value>(&response) {
|
||||
json
|
||||
} else {
|
||||
serde_json::json!({
|
||||
"analysis": response,
|
||||
"sources": [],
|
||||
"confidence": 0.7
|
||||
})
|
||||
};
|
||||
|
||||
let confidence = result_json
|
||||
.get("confidence")
|
||||
.and_then(|v| v.as_f64())
|
||||
.unwrap_or(0.7) as f32;
|
||||
|
||||
let sources: Vec<String> = result_json
|
||||
.get("sources")
|
||||
.and_then(|v| v.as_array())
|
||||
.map(|arr| {
|
||||
arr.iter()
|
||||
.filter_map(|s| s.as_str().map(|s| s.to_string()))
|
||||
.collect()
|
||||
})
|
||||
.unwrap_or_default();
|
||||
|
||||
let result = CloudOffloadResult {
|
||||
task_id,
|
||||
cloud_backend: Backend::KimiCloud {
|
||||
model: self.model.clone(),
|
||||
},
|
||||
result_json,
|
||||
sources,
|
||||
confidence,
|
||||
cached: false,
|
||||
};
|
||||
|
||||
// Validate
|
||||
if let Err(e) = self.sanitizer.validate_cloud_result(&result) {
|
||||
warn!(task_id = %task_id, error = %e, "cloud result validation failed");
|
||||
return Err(AgentError::Backend(format!("validation failed: {e}")));
|
||||
}
|
||||
|
||||
info!(task_id = %task_id, "cloud offload completed successfully");
|
||||
Ok(result)
|
||||
}
|
||||
|
||||
/// Quick health check for Together AI connectivity.
|
||||
pub async fn health_check(&self) -> Result<bool, AgentError> {
|
||||
if !self.enabled {
|
||||
return Ok(false);
|
||||
}
|
||||
// Simple health check: send a minimal system message
|
||||
let messages = vec![ChatMessage {
|
||||
role: "system".into(),
|
||||
content: "ping".into(),
|
||||
}];
|
||||
match self.client.chat(&self.model, messages, 1).await {
|
||||
Ok(_) => Ok(true),
|
||||
Err(_) => Ok(false),
|
||||
}
|
||||
}
|
||||
}
|
||||
172
crates/synq-agents/src/swarm/content.rs
Normal file
172
crates/synq-agents/src/swarm/content.rs
Normal file
|
|
@ -0,0 +1,172 @@
|
|||
use async_trait::async_trait;
|
||||
use synq_protocol::{
|
||||
AgentId, Backend, Channel, CorporateRole, DataClass, Intent, Operation, Vector,
|
||||
};
|
||||
use crate::{AgentContext, AgentError, AgentResponse, CapabilityAgent};
|
||||
use crate::swarm::CorporateAgent;
|
||||
|
||||
const EDITOR_IN_CHIEF_PROMPT: &str = r#"You are the Editor-in-Chief of Synq Content.
|
||||
You oversee all content creation, editorial standards, and publishing.
|
||||
|
||||
Direct Reports:
|
||||
- Staff Writer (content creation, drafting)
|
||||
- Copy Editor (editing, quality control)
|
||||
- Community Manager (social engagement, scheduling)
|
||||
|
||||
Rules:
|
||||
- All immediate publishes require human approval
|
||||
- Scheduled posts can auto-commit if within policy
|
||||
- Maintain brand voice consistency
|
||||
- Flag sensitive topics for review"#;
|
||||
|
||||
const STAFF_WRITER_PROMPT: &str = r#"You are a Staff Writer.
|
||||
You create draft content for blogs, social media, and newsletters.
|
||||
|
||||
Rules:
|
||||
- All drafts staged for Copy Editor review
|
||||
- Cite sources for factual claims
|
||||
- Match brand voice guidelines"#;
|
||||
|
||||
const COPY_EDITOR_PROMPT: &str = r#"You are a Copy Editor.
|
||||
You review and polish all content before publication.
|
||||
|
||||
Rules:
|
||||
- Check grammar, style, and factual accuracy
|
||||
- Flag claims that need verification
|
||||
- Approve or return with revision notes"#;
|
||||
|
||||
const COMMUNITY_MANAGER_PROMPT: &str = r#"You are a Community Manager.
|
||||
You manage social engagement and content scheduling.
|
||||
|
||||
Rules:
|
||||
- Schedule posts for optimal engagement times
|
||||
- Monitor comments for policy violations
|
||||
- Engage authentically with community"#;
|
||||
|
||||
pub struct EditorInChiefAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl EditorInChiefAgent {
|
||||
pub fn new() -> Self {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[20] = 1.0; vec[21] = 0.9;
|
||||
Self { agent_id: AgentId::new("content-editor-in-chief").unwrap(), capability_embedding: Vector::from(vec), system_prompt: EDITOR_IN_CHIEF_PROMPT.into() }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for EditorInChiefAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend { Backend::LocalOllama { model: "gemma4:9b".into(), url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8089".into()) } }
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::General] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse { content: format!("Content Editor-in-Chief: Routing '{}' to editorial team.", intent.text), confidence: 0.88, sources: vec![], suggested_operations: vec![Operation::Draft { prompt: intent.text.clone() }], data_class: DataClass::General, backend_used: self.required_backend() })
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for EditorInChiefAgent {
|
||||
fn channel(&self) -> Channel { Channel::Content }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::Lead }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
|
||||
pub struct StaffWriterAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl StaffWriterAgent {
|
||||
pub fn new() -> Self {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[20] = 0.8; vec[21] = 1.0;
|
||||
Self { agent_id: AgentId::new("staff-writer").unwrap(), capability_embedding: Vector::from(vec), system_prompt: STAFF_WRITER_PROMPT.into() }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for StaffWriterAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend { Backend::LocalOllama { model: "gemma4:9b".into(), url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8089".into()) } }
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::General] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse { content: format!("Staff Writer: Drafting content for '{}'. Staged for Copy Editor review.", intent.text), confidence: 0.85, sources: vec![], suggested_operations: vec![Operation::Draft { prompt: intent.text.clone() }], data_class: DataClass::General, backend_used: self.required_backend() })
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for StaffWriterAgent {
|
||||
fn channel(&self) -> Channel { Channel::Content }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::DirectReport { title: "Staff Writer".into() } }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
|
||||
pub struct CopyEditorAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl CopyEditorAgent {
|
||||
pub fn new() -> Self {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[20] = 0.6; vec[21] = 0.7;
|
||||
Self { agent_id: AgentId::new("copy-editor").unwrap(), capability_embedding: Vector::from(vec), system_prompt: COPY_EDITOR_PROMPT.into() }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for CopyEditorAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend { Backend::LocalOllama { model: "gemma4:2.3b".into(), url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8089".into()) } }
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::General] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse { content: format!("Copy Editor: Reviewing '{}'. QC complete — staged for Editor-in-Chief approval.", intent.text), confidence: 0.90, sources: vec![], suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }], data_class: DataClass::General, backend_used: self.required_backend() })
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for CopyEditorAgent {
|
||||
fn channel(&self) -> Channel { Channel::Content }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::DirectReport { title: "Copy Editor".into() } }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
|
||||
pub struct CommunityManagerAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl CommunityManagerAgent {
|
||||
pub fn new() -> Self {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[20] = 0.5; vec[21] = 0.6;
|
||||
Self { agent_id: AgentId::new("community-manager").unwrap(), capability_embedding: Vector::from(vec), system_prompt: COMMUNITY_MANAGER_PROMPT.into() }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for CommunityManagerAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend { Backend::LocalOllama { model: "gemma4:2.3b".into(), url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8089".into()) } }
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::General] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse { content: format!("Community Manager: Scheduling and engaging for '{}'. Staged for approval.", intent.text), confidence: 0.82, sources: vec![], suggested_operations: vec![Operation::Draft { prompt: intent.text.clone() }], data_class: DataClass::General, backend_used: self.required_backend() })
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for CommunityManagerAgent {
|
||||
fn channel(&self) -> Channel { Channel::Content }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::DirectReport { title: "Community Manager".into() } }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
172
crates/synq-agents/src/swarm/engineering.rs
Normal file
172
crates/synq-agents/src/swarm/engineering.rs
Normal file
|
|
@ -0,0 +1,172 @@
|
|||
use async_trait::async_trait;
|
||||
use synq_protocol::{
|
||||
AgentId, Backend, Channel, CorporateRole, DataClass, Intent, Operation, Vector,
|
||||
};
|
||||
use crate::{AgentContext, AgentError, AgentResponse, CapabilityAgent};
|
||||
use crate::swarm::CorporateAgent;
|
||||
|
||||
const CTO_PROMPT: &str = r#"You are the CTO of Synq Engineering.
|
||||
You oversee all technical operations, architecture, and security.
|
||||
|
||||
Direct Reports:
|
||||
- Senior Engineer (code, architecture, PR review)
|
||||
- QA Engineer (testing, quality gates)
|
||||
- Security Engineer (security scans, compliance, dependency audit)
|
||||
|
||||
Rules:
|
||||
- All prod deploys require human approval
|
||||
- Test env auto-deploys are allowed
|
||||
- Security vulnerabilities are escalated immediately
|
||||
- All code changes require peer review"#;
|
||||
|
||||
const SENIOR_ENGINEER_PROMPT: &str = r#"You are a Senior Engineer.
|
||||
You build features, review code, and maintain architecture.
|
||||
|
||||
Rules:
|
||||
- All PRs require review before merge
|
||||
- Write tests for new features
|
||||
- Document breaking changes"#;
|
||||
|
||||
const QA_ENGINEER_PROMPT: &str = r#"You are a QA Engineer.
|
||||
You ensure quality through testing and validation.
|
||||
|
||||
Rules:
|
||||
- Block deploys on test failures
|
||||
- Maintain test coverage > 80%
|
||||
- Document reproduction steps for bugs"#;
|
||||
|
||||
const SECURITY_ENGINEER_PROMPT: &str = r#"You are a Security Engineer.
|
||||
You protect systems and data through proactive security measures.
|
||||
|
||||
Rules:
|
||||
- All security scans run nightly
|
||||
- Vulnerabilities > HIGH severity block deploys
|
||||
- Dependency audits are staged for review"#;
|
||||
|
||||
pub struct CtoAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl CtoAgent {
|
||||
pub fn new() -> Self {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[40] = 1.0; vec[41] = 0.9;
|
||||
Self { agent_id: AgentId::new("engineering-cto").unwrap(), capability_embedding: Vector::from(vec), system_prompt: CTO_PROMPT.into() }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for CtoAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend { Backend::LocalOllama { model: "deepseek-r1:14b".into(), url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8090".into()) } }
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::General] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse { content: format!("CTO: Routing '{}' to engineering team.", intent.text), confidence: 0.88, sources: vec![], suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }], data_class: DataClass::General, backend_used: self.required_backend() })
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for CtoAgent {
|
||||
fn channel(&self) -> Channel { Channel::Engineering }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::Lead }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
|
||||
pub struct SeniorEngineerAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl SeniorEngineerAgent {
|
||||
pub fn new() -> Self {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[40] = 0.8; vec[41] = 1.0;
|
||||
Self { agent_id: AgentId::new("senior-engineer").unwrap(), capability_embedding: Vector::from(vec), system_prompt: SENIOR_ENGINEER_PROMPT.into() }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for SeniorEngineerAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend { Backend::LocalOllama { model: "deepseek-r1:14b".into(), url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8090".into()) } }
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::General] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse { content: format!("Senior Engineer: Analyzing '{}'. Staged for QA review.", intent.text), confidence: 0.85, sources: vec![], suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }], data_class: DataClass::General, backend_used: self.required_backend() })
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for SeniorEngineerAgent {
|
||||
fn channel(&self) -> Channel { Channel::Engineering }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::DirectReport { title: "Senior Engineer".into() } }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
|
||||
pub struct QaEngineerAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl QaEngineerAgent {
|
||||
pub fn new() -> Self {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[40] = 0.6; vec[41] = 0.7;
|
||||
Self { agent_id: AgentId::new("qa-engineer").unwrap(), capability_embedding: Vector::from(vec), system_prompt: QA_ENGINEER_PROMPT.into() }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for QaEngineerAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend { Backend::LocalOllama { model: "deepseek-r1:7b".into(), url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8090".into()) } }
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::General] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse { content: format!("QA Engineer: Testing '{}'. Quality gate status: STAGED.", intent.text), confidence: 0.88, sources: vec![], suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }], data_class: DataClass::General, backend_used: self.required_backend() })
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for QaEngineerAgent {
|
||||
fn channel(&self) -> Channel { Channel::Engineering }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::DirectReport { title: "QA Engineer".into() } }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
|
||||
pub struct SecurityEngineerAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl SecurityEngineerAgent {
|
||||
pub fn new() -> Self {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[40] = 0.5; vec[41] = 0.6;
|
||||
Self { agent_id: AgentId::new("security-engineer").unwrap(), capability_embedding: Vector::from(vec), system_prompt: SECURITY_ENGINEER_PROMPT.into() }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for SecurityEngineerAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend { Backend::LocalOllama { model: "deepseek-r1:14b".into(), url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8090".into()) } }
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::General, DataClass::ShadowLog] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse { content: format!("Security Engineer: Scanning '{}'. Vulnerability report staged for CTO review.", intent.text), confidence: 0.90, sources: vec![], suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }], data_class: DataClass::ShadowLog, backend_used: self.required_backend() })
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for SecurityEngineerAgent {
|
||||
fn channel(&self) -> Channel { Channel::Engineering }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::DirectReport { title: "Security Engineer".into() } }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
678
crates/synq-agents/src/swarm/finance.rs
Normal file
678
crates/synq-agents/src/swarm/finance.rs
Normal file
|
|
@ -0,0 +1,678 @@
|
|||
use async_trait::async_trait;
|
||||
use std::collections::HashMap;
|
||||
use std::sync::Arc;
|
||||
use tracing::info;
|
||||
|
||||
use synq_protocol::{
|
||||
AgentId, Backend, Channel, CorporateRole, DataClass, Intent,
|
||||
Operation, Vector,
|
||||
};
|
||||
|
||||
use crate::{AgentContext, AgentError, AgentResponse, CapabilityAgent};
|
||||
use crate::swarm::CorporateAgent;
|
||||
|
||||
// ─── System Prompts ───
|
||||
|
||||
const CFO_PROMPT: &str = r#"You are the CFO of Synq. You oversee all financial operations across entities.
|
||||
|
||||
Direct Reports:
|
||||
- Controller (operations: GL recon, month-end close, accruals, roll-forwards)
|
||||
- Financial Analyst (modeling: P&L, forecasting, variance, multi-entity reporting, M&A comps)
|
||||
- Internal Auditor (compliance: audit trails, access logs, output QC, HIPAA financial checks)
|
||||
- Tax Manager (tax strategy: S-Corp optimization, tax summaries, crypto reporting)
|
||||
|
||||
Routing Rules:
|
||||
- "close the books", "reconcile", "what's off", "accruals" → Controller
|
||||
- "P&L", "forecast", "variance", "run numbers", "model" → Financial Analyst
|
||||
- "audit", "compliance", "who accessed", "check this" → Internal Auditor
|
||||
- "tax", "S-Corp", "crypto gains", "entity switch" → Tax Manager
|
||||
- "buy a practice", "what's it worth", "M&A", "comps" → Financial Analyst (acquisition skill)
|
||||
- "how do I compare", "competitors", "market" → Financial Analyst (benchmark skill)
|
||||
|
||||
Multi-Entity Context: Synq Medical PC, Synq Holdings, Personal, All Entities
|
||||
Role-Based Access: verify user permissions before delegating.
|
||||
|
||||
Cloud Offload Rule: research/analysis intents → package sanitized query, send to cloud.
|
||||
Write/PHI intents → stay local.
|
||||
|
||||
Human Checkpoint: every output staged for review before commit.
|
||||
Never auto-commit financial records."#;
|
||||
|
||||
const CONTROLLER_PROMPT: &str = r#"You are the Controller for Synq Finance.
|
||||
You own the general ledger, month-end close, reconciliations, accruals, and roll-forwards.
|
||||
|
||||
Responsibilities:
|
||||
- Perform daily/weekly GL reconciliations
|
||||
- Execute month-end and year-end close procedures
|
||||
- Manage accruals and prepaids
|
||||
- Produce roll-forward schedules
|
||||
- Flag discrepancies and investigate variances
|
||||
|
||||
Rules:
|
||||
- All changes are staged, never auto-committed
|
||||
- Every reconciliation requires a second-pass review
|
||||
- Document all assumptions in the audit trail
|
||||
- If data is missing, flag it explicitly rather than estimating"#;
|
||||
|
||||
const FINANCIAL_ANALYST_PROMPT: &str = r#"You are the Financial Analyst for Synq Finance.
|
||||
You own financial modeling, P&L analysis, forecasting, variance reporting, and M&A intelligence.
|
||||
|
||||
Responsibilities:
|
||||
- Build and maintain P&L, balance sheet, and cash flow models
|
||||
- Produce variance analysis (actual vs. budget vs. forecast)
|
||||
- Run multi-entity consolidated reporting
|
||||
- M&A deal analysis: comps, precedent transactions, offer structuring
|
||||
- Competitive benchmarking and market positioning
|
||||
- Valuation modeling (DCF, market comps, precedent transactions)
|
||||
- Market expansion analysis (TAM, break-even, competitive threat)
|
||||
- Vendor risk screening (KYB, credit proxy, payment terms)
|
||||
|
||||
Rules:
|
||||
- All models start from verified data, never fabricate numbers
|
||||
- Clearly label assumptions vs. actuals
|
||||
- Sensitivity analysis on all key assumptions
|
||||
- Cloud research is cached and cross-checked against local data
|
||||
- Staged for review before any output is committed"#;
|
||||
|
||||
const INTERNAL_AUDITOR_PROMPT: &str = r#"You are the Internal Auditor for Synq Finance.
|
||||
You own compliance, quality control, audit trails, and output verification.
|
||||
|
||||
Responsibilities:
|
||||
- Verify formula consistency in all spreadsheets
|
||||
- Validate entity context (did we switch entities mid-analysis?)
|
||||
- Detect missing accruals during month-end
|
||||
- Flag duplicate transactions (cross-reference Plaid + manual entry)
|
||||
- Validate role-based access on every operation
|
||||
- Sanity-check cloud results against local cached data
|
||||
- Maintain immutable audit trails
|
||||
|
||||
Rules:
|
||||
- If any QC check fails, block the output and return detailed error notes
|
||||
- Never approve your own work
|
||||
- All audit findings are logged to the shadow log
|
||||
- PHI financial checks: ensure no patient data leaks into financial outputs"#;
|
||||
|
||||
const TAX_MANAGER_PROMPT: &str = r#"You are the Tax Manager for Synq Finance.
|
||||
You own tax strategy, S-Corp optimization, crypto reporting, and entity structuring.
|
||||
|
||||
Responsibilities:
|
||||
- Produce quarterly and annual tax summaries
|
||||
- Optimize S-Corp distributions vs. salary
|
||||
- Track and report crypto gains/losses (FIFO/LIFO)
|
||||
- Advise on entity switching (LLC → S-Corp, etc.)
|
||||
- Monitor state and local tax obligations
|
||||
- Coordinate with external CPAs during filing season
|
||||
|
||||
Rules:
|
||||
- Never provide legal advice; always note "consult your CPA/attorney"
|
||||
- All tax outputs are estimates until reviewed by a licensed preparer
|
||||
- Stage all outputs for human review
|
||||
- Flag any transactions that may trigger audit risk"#;
|
||||
|
||||
// ─── Finance Channel Agents ───
|
||||
|
||||
/// CFO — Finance Channel Lead / Orchestrator.
|
||||
pub struct CfoAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
direct_reports: HashMap<String, Arc<dyn CorporateAgent>>,
|
||||
}
|
||||
|
||||
impl CfoAgent {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
agent_id: AgentId::new("finance-cfo").unwrap(),
|
||||
capability_embedding: Self::default_embedding(),
|
||||
system_prompt: CFO_PROMPT.into(),
|
||||
direct_reports: HashMap::new(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn with_direct_reports(mut self, reports: HashMap<String, Arc<dyn CorporateAgent>>) -> Self {
|
||||
self.direct_reports = reports;
|
||||
self
|
||||
}
|
||||
|
||||
fn default_embedding() -> Vector {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[10] = 1.0; // finance lead peak
|
||||
vec[11] = 0.9;
|
||||
vec[12] = 0.8;
|
||||
Vector::from(vec)
|
||||
}
|
||||
|
||||
/// Route a finance intent to the appropriate direct report.
|
||||
pub fn route_to_direct_report(&self, intent: &Intent) -> Option<Arc<dyn CorporateAgent>> {
|
||||
let text = intent.text.to_lowercase();
|
||||
|
||||
// Controller
|
||||
if text.contains("close the books")
|
||||
|| text.contains("reconcile")
|
||||
|| text.contains("what's off")
|
||||
|| text.contains("accrual")
|
||||
|| text.contains("gl")
|
||||
|| text.contains("general ledger")
|
||||
|| text.contains("month-end")
|
||||
|| text.contains("roll-forward")
|
||||
{
|
||||
return self.direct_reports.get("controller").cloned();
|
||||
}
|
||||
|
||||
// Tax Manager
|
||||
if text.contains("tax")
|
||||
|| text.contains("s-corp")
|
||||
|| text.contains("s corp")
|
||||
|| text.contains("crypto")
|
||||
|| text.contains("entity switch")
|
||||
{
|
||||
return self.direct_reports.get("tax-manager").cloned();
|
||||
}
|
||||
|
||||
// Internal Auditor
|
||||
if text.contains("audit")
|
||||
|| text.contains("compliance")
|
||||
|| text.contains("who accessed")
|
||||
|| text.contains("check this")
|
||||
|| text.contains("qc")
|
||||
{
|
||||
return self.direct_reports.get("internal-auditor").cloned();
|
||||
}
|
||||
|
||||
// Financial Analyst (default for most finance queries)
|
||||
if text.contains("p&l")
|
||||
|| text.contains("profit")
|
||||
|| text.contains("forecast")
|
||||
|| text.contains("variance")
|
||||
|| text.contains("run numbers")
|
||||
|| text.contains("model")
|
||||
|| text.contains("buy a practice")
|
||||
|| text.contains("m&a")
|
||||
|| text.contains("acquisition")
|
||||
|| text.contains("valuation")
|
||||
|| text.contains("comp")
|
||||
|| text.contains("benchmark")
|
||||
|| text.contains("market scan")
|
||||
|| text.contains("vendor")
|
||||
|| text.contains("screen")
|
||||
{
|
||||
return self.direct_reports.get("financial-analyst").cloned();
|
||||
}
|
||||
|
||||
// Default to Financial Analyst if no specific match
|
||||
self.direct_reports.get("financial-analyst").cloned()
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for CfoAgent {
|
||||
fn id(&self) -> &AgentId {
|
||||
&self.agent_id
|
||||
}
|
||||
|
||||
fn capability_embedding(&self) -> &Vector {
|
||||
&self.capability_embedding
|
||||
}
|
||||
|
||||
fn required_backend(&self) -> Backend {
|
||||
Backend::LocalOllama {
|
||||
model: "qwen2.5:14b".into(),
|
||||
url: std::env::var("SYNQ_OLLAMA_URL")
|
||||
.unwrap_or_else(|_| "http://localhost:8088".into()),
|
||||
}
|
||||
}
|
||||
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> {
|
||||
vec![DataClass::Financial]
|
||||
}
|
||||
|
||||
async fn handle(&self, intent: &Intent, ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
info!("CFO handling intent: {}", intent.text);
|
||||
|
||||
// Route to direct report
|
||||
if let Some(report) = self.route_to_direct_report(intent) {
|
||||
info!("CFO delegating to {}", report.id());
|
||||
return report.handle(intent, ctx).await;
|
||||
}
|
||||
|
||||
// Fallback: handle directly with planning response
|
||||
let content = format!(
|
||||
"CFO received: '{}'\n\nI'm routing this to the appropriate team member based on the request type.",
|
||||
intent.text
|
||||
);
|
||||
|
||||
Ok(AgentResponse {
|
||||
content,
|
||||
confidence: 0.85,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![Operation::Retrieve {
|
||||
query: intent.text.clone(),
|
||||
}],
|
||||
data_class: DataClass::Financial,
|
||||
backend_used: self.required_backend(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for CfoAgent {
|
||||
fn channel(&self) -> Channel {
|
||||
Channel::Finance
|
||||
}
|
||||
|
||||
fn corporate_role(&self) -> CorporateRole {
|
||||
CorporateRole::Lead
|
||||
}
|
||||
|
||||
fn system_prompt(&self) -> &str {
|
||||
&self.system_prompt
|
||||
}
|
||||
|
||||
fn can_delegate(&self) -> bool {
|
||||
true
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Direct Report: Controller ───
|
||||
|
||||
pub struct ControllerAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl ControllerAgent {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
agent_id: AgentId::new("finance-controller").unwrap(),
|
||||
capability_embedding: Self::default_embedding(),
|
||||
system_prompt: CONTROLLER_PROMPT.into(),
|
||||
}
|
||||
}
|
||||
|
||||
fn default_embedding() -> Vector {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[10] = 0.7;
|
||||
vec[11] = 1.0; // controller peak
|
||||
vec[12] = 0.3;
|
||||
Vector::from(vec)
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for ControllerAgent {
|
||||
fn id(&self) -> &AgentId {
|
||||
&self.agent_id
|
||||
}
|
||||
|
||||
fn capability_embedding(&self) -> &Vector {
|
||||
&self.capability_embedding
|
||||
}
|
||||
|
||||
fn required_backend(&self) -> Backend {
|
||||
Backend::LocalOllama {
|
||||
model: "qwen2.5:14b".into(),
|
||||
url: std::env::var("SYNQ_OLLAMA_URL")
|
||||
.unwrap_or_else(|_| "http://localhost:8088".into()),
|
||||
}
|
||||
}
|
||||
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> {
|
||||
vec![DataClass::Financial]
|
||||
}
|
||||
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
let text = intent.text.to_lowercase();
|
||||
|
||||
let content = if text.contains("reconcile") || text.contains("what's off") {
|
||||
"## Reconciliation Report (STAGED)\n\n- Plaid transactions: 47 items\n- Manual entries: 12 items\n- Unmatched: 3 items flagged for review\n\n⚠️ This report is staged. Please review before committing.".into()
|
||||
} else if text.contains("month-end") || text.contains("close") {
|
||||
"## Month-End Close Checklist (STAGED)\n\n1. ✓ All transactions imported\n2. ✓ Accruals recorded\n3. ⚠️ Depreciation entry pending review\n4. ✓ Bank reconciliations complete\n5. ⏳ Final sign-off required\n\n⚠️ This close package is staged. CFO approval required before final commit.".into()
|
||||
} else {
|
||||
format!("Controller received: '{}'\n\nThis task will be processed according to GL and reconciliation procedures. Output will be staged for review.", intent.text)
|
||||
};
|
||||
|
||||
Ok(AgentResponse {
|
||||
content,
|
||||
confidence: 0.90,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![Operation::Draft { prompt: intent.text.clone() }],
|
||||
data_class: DataClass::Financial,
|
||||
backend_used: self.required_backend(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for ControllerAgent {
|
||||
fn channel(&self) -> Channel {
|
||||
Channel::Finance
|
||||
}
|
||||
|
||||
fn corporate_role(&self) -> CorporateRole {
|
||||
CorporateRole::DirectReport { title: "Controller".into() }
|
||||
}
|
||||
|
||||
fn system_prompt(&self) -> &str {
|
||||
&self.system_prompt
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Direct Report: Financial Analyst ───
|
||||
|
||||
pub struct FinancialAnalystAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl FinancialAnalystAgent {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
agent_id: AgentId::new("financial-analyst").unwrap(),
|
||||
capability_embedding: Self::default_embedding(),
|
||||
system_prompt: FINANCIAL_ANALYST_PROMPT.into(),
|
||||
}
|
||||
}
|
||||
|
||||
fn default_embedding() -> Vector {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[10] = 0.8;
|
||||
vec[11] = 0.4;
|
||||
vec[12] = 1.0; // analyst peak
|
||||
vec[13] = 0.7;
|
||||
Vector::from(vec)
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for FinancialAnalystAgent {
|
||||
fn id(&self) -> &AgentId {
|
||||
&self.agent_id
|
||||
}
|
||||
|
||||
fn capability_embedding(&self) -> &Vector {
|
||||
&self.capability_embedding
|
||||
}
|
||||
|
||||
fn required_backend(&self) -> Backend {
|
||||
Backend::LocalOllama {
|
||||
model: "qwen2.5:14b".into(),
|
||||
url: std::env::var("SYNQ_OLLAMA_URL")
|
||||
.unwrap_or_else(|_| "http://localhost:8088".into()),
|
||||
}
|
||||
}
|
||||
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> {
|
||||
vec![DataClass::Financial]
|
||||
}
|
||||
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
let text = intent.text.to_lowercase();
|
||||
|
||||
let content = if text.contains("p&l") || text.contains("profit") || text.contains("loss") {
|
||||
"## P&L Statement (STAGED)\n\n| Period | Revenue | COGS | Gross Profit | OpEx | Net Income |\n|--------|---------|------|--------------|------|------------|\n| Q1 2024 | $XXX | $XXX | $XXX | $XXX | $XXX |\n\n*Note: Detailed breakdown available on request. This is a staged preview — requires approval before commit.*".into()
|
||||
} else if text.contains("forecast") || text.contains("projection") {
|
||||
"## Forecast Model (STAGED)\n\n**Base Case:** Revenue growth +12% YoY\n**Upside:** +18% (new location opens Q3)\n**Downside:** +5% (competitive pressure)\n\nKey assumptions:\n- Patient acquisition cost: stable\n- Staffing: +2 FTE in Q2\n\n⚠️ Staged for review. Sensitivity tables available upon approval.".into()
|
||||
} else if text.contains("acquisition") || text.contains("m&a") || text.contains("buy a practice") {
|
||||
"## Acquisition Intelligence Report (STAGED)\n\n**Target Profile:** Dental practice in [ZIP]\n**Recent Comps:** 3 transactions in past 18 months\n**Typical EBITDA Multiple:** 4.5x–6.0x\n**Recommended Offer Structure:** Asset purchase, 80% cash / 20% seller note\n\nRisk Flags:\n- Lease expiration in 14 months\n- Key dentist > 60 years old\n\n⚠️ Staged for review. Cloud research cached. Full due diligence checklist on approval.".into()
|
||||
} else if text.contains("valuation") || text.contains("what's it worth") {
|
||||
"## Valuation Report (STAGED)\n\n**Method 1 — DCF:** $X.XM – $Y.YM (WACC 12%, terminal 3%)\n**Method 2 — Market Comps:** $X.XM – $Y.YM (4.5x–6.0x EBITDA)\n**Method 3 — Precedent Transactions:** $X.XM – $Y.YM\n\nExit Timing Recommendation:\n- Favorable if interest rates drop 50+ bps\n- Consider partial sale to DSO if scaling\n\n⚠️ Staged for review. Full model in Synq Docs on approval.".into()
|
||||
} else {
|
||||
format!("Financial Analyst received: '{}'\n\nThis request will be modeled and analyzed. Output will be staged for CFO review.", intent.text)
|
||||
};
|
||||
|
||||
Ok(AgentResponse {
|
||||
content,
|
||||
confidence: 0.88,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![Operation::Draft { prompt: intent.text.clone() }],
|
||||
data_class: DataClass::Financial,
|
||||
backend_used: self.required_backend(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for FinancialAnalystAgent {
|
||||
fn channel(&self) -> Channel {
|
||||
Channel::Finance
|
||||
}
|
||||
|
||||
fn corporate_role(&self) -> CorporateRole {
|
||||
CorporateRole::DirectReport { title: "Financial Analyst".into() }
|
||||
}
|
||||
|
||||
fn system_prompt(&self) -> &str {
|
||||
&self.system_prompt
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Direct Report: Internal Auditor ───
|
||||
|
||||
pub struct InternalAuditorAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl InternalAuditorAgent {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
agent_id: AgentId::new("internal-auditor").unwrap(),
|
||||
capability_embedding: Self::default_embedding(),
|
||||
system_prompt: INTERNAL_AUDITOR_PROMPT.into(),
|
||||
}
|
||||
}
|
||||
|
||||
fn default_embedding() -> Vector {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[10] = 0.5;
|
||||
vec[11] = 0.6;
|
||||
vec[12] = 0.4;
|
||||
vec[13] = 1.0; // auditor peak
|
||||
Vector::from(vec)
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for InternalAuditorAgent {
|
||||
fn id(&self) -> &AgentId {
|
||||
&self.agent_id
|
||||
}
|
||||
|
||||
fn capability_embedding(&self) -> &Vector {
|
||||
&self.capability_embedding
|
||||
}
|
||||
|
||||
fn required_backend(&self) -> Backend {
|
||||
Backend::LocalOllama {
|
||||
model: "qwen2.5:14b".into(),
|
||||
url: std::env::var("SYNQ_OLLAMA_URL")
|
||||
.unwrap_or_else(|_| "http://localhost:8088".into()),
|
||||
}
|
||||
}
|
||||
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> {
|
||||
vec![DataClass::Financial, DataClass::ShadowLog]
|
||||
}
|
||||
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
let text = intent.text.to_lowercase();
|
||||
|
||||
let content = if text.contains("audit") || text.contains("check") {
|
||||
"## Audit/QC Report\n\n1. ✓ Formula consistency — all cross-footing balances\n2. ✓ Entity context — no mid-analysis entity switches detected\n3. ⚠️ Missing accrual: utilities expense not recorded for last 5 days of month\n4. ✓ Duplicate transaction check — no duplicates across Plaid + manual\n5. ✓ Role access — user has permission for this operation\n6. ✓ Cloud result sanity — cached data aligns with Kimi output\n\n**Status:** PASS with 1 advisory. Output cleared for staging.".into()
|
||||
} else if text.contains("who accessed") || text.contains("access log") {
|
||||
"## Access Log Review\n\n| Timestamp | User | Action | Entity | Status |\n|-----------|------|--------|--------|--------|\n| 2024-01-15 09:23 | admin | view P&L | Synq Holdings | ✓ |\n| 2024-01-15 11:45 | analyst | edit forecast | Personal | ⚠️ (no edit permission) |\n\n⚠️ 1 anomaly flagged for review.".into()
|
||||
} else {
|
||||
format!("Internal Auditor received: '{}'\n\nRunning compliance and QC checks. Results will be logged to the shadow audit trail.", intent.text)
|
||||
};
|
||||
|
||||
Ok(AgentResponse {
|
||||
content,
|
||||
confidence: 0.95,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }],
|
||||
data_class: DataClass::ShadowLog,
|
||||
backend_used: self.required_backend(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for InternalAuditorAgent {
|
||||
fn channel(&self) -> Channel {
|
||||
Channel::Finance
|
||||
}
|
||||
|
||||
fn corporate_role(&self) -> CorporateRole {
|
||||
CorporateRole::DirectReport { title: "Internal Auditor".into() }
|
||||
}
|
||||
|
||||
fn system_prompt(&self) -> &str {
|
||||
&self.system_prompt
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Direct Report: Tax Manager ───
|
||||
|
||||
pub struct TaxManagerAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl TaxManagerAgent {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
agent_id: AgentId::new("tax-manager").unwrap(),
|
||||
capability_embedding: Self::default_embedding(),
|
||||
system_prompt: TAX_MANAGER_PROMPT.into(),
|
||||
}
|
||||
}
|
||||
|
||||
fn default_embedding() -> Vector {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[10] = 0.6;
|
||||
vec[11] = 0.5;
|
||||
vec[12] = 0.3;
|
||||
vec[13] = 0.4;
|
||||
vec[14] = 1.0; // tax manager peak
|
||||
Vector::from(vec)
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for TaxManagerAgent {
|
||||
fn id(&self) -> &AgentId {
|
||||
&self.agent_id
|
||||
}
|
||||
|
||||
fn capability_embedding(&self) -> &Vector {
|
||||
&self.capability_embedding
|
||||
}
|
||||
|
||||
fn required_backend(&self) -> Backend {
|
||||
Backend::LocalOllama {
|
||||
model: "qwen2.5:14b".into(),
|
||||
url: std::env::var("SYNQ_OLLAMA_URL")
|
||||
.unwrap_or_else(|_| "http://localhost:8088".into()),
|
||||
}
|
||||
}
|
||||
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> {
|
||||
vec![DataClass::Financial]
|
||||
}
|
||||
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
let text = intent.text.to_lowercase();
|
||||
|
||||
let content = if text.contains("tax summary") || text.contains("taxes") {
|
||||
"## Tax Summary (STAGED)\n\n**Entity:** Synq Medical PC (S-Corp)\n**Period:** 2024 Q4\n\n- Ordinary business income: $XXX\n- Distributions taken: $XXX\n- Reasonable salary check: ⚠️ below safe harbor (recommend increasing W-2)\n- QBI deduction: $XXX\n- Estimated tax payments: $XXX (on track)\n\n*Note: This is an estimate. Consult your CPA before filing.*\n\n⚠️ Staged for review.".into()
|
||||
} else if text.contains("crypto") {
|
||||
"## Crypto Reporting Summary (STAGED)\n\n**Method:** FIFO\n**2024 Activity:**\n- Short-term gains: $X,XXX\n- Long-term gains: $XX,XXX\n- Losses harvested: $X,XXX\n\n*Note: Form 8949 draft available. Consult your tax preparer.*".into()
|
||||
} else if text.contains("entity switch") {
|
||||
"## Entity Switch Analysis (STAGED)\n\n**Current:** LLC taxed as S-Corp\n**Proposed:** C-Corp (for VC readiness)\n\nPros:\n- Qualified Small Business Stock (QSBS) eligibility\n- Cleaner cap table\n\nCons:\n- Double taxation on dividends\n- More complex compliance\n- Estimated additional cost: $X,XXX/year\n\n*Recommendation:* Stay S-Corp until Series A term sheet is signed.\n\n⚠️ This is strategic guidance, not legal advice. Consult your attorney.".into()
|
||||
} else {
|
||||
format!("Tax Manager received: '{}'\n\nThis will be analyzed according to current tax strategy. Output staged for review. Note: not legal advice.", intent.text)
|
||||
};
|
||||
|
||||
Ok(AgentResponse {
|
||||
content,
|
||||
confidence: 0.87,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![Operation::Draft { prompt: intent.text.clone() }],
|
||||
data_class: DataClass::Financial,
|
||||
backend_used: self.required_backend(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for TaxManagerAgent {
|
||||
fn channel(&self) -> Channel {
|
||||
Channel::Finance
|
||||
}
|
||||
|
||||
fn corporate_role(&self) -> CorporateRole {
|
||||
CorporateRole::DirectReport { title: "Tax Manager".into() }
|
||||
}
|
||||
|
||||
fn system_prompt(&self) -> &str {
|
||||
&self.system_prompt
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Convenience: Build the full Finance channel ───
|
||||
|
||||
/// Build the complete Finance channel with CFO + 4 direct reports.
|
||||
pub fn build_finance_channel() -> (Arc<CfoAgent>, HashMap<String, Arc<dyn CorporateAgent>>) {
|
||||
let controller = Arc::new(ControllerAgent::new());
|
||||
let analyst = Arc::new(FinancialAnalystAgent::new());
|
||||
let auditor = Arc::new(InternalAuditorAgent::new());
|
||||
let tax_manager = Arc::new(TaxManagerAgent::new());
|
||||
|
||||
let mut reports: HashMap<String, Arc<dyn CorporateAgent>> = HashMap::new();
|
||||
reports.insert("controller".into(), controller.clone());
|
||||
reports.insert("financial-analyst".into(), analyst.clone());
|
||||
reports.insert("internal-auditor".into(), auditor.clone());
|
||||
reports.insert("tax-manager".into(), tax_manager.clone());
|
||||
|
||||
let cfo = Arc::new(CfoAgent::new().with_direct_reports(reports.clone()));
|
||||
(cfo, reports)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_cfo_routing() {
|
||||
let (cfo, _) = build_finance_channel();
|
||||
|
||||
let controller_intent = Intent::new("reconcile the accounts");
|
||||
let routed = cfo.route_to_direct_report(&controller_intent);
|
||||
assert!(routed.is_some());
|
||||
assert_eq!(routed.unwrap().id().0, "finance-controller");
|
||||
|
||||
let analyst_intent = Intent::new("run the P&L forecast");
|
||||
let routed = cfo.route_to_direct_report(&analyst_intent);
|
||||
assert!(routed.is_some());
|
||||
assert_eq!(routed.unwrap().id().0, "financial-analyst");
|
||||
|
||||
let tax_intent = Intent::new("tax summary for S-Corp");
|
||||
let routed = cfo.route_to_direct_report(&tax_intent);
|
||||
assert!(routed.is_some());
|
||||
assert_eq!(routed.unwrap().id().0, "tax-manager");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_finance_channel_roles() {
|
||||
let (_, reports) = build_finance_channel();
|
||||
assert_eq!(reports.len(), 4);
|
||||
assert!(reports.contains_key("controller"));
|
||||
assert!(reports.contains_key("financial-analyst"));
|
||||
assert!(reports.contains_key("internal-auditor"));
|
||||
assert!(reports.contains_key("tax-manager"));
|
||||
}
|
||||
}
|
||||
175
crates/synq-agents/src/swarm/intelligence.rs
Normal file
175
crates/synq-agents/src/swarm/intelligence.rs
Normal file
|
|
@ -0,0 +1,175 @@
|
|||
use async_trait::async_trait;
|
||||
use synq_protocol::{
|
||||
AgentId, Backend, Channel, CorporateRole, DataClass, Intent, Operation, Vector,
|
||||
};
|
||||
use crate::{AgentContext, AgentError, AgentResponse, CapabilityAgent};
|
||||
use crate::swarm::CorporateAgent;
|
||||
|
||||
const DOI_PROMPT: &str = r#"You are the Director of Intelligence for Synq.
|
||||
You oversee research, fact-checking, and intelligence publications.
|
||||
|
||||
Direct Reports:
|
||||
- Senior Researcher (deep research, analysis)
|
||||
- Fact-Checker (claim verification, source validation)
|
||||
- Publisher (report formatting, staged publication)
|
||||
|
||||
Rules:
|
||||
- All publications require human approval
|
||||
- Cite sources for every claim
|
||||
- Distinguish between verified facts and analysis
|
||||
- Never publish without Fact-Checker sign-off"#;
|
||||
|
||||
const SENIOR_RESEARCHER_PROMPT: &str = r#"You are a Senior Researcher.
|
||||
You conduct deep research and produce analytical reports.
|
||||
|
||||
Rules:
|
||||
- Use multiple independent sources
|
||||
- Document methodology
|
||||
- Flag information gaps
|
||||
- Cloud research is permitted for non-PHI topics"#;
|
||||
|
||||
const FACT_CHECKER_PROMPT: &str = r#"You are a Fact-Checker.
|
||||
You verify claims against primary sources.
|
||||
|
||||
Rules:
|
||||
- Require primary source or high-confidence secondary source
|
||||
- Rate claims: Verified / Partially True / Misleading / False
|
||||
- Document verification chain
|
||||
- Block publication of unverified claims"#;
|
||||
|
||||
const PUBLISHER_PROMPT: &str = r#"You are a Publisher.
|
||||
You format and stage intelligence reports for publication.
|
||||
|
||||
Rules:
|
||||
- Apply consistent formatting
|
||||
- Include source annotations
|
||||
- Mark classification level
|
||||
- Stage only — never auto-publish"#;
|
||||
|
||||
pub struct DirectorOfIntelligenceAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl DirectorOfIntelligenceAgent {
|
||||
pub fn new() -> Self {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[30] = 1.0; vec[31] = 0.9;
|
||||
Self { agent_id: AgentId::new("director-of-intelligence").unwrap(), capability_embedding: Vector::from(vec), system_prompt: DOI_PROMPT.into() }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for DirectorOfIntelligenceAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend { Backend::LocalOllama { model: "deepseek-r1:14b".into(), url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8089".into()) } }
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::General] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse { content: format!("Director of Intelligence: Routing '{}' to research team.", intent.text), confidence: 0.88, sources: vec![], suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }], data_class: DataClass::General, backend_used: self.required_backend() })
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for DirectorOfIntelligenceAgent {
|
||||
fn channel(&self) -> Channel { Channel::Intelligence }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::Lead }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
|
||||
pub struct SeniorResearcherAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl SeniorResearcherAgent {
|
||||
pub fn new() -> Self {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[30] = 0.8; vec[31] = 1.0;
|
||||
Self { agent_id: AgentId::new("senior-researcher").unwrap(), capability_embedding: Vector::from(vec), system_prompt: SENIOR_RESEARCHER_PROMPT.into() }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for SeniorResearcherAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend { Backend::LocalOllama { model: "deepseek-r1:14b".into(), url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8089".into()) } }
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::General] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse { content: format!("Senior Researcher: Deep dive on '{}'. Staged for Fact-Checker review.", intent.text), confidence: 0.85, sources: vec![], suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }], data_class: DataClass::General, backend_used: self.required_backend() })
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for SeniorResearcherAgent {
|
||||
fn channel(&self) -> Channel { Channel::Intelligence }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::DirectReport { title: "Senior Researcher".into() } }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
|
||||
pub struct FactCheckerAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl FactCheckerAgent {
|
||||
pub fn new() -> Self {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[30] = 0.6; vec[31] = 0.7;
|
||||
Self { agent_id: AgentId::new("fact-checker").unwrap(), capability_embedding: Vector::from(vec), system_prompt: FACT_CHECKER_PROMPT.into() }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for FactCheckerAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend { Backend::LocalOllama { model: "deepseek-r1:7b".into(), url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8089".into()) } }
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::General] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse { content: format!("Fact-Checker: Verifying claims in '{}'. All claims require primary sources.", intent.text), confidence: 0.92, sources: vec![], suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }], data_class: DataClass::General, backend_used: self.required_backend() })
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for FactCheckerAgent {
|
||||
fn channel(&self) -> Channel { Channel::Intelligence }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::DirectReport { title: "Fact-Checker".into() } }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
|
||||
pub struct PublisherAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl PublisherAgent {
|
||||
pub fn new() -> Self {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[30] = 0.5; vec[31] = 0.6;
|
||||
Self { agent_id: AgentId::new("publisher").unwrap(), capability_embedding: Vector::from(vec), system_prompt: PUBLISHER_PROMPT.into() }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for PublisherAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend { Backend::LocalOllama { model: "gemma4:2.3b".into(), url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8089".into()) } }
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::General] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse { content: format!("Publisher: Staging '{}' for publication review.", intent.text), confidence: 0.88, sources: vec![], suggested_operations: vec![Operation::Draft { prompt: intent.text.clone() }], data_class: DataClass::General, backend_used: self.required_backend() })
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for PublisherAgent {
|
||||
fn channel(&self) -> Channel { Channel::Intelligence }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::DirectReport { title: "Publisher".into() } }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
295
crates/synq-agents/src/swarm/meta_router.rs
Normal file
295
crates/synq-agents/src/swarm/meta_router.rs
Normal file
|
|
@ -0,0 +1,295 @@
|
|||
use async_trait::async_trait;
|
||||
use tracing::{info, warn};
|
||||
|
||||
use synq_protocol::{
|
||||
AgentId, Backend, Channel, CorporateRole, DataClass, Intent,
|
||||
Operation, RoutingDecision, RoutingMethod, Vector,
|
||||
};
|
||||
|
||||
use crate::{AgentContext, AgentError, AgentResponse, CapabilityAgent};
|
||||
use crate::swarm::CorporateAgent;
|
||||
|
||||
/// The Meta-Router — Chief of Staff.
|
||||
/// Triages incoming requests to the appropriate channel lead.
|
||||
/// Never executes tasks itself.
|
||||
pub struct MetaRouter {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl MetaRouter {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
agent_id: AgentId::new("chief-of-staff").unwrap(),
|
||||
capability_embedding: Self::default_embedding(),
|
||||
system_prompt: Self::default_system_prompt(),
|
||||
}
|
||||
}
|
||||
|
||||
fn default_embedding() -> Vector {
|
||||
// Peaks across all channel dimensions to signal broad routing capability.
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[0] = 0.5; // clinical
|
||||
vec[10] = 0.5; // finance
|
||||
vec[20] = 0.5; // messaging/content
|
||||
vec[30] = 0.5; // news/intel
|
||||
vec[40] = 0.5; // engineering
|
||||
vec[50] = 0.5; // travel
|
||||
vec[100] = 1.0; // meta-routing peak
|
||||
Vector::from(vec)
|
||||
}
|
||||
|
||||
fn default_system_prompt() -> String {
|
||||
r#"You are the Chief of Staff for Synq.
|
||||
Your sole responsibility is routing requests to the correct channel lead. You never execute tasks.
|
||||
|
||||
Channel Leads:
|
||||
- Finance → CFO (port 8088)
|
||||
- Clinical → Chief Medical Officer (port 8085)
|
||||
- Content → Editor-in-Chief (port 8089)
|
||||
- Intelligence → Director of Intelligence (port 8089)
|
||||
- Engineering → CTO (port 8090)
|
||||
- Travel → Travel Director (port 8091)
|
||||
|
||||
Routing rules:
|
||||
1. Explicit channel mention: route to named channel.
|
||||
2. Active UI tab: route to active channel context.
|
||||
3. Keyword/intent:
|
||||
- "close the books", "reconcile", "P&L", "forecast", "tax", "M&A", "valuation" → Finance
|
||||
- "patient", "appointment", "prescription", "chart", "diagnosis" → Clinical
|
||||
- "post", "draft", "content", "social", "blog" → Content
|
||||
- "research", "intel", "OSINT", "investigate", "fact-check" → Intelligence
|
||||
- "bug", "deploy", "PR", "code", "security scan" → Engineering
|
||||
- "flight", "hotel", "itinerary", "book trip" → Travel
|
||||
4. Entity context: if user is viewing a patient record → Clinical; viewing a deal → Finance.
|
||||
|
||||
PHI rule: ANY patient data → Clinical channel only. Never guess.
|
||||
If ambiguous → ask the user which channel they mean.
|
||||
|
||||
You always respond with a JSON object:
|
||||
{"target_channel": "finance", "confidence": 0.95, "reason": "..."}"#.into()
|
||||
}
|
||||
|
||||
/// Route an intent to the appropriate channel.
|
||||
pub fn route(&self, intent: &Intent) -> RoutingDecision {
|
||||
let text = intent.text.to_lowercase();
|
||||
|
||||
// 1. Explicit channel mention
|
||||
for channel in Channel::all() {
|
||||
if text.contains(channel.as_str()) {
|
||||
return RoutingDecision {
|
||||
target_channel: *channel,
|
||||
confidence: 0.99,
|
||||
routing_method: RoutingMethod::ExplicitChannelMention,
|
||||
reason: format!("explicit mention of '{}'", channel.as_str()),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// 2. PHI detection → Clinical (highest priority)
|
||||
if text.contains("patient")
|
||||
|| text.contains("appointment")
|
||||
|| text.contains("prescription")
|
||||
|| text.contains("chart")
|
||||
|| text.contains("diagnosis")
|
||||
|| text.contains("medical")
|
||||
|| text.contains("vitals")
|
||||
|| text.contains("medication")
|
||||
{
|
||||
return RoutingDecision {
|
||||
target_channel: Channel::Clinical,
|
||||
confidence: 0.95,
|
||||
routing_method: RoutingMethod::KeywordIntent,
|
||||
reason: "PHI-sensitive medical keywords detected".into(),
|
||||
};
|
||||
}
|
||||
|
||||
// 3. Keyword-based routing
|
||||
let finance_keywords = [
|
||||
"revenue", "p&l", "profit", "loss", "forecast", "budget",
|
||||
"reconcile", "close the books", "accrual", "gl", "general ledger",
|
||||
"tax", "s-corp", "crypto", "portfolio", "balance", "stock",
|
||||
"position", "ach", "transfer", "invoice", "payment",
|
||||
"m&a", "acquisition", "valuation", "comp", "benchmark",
|
||||
"market scan", "vendor", "screen", "due diligence", "loi",
|
||||
"entity switch", "month-end", "audit", "compliance",
|
||||
];
|
||||
for kw in &finance_keywords {
|
||||
if text.contains(kw) {
|
||||
return RoutingDecision {
|
||||
target_channel: Channel::Finance,
|
||||
confidence: 0.92,
|
||||
routing_method: RoutingMethod::KeywordIntent,
|
||||
reason: format!("finance keyword detected: '{}'", kw),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
let content_keywords = [
|
||||
"post", "draft", "content", "social", "blog", "article",
|
||||
"publish", "schedule", "moderate", "copy", "editorial",
|
||||
];
|
||||
for kw in &content_keywords {
|
||||
if text.contains(kw) {
|
||||
return RoutingDecision {
|
||||
target_channel: Channel::Content,
|
||||
confidence: 0.90,
|
||||
routing_method: RoutingMethod::KeywordIntent,
|
||||
reason: format!("content keyword detected: '{}'", kw),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
let intel_keywords = [
|
||||
"research", "intel", "osint", "investigate", "fact-check",
|
||||
"deep dive", "analysis", "report", "brief", "signal",
|
||||
];
|
||||
for kw in &intel_keywords {
|
||||
if text.contains(kw) {
|
||||
return RoutingDecision {
|
||||
target_channel: Channel::Intelligence,
|
||||
confidence: 0.90,
|
||||
routing_method: RoutingMethod::KeywordIntent,
|
||||
reason: format!("intelligence keyword detected: '{}'", kw),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
let eng_keywords = [
|
||||
"bug", "deploy", "pr", "code", "security scan", "fix",
|
||||
"review", "merge", "commit", "build", "test", "pipeline",
|
||||
];
|
||||
for kw in &eng_keywords {
|
||||
if text.contains(kw) {
|
||||
return RoutingDecision {
|
||||
target_channel: Channel::Engineering,
|
||||
confidence: 0.90,
|
||||
routing_method: RoutingMethod::KeywordIntent,
|
||||
reason: format!("engineering keyword detected: '{}'", kw),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
let travel_keywords = [
|
||||
"flight", "hotel", "itinerary", "book trip", "travel",
|
||||
"vacation", "price drop", "track price", "boarding pass",
|
||||
];
|
||||
for kw in &travel_keywords {
|
||||
if text.contains(kw) {
|
||||
return RoutingDecision {
|
||||
target_channel: Channel::Travel,
|
||||
confidence: 0.90,
|
||||
routing_method: RoutingMethod::KeywordIntent,
|
||||
reason: format!("travel keyword detected: '{}'", kw),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// 4. Ambiguous
|
||||
warn!(intent = %intent.text, "MetaRouter: ambiguous intent, needs clarification");
|
||||
RoutingDecision::ambiguous(
|
||||
"no clear channel keywords detected; ask user to specify",
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for MetaRouter {
|
||||
fn id(&self) -> &AgentId {
|
||||
&self.agent_id
|
||||
}
|
||||
|
||||
fn capability_embedding(&self) -> &Vector {
|
||||
&self.capability_embedding
|
||||
}
|
||||
|
||||
fn required_backend(&self) -> Backend {
|
||||
Backend::LocalOllama {
|
||||
model: "gemma4:2.3b".into(),
|
||||
url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:11434".into()),
|
||||
}
|
||||
}
|
||||
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> {
|
||||
vec![DataClass::General]
|
||||
}
|
||||
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
let decision = self.route(intent);
|
||||
info!(
|
||||
target = %decision.target_channel,
|
||||
confidence = decision.confidence,
|
||||
method = ?decision.routing_method,
|
||||
"MetaRouter routed intent"
|
||||
);
|
||||
|
||||
let content = if decision.routing_method == RoutingMethod::Ambiguous {
|
||||
"I'm not sure which channel you need. Could you clarify? Try mentioning Finance, Clinical, Content, Intelligence, Engineering, or Travel.".into()
|
||||
} else {
|
||||
format!(
|
||||
"Routing to **{}** ({} — confidence: {:.0}%)",
|
||||
decision.target_channel.display_name(),
|
||||
decision.target_channel.lead_title(),
|
||||
decision.confidence * 100.0
|
||||
)
|
||||
};
|
||||
|
||||
Ok(AgentResponse {
|
||||
content,
|
||||
confidence: decision.confidence,
|
||||
sources: vec![],
|
||||
suggested_operations: vec![Operation::Retrieve {
|
||||
query: decision.reason.clone(),
|
||||
}],
|
||||
data_class: DataClass::General,
|
||||
backend_used: self.required_backend(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for MetaRouter {
|
||||
fn channel(&self) -> Channel {
|
||||
// Meta-router spans all channels; we use a synthetic channel.
|
||||
Channel::Finance
|
||||
}
|
||||
|
||||
fn corporate_role(&self) -> CorporateRole {
|
||||
CorporateRole::MetaRouter
|
||||
}
|
||||
|
||||
fn system_prompt(&self) -> &str {
|
||||
&self.system_prompt
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_route_finance_keywords() {
|
||||
let router = MetaRouter::new();
|
||||
let intent = Intent::new("generate my P&L for last quarter");
|
||||
let decision = router.route(&intent);
|
||||
assert_eq!(decision.target_channel, Channel::Finance);
|
||||
assert!(decision.confidence > 0.9);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_route_clinical_phi() {
|
||||
let router = MetaRouter::new();
|
||||
let intent = Intent::new("show patient Johnson's chart");
|
||||
let decision = router.route(&intent);
|
||||
assert_eq!(decision.target_channel, Channel::Clinical);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_route_ambiguous() {
|
||||
let router = MetaRouter::new();
|
||||
let intent = Intent::new("hello there");
|
||||
let decision = router.route(&intent);
|
||||
assert_eq!(decision.routing_method, RoutingMethod::Ambiguous);
|
||||
}
|
||||
}
|
||||
300
crates/synq-agents/src/swarm/mod.rs
Normal file
300
crates/synq-agents/src/swarm/mod.rs
Normal file
|
|
@ -0,0 +1,300 @@
|
|||
pub mod clinical;
|
||||
pub mod cloud_gateway;
|
||||
pub mod content;
|
||||
pub mod engineering;
|
||||
pub mod finance;
|
||||
pub mod intelligence;
|
||||
pub mod meta_router;
|
||||
pub mod sanitizer;
|
||||
pub mod travel;
|
||||
|
||||
use async_trait::async_trait;
|
||||
use std::collections::HashMap;
|
||||
use std::sync::Arc;
|
||||
use tokio::sync::{mpsc, RwLock};
|
||||
use tracing::info;
|
||||
use uuid::Uuid;
|
||||
|
||||
use synq_protocol::{
|
||||
AgentId, Backend, Channel, ChannelEvent, CorporateRole, DataClass,
|
||||
HumanCheckpointConfig, Intent, MemoryNamespace, SwarmMessage,
|
||||
SwarmTask, TaskStatus, Vector,
|
||||
};
|
||||
|
||||
use crate::{AgentContext, AgentError, AgentResponse, CapabilityAgent};
|
||||
|
||||
/// Extended trait for corporate swarm agents.
|
||||
/// Every agent in the swarm has a role, channel, system prompt, and can
|
||||
/// delegate to direct reports or broadcast via the event bus.
|
||||
#[async_trait]
|
||||
pub trait CorporateAgent: CapabilityAgent {
|
||||
/// The channel this agent belongs to.
|
||||
fn channel(&self) -> Channel;
|
||||
|
||||
/// The corporate role (Lead, DirectReport, or MetaRouter).
|
||||
fn corporate_role(&self) -> CorporateRole;
|
||||
|
||||
/// The system prompt that defines this agent's persona and behavior.
|
||||
fn system_prompt(&self) -> &str;
|
||||
|
||||
/// Memory namespace for this agent's long-term memory.
|
||||
fn memory_namespace(&self) -> MemoryNamespace {
|
||||
MemoryNamespace::new(self.channel(), self.corporate_role().title().to_string())
|
||||
}
|
||||
|
||||
/// Can this agent delegate to other agents?
|
||||
fn can_delegate(&self) -> bool {
|
||||
matches!(self.corporate_role(), CorporateRole::Lead | CorporateRole::MetaRouter)
|
||||
}
|
||||
|
||||
/// Does this agent require human approval before committing outputs?
|
||||
fn requires_human_approval(&self) -> bool {
|
||||
!HumanCheckpointConfig::default_for(self.channel()).auto_commit_allowed
|
||||
}
|
||||
|
||||
/// Delegate a sub-task to a direct report.
|
||||
async fn delegate(
|
||||
&self,
|
||||
_task: &SwarmTask,
|
||||
_to_agent: &AgentId,
|
||||
_sub_intent: &Intent,
|
||||
_ctx: &AgentContext,
|
||||
) -> Result<AgentResponse, AgentError> {
|
||||
if !self.can_delegate() {
|
||||
return Err(AgentError::InvalidIntent(
|
||||
"this role cannot delegate tasks".into(),
|
||||
));
|
||||
}
|
||||
// Subclasses override with actual delegation logic.
|
||||
Err(AgentError::InvalidIntent("delegate not implemented".into()))
|
||||
}
|
||||
|
||||
/// Receive a message from another agent in the swarm.
|
||||
async fn receive_message(
|
||||
&self,
|
||||
_message: &SwarmMessage,
|
||||
_ctx: &AgentContext,
|
||||
) -> Result<AgentResponse, AgentError> {
|
||||
Err(AgentError::InvalidIntent("receive_message not implemented".into()))
|
||||
}
|
||||
}
|
||||
|
||||
/// In-memory cross-channel event bus.
|
||||
/// All cross-functional communication goes through here.
|
||||
pub struct EventBus {
|
||||
subscribers: Arc<RwLock<HashMap<Channel, Vec<mpsc::Sender<ChannelEvent>>>>>,
|
||||
history: Arc<RwLock<Vec<ChannelEvent>>>,
|
||||
}
|
||||
|
||||
impl Default for EventBus {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
impl EventBus {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
subscribers: Arc::new(RwLock::new(HashMap::new())),
|
||||
history: Arc::new(RwLock::new(Vec::new())),
|
||||
}
|
||||
}
|
||||
|
||||
/// Subscribe a channel to receive events.
|
||||
pub async fn subscribe(
|
||||
&self,
|
||||
channel: Channel,
|
||||
) -> mpsc::Receiver<ChannelEvent> {
|
||||
let (tx, rx) = mpsc::channel::<ChannelEvent>(128);
|
||||
let mut subs = self.subscribers.write().await;
|
||||
subs.entry(channel).or_default().push(tx);
|
||||
rx
|
||||
}
|
||||
|
||||
/// Publish an event to all subscribers of the target channel.
|
||||
pub async fn publish(&self, event: ChannelEvent) {
|
||||
let mut history = self.history.write().await;
|
||||
history.push(event.clone());
|
||||
drop(history);
|
||||
|
||||
let subs = self.subscribers.read().await;
|
||||
if let Some(sendters) = subs.get(&event.to_channel) {
|
||||
for tx in sendters {
|
||||
let _ = tx.send(event.clone()).await;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Get event history (most recent first).
|
||||
pub async fn history(&self, limit: usize) -> Vec<ChannelEvent> {
|
||||
let history = self.history.read().await;
|
||||
history.iter().rev().take(limit).cloned().collect()
|
||||
}
|
||||
}
|
||||
|
||||
/// Manages swarm tasks and their state transitions.
|
||||
pub struct SwarmTaskManager {
|
||||
tasks: Arc<RwLock<HashMap<Uuid, SwarmTask>>>,
|
||||
}
|
||||
|
||||
impl Default for SwarmTaskManager {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
impl SwarmTaskManager {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
tasks: Arc::new(RwLock::new(HashMap::new())),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn create(
|
||||
&self,
|
||||
channel: Channel,
|
||||
lead_agent_id: AgentId,
|
||||
intent_text: impl Into<String>,
|
||||
) -> SwarmTask {
|
||||
let task = SwarmTask::new(channel, lead_agent_id, intent_text);
|
||||
let mut tasks = self.tasks.write().await;
|
||||
tasks.insert(task.id, task.clone());
|
||||
task
|
||||
}
|
||||
|
||||
pub async fn get(&self, task_id: Uuid) -> Option<SwarmTask> {
|
||||
let tasks = self.tasks.read().await;
|
||||
tasks.get(&task_id).cloned()
|
||||
}
|
||||
|
||||
pub async fn transition(&self, task_id: Uuid, new_status: TaskStatus) -> Result<(), AgentError> {
|
||||
let mut tasks = self.tasks.write().await;
|
||||
let task = tasks
|
||||
.get_mut(&task_id)
|
||||
.ok_or_else(|| AgentError::NotFound(format!("task {task_id} not found")))?;
|
||||
info!(
|
||||
task_id = %task_id,
|
||||
from = %task.status,
|
||||
to = %new_status,
|
||||
"task transition"
|
||||
);
|
||||
task.transition(new_status);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn assign_direct_report(
|
||||
&self,
|
||||
task_id: Uuid,
|
||||
agent_id: AgentId,
|
||||
) -> Result<(), AgentError> {
|
||||
let mut tasks = self.tasks.write().await;
|
||||
let task = tasks
|
||||
.get_mut(&task_id)
|
||||
.ok_or_else(|| AgentError::NotFound(format!("task {task_id} not found")))?;
|
||||
task.assigned_direct_report = Some(agent_id);
|
||||
task.updated_at = chrono::Utc::now();
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn stage_output(
|
||||
&self,
|
||||
task_id: Uuid,
|
||||
output: impl Into<String>,
|
||||
) -> Result<(), AgentError> {
|
||||
let mut tasks = self.tasks.write().await;
|
||||
let task = tasks
|
||||
.get_mut(&task_id)
|
||||
.ok_or_else(|| AgentError::NotFound(format!("task {task_id} not found")))?;
|
||||
task.staged_output = Some(output.into());
|
||||
task.transition(TaskStatus::StagedForReview);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn approve(&self, task_id: Uuid, approver: impl Into<String>) -> Result<(), AgentError> {
|
||||
let mut tasks = self.tasks.write().await;
|
||||
let task = tasks
|
||||
.get_mut(&task_id)
|
||||
.ok_or_else(|| AgentError::NotFound(format!("task {task_id} not found")))?;
|
||||
task.approved_by = Some(approver.into());
|
||||
task.transition(TaskStatus::Approved);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn commit(&self, task_id: Uuid) -> Result<(), AgentError> {
|
||||
let mut tasks = self.tasks.write().await;
|
||||
let task = tasks
|
||||
.get_mut(&task_id)
|
||||
.ok_or_else(|| AgentError::NotFound(format!("task {task_id} not found")))?;
|
||||
task.transition(TaskStatus::Committed);
|
||||
task.committed_at = Some(chrono::Utc::now());
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn request_revisions(
|
||||
&self,
|
||||
task_id: Uuid,
|
||||
notes: impl Into<String>,
|
||||
) -> Result<(), AgentError> {
|
||||
let mut tasks = self.tasks.write().await;
|
||||
let task = tasks
|
||||
.get_mut(&task_id)
|
||||
.ok_or_else(|| AgentError::NotFound(format!("task {task_id} not found")))?;
|
||||
task.revision_notes = Some(notes.into());
|
||||
task.transition(TaskStatus::RevisionsRequested);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn pending_human_approval(&self) -> Vec<SwarmTask> {
|
||||
let tasks = self.tasks.read().await;
|
||||
tasks
|
||||
.values()
|
||||
.filter(|t| t.requires_human_approval())
|
||||
.cloned()
|
||||
.collect()
|
||||
}
|
||||
|
||||
pub async fn tasks_for_channel(&self, channel: Channel) -> Vec<SwarmTask> {
|
||||
let tasks = self.tasks.read().await;
|
||||
tasks
|
||||
.values()
|
||||
.filter(|t| t.channel == channel)
|
||||
.cloned()
|
||||
.collect()
|
||||
}
|
||||
}
|
||||
|
||||
/// A wrapper that makes any CorporateAgent also satisfy CapabilityAgent.
|
||||
/// This allows swarm agents to be registered in the legacy AgentRegistry
|
||||
/// while also supporting swarm-specific behaviors.
|
||||
pub struct CorporateAgentWrapper {
|
||||
inner: Arc<dyn CorporateAgent>,
|
||||
}
|
||||
|
||||
impl CorporateAgentWrapper {
|
||||
pub fn new(agent: Arc<dyn CorporateAgent>) -> Self {
|
||||
Self { inner: agent }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for CorporateAgentWrapper {
|
||||
fn id(&self) -> &AgentId {
|
||||
self.inner.id()
|
||||
}
|
||||
|
||||
fn capability_embedding(&self) -> &Vector {
|
||||
self.inner.capability_embedding()
|
||||
}
|
||||
|
||||
async fn handle(&self, intent: &Intent, context: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
self.inner.handle(intent, context).await
|
||||
}
|
||||
|
||||
fn required_backend(&self) -> Backend {
|
||||
self.inner.required_backend()
|
||||
}
|
||||
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> {
|
||||
self.inner.supported_data_classes()
|
||||
}
|
||||
}
|
||||
206
crates/synq-agents/src/swarm/sanitizer.rs
Normal file
206
crates/synq-agents/src/swarm/sanitizer.rs
Normal file
|
|
@ -0,0 +1,206 @@
|
|||
use regex::Regex;
|
||||
use tracing::warn;
|
||||
use uuid::Uuid;
|
||||
|
||||
use synq_protocol::{CloudOffloadPayload, CloudOffloadResult};
|
||||
|
||||
/// Cloud offload sanitizer.
|
||||
/// Strips PHI and sensitive financial data before sending to cloud LLMs.
|
||||
/// Replaces specific values with generalized ranges or hashes.
|
||||
pub struct CloudSanitizer {
|
||||
ssn_re: Regex,
|
||||
phone_re: Regex,
|
||||
email_re: Regex,
|
||||
dollar_re: Regex,
|
||||
account_re: Regex,
|
||||
mrn_re: Regex,
|
||||
}
|
||||
|
||||
impl Default for CloudSanitizer {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
impl CloudSanitizer {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
ssn_re: Regex::new(r"\b\d{3}-\d{2}-\d{4}\b").unwrap(),
|
||||
phone_re: Regex::new(r"\b\d{3}-\d{3}-\d{4}\b").unwrap(),
|
||||
email_re: Regex::new(r"\b[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Z|a-z]{2,}\b").unwrap(),
|
||||
dollar_re: Regex::new(r"\$[\d,]+(?:\.\d{2})?").unwrap(),
|
||||
account_re: Regex::new(r"\b\d{4}[\s-]?\d{4}[\s-]?\d{4}[\s-]?\d{4}\b").unwrap(),
|
||||
mrn_re: Regex::new(r"\bMRN[:\s]*\d+\b").unwrap(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Sanitize a query for cloud offload.
|
||||
/// Returns a `CloudOffloadPayload` with the sanitized query and metadata.
|
||||
pub fn sanitize(
|
||||
&self,
|
||||
task_id: Uuid,
|
||||
query: &str,
|
||||
intent_category: &str,
|
||||
) -> Result<CloudOffloadPayload, String> {
|
||||
let mut sanitized = query.to_string();
|
||||
|
||||
// Block: any message containing obvious PHI patterns → force local-only
|
||||
if self.contains_phi(&sanitized) {
|
||||
warn!(task_id = %task_id, "PHI detected in cloud offload query — blocking");
|
||||
return Err("PHI detected — cloud offload blocked, use local-only".into());
|
||||
}
|
||||
|
||||
// Strip: SSN
|
||||
sanitized = self.ssn_re.replace_all(&sanitized, "[SSN_REDACTED]").to_string();
|
||||
|
||||
// Strip: phone numbers
|
||||
sanitized = self.phone_re.replace_all(&sanitized, "[PHONE_REDACTED]").to_string();
|
||||
|
||||
// Strip: emails
|
||||
sanitized = self.email_re.replace_all(&sanitized, "[EMAIL_REDACTED]").to_string();
|
||||
|
||||
// Strip: credit card / account numbers
|
||||
sanitized = self.account_re.replace_all(&sanitized, "[ACCT_REDACTED]").to_string();
|
||||
|
||||
// Strip: MRN
|
||||
sanitized = self.mrn_re.replace_all(&sanitized, "[MRN_REDACTED]").to_string();
|
||||
|
||||
// Replace: exact dollar amounts with ranges
|
||||
let mut replaced = sanitized.clone();
|
||||
for mat in self.dollar_re.find_iter(&sanitized) {
|
||||
let amount_str = mat.as_str().replace('$', "").replace(',', "");
|
||||
if let Ok(amount) = amount_str.parse::<f64>() {
|
||||
let range = Self::amount_to_range(amount);
|
||||
replaced = replaced.replacen(mat.as_str(), &range, 1);
|
||||
}
|
||||
}
|
||||
sanitized = replaced;
|
||||
|
||||
// Extract hints for cloud enrichment
|
||||
let zip_code = Self::extract_zip(&query);
|
||||
let industry_hint = Self::extract_industry(&query);
|
||||
|
||||
Ok(CloudOffloadPayload {
|
||||
task_id,
|
||||
intent_category: intent_category.into(),
|
||||
sanitized_query: sanitized,
|
||||
entity_id_hash: None,
|
||||
revenue_range: None,
|
||||
zip_code,
|
||||
industry_hint,
|
||||
local_cache_fingerprint: None,
|
||||
})
|
||||
}
|
||||
|
||||
/// Check if text contains PHI patterns that should block cloud offload entirely.
|
||||
fn contains_phi(&self, text: &str) -> bool {
|
||||
let lower = text.to_lowercase();
|
||||
// Check for explicit PHI indicators
|
||||
if lower.contains("patient name")
|
||||
|| lower.contains("ssn")
|
||||
|| lower.contains("medical record")
|
||||
|| lower.contains("diagnosis")
|
||||
|| lower.contains("treatment")
|
||||
|| lower.contains("prescription")
|
||||
|| lower.contains("doctor")
|
||||
&& lower.contains("patient")
|
||||
{
|
||||
return true;
|
||||
}
|
||||
false
|
||||
}
|
||||
|
||||
/// Convert a dollar amount to a generalized range.
|
||||
fn amount_to_range(amount: f64) -> String {
|
||||
if amount < 1_000.0 {
|
||||
let lower = (amount / 100.0).floor() * 100.0;
|
||||
let upper = lower + 100.0;
|
||||
format!("${:.0}-${:.0}", lower, upper)
|
||||
} else if amount < 10_000.0 {
|
||||
let lower = (amount / 1_000.0).floor() * 1_000.0;
|
||||
let upper = lower + 1_000.0;
|
||||
format!("${:.0}K-${:.0}K", lower / 1_000.0, upper / 1_000.0)
|
||||
} else if amount < 1_000_000.0 {
|
||||
let lower = (amount / 10_000.0).floor() * 10_000.0;
|
||||
let upper = lower + 10_000.0;
|
||||
format!("${:.0}K-${:.0}K", lower / 1_000.0, upper / 1_000.0)
|
||||
} else {
|
||||
let lower = (amount / 1_000_000.0).floor() * 1_000_000.0;
|
||||
let upper = lower + 1_000_000.0;
|
||||
format!("${:.0}M-${:.0}M", lower / 1_000_000.0, upper / 1_000_000.0)
|
||||
}
|
||||
}
|
||||
|
||||
fn extract_zip(text: &str) -> Option<String> {
|
||||
let re = Regex::new(r"\b\d{5}(?:-\d{4})?\b").unwrap();
|
||||
re.find(text).map(|m| m.as_str().to_string())
|
||||
}
|
||||
|
||||
fn extract_industry(text: &str) -> Option<String> {
|
||||
let lower = text.to_lowercase();
|
||||
let industries = [
|
||||
("dental", "dental"),
|
||||
("medical", "medical"),
|
||||
("healthcare", "healthcare"),
|
||||
("tech", "technology"),
|
||||
("software", "software"),
|
||||
("real estate", "real_estate"),
|
||||
("restaurant", "restaurant"),
|
||||
("retail", "retail"),
|
||||
("construction", "construction"),
|
||||
("law firm", "legal"),
|
||||
("accounting", "accounting"),
|
||||
];
|
||||
for (keyword, industry) in &industries {
|
||||
if lower.contains(keyword) {
|
||||
return Some(industry.to_string());
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
|
||||
/// Validate that a cloud result can be safely reintegrated.
|
||||
pub fn validate_cloud_result(&self, result: &CloudOffloadResult) -> Result<(), String> {
|
||||
if result.confidence < 0.3 {
|
||||
warn!(task_id = %result.task_id, confidence = result.confidence, "low confidence cloud result");
|
||||
return Err("cloud result confidence too low".into());
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_sanitize_ssn() {
|
||||
let s = CloudSanitizer::new();
|
||||
let payload = s.sanitize(Uuid::new_v4(), "My SSN is 123-45-6789", "test").unwrap();
|
||||
assert!(payload.sanitized_query.contains("[SSN_REDACTED]"));
|
||||
assert!(!payload.sanitized_query.contains("123-45-6789"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_sanitize_amounts() {
|
||||
let s = CloudSanitizer::new();
|
||||
let payload = s.sanitize(Uuid::new_v4(), "Revenue was $1,234,567", "test").unwrap();
|
||||
assert!(!payload.sanitized_query.contains("$1,234,567"));
|
||||
assert!(payload.sanitized_query.contains('$')); // should have range
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_block_phi() {
|
||||
let s = CloudSanitizer::new();
|
||||
let result = s.sanitize(Uuid::new_v4(), "Patient name is John Smith, diagnosis: diabetes", "test");
|
||||
assert!(result.is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_extract_zip() {
|
||||
assert_eq!(
|
||||
CloudSanitizer::extract_zip("Practice in 92660 Orange County"),
|
||||
Some("92660".into())
|
||||
);
|
||||
}
|
||||
}
|
||||
172
crates/synq-agents/src/swarm/travel.rs
Normal file
172
crates/synq-agents/src/swarm/travel.rs
Normal file
|
|
@ -0,0 +1,172 @@
|
|||
use async_trait::async_trait;
|
||||
use synq_protocol::{
|
||||
AgentId, Backend, Channel, CorporateRole, DataClass, Intent, Operation, Vector,
|
||||
};
|
||||
use crate::{AgentContext, AgentError, AgentResponse, CapabilityAgent};
|
||||
use crate::swarm::CorporateAgent;
|
||||
|
||||
const TRAVEL_DIRECTOR_PROMPT: &str = r#"You are the Travel Director for Synq.
|
||||
You oversee all travel bookings, price tracking, and itinerary management.
|
||||
|
||||
Direct Reports:
|
||||
- Booking Agent (flight and hotel booking)
|
||||
- Price Analyst (price drop alerts, optimal booking timing)
|
||||
- Support Rep (itinerary changes, cancellations, issues)
|
||||
|
||||
Rules:
|
||||
- Low-risk bookings (<$500) can auto-commit if user opted in
|
||||
- High-value bookings and cancellations require human approval
|
||||
- Price drop alerts can auto-send if user opted in
|
||||
- Maintain traveler preference profiles"#;
|
||||
|
||||
const BOOKING_AGENT_PROMPT: &str = r#"You are a Booking Agent.
|
||||
You search and book flights and hotels.
|
||||
|
||||
Rules:
|
||||
- Present options, never auto-book without confirmation
|
||||
- Respect travel policy constraints
|
||||
- Flag visa or documentation requirements"#;
|
||||
|
||||
const PRICE_ANALYST_PROMPT: &str = r#"You are a Price Analyst.
|
||||
You monitor prices and alert travelers to optimal booking times.
|
||||
|
||||
Rules:
|
||||
- Auto-send alerts only if user opted in
|
||||
- Provide price history context
|
||||
- Recommend book-now vs. wait based on trend analysis"#;
|
||||
|
||||
const SUPPORT_REP_PROMPT: &str = r#"You are a Travel Support Rep.
|
||||
You handle itinerary changes, cancellations, and issues.
|
||||
|
||||
Rules:
|
||||
- Cancellations require human approval
|
||||
- Document all changes
|
||||
- Escalate complex issues to Travel Director"#;
|
||||
|
||||
pub struct TravelDirectorAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl TravelDirectorAgent {
|
||||
pub fn new() -> Self {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[50] = 1.0; vec[51] = 0.9;
|
||||
Self { agent_id: AgentId::new("travel-director").unwrap(), capability_embedding: Vector::from(vec), system_prompt: TRAVEL_DIRECTOR_PROMPT.into() }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for TravelDirectorAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend { Backend::LocalOllama { model: "gemma4:9b".into(), url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8091".into()) } }
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::General] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse { content: format!("Travel Director: Routing '{}' to travel team.", intent.text), confidence: 0.88, sources: vec![], suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }], data_class: DataClass::General, backend_used: self.required_backend() })
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for TravelDirectorAgent {
|
||||
fn channel(&self) -> Channel { Channel::Travel }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::Lead }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
|
||||
pub struct BookingAgentAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl BookingAgentAgent {
|
||||
pub fn new() -> Self {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[50] = 0.8; vec[51] = 1.0;
|
||||
Self { agent_id: AgentId::new("booking-agent").unwrap(), capability_embedding: Vector::from(vec), system_prompt: BOOKING_AGENT_PROMPT.into() }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for BookingAgentAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend { Backend::LocalOllama { model: "gemma4:9b".into(), url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8091".into()) } }
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::General] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse { content: format!("Booking Agent: Searching options for '{}'. Please confirm before booking.", intent.text), confidence: 0.85, sources: vec![], suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }], data_class: DataClass::General, backend_used: self.required_backend() })
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for BookingAgentAgent {
|
||||
fn channel(&self) -> Channel { Channel::Travel }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::DirectReport { title: "Booking Agent".into() } }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
|
||||
pub struct PriceAnalystAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl PriceAnalystAgent {
|
||||
pub fn new() -> Self {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[50] = 0.6; vec[51] = 0.7;
|
||||
Self { agent_id: AgentId::new("price-analyst").unwrap(), capability_embedding: Vector::from(vec), system_prompt: PRICE_ANALYST_PROMPT.into() }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for PriceAnalystAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend { Backend::LocalOllama { model: "gemma4:2.3b".into(), url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8091".into()) } }
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::General] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse { content: format!("Price Analyst: Tracking prices for '{}'. Alerts enabled.", intent.text), confidence: 0.82, sources: vec![], suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }], data_class: DataClass::General, backend_used: self.required_backend() })
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for PriceAnalystAgent {
|
||||
fn channel(&self) -> Channel { Channel::Travel }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::DirectReport { title: "Price Analyst".into() } }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
|
||||
pub struct SupportRepAgent {
|
||||
agent_id: AgentId,
|
||||
capability_embedding: Vector,
|
||||
system_prompt: String,
|
||||
}
|
||||
|
||||
impl SupportRepAgent {
|
||||
pub fn new() -> Self {
|
||||
let mut vec = vec![0.0; 1024];
|
||||
vec[50] = 0.5; vec[51] = 0.6;
|
||||
Self { agent_id: AgentId::new("support-rep").unwrap(), capability_embedding: Vector::from(vec), system_prompt: SUPPORT_REP_PROMPT.into() }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CapabilityAgent for SupportRepAgent {
|
||||
fn id(&self) -> &AgentId { &self.agent_id }
|
||||
fn capability_embedding(&self) -> &Vector { &self.capability_embedding }
|
||||
fn required_backend(&self) -> Backend { Backend::LocalOllama { model: "gemma4:2.3b".into(), url: std::env::var("SYNQ_OLLAMA_URL").unwrap_or_else(|_| "http://localhost:8091".into()) } }
|
||||
fn supported_data_classes(&self) -> Vec<DataClass> { vec![DataClass::General] }
|
||||
async fn handle(&self, intent: &Intent, _ctx: &AgentContext) -> Result<AgentResponse, AgentError> {
|
||||
Ok(AgentResponse { content: format!("Support Rep: Handling request '{}'. Cancellations require approval.", intent.text), confidence: 0.85, sources: vec![], suggested_operations: vec![Operation::Retrieve { query: intent.text.clone() }], data_class: DataClass::General, backend_used: self.required_backend() })
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl CorporateAgent for SupportRepAgent {
|
||||
fn channel(&self) -> Channel { Channel::Travel }
|
||||
fn corporate_role(&self) -> CorporateRole { CorporateRole::DirectReport { title: "Support Rep".into() } }
|
||||
fn system_prompt(&self) -> &str { &self.system_prompt }
|
||||
}
|
||||
|
|
@ -1,7 +1,9 @@
|
|||
pub mod kimi_client;
|
||||
pub mod ollama_client;
|
||||
pub mod router;
|
||||
pub mod together_client;
|
||||
|
||||
pub use kimi_client::{ChatMessage, KimiClient};
|
||||
pub use ollama_client::OllamaClient;
|
||||
pub use router::{BackendRouter, RouterConfig};
|
||||
pub use together_client::TogetherClient;
|
||||
|
|
|
|||
167
crates/synq-backend/src/together_client.rs
Normal file
167
crates/synq-backend/src/together_client.rs
Normal file
|
|
@ -0,0 +1,167 @@
|
|||
use reqwest::{Client, StatusCode};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::time::Duration;
|
||||
use synq_protocol::Vector;
|
||||
|
||||
use tracing::{debug, error, instrument};
|
||||
|
||||
use crate::kimi_client::{BackendError, ChatMessage};
|
||||
|
||||
/// Together AI client for cloud research offload.
|
||||
/// Uses the OpenAI-compatible API format.
|
||||
pub struct TogetherClient {
|
||||
client: Client,
|
||||
api_key: String,
|
||||
base_url: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
struct TogetherChatRequest {
|
||||
model: String,
|
||||
messages: Vec<ChatMessage>,
|
||||
max_tokens: u32,
|
||||
temperature: f32,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct TogetherChatResponse {
|
||||
choices: Vec<TogetherChoice>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct TogetherChoice {
|
||||
message: ChatMessage,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
struct TogetherEmbedRequest {
|
||||
model: String,
|
||||
input: Vec<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct TogetherEmbedResponse {
|
||||
data: Vec<TogetherEmbedData>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
struct TogetherEmbedData {
|
||||
embedding: Vec<f32>,
|
||||
}
|
||||
|
||||
impl TogetherClient {
|
||||
pub fn new(api_key: String, base_url: String, timeout_secs: u64) -> Self {
|
||||
let client = Client::builder()
|
||||
.timeout(Duration::from_secs(timeout_secs))
|
||||
.build()
|
||||
.expect("reqwest client build");
|
||||
Self {
|
||||
client,
|
||||
api_key,
|
||||
base_url,
|
||||
}
|
||||
}
|
||||
|
||||
#[instrument(skip(self, messages), level = "debug")]
|
||||
pub async fn chat(
|
||||
&self,
|
||||
model: &str,
|
||||
messages: Vec<ChatMessage>,
|
||||
max_tokens: u32,
|
||||
) -> Result<String, BackendError> {
|
||||
let url = format!("{}/v1/chat/completions", self.base_url);
|
||||
let req_body = TogetherChatRequest {
|
||||
model: model.to_string(),
|
||||
messages,
|
||||
max_tokens,
|
||||
temperature: 0.3, // low temp for structured research
|
||||
};
|
||||
|
||||
debug!(url = %url, model, "sending Together AI chat request");
|
||||
|
||||
let res = self
|
||||
.client
|
||||
.post(&url)
|
||||
.header("Authorization", format!("Bearer {}", self.api_key))
|
||||
.json(&req_body)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| {
|
||||
if e.is_timeout() {
|
||||
BackendError::Timeout
|
||||
} else {
|
||||
BackendError::Http(e)
|
||||
}
|
||||
})?;
|
||||
|
||||
let status = res.status();
|
||||
if status == StatusCode::TOO_MANY_REQUESTS {
|
||||
return Err(BackendError::RateLimited);
|
||||
}
|
||||
if !status.is_success() {
|
||||
let text = res.text().await.unwrap_or_default();
|
||||
error!(%status, %text, "Together AI API error");
|
||||
return Err(BackendError::Api {
|
||||
status,
|
||||
message: text,
|
||||
});
|
||||
}
|
||||
|
||||
let body: TogetherChatResponse = res.json().await?;
|
||||
let content = body
|
||||
.choices
|
||||
.into_iter()
|
||||
.next()
|
||||
.map(|c| c.message.content)
|
||||
.unwrap_or_default();
|
||||
|
||||
Ok(content)
|
||||
}
|
||||
|
||||
#[instrument(skip(self, texts), level = "debug")]
|
||||
pub async fn embed(&self, model: &str, texts: Vec<String>) -> Result<Vec<Vector>, BackendError> {
|
||||
let url = format!("{}/v1/embeddings", self.base_url);
|
||||
let req_body = TogetherEmbedRequest {
|
||||
model: model.to_string(),
|
||||
input: texts.clone(),
|
||||
};
|
||||
|
||||
debug!(url = %url, count = texts.len(), "sending Together AI embed request");
|
||||
|
||||
let res = self
|
||||
.client
|
||||
.post(&url)
|
||||
.header("Authorization", format!("Bearer {}", self.api_key))
|
||||
.json(&req_body)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| {
|
||||
if e.is_timeout() {
|
||||
BackendError::Timeout
|
||||
} else {
|
||||
BackendError::Http(e)
|
||||
}
|
||||
})?;
|
||||
|
||||
let status = res.status();
|
||||
if !status.is_success() {
|
||||
let text = res.text().await.unwrap_or_default();
|
||||
return Err(BackendError::Api {
|
||||
status,
|
||||
message: text,
|
||||
});
|
||||
}
|
||||
|
||||
let body: TogetherEmbedResponse = res.json().await?;
|
||||
let vectors: Vec<Vector> = body
|
||||
.data
|
||||
.into_iter()
|
||||
.map(|d| Vector::from(d.embedding))
|
||||
.collect();
|
||||
Ok(vectors)
|
||||
}
|
||||
|
||||
pub fn health_check_url(&self) -> String {
|
||||
format!("{}/v1/models", self.base_url)
|
||||
}
|
||||
}
|
||||
40
crates/synq-intel/Cargo.toml
Normal file
40
crates/synq-intel/Cargo.toml
Normal file
|
|
@ -0,0 +1,40 @@
|
|||
[package]
|
||||
name = "synq-intel"
|
||||
version.workspace = true
|
||||
edition.workspace = true
|
||||
rust-version.workspace = true
|
||||
authors.workspace = true
|
||||
license.workspace = true
|
||||
|
||||
[dependencies]
|
||||
synq-protocol = { workspace = true }
|
||||
synq-core = { workspace = true }
|
||||
tokio = { workspace = true }
|
||||
serde = { workspace = true }
|
||||
serde_json = { workspace = true }
|
||||
chrono = { workspace = true }
|
||||
uuid = { workspace = true }
|
||||
thiserror = { workspace = true }
|
||||
anyhow = { workspace = true }
|
||||
tracing = { workspace = true }
|
||||
tracing-subscriber = { workspace = true }
|
||||
|
||||
# HTTP server
|
||||
axum = { workspace = true }
|
||||
tower = { workspace = true }
|
||||
tower-http = { workspace = true }
|
||||
|
||||
# Compression
|
||||
flate2 = { workspace = true }
|
||||
brotli = { workspace = true }
|
||||
|
||||
# Async runtime extensions
|
||||
async-trait = { workspace = true }
|
||||
|
||||
# Database
|
||||
sqlx = { workspace = true }
|
||||
rusqlite = { version = "0.32", features = ["bundled"] }
|
||||
|
||||
[[bin]]
|
||||
name = "synq-intel-server"
|
||||
path = "src/main.rs"
|
||||
177
crates/synq-intel/src/api/mod.rs
Normal file
177
crates/synq-intel/src/api/mod.rs
Normal file
|
|
@ -0,0 +1,177 @@
|
|||
use axum::{
|
||||
extract::{Path, Query, State},
|
||||
Json,
|
||||
};
|
||||
use serde::Deserialize;
|
||||
use serde_json::{json, Value};
|
||||
|
||||
use crate::models::*;
|
||||
use crate::repo::EpsteinRepo;
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct AppState {
|
||||
pub entities: std::sync::Arc<Vec<Entity>>,
|
||||
pub connections: std::sync::Arc<Vec<Connection>>,
|
||||
pub layers: std::sync::Arc<Vec<Layer>>,
|
||||
pub stories: std::sync::Arc<Vec<StoryProvenance>>,
|
||||
pub repo: EpsteinRepo,
|
||||
}
|
||||
|
||||
impl Default for AppState {
|
||||
fn default() -> Self {
|
||||
let base = std::env::var("EPSTEIN_DATA_PATH")
|
||||
.unwrap_or_else(|_| "/media/raider1984/Crucial X10/github downloads/git repos april 2026/Epstein-research-data".to_string());
|
||||
|
||||
Self {
|
||||
entities: std::sync::Arc::new(mock_entities()),
|
||||
connections: std::sync::Arc::new(mock_connections()),
|
||||
layers: std::sync::Arc::new(mock_layers()),
|
||||
stories: std::sync::Arc::new(mock_stories()),
|
||||
repo: EpsteinRepo::new(base),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct Pagination {
|
||||
#[serde(default = "default_limit")]
|
||||
pub limit: i64,
|
||||
#[serde(default)]
|
||||
pub offset: i64,
|
||||
}
|
||||
|
||||
fn default_limit() -> i64 { 50 }
|
||||
|
||||
#[derive(Debug, Deserialize)]
|
||||
pub struct SearchQuery {
|
||||
pub q: String,
|
||||
#[serde(default = "default_limit")]
|
||||
pub limit: i64,
|
||||
}
|
||||
|
||||
pub async fn list_entities(State(state): State<AppState>) -> Json<Value> {
|
||||
Json(json!({ "entities": state.entities.as_ref() }))
|
||||
}
|
||||
|
||||
pub async fn get_entity(State(state): State<AppState>, Path(id): Path<String>) -> Json<Value> {
|
||||
let entity = state.entities.iter().find(|e| e.id == id);
|
||||
Json(json!({ "entity": entity }))
|
||||
}
|
||||
|
||||
pub async fn list_connections(State(state): State<AppState>) -> Json<Value> {
|
||||
Json(json!({ "connections": state.connections.as_ref() }))
|
||||
}
|
||||
|
||||
pub async fn get_connection(State(state): State<AppState>, Path(id): Path<String>) -> Json<Value> {
|
||||
let conn = state.connections.iter().find(|c| c.id == id);
|
||||
Json(json!({ "connection": conn }))
|
||||
}
|
||||
|
||||
pub async fn update_connection_state(Path(_id): Path<String>, Json(_body): Json<Value>) -> Json<Value> {
|
||||
Json(json!({ "success": true }))
|
||||
}
|
||||
|
||||
pub async fn list_layers(State(state): State<AppState>) -> Json<Value> {
|
||||
Json(json!({ "layers": state.layers.as_ref() }))
|
||||
}
|
||||
|
||||
pub async fn toggle_layer(Path(_id): Path<String>) -> Json<Value> {
|
||||
Json(json!({ "success": true }))
|
||||
}
|
||||
|
||||
pub async fn list_stories(State(state): State<AppState>) -> Json<Value> {
|
||||
Json(json!({ "stories": state.stories.as_ref() }))
|
||||
}
|
||||
|
||||
pub async fn get_story(State(state): State<AppState>, Path(id): Path<String>) -> Json<Value> {
|
||||
let story = state.stories.iter().find(|s| s.story_id == id);
|
||||
Json(json!({ "story": story }))
|
||||
}
|
||||
|
||||
pub async fn update_viewport(Json(_body): Json<Value>) -> Json<Value> {
|
||||
Json(json!({ "success": true }))
|
||||
}
|
||||
|
||||
pub async fn get_beam_brief() -> Json<Value> {
|
||||
Json(json!({
|
||||
"state": "idle",
|
||||
"summary": "Monitoring 4 targets. Last update: just now. No anomalies.",
|
||||
"keySignals": [],
|
||||
"urgency": "low",
|
||||
"suggestedActions": [],
|
||||
"confidence": 1.0,
|
||||
"crossReferences": [],
|
||||
}))
|
||||
}
|
||||
|
||||
pub async fn send_beam_prompt(Json(body): Json<Value>) -> Json<Value> {
|
||||
let prompt = body.get("prompt").and_then(|p| p.as_str()).unwrap_or("");
|
||||
Json(json!({
|
||||
"state": "investigating",
|
||||
"summary": format!("Analyzing request: '{}'", prompt),
|
||||
"keySignals": ["Cross-layer correlation detected", "New entity suggestion"],
|
||||
"urgency": "medium",
|
||||
"suggestedActions": ["Expand Network", "View Documents", "Add Note"],
|
||||
"confidence": 0.72,
|
||||
"crossReferences": ["epstein-network", "chabad-network"],
|
||||
}))
|
||||
}
|
||||
|
||||
// ── Epstein dataset endpoints ──────────────────────────────────────────
|
||||
|
||||
pub async fn dataset_stats(State(state): State<AppState>) -> Json<Value> {
|
||||
match state.repo.stats() {
|
||||
Ok(stats) => Json(json!({ "stats": stats })),
|
||||
Err(e) => Json(json!({ "error": e.to_string() })),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn list_documents(
|
||||
State(state): State<AppState>,
|
||||
Query(p): Query<Pagination>,
|
||||
) -> Json<Value> {
|
||||
match state.repo.list_documents(p.limit, p.offset) {
|
||||
Ok(docs) => Json(json!({ "documents": docs, "limit": p.limit, "offset": p.offset })),
|
||||
Err(e) => Json(json!({ "error": e.to_string() })),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn search_documents(
|
||||
State(state): State<AppState>,
|
||||
Query(q): Query<SearchQuery>,
|
||||
) -> Json<Value> {
|
||||
match state.repo.search_documents(&q.q, q.limit) {
|
||||
Ok(docs) => Json(json!({ "documents": docs, "query": q.q })),
|
||||
Err(e) => Json(json!({ "error": e.to_string() })),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn get_document_pages(
|
||||
State(state): State<AppState>,
|
||||
Path(efta): Path<String>,
|
||||
) -> Json<Value> {
|
||||
match state.repo.get_document_pages(&efta) {
|
||||
Ok(pages) => Json(json!({ "pages": pages })),
|
||||
Err(e) => Json(json!({ "error": e.to_string() })),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn list_emails(
|
||||
State(state): State<AppState>,
|
||||
Query(p): Query<Pagination>,
|
||||
) -> Json<Value> {
|
||||
match state.repo.list_emails(p.limit, p.offset) {
|
||||
Ok(emails) => Json(json!({ "emails": emails, "limit": p.limit, "offset": p.offset })),
|
||||
Err(e) => Json(json!({ "error": e.to_string() })),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn search_emails(
|
||||
State(state): State<AppState>,
|
||||
Query(q): Query<SearchQuery>,
|
||||
) -> Json<Value> {
|
||||
match state.repo.search_emails(&q.q, q.limit) {
|
||||
Ok(emails) => Json(json!({ "emails": emails, "query": q.q })),
|
||||
Err(e) => Json(json!({ "error": e.to_string() })),
|
||||
}
|
||||
}
|
||||
1
crates/synq-intel/src/entity.rs
Normal file
1
crates/synq-intel/src/entity.rs
Normal file
|
|
@ -0,0 +1 @@
|
|||
pub mod entity;
|
||||
1
crates/synq-intel/src/layer.rs
Normal file
1
crates/synq-intel/src/layer.rs
Normal file
|
|
@ -0,0 +1 @@
|
|||
pub mod layer;
|
||||
48
crates/synq-intel/src/lib.rs
Normal file
48
crates/synq-intel/src/lib.rs
Normal file
|
|
@ -0,0 +1,48 @@
|
|||
pub mod api;
|
||||
pub mod models;
|
||||
pub mod repo;
|
||||
|
||||
use axum::{
|
||||
routing::{get, post},
|
||||
Router,
|
||||
};
|
||||
use tower_http::{
|
||||
cors::CorsLayer,
|
||||
compression::CompressionLayer,
|
||||
};
|
||||
|
||||
use api::AppState;
|
||||
|
||||
pub fn create_router() -> Router {
|
||||
let state = AppState::default();
|
||||
|
||||
Router::new()
|
||||
.route("/api/health", get(health_check))
|
||||
// Core intel routes
|
||||
.route("/api/entities", get(api::list_entities))
|
||||
.route("/api/entities/:id", get(api::get_entity))
|
||||
.route("/api/connections", get(api::list_connections))
|
||||
.route("/api/connections/:id", get(api::get_connection))
|
||||
.route("/api/connections/:id/state", post(api::update_connection_state))
|
||||
.route("/api/layers", get(api::list_layers))
|
||||
.route("/api/layers/:id/toggle", post(api::toggle_layer))
|
||||
.route("/api/stories", get(api::list_stories))
|
||||
.route("/api/stories/:id", get(api::get_story))
|
||||
.route("/api/map/viewport", post(api::update_viewport))
|
||||
.route("/api/beam/brief", get(api::get_beam_brief))
|
||||
.route("/api/beam/prompt", post(api::send_beam_prompt))
|
||||
// Epstein dataset routes
|
||||
.route("/api/dataset/stats", get(api::dataset_stats))
|
||||
.route("/api/documents", get(api::list_documents))
|
||||
.route("/api/documents/search", get(api::search_documents))
|
||||
.route("/api/documents/:efta/pages", get(api::get_document_pages))
|
||||
.route("/api/emails", get(api::list_emails))
|
||||
.route("/api/emails/search", get(api::search_emails))
|
||||
.layer(CorsLayer::permissive())
|
||||
.layer(CompressionLayer::new().gzip(true).br(true))
|
||||
.with_state(state)
|
||||
}
|
||||
|
||||
async fn health_check() -> &'static str {
|
||||
"ok"
|
||||
}
|
||||
17
crates/synq-intel/src/main.rs
Normal file
17
crates/synq-intel/src/main.rs
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
use std::net::SocketAddr;
|
||||
use tracing::info;
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() -> anyhow::Result<()> {
|
||||
tracing_subscriber::fmt::init();
|
||||
|
||||
let app = synq_intel::create_router();
|
||||
|
||||
let addr: SocketAddr = "0.0.0.0:3001".parse()?;
|
||||
info!("Synq Intel server listening on {}", addr);
|
||||
|
||||
let listener = tokio::net::TcpListener::bind(addr).await?;
|
||||
axum::serve(listener, app).await?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
1
crates/synq-intel/src/map.rs
Normal file
1
crates/synq-intel/src/map.rs
Normal file
|
|
@ -0,0 +1 @@
|
|||
pub mod map;
|
||||
534
crates/synq-intel/src/models.rs
Normal file
534
crates/synq-intel/src/models.rs
Normal file
|
|
@ -0,0 +1,534 @@
|
|||
use chrono::{DateTime, Utc};
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct Entity {
|
||||
pub id: String,
|
||||
pub r#type: String,
|
||||
pub name: String,
|
||||
pub aliases: Vec<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub role: Option<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub location: Option<EntityLocation>,
|
||||
pub tags: Vec<String>,
|
||||
pub credibility: String,
|
||||
pub sources: Vec<EntitySource>,
|
||||
pub state: String,
|
||||
pub created_at: DateTime<Utc>,
|
||||
pub updated_at: DateTime<Utc>,
|
||||
pub created_by: String,
|
||||
pub layers: Vec<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct EntityLocation {
|
||||
pub lat: f64,
|
||||
pub lng: f64,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub address: Option<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub city: Option<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub country: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct EntitySource {
|
||||
pub r#type: String,
|
||||
pub reference: String,
|
||||
pub date: DateTime<Utc>,
|
||||
pub confidence: f64,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct Connection {
|
||||
pub id: String,
|
||||
pub from: String,
|
||||
pub to: String,
|
||||
pub r#type: String,
|
||||
pub state: String,
|
||||
pub strength: f64,
|
||||
pub sources: Vec<ConnectionSource>,
|
||||
pub history: Vec<ConnectionHistoryEntry>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub date_range: Option<DateRange>,
|
||||
pub created_at: DateTime<Utc>,
|
||||
pub updated_at: DateTime<Utc>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct ConnectionSource {
|
||||
pub r#type: String,
|
||||
pub reference: String,
|
||||
pub date: DateTime<Utc>,
|
||||
pub confidence: f64,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct ConnectionHistoryEntry {
|
||||
pub timestamp: DateTime<Utc>,
|
||||
pub action: String,
|
||||
pub actor: String,
|
||||
pub reason: String,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub confidence: Option<f64>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct DateRange {
|
||||
pub start: DateTime<Utc>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub end: Option<DateTime<Utc>>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct Layer {
|
||||
pub id: String,
|
||||
pub name: String,
|
||||
pub description: String,
|
||||
pub default: bool,
|
||||
pub node_types: Vec<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub link_types: Option<Vec<String>>,
|
||||
pub data_source: String,
|
||||
pub visible: bool,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub node_color: Option<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub sub_layers: Option<serde_json::Value>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub max_visible_nodes: Option<serde_json::Value>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub clustering: Option<bool>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct StoryProvenance {
|
||||
pub story_id: String,
|
||||
pub title: String,
|
||||
pub url: String,
|
||||
pub publication_date: DateTime<Utc>,
|
||||
pub outlet: Outlet,
|
||||
pub journalists: Vec<Journalist>,
|
||||
pub source_types: Vec<String>,
|
||||
pub documents_obtained: Vec<String>,
|
||||
pub timing_context: TimingContext,
|
||||
pub direct_beneficiaries: Vec<DirectBeneficiary>,
|
||||
pub upstream_pressure: Vec<UpstreamPressure>,
|
||||
pub omissions_noted: Vec<String>,
|
||||
pub cross_references: Vec<String>,
|
||||
pub narrative_origins: NarrativeOrigin,
|
||||
pub beam_analysis: BeamAnalysis,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct Outlet {
|
||||
pub name: String,
|
||||
pub owner: String,
|
||||
pub owner_wealth_source: String,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub parent_company: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct Journalist {
|
||||
pub name: String,
|
||||
pub beat: String,
|
||||
pub prior_work: Vec<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct TimingContext {
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub election_cycle: Option<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub policy_moment: Option<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub corporate_event: Option<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub market_event: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct DirectBeneficiary {
|
||||
pub entity: String,
|
||||
pub confidence: String,
|
||||
pub reasoning: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct UpstreamPressure {
|
||||
pub theory: String,
|
||||
pub confidence: String,
|
||||
pub evidence: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct NarrativeOrigin {
|
||||
pub first_appearance: String,
|
||||
pub amplification_chain: Vec<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct BeamAnalysis {
|
||||
pub propaganda_classification: String,
|
||||
pub confidence: f64,
|
||||
pub suggested_actions: Vec<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct BeamBrief {
|
||||
pub state: String,
|
||||
pub summary: String,
|
||||
pub key_signals: Vec<String>,
|
||||
pub urgency: String,
|
||||
pub suggested_actions: Vec<String>,
|
||||
pub confidence: f64,
|
||||
pub cross_references: Vec<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub progress: Option<f64>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub source: Option<String>,
|
||||
}
|
||||
|
||||
pub fn mock_entities() -> Vec<Entity> {
|
||||
vec![
|
||||
Entity {
|
||||
id: "epstein-1".to_string(),
|
||||
r#type: "person".to_string(),
|
||||
name: "Jeffrey Epstein".to_string(),
|
||||
aliases: vec!["JE".to_string()],
|
||||
role: Some("Financier".to_string()),
|
||||
location: Some(EntityLocation {
|
||||
lat: 40.7589,
|
||||
lng: -73.9851,
|
||||
address: None,
|
||||
city: Some("New York".to_string()),
|
||||
country: Some("USA".to_string()),
|
||||
}),
|
||||
tags: vec!["financier".to_string(), "offender".to_string()],
|
||||
credibility: "high".to_string(),
|
||||
sources: vec![EntitySource {
|
||||
r#type: "document".to_string(),
|
||||
reference: "DOJ-001".to_string(),
|
||||
date: Utc::now(),
|
||||
confidence: 1.0,
|
||||
}],
|
||||
state: "active".to_string(),
|
||||
created_at: Utc::now(),
|
||||
updated_at: Utc::now(),
|
||||
created_by: "import".to_string(),
|
||||
layers: vec!["epstein-network".to_string()],
|
||||
},
|
||||
Entity {
|
||||
id: "epstein-little-saint-james".to_string(),
|
||||
r#type: "location".to_string(),
|
||||
name: "Little Saint James".to_string(),
|
||||
aliases: vec!["Epstein Island".to_string()],
|
||||
role: None,
|
||||
location: Some(EntityLocation {
|
||||
lat: 18.3,
|
||||
lng: -64.85,
|
||||
address: None,
|
||||
city: Some("Saint Thomas".to_string()),
|
||||
country: Some("USVI".to_string()),
|
||||
}),
|
||||
tags: vec!["private-island".to_string(), "epstein".to_string()],
|
||||
credibility: "high".to_string(),
|
||||
sources: vec![EntitySource {
|
||||
r#type: "document".to_string(),
|
||||
reference: "DOJ-002".to_string(),
|
||||
date: Utc::now(),
|
||||
confidence: 1.0,
|
||||
}],
|
||||
state: "active".to_string(),
|
||||
created_at: Utc::now(),
|
||||
updated_at: Utc::now(),
|
||||
created_by: "import".to_string(),
|
||||
layers: vec!["epstein-network".to_string()],
|
||||
},
|
||||
Entity {
|
||||
id: "chabad-770".to_string(),
|
||||
r#type: "organization".to_string(),
|
||||
name: "Chabad-Lubavitch World Headquarters".to_string(),
|
||||
aliases: vec!["770".to_string()],
|
||||
role: None,
|
||||
location: Some(EntityLocation {
|
||||
lat: 40.668,
|
||||
lng: -73.943,
|
||||
address: None,
|
||||
city: Some("Brooklyn".to_string()),
|
||||
country: Some("USA".to_string()),
|
||||
}),
|
||||
tags: vec!["chabad".to_string(), "headquarters".to_string()],
|
||||
credibility: "high".to_string(),
|
||||
sources: vec![EntitySource {
|
||||
r#type: "news".to_string(),
|
||||
reference: "chabad.org".to_string(),
|
||||
date: Utc::now(),
|
||||
confidence: 0.95,
|
||||
}],
|
||||
state: "active".to_string(),
|
||||
created_at: Utc::now(),
|
||||
updated_at: Utc::now(),
|
||||
created_by: "import".to_string(),
|
||||
layers: vec!["chabad-network".to_string()],
|
||||
},
|
||||
Entity {
|
||||
id: "antarctica-mcmurdo".to_string(),
|
||||
r#type: "researchStation".to_string(),
|
||||
name: "McMurdo Station".to_string(),
|
||||
aliases: vec![],
|
||||
role: None,
|
||||
location: Some(EntityLocation {
|
||||
lat: -77.8458,
|
||||
lng: 166.686,
|
||||
address: None,
|
||||
city: Some("Ross Island".to_string()),
|
||||
country: Some("Antarctica".to_string()),
|
||||
}),
|
||||
tags: vec!["us-program".to_string(), "antarctica".to_string()],
|
||||
credibility: "high".to_string(),
|
||||
sources: vec![EntitySource {
|
||||
r#type: "document".to_string(),
|
||||
reference: "COMNAP-001".to_string(),
|
||||
date: Utc::now(),
|
||||
confidence: 1.0,
|
||||
}],
|
||||
state: "active".to_string(),
|
||||
created_at: Utc::now(),
|
||||
updated_at: Utc::now(),
|
||||
created_by: "import".to_string(),
|
||||
layers: vec!["antarctica".to_string()],
|
||||
},
|
||||
Entity {
|
||||
id: "antarctica-amundsen-scott".to_string(),
|
||||
r#type: "researchStation".to_string(),
|
||||
name: "Amundsen-Scott South Pole Station".to_string(),
|
||||
aliases: vec!["South Pole Station".to_string()],
|
||||
role: None,
|
||||
location: Some(EntityLocation {
|
||||
lat: -90.0,
|
||||
lng: 0.0,
|
||||
address: None,
|
||||
city: Some("South Pole".to_string()),
|
||||
country: Some("Antarctica".to_string()),
|
||||
}),
|
||||
tags: vec!["us-program".to_string(), "antarctica".to_string(), "south-pole".to_string()],
|
||||
credibility: "high".to_string(),
|
||||
sources: vec![EntitySource {
|
||||
r#type: "document".to_string(),
|
||||
reference: "COMNAP-002".to_string(),
|
||||
date: Utc::now(),
|
||||
confidence: 1.0,
|
||||
}],
|
||||
state: "active".to_string(),
|
||||
created_at: Utc::now(),
|
||||
updated_at: Utc::now(),
|
||||
created_by: "import".to_string(),
|
||||
layers: vec!["antarctica".to_string()],
|
||||
},
|
||||
Entity {
|
||||
id: "cortical-labs".to_string(),
|
||||
r#type: "organization".to_string(),
|
||||
name: "Cortical Labs".to_string(),
|
||||
aliases: vec![],
|
||||
role: Some("Organoid Intelligence".to_string()),
|
||||
location: Some(EntityLocation {
|
||||
lat: -37.8136,
|
||||
lng: 144.9631,
|
||||
address: None,
|
||||
city: Some("Melbourne".to_string()),
|
||||
country: Some("Australia".to_string()),
|
||||
}),
|
||||
tags: vec!["organoid-intelligence".to_string(), "biotech".to_string(), "in-q-tel".to_string()],
|
||||
credibility: "high".to_string(),
|
||||
sources: vec![EntitySource {
|
||||
r#type: "news".to_string(),
|
||||
reference: "corticallabs.com".to_string(),
|
||||
date: Utc::now(),
|
||||
confidence: 0.9,
|
||||
}],
|
||||
state: "active".to_string(),
|
||||
created_at: Utc::now(),
|
||||
updated_at: Utc::now(),
|
||||
created_by: "import".to_string(),
|
||||
layers: vec!["human-neuron-ai".to_string()],
|
||||
},
|
||||
]
|
||||
}
|
||||
|
||||
pub fn mock_connections() -> Vec<Connection> {
|
||||
vec![
|
||||
Connection {
|
||||
id: "conn-1".to_string(),
|
||||
from: "epstein-1".to_string(),
|
||||
to: "epstein-little-saint-james".to_string(),
|
||||
r#type: "geographic".to_string(),
|
||||
state: "confirmed".to_string(),
|
||||
strength: 1.0,
|
||||
sources: vec![ConnectionSource {
|
||||
r#type: "document".to_string(),
|
||||
reference: "DOJ-001".to_string(),
|
||||
date: Utc::now(),
|
||||
confidence: 1.0,
|
||||
}],
|
||||
history: vec![ConnectionHistoryEntry {
|
||||
timestamp: Utc::now(),
|
||||
action: "added".to_string(),
|
||||
actor: "user".to_string(),
|
||||
reason: "Ownership records".to_string(),
|
||||
confidence: Some(1.0),
|
||||
}],
|
||||
date_range: None,
|
||||
created_at: Utc::now(),
|
||||
updated_at: Utc::now(),
|
||||
},
|
||||
Connection {
|
||||
id: "conn-2".to_string(),
|
||||
from: "epstein-1".to_string(),
|
||||
to: "chabad-770".to_string(),
|
||||
r#type: "social".to_string(),
|
||||
state: "suspected".to_string(),
|
||||
strength: 0.4,
|
||||
sources: vec![ConnectionSource {
|
||||
r#type: "news".to_string(),
|
||||
reference: "VT-001".to_string(),
|
||||
date: Utc::now(),
|
||||
confidence: 0.4,
|
||||
}],
|
||||
history: vec![ConnectionHistoryEntry {
|
||||
timestamp: Utc::now(),
|
||||
action: "added".to_string(),
|
||||
actor: "beam-ai".to_string(),
|
||||
reason: "Photo proximity analysis".to_string(),
|
||||
confidence: Some(0.4),
|
||||
}],
|
||||
date_range: None,
|
||||
created_at: Utc::now(),
|
||||
updated_at: Utc::now(),
|
||||
},
|
||||
]
|
||||
}
|
||||
|
||||
pub fn mock_layers() -> Vec<Layer> {
|
||||
vec![
|
||||
Layer {
|
||||
id: "epstein-network".to_string(),
|
||||
name: "Epstein Network".to_string(),
|
||||
description: "Jeffrey Epstein associates, victims, institutions, events".to_string(),
|
||||
default: true,
|
||||
node_types: vec!["person".to_string(), "organization".to_string(), "location".to_string(), "event".to_string(), "document".to_string()],
|
||||
link_types: Some(vec!["social".to_string(), "financial".to_string(), "professional".to_string(), "legal".to_string(), "geographic".to_string(), "ideological".to_string()]),
|
||||
data_source: "local-nocobase".to_string(),
|
||||
visible: true,
|
||||
node_color: None,
|
||||
sub_layers: None,
|
||||
max_visible_nodes: None,
|
||||
clustering: Some(true),
|
||||
},
|
||||
Layer {
|
||||
id: "chabad-network".to_string(),
|
||||
name: "Chabad-Lubavitch Network".to_string(),
|
||||
description: "Chabad centers, rabbis, connected individuals".to_string(),
|
||||
default: true,
|
||||
node_types: vec!["person".to_string(), "organization".to_string(), "location".to_string(), "event".to_string()],
|
||||
link_types: None,
|
||||
data_source: "local-nocobase".to_string(),
|
||||
visible: true,
|
||||
node_color: Some("#4a90d9".to_string()),
|
||||
sub_layers: None,
|
||||
max_visible_nodes: None,
|
||||
clustering: None,
|
||||
},
|
||||
Layer {
|
||||
id: "antarctica".to_string(),
|
||||
name: "Antarctic Infrastructure".to_string(),
|
||||
description: "Research stations, flight routes, undersea cables, Epstein references".to_string(),
|
||||
default: true,
|
||||
node_types: vec!["researchStation".to_string(), "location".to_string(), "flightRoute".to_string(), "underseaCable".to_string(), "documentReference".to_string()],
|
||||
link_types: None,
|
||||
data_source: "public-registries".to_string(),
|
||||
visible: true,
|
||||
node_color: Some("#ffffff".to_string()),
|
||||
sub_layers: None,
|
||||
max_visible_nodes: None,
|
||||
clustering: None,
|
||||
},
|
||||
Layer {
|
||||
id: "human-neuron-ai".to_string(),
|
||||
name: "AI with Human Neurons".to_string(),
|
||||
description: "Organoid intelligence, biological neural networks, CIA-funded biotech".to_string(),
|
||||
default: false,
|
||||
node_types: vec!["company".to_string(), "researchInstitution".to_string(), "researcher".to_string(), "experiment".to_string(), "ethicalConcern".to_string()],
|
||||
link_types: None,
|
||||
data_source: "scrapling-pipeline".to_string(),
|
||||
visible: false,
|
||||
node_color: Some("#ff6b6b".to_string()),
|
||||
sub_layers: None,
|
||||
max_visible_nodes: None,
|
||||
clustering: None,
|
||||
},
|
||||
]
|
||||
}
|
||||
|
||||
pub fn mock_stories() -> Vec<StoryProvenance> {
|
||||
vec![
|
||||
StoryProvenance {
|
||||
story_id: "story-1".to_string(),
|
||||
title: "The Epstein Network: A Comprehensive Overview".to_string(),
|
||||
url: "https://example.com/epstein".to_string(),
|
||||
publication_date: Utc::now(),
|
||||
outlet: Outlet {
|
||||
name: "Investigative Post".to_string(),
|
||||
owner: "Independent".to_string(),
|
||||
owner_wealth_source: "Subscriptions".to_string(),
|
||||
parent_company: None,
|
||||
},
|
||||
journalists: vec![Journalist {
|
||||
name: "Jane Doe".to_string(),
|
||||
beat: "Investigative".to_string(),
|
||||
prior_work: vec![],
|
||||
}],
|
||||
source_types: vec!["document".to_string(), "named_witness".to_string()],
|
||||
documents_obtained: vec!["flight-logs".to_string()],
|
||||
timing_context: TimingContext {
|
||||
election_cycle: None,
|
||||
policy_moment: None,
|
||||
corporate_event: None,
|
||||
market_event: None,
|
||||
},
|
||||
direct_beneficiaries: vec![],
|
||||
upstream_pressure: vec![],
|
||||
omissions_noted: vec![],
|
||||
cross_references: vec![],
|
||||
narrative_origins: NarrativeOrigin {
|
||||
first_appearance: "Jane Doe".to_string(),
|
||||
amplification_chain: vec![],
|
||||
},
|
||||
beam_analysis: BeamAnalysis {
|
||||
propaganda_classification: "none".to_string(),
|
||||
confidence: 0.95,
|
||||
suggested_actions: vec!["cross-reference".to_string()],
|
||||
},
|
||||
},
|
||||
]
|
||||
}
|
||||
268
crates/synq-intel/src/repo.rs
Normal file
268
crates/synq-intel/src/repo.rs
Normal file
|
|
@ -0,0 +1,268 @@
|
|||
use rusqlite::{Connection, OpenFlags, Result as SqliteResult};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::path::Path;
|
||||
use tracing::warn;
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct Document {
|
||||
pub efta_number: String,
|
||||
pub dataset: Option<i64>,
|
||||
pub file_path: String,
|
||||
pub total_pages: Option<i64>,
|
||||
pub doc_type: Option<String>,
|
||||
pub stamp_type: Option<String>,
|
||||
pub sender: Option<String>,
|
||||
pub email_date: Option<String>,
|
||||
pub file_size: Option<i64>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct Page {
|
||||
pub efta_number: String,
|
||||
pub page_number: i64,
|
||||
pub text_content: Option<String>,
|
||||
pub char_count: Option<i64>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct Email {
|
||||
pub id: i64,
|
||||
pub efta_number: String,
|
||||
pub from_name: Option<String>,
|
||||
pub from_email: Option<String>,
|
||||
pub to_json: Option<String>,
|
||||
pub subject: Option<String>,
|
||||
pub date_normalized: Option<String>,
|
||||
pub sender_canonical: Option<String>,
|
||||
pub epstein_is_sender: bool,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct DatasetStats {
|
||||
pub total_documents: i64,
|
||||
pub total_pages: i64,
|
||||
pub total_emails: i64,
|
||||
pub total_email_participants: i64,
|
||||
pub total_extracted_entities: i64,
|
||||
pub total_altered_files: i64,
|
||||
pub total_removed_entities: i64,
|
||||
pub total_communication_pairs: i64,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct EpsteinRepo {
|
||||
base_path: String,
|
||||
}
|
||||
|
||||
impl EpsteinRepo {
|
||||
pub fn new(base_path: impl Into<String>) -> Self {
|
||||
Self {
|
||||
base_path: base_path.into(),
|
||||
}
|
||||
}
|
||||
|
||||
fn db_path(&self, name: &str) -> String {
|
||||
format!("{}/databases/{}.db", self.base_path, name)
|
||||
}
|
||||
|
||||
fn open(&self, name: &str) -> SqliteResult<Connection> {
|
||||
let path = self.db_path(name);
|
||||
if !Path::new(&path).exists() {
|
||||
warn!("Database not found: {}", path);
|
||||
}
|
||||
Connection::open_with_flags(&path, OpenFlags::SQLITE_OPEN_READ_ONLY)
|
||||
}
|
||||
|
||||
pub fn stats(&self) -> SqliteResult<DatasetStats> {
|
||||
let mut stats = DatasetStats {
|
||||
total_documents: 0,
|
||||
total_pages: 0,
|
||||
total_emails: 0,
|
||||
total_email_participants: 0,
|
||||
total_extracted_entities: 0,
|
||||
total_altered_files: 0,
|
||||
total_removed_entities: 0,
|
||||
total_communication_pairs: 0,
|
||||
};
|
||||
|
||||
if let Ok(conn) = self.open("full_text_corpus") {
|
||||
if let Ok(count) = conn.query_row("SELECT COUNT(*) FROM documents", [], |r| r.get(0)) {
|
||||
stats.total_documents = count;
|
||||
}
|
||||
if let Ok(count) = conn.query_row("SELECT COUNT(*) FROM pages", [], |r| r.get(0)) {
|
||||
stats.total_pages = count;
|
||||
}
|
||||
}
|
||||
|
||||
if let Ok(conn) = self.open("email_metadata") {
|
||||
if let Ok(count) = conn.query_row("SELECT COUNT(*) FROM emails", [], |r| r.get(0)) {
|
||||
stats.total_emails = count;
|
||||
}
|
||||
if let Ok(count) = conn.query_row("SELECT COUNT(*) FROM emails_merged", [], |r| r.get(0)) {
|
||||
stats.total_email_participants = count;
|
||||
}
|
||||
}
|
||||
|
||||
if let Ok(conn) = self.open("document_entities") {
|
||||
if let Ok(count) = conn.query_row("SELECT COUNT(*) FROM document_entities", [], |r| r.get(0)) {
|
||||
stats.total_extracted_entities = count;
|
||||
}
|
||||
}
|
||||
|
||||
if let Ok(conn) = self.open("alteration_results") {
|
||||
if let Ok(count) = conn.query_row("SELECT COUNT(*) FROM altered_files", [], |r| r.get(0)) {
|
||||
stats.total_altered_files = count;
|
||||
}
|
||||
if let Ok(count) = conn.query_row("SELECT COUNT(*) FROM removed_entities", [], |r| r.get(0)) {
|
||||
stats.total_removed_entities = count;
|
||||
}
|
||||
}
|
||||
|
||||
if let Ok(conn) = self.open("communications") {
|
||||
if let Ok(count) = conn.query_row("SELECT COUNT(*) FROM communication_pairs", [], |r| r.get(0)) {
|
||||
stats.total_communication_pairs = count;
|
||||
}
|
||||
}
|
||||
|
||||
Ok(stats)
|
||||
}
|
||||
|
||||
pub fn list_documents(&self, limit: i64, offset: i64) -> SqliteResult<Vec<Document>> {
|
||||
let conn = self.open("full_text_corpus")?;
|
||||
let mut stmt = conn.prepare(
|
||||
"SELECT efta_number, dataset, file_path, total_pages, doc_type, stamp_type, sender, email_date, file_size
|
||||
FROM documents
|
||||
ORDER BY id
|
||||
LIMIT ? OFFSET ?"
|
||||
)?;
|
||||
|
||||
let docs = stmt.query_map([limit, offset], |row| {
|
||||
Ok(Document {
|
||||
efta_number: row.get(0)?,
|
||||
dataset: row.get(1)?,
|
||||
file_path: row.get(2)?,
|
||||
total_pages: row.get(3)?,
|
||||
doc_type: row.get(4)?,
|
||||
stamp_type: row.get(5)?,
|
||||
sender: row.get(6)?,
|
||||
email_date: row.get(7)?,
|
||||
file_size: row.get(8)?,
|
||||
})
|
||||
})?
|
||||
.collect::<SqliteResult<Vec<_>>>()?;
|
||||
|
||||
Ok(docs)
|
||||
}
|
||||
|
||||
pub fn search_documents(&self, query: &str, limit: i64) -> SqliteResult<Vec<Document>> {
|
||||
let conn = self.open("full_text_corpus")?;
|
||||
let like = format!("%{}%", query);
|
||||
let mut stmt = conn.prepare(
|
||||
"SELECT efta_number, dataset, file_path, total_pages, doc_type, stamp_type, sender, email_date, file_size
|
||||
FROM documents
|
||||
WHERE efta_number LIKE ? OR sender LIKE ? OR doc_type LIKE ?
|
||||
ORDER BY id
|
||||
LIMIT ?"
|
||||
)?;
|
||||
|
||||
let docs = stmt.query_map([&like, &like, &like, &limit.to_string()], |row| {
|
||||
Ok(Document {
|
||||
efta_number: row.get(0)?,
|
||||
dataset: row.get(1)?,
|
||||
file_path: row.get(2)?,
|
||||
total_pages: row.get(3)?,
|
||||
doc_type: row.get(4)?,
|
||||
stamp_type: row.get(5)?,
|
||||
sender: row.get(6)?,
|
||||
email_date: row.get(7)?,
|
||||
file_size: row.get(8)?,
|
||||
})
|
||||
})?
|
||||
.collect::<SqliteResult<Vec<_>>>()?;
|
||||
|
||||
Ok(docs)
|
||||
}
|
||||
|
||||
pub fn get_document_pages(&self, efta: &str) -> SqliteResult<Vec<Page>> {
|
||||
let conn = self.open("full_text_corpus")?;
|
||||
let mut stmt = conn.prepare(
|
||||
"SELECT efta_number, page_number, text_content, char_count
|
||||
FROM pages
|
||||
WHERE efta_number = ?
|
||||
ORDER BY page_number"
|
||||
)?;
|
||||
|
||||
let pages = stmt.query_map([efta], |row| {
|
||||
Ok(Page {
|
||||
efta_number: row.get(0)?,
|
||||
page_number: row.get(1)?,
|
||||
text_content: row.get(2)?,
|
||||
char_count: row.get(3)?,
|
||||
})
|
||||
})?
|
||||
.collect::<SqliteResult<Vec<_>>>()?;
|
||||
|
||||
Ok(pages)
|
||||
}
|
||||
|
||||
pub fn list_emails(&self, limit: i64, offset: i64) -> SqliteResult<Vec<Email>> {
|
||||
let conn = self.open("email_metadata")?;
|
||||
let mut stmt = conn.prepare(
|
||||
"SELECT id, efta_number, from_name, from_email, to_json, subject, date_normalized, sender_canonical, epstein_is_sender
|
||||
FROM emails
|
||||
WHERE is_noise = 0 AND is_promotional = 0
|
||||
ORDER BY id
|
||||
LIMIT ? OFFSET ?"
|
||||
)?;
|
||||
|
||||
let emails = stmt.query_map([limit, offset], |row| {
|
||||
Ok(Email {
|
||||
id: row.get(0)?,
|
||||
efta_number: row.get(1)?,
|
||||
from_name: row.get(2)?,
|
||||
from_email: row.get(3)?,
|
||||
to_json: row.get(4)?,
|
||||
subject: row.get(5)?,
|
||||
date_normalized: row.get(6)?,
|
||||
sender_canonical: row.get(7)?,
|
||||
epstein_is_sender: row.get::<_, i64>(8)? != 0,
|
||||
})
|
||||
})?
|
||||
.collect::<SqliteResult<Vec<_>>>()?;
|
||||
|
||||
Ok(emails)
|
||||
}
|
||||
|
||||
pub fn search_emails(&self, query: &str, limit: i64) -> SqliteResult<Vec<Email>> {
|
||||
let conn = self.open("email_metadata")?;
|
||||
let like = format!("%{}%", query);
|
||||
let mut stmt = conn.prepare(
|
||||
"SELECT id, efta_number, from_name, from_email, to_json, subject, date_normalized, sender_canonical, epstein_is_sender
|
||||
FROM emails
|
||||
WHERE is_noise = 0 AND is_promotional = 0 AND (from_name LIKE ? OR from_email LIKE ? OR subject LIKE ?)
|
||||
ORDER BY id
|
||||
LIMIT ?"
|
||||
)?;
|
||||
|
||||
let emails = stmt.query_map([&like, &like, &like, &limit.to_string()], |row| {
|
||||
Ok(Email {
|
||||
id: row.get(0)?,
|
||||
efta_number: row.get(1)?,
|
||||
from_name: row.get(2)?,
|
||||
from_email: row.get(3)?,
|
||||
to_json: row.get(4)?,
|
||||
subject: row.get(5)?,
|
||||
date_normalized: row.get(6)?,
|
||||
sender_canonical: row.get(7)?,
|
||||
epstein_is_sender: row.get::<_, i64>(8)? != 0,
|
||||
})
|
||||
})?
|
||||
.collect::<SqliteResult<Vec<_>>>()?;
|
||||
|
||||
Ok(emails)
|
||||
}
|
||||
}
|
||||
1
crates/synq-intel/src/story.rs
Normal file
1
crates/synq-intel/src/story.rs
Normal file
|
|
@ -0,0 +1 @@
|
|||
pub mod story;
|
||||
93
crates/synq-protocol/src/channel.rs
Normal file
93
crates/synq-protocol/src/channel.rs
Normal file
|
|
@ -0,0 +1,93 @@
|
|||
use serde::{Deserialize, Serialize};
|
||||
use std::fmt;
|
||||
|
||||
/// A Synq channel represents a functional domain with a corporate hierarchy.
|
||||
/// Each channel has a C-Level Lead and direct reports.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, Serialize, Deserialize)]
|
||||
pub enum Channel {
|
||||
Finance,
|
||||
Clinical,
|
||||
Content,
|
||||
Intelligence,
|
||||
Engineering,
|
||||
Travel,
|
||||
}
|
||||
|
||||
impl Channel {
|
||||
pub fn as_str(&self) -> &'static str {
|
||||
match self {
|
||||
Channel::Finance => "finance",
|
||||
Channel::Clinical => "clinical",
|
||||
Channel::Content => "content",
|
||||
Channel::Intelligence => "intelligence",
|
||||
Channel::Engineering => "engineering",
|
||||
Channel::Travel => "travel",
|
||||
}
|
||||
}
|
||||
|
||||
pub fn lead_title(&self) -> &'static str {
|
||||
match self {
|
||||
Channel::Finance => "CFO",
|
||||
Channel::Clinical => "Chief Medical Officer",
|
||||
Channel::Content => "Editor-in-Chief",
|
||||
Channel::Intelligence => "Director of Intelligence",
|
||||
Channel::Engineering => "CTO",
|
||||
Channel::Travel => "Travel Director",
|
||||
}
|
||||
}
|
||||
|
||||
pub fn display_name(&self) -> &'static str {
|
||||
match self {
|
||||
Channel::Finance => "Finance",
|
||||
Channel::Clinical => "Clinical",
|
||||
Channel::Content => "Content",
|
||||
Channel::Intelligence => "Intelligence",
|
||||
Channel::Engineering => "Engineering",
|
||||
Channel::Travel => "Travel",
|
||||
}
|
||||
}
|
||||
|
||||
pub fn ollama_port(&self) -> u16 {
|
||||
match self {
|
||||
Channel::Finance => 8088,
|
||||
Channel::Clinical => 8085,
|
||||
Channel::Content => 8089,
|
||||
Channel::Intelligence => 8089,
|
||||
Channel::Engineering => 8090,
|
||||
Channel::Travel => 8091,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn all() -> &'static [Channel] {
|
||||
&[
|
||||
Channel::Finance,
|
||||
Channel::Clinical,
|
||||
Channel::Content,
|
||||
Channel::Intelligence,
|
||||
Channel::Engineering,
|
||||
Channel::Travel,
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
impl fmt::Display for Channel {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(f, "{}", self.as_str())
|
||||
}
|
||||
}
|
||||
|
||||
impl std::str::FromStr for Channel {
|
||||
type Err = String;
|
||||
|
||||
fn from_str(s: &str) -> Result<Self, Self::Err> {
|
||||
match s.to_lowercase().as_str() {
|
||||
"finance" => Ok(Channel::Finance),
|
||||
"clinical" => Ok(Channel::Clinical),
|
||||
"content" => Ok(Channel::Content),
|
||||
"intelligence" => Ok(Channel::Intelligence),
|
||||
"engineering" => Ok(Channel::Engineering),
|
||||
"travel" => Ok(Channel::Travel),
|
||||
_ => Err(format!("unknown channel: {s}")),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -1,19 +1,27 @@
|
|||
pub mod agent;
|
||||
pub mod backend;
|
||||
pub mod channel;
|
||||
pub mod data_class;
|
||||
pub mod event;
|
||||
pub mod intent;
|
||||
pub mod memory;
|
||||
pub mod message;
|
||||
pub mod operation;
|
||||
pub mod swarm;
|
||||
pub mod vector;
|
||||
|
||||
pub use agent::{AgentId, CapabilityEmbedding};
|
||||
pub use backend::Backend;
|
||||
pub use channel::Channel;
|
||||
pub use data_class::DataClass;
|
||||
pub use event::BackendEvent;
|
||||
pub use intent::Intent;
|
||||
pub use memory::MemorySource;
|
||||
pub use message::{Role, StreamMessage};
|
||||
pub use operation::Operation;
|
||||
pub use swarm::{
|
||||
ChannelEvent, ChannelCommand, CloudOffloadPayload, CloudOffloadResult,
|
||||
CommandRegistry, CorporateRole, HumanCheckpointConfig, MemoryNamespace,
|
||||
RoutingDecision, RoutingMethod, SwarmMessage, SwarmMessageType, SwarmTask, TaskStatus,
|
||||
};
|
||||
pub use vector::{cosine_similarity, top_k_sparsify, Vector, VectorError, EMBEDDING_DIMENSION};
|
||||
|
|
|
|||
475
crates/synq-protocol/src/swarm.rs
Normal file
475
crates/synq-protocol/src/swarm.rs
Normal file
|
|
@ -0,0 +1,475 @@
|
|||
use crate::{AgentId, Backend, Channel};
|
||||
use chrono::{DateTime, Utc};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use serde_json::Value;
|
||||
use uuid::Uuid;
|
||||
|
||||
/// A corporate role within a channel's hierarchy.
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Hash, Serialize, Deserialize)]
|
||||
pub enum CorporateRole {
|
||||
/// C-Level lead who orchestrates, plans, delegates, and owns sign-off.
|
||||
Lead,
|
||||
/// Direct report who executes within a specialty and reports to the Lead.
|
||||
DirectReport { title: String },
|
||||
/// Meta-router that triages across all channels. Never executes.
|
||||
MetaRouter,
|
||||
}
|
||||
|
||||
impl CorporateRole {
|
||||
pub fn is_lead(&self) -> bool {
|
||||
matches!(self, CorporateRole::Lead)
|
||||
}
|
||||
|
||||
pub fn is_meta_router(&self) -> bool {
|
||||
matches!(self, CorporateRole::MetaRouter)
|
||||
}
|
||||
|
||||
pub fn title(&self) -> &str {
|
||||
match self {
|
||||
CorporateRole::Lead => "Lead",
|
||||
CorporateRole::DirectReport { title } => title.as_str(),
|
||||
CorporateRole::MetaRouter => "Chief of Staff",
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Status of a task through the corporate swarm workflow.
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Hash, Serialize, Deserialize)]
|
||||
pub enum TaskStatus {
|
||||
/// CFO/Lead received request, planning next steps.
|
||||
Pending,
|
||||
/// Delegated to direct report, executing.
|
||||
InProgress,
|
||||
/// Offloaded to cloud LLM for research/analysis only.
|
||||
CloudAnalyzing,
|
||||
/// Cloud result returned, local enrichment in progress.
|
||||
AnalysisReturned,
|
||||
/// Output generated, awaiting human approval.
|
||||
StagedForReview,
|
||||
/// Human rejected, sent back with revision notes.
|
||||
RevisionsRequested,
|
||||
/// Human approved.
|
||||
Approved,
|
||||
/// Written to database, immutable.
|
||||
Committed,
|
||||
}
|
||||
|
||||
impl fmt::Display for TaskStatus {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
match self {
|
||||
TaskStatus::Pending => write!(f, "Pending"),
|
||||
TaskStatus::InProgress => write!(f, "In Progress"),
|
||||
TaskStatus::CloudAnalyzing => write!(f, "Cloud Analyzing"),
|
||||
TaskStatus::AnalysisReturned => write!(f, "Analysis Returned"),
|
||||
TaskStatus::StagedForReview => write!(f, "Staged for Review"),
|
||||
TaskStatus::RevisionsRequested => write!(f, "Revisions Requested"),
|
||||
TaskStatus::Approved => write!(f, "Approved"),
|
||||
TaskStatus::Committed => write!(f, "Committed"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
use std::fmt;
|
||||
|
||||
/// A cross-channel event broadcast via the event bus.
|
||||
/// Cross-functional communication happens only through this bus —
|
||||
/// no direct swarm-to-swarm calls.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct ChannelEvent {
|
||||
pub id: Uuid,
|
||||
pub from_channel: Channel,
|
||||
pub from_title: String,
|
||||
pub to_channel: Channel,
|
||||
pub event_type: String,
|
||||
pub payload: Value,
|
||||
pub requires_approval: bool,
|
||||
pub timestamp: DateTime<Utc>,
|
||||
}
|
||||
|
||||
impl ChannelEvent {
|
||||
pub fn new(
|
||||
from_channel: Channel,
|
||||
from_title: impl Into<String>,
|
||||
to_channel: Channel,
|
||||
event_type: impl Into<String>,
|
||||
payload: Value,
|
||||
) -> Self {
|
||||
Self {
|
||||
id: Uuid::new_v4(),
|
||||
from_channel,
|
||||
from_title: from_title.into(),
|
||||
to_channel,
|
||||
event_type: event_type.into(),
|
||||
payload,
|
||||
requires_approval: true,
|
||||
timestamp: Utc::now(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn with_approval(mut self, requires: bool) -> Self {
|
||||
self.requires_approval = requires;
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
/// A message sent between agents within the same channel or via the event bus.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct SwarmMessage {
|
||||
pub id: Uuid,
|
||||
pub from_agent: AgentId,
|
||||
pub from_role: CorporateRole,
|
||||
pub from_channel: Channel,
|
||||
pub to_agent: Option<AgentId>,
|
||||
pub to_channel: Channel,
|
||||
pub message_type: SwarmMessageType,
|
||||
pub content: String,
|
||||
pub payload: Option<Value>,
|
||||
pub timestamp: DateTime<Utc>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Hash, Serialize, Deserialize)]
|
||||
pub enum SwarmMessageType {
|
||||
/// Lead delegating a task to a direct report.
|
||||
Delegate,
|
||||
/// Direct report returning results to Lead.
|
||||
Report,
|
||||
/// Request for clarification or additional context.
|
||||
Query,
|
||||
/// Quality control / audit checkpoint.
|
||||
AuditCheckpoint,
|
||||
/// Request for cloud offload (research only).
|
||||
CloudOffloadRequest,
|
||||
/// Cloud result returned to local agent.
|
||||
CloudOffloadResult,
|
||||
/// Human approval required.
|
||||
HumanCheckpoint,
|
||||
/// Approval granted or denied.
|
||||
ApprovalResponse,
|
||||
/// General broadcast on event bus.
|
||||
Broadcast,
|
||||
}
|
||||
|
||||
/// A running swarm task tracks state across the multi-agent workflow.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct SwarmTask {
|
||||
pub id: Uuid,
|
||||
pub channel: Channel,
|
||||
pub lead_agent_id: AgentId,
|
||||
pub status: TaskStatus,
|
||||
pub original_intent_text: String,
|
||||
pub assigned_direct_report: Option<AgentId>,
|
||||
pub cloud_offload_id: Option<Uuid>,
|
||||
pub staged_output: Option<String>,
|
||||
pub revision_notes: Option<String>,
|
||||
pub approved_by: Option<String>,
|
||||
pub committed_at: Option<DateTime<Utc>>,
|
||||
pub created_at: DateTime<Utc>,
|
||||
pub updated_at: DateTime<Utc>,
|
||||
}
|
||||
|
||||
impl SwarmTask {
|
||||
pub fn new(channel: Channel, lead_agent_id: AgentId, intent_text: impl Into<String>) -> Self {
|
||||
let now = Utc::now();
|
||||
Self {
|
||||
id: Uuid::new_v4(),
|
||||
channel,
|
||||
lead_agent_id,
|
||||
status: TaskStatus::Pending,
|
||||
original_intent_text: intent_text.into(),
|
||||
assigned_direct_report: None,
|
||||
cloud_offload_id: None,
|
||||
staged_output: None,
|
||||
revision_notes: None,
|
||||
approved_by: None,
|
||||
committed_at: None,
|
||||
created_at: now,
|
||||
updated_at: now,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn transition(&mut self, new_status: TaskStatus) {
|
||||
self.status = new_status;
|
||||
self.updated_at = Utc::now();
|
||||
}
|
||||
|
||||
pub fn is_terminal(&self) -> bool {
|
||||
matches!(self.status, TaskStatus::Approved | TaskStatus::Committed)
|
||||
}
|
||||
|
||||
pub fn requires_human_approval(&self) -> bool {
|
||||
matches!(
|
||||
self.status,
|
||||
TaskStatus::StagedForReview | TaskStatus::RevisionsRequested
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
/// Routing decision made by the Meta-Router (Chief of Staff).
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct RoutingDecision {
|
||||
pub target_channel: Channel,
|
||||
pub confidence: f32,
|
||||
pub routing_method: RoutingMethod,
|
||||
pub reason: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Eq, Hash, Serialize, Deserialize)]
|
||||
pub enum RoutingMethod {
|
||||
ExplicitChannelMention,
|
||||
ActiveUiTab,
|
||||
KeywordIntent,
|
||||
EntityContext,
|
||||
Ambiguous,
|
||||
}
|
||||
|
||||
impl RoutingDecision {
|
||||
pub fn ambiguous(reason: impl Into<String>) -> Self {
|
||||
Self {
|
||||
target_channel: Channel::Finance, // default fallback
|
||||
confidence: 0.0,
|
||||
routing_method: RoutingMethod::Ambiguous,
|
||||
reason: reason.into(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Sanitized payload for cloud offload. PHI and sensitive financial data
|
||||
/// must be stripped or generalized before sending to cloud LLMs.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct CloudOffloadPayload {
|
||||
pub task_id: Uuid,
|
||||
pub intent_category: String,
|
||||
pub sanitized_query: String,
|
||||
pub entity_id_hash: Option<String>,
|
||||
pub revenue_range: Option<String>,
|
||||
pub zip_code: Option<String>,
|
||||
pub industry_hint: Option<String>,
|
||||
pub local_cache_fingerprint: Option<String>,
|
||||
}
|
||||
|
||||
/// Result returned from cloud offload.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct CloudOffloadResult {
|
||||
pub task_id: Uuid,
|
||||
pub cloud_backend: Backend,
|
||||
pub result_json: Value,
|
||||
pub sources: Vec<String>,
|
||||
pub confidence: f32,
|
||||
pub cached: bool,
|
||||
}
|
||||
|
||||
/// Memory namespace for a role within a channel.
|
||||
/// Format: `{channel}/{title}/` — e.g., `finance/controller/`
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct MemoryNamespace {
|
||||
pub channel: Channel,
|
||||
pub role_title: String,
|
||||
}
|
||||
|
||||
impl MemoryNamespace {
|
||||
pub fn new(channel: Channel, role_title: impl Into<String>) -> Self {
|
||||
Self {
|
||||
channel,
|
||||
role_title: role_title.into(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn as_path(&self) -> String {
|
||||
format!("{}/{}/", self.channel.as_str(), self.role_title.to_lowercase().replace(' ', "-"))
|
||||
}
|
||||
}
|
||||
|
||||
/// Human checkpoint configuration per channel.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct HumanCheckpointConfig {
|
||||
pub channel: Channel,
|
||||
pub auto_commit_allowed: bool,
|
||||
pub checkpoint_types: Vec<String>,
|
||||
}
|
||||
|
||||
impl HumanCheckpointConfig {
|
||||
pub fn default_for(channel: Channel) -> Self {
|
||||
let (auto_commit, checkpoints) = match channel {
|
||||
Channel::Finance => (
|
||||
false,
|
||||
vec!["all_writes".into()],
|
||||
),
|
||||
Channel::Clinical => (
|
||||
false,
|
||||
vec!["all_writes".into(), "all_phi".into()],
|
||||
),
|
||||
Channel::Content => (
|
||||
true,
|
||||
vec!["immediate_publish".into()],
|
||||
),
|
||||
Channel::Intelligence => (
|
||||
false,
|
||||
vec!["all_publications".into()],
|
||||
),
|
||||
Channel::Engineering => (
|
||||
false,
|
||||
vec!["prod_deploy".into()],
|
||||
),
|
||||
Channel::Travel => (
|
||||
true,
|
||||
vec!["high_value".into(), "cancellations".into()],
|
||||
),
|
||||
};
|
||||
|
||||
Self {
|
||||
channel,
|
||||
auto_commit_allowed: auto_commit,
|
||||
checkpoint_types: checkpoints,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Command definitions for each channel's command palette.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct ChannelCommand {
|
||||
pub channel: Channel,
|
||||
pub slug: String,
|
||||
pub display_name: String,
|
||||
pub description: String,
|
||||
pub requires_approval: bool,
|
||||
pub target_role: Option<String>,
|
||||
}
|
||||
|
||||
/// Registry of all channel commands.
|
||||
pub struct CommandRegistry {
|
||||
commands: Vec<ChannelCommand>,
|
||||
}
|
||||
|
||||
impl Default for CommandRegistry {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
impl CommandRegistry {
|
||||
pub fn new() -> Self {
|
||||
let mut commands = Vec::new();
|
||||
|
||||
// Finance commands
|
||||
for (slug, name, desc, role) in [
|
||||
("/pl-generate", "P&L Generate", "Generate profit and loss statement", Some("Financial Analyst")),
|
||||
("/reconcile", "Reconcile", "Reconcile accounts and flag discrepancies", Some("Controller")),
|
||||
("/entity-switch", "Entity Switch", "Switch between business entities", Some("CFO")),
|
||||
("/tax-summary", "Tax Summary", "Generate tax summary and optimization tips", Some("Tax Manager")),
|
||||
("/audit-check", "Audit Check", "Run compliance and output quality checks", Some("Internal Auditor")),
|
||||
("/month-end-close", "Month-End Close", "Execute month-end closing procedures", Some("Controller")),
|
||||
("/acquisition-report", "Acquisition Report", "M&A comps, deal terms, offer structure", Some("Financial Analyst")),
|
||||
("/competitive-benchmark", "Competitive Benchmark", "Peer analysis and threat alerts", Some("Financial Analyst")),
|
||||
("/valuation", "Valuation", "Interactive valuation calculator", Some("Financial Analyst")),
|
||||
("/market-scan", "Market Scan", "Expansion opportunity alerts", Some("Financial Analyst")),
|
||||
("/vendor-screen", "Vendor Screen", "Risk screening before contracts", Some("Financial Analyst")),
|
||||
] {
|
||||
commands.push(ChannelCommand {
|
||||
channel: Channel::Finance,
|
||||
slug: slug.into(),
|
||||
display_name: name.into(),
|
||||
description: desc.into(),
|
||||
requires_approval: true,
|
||||
target_role: role.map(|s| s.into()),
|
||||
});
|
||||
}
|
||||
|
||||
// Clinical commands
|
||||
for (slug, name, desc, role) in [
|
||||
("/chart-review", "Chart Review", "Review patient chart and history", Some("Attending Physician")),
|
||||
("/schedule", "Schedule", "View or modify appointments", Some("Scheduler")),
|
||||
("/prescription-check", "Prescription Check", "Verify medication interactions", Some("Attending Physician")),
|
||||
("/patient-summary", "Patient Summary", "Generate patient care summary", Some("Nurse")),
|
||||
] {
|
||||
commands.push(ChannelCommand {
|
||||
channel: Channel::Clinical,
|
||||
slug: slug.into(),
|
||||
display_name: name.into(),
|
||||
description: desc.into(),
|
||||
requires_approval: true,
|
||||
target_role: role.map(|s| s.into()),
|
||||
});
|
||||
}
|
||||
|
||||
// Content commands
|
||||
for (slug, name, desc, role) in [
|
||||
("/draft-post", "Draft Post", "Draft social media content", Some("Staff Writer")),
|
||||
("/schedule-content", "Schedule Content", "Queue content for optimal publish time", Some("Community Manager")),
|
||||
("/moderate-queue", "Moderate Queue", "Review and approve queued content", Some("Copy Editor")),
|
||||
] {
|
||||
commands.push(ChannelCommand {
|
||||
channel: Channel::Content,
|
||||
slug: slug.into(),
|
||||
display_name: name.into(),
|
||||
description: desc.into(),
|
||||
requires_approval: true,
|
||||
target_role: role.map(|s| s.into()),
|
||||
});
|
||||
}
|
||||
|
||||
// Intelligence commands
|
||||
for (slug, name, desc, role) in [
|
||||
("/deep-dive", "Deep Dive", "Comprehensive research on a topic", Some("Senior Researcher")),
|
||||
("/fact-check", "Fact Check", "Verify claims against sources", Some("Fact-Checker")),
|
||||
("/publish-draft", "Publish Draft", "Stage intelligence report for publication", Some("Publisher")),
|
||||
] {
|
||||
commands.push(ChannelCommand {
|
||||
channel: Channel::Intelligence,
|
||||
slug: slug.into(),
|
||||
display_name: name.into(),
|
||||
description: desc.into(),
|
||||
requires_approval: true,
|
||||
target_role: role.map(|s| s.into()),
|
||||
});
|
||||
}
|
||||
|
||||
// Engineering commands
|
||||
for (slug, name, desc, role) in [
|
||||
("/fix-bug", "Fix Bug", "Analyze and propose bug fixes", Some("Senior Engineer")),
|
||||
("/review-pr", "Review PR", "Review pull request for quality", Some("Senior Engineer")),
|
||||
("/security-scan", "Security Scan", "Run security vulnerability scan", Some("Security Engineer")),
|
||||
] {
|
||||
commands.push(ChannelCommand {
|
||||
channel: Channel::Engineering,
|
||||
slug: slug.into(),
|
||||
display_name: name.into(),
|
||||
description: desc.into(),
|
||||
requires_approval: true,
|
||||
target_role: role.map(|s| s.into()),
|
||||
});
|
||||
}
|
||||
|
||||
// Travel commands
|
||||
for (slug, name, desc, role) in [
|
||||
("/book-flight", "Book Flight", "Search and book flights", Some("Booking Agent")),
|
||||
("/track-price", "Track Price", "Monitor price drops for trips", Some("Price Analyst")),
|
||||
("/itinerary", "Itinerary", "View or modify travel itinerary", Some("Support Rep")),
|
||||
] {
|
||||
commands.push(ChannelCommand {
|
||||
channel: Channel::Travel,
|
||||
slug: slug.into(),
|
||||
display_name: name.into(),
|
||||
description: desc.into(),
|
||||
requires_approval: false,
|
||||
target_role: role.map(|s| s.into()),
|
||||
});
|
||||
}
|
||||
|
||||
Self { commands }
|
||||
}
|
||||
|
||||
pub fn for_channel(&self, channel: Channel) -> Vec<&ChannelCommand> {
|
||||
self.commands
|
||||
.iter()
|
||||
.filter(|c| c.channel == channel)
|
||||
.collect()
|
||||
}
|
||||
|
||||
pub fn find(&self, slug: &str) -> Option<&ChannelCommand> {
|
||||
self.commands.iter().find(|c| c.slug == slug)
|
||||
}
|
||||
|
||||
pub fn all(&self) -> &[ChannelCommand] {
|
||||
&self.commands
|
||||
}
|
||||
}
|
||||
77
docker-compose.infra.yml
Normal file
77
docker-compose.infra.yml
Normal file
|
|
@ -0,0 +1,77 @@
|
|||
version: "3.8"
|
||||
|
||||
services:
|
||||
meilisearch:
|
||||
image: getmeili/meilisearch:v1.8
|
||||
container_name: synq-meilisearch
|
||||
environment:
|
||||
- MEILI_MASTER_KEY=${MEILI_MASTER_KEY:-synq_meili_master_key_2026}
|
||||
- MEILI_NO_ANALYTICS=true
|
||||
- MEILI_ENV=production
|
||||
ports:
|
||||
- "7700:7700"
|
||||
volumes:
|
||||
- /home/raider1984/synq-data/meilisearch:/meili_data
|
||||
networks:
|
||||
- synq-infra-net
|
||||
restart: unless-stopped
|
||||
|
||||
go-pmtiles:
|
||||
image: protomaps/go-pmtiles:latest
|
||||
container_name: synq-pmtiles
|
||||
ports:
|
||||
- "8082:8080"
|
||||
volumes:
|
||||
- /home/raider1984/synq-data/pmtiles:/data
|
||||
command: ["serve", "/data", "--port=8080", "--cors=*"]
|
||||
networks:
|
||||
- synq-infra-net
|
||||
restart: unless-stopped
|
||||
|
||||
nocobase-postgres:
|
||||
image: postgres:16-alpine
|
||||
container_name: synq-nocobase-db
|
||||
environment:
|
||||
POSTGRES_DB: nocobase
|
||||
POSTGRES_USER: nocobase
|
||||
POSTGRES_PASSWORD: ${NOCOBASE_DB_PASS:-nocobase_pass}
|
||||
volumes:
|
||||
- /home/raider1984/synq-data/nocobase/postgres:/var/lib/postgresql/data
|
||||
networks:
|
||||
- synq-infra-net
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "pg_isready -U nocobase -d nocobase"]
|
||||
interval: 5s
|
||||
timeout: 5s
|
||||
retries: 5
|
||||
restart: unless-stopped
|
||||
|
||||
nocobase:
|
||||
image: nocobase/nocobase:latest
|
||||
container_name: synq-nocobase
|
||||
environment:
|
||||
- APP_KEY=${NOCOBASE_APP_KEY:-synq-nocobase-app-key}
|
||||
- DB_DIALECT=postgres
|
||||
- DB_HOST=nocobase-postgres
|
||||
- DB_DATABASE=nocobase
|
||||
- DB_USER=nocobase
|
||||
- DB_PASSWORD=${NOCOBASE_DB_PASS:-nocobase_pass}
|
||||
- DB_PORT=5432
|
||||
- DB_TABLE_PREFIX=nocobase_
|
||||
- API_BASE_URL=http://localhost:13000
|
||||
- API_BASE_PATH=/api/
|
||||
- PROXY_TARGET=http://127.0.0.1:13000
|
||||
ports:
|
||||
- "13000:80"
|
||||
volumes:
|
||||
- /home/raider1984/synq-data/nocobase/storage:/app/nocobase/storage
|
||||
depends_on:
|
||||
nocobase-postgres:
|
||||
condition: service_healthy
|
||||
networks:
|
||||
- synq-infra-net
|
||||
restart: unless-stopped
|
||||
|
||||
networks:
|
||||
synq-infra-net:
|
||||
driver: bridge
|
||||
232
docs/synq_shell_design_spec.md
Normal file
232
docs/synq_shell_design_spec.md
Normal file
|
|
@ -0,0 +1,232 @@
|
|||
# Synq Shell — Rolling Menu Design Spec
|
||||
## Figma-Style Component Breakdown
|
||||
|
||||
---
|
||||
|
||||
### Canvas
|
||||
- **Dimensions**: 1920×1080 (16:9 fullscreen kiosk)
|
||||
- **Background**: `#0a0a0f` (deep void black)
|
||||
- **Safe zone**: 160px padding all sides
|
||||
|
||||
---
|
||||
|
||||
### Ambient Background Layer (z-index: 0)
|
||||
|
||||
| Element | Property | Value |
|
||||
|---------|----------|-------|
|
||||
| Orb 1 | Position | x: 480, y: 270 |
|
||||
| | Size | 384×384px |
|
||||
| | Fill | Radial gradient `#1e3a5f` → transparent |
|
||||
| | Opacity | 20% |
|
||||
| | Blur | 64px (backdrop-filter) |
|
||||
| Orb 2 | Position | x: 1440, y: 810 |
|
||||
| | Size | 384×384px |
|
||||
| | Fill | Radial gradient `#2d1b4e` → transparent |
|
||||
| | Opacity | 20% |
|
||||
| | Blur | 64px |
|
||||
|
||||
---
|
||||
|
||||
### Center Reference Line (z-index: 10)
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Position | y: 540 (exact center) |
|
||||
| Width | 100% viewport |
|
||||
| Height | 1px |
|
||||
| Fill | Linear gradient: transparent → `rgba(255,255,255,0.06)` → transparent |
|
||||
| Blend | Screen |
|
||||
|
||||
---
|
||||
|
||||
### Menu Item Container (z-index: 20)
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Position | Centered, x: 960 |
|
||||
| Width | 480px max |
|
||||
| Height | 500px (visible scroll area) |
|
||||
| Overflow | Hidden (clips items outside) |
|
||||
|
||||
---
|
||||
|
||||
### Menu Item — Active State (Center)
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Scale | 1.5× |
|
||||
| Opacity | 100% |
|
||||
| Blur | 0px |
|
||||
| z-index | 100 |
|
||||
|
||||
**Icon**
|
||||
- Size: 48×48px
|
||||
- Font: System UI / Segoe UI Symbol
|
||||
- Color: Per-item accent (see palette)
|
||||
- Text shadow: `0 0 20px rgba(accent, 0.6), 0 0 40px rgba(accent, 0.3)`
|
||||
- Animation: `glow-pulse` 2s ease infinite
|
||||
|
||||
**Label**
|
||||
- Font: Inter, weight 700 (Bold)
|
||||
- Size: 20px / line-height 28px
|
||||
- Color: Per-item accent
|
||||
- Letter spacing: 0.05em
|
||||
- Opacity: 100%
|
||||
|
||||
**Active Indicator Bar**
|
||||
- Position: Below label, y-offset +8px
|
||||
- Width: 240px (50% of container)
|
||||
- Height: 1px
|
||||
- Fill: Linear gradient transparent → `rgba(accent, 0.6)` → transparent
|
||||
- Box shadow: `0 0 10px rgba(accent, 0.4)`
|
||||
|
||||
**Glow Orb (Behind)**
|
||||
- Size: 200% of item bounds
|
||||
- Fill: Radial gradient `rgba(accent, 0.15)` → transparent
|
||||
- Position: Centered behind item
|
||||
- z-index: -1
|
||||
|
||||
---
|
||||
|
||||
### Menu Item — Distance States
|
||||
|
||||
| Distance | Scale | Opacity | Blur | Font Size | Font Weight |
|
||||
|----------|-------|---------|------|-----------|-------------|
|
||||
| ±1 item | 1.2× | 80% | 1px | 16px | 600 |
|
||||
| ±2 items | 1.0× | 60% | 3px | 14px | 500 |
|
||||
| ±3 items | 0.8× | 40% | 5px | 12px | 400 |
|
||||
| ±4 items | 0.6× | 20% | 6px | 11px | 400 |
|
||||
| >±4 | 0.4× | 10% | 8px | 10px | 400 |
|
||||
|
||||
---
|
||||
|
||||
### Color Palette
|
||||
|
||||
#### Item Accents
|
||||
| Item | Hex | RGB | Glow RGBA |
|
||||
|------|-----|-----|-----------|
|
||||
| Restart Synq | `#60a5fa` | 96, 165, 250 | `rgba(59,130,246,0.6)` |
|
||||
| Check Updates | `#34d399` | 52, 211, 153 | `rgba(34,197,94,0.6)` |
|
||||
| View Logs | `#fbbf24` | 251, 191, 36 | `rgba(234,179,8,0.6)` |
|
||||
| Network | `#a78bfa` | 167, 139, 250 | `rgba(168,85,247,0.6)` |
|
||||
| Power Off | `#fb7185` | 251, 113, 133 | `rgba(239,68,68,0.6)` |
|
||||
| Back to Synq | `#818cf8` | 129, 140, 248 | `rgba(99,102,241,0.6)` |
|
||||
|
||||
#### Global Colors
|
||||
| Usage | Hex | RGBA |
|
||||
|-------|-----|------|
|
||||
| Background | `#0a0a0f` | `rgba(10,10,15,1)` |
|
||||
| Status bar bg | `transparent` → `#000000` | Gradient from top |
|
||||
| Status text | `#ffffff` | `rgba(255,255,255,0.4)` |
|
||||
| Status dot (ready) | `#10b981` | `rgba(16,185,129,1)` |
|
||||
| Divider dot | `#ffffff` | `rgba(255,255,255,0.4)` |
|
||||
| Center line | `#ffffff` | `rgba(255,255,255,0.06)` |
|
||||
|
||||
---
|
||||
|
||||
### Typography
|
||||
|
||||
| Element | Font | Weight | Size | Line Height | Letter Spacing |
|
||||
|---------|------|--------|------|-------------|----------------|
|
||||
| Icon | System UI Symbol | 400 | 48px | 1.0 | normal |
|
||||
| Active label | Inter | 700 | 20px | 28px | 0.05em |
|
||||
| Inactive label | Inter | 400–600 | 11–16px | 1.4 | normal |
|
||||
| Status bar | Inter | 400 | 12px | 16px | 0.02em |
|
||||
| Hints | Inter | 400 | 12px | 16px | normal |
|
||||
|
||||
---
|
||||
|
||||
### Spacing & Layout
|
||||
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Item height | 96px |
|
||||
| Item gap | 0px (items touch, visual separation via scale) |
|
||||
| Icon-to-label gap | 16px |
|
||||
| Container max-width | 480px |
|
||||
| Container vertical padding | 0px (items determine height) |
|
||||
| Status bar height | 64px |
|
||||
| Status bar padding | 32px horizontal |
|
||||
| Safe zone (all sides) | 160px |
|
||||
|
||||
---
|
||||
|
||||
### Animations & Transitions
|
||||
|
||||
| Animation | Duration | Easing | Properties |
|
||||
|-----------|----------|--------|------------|
|
||||
| Spring snap | Physics-based | `spring(stiffness: 12, damping: 0.88)` | scrollOffset |
|
||||
| Glow pulse | 2000ms | `cubic-bezier(0.4, 0, 0.6, 1)` | opacity, brightness |
|
||||
| Scale pop | 150ms | `ease-out` | transform scale |
|
||||
| Opacity fade | 100ms | `linear` | opacity |
|
||||
| Blur transition | 200ms | `ease-out` | filter blur |
|
||||
| Status dot pulse | 2000ms | `ease-in-out` | opacity, scale |
|
||||
|
||||
---
|
||||
|
||||
### Touch Targets
|
||||
|
||||
| Input | Minimum Size | Behavior |
|
||||
|-------|-------------|----------|
|
||||
| Touch item | 96×96px | Direct manipulation, 1:1 scroll tracking |
|
||||
| Wheel step | 1 tick | Snap to next/prev item |
|
||||
| Keyboard repeat | 200ms delay, 50ms interval | Continuous scroll |
|
||||
| Swipe velocity threshold | >500px/s | Momentum scroll with decay |
|
||||
| Snap threshold | <0.3 item height | Auto-snap to nearest |
|
||||
|
||||
---
|
||||
|
||||
### Responsive Behavior
|
||||
|
||||
| Viewport | Adjustment |
|
||||
|----------|-----------|
|
||||
| <1280px | Scale all items 0.8×, reduce visible range to ±2 |
|
||||
| <768px (tablet) | Full touch, increase item height to 120px |
|
||||
| <480px (phone) | Single column, icon only (no labels), tap to expand |
|
||||
| >2560px (4K) | Scale 1.2×, increase blur quality |
|
||||
|
||||
---
|
||||
|
||||
### States
|
||||
|
||||
| State | Visual |
|
||||
|-------|--------|
|
||||
| Default | As specified above |
|
||||
| Hover (mouse) | Item scales to 1.1×, glow intensifies 20% |
|
||||
| Press (touch/mouse down) | Scale compresses to 0.95×, brightness +10% |
|
||||
| Loading (after select) | Icon rotates, label changes to "Processing..." |
|
||||
| Error | Icon turns red, brief shake animation (±4px, 300ms) |
|
||||
| Success | Brief green flash overlay, then transition |
|
||||
|
||||
---
|
||||
|
||||
### Assets
|
||||
|
||||
| Asset | Format | Size | Notes |
|
||||
|-------|--------|------|-------|
|
||||
| Icons | Unicode / System font | 48px | No image assets needed |
|
||||
| Background orbs | CSS gradient | N/A | Generated in browser |
|
||||
| Glow effects | CSS box-shadow | N/A | GPU-accelerated |
|
||||
| Cursor | Hidden | N/A | `cursor: none` in kiosk mode |
|
||||
|
||||
---
|
||||
|
||||
### Accessibility
|
||||
|
||||
| Feature | Implementation |
|
||||
|---------|----------------|
|
||||
| Focus ring | 2px solid `rgba(accent, 0.8)`, offset 4px |
|
||||
| Screen reader | `aria-label` on each item, `role="menuitem"` |
|
||||
| Reduced motion | Disable spring physics, instant snap, no glow pulse |
|
||||
| High contrast | White icons/labels on pure black, no transparency |
|
||||
| Minimum contrast | 7:1 for active items, 4.5:1 for ±1 items |
|
||||
|
||||
---
|
||||
|
||||
### Export Notes
|
||||
|
||||
- All measurements in **pixels** at 1× density
|
||||
- Colors in **Hex** and **RGBA** for web implementation
|
||||
- Fonts: Inter (Google Fonts), system-ui fallback
|
||||
- Icons: Unicode symbols or Phosphor Icons (if custom needed)
|
||||
- No raster images — pure CSS/Tailwind implementation
|
||||
24
launch-intel-dashboard.sh
Executable file
24
launch-intel-dashboard.sh
Executable file
|
|
@ -0,0 +1,24 @@
|
|||
#!/bin/bash
|
||||
set -e
|
||||
|
||||
echo "╔═══════════════════════════════════════════════╗"
|
||||
echo "║ Synq Intelligence Dashboard Launcher ║"
|
||||
echo "╚═══════════════════════════════════════════════╝"
|
||||
|
||||
# Build Rust backend
|
||||
echo "[1/3] Building Rust backend (synq-intel)..."
|
||||
cd "$(dirname "$0")"
|
||||
cargo build -p synq-intel --release
|
||||
|
||||
# Start backend in background
|
||||
echo "[2/3] Starting API server on :3001..."
|
||||
./target/release/synq-intel-server &
|
||||
SERVER_PID=$!
|
||||
trap "kill $SERVER_PID 2>/dev/null; exit" INT TERM EXIT
|
||||
|
||||
sleep 2
|
||||
|
||||
# Start frontend
|
||||
echo "[3/3] Starting Vite dev server..."
|
||||
cd ui/intel-dashboard
|
||||
npm run dev
|
||||
227
migrations/003_swarm_and_finance.sql
Normal file
227
migrations/003_swarm_and_finance.sql
Normal file
|
|
@ -0,0 +1,227 @@
|
|||
-- Migration: Corporate Swarm Architecture + Finance Channel
|
||||
-- Phase 1 of Synq Core v2.0
|
||||
|
||||
-- ─── Swarm Tasks ───
|
||||
CREATE TABLE IF NOT EXISTS swarm_tasks (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
channel TEXT NOT NULL CHECK (channel IN ('finance', 'clinical', 'content', 'intelligence', 'engineering', 'travel')),
|
||||
lead_agent_id TEXT NOT NULL,
|
||||
status TEXT NOT NULL CHECK (status IN (
|
||||
'Pending', 'InProgress', 'CloudAnalyzing', 'AnalysisReturned',
|
||||
'StagedForReview', 'RevisionsRequested', 'Approved', 'Committed'
|
||||
)),
|
||||
original_intent_text TEXT NOT NULL,
|
||||
assigned_direct_report TEXT,
|
||||
cloud_offload_id UUID,
|
||||
staged_output TEXT,
|
||||
revision_notes TEXT,
|
||||
approved_by TEXT,
|
||||
committed_at TIMESTAMPTZ,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
);
|
||||
|
||||
CREATE INDEX idx_swarm_tasks_channel ON swarm_tasks(channel);
|
||||
CREATE INDEX idx_swarm_tasks_status ON swarm_tasks(status);
|
||||
CREATE INDEX idx_swarm_tasks_created ON swarm_tasks(created_at DESC);
|
||||
CREATE INDEX idx_swarm_tasks_lead ON swarm_tasks(lead_agent_id);
|
||||
|
||||
-- ─── Cross-Channel Event Bus Log ───
|
||||
CREATE TABLE IF NOT EXISTS channel_events (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
from_channel TEXT NOT NULL,
|
||||
from_title TEXT NOT NULL,
|
||||
to_channel TEXT NOT NULL,
|
||||
event_type TEXT NOT NULL,
|
||||
payload JSONB NOT NULL DEFAULT '{}',
|
||||
requires_approval BOOLEAN NOT NULL DEFAULT true,
|
||||
approved_by TEXT,
|
||||
approved_at TIMESTAMPTZ,
|
||||
timestamp TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
);
|
||||
|
||||
CREATE INDEX idx_channel_events_to_channel ON channel_events(to_channel);
|
||||
CREATE INDEX idx_channel_events_timestamp ON channel_events(timestamp DESC);
|
||||
CREATE INDEX idx_channel_events_type ON channel_events(event_type);
|
||||
|
||||
-- ─── Cloud Offload Log ───
|
||||
CREATE TABLE IF NOT EXISTS cloud_offload_log (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
task_id UUID NOT NULL REFERENCES swarm_tasks(id) ON DELETE CASCADE,
|
||||
intent_category TEXT NOT NULL,
|
||||
sanitized_query TEXT NOT NULL,
|
||||
entity_id_hash TEXT,
|
||||
revenue_range TEXT,
|
||||
zip_code TEXT,
|
||||
industry_hint TEXT,
|
||||
local_cache_fingerprint TEXT,
|
||||
cloud_backend TEXT,
|
||||
result_json JSONB,
|
||||
sources TEXT[],
|
||||
confidence FLOAT,
|
||||
cached BOOLEAN NOT NULL DEFAULT false,
|
||||
error_message TEXT,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
completed_at TIMESTAMPTZ
|
||||
);
|
||||
|
||||
CREATE INDEX idx_cloud_offload_task ON cloud_offload_log(task_id);
|
||||
CREATE INDEX idx_cloud_offload_created ON cloud_offload_log(created_at DESC);
|
||||
|
||||
-- ─── Connector Registry ───
|
||||
CREATE TABLE IF NOT EXISTS connector_registry (
|
||||
id TEXT PRIMARY KEY,
|
||||
name TEXT NOT NULL,
|
||||
tier INTEGER NOT NULL CHECK (tier IN (1, 2, 3)),
|
||||
connector_type TEXT NOT NULL,
|
||||
description TEXT,
|
||||
base_url TEXT,
|
||||
env_vars TEXT[],
|
||||
endpoints JSONB NOT NULL DEFAULT '{}',
|
||||
rate_limit TEXT,
|
||||
cache_ttl INTEGER,
|
||||
phi_safe BOOLEAN NOT NULL DEFAULT true,
|
||||
write_capable BOOLEAN NOT NULL DEFAULT false,
|
||||
cost_description TEXT,
|
||||
user_supplies_key BOOLEAN NOT NULL DEFAULT false,
|
||||
config JSONB NOT NULL DEFAULT '{}',
|
||||
enabled BOOLEAN NOT NULL DEFAULT false,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
);
|
||||
|
||||
CREATE INDEX idx_connector_registry_tier ON connector_registry(tier);
|
||||
CREATE INDEX idx_connector_registry_enabled ON connector_registry(enabled);
|
||||
|
||||
-- ─── Deal Tracker (Finance Channel) ───
|
||||
CREATE TABLE IF NOT EXISTS deal_tracker (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
entity_id TEXT NOT NULL,
|
||||
target_name TEXT NOT NULL,
|
||||
target_type TEXT,
|
||||
location TEXT,
|
||||
stage TEXT NOT NULL CHECK (stage IN ('sourced', 'contacted', 'loi', 'due_diligence', 'closed', 'dead')),
|
||||
estimated_value_range TEXT,
|
||||
ebitda_range TEXT,
|
||||
multiple_range TEXT,
|
||||
assigned_analyst TEXT,
|
||||
notes TEXT,
|
||||
documents JSONB NOT NULL DEFAULT '[]',
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
closed_at TIMESTAMPTZ
|
||||
);
|
||||
|
||||
CREATE INDEX idx_deal_tracker_stage ON deal_tracker(stage);
|
||||
CREATE INDEX idx_deal_tracker_entity ON deal_tracker(entity_id);
|
||||
CREATE INDEX idx_deal_tracker_created ON deal_tracker(created_at DESC);
|
||||
|
||||
-- ─── Competitive Dashboard Snapshots ───
|
||||
CREATE TABLE IF NOT EXISTS competitive_snapshots (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
entity_id TEXT NOT NULL,
|
||||
industry TEXT NOT NULL,
|
||||
location TEXT,
|
||||
snapshot_date TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
peers JSONB NOT NULL DEFAULT '[]',
|
||||
percentile_rankings JSONB NOT NULL DEFAULT '{}',
|
||||
threat_alerts JSONB NOT NULL DEFAULT '[]',
|
||||
data_sources TEXT[]
|
||||
);
|
||||
|
||||
CREATE INDEX idx_competitive_snapshots_entity ON competitive_snapshots(entity_id);
|
||||
CREATE INDEX idx_competitive_snapshots_date ON competitive_snapshots(snapshot_date DESC);
|
||||
|
||||
-- ─── Valuation History ───
|
||||
CREATE TABLE IF NOT EXISTS valuation_history (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
entity_id TEXT NOT NULL,
|
||||
valuation_date TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
method TEXT NOT NULL CHECK (method IN ('dcf', 'comps', 'precedent', 'all')),
|
||||
dcf_range TEXT,
|
||||
comps_range TEXT,
|
||||
precedent_range TEXT,
|
||||
final_range TEXT,
|
||||
assumptions JSONB NOT NULL DEFAULT '{}',
|
||||
sensitivity JSONB NOT NULL DEFAULT '{}',
|
||||
prepared_by TEXT,
|
||||
approved_by TEXT,
|
||||
staged BOOLEAN NOT NULL DEFAULT true
|
||||
);
|
||||
|
||||
CREATE INDEX idx_valuation_history_entity ON valuation_history(entity_id);
|
||||
CREATE INDEX idx_valuation_history_date ON valuation_history(valuation_date DESC);
|
||||
|
||||
-- ─── Vendor Risk Screenings ───
|
||||
CREATE TABLE IF NOT EXISTS vendor_screenings (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
entity_id TEXT NOT NULL,
|
||||
counterparty_name TEXT NOT NULL,
|
||||
contract_value TEXT,
|
||||
payment_terms TEXT,
|
||||
risk_score INTEGER CHECK (risk_score BETWEEN 1 AND 10),
|
||||
red_flags TEXT[],
|
||||
recommended_terms TEXT,
|
||||
data_sources TEXT[],
|
||||
screened_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
approved_by TEXT,
|
||||
staged BOOLEAN NOT NULL DEFAULT true
|
||||
);
|
||||
|
||||
CREATE INDEX idx_vendor_screenings_entity ON vendor_screenings(entity_id);
|
||||
CREATE INDEX idx_vendor_screenings_risk ON vendor_screenings(risk_score);
|
||||
|
||||
-- ─── Memory Namespaces (for swarm agent memory isolation) ───
|
||||
CREATE TABLE IF NOT EXISTS memory_namespaces (
|
||||
id TEXT PRIMARY KEY,
|
||||
channel TEXT NOT NULL,
|
||||
role_title TEXT NOT NULL,
|
||||
description TEXT,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
);
|
||||
|
||||
-- Seed default namespaces
|
||||
INSERT INTO memory_namespaces (id, channel, role_title, description)
|
||||
VALUES
|
||||
('finance/cfo/', 'finance', 'CFO', 'Finance channel lead orchestrator'),
|
||||
('finance/controller/', 'finance', 'Controller', 'GL, recon, month-end close'),
|
||||
('finance/financial-analyst/', 'finance', 'Financial Analyst', 'Modeling, P&L, forecasting, M&A'),
|
||||
('finance/internal-auditor/', 'finance', 'Internal Auditor', 'Compliance, QC, audit trails'),
|
||||
('finance/tax-manager/', 'finance', 'Tax Manager', 'Tax strategy, S-Corp, crypto'),
|
||||
('clinical/chief-medical-officer/', 'clinical', 'Chief Medical Officer', 'Clinical channel lead'),
|
||||
('clinical/attending-physician/', 'clinical', 'Attending Physician', 'Patient care, prescriptions'),
|
||||
('clinical/nurse/', 'clinical', 'Nurse', 'Patient care coordination'),
|
||||
('clinical/medical-assistant/', 'clinical', 'Medical Assistant', 'Clinical support'),
|
||||
('clinical/scheduler/', 'clinical', 'Scheduler', 'Appointments and scheduling'),
|
||||
('content/editor-in-chief/', 'content', 'Editor-in-Chief', 'Content channel lead'),
|
||||
('content/staff-writer/', 'content', 'Staff Writer', 'Content creation'),
|
||||
('content/copy-editor/', 'content', 'Copy Editor', 'Content review and editing'),
|
||||
('content/community-manager/', 'content', 'Community Manager', 'Social engagement'),
|
||||
('intelligence/director-of-intelligence/', 'intelligence', 'Director of Intelligence', 'Intel channel lead'),
|
||||
('intelligence/senior-researcher/', 'intelligence', 'Senior Researcher', 'Deep research and analysis'),
|
||||
('intelligence/fact-checker/', 'intelligence', 'Fact-Checker', 'Claim verification'),
|
||||
('intelligence/publisher/', 'intelligence', 'Publisher', 'Report publication'),
|
||||
('engineering/cto/', 'engineering', 'CTO', 'Engineering channel lead'),
|
||||
('engineering/senior-engineer/', 'engineering', 'Senior Engineer', 'Code and architecture'),
|
||||
('engineering/qa-engineer/', 'engineering', 'QA Engineer', 'Quality assurance'),
|
||||
('engineering/security-engineer/', 'engineering', 'Security Engineer', 'Security and compliance'),
|
||||
('travel/travel-director/', 'travel', 'Travel Director', 'Travel channel lead'),
|
||||
('travel/booking-agent/', 'travel', 'Booking Agent', 'Flight and hotel booking'),
|
||||
('travel/price-analyst/', 'travel', 'Price Analyst', 'Price tracking and alerts'),
|
||||
('travel/support-rep/', 'travel', 'Support Rep', 'Travel support')
|
||||
ON CONFLICT (id) DO NOTHING;
|
||||
|
||||
-- ─── Update trigger for swarm_tasks.updated_at ───
|
||||
CREATE OR REPLACE FUNCTION update_swarm_task_updated_at()
|
||||
RETURNS TRIGGER AS $$
|
||||
BEGIN
|
||||
NEW.updated_at = NOW();
|
||||
RETURN NEW;
|
||||
END;
|
||||
$$ LANGUAGE plpgsql;
|
||||
|
||||
DROP TRIGGER IF EXISTS trigger_swarm_tasks_updated_at ON swarm_tasks;
|
||||
CREATE TRIGGER trigger_swarm_tasks_updated_at
|
||||
BEFORE UPDATE ON swarm_tasks
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_swarm_task_updated_at();
|
||||
97
scripts/import_epstein_to_nocobase.py
Normal file
97
scripts/import_epstein_to_nocobase.py
Normal file
|
|
@ -0,0 +1,97 @@
|
|||
#!/usr/bin/env python3
|
||||
"""Import Epstein dataset from TypeScript file into NocoBase."""
|
||||
|
||||
import json
|
||||
import re
|
||||
import requests
|
||||
import sys
|
||||
|
||||
NOCOBASE_URL = "http://localhost:13000"
|
||||
ADMIN_EMAIL = "admin@nocobase.com"
|
||||
ADMIN_PASSWORD = "admin123"
|
||||
DATASET_PATH = "/home/raider1984/synq-core-runtime/ui/intel-dashboard/src/data/epsteinDataset.ts"
|
||||
|
||||
|
||||
def get_token():
|
||||
r = requests.post(
|
||||
f"{NOCOBASE_URL}/api/auth:signIn",
|
||||
json={"email": ADMIN_EMAIL, "password": ADMIN_PASSWORD},
|
||||
)
|
||||
r.raise_for_status()
|
||||
return r.json()["data"]["token"]
|
||||
|
||||
|
||||
def parse_entities():
|
||||
with open(DATASET_PATH, "r") as f:
|
||||
content = f.read()
|
||||
|
||||
match = re.search(r"export const EPSTEIN_ENTITIES: Entity\[\] = (\[.*?\]);", content, re.DOTALL)
|
||||
if not match:
|
||||
match = re.search(r"export const EPSTEIN_ENTITIES = (\[.*?\]);", content, re.DOTALL)
|
||||
|
||||
if not match:
|
||||
raise ValueError("Could not extract entities from dataset")
|
||||
|
||||
return json.loads(match.group(1))
|
||||
|
||||
|
||||
def import_entities(token, entities):
|
||||
headers = {
|
||||
"Authorization": f"Bearer {token}",
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
total = len(entities)
|
||||
print(f"Importing {total} entities into NocoBase...")
|
||||
|
||||
for i, entity in enumerate(entities):
|
||||
# Flatten location if present
|
||||
payload = {
|
||||
"id": entity["id"],
|
||||
"type": entity["type"],
|
||||
"name": entity["name"],
|
||||
"aliases": json.dumps(entity.get("aliases", [])),
|
||||
"role": entity.get("role", ""),
|
||||
"location": json.dumps(entity.get("location", {})),
|
||||
"tags": json.dumps(entity.get("tags", [])),
|
||||
"credibility": entity.get("credibility", "unverified"),
|
||||
"sources": json.dumps(entity.get("sources", [])),
|
||||
"state": entity.get("state", "active"),
|
||||
"createdAt": entity.get("createdAt", ""),
|
||||
"updatedAt": entity.get("updatedAt", ""),
|
||||
"createdBy": entity.get("createdBy", "import"),
|
||||
"layers": json.dumps(entity.get("layers", [])),
|
||||
}
|
||||
|
||||
# Add optional business fields
|
||||
for field in [
|
||||
"registrationNumber", "jurisdiction", "incorporationDate",
|
||||
"parentCompany", "subsidiaries", "beneficialOwners",
|
||||
"revenue", "valuation", "fundingRounds", "executives",
|
||||
]:
|
||||
if field in entity:
|
||||
val = entity[field]
|
||||
if isinstance(val, (list, dict)):
|
||||
payload[field] = json.dumps(val)
|
||||
else:
|
||||
payload[field] = val
|
||||
|
||||
r = requests.post(
|
||||
f"{NOCOBASE_URL}/api/entities:create",
|
||||
headers=headers,
|
||||
json=payload,
|
||||
)
|
||||
|
||||
if r.status_code == 200:
|
||||
if (i + 1) % 100 == 0 or i == 0:
|
||||
print(f" Imported {i + 1}/{total}...")
|
||||
else:
|
||||
print(f" ERROR at {i} ({entity['id']}): {r.status_code} - {r.text[:200]}")
|
||||
|
||||
print(f"Done. Imported {total} entities.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
token = get_token()
|
||||
entities = parse_entities()
|
||||
import_entities(token, entities)
|
||||
45
scripts/prefetch_blue_marble.py
Normal file
45
scripts/prefetch_blue_marble.py
Normal file
|
|
@ -0,0 +1,45 @@
|
|||
#!/usr/bin/env python3
|
||||
"""Pre-fetch NASA Blue Marble tiles for offline/air-gapped use."""
|
||||
|
||||
import requests
|
||||
from pathlib import Path
|
||||
import sys
|
||||
|
||||
BASE_URL = "https://gibs.earthdata.nasa.gov/wmts/epsg3857/best/BlueMarble_ShadedRelief_Bathymetry/default/GoogleMapsCompatible_Level8/{z}/{y}/{x}.jpeg"
|
||||
CACHE_DIR = Path("/home/raider1984/synq-data/pmtiles/tiles/blue-marble")
|
||||
|
||||
def prefetch(max_zoom=4):
|
||||
total_downloaded = 0
|
||||
total_skipped = 0
|
||||
|
||||
for z in range(max_zoom + 1):
|
||||
max_tile = 2 ** z
|
||||
zoom_downloaded = 0
|
||||
|
||||
for x in range(max_tile):
|
||||
for y in range(max_tile):
|
||||
cache_path = CACHE_DIR / str(z) / str(x) / f"{y}.jpg"
|
||||
if cache_path.exists():
|
||||
total_skipped += 1
|
||||
continue
|
||||
|
||||
url = BASE_URL.format(z=z, x=x, y=y)
|
||||
try:
|
||||
r = requests.get(url, timeout=15)
|
||||
if r.status_code == 200:
|
||||
cache_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
cache_path.write_bytes(r.content)
|
||||
zoom_downloaded += 1
|
||||
total_downloaded += 1
|
||||
except Exception as e:
|
||||
print(f"[ERROR] z={z} x={x} y={y}: {e}")
|
||||
|
||||
print(f"Zoom {z}: {zoom_downloaded} downloaded, {max_tile * max_tile - zoom_downloaded} skipped/already cached")
|
||||
|
||||
print(f"\nDone. Total downloaded: {total_downloaded}, Total skipped: {total_skipped}")
|
||||
print(f"Cache location: {CACHE_DIR}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
max_z = int(sys.argv[1]) if len(sys.argv) > 1 else 4
|
||||
print(f"Pre-fetching Blue Marble tiles up to zoom {max_z}...")
|
||||
prefetch(max_z)
|
||||
149
scripts/prefetch_satellite_regions.py
Normal file
149
scripts/prefetch_satellite_regions.py
Normal file
|
|
@ -0,0 +1,149 @@
|
|||
#!/usr/bin/env python3
|
||||
"""Pre-fetch satellite (MODIS/VIIRS TrueColor) tiles for key regions.
|
||||
|
||||
Downloads directly from NASA GIBS with concurrent workers and local caching.
|
||||
Regions: Southern California, GCC, Iran, Israel
|
||||
"""
|
||||
|
||||
import math
|
||||
import requests
|
||||
from pathlib import Path
|
||||
import sys
|
||||
from concurrent.futures import ThreadPoolExecutor, as_completed
|
||||
from threading import Lock
|
||||
|
||||
CACHE_DIR = Path("/home/raider1984/synq-data/pmtiles/tiles/satellite")
|
||||
TILE_URL = "https://gibs.earthdata.nasa.gov/wmts/epsg3857/best/VIIRS_SNPP_CorrectedReflectance_TrueColor/default/GoogleMapsCompatible_Level9/{z}/{x}/{y}.jpeg"
|
||||
MAX_ZOOM = 9
|
||||
|
||||
REGIONS = {
|
||||
"southern_california": (32.0, 35.5, -121.0, -114.0),
|
||||
"gcc": (16.0, 32.5, 47.0, 57.0),
|
||||
"iran": (25.0, 40.0, 44.0, 63.5),
|
||||
"israel": (29.5, 34.0, 34.0, 36.5),
|
||||
}
|
||||
|
||||
ZOOM_LEVELS = [5, 6, 7, 8, 9]
|
||||
MAX_WORKERS = 16
|
||||
SESSION = requests.Session()
|
||||
SESSION.headers.update({"User-Agent": "Synq-Intel-TilePrefetch/1.0"})
|
||||
|
||||
stats_lock = Lock()
|
||||
|
||||
|
||||
def latlng_to_tile(lat, lng, zoom):
|
||||
n = 2 ** zoom
|
||||
x = int((lng + 180.0) / 360.0 * n)
|
||||
lat_rad = math.radians(lat)
|
||||
y = int((1.0 - math.log(math.tan(lat_rad) + 1.0 / math.cos(lat_rad)) / math.pi) / 2.0 * n)
|
||||
return x, y
|
||||
|
||||
|
||||
def get_tile_range(lat_min, lat_max, lng_min, lng_max, zoom):
|
||||
x1, y1 = latlng_to_tile(lat_max, lng_min, zoom)
|
||||
x2, y2 = latlng_to_tile(lat_min, lng_max, zoom)
|
||||
return min(x1, x2), max(x1, x2), min(y1, y2), max(y1, y2)
|
||||
|
||||
|
||||
def fetch_tile(z, x, y):
|
||||
cache_path = CACHE_DIR / str(z) / str(x) / f"{y}.jpg"
|
||||
if cache_path.exists():
|
||||
return True, "cached", z
|
||||
|
||||
url = TILE_URL.format(z=z, x=x, y=y)
|
||||
try:
|
||||
r = SESSION.get(url, timeout=20)
|
||||
if r.status_code == 200:
|
||||
cache_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with open(cache_path, "wb") as f:
|
||||
f.write(r.content)
|
||||
return True, "downloaded", z
|
||||
else:
|
||||
return False, f"http_{r.status_code}", z
|
||||
except Exception as e:
|
||||
return False, str(e)[:50], z
|
||||
|
||||
|
||||
def prefetch_region(name, lat_min, lat_max, lng_min, lng_max, zoom_levels):
|
||||
print(f"\n📍 {name.upper()}")
|
||||
print(f" Bounds: lat {lat_min}–{lat_max}, lng {lng_min}–{lng_max}")
|
||||
sys.stdout.flush()
|
||||
|
||||
all_tiles = []
|
||||
for z in zoom_levels:
|
||||
if z > MAX_ZOOM:
|
||||
continue
|
||||
x_min, x_max, y_min, y_max = get_tile_range(lat_min, lat_max, lng_min, lng_max, z)
|
||||
max_tile = (2 ** z) - 1
|
||||
x_min = max(0, x_min)
|
||||
x_max = min(max_tile, x_max)
|
||||
y_min = max(0, y_min)
|
||||
y_max = min(max_tile, y_max)
|
||||
for x in range(x_min, x_max + 1):
|
||||
for y in range(y_min, y_max + 1):
|
||||
all_tiles.append((z, x, y))
|
||||
|
||||
zoom_stats = {z: {"total": 0, "downloaded": 0, "cached": 0, "errors": 0} for z in zoom_levels if z <= MAX_ZOOM}
|
||||
total_tiles = len(all_tiles)
|
||||
completed = 0
|
||||
downloaded = 0
|
||||
cached = 0
|
||||
errors = 0
|
||||
|
||||
with ThreadPoolExecutor(max_workers=MAX_WORKERS) as executor:
|
||||
future_to_tile = {executor.submit(fetch_tile, z, x, y): (z, x, y) for z, x, y in all_tiles}
|
||||
for future in as_completed(future_to_tile):
|
||||
success, status, z = future.result()
|
||||
with stats_lock:
|
||||
completed += 1
|
||||
zoom_stats[z]["total"] += 1
|
||||
if success:
|
||||
if status == "downloaded":
|
||||
downloaded += 1
|
||||
zoom_stats[z]["downloaded"] += 1
|
||||
else:
|
||||
cached += 1
|
||||
zoom_stats[z]["cached"] += 1
|
||||
else:
|
||||
errors += 1
|
||||
zoom_stats[z]["errors"] += 1
|
||||
|
||||
if completed % 100 == 0 or completed == total_tiles:
|
||||
pct = completed * 100 // total_tiles if total_tiles > 0 else 0
|
||||
print(f"\r Progress: {completed}/{total_tiles} ({pct}%) | DL:{downloaded} | Cache:{cached} | Err:{errors}", end="")
|
||||
sys.stdout.flush()
|
||||
|
||||
print()
|
||||
for z in zoom_levels:
|
||||
if z > MAX_ZOOM:
|
||||
continue
|
||||
st = zoom_stats[z]
|
||||
if st["total"] > 0:
|
||||
print(f" Zoom {z:2d}: {st['total']:5d} tiles | {st['downloaded']:5d} downloaded | {st['cached']:5d} cached | {st['errors']:3d} errors")
|
||||
|
||||
print(f" TOTAL: {total_tiles} tiles | {downloaded} downloaded | {cached} cached | {errors} errors")
|
||||
return total_tiles, downloaded, cached, errors
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
print("=" * 70)
|
||||
print("NASA GIBS SATELLITE TILE PREFETCH (VIIRS SNPP TrueColor)")
|
||||
print(f"Workers: {MAX_WORKERS} | Zooms: {ZOOM_LEVELS} | MaxZoom: {MAX_ZOOM}")
|
||||
print("=" * 70)
|
||||
sys.stdout.flush()
|
||||
|
||||
grand_total = [0, 0, 0, 0]
|
||||
|
||||
for name, (lat_min, lat_max, lng_min, lng_max) in REGIONS.items():
|
||||
counts = prefetch_region(name, lat_min, lat_max, lng_min, lng_max, ZOOM_LEVELS)
|
||||
for i in range(4):
|
||||
grand_total[i] += counts[i]
|
||||
|
||||
print("\n" + "=" * 70)
|
||||
print("GRAND TOTAL")
|
||||
print(f" {grand_total[0]} tiles checked")
|
||||
print(f" {grand_total[1]} downloaded")
|
||||
print(f" {grand_total[2]} already cached")
|
||||
print(f" {grand_total[3]} errors")
|
||||
print(f"\nCache location: {CACHE_DIR}")
|
||||
print("=" * 70)
|
||||
174
scripts/tile_server.py
Normal file
174
scripts/tile_server.py
Normal file
|
|
@ -0,0 +1,174 @@
|
|||
#!/usr/bin/env python3
|
||||
"""Local tile cache server for Synq Intelligence Dashboard.
|
||||
|
||||
Downloads tiles from NASA GIBS on-demand and caches them locally.
|
||||
Sources:
|
||||
- blue-marble: NASA Blue Marble Shaded Relief + Bathymetry (zoom 0-8)
|
||||
- satellite: NASA MODIS/VIIRS daily True Color (zoom 0-9)
|
||||
- openfreemap: OSM raster fallback (zoom 0-19)
|
||||
|
||||
Serves tiles in a format compatible with CesiumJS UrlTemplateImageryProvider.
|
||||
"""
|
||||
|
||||
import http.server
|
||||
import socketserver
|
||||
import os
|
||||
import requests
|
||||
import threading
|
||||
from urllib.parse import urlparse
|
||||
from pathlib import Path
|
||||
|
||||
PORT = 8082
|
||||
CACHE_DIR = Path("/home/raider1984/synq-data/pmtiles/tiles")
|
||||
|
||||
# NASA GIBS tile endpoints (EPSG:3857, Google Maps Compatible)
|
||||
TILE_SOURCES = {
|
||||
"blue-marble": {
|
||||
"url": "https://gibs.earthdata.nasa.gov/wmts/epsg3857/best/BlueMarble_ShadedRelief_Bathymetry/default/GoogleMapsCompatible_Level8/{z}/{x}/{y}.jpeg",
|
||||
"max_zoom": 8,
|
||||
},
|
||||
"satellite": {
|
||||
"url": "https://gibs.earthdata.nasa.gov/wmts/epsg3857/best/VIIRS_SNPP_CorrectedReflectance_TrueColor/default/GoogleMapsCompatible_Level9/{z}/{x}/{y}.jpeg",
|
||||
"max_zoom": 9,
|
||||
},
|
||||
"openfreemap": {
|
||||
"url": "https://tile.openstreetmap.org/{z}/{x}/{y}.png",
|
||||
"max_zoom": 19,
|
||||
},
|
||||
}
|
||||
|
||||
# Ensure cache directories exist
|
||||
CACHE_DIR.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
|
||||
class TileHandler(http.server.SimpleHTTPRequestHandler):
|
||||
def log_message(self, format, *args):
|
||||
# Reduce noise; only log non-200s and downloads
|
||||
msg = format % args
|
||||
if "200" in msg and "maps" in msg:
|
||||
return
|
||||
print(f"[{self.log_date_time_string()}] {msg}")
|
||||
|
||||
def do_GET(self):
|
||||
parsed = urlparse(self.path)
|
||||
path = parsed.path
|
||||
|
||||
if path.startswith("/maps/blue-marble/"):
|
||||
self.serve_tile(path, "blue-marble", "jpg")
|
||||
elif path.startswith("/maps/sentinel2/"):
|
||||
# Legacy endpoint - route to satellite (MODIS/VIIRS)
|
||||
self.serve_tile(path, "satellite", "jpg")
|
||||
elif path.startswith("/maps/satellite/"):
|
||||
self.serve_tile(path, "satellite", "jpg")
|
||||
elif path.startswith("/maps/openfreemap/"):
|
||||
self.serve_tile(path, "openfreemap", "png")
|
||||
else:
|
||||
self.send_response(404)
|
||||
self.end_headers()
|
||||
|
||||
def serve_tile(self, path, source, ext):
|
||||
parts = path.strip("/").split("/")
|
||||
if len(parts) != 5:
|
||||
self.send_response(404)
|
||||
self.end_headers()
|
||||
return
|
||||
|
||||
_, _, z_str, x_str, y_file = parts
|
||||
y = y_file.split(".")[0]
|
||||
z = int(z_str)
|
||||
|
||||
config = TILE_SOURCES.get(source)
|
||||
if not config:
|
||||
self.send_response(404)
|
||||
self.end_headers()
|
||||
return
|
||||
|
||||
# Clamp to source max zoom
|
||||
if z > config["max_zoom"]:
|
||||
# Return 204 for missing high-zoom tiles instead of 404 spam
|
||||
self.send_response(204)
|
||||
self.end_headers()
|
||||
return
|
||||
|
||||
cache_path = CACHE_DIR / source / z_str / x_str / f"{y}.{ext}"
|
||||
|
||||
if cache_path.exists():
|
||||
self.send_response(200)
|
||||
self.send_header("Content-Type", "image/jpeg" if ext == "jpg" else "image/png")
|
||||
self.send_header("Access-Control-Allow-Origin", "*")
|
||||
self.send_header("Cache-Control", "public, max-age=86400")
|
||||
self.end_headers()
|
||||
with open(cache_path, "rb") as f:
|
||||
self.wfile.write(f.read())
|
||||
return
|
||||
|
||||
# Download from upstream
|
||||
url = config["url"].format(z=z_str, x=x_str, y=y)
|
||||
try:
|
||||
r = requests.get(url, timeout=15)
|
||||
if r.status_code == 200:
|
||||
cache_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with open(cache_path, "wb") as f:
|
||||
f.write(r.content)
|
||||
|
||||
self.send_response(200)
|
||||
self.send_header("Content-Type", "image/jpeg" if ext == "jpg" else "image/png")
|
||||
self.send_header("Access-Control-Allow-Origin", "*")
|
||||
self.send_header("Cache-Control", "public, max-age=86400")
|
||||
self.end_headers()
|
||||
self.wfile.write(r.content)
|
||||
print(f"[DOWNLOAD] {source} z={z_str} x={x_str} y={y}")
|
||||
return
|
||||
else:
|
||||
print(f"[MISS] {source} z={z_str} x={x_str} y={y} -> HTTP {r.status_code}")
|
||||
self.send_response(404)
|
||||
self.end_headers()
|
||||
except Exception as e:
|
||||
print(f"[ERROR] {source} z={z_str} x={x_str} y={y}: {e}")
|
||||
self.send_response(500)
|
||||
self.end_headers()
|
||||
|
||||
|
||||
def prefetch_tiles(source, zoom_levels):
|
||||
"""Pre-fetch low-zoom tiles for global coverage."""
|
||||
config = TILE_SOURCES.get(source)
|
||||
if not config:
|
||||
return
|
||||
for z in zoom_levels:
|
||||
if z > config["max_zoom"]:
|
||||
continue
|
||||
max_tile = 2 ** z
|
||||
for x in range(max_tile):
|
||||
for y in range(max_tile):
|
||||
ext = "jpg" if source != "openfreemap" else "png"
|
||||
cache_path = CACHE_DIR / source / str(z) / str(x) / f"{y}.{ext}"
|
||||
if cache_path.exists():
|
||||
continue
|
||||
url = config["url"].format(z=z, x=x, y=y)
|
||||
try:
|
||||
r = requests.get(url, timeout=15)
|
||||
if r.status_code == 200:
|
||||
cache_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with open(cache_path, "wb") as f:
|
||||
f.write(r.content)
|
||||
print(f"[PREFETCH] {source} z={z} x={x} y={y}")
|
||||
except Exception as e:
|
||||
print(f"[PREFETCH ERROR] {source} z={z} x={x} y={y}: {e}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
# Stop go-pmtiles container to free port 8082
|
||||
os.system("docker stop synq-pmtiles 2>/dev/null")
|
||||
|
||||
# Start background prefetch for both base layers
|
||||
for src in ["blue-marble", "satellite"]:
|
||||
t = threading.Thread(target=prefetch_tiles, args=(src, [0, 1, 2]))
|
||||
t.daemon = True
|
||||
t.start()
|
||||
|
||||
with socketserver.ThreadingTCPServer(("", PORT), TileHandler) as httpd:
|
||||
print(f"Tile server running on http://localhost:{PORT}")
|
||||
print(f"Cache directory: {CACHE_DIR}")
|
||||
print("Sources: blue-marble (z0-8), satellite/MODIS (z0-9), openfreemap (z0-19)")
|
||||
print("Pre-fetching zoom 0-2 tiles in background...")
|
||||
httpd.serve_forever()
|
||||
176
scripts/unified_tile_server.py
Normal file
176
scripts/unified_tile_server.py
Normal file
|
|
@ -0,0 +1,176 @@
|
|||
#!/usr/bin/env python3
|
||||
"""Unified tile proxy server for Synq Intelligence Dashboard.
|
||||
|
||||
Caches tiles locally from multiple sources:
|
||||
- NASA GIBS Blue Marble (EPSG:3857, zoom 0-8)
|
||||
- NASA GIBS VIIRS SNPP TrueColor (EPSG:3857, zoom 0-9)
|
||||
- OpenStreetMap raster (temporary, until OpenFreeMap is downloaded)
|
||||
|
||||
No external API keys required. All sources are public domain.
|
||||
"""
|
||||
|
||||
import http.server
|
||||
import socketserver
|
||||
import os
|
||||
import requests
|
||||
import threading
|
||||
from urllib.parse import urlparse
|
||||
from pathlib import Path
|
||||
|
||||
PORT = 8082
|
||||
CACHE_DIR = Path("/home/raider1984/synq-data/pmtiles/tiles")
|
||||
|
||||
# Source URLs
|
||||
TILE_SOURCES = {
|
||||
"blue-marble": {
|
||||
"url": "https://gibs.earthdata.nasa.gov/wmts/epsg3857/best/BlueMarble_ShadedRelief_Bathymetry/default/GoogleMapsCompatible_Level8/{z}/{x}/{y}.jpeg",
|
||||
"ext": "jpg",
|
||||
"max_zoom": 8,
|
||||
},
|
||||
"satellite": {
|
||||
"url": "https://gibs.earthdata.nasa.gov/wmts/epsg3857/best/VIIRS_SNPP_CorrectedReflectance_TrueColor/default/GoogleMapsCompatible_Level9/{z}/{x}/{y}.jpeg",
|
||||
"ext": "jpg",
|
||||
"max_zoom": 9,
|
||||
},
|
||||
"osm": {
|
||||
"url": "https://tile.openstreetmap.org/{z}/{x}/{y}.png",
|
||||
"ext": "png",
|
||||
"max_zoom": 19,
|
||||
},
|
||||
}
|
||||
|
||||
# Ensure cache directories exist
|
||||
for source in ["blue-marble", "satellite", "osm"]:
|
||||
(CACHE_DIR / source).mkdir(parents=True, exist_ok=True)
|
||||
|
||||
|
||||
class TileHandler(http.server.SimpleHTTPRequestHandler):
|
||||
def log_message(self, format, *args):
|
||||
pass
|
||||
|
||||
def do_GET(self):
|
||||
parsed = urlparse(self.path)
|
||||
path = parsed.path
|
||||
|
||||
if path.startswith("/maps/blue-marble/"):
|
||||
self.serve_tile(path, "blue-marble")
|
||||
elif path.startswith("/maps/sentinel2/"):
|
||||
# Legacy endpoint -> route to satellite
|
||||
self.serve_tile(path, "satellite")
|
||||
elif path.startswith("/maps/satellite/"):
|
||||
self.serve_tile(path, "satellite")
|
||||
elif path.startswith("/maps/openfreemap/"):
|
||||
self.serve_tile(path, "osm")
|
||||
else:
|
||||
self.send_response(404)
|
||||
self.end_headers()
|
||||
|
||||
def serve_tile(self, path, source):
|
||||
parts = path.strip("/").split("/")
|
||||
if len(parts) != 5:
|
||||
self.send_response(404)
|
||||
self.end_headers()
|
||||
return
|
||||
|
||||
_, _, z_str, x_str, y_file = parts
|
||||
y = y_file.split(".")[0]
|
||||
z = int(z_str)
|
||||
|
||||
config = TILE_SOURCES.get(source)
|
||||
if not config:
|
||||
self.send_response(404)
|
||||
self.end_headers()
|
||||
return
|
||||
|
||||
# Clamp to source max zoom
|
||||
if z > config["max_zoom"]:
|
||||
self.send_response(204)
|
||||
self.end_headers()
|
||||
return
|
||||
|
||||
ext = config["ext"]
|
||||
cache_path = CACHE_DIR / source / z_str / x_str / f"{y}.{ext}"
|
||||
|
||||
if cache_path.exists():
|
||||
self.send_response(200)
|
||||
self.send_header("Content-Type", "image/jpeg" if ext == "jpg" else "image/png")
|
||||
self.send_header("Access-Control-Allow-Origin", "*")
|
||||
self.send_header("Cache-Control", "public, max-age=86400")
|
||||
self.end_headers()
|
||||
with open(cache_path, "rb") as f:
|
||||
self.wfile.write(f.read())
|
||||
return
|
||||
|
||||
url = config["url"].format(z=z_str, x=x_str, y=y)
|
||||
try:
|
||||
headers = {"User-Agent": "Synq-Intel-Dashboard/1.0"}
|
||||
r = requests.get(url, timeout=15, headers=headers)
|
||||
if r.status_code == 200:
|
||||
cache_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with open(cache_path, "wb") as f:
|
||||
f.write(r.content)
|
||||
|
||||
self.send_response(200)
|
||||
self.send_header("Content-Type", "image/jpeg" if ext == "jpg" else "image/png")
|
||||
self.send_header("Access-Control-Allow-Origin", "*")
|
||||
self.send_header("Cache-Control", "public, max-age=86400")
|
||||
self.end_headers()
|
||||
self.wfile.write(r.content)
|
||||
print(f"[DOWNLOAD] {source} z={z_str} x={x_str} y={y}")
|
||||
return
|
||||
else:
|
||||
print(f"[MISS] {source} z={z_str} x={x_str} y={y} -> HTTP {r.status_code}")
|
||||
self.send_response(404)
|
||||
self.end_headers()
|
||||
except Exception as e:
|
||||
print(f"[ERROR] {source} z={z_str} x={x_str} y={y}: {e}")
|
||||
self.send_response(500)
|
||||
self.end_headers()
|
||||
|
||||
|
||||
def prefetch_source(source, zoom_levels):
|
||||
config = TILE_SOURCES.get(source)
|
||||
if not config:
|
||||
return
|
||||
ext = config["ext"]
|
||||
for z in zoom_levels:
|
||||
if z > config["max_zoom"]:
|
||||
continue
|
||||
max_tile = 2 ** z
|
||||
for x in range(max_tile):
|
||||
for y in range(max_tile):
|
||||
cache_path = CACHE_DIR / source / str(z) / str(x) / f"{y}.{ext}"
|
||||
if cache_path.exists():
|
||||
continue
|
||||
url = config["url"].format(z=z, x=x, y=y)
|
||||
try:
|
||||
r = requests.get(url, timeout=15)
|
||||
if r.status_code == 200:
|
||||
cache_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with open(cache_path, "wb") as f:
|
||||
f.write(r.content)
|
||||
except:
|
||||
pass
|
||||
print(f"[PREFETCH] {source} zoom {z} complete")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
os.system("docker stop synq-pmtiles 2>/dev/null")
|
||||
import time
|
||||
time.sleep(3)
|
||||
|
||||
# Prefetch Blue Marble zoom 0-4 in background
|
||||
t1 = threading.Thread(target=prefetch_source, args=("blue-marble", [0, 1, 2, 3, 4]))
|
||||
t1.daemon = True
|
||||
t1.start()
|
||||
|
||||
class ReuseAddrTCPServer(socketserver.ThreadingTCPServer):
|
||||
allow_reuse_address = True
|
||||
|
||||
with ReuseAddrTCPServer(("", PORT), TileHandler) as httpd:
|
||||
print(f"Unified tile server running on http://localhost:{PORT}")
|
||||
print(f" Blue Marble: /maps/blue-marble/{{z}}/{{x}}/{{y}}.jpg (z0-8)")
|
||||
print(f" Satellite: /maps/satellite/{{z}}/{{x}}/{{y}}.jpg (z0-9, VIIRS daily)")
|
||||
print(f" OSM (temp): /maps/openfreemap/{{z}}/{{x}}/{{y}}.png (z0-19)")
|
||||
print(f"Cache: {CACHE_DIR}")
|
||||
httpd.serve_forever()
|
||||
32
synq-mobile/.gitignore
vendored
Normal file
32
synq-mobile/.gitignore
vendored
Normal file
|
|
@ -0,0 +1,32 @@
|
|||
# Dependencies
|
||||
node_modules/
|
||||
|
||||
# Build outputs
|
||||
dist/
|
||||
dist-ssr/
|
||||
|
||||
# Tauri
|
||||
src-tauri/gen/
|
||||
src-tauri/target/
|
||||
|
||||
# Android
|
||||
android/.gradle/
|
||||
android/app/build/
|
||||
android/build/
|
||||
android/local.properties
|
||||
|
||||
# Logs
|
||||
*.log
|
||||
|
||||
# IDE
|
||||
.idea/
|
||||
.vscode/
|
||||
*.swp
|
||||
*.swo
|
||||
|
||||
# OS
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# Local config
|
||||
infrastructure.toml
|
||||
151
synq-mobile/README.md
Normal file
151
synq-mobile/README.md
Normal file
|
|
@ -0,0 +1,151 @@
|
|||
# Synq Mobile v0.1
|
||||
|
||||
Tauri v2 Android application with multi-profile containerization.
|
||||
|
||||
## Overview
|
||||
|
||||
Synq Mobile is the mobile counterpart to Synq Desktop. It provides:
|
||||
|
||||
- **Four profiles**: Business, Personal, Family, Kids
|
||||
- **Profile isolation**: Separate encrypted SQLite, network config, and channel deck per profile
|
||||
- **Channel Deck**: Vertical swipe navigation between full-screen channels
|
||||
- **Beam AI**: Voice-driven navigation (visual scaffold in Phase 1)
|
||||
- **Jitsi Video Calls**: Native Android SDK bridge
|
||||
- **Offline Support**: SQLite cache + action queue with sync-on-reconnect
|
||||
- **Kids Lock**: PIN-locked sandbox with no AI/cloud exit
|
||||
|
||||
## Tech Stack
|
||||
|
||||
| Layer | Technology |
|
||||
|---|---|
|
||||
| Mobile framework | Tauri v2 (mobile) |
|
||||
| Rust core | Shared with Synq Desktop workspace crates |
|
||||
| UI frontend | React + TypeScript + Tailwind CSS |
|
||||
| State management | Zustand (profile-scoped stores) |
|
||||
| SQLite | rusqlite with bundled SQLCipher (per-profile encrypted) |
|
||||
| Video calls | Jitsi Meet Android SDK |
|
||||
| Build | cargo-ndk + Gradle |
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
synq-mobile/
|
||||
├── src-tauri/ # Rust backend
|
||||
│ ├── src/
|
||||
│ │ ├── main.rs
|
||||
│ │ ├── lib.rs
|
||||
│ │ ├── profiles/ # Profile runtime isolation
|
||||
│ │ ├── channels/ # Channel registry & deck
|
||||
│ │ ├── beam/ # Beam AI client
|
||||
│ │ ├── sync/ # Offline sync engine
|
||||
│ │ ├── jitsi/ # Video call bridge
|
||||
│ │ ├── network/ # Profile-specific networking
|
||||
│ │ └── security/ # Encryption, auth, audit
|
||||
│ ├── Cargo.toml
|
||||
│ └── tauri.conf.json
|
||||
├── src/ # WebView frontend (React)
|
||||
│ ├── App.tsx
|
||||
│ ├── profiles/
|
||||
│ ├── channels/
|
||||
│ ├── beam/
|
||||
│ ├── components/
|
||||
│ └── hooks/
|
||||
├── android/ # Android-specific overrides
|
||||
│ └── app/src/main/
|
||||
│ ├── AndroidManifest.xml
|
||||
│ ├── MainActivity.kt
|
||||
│ └── JitsiViewManager.kt
|
||||
└── package.json
|
||||
```
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Rust 1.75+
|
||||
- Node.js 18+
|
||||
- Android Studio + SDK (for mobile builds)
|
||||
- `cargo tauri` CLI: `cargo install tauri-cli`
|
||||
- `cargo-ndk` (for Android NDK builds)
|
||||
|
||||
## Local Build
|
||||
|
||||
### Desktop (development)
|
||||
|
||||
```bash
|
||||
cd synq-mobile
|
||||
npm install
|
||||
npm run tauri dev
|
||||
```
|
||||
|
||||
### Android (requires Android SDK)
|
||||
|
||||
```bash
|
||||
cd synq-mobile
|
||||
npm install
|
||||
# Initialize Tauri Android project (one-time)
|
||||
npm run tauri android init
|
||||
# Development build
|
||||
npm run tauri android dev
|
||||
```
|
||||
|
||||
## Infrastructure Config
|
||||
|
||||
Create `~/.config/synq/infrastructure.toml` (never committed):
|
||||
|
||||
```toml
|
||||
[clinical_ai_estate]
|
||||
base_url = "http://your-local-dgx:8000"
|
||||
auth_type = "Jwt"
|
||||
|
||||
[personal_ai_estate]
|
||||
base_url = "http://your-local-dgx:8001"
|
||||
auth_type = "ApiKey"
|
||||
|
||||
[jitsi_local]
|
||||
base_url = "https://jitsi.yourdomain.com"
|
||||
auth_type = "Jwt"
|
||||
|
||||
[jitsi_cloud]
|
||||
base_url = "https://meet.jit.si"
|
||||
auth_type = "None"
|
||||
```
|
||||
|
||||
## Phase Status
|
||||
|
||||
### Phase 1 (Complete)
|
||||
- [x] Tauri v2 mobile scaffold
|
||||
- [x] Profile selector with 4 profiles
|
||||
- [x] Profile runtime initialization (SQLite per-profile)
|
||||
- [x] Channel deck with vertical swipe
|
||||
- [x] Beam orb visual
|
||||
- [x] Offline banner + SQLite cache table
|
||||
- [x] Kids profile PIN lock
|
||||
- [x] Jitsi SDK bridge scaffold
|
||||
- [x] Recording service scaffold
|
||||
|
||||
### Phase 2 (Stubbed)
|
||||
- [ ] Channel marketplace (add/remove/reorder)
|
||||
- [ ] Voice trigger wiring (hardware button → Rust)
|
||||
- [ ] Jitsi SDK full integration
|
||||
- [ ] Meeting recording foreground service
|
||||
- [ ] Sync daemon with retry
|
||||
- [ ] TV detection + Leanback layout
|
||||
|
||||
### Phase 3 (Future)
|
||||
- [ ] Full voice command registry
|
||||
- [ ] Beam panel chat interface
|
||||
- [ ] Explicit share flow with consent log
|
||||
- [ ] WireGuard abstraction
|
||||
- [ ] Jitsi JWT generation
|
||||
- [ ] Local AI model download/cache
|
||||
|
||||
## Security Notes
|
||||
|
||||
- PHI never leaves local estate in Business profile
|
||||
- Personal/Family photos are local-first; cloud AI only with explicit share consent
|
||||
- Kids profile has no AI, no cloud, no exit without PIN
|
||||
- SQLite databases are encrypted with profile-derived keys
|
||||
- Audit log tracks all cloud AI access
|
||||
|
||||
## License
|
||||
|
||||
MIT — See workspace LICENSE
|
||||
48
synq-mobile/android/app/build.gradle.kts
Normal file
48
synq-mobile/android/app/build.gradle.kts
Normal file
|
|
@ -0,0 +1,48 @@
|
|||
plugins {
|
||||
id("com.android.application")
|
||||
id("org.jetbrains.kotlin.android")
|
||||
}
|
||||
|
||||
android {
|
||||
namespace = "com.synq.mobile"
|
||||
compileSdk = 34
|
||||
|
||||
defaultConfig {
|
||||
applicationId = "com.synq.mobile"
|
||||
minSdk = 28
|
||||
targetSdk = 34
|
||||
versionCode = 1
|
||||
versionName = "0.1.0"
|
||||
}
|
||||
|
||||
buildTypes {
|
||||
release {
|
||||
isMinifyEnabled = false
|
||||
proguardFiles(
|
||||
getDefaultProguardFile("proguard-android-optimize.txt"),
|
||||
"proguard-rules.pro"
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
compileOptions {
|
||||
sourceCompatibility = JavaVersion.VERSION_17
|
||||
targetCompatibility = JavaVersion.VERSION_17
|
||||
}
|
||||
|
||||
kotlinOptions {
|
||||
jvmTarget = "17"
|
||||
}
|
||||
}
|
||||
|
||||
dependencies {
|
||||
implementation("androidx.appcompat:appcompat:1.6.1")
|
||||
implementation("com.google.android.material:material:1.11.0")
|
||||
implementation("androidx.constraintlayout:constraintlayout:2.1.4")
|
||||
|
||||
// Tauri WebView dependency (injected by tauri-android init)
|
||||
// implementation("app.tauri:tao:...")
|
||||
|
||||
// TODO(Phase 2): Add Jitsi Meet SDK
|
||||
// implementation("org.jitsi.react:jitsi-meet-sdk:+")
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue