Update scheme.md
Some checks failed
Build / Build & Push Docker Images (api) (push) Has been cancelled
Build / Build & Push Docker Images (chat) (push) Has been cancelled
Build / Build & Push Docker Images (frontend) (push) Has been cancelled
Build / Build & Push Docker Images (worker) (push) Has been cancelled
Build / Code Quality Checks (push) Has started running
Some checks failed
Build / Build & Push Docker Images (api) (push) Has been cancelled
Build / Build & Push Docker Images (chat) (push) Has been cancelled
Build / Build & Push Docker Images (frontend) (push) Has been cancelled
Build / Build & Push Docker Images (worker) (push) Has been cancelled
Build / Code Quality Checks (push) Has started running
This commit is contained in:
176
scheme.md
176
scheme.md
@@ -4,7 +4,7 @@ Sistema automatizzato per la generazione e mantenimento della documentazione tec
|
|||||||
|
|
||||||
[](https://opensource.org/licenses/MIT)
|
[](https://opensource.org/licenses/MIT)
|
||||||
[](https://www.python.org/downloads/)
|
[](https://www.python.org/downloads/)
|
||||||
[](https://kafka.apache.org/)
|
[](https://redis.io/)
|
||||||
|
|
||||||
## 📋 Indice
|
## 📋 Indice
|
||||||
|
|
||||||
@@ -22,7 +22,8 @@ Sistema progettato per **automatizzare la creazione e l'aggiornamento della docu
|
|||||||
|
|
||||||
- ✅ **Raccolta dati asincrona** da molteplici sistemi infrastrutturali
|
- ✅ **Raccolta dati asincrona** da molteplici sistemi infrastrutturali
|
||||||
- ✅ **Isolamento di sicurezza**: LLM non accede mai ai sistemi live
|
- ✅ **Isolamento di sicurezza**: LLM non accede mai ai sistemi live
|
||||||
- ✅ **Event-driven architecture** con Apache Kafka
|
- ✅ **Change Detection**: Documentazione generata solo su modifiche rilevate
|
||||||
|
- ✅ **Redis Cache** per storage dati e performance
|
||||||
- ✅ **LLM locale on-premise** (Qwen) tramite MCP Server
|
- ✅ **LLM locale on-premise** (Qwen) tramite MCP Server
|
||||||
- ✅ **Human-in-the-loop validation** con workflow GitOps
|
- ✅ **Human-in-the-loop validation** con workflow GitOps
|
||||||
- ✅ **CI/CD automatizzato** per pubblicazione
|
- ✅ **CI/CD automatizzato** per pubblicazione
|
||||||
@@ -31,11 +32,13 @@ Sistema progettato per **automatizzare la creazione e l'aggiornamento della docu
|
|||||||
|
|
||||||
Il sistema è suddiviso in **3 flussi principali**:
|
Il sistema è suddiviso in **3 flussi principali**:
|
||||||
|
|
||||||
1. **Raccolta Dati (Background)**: Connettori interrogano periodicamente i sistemi infrastrutturali tramite API e pubblicano i dati su Kafka
|
1. **Raccolta Dati (Background)**: Connettori interrogano periodicamente i sistemi infrastrutturali tramite API e aggiornano Redis
|
||||||
2. **Generazione Documentazione (On-Demand)**: LLM locale (Qwen) genera markdown interrogando Kafka/Redis tramite MCP Server
|
2. **Change Detection**: Sistema di rilevamento modifiche che attiva la generazione documentazione solo quando necessario
|
||||||
3. **Validazione e Pubblicazione (GitOps)**: Review umana su Pull Request e deploy automatico via CI/CD
|
3. **Generazione e Pubblicazione (Triggered)**: LLM locale (Qwen) genera markdown leggendo da Redis, seguito da review umana e deploy automatico
|
||||||
|
|
||||||
> **Principio di Sicurezza**: L'LLM non ha mai accesso diretto ai sistemi infrastrutturali. Tutti i dati passano attraverso Kafka/Redis.
|
> **Principio di Sicurezza**: L'LLM non ha mai accesso diretto ai sistemi infrastrutturali. Tutti i dati sono letti da Redis.
|
||||||
|
|
||||||
|
> **Principio di Efficienza**: La documentazione viene generata solo quando il sistema rileva modifiche nella configurazione infrastrutturale.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@@ -56,8 +59,8 @@ INCOLLA QUI LO SCHEMA ARCHITETTURALE
|
|||||||
graph TB
|
graph TB
|
||||||
%% Styling
|
%% Styling
|
||||||
classDef infrastructure fill:#e1f5ff,stroke:#01579b,stroke-width:3px,color:#333
|
classDef infrastructure fill:#e1f5ff,stroke:#01579b,stroke-width:3px,color:#333
|
||||||
classDef kafka fill:#fff3e0,stroke:#e65100,stroke-width:3px,color:#333
|
|
||||||
classDef cache fill:#f3e5f5,stroke:#4a148c,stroke-width:3px,color:#333
|
classDef cache fill:#f3e5f5,stroke:#4a148c,stroke-width:3px,color:#333
|
||||||
|
classDef change fill:#fff3e0,stroke:#e65100,stroke-width:3px,color:#333
|
||||||
classDef llm fill:#e8f5e9,stroke:#1b5e20,stroke-width:3px,color:#333
|
classDef llm fill:#e8f5e9,stroke:#1b5e20,stroke-width:3px,color:#333
|
||||||
classDef git fill:#fce4ec,stroke:#880e4f,stroke-width:3px,color:#333
|
classDef git fill:#fce4ec,stroke:#880e4f,stroke-width:3px,color:#333
|
||||||
classDef human fill:#fff9c4,stroke:#f57f17,stroke-width:3px,color:#333
|
classDef human fill:#fff9c4,stroke:#f57f17,stroke-width:3px,color:#333
|
||||||
@@ -70,37 +73,42 @@ graph TB
|
|||||||
|
|
||||||
CONN["🔌 CONNETTORI<br/>Polling Automatico"]:::infrastructure
|
CONN["🔌 CONNETTORI<br/>Polling Automatico"]:::infrastructure
|
||||||
|
|
||||||
KAFKA[("📨 APACHE KAFKA<br/>Message Broker<br/>+ Persistenza")]:::kafka
|
REDIS[("💾 REDIS CACHE<br/>Configurazione<br/>Infrastruttura")]:::cache
|
||||||
|
|
||||||
CONSUMER["⚙️ KAFKA CONSUMER<br/>Processor Service"]:::kafka
|
|
||||||
|
|
||||||
REDIS[("💾 REDIS CACHE<br/>(Opzionale)<br/>Performance Layer")]:::cache
|
|
||||||
|
|
||||||
INFRA -->|"API Polling<br/>Continuo"| CONN
|
INFRA -->|"API Polling<br/>Continuo"| CONN
|
||||||
CONN -->|"Publish<br/>Eventi"| KAFKA
|
CONN -->|"Update<br/>Configurazione"| REDIS
|
||||||
KAFKA -->|"Consume<br/>Stream"| CONSUMER
|
|
||||||
CONSUMER -.->|"Update<br/>Opzionale"| REDIS
|
|
||||||
|
|
||||||
%% ========================================
|
%% ========================================
|
||||||
%% FLUSSO 2: GENERAZIONE DOCUMENTAZIONE
|
%% CHANGE DETECTION
|
||||||
%% ========================================
|
%% ========================================
|
||||||
|
|
||||||
USER["👤 UTENTE<br/>Richiesta Doc"]:::human
|
CHANGE["🔍 CHANGE DETECTOR<br/>Rileva Modifiche<br/>Configurazione"]:::change
|
||||||
|
|
||||||
LLM["🤖 LLM ENGINE<br/>Claude / GPT"]:::llm
|
REDIS -->|"Monitor<br/>Changes"| CHANGE
|
||||||
|
|
||||||
|
%% ========================================
|
||||||
|
%% FLUSSO 2: GENERAZIONE DOCUMENTAZIONE (Triggered)
|
||||||
|
%% ========================================
|
||||||
|
|
||||||
|
TRIGGER["⚡ TRIGGER<br/>Solo se modifiche"]:::change
|
||||||
|
|
||||||
|
USER["👤 UTENTE<br/>Richiesta Manuale"]:::human
|
||||||
|
|
||||||
|
LLM["🤖 LLM ENGINE<br/>Qwen (Locale)"]:::llm
|
||||||
|
|
||||||
MCP["🔧 MCP SERVER<br/>API Control Platform"]:::llm
|
MCP["🔧 MCP SERVER<br/>API Control Platform"]:::llm
|
||||||
|
|
||||||
DOC["📄 DOCUMENTO<br/>Markdown Generato"]:::llm
|
DOC["📄 DOCUMENTO<br/>Markdown Generato"]:::llm
|
||||||
|
|
||||||
USER -->|"1. Prompt"| LLM
|
CHANGE -->|"Modifiche<br/>Rilevate"| TRIGGER
|
||||||
LLM -->|"2. Tool Call"| MCP
|
USER -.->|"Opzionale"| TRIGGER
|
||||||
MCP -->|"3a. Query"| KAFKA
|
|
||||||
MCP -.->|"3b. Query<br/>Fast"| REDIS
|
TRIGGER -->|"Avvia<br/>Generazione"| LLM
|
||||||
KAFKA -->|"4a. Dati"| MCP
|
LLM -->|"Tool Call"| MCP
|
||||||
REDIS -.->|"4b. Dati"| MCP
|
MCP -->|"Query"| REDIS
|
||||||
MCP -->|"5. Context"| LLM
|
REDIS -->|"Dati Config"| MCP
|
||||||
LLM -->|"6. Genera"| DOC
|
MCP -->|"Context"| LLM
|
||||||
|
LLM -->|"Genera"| DOC
|
||||||
|
|
||||||
%% ========================================
|
%% ========================================
|
||||||
%% FLUSSO 3: VALIDAZIONE E PUBBLICAZIONE
|
%% FLUSSO 3: VALIDAZIONE E PUBBLICAZIONE
|
||||||
@@ -127,16 +135,16 @@ graph TB
|
|||||||
MKDOCS -->|"Deploy"| WEB
|
MKDOCS -->|"Deploy"| WEB
|
||||||
|
|
||||||
%% ========================================
|
%% ========================================
|
||||||
%% ANNOTAZIONI SICUREZZA
|
%% ANNOTAZIONI
|
||||||
%% ========================================
|
%% ========================================
|
||||||
|
|
||||||
SECURITY["🔒 SICUREZZA<br/>LLM isolato dai sistemi live"]:::human
|
SECURITY["🔒 SICUREZZA<br/>LLM isolato dai sistemi live"]:::human
|
||||||
PERF["⚡ PERFORMANCE<br/>Cache Redis opzionale"]:::cache
|
EFFICIENCY["⚡ EFFICIENZA<br/>Doc generata solo<br/>su modifiche"]:::change
|
||||||
|
|
||||||
LLM -.->|"NESSUN<br/>ACCESSO"| INFRA
|
LLM -.->|"NESSUN<br/>ACCESSO"| INFRA
|
||||||
|
|
||||||
SECURITY -.-> LLM
|
SECURITY -.-> LLM
|
||||||
PERF -.-> REDIS
|
EFFICIENCY -.-> CHANGE
|
||||||
```
|
```
|
||||||
|
|
||||||
---
|
---
|
||||||
@@ -159,8 +167,8 @@ graph TB
|
|||||||
%% Styling tecnico
|
%% Styling tecnico
|
||||||
classDef infra fill:#e1f5ff,stroke:#01579b,stroke-width:2px,color:#333,font-size:11px
|
classDef infra fill:#e1f5ff,stroke:#01579b,stroke-width:2px,color:#333,font-size:11px
|
||||||
classDef connector fill:#e3f2fd,stroke:#1565c0,stroke-width:2px,color:#333,font-size:11px
|
classDef connector fill:#e3f2fd,stroke:#1565c0,stroke-width:2px,color:#333,font-size:11px
|
||||||
classDef kafka fill:#fff3e0,stroke:#e65100,stroke-width:2px,color:#333,font-size:11px
|
|
||||||
classDef cache fill:#f3e5f5,stroke:#4a148c,stroke-width:2px,color:#333,font-size:11px
|
classDef cache fill:#f3e5f5,stroke:#4a148c,stroke-width:2px,color:#333,font-size:11px
|
||||||
|
classDef change fill:#fff3e0,stroke:#e65100,stroke-width:2px,color:#333,font-size:11px
|
||||||
classDef llm fill:#e8f5e9,stroke:#1b5e20,stroke-width:2px,color:#333,font-size:11px
|
classDef llm fill:#e8f5e9,stroke:#1b5e20,stroke-width:2px,color:#333,font-size:11px
|
||||||
classDef git fill:#fce4ec,stroke:#880e4f,stroke-width:2px,color:#333,font-size:11px
|
classDef git fill:#fce4ec,stroke:#880e4f,stroke-width:2px,color:#333,font-size:11px
|
||||||
classDef monitor fill:#fff8e1,stroke:#f57f17,stroke-width:2px,color:#333,font-size:11px
|
classDef monitor fill:#fff8e1,stroke:#f57f17,stroke-width:2px,color:#333,font-size:11px
|
||||||
@@ -181,7 +189,7 @@ graph TB
|
|||||||
%% =====================================
|
%% =====================================
|
||||||
|
|
||||||
subgraph CONNECTORS["🔌 DATA COLLECTORS (Python/Go)"]
|
subgraph CONNECTORS["🔌 DATA COLLECTORS (Python/Go)"]
|
||||||
CONN_VM["VMware Collector<br/>Lang: Python 3.11<br/>Lib: pyvmomi<br/>Schedule: */15 * * * *<br/>Output: JSON"]:::connector
|
CONN_VM["VMware Collector<br/>Lang: Python 3.11<br/>Lib: pyvmomi<br/>Schedule: */15 * * * *<br/>Output: JSON → Redis"]:::connector
|
||||||
|
|
||||||
CONN_K8S["K8s Collector<br/>Lang: Python 3.11<br/>Lib: kubernetes-client<br/>Schedule: */5 * * * *<br/>Resources: pods,svc,ing,deploy"]:::connector
|
CONN_K8S["K8s Collector<br/>Lang: Python 3.11<br/>Lib: kubernetes-client<br/>Schedule: */5 * * * *<br/>Resources: pods,svc,ing,deploy"]:::connector
|
||||||
|
|
||||||
@@ -196,73 +204,78 @@ graph TB
|
|||||||
CISCO -->|"NETCONF<br/>get-config"| CONN_CSC
|
CISCO -->|"NETCONF<br/>get-config"| CONN_CSC
|
||||||
|
|
||||||
%% =====================================
|
%% =====================================
|
||||||
%% LAYER 3: MESSAGE BROKER
|
%% LAYER 3: REDIS STORAGE
|
||||||
%% =====================================
|
%% =====================================
|
||||||
|
|
||||||
subgraph MESSAGING["📨 KAFKA CLUSTER (3 brokers)"]
|
subgraph STORAGE["💾 REDIS CLUSTER"]
|
||||||
KAFKA_TOPICS["Kafka Topics:<br/>• vmware.inventory (P:6, R:3)<br/>• k8s.resources (P:12, R:3)<br/>• linux.systems (P:3, R:3)<br/>• cisco.network (P:3, R:3)<br/>Retention: 7 days<br/>Format: JSON + Schema Registry"]:::kafka
|
|
||||||
|
|
||||||
SCHEMA["Schema Registry<br/>Avro Schemas<br/>Versioning enabled<br/>Port: 8081"]:::kafka
|
|
||||||
end
|
|
||||||
|
|
||||||
CONN_VM -->|"Producer<br/>Batch 100 msg"| KAFKA_TOPICS
|
|
||||||
CONN_K8S -->|"Producer<br/>Batch 100 msg"| KAFKA_TOPICS
|
|
||||||
CONN_LNX -->|"Producer<br/>Batch 50 msg"| KAFKA_TOPICS
|
|
||||||
CONN_CSC -->|"Producer<br/>Batch 50 msg"| KAFKA_TOPICS
|
|
||||||
|
|
||||||
KAFKA_TOPICS <--> SCHEMA
|
|
||||||
|
|
||||||
%% =====================================
|
|
||||||
%% LAYER 4: PROCESSING & CACHE
|
|
||||||
%% =====================================
|
|
||||||
|
|
||||||
subgraph PROCESSING["⚙️ STREAM PROCESSING"]
|
|
||||||
CONSUMER_GRP["Kafka Consumer Group<br/>Group ID: doc-consumers<br/>Lang: Python 3.11<br/>Lib: kafka-python<br/>Workers: 6<br/>Commit: auto (5s)"]:::kafka
|
|
||||||
|
|
||||||
PROCESSOR["Data Processor<br/>• Validation<br/>• Transformation<br/>• Enrichment<br/>• Deduplication"]:::kafka
|
|
||||||
end
|
|
||||||
|
|
||||||
KAFKA_TOPICS -->|"Subscribe<br/>offset management"| CONSUMER_GRP
|
|
||||||
CONSUMER_GRP --> PROCESSOR
|
|
||||||
|
|
||||||
subgraph STORAGE["💾 CACHE LAYER (Optional)"]
|
|
||||||
REDIS_CLUSTER["Redis Cluster<br/>Mode: Cluster (6 nodes)<br/>Port: 6379<br/>Persistence: RDB + AOF<br/>Memory: 64GB<br/>Eviction: allkeys-lru"]:::cache
|
REDIS_CLUSTER["Redis Cluster<br/>Mode: Cluster (6 nodes)<br/>Port: 6379<br/>Persistence: RDB + AOF<br/>Memory: 64GB<br/>Eviction: allkeys-lru"]:::cache
|
||||||
|
|
||||||
REDIS_KEYS["Key Structure:<br/>• vmware:vcenter-id:vms<br/>• k8s:cluster:namespace:resource<br/>• linux:hostname:info<br/>• cisco:device-id:config<br/>TTL: 1-24h based on type"]:::cache
|
REDIS_KEYS["Key Structure:<br/>• vmware:vcenter-id:vms:hash<br/>• k8s:cluster:namespace:resource:hash<br/>• linux:hostname:info:hash<br/>• cisco:device-id:config:hash<br/>• changelog:timestamp:diff<br/>TTL: 30d for data, 90d for changelog"]:::cache
|
||||||
end
|
end
|
||||||
|
|
||||||
PROCESSOR -.->|"SET/HSET<br/>Pipeline batch"| REDIS_CLUSTER
|
CONN_VM -->|"HSET/HMSET<br/>+ Hash Storage"| REDIS_CLUSTER
|
||||||
|
CONN_K8S -->|"HSET/HMSET<br/>+ Hash Storage"| REDIS_CLUSTER
|
||||||
|
CONN_LNX -->|"HSET/HMSET<br/>+ Hash Storage"| REDIS_CLUSTER
|
||||||
|
CONN_CSC -->|"HSET/HMSET<br/>+ Hash Storage"| REDIS_CLUSTER
|
||||||
|
|
||||||
REDIS_CLUSTER --> REDIS_KEYS
|
REDIS_CLUSTER --> REDIS_KEYS
|
||||||
|
|
||||||
%% =====================================
|
%% =====================================
|
||||||
%% LAYER 5: LLM & MCP
|
%% LAYER 4: CHANGE DETECTION
|
||||||
%% =====================================
|
%% =====================================
|
||||||
|
|
||||||
|
subgraph CHANGE_DETECTION["🔍 CHANGE DETECTION SYSTEM"]
|
||||||
|
DETECTOR["Change Detector Service<br/>Lang: Python 3.11<br/>Lib: redis-py<br/>Algorithm: Hash comparison<br/>Check interval: */5 * * * *"]:::change
|
||||||
|
|
||||||
|
DIFF_ENGINE["Diff Engine<br/>• Deep object comparison<br/>• JSON diff generation<br/>• Change classification<br/>• Severity assessment"]:::change
|
||||||
|
|
||||||
|
CHANGE_LOG["Change Log Store<br/>Key: changelog:*<br/>Data: diff JSON + metadata<br/>Indexed by: timestamp, resource"]:::change
|
||||||
|
|
||||||
|
NOTIFIER["Change Notifier<br/>• Webhook triggers<br/>• Slack notifications<br/>• Event emission<br/>Target: LLM trigger"]:::change
|
||||||
|
end
|
||||||
|
|
||||||
|
REDIS_CLUSTER -->|"Monitor<br/>key changes"| DETECTOR
|
||||||
|
DETECTOR --> DIFF_ENGINE
|
||||||
|
DIFF_ENGINE -->|"Store diff"| CHANGE_LOG
|
||||||
|
CHANGE_LOG --> REDIS_CLUSTER
|
||||||
|
DIFF_ENGINE -->|"Notify if<br/>significant"| NOTIFIER
|
||||||
|
|
||||||
|
%% =====================================
|
||||||
|
%% LAYER 5: LLM TRIGGER & GENERATION
|
||||||
|
%% =====================================
|
||||||
|
|
||||||
|
subgraph TRIGGER_SYSTEM["⚡ TRIGGER SYSTEM"]
|
||||||
|
TRIGGER_SVC["Trigger Service<br/>Lang: Python 3.11<br/>Listen: Webhook + Redis Pub/Sub<br/>Debounce: 5 min<br/>Batch: multiple changes"]:::change
|
||||||
|
|
||||||
|
QUEUE["Generation Queue<br/>Type: Redis List<br/>Priority: High/Medium/Low<br/>Processing: FIFO"]:::change
|
||||||
|
end
|
||||||
|
|
||||||
|
NOTIFIER -->|"Trigger event"| TRIGGER_SVC
|
||||||
|
TRIGGER_SVC -->|"Enqueue<br/>generation task"| QUEUE
|
||||||
|
|
||||||
subgraph LLM_LAYER["🤖 AI GENERATION LAYER"]
|
subgraph LLM_LAYER["🤖 AI GENERATION LAYER"]
|
||||||
LLM_ENGINE["LLM Engine<br/>Model: Claude Sonnet 4 / GPT-4<br/>API: Anthropic/OpenAI<br/>Temp: 0.3<br/>Max Tokens: 4096<br/>Timeout: 120s"]:::llm
|
LLM_ENGINE["LLM Engine<br/>Model: Qwen (Locale)<br/>API: Ollama/vLLM/LM Studio<br/>Port: 11434<br/>Temp: 0.3<br/>Max Tokens: 4096<br/>Timeout: 120s"]:::llm
|
||||||
|
|
||||||
MCP_SERVER["MCP Server<br/>Lang: TypeScript/Node.js<br/>Port: 3000<br/>Protocol: JSON-RPC 2.0<br/>Auth: JWT tokens"]:::llm
|
MCP_SERVER["MCP Server<br/>Lang: TypeScript/Node.js<br/>Port: 3000<br/>Protocol: JSON-RPC 2.0<br/>Auth: JWT tokens"]:::llm
|
||||||
|
|
||||||
MCP_TOOLS["MCP Tools:<br/>• getVMwareInventory(vcenter)<br/>• getK8sResources(cluster,ns,type)<br/>• getLinuxSystemInfo(hostname)<br/>• getCiscoConfig(device,section)<br/>• queryTimeRange(start,end)<br/>Return: JSON + Metadata"]:::llm
|
MCP_TOOLS["MCP Tools:<br/>• getVMwareInventory(vcenter)<br/>• getK8sResources(cluster,ns,type)<br/>• getLinuxSystemInfo(hostname)<br/>• getCiscoConfig(device,section)<br/>• getChangelog(start,end,resource)<br/>Return: JSON + Metadata"]:::llm
|
||||||
end
|
end
|
||||||
|
|
||||||
|
QUEUE -->|"Dequeue<br/>task"| LLM_ENGINE
|
||||||
|
|
||||||
LLM_ENGINE <-->|"Tool calls<br/>JSON-RPC"| MCP_SERVER
|
LLM_ENGINE <-->|"Tool calls<br/>JSON-RPC"| MCP_SERVER
|
||||||
MCP_SERVER --> MCP_TOOLS
|
MCP_SERVER --> MCP_TOOLS
|
||||||
|
|
||||||
MCP_TOOLS -->|"1. Query Kafka Consumer API<br/>GET /api/v1/data"| CONSUMER_GRP
|
MCP_TOOLS -->|"HGETALL/MGET<br/>Read data"| REDIS_CLUSTER
|
||||||
MCP_TOOLS -.->|"2. Fallback Redis<br/>MGET/HGETALL"| REDIS_CLUSTER
|
REDIS_CLUSTER -->|"Config data<br/>+ Changelog"| MCP_TOOLS
|
||||||
|
|
||||||
CONSUMER_GRP -->|"JSON Response<br/>+ Timestamps"| MCP_TOOLS
|
|
||||||
REDIS_CLUSTER -.->|"Cached JSON<br/>Fast response"| MCP_TOOLS
|
|
||||||
|
|
||||||
MCP_TOOLS -->|"Structured Data<br/>+ Context"| LLM_ENGINE
|
MCP_TOOLS -->|"Structured Data<br/>+ Context"| LLM_ENGINE
|
||||||
|
|
||||||
subgraph OUTPUT["📝 DOCUMENT GENERATION"]
|
subgraph OUTPUT["📝 DOCUMENT GENERATION"]
|
||||||
TEMPLATE["Template Engine<br/>Format: Jinja2<br/>Templates: markdown/*.j2<br/>Variables: from LLM"]:::llm
|
TEMPLATE["Template Engine<br/>Format: Jinja2<br/>Templates: markdown/*.j2<br/>Variables: from LLM"]:::llm
|
||||||
|
|
||||||
MARKDOWN["Markdown Output<br/>Format: CommonMark<br/>Metadata: YAML frontmatter<br/>Assets: diagrams in mermaid"]:::llm
|
MARKDOWN["Markdown Output<br/>Format: CommonMark<br/>Metadata: YAML frontmatter<br/>Change summary included<br/>Assets: diagrams in mermaid"]:::llm
|
||||||
|
|
||||||
VALIDATOR["Doc Validator<br/>• Markdown linting<br/>• Link checking<br/>• Schema validation"]:::llm
|
VALIDATOR["Doc Validator<br/>• Markdown linting<br/>• Link checking<br/>• Schema validation<br/>• Change verification"]:::llm
|
||||||
end
|
end
|
||||||
|
|
||||||
LLM_ENGINE --> TEMPLATE
|
LLM_ENGINE --> TEMPLATE
|
||||||
@@ -278,14 +291,14 @@ graph TB
|
|||||||
|
|
||||||
GIT_API["GitLab API<br/>API: v4<br/>Auth: Project Access Token<br/>Permissions: api, write_repo"]:::git
|
GIT_API["GitLab API<br/>API: v4<br/>Auth: Project Access Token<br/>Permissions: api, write_repo"]:::git
|
||||||
|
|
||||||
PR_AUTO["Automated PR Creator<br/>Lang: Python 3.11<br/>Lib: python-gitlab<br/>Template: .gitlab/merge_request.md"]:::git
|
PR_AUTO["Automated PR Creator<br/>Lang: Python 3.11<br/>Lib: python-gitlab<br/>Template: .gitlab/merge_request.md<br/>Include: change summary"]:::git
|
||||||
end
|
end
|
||||||
|
|
||||||
VALIDATOR -->|"git add/commit/push"| GIT_REPO
|
VALIDATOR -->|"git add/commit/push"| GIT_REPO
|
||||||
GIT_REPO <--> GIT_API
|
GIT_REPO <--> GIT_API
|
||||||
GIT_API --> PR_AUTO
|
GIT_API --> PR_AUTO
|
||||||
|
|
||||||
REVIEWER["👨💼 Technical Reviewer<br/>Role: Maintainer/Owner<br/>Review: diff + validation<br/>Approve: required (min 1)"]:::monitor
|
REVIEWER["👨💼 Technical Reviewer<br/>Role: Maintainer/Owner<br/>Review: diff + validation<br/>Check: change correlation<br/>Approve: required (min 1)"]:::monitor
|
||||||
|
|
||||||
PR_AUTO -->|"Notification<br/>Email + Slack"| REVIEWER
|
PR_AUTO -->|"Notification<br/>Email + Slack"| REVIEWER
|
||||||
REVIEWER -->|"Merge to main"| GIT_REPO
|
REVIEWER -->|"Merge to main"| GIT_REPO
|
||||||
@@ -323,41 +336,44 @@ graph TB
|
|||||||
%% =====================================
|
%% =====================================
|
||||||
|
|
||||||
subgraph OBSERVABILITY["📊 MONITORING & LOGGING"]
|
subgraph OBSERVABILITY["📊 MONITORING & LOGGING"]
|
||||||
PROMETHEUS["Prometheus<br/>Metrics: collector lag, cache hit/miss<br/>Scrape: 30s<br/>Retention: 15d"]:::monitor
|
PROMETHEUS["Prometheus<br/>Metrics: collector updates, changes detected<br/>Scrape: 30s<br/>Retention: 15d"]:::monitor
|
||||||
|
|
||||||
GRAFANA["Grafana Dashboards<br/>• Kafka metrics<br/>• Redis performance<br/>• LLM response times<br/>• Pipeline success rate"]:::monitor
|
GRAFANA["Grafana Dashboards<br/>• Collector status<br/>• Redis performance<br/>• Change detection rate<br/>• LLM response times<br/>• Pipeline success rate"]:::monitor
|
||||||
|
|
||||||
ELK["ELK Stack<br/>Logs: all components<br/>Index: daily rotation<br/>Retention: 30d"]:::monitor
|
ELK["ELK Stack<br/>Logs: all components<br/>Index: daily rotation<br/>Retention: 30d"]:::monitor
|
||||||
|
|
||||||
ALERTS["Alerting<br/>• Connector failures<br/>• Kafka lag > 10k<br/>• Redis OOM<br/>• Pipeline failures<br/>Channel: Slack + PagerDuty"]:::monitor
|
ALERTS["Alerting<br/>• Collector failures<br/>• Redis issues<br/>• Change detection errors<br/>• Pipeline failures<br/>Channel: Slack + PagerDuty"]:::monitor
|
||||||
end
|
end
|
||||||
|
|
||||||
CONN_VM -.->|"metrics"| PROMETHEUS
|
CONN_VM -.->|"metrics"| PROMETHEUS
|
||||||
CONN_K8S -.->|"metrics"| PROMETHEUS
|
CONN_K8S -.->|"metrics"| PROMETHEUS
|
||||||
KAFKA_TOPICS -.->|"metrics"| PROMETHEUS
|
|
||||||
REDIS_CLUSTER -.->|"metrics"| PROMETHEUS
|
REDIS_CLUSTER -.->|"metrics"| PROMETHEUS
|
||||||
|
DETECTOR -.->|"metrics"| PROMETHEUS
|
||||||
MCP_SERVER -.->|"metrics"| PROMETHEUS
|
MCP_SERVER -.->|"metrics"| PROMETHEUS
|
||||||
GITLAB_CI -.->|"metrics"| PROMETHEUS
|
GITLAB_CI -.->|"metrics"| PROMETHEUS
|
||||||
|
|
||||||
PROMETHEUS --> GRAFANA
|
PROMETHEUS --> GRAFANA
|
||||||
|
|
||||||
CONN_VM -.->|"logs"| ELK
|
CONN_VM -.->|"logs"| ELK
|
||||||
CONSUMER_GRP -.->|"logs"| ELK
|
DETECTOR -.->|"logs"| ELK
|
||||||
MCP_SERVER -.->|"logs"| ELK
|
MCP_SERVER -.->|"logs"| ELK
|
||||||
GITLAB_CI -.->|"logs"| ELK
|
GITLAB_CI -.->|"logs"| ELK
|
||||||
|
|
||||||
GRAFANA --> ALERTS
|
GRAFANA --> ALERTS
|
||||||
|
|
||||||
%% =====================================
|
%% =====================================
|
||||||
%% SECURITY ANNOTATIONS
|
%% SECURITY & EFFICIENCY ANNOTATIONS
|
||||||
%% =====================================
|
%% =====================================
|
||||||
|
|
||||||
SEC1["🔒 SECURITY:<br/>• All APIs use TLS 1.3<br/>• Secrets in Vault/K8s Secrets<br/>• Network: private VPC<br/>• LLM has NO direct access"]:::monitor
|
SEC1["🔒 SECURITY:<br/>• All APIs use TLS 1.3<br/>• Secrets in Vault/K8s Secrets<br/>• Network: private VPC<br/>• LLM has NO direct access"]:::monitor
|
||||||
|
|
||||||
SEC2["🔐 AUTHENTICATION:<br/>• API Tokens rotated 90d<br/>• RBAC enforced<br/>• Audit logs enabled<br/>• MFA required for Git"]:::monitor
|
SEC2["🔐 AUTHENTICATION:<br/>• API Tokens rotated 90d<br/>• RBAC enforced<br/>• Audit logs enabled<br/>• MFA required for Git"]:::monitor
|
||||||
|
|
||||||
|
EFF1["⚡ EFFICIENCY:<br/>• Doc generation only on changes<br/>• Debounce prevents spam<br/>• Hash-based change detection<br/>• Batch processing"]:::change
|
||||||
|
|
||||||
SEC1 -.-> MCP_SERVER
|
SEC1 -.-> MCP_SERVER
|
||||||
SEC2 -.-> GIT_REPO
|
SEC2 -.-> GIT_REPO
|
||||||
|
EFF1 -.-> DETECTOR
|
||||||
```
|
```
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|||||||
Reference in New Issue
Block a user