feat: enhance chat service with documentation indexing and improved Docker configuration
Some checks failed
CI/CD Pipeline / Generate Documentation (push) Failing after 7m41s
CI/CD Pipeline / Lint Code (push) Failing after 7m44s
CI/CD Pipeline / Run Tests (push) Has been skipped
CI/CD Pipeline / Security Scanning (push) Has been skipped
CI/CD Pipeline / Build and Push Docker Images (api) (push) Has been skipped
CI/CD Pipeline / Build and Push Docker Images (chat) (push) Has been skipped
CI/CD Pipeline / Build and Push Docker Images (frontend) (push) Has been skipped
CI/CD Pipeline / Build and Push Docker Images (worker) (push) Has been skipped
CI/CD Pipeline / Deploy to Staging (push) Has been skipped
CI/CD Pipeline / Deploy to Production (push) Has been skipped
Some checks failed
CI/CD Pipeline / Generate Documentation (push) Failing after 7m41s
CI/CD Pipeline / Lint Code (push) Failing after 7m44s
CI/CD Pipeline / Run Tests (push) Has been skipped
CI/CD Pipeline / Security Scanning (push) Has been skipped
CI/CD Pipeline / Build and Push Docker Images (api) (push) Has been skipped
CI/CD Pipeline / Build and Push Docker Images (chat) (push) Has been skipped
CI/CD Pipeline / Build and Push Docker Images (frontend) (push) Has been skipped
CI/CD Pipeline / Build and Push Docker Images (worker) (push) Has been skipped
CI/CD Pipeline / Deploy to Staging (push) Has been skipped
CI/CD Pipeline / Deploy to Production (push) Has been skipped
This commit is contained in:
@@ -14,6 +14,7 @@ This client works with:
|
||||
import logging
|
||||
from typing import Any, AsyncIterator, Dict, List, Optional, Union, cast
|
||||
|
||||
import httpx
|
||||
from openai import AsyncOpenAI
|
||||
from openai.types.chat import ChatCompletion, ChatCompletionChunk
|
||||
|
||||
@@ -79,8 +80,13 @@ class LLMClient:
|
||||
self.temperature = temperature if temperature is not None else settings.LLM_TEMPERATURE
|
||||
self.max_tokens = max_tokens or settings.LLM_MAX_TOKENS
|
||||
|
||||
# Initialize AsyncOpenAI client
|
||||
self.client = AsyncOpenAI(base_url=self.base_url, api_key=self.api_key)
|
||||
# Initialize AsyncOpenAI client with custom HTTP client (disable SSL verification for self-signed certs)
|
||||
http_client = httpx.AsyncClient(verify=False, timeout=30.0)
|
||||
self.client = AsyncOpenAI(
|
||||
base_url=self.base_url,
|
||||
api_key=self.api_key,
|
||||
http_client=http_client
|
||||
)
|
||||
|
||||
logger.info(f"Initialized LLM client: base_url={self.base_url}, model={self.model}")
|
||||
|
||||
|
||||
Reference in New Issue
Block a user