- Added OpenAI-compatible LLM endpoints to API backend - Introduced web frontend with Jinja2 templates and static assets - Implemented API proxy routes in web service - Added sample db.json data for items, users, orders, reviews, categories, llm_requests - Updated ADC and Helm configs for separate AI and standard rate limiting - Upgraded FastAPI, Uvicorn, and added httpx, Jinja2, python-multipart dependencies - Added API configuration modal and client-side JS for web app
13 lines
290 B
Plaintext
13 lines
290 B
Plaintext
# API Backend Configuration
|
|
# Set this to the base URL where the API service is running
|
|
|
|
# Local development
|
|
API_BASE_URL=http://localhost:8001
|
|
|
|
# Production
|
|
# API_BASE_URL=https://commandware.it/api
|
|
|
|
# Other examples
|
|
# API_BASE_URL=http://api:8001
|
|
# API_BASE_URL=http://192.168.1.100:8001
|