Files
llm-automation-docs-and-rem…/frontend/src
dnviti 6f5deb0879
Some checks failed
CI/CD Pipeline / Run Tests (push) Has been cancelled
CI/CD Pipeline / Security Scanning (push) Has been cancelled
CI/CD Pipeline / Build and Push Docker Images (api) (push) Has been cancelled
CI/CD Pipeline / Build and Push Docker Images (chat) (push) Has been cancelled
CI/CD Pipeline / Build and Push Docker Images (frontend) (push) Has been cancelled
CI/CD Pipeline / Build and Push Docker Images (worker) (push) Has been cancelled
CI/CD Pipeline / Deploy to Staging (push) Has been cancelled
CI/CD Pipeline / Deploy to Production (push) Has been cancelled
CI/CD Pipeline / Lint Code (push) Has started running
CI/CD Pipeline / Generate Documentation (push) Has started running
feat: add multilingual chat support with markdown rendering
- Fix Socket.IO proxy configuration in nginx for chat connectivity
- Add Socket.IO path routing (/socket.io/) with WebSocket upgrade support
- Fix frontend healthcheck to use curl instead of wget
- Add react-markdown and remark-gfm for proper markdown rendering
- Implement language selector in chat interface (8 languages supported)
- Add language parameter to chat agent and LLM prompts
- Support English, Italian, Spanish, French, German, Portuguese, Chinese, Japanese

This resolves the chat connection issues and enables users to receive
AI responses in their preferred language with properly formatted markdown.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-20 19:14:38 +02:00
..