codeflash-internal/django
mohammed ahmed 4c4b497d2a Fix: Handle LLM client close() errors gracefully
**Issue**: The `/ai/optimization_review` endpoint was returning 500 errors
when trying to close LLM clients during event loop changes.

**Root Cause**: In `aiservice/llm.py` lines 96-99, the `close()` calls on
OpenAI and Anthropic clients were not wrapped in exception handlers. When
the httpx transport was already closed or in a bad state (e.g., event loop
closure, connection already closed), the exception would propagate and cause
the entire request to fail with a 500 error.

**Fix**: Wrapped both `openai_client.close()` and `anthropic_client.close()`
in try-except blocks that catch and log exceptions at DEBUG level. This
prevents transport errors from crashing requests while still attempting to
clean up resources properly.

**Impact**: Fixes 500 errors on `/ai/optimization_review` and other endpoints
that use the LLM client when event loops change or clients are in bad states.

**Testing**: Added `test_llm_client_close.py` with 2 test cases that verify:
1. Transport errors during close() are handled gracefully
2. Event loop closed errors are handled gracefully

**Traces**: 312d7392, 5bbdf214, a1325051
2026-04-03 19:19:19 +00:00
..
aiservice Fix: Handle LLM client close() errors gracefully 2026-04-03 19:19:19 +00:00
.dockerignore local setup (#1898) 2025-11-17 12:35:09 -08:00