codeflash-internal/django
Kevin Turcios 9b3cd48048 Raise LLMOutputUnparseable on empty LLM responses instead of silently returning ""
When Azure OpenAI or Anthropic returns null/empty content (content
filter, truncation, transient failure), call_openai/call_anthropic now
raise LLMOutputUnparseable instead of returning an empty string that
silently flows through the pipeline and produces 422 "Could not
generate any optimizations." All optimizer callers catch
LLMOutputUnparseable to preserve cost tracking while returning None.
2026-04-21 05:59:07 -05:00
..
aiservice Raise LLMOutputUnparseable on empty LLM responses instead of silently returning "" 2026-04-21 05:59:07 -05:00
.dockerignore local setup (#1898) 2025-11-17 12:35:09 -08:00