mirror of
https://github.com/codeflash-ai/codeflash-internal.git
synced 2026-05-04 18:25:18 +00:00
When Azure OpenAI or Anthropic returns null/empty content (content filter, truncation, transient failure), call_openai/call_anthropic now raise LLMOutputUnparseable instead of returning an empty string that silently flows through the pipeline and produces 422 "Could not generate any optimizations." All optimizer callers catch LLMOutputUnparseable to preserve cost tracking while returning None. |
||
|---|---|---|
| .. | ||
| aiservice | ||
| .dockerignore | ||