Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve error handling #723

Merged
merged 14 commits into from
Feb 6, 2025
Merged

Conversation

danielaskdd
Copy link
Contributor

@danielaskdd danielaskdd commented Feb 6, 2025

Improve error handling, especially when LLM service is unstable:

  • Check if LLM response with None value, and retry if necessary.
  • Add error logs for each retry.
  • Send 'done' status to client when streaming is canceled before finish in Ollama API.
  • Improve timing accuracy for Ollama API
  • Increase timeout value for unit test.
  • Refactor concurrent unit test.

- Initialize timestamps at start to avoid null checks
- Add detailed error handling for streaming response
- Handle CancelledError and other exceptions separately
- Unify exception handling with trace_exception
- Clean up redundant code and simplify logic
• Added McpError and ErrorCode classes
• Added detailed error collection logic
• Improved error reporting & formatting
• Added request ID tracking
• Enhanced test results visibility
• Initialize first_chunk_time as None
• Set timing only when first chunk arrives
• Add InvalidResponseError custom exception
• Improve error logging for API failures
• Add empty response content validation
• Add more detailed debug logging info
• Add retry for invalid response cases
• Add User-Agent header with version info
• Update header creation in Ollama client
• Update header creation in OpenAI client
• Ensure consistent header format
• Include Mozilla UA string for OpenAI
@danielaskdd danielaskdd marked this pull request as ready for review February 6, 2025 17:03
@LarFii LarFii merged commit 0005462 into HKUDS:main Feb 6, 2025
1 check passed
@danielaskdd danielaskdd deleted the improve-ollama-api-streaming branch February 10, 2025 10:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants