Tags: teilomillet/gollm
Tags
Update Go module dependencies and enhance function calling tests This commit updates the Go module dependencies to their latest versions, including Go 1.23.0 and various libraries such as `github.com/caarlos0/env`, `github.com/go-playground/validator`, and `github.com/invopop/jsonschema`. The changes ensure compatibility with the latest features and improvements in these libraries. Additionally, the function calling tests have been refactored to improve structure and maintainability. The test cases for both OpenAI and Anthropic providers have been consolidated, allowing for shared logic while maintaining provider-specific configurations. New functions have been introduced, including `get_weather_with_optional` and `record_summary`, to enhance the testing of optional parameters and JSON schema outputs. The overall test suite has been expanded to cover various scenarios, ensuring robust validation of function calls and their parameters.
Enhance OpenAIProvider to handle max_tokens and max_completion_tokens… … more robustly This commit improves the handling of token parameters in the OpenAIProvider by ensuring that the correct token limit is used based on the model type. The changes include a more comprehensive check for models that require max_completion_tokens, allowing for seamless conversion between max_tokens and max_completion_tokens. Additionally, the merging of options from both provider and function parameters is refined to prevent conflicts, ensuring that only the appropriate token limit is set in the request. This enhancement aims to improve the reliability and correctness of token management across different OpenAI models.
fix: Improve TestChainedPrompts with detailed debugging output Enhance test reliability and diagnostics by adding comprehensive logging and error checking in the TestChainedPrompts function. The changes include: - Add fmt.Printf statements to print raw and cleaned JSON responses - Print unmarshalled result with indentation for better readability - Improve assertion messages to provide more context on expected properties - Ensure thorough validation of the JSON schema response structure
feat: Add thread-safe option management with mutex protection Enhance concurrency safety in LLM option handling by introducing a read-write mutex to protect concurrent access to the Options map. This change prevents potential race conditions when multiple goroutines interact with provider-specific options, ensuring safe and predictable behavior across different LLM operations like generation, streaming, and option setting.
feat: Enhance memory management and structured messaging support acro… …ss providers This commit introduces significant improvements to memory management and message handling: - Added support for structured messages with cache control - Implemented `PrepareRequestWithMessages` method for multiple providers - Enhanced memory management with more flexible message addition - Introduced new memory message types with metadata support - Improved conversation context preservation across different LLM providers - Added option to use structured or flattened message formats The changes enable more granular control over conversation history, caching, and message metadata while maintaining backwards compatibility.
PreviousNext