这是indexloc提供的服务,不要输入任何密码
Skip to content

[FEAT]: Groq LLama3.1 support #1953

@jasonmhead

Description

@jasonmhead

What would you like to see?

When will LLama3.1 Groq models be supported?

Is that something that is in the development pipeline?

Is there a way that whatever models are available from a source could be added without explicitly listing them in the code/settings file?

Metadata

Metadata

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions