+
Skip to content

[Feature Request] Request for training configuration for RWKV7 and Mamba2 #22

@Luther-Sparks

Description

@Luther-Sparks

Feature Request

I would like to request the addition of training configuration files (e.g., model architecture, optimizer settings, scheduler, etc.) for RWKV7 and Mamba2 models.

Providing official or recommended training configs for these models would help standardize experimentation and make it easier for the community to benchmark and train models consistently.

Motivation

I’m currently exploring training pipelines for RWKV7 and Mamba2, and it’s been a challenge to reproduce results or even get started without a reliable baseline configuration.

Having official training configs would greatly reduce setup time and lower the barrier to entry for new users. It would also help ensure training aligns with best practices and model-specific requirements.

This feature request aims to address the recurring friction of inconsistent or undocumented training setups. If this is already covered elsewhere, I’d appreciate a pointer!

Your Contribution

I am trying to do training of Mamba2 and RWKV-7, but I failed to reproduce.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载