-
Notifications
You must be signed in to change notification settings - Fork 39
Description
Feature Request
I would like to request the addition of training configuration files (e.g., model architecture, optimizer settings, scheduler, etc.) for RWKV7 and Mamba2 models.
Providing official or recommended training configs for these models would help standardize experimentation and make it easier for the community to benchmark and train models consistently.
Motivation
I’m currently exploring training pipelines for RWKV7 and Mamba2, and it’s been a challenge to reproduce results or even get started without a reliable baseline configuration.
Having official training configs would greatly reduce setup time and lower the barrier to entry for new users. It would also help ensure training aligns with best practices and model-specific requirements.
This feature request aims to address the recurring friction of inconsistent or undocumented training setups. If this is already covered elsewhere, I’d appreciate a pointer!
Your Contribution
I am trying to do training of Mamba2 and RWKV-7, but I failed to reproduce.