这是indexloc提供的服务,不要输入任何密码
Skip to content

Integrating X-LoRA #1472

@EricLBuehler

Description

@EricLBuehler

Hello all, and thank you for your great work!

Earlier this week, we announced X-LoRA, a flexible MoE approach for LoRA adapters. We implement deep layer- and token-wise scalings for multiple LoRA adapters and provide an implementation (https://github.com/EricLBuehler/xlora) that enables straightforward application to any model to which peft LoRA adapters may be applied to. This offers the possibility to orchestrate at a much finer level, that is, to achieve new combinations of adapter layers. This results in never-before-seen per-token deep layer-wise combinations of parameters to solve specific tasks. Sample weights are provided at: https://huggingface.co/lamm-mit/x-lora with examples in protein science in the paper.

Would you be interested to perhaps integrate X-LoRA into peft? I would be happy to work on this if there is interest from you and the community.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions