+
Skip to content

position_embeddings error when using custom config #261

@yagami-yon

Description

@yagami-yon

Hello, thank you very much but I've encountered some issues while using Longformer. Currently, my Transformers library version is 4.48.3. When I try to initialize my own Longformer model using a configuration, an error occurs: .
[AssertionError: Padding_idx must be within num_embeddings]
After reviewing the library functions, I found that the issue might stem from the following code in Longformer:

self.position_embeddings = nn.Embedding(
    config.max_position_embeddings, config.hidden_size, padding_idx=self.padding_idx
)

This line may have incorrectly introduced padding_idx, which I believe should not be present in position_embeddings. I checked the code for BERT and RoBERTa, and in the corresponding part, they use:

self.position_embeddings = nn.Embedding(config.max_position_embeddings, config.hidden_size)

Therefore, I suggest considering removing this parameter. I hope this information is helpful to you. Thank you!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载