-
Notifications
You must be signed in to change notification settings - Fork 287
Open
Description
Hello, thank you very much but I've encountered some issues while using Longformer. Currently, my Transformers library version is 4.48.3. When I try to initialize my own Longformer model using a configuration, an error occurs: .
[AssertionError: Padding_idx must be within num_embeddings]
After reviewing the library functions, I found that the issue might stem from the following code in Longformer:
self.position_embeddings = nn.Embedding(
config.max_position_embeddings, config.hidden_size, padding_idx=self.padding_idx
)
This line may have incorrectly introduced padding_idx
, which I believe should not be present in position_embeddings
. I checked the code for BERT and RoBERTa, and in the corresponding part, they use:
self.position_embeddings = nn.Embedding(config.max_position_embeddings, config.hidden_size)
Therefore, I suggest considering removing this parameter. I hope this information is helpful to you. Thank you!
Metadata
Metadata
Assignees
Labels
No labels