这是indexloc提供的服务,不要输入任何密码
Skip to content
This repository was archived by the owner on Oct 31, 2023. It is now read-only.
This repository was archived by the owner on Oct 31, 2023. It is now read-only.

Using pretrained networks and small amounts of labelled data #19

@brocksar

Description

@brocksar

Firstly, great and interesting work on PAWS! I have been working on using PAWS for my specific uses and I had a couple questions. Have you ever used a pretrained model as the backbone and used PAWS for further training? Particularly what I'm trying to do is use a backbone trained on imagenet, and then use PAWS to tune the weights on a smaller dataset with very few labels per class. I was wondering if you've ever experimented with a training setup like this, and what you would recommend hyperparameter-wise.

My second question touches on the 'very few labels per class' I just mentioned. In the ablation study in the paper you go down to 4 labels per class. Have you experimented with anything lower? I've been working on training PAWS with one labeled image per class, and was wondering what insights you had on training in this case. I know you have mentioned that increasing the number of classes per support batch is important, but I was wondering if there was anything else.

Thanks so much!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions