这是indexloc提供的服务,不要输入任何密码
Skip to content

Add LR schedulers to more algorithms #586

@alexnikulkov

Description

@alexnikulkov
  • I have marked all applicable categories:
    • exception-raising bug
    • RL algorithm bug
    • documentation request (i.e. "X is missing from the documentation.")
    • new feature request
  • I have visited the source website
  • I have searched through the issue tracker for duplicates
  • I have mentioned version numbers, operating system and environment, where applicable:
    import tianshou, gym, torch, numpy, sys
    print(tianshou.__version__, gym.__version__, torch.__version__, numpy.__version__, sys.version, sys.platform)

I see that PG and algorithms which inherit from it support LR schedulers. Can we add support for LR schedulers to more algorithms? I'm primarily interested in DDPG and SAC.

As a more ambitious refactor, can we add support for LR schedulers to all algorithms by allowing a tuple of (optim, LR_scheduler) to be used instead of an optim when we want to use a scheduler?

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementFeature that is not a new algorithm or an algorithm enhancement

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions