-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Closed
Labels
enhancementFeature that is not a new algorithm or an algorithm enhancementFeature that is not a new algorithm or an algorithm enhancement
Description
new feature request
import tianshou, gym, torch, numpy, sys
print(tianshou.__version__, gym.__version__, torch.__version__, numpy.__version__, sys.version, sys.platform)
print(tianshou.__version__, gym.__version__, torch.__version__, numpy.__version__, sys.version, sys.platform)
0.4.10 0.21.0 1.12.1.post200 1.23.3 3.10.6 | packaged by conda-forge | (main, Aug 22 2022, 20:35:26) [GCC 10.4.0] linux
-
Type of args
norm_layer
andactivation
oftianshou.utils.net.common.Net
isOptional[ModuleType]
, while that of MLP is Optional[Union[ModuleType, Sequence[ModuleType]]]. Ininit
ofNet
, call ofMLP()
withnorm_layer
andactivation
miss adaptation to typeSequence[ModuleType]
.
I think it's better to keep consistent of the type of args between Net and MLP. -
Beside
norm_layer
andactivation
, I suggest that add concomitant args to pass args or kwargs for init of norm_layer and activation.
For example,
def miniblock(
input_size: int,
output_size: int = 0,
norm_layer: Optional[ModuleType] = None,
norm_args: Optional[(Optional[Tuple], Optional[Dict])] = None,
activation: Optional[ModuleType] = None,
act_args: Optional[(Optional[Tuple], Optional[Dict])] = None,
linear_layer: Type[nn.Linear] = nn.Linear,
) -> List[nn.Module]:
Add act_args: Optional[args or kwargs or sequence of args/kwargs],
to Net and MLP.
Metadata
Metadata
Assignees
Labels
enhancementFeature that is not a new algorithm or an algorithm enhancementFeature that is not a new algorithm or an algorithm enhancement