+
Skip to content

AutoTuner should not export the learner parameters that are being set by the tuner #141

Open
@mb706

Description

@mb706

I am not sure this is possible with reasonable effort, because the tuner does not know what parameters are being set by it. Maybe using something like what I propose in mlr-org/paradox#225 would make this possible, because it would be a $trafo() that knows what the parameters are that are being set. Alternatively the user could somehow inform the AutoTuner what parameters are set by the tuner and should not be set by the user.

Example of a trafo that sets the maxdepth or the cp parameter, but that does not make the fact that it refers to maxdepth or cp obvious to the outside:

> ps = ParamSet$new(list(ParamLgl$new("x")))
> ps$trafo = function(x, param_set)
+   if (!"x" %in% names(x)) x else  # circumvent bug in EvalPerf
+   if (x$x) list(cp = 0.5) else
+   list(maxdepth = 1)

(Tuning happens the following way:)

> tune = TunerRandomSearch$new(
+   pe = PerfEval$new(task = "iris", learner = "classif.rpart",
+     resampling = "cv", measure = "classif.ce", param_set = ps),
+   terminator = TerminatorEvaluations$new(3))$tune()

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载