skopt_tuner#

Bayesian Optimizer.

Module Contents#

Classes Summary#

SKOptTuner

Bayesian Optimizer.

Attributes Summary#

logger

Contents#

evalml.tuners.skopt_tuner.logger#
class evalml.tuners.skopt_tuner.SKOptTuner(pipeline_hyperparameter_ranges, random_seed=0)[source]#

Bayesian Optimizer.

Parameters
  • pipeline_hyperparameter_ranges (dict) – A set of hyperparameter ranges corresponding to a pipeline’s parameters.

  • random_seed (int) – The seed for the random number generator. Defaults to 0.

Examples

>>> tuner = SKOptTuner({'My Component': {'param a': [0.0, 10.0], 'param b': ['a', 'b', 'c']}})
>>> proposal = tuner.propose()
...
>>> assert proposal.keys() == {'My Component'}
>>> assert proposal['My Component'] == {'param a': 5.928446182250184, 'param b': 'c'}

Determines points using a Bayesian Optimizer approach.

>>> for each in range(7):
...     print(tuner.propose())
{'My Component': {'param a': 8.57945617622757, 'param b': 'c'}}
{'My Component': {'param a': 6.235636967859724, 'param b': 'b'}}
{'My Component': {'param a': 2.9753460654447235, 'param b': 'a'}}
{'My Component': {'param a': 2.7265629458011325, 'param b': 'b'}}
{'My Component': {'param a': 8.121687287754932, 'param b': 'b'}}
{'My Component': {'param a': 3.927847961008298, 'param b': 'c'}}
{'My Component': {'param a': 3.3739616041726843, 'param b': 'b'}}

Methods

add

Add score to sample.

get_starting_parameters

Gets the starting parameters given the pipeline hyperparameter range.

is_search_space_exhausted

Optional. If possible search space for tuner is finite, this method indicates whether or not all possible parameters have been scored.

propose

Returns a suggested set of parameters to train and score a pipeline with, based off the search space dimensions and prior samples.

add(self, pipeline_parameters, score)[source]#

Add score to sample.

Parameters
  • pipeline_parameters (dict) – A dict of the parameters used to evaluate a pipeline

  • score (float) – The score obtained by evaluating the pipeline with the provided parameters

Returns

None

Raises
  • Exception – If skopt tuner errors.

  • ParameterError – If skopt receives invalid parameters.

get_starting_parameters(self, hyperparameter_ranges, random_seed=0)#

Gets the starting parameters given the pipeline hyperparameter range.

Parameters
  • hyperparameter_ranges (dict) – The custom hyperparameter ranges passed in during search. Used to determine the starting parameters.

  • random_seed (int) – The random seed to use. Defaults to 0.

Returns

The starting parameters, randomly chosen, to initialize a pipeline with.

Return type

dict

is_search_space_exhausted(self)#

Optional. If possible search space for tuner is finite, this method indicates whether or not all possible parameters have been scored.

Returns

Returns true if all possible parameters in a search space has been scored.

Return type

bool

propose(self)[source]#

Returns a suggested set of parameters to train and score a pipeline with, based off the search space dimensions and prior samples.

Returns

Proposed pipeline parameters.

Return type

dict