Tuners¶
EvalML tuner classes.
Package Contents¶
Classes Summary¶
Grid Search Optimizer, which generates all of the possible points to search for using a grid. |
|
Random Search Optimizer. |
|
Bayesian Optimizer. |
|
Base Tuner class. |
Exceptions Summary¶
Contents¶
-
class
evalml.tuners.
GridSearchTuner
(pipeline_hyperparameter_ranges, n_points=10, random_seed=0)[source]¶ Grid Search Optimizer, which generates all of the possible points to search for using a grid.
- Parameters
pipeline_hyperparameter_ranges (dict) – a set of hyperparameter ranges corresponding to a pipeline’s parameters
n_points (int) – The number of points to sample from along each dimension defined in the
space
argument. Defaults to 10.random_seed (int) – Seed for random number generator. Unused in this class, defaults to 0.
Examples
>>> tuner = GridSearchTuner({'My Component': {'param a': [0.0, 10.0], 'param b': ['a', 'b', 'c']}}, n_points=5) >>> proposal = tuner.propose() ... >>> assert proposal.keys() == {'My Component'} >>> assert proposal['My Component'] == {'param a': 0.0, 'param b': 'a'}
Determines points using a grid search approach.
>>> for each in range(5): ... print(tuner.propose()) {'My Component': {'param a': 0.0, 'param b': 'b'}} {'My Component': {'param a': 0.0, 'param b': 'c'}} {'My Component': {'param a': 10.0, 'param b': 'a'}} {'My Component': {'param a': 10.0, 'param b': 'b'}} {'My Component': {'param a': 10.0, 'param b': 'c'}}
Methods
Not applicable to grid search tuner as generated parameters are not dependent on scores of previous parameters.
Checks if it is possible to generate a set of valid parameters. Stores generated parameters in
self.curr_params
to be returned bypropose()
.Returns parameters from _grid_points iterations.
-
add
(self, pipeline_parameters, score)[source]¶ Not applicable to grid search tuner as generated parameters are not dependent on scores of previous parameters.
- Parameters
pipeline_parameters (dict) – a dict of the parameters used to evaluate a pipeline
score (float) – the score obtained by evaluating the pipeline with the provided parameters
-
is_search_space_exhausted
(self)[source]¶ Checks if it is possible to generate a set of valid parameters. Stores generated parameters in
self.curr_params
to be returned bypropose()
.- Returns
If no more valid parameters exists in the search space, return False.
- Return type
bool
- Raises
NoParamsException – If a search space is exhausted, then this exception is thrown.
-
exception
evalml.tuners.
NoParamsException
[source]¶ Raised when a tuner exhausts its search space and runs out of parameters to propose.
-
exception
evalml.tuners.
ParameterError
[source]¶ Raised when a tuner encounters an error with the parameters being used with it.
-
class
evalml.tuners.
RandomSearchTuner
(pipeline_hyperparameter_ranges, with_replacement=False, replacement_max_attempts=10, random_seed=0)[source]¶ Random Search Optimizer.
- Parameters
pipeline_hyperparameter_ranges (dict) – a set of hyperparameter ranges corresponding to a pipeline’s parameters
with_replacement (bool) – If false, only unique hyperparameters will be shown
replacement_max_attempts (int) – The maximum number of tries to get a unique set of random parameters. Only used if tuner is initalized with with_replacement=True
random_seed (int) – Seed for random number generator. Defaults to 0.
Example
>>> tuner = RandomSearchTuner({'My Component': {'param a': [0.0, 10.0], 'param b': ['a', 'b', 'c']}}, random_seed=42) >>> proposal = tuner.propose() ... >>> assert proposal.keys() == {'My Component'} >>> assert proposal['My Component'] == {'param a': 3.7454011884736254, 'param b': 'c'}
Determines points using a random search approach.
>>> for each in range(7): ... print(tuner.propose()) {'My Component': {'param a': 7.3199394181140525, 'param b': 'b'}} {'My Component': {'param a': 1.5601864044243654, 'param b': 'a'}} {'My Component': {'param a': 0.5808361216819947, 'param b': 'c'}} {'My Component': {'param a': 6.011150117432089, 'param b': 'c'}} {'My Component': {'param a': 0.2058449429580245, 'param b': 'c'}} {'My Component': {'param a': 8.32442640800422, 'param b': 'a'}} {'My Component': {'param a': 1.8182496720710064, 'param b': 'a'}}
Methods
Not applicable to random search tuner as generated parameters are not dependent on scores of previous parameters.
Checks if it is possible to generate a set of valid parameters. Stores generated parameters in
self.curr_params
to be returned bypropose()
.Generate a unique set of parameters.
-
add
(self, pipeline_parameters, score)[source]¶ Not applicable to random search tuner as generated parameters are not dependent on scores of previous parameters.
- Parameters
pipeline_parameters (dict) – A dict of the parameters used to evaluate a pipeline
score (float) – The score obtained by evaluating the pipeline with the provided parameters
-
is_search_space_exhausted
(self)[source]¶ Checks if it is possible to generate a set of valid parameters. Stores generated parameters in
self.curr_params
to be returned bypropose()
.- Returns
If no more valid parameters exists in the search space, return False.
- Return type
bool
- Raises
NoParamsException – If a search space is exhausted, then this exception is thrown.
-
propose
(self)[source]¶ Generate a unique set of parameters.
If tuner was initialized with
with_replacement=True
and the tuner is unable to generate a unique set of parameters afterreplacement_max_attempts
tries, thenNoParamsException
is raised.- Returns
Proposed pipeline parameters
- Return type
dict
-
class
evalml.tuners.
SKOptTuner
(pipeline_hyperparameter_ranges, random_seed=0)[source]¶ Bayesian Optimizer.
- Parameters
pipeline_hyperparameter_ranges (dict) – A set of hyperparameter ranges corresponding to a pipeline’s parameters.
random_seed (int) – The seed for the random number generator. Defaults to 0.
Examples
>>> tuner = SKOptTuner({'My Component': {'param a': [0.0, 10.0], 'param b': ['a', 'b', 'c']}}) >>> proposal = tuner.propose() ... >>> assert proposal.keys() == {'My Component'} >>> assert proposal['My Component'] == {'param a': 5.928446182250184, 'param b': 'c'}
Determines points using a Bayesian Optimizer approach.
>>> for each in range(7): ... print(tuner.propose()) {'My Component': {'param a': 8.57945617622757, 'param b': 'c'}} {'My Component': {'param a': 6.235636967859724, 'param b': 'b'}} {'My Component': {'param a': 2.9753460654447235, 'param b': 'a'}} {'My Component': {'param a': 2.7265629458011325, 'param b': 'b'}} {'My Component': {'param a': 8.121687287754932, 'param b': 'b'}} {'My Component': {'param a': 3.927847961008298, 'param b': 'c'}} {'My Component': {'param a': 3.3739616041726843, 'param b': 'b'}}
Methods
Add score to sample.
Optional. If possible search space for tuner is finite, this method indicates whether or not all possible parameters have been scored.
Returns a suggested set of parameters to train and score a pipeline with, based off the search space dimensions and prior samples.
-
add
(self, pipeline_parameters, score)[source]¶ Add score to sample.
- Parameters
pipeline_parameters (dict) – A dict of the parameters used to evaluate a pipeline
score (float) – The score obtained by evaluating the pipeline with the provided parameters
- Returns
None
- Raises
Exception – If skopt tuner errors.
ParameterError – If skopt receives invalid parameters.
-
is_search_space_exhausted
(self)¶ Optional. If possible search space for tuner is finite, this method indicates whether or not all possible parameters have been scored.
- Returns
Returns true if all possible parameters in a search space has been scored.
- Return type
bool
-
class
evalml.tuners.
Tuner
(pipeline_hyperparameter_ranges, random_seed=0)[source]¶ Base Tuner class.
Tuners implement different strategies for sampling from a search space. They’re used in EvalML to search the space of pipeline hyperparameters.
- Parameters
pipeline_hyperparameter_ranges (dict) – a set of hyperparameter ranges corresponding to a pipeline’s parameters.
random_seed (int) – The random state. Defaults to 0.
Methods
Register a set of hyperparameters with the score obtained from training a pipeline with those hyperparameters.
Optional. If possible search space for tuner is finite, this method indicates whether or not all possible parameters have been scored.
Returns a suggested set of parameters to train and score a pipeline with, based off the search space dimensions and prior samples.
-
abstract
add
(self, pipeline_parameters, score)[source]¶ Register a set of hyperparameters with the score obtained from training a pipeline with those hyperparameters.
- Parameters
pipeline_parameters (dict) – a dict of the parameters used to evaluate a pipeline
score (float) – the score obtained by evaluating the pipeline with the provided parameters
- Returns
None