skopt_tuner =================================== .. py:module:: evalml.tuners.skopt_tuner .. autoapi-nested-parse:: Bayesian Optimizer. Module Contents --------------- Classes Summary ~~~~~~~~~~~~~~~ .. autoapisummary:: evalml.tuners.skopt_tuner.SKOptTuner Attributes Summary ~~~~~~~~~~~~~~~~~~~ .. autoapisummary:: evalml.tuners.skopt_tuner.logger Contents ~~~~~~~~~~~~~~~~~~~ .. py:data:: logger .. py:class:: SKOptTuner(pipeline_hyperparameter_ranges, random_seed=0) Bayesian Optimizer. :param pipeline_hyperparameter_ranges: A set of hyperparameter ranges corresponding to a pipeline's parameters. :type pipeline_hyperparameter_ranges: dict :param random_seed: The seed for the random number generator. Defaults to 0. :type random_seed: int .. rubric:: Examples >>> tuner = SKOptTuner({'My Component': {'param a': [0.0, 10.0], 'param b': ['a', 'b', 'c']}}) >>> proposal = tuner.propose() ... >>> assert proposal.keys() == {'My Component'} >>> assert proposal['My Component'] == {'param a': 5.928446182250184, 'param b': 'c'} Determines points using a Bayesian Optimizer approach. >>> for each in range(7): ... print(tuner.propose()) {'My Component': {'param a': 8.57945617622757, 'param b': 'c'}} {'My Component': {'param a': 6.235636967859724, 'param b': 'b'}} {'My Component': {'param a': 2.9753460654447235, 'param b': 'a'}} {'My Component': {'param a': 2.7265629458011325, 'param b': 'b'}} {'My Component': {'param a': 8.121687287754932, 'param b': 'b'}} {'My Component': {'param a': 3.927847961008298, 'param b': 'c'}} {'My Component': {'param a': 3.3739616041726843, 'param b': 'b'}} **Methods** .. autoapisummary:: :nosignatures: evalml.tuners.skopt_tuner.SKOptTuner.add evalml.tuners.skopt_tuner.SKOptTuner.get_starting_parameters evalml.tuners.skopt_tuner.SKOptTuner.is_search_space_exhausted evalml.tuners.skopt_tuner.SKOptTuner.propose .. py:method:: add(self, pipeline_parameters, score) Add score to sample. :param pipeline_parameters: A dict of the parameters used to evaluate a pipeline :type pipeline_parameters: dict :param score: The score obtained by evaluating the pipeline with the provided parameters :type score: float :returns: None :raises Exception: If skopt tuner errors. :raises ParameterError: If skopt receives invalid parameters. .. py:method:: get_starting_parameters(self, hyperparameter_ranges, random_seed=0) Gets the starting parameters given the pipeline hyperparameter range. :param hyperparameter_ranges: The custom hyperparameter ranges passed in during search. Used to determine the starting parameters. :type hyperparameter_ranges: dict :param random_seed: The random seed to use. Defaults to 0. :type random_seed: int :returns: The starting parameters, randomly chosen, to initialize a pipeline with. :rtype: dict .. py:method:: is_search_space_exhausted(self) Optional. If possible search space for tuner is finite, this method indicates whether or not all possible parameters have been scored. :returns: Returns true if all possible parameters in a search space has been scored. :rtype: bool .. py:method:: propose(self) Returns a suggested set of parameters to train and score a pipeline with, based off the search space dimensions and prior samples. :returns: Proposed pipeline parameters. :rtype: dict