Tuners ======================= .. py:module:: evalml.tuners .. autoapi-nested-parse:: EvalML tuner classes. Submodules ---------- .. toctree:: :titlesonly: :maxdepth: 1 grid_search_tuner/index.rst random_search_tuner/index.rst skopt_tuner/index.rst tuner/index.rst tuner_exceptions/index.rst Package Contents ---------------- Classes Summary ~~~~~~~~~~~~~~~ .. autoapisummary:: evalml.tuners.GridSearchTuner evalml.tuners.RandomSearchTuner evalml.tuners.SKOptTuner evalml.tuners.Tuner Exceptions Summary ~~~~~~~~~~~~~~~~~~ .. autoapisummary:: `evalml.tuners.NoParamsException` `evalml.tuners.ParameterError` Contents ~~~~~~~~~~~~~~~~~~~ .. py:class:: GridSearchTuner(pipeline_hyperparameter_ranges, n_points=10, random_seed=0) Grid Search Optimizer, which generates all of the possible points to search for using a grid. :param pipeline_hyperparameter_ranges: a set of hyperparameter ranges corresponding to a pipeline's parameters :type pipeline_hyperparameter_ranges: dict :param n_points: The number of points to sample from along each dimension defined in the ``space`` argument. Defaults to 10. :type n_points: int :param random_seed: Seed for random number generator. Unused in this class, defaults to 0. :type random_seed: int .. rubric:: Examples >>> tuner = GridSearchTuner({'My Component': {'param a': [0.0, 10.0], 'param b': ['a', 'b', 'c']}}, n_points=5) >>> proposal = tuner.propose() ... >>> assert proposal.keys() == {'My Component'} >>> assert proposal['My Component'] == {'param a': 0.0, 'param b': 'a'} Determines points using a grid search approach. >>> for each in range(5): ... print(tuner.propose()) {'My Component': {'param a': 0.0, 'param b': 'b'}} {'My Component': {'param a': 0.0, 'param b': 'c'}} {'My Component': {'param a': 10.0, 'param b': 'a'}} {'My Component': {'param a': 10.0, 'param b': 'b'}} {'My Component': {'param a': 10.0, 'param b': 'c'}} **Methods** .. autoapisummary:: :nosignatures: evalml.tuners.GridSearchTuner.add evalml.tuners.GridSearchTuner.get_starting_parameters evalml.tuners.GridSearchTuner.is_search_space_exhausted evalml.tuners.GridSearchTuner.propose .. py:method:: add(self, pipeline_parameters, score) Not applicable to grid search tuner as generated parameters are not dependent on scores of previous parameters. :param pipeline_parameters: a dict of the parameters used to evaluate a pipeline :type pipeline_parameters: dict :param score: the score obtained by evaluating the pipeline with the provided parameters :type score: float .. py:method:: get_starting_parameters(self, hyperparameter_ranges, random_seed=0) Gets the starting parameters given the pipeline hyperparameter range. :param hyperparameter_ranges: The custom hyperparameter ranges passed in during search. Used to determine the starting parameters. :type hyperparameter_ranges: dict :param random_seed: The random seed to use. Defaults to 0. :type random_seed: int :returns: The starting parameters, randomly chosen, to initialize a pipeline with. :rtype: dict .. py:method:: is_search_space_exhausted(self) Checks if it is possible to generate a set of valid parameters. Stores generated parameters in ``self.curr_params`` to be returned by ``propose()``. :returns: If no more valid parameters exists in the search space, return False. :rtype: bool :raises NoParamsException: If a search space is exhausted, then this exception is thrown. .. py:method:: propose(self) Returns parameters from _grid_points iterations. If all possible combinations of parameters have been scored, then ``NoParamsException`` is raised. :returns: proposed pipeline parameters :rtype: dict .. py:exception:: NoParamsException Raised when a tuner exhausts its search space and runs out of parameters to propose. .. py:exception:: ParameterError Raised when a tuner encounters an error with the parameters being used with it. .. py:class:: RandomSearchTuner(pipeline_hyperparameter_ranges, with_replacement=False, replacement_max_attempts=10, random_seed=0) Random Search Optimizer. :param pipeline_hyperparameter_ranges: a set of hyperparameter ranges corresponding to a pipeline's parameters :type pipeline_hyperparameter_ranges: dict :param with_replacement: If false, only unique hyperparameters will be shown :type with_replacement: bool :param replacement_max_attempts: The maximum number of tries to get a unique set of random parameters. Only used if tuner is initalized with with_replacement=True :type replacement_max_attempts: int :param random_seed: Seed for random number generator. Defaults to 0. :type random_seed: int .. rubric:: Example >>> tuner = RandomSearchTuner({'My Component': {'param a': [0.0, 10.0], 'param b': ['a', 'b', 'c']}}, random_seed=42) >>> proposal = tuner.propose() ... >>> assert proposal.keys() == {'My Component'} >>> assert proposal['My Component'] == {'param a': 3.7454011884736254, 'param b': 'c'} Determines points using a random search approach. >>> for each in range(7): ... print(tuner.propose()) {'My Component': {'param a': 7.3199394181140525, 'param b': 'b'}} {'My Component': {'param a': 1.5601864044243654, 'param b': 'a'}} {'My Component': {'param a': 0.5808361216819947, 'param b': 'c'}} {'My Component': {'param a': 6.011150117432089, 'param b': 'c'}} {'My Component': {'param a': 0.2058449429580245, 'param b': 'c'}} {'My Component': {'param a': 8.32442640800422, 'param b': 'a'}} {'My Component': {'param a': 1.8182496720710064, 'param b': 'a'}} **Methods** .. autoapisummary:: :nosignatures: evalml.tuners.RandomSearchTuner.add evalml.tuners.RandomSearchTuner.get_starting_parameters evalml.tuners.RandomSearchTuner.is_search_space_exhausted evalml.tuners.RandomSearchTuner.propose .. py:method:: add(self, pipeline_parameters, score) Not applicable to random search tuner as generated parameters are not dependent on scores of previous parameters. :param pipeline_parameters: A dict of the parameters used to evaluate a pipeline :type pipeline_parameters: dict :param score: The score obtained by evaluating the pipeline with the provided parameters :type score: float .. py:method:: get_starting_parameters(self, hyperparameter_ranges, random_seed=0) Gets the starting parameters given the pipeline hyperparameter range. :param hyperparameter_ranges: The custom hyperparameter ranges passed in during search. Used to determine the starting parameters. :type hyperparameter_ranges: dict :param random_seed: The random seed to use. Defaults to 0. :type random_seed: int :returns: The starting parameters, randomly chosen, to initialize a pipeline with. :rtype: dict .. py:method:: is_search_space_exhausted(self) Checks if it is possible to generate a set of valid parameters. Stores generated parameters in ``self.curr_params`` to be returned by ``propose()``. :returns: If no more valid parameters exists in the search space, return False. :rtype: bool :raises NoParamsException: If a search space is exhausted, then this exception is thrown. .. py:method:: propose(self) Generate a unique set of parameters. If tuner was initialized with ``with_replacement=True`` and the tuner is unable to generate a unique set of parameters after ``replacement_max_attempts`` tries, then ``NoParamsException`` is raised. :returns: Proposed pipeline parameters :rtype: dict .. py:class:: SKOptTuner(pipeline_hyperparameter_ranges, random_seed=0) Bayesian Optimizer. :param pipeline_hyperparameter_ranges: A set of hyperparameter ranges corresponding to a pipeline's parameters. :type pipeline_hyperparameter_ranges: dict :param random_seed: The seed for the random number generator. Defaults to 0. :type random_seed: int .. rubric:: Examples >>> tuner = SKOptTuner({'My Component': {'param a': [0.0, 10.0], 'param b': ['a', 'b', 'c']}}) >>> proposal = tuner.propose() ... >>> assert proposal.keys() == {'My Component'} >>> assert proposal['My Component'] == {'param a': 5.928446182250184, 'param b': 'c'} Determines points using a Bayesian Optimizer approach. >>> for each in range(7): ... print(tuner.propose()) {'My Component': {'param a': 8.57945617622757, 'param b': 'c'}} {'My Component': {'param a': 6.235636967859724, 'param b': 'b'}} {'My Component': {'param a': 2.9753460654447235, 'param b': 'a'}} {'My Component': {'param a': 2.7265629458011325, 'param b': 'b'}} {'My Component': {'param a': 8.121687287754932, 'param b': 'b'}} {'My Component': {'param a': 3.927847961008298, 'param b': 'c'}} {'My Component': {'param a': 3.3739616041726843, 'param b': 'b'}} **Methods** .. autoapisummary:: :nosignatures: evalml.tuners.SKOptTuner.add evalml.tuners.SKOptTuner.get_starting_parameters evalml.tuners.SKOptTuner.is_search_space_exhausted evalml.tuners.SKOptTuner.propose .. py:method:: add(self, pipeline_parameters, score) Add score to sample. :param pipeline_parameters: A dict of the parameters used to evaluate a pipeline :type pipeline_parameters: dict :param score: The score obtained by evaluating the pipeline with the provided parameters :type score: float :returns: None :raises Exception: If skopt tuner errors. :raises ParameterError: If skopt receives invalid parameters. .. py:method:: get_starting_parameters(self, hyperparameter_ranges, random_seed=0) Gets the starting parameters given the pipeline hyperparameter range. :param hyperparameter_ranges: The custom hyperparameter ranges passed in during search. Used to determine the starting parameters. :type hyperparameter_ranges: dict :param random_seed: The random seed to use. Defaults to 0. :type random_seed: int :returns: The starting parameters, randomly chosen, to initialize a pipeline with. :rtype: dict .. py:method:: is_search_space_exhausted(self) Optional. If possible search space for tuner is finite, this method indicates whether or not all possible parameters have been scored. :returns: Returns true if all possible parameters in a search space has been scored. :rtype: bool .. py:method:: propose(self) Returns a suggested set of parameters to train and score a pipeline with, based off the search space dimensions and prior samples. :returns: Proposed pipeline parameters. :rtype: dict .. py:class:: Tuner(pipeline_hyperparameter_ranges, random_seed=0) Base Tuner class. Tuners implement different strategies for sampling from a search space. They're used in EvalML to search the space of pipeline hyperparameters. :param pipeline_hyperparameter_ranges: a set of hyperparameter ranges corresponding to a pipeline's parameters. :type pipeline_hyperparameter_ranges: dict :param random_seed: The random state. Defaults to 0. :type random_seed: int **Methods** .. autoapisummary:: :nosignatures: evalml.tuners.Tuner.add evalml.tuners.Tuner.get_starting_parameters evalml.tuners.Tuner.is_search_space_exhausted evalml.tuners.Tuner.propose .. py:method:: add(self, pipeline_parameters, score) :abstractmethod: Register a set of hyperparameters with the score obtained from training a pipeline with those hyperparameters. :param pipeline_parameters: a dict of the parameters used to evaluate a pipeline :type pipeline_parameters: dict :param score: the score obtained by evaluating the pipeline with the provided parameters :type score: float :returns: None .. py:method:: get_starting_parameters(self, hyperparameter_ranges, random_seed=0) Gets the starting parameters given the pipeline hyperparameter range. :param hyperparameter_ranges: The custom hyperparameter ranges passed in during search. Used to determine the starting parameters. :type hyperparameter_ranges: dict :param random_seed: The random seed to use. Defaults to 0. :type random_seed: int :returns: The starting parameters, randomly chosen, to initialize a pipeline with. :rtype: dict .. py:method:: is_search_space_exhausted(self) Optional. If possible search space for tuner is finite, this method indicates whether or not all possible parameters have been scored. :returns: Returns true if all possible parameters in a search space has been scored. :rtype: bool .. py:method:: propose(self) :abstractmethod: Returns a suggested set of parameters to train and score a pipeline with, based off the search space dimensions and prior samples. :returns: Proposed pipeline parameters :rtype: dict