random_search_tuner =========================================== .. py:module:: evalml.tuners.random_search_tuner .. autoapi-nested-parse:: Random Search Optimizer. Module Contents --------------- Classes Summary ~~~~~~~~~~~~~~~ .. autoapisummary:: evalml.tuners.random_search_tuner.RandomSearchTuner Contents ~~~~~~~~~~~~~~~~~~~ .. py:class:: RandomSearchTuner(pipeline_hyperparameter_ranges, with_replacement=False, replacement_max_attempts=10, random_seed=0) Random Search Optimizer. :param pipeline_hyperparameter_ranges: a set of hyperparameter ranges corresponding to a pipeline's parameters :type pipeline_hyperparameter_ranges: dict :param with_replacement: If false, only unique hyperparameters will be shown :type with_replacement: bool :param replacement_max_attempts: The maximum number of tries to get a unique set of random parameters. Only used if tuner is initalized with with_replacement=True :type replacement_max_attempts: int :param random_seed: Seed for random number generator. Defaults to 0. :type random_seed: int .. rubric:: Example >>> tuner = RandomSearchTuner({'My Component': {'param a': [0.0, 10.0], 'param b': ['a', 'b', 'c']}}, random_seed=42) >>> proposal = tuner.propose() ... >>> assert proposal.keys() == {'My Component'} >>> assert proposal['My Component'] == {'param a': 3.7454011884736254, 'param b': 'c'} Determines points using a random search approach. >>> for each in range(7): ... print(tuner.propose()) {'My Component': {'param a': 7.3199394181140525, 'param b': 'b'}} {'My Component': {'param a': 1.5601864044243654, 'param b': 'a'}} {'My Component': {'param a': 0.5808361216819947, 'param b': 'c'}} {'My Component': {'param a': 6.011150117432089, 'param b': 'c'}} {'My Component': {'param a': 0.2058449429580245, 'param b': 'c'}} {'My Component': {'param a': 8.32442640800422, 'param b': 'a'}} {'My Component': {'param a': 1.8182496720710064, 'param b': 'a'}} **Methods** .. autoapisummary:: :nosignatures: evalml.tuners.random_search_tuner.RandomSearchTuner.add evalml.tuners.random_search_tuner.RandomSearchTuner.get_starting_parameters evalml.tuners.random_search_tuner.RandomSearchTuner.is_search_space_exhausted evalml.tuners.random_search_tuner.RandomSearchTuner.propose .. py:method:: add(self, pipeline_parameters, score) Not applicable to random search tuner as generated parameters are not dependent on scores of previous parameters. :param pipeline_parameters: A dict of the parameters used to evaluate a pipeline :type pipeline_parameters: dict :param score: The score obtained by evaluating the pipeline with the provided parameters :type score: float .. py:method:: get_starting_parameters(self, hyperparameter_ranges, random_seed=0) Gets the starting parameters given the pipeline hyperparameter range. :param hyperparameter_ranges: The custom hyperparameter ranges passed in during search. Used to determine the starting parameters. :type hyperparameter_ranges: dict :param random_seed: The random seed to use. Defaults to 0. :type random_seed: int :returns: The starting parameters, randomly chosen, to initialize a pipeline with. :rtype: dict .. py:method:: is_search_space_exhausted(self) Checks if it is possible to generate a set of valid parameters. Stores generated parameters in ``self.curr_params`` to be returned by ``propose()``. :returns: If no more valid parameters exists in the search space, return False. :rtype: bool :raises NoParamsException: If a search space is exhausted, then this exception is thrown. .. py:method:: propose(self) Generate a unique set of parameters. If tuner was initialized with ``with_replacement=True`` and the tuner is unable to generate a unique set of parameters after ``replacement_max_attempts`` tries, then ``NoParamsException`` is raised. :returns: Proposed pipeline parameters :rtype: dict