evalml.objectives.AUCMicro.objective_function

AUCMicro.objective_function(y_true, y_predicted, X=None)[source]
Computes the relative value of the provided predictions compared to the actual labels, according a specified metric
Arguments:

y_predicted (pd.Series): predicted values of length [n_samples] y_true (pd.Series): actual class labels of length [n_samples] X (pd.DataFrame or np.array): extra data of shape [n_samples, n_features] necessary to calculate score

Returns

numerical value used to calculate score