Comparing multiple reports#

The class ComparisonReport provides a report allowing to compare EstimatorReport or CrossValidationReport instances in an interactive way. The functionalities of the report are accessible through accessors.

ComparisonReport(reports, *[, n_jobs])

Report for comparing reports.

Methods

ComparisonReport.help()

Display report help using rich or HTML.

ComparisonReport.cache_predictions([...])

Cache the predictions for sub-estimators reports.

ComparisonReport.clear_cache()

Clear the cache.

ComparisonReport.create_estimator_report(*, name)

Create an estimator report from one of the reports in the comparison.

ComparisonReport.get_predictions(*, data_source)

Get predictions from the underlying reports.

Accessors

ComparisonReport.inspection

Accessor for model inspection related operations.

ComparisonReport.metrics

Accessor for metrics-related operations.

Metrics#

The metrics accessor helps you to evaluate the statistical performance of the compared estimators. In addition, we provide a sub-accessor plot, to get the common performance metric representations.

ComparisonReport.metrics.help()

Display accessor help using rich or HTML.

ComparisonReport.metrics.summarize(*[, ...])

Report a set of metrics for the estimators.

ComparisonReport.metrics.accuracy(*[, ...])

Compute the accuracy score.

ComparisonReport.metrics.brier_score(*[, ...])

Compute the Brier score.

ComparisonReport.metrics.confusion_matrix(*)

Plot the confusion matrix.

ComparisonReport.metrics.custom_metric(...)

Compute a custom metric.

ComparisonReport.metrics.log_loss(*[, ...])

Compute the log loss.

ComparisonReport.metrics.precision(*[, ...])

Compute the precision score.

ComparisonReport.metrics.precision_recall(*)

Plot the precision-recall curve.

ComparisonReport.metrics.prediction_error(*)

Plot the prediction error of a regression model.

ComparisonReport.metrics.r2(*[, ...])

Compute the R² score.

ComparisonReport.metrics.recall(*[, ...])

Compute the recall score.

ComparisonReport.metrics.rmse(*[, ...])

Compute the root mean squared error.

ComparisonReport.metrics.roc(*[, ...])

Plot the ROC curve.

ComparisonReport.metrics.roc_auc(*[, ...])

Compute the ROC AUC score.

ComparisonReport.metrics.timings([aggregate])

Get all measured processing times related to the different estimators.

Inspection#

The inspection accessor helps you inspect your model by e.g. evaluating the importance of the features in your model.

ComparisonReport.inspection.help()

Display accessor help using rich or HTML.

ComparisonReport.inspection.coefficients()

Retrieve the coefficients for each report, including the intercepts.