metrics.confusion_matrix

confusion_matrix(y_true: `object`, y_pred: `object`, __namedParameters: `object`)

A confusion matrix is a technique for summarizing the performance of a classification algorithm.

Classification accuracy alone can be misleading if you have an unequal number of observations in each class or if you have more than two classes in your dataset.

Calculating a confusion matrix can give you a better idea of what your classification model is getting right and what types of errors it is making.

Usage

import { confusion_matrix } from 'machinelearn/metrics';

const matrix1 = confusion_matrix([1, 2, 3], [1, 2, 3]);
console.log(matrix1); // [ [ 1, 0, 0 ], [ 0, 1, 0 ], [ 0, 0, 1 ] ]

const matrix2 = confusion_matrix(
  ['cat', 'ant', 'cat', 'cat', 'ant', 'bird'],
  ['ant', 'ant', 'cat', 'cat', 'ant', 'cat']
);
console.log(matrix2); // [ [ 1, 2, 0 ], [ 2, 0, 0 ], [ 0, 1, 0 ] ]

Defined in metrics/classification.ts:192

Parameters:

ParamTypeDefaultDescription
ytruestring[] or number[]nullGround truth (correct) target values.
ypredstring[] or number[]nullEstimated targets as returned by a classifier.
options.labelsany[]nullList of labels to index the matrix. This may be used to reorder orselect a subset of labels. If none is given, those that appearat least once in ytrue or y_pred are used in sorted order.

Returns:

number[]