--- title: Vision.Metrics keywords: fastai sidebar: home_sidebar summary: "Metrics for tracking performance of self-supervised training. It aims to give an idea about the quality of the learned representations during training in the presence of a labeled validation set. It can be used to decide how much longer to keep training. Since self-supervised models usually favor more epochs it can help save time and computation." description: "Metrics for tracking performance of self-supervised training. It aims to give an idea about the quality of the learned representations during training in the presence of a labeled validation set. It can be used to decide how much longer to keep training. Since self-supervised models usually favor more epochs it can help save time and computation." nb_path: "nbs/70 - vision.metrics.ipynb" ---
{% raw %}
{% endraw %} {% raw %}
{% endraw %} {% raw %}

class KNNProxyMetric[source]

KNNProxyMetric(after_create=None, before_fit=None, before_epoch=None, before_train=None, before_batch=None, after_pred=None, after_loss=None, before_backward=None, before_step=None, after_cancel_step=None, after_step=None, after_cancel_batch=None, after_batch=None, after_cancel_train=None, after_train=None, before_validate=None, after_cancel_validate=None, after_validate=None, after_cancel_epoch=None, after_epoch=None, after_cancel_fit=None, after_fit=None) :: Callback

A metric which calculates knn-1 accuracy. Use with a labeled validation set.

{% endraw %} {% raw %}
{% endraw %}

Example Usage

{% raw %}
from self_supervised.layers import *
from self_supervised.vision.simclr import *
{% endraw %}

Create your dataset as usual. Make sure your validation set has labels. For example, if your dataset has 10% of labeled data and 90% of unlabeled data you can use that 10% or portion of it as your validation set. You assign dummy labels to 90% of data which doesn't have an actual label to circumvent code breaks during dls construction.

{% raw %}
path = untar_data(URLs.MNIST_TINY)
items = get_image_files(path)
tds = Datasets(items, [PILImageBW.create, [parent_label, Categorize()]], splits=GrandparentSplitter()(items))
dls = tds.dataloaders(bs=5, after_item=[ToTensor(), IntToFloatTensor()], device='cpu')
{% endraw %}

Create your model and augmentations as usual

{% raw %}
fastai_encoder = create_encoder('xresnet18', n_in=1, pretrained=False)
model = create_simclr_model(fastai_encoder, hidden_size=2048, projection_size=128)
aug_pipelines = get_simclr_aug_pipelines(size=28, rotate=False, jitter=False, bw=False, blur=False, stats=None, cuda=False)
{% endraw %}

Define self-supervised alogrithm in cbs. ShortEpochCallback is used for testing purposes here.

{% raw %}
cbs=[SimCLR(aug_pipelines, temp=0.07, print_augs=True)]
Pipeline: RandomResizedCrop -> RandomHorizontalFlip
Pipeline: RandomResizedCrop -> RandomHorizontalFlip
{% endraw %}

We will use ValueMetric from fastai since KNNProxyMetric is a metric implemented as a Callback. ValueMetric expects a function which will return a value when it's called and that function is KNNProxyMetric.accuracy.

{% raw %}
knn_metric_cb = KNNProxyMetric()
cbs += [knn_metric_cb]
metric = ValueMetric(knn_metric_cb.accuracy, metric_name='knn_accuracy')
{% endraw %}

Construct Learner with previously defined dls, cbs and metric.

{% raw %}
learn = Learner(dls, model, cbs=cbs, metrics=metric)
{% endraw %} {% raw %}
learn.validate()
(#2) [1.6862106323242188,0.9871244430541992]
{% endraw %}