triplet_margin_loss / README.md
theAIguy's picture
add readme
639df4a

A newer version of the Gradio SDK is available: 5.9.1

Upgrade
metadata
title: triplet_margin_loss
emoji: 🐠
colorFrom: blue
colorTo: blue
sdk: gradio
sdk_version: 3.0.10
app_file: app.py
pinned: false

Metric Card for Triplet Margin Loss

Metric Description

Triplet margin loss is a loss function that measures a relative similarity between the samples. A triplet is comprised of reference input 'anchor (a)', matching input 'positive examples (p)' and non-matching input 'negative examples (n)'. The loss function for each triplet is given by: L(a, p, n) = max{d(a,p) - d(a,n) + margin, 0} where d(x, y) is the 2nd order (Euclidean) pairwise distance between x and y.

How to Use

At minimum, this metric requires anchor, positive and negative examples.

>>> triplet_margin_loss = evaluate.load("theAIguy/triplet_margin_loss")
>>> loss = triplet_margin_loss.compute(
            anchor=[-0.4765, 1.7133, 1.3971, -1.0121, 0.0732], 
            positive=[0.9218, 0.6305, 0.3381, 0.1412, 0.2607], 
            negative=[0.1971, 0.7246, 0.6729, 0.0941, 0.1011])
>>> print(loss)
{'triplet_margin_loss': 1.59}

You may also add a custom margin (default: 1.0).

>>> triplet_margin_loss = evaluate.load("theAIguy/triplet_margin_loss")
>>> loss = triplet_margin_loss.compute(
            anchor=[-0.4765, 1.7133, 1.3971, -1.0121, 0.0732], 
            positive=[0.9218, 0.6305, 0.3381, 0.1412, 0.2607], 
            negative=[0.1971, 0.7246, 0.6729, 0.0941, 0.1011],
            margin=2.0)
>>> print(loss)
{'triplet_margin_loss': 2.59}

Inputs

  • anchor (list of float): Reference inputs.
  • positive (list of float): Matching inputs.
  • negative (list of float): Non-matching inputs.
  • margin (float): Margin, default:1.0

Output Values

  • triple_margin_loss(float): Total loss.

Output Example(s):

{'triplet_margin_loss': 2.59}

This metric outputs a dictionary, containing the triplet margin loss.

Examples

Example 1-A simple example

>>> accuracy_metric = evaluate.load("triplet_margin_loss")
>>> loss = triplet_margin_loss.compute(
            anchor=[-0.4765, 1.7133, 1.3971, -1.0121, 0.0732], 
            positive=[0.9218, 0.6305, 0.3381, 0.1412, 0.2607], 
            negative=[0.1971, 0.7246, 0.6729, 0.0941, 0.1011])
>>> print(loss)
{'triplet_margin_loss': 1.59}

Example 2-The same as Example 1, except margin set to 2.0.

>>> triplet_margin_loss = evaluate.load("triplet_margin_loss")
>>> loss = triplet_margin_loss.compute(
            anchor=[-0.4765, 1.7133, 1.3971, -1.0121, 0.0732], 
            positive=[0.9218, 0.6305, 0.3381, 0.1412, 0.2607], 
            negative=[0.1971, 0.7246, 0.6729, 0.0941, 0.1011],
            margin=2.0)
>>> print(loss)
{'triplet_margin_loss': 2.59}

Limitations and Bias

When used to cluster data points, one needs to be careful to include feature rich data points else dissimilar data points might be clustered together. Negative hard sample mining is widely used along with this loss function to penalize such data points.

Citation(s)

@article{schultz2003learning,
  title={Learning a distance metric from relative comparisons},
  author={Schultz, Matthew and Joachims, Thorsten},
  journal={Advances in neural information processing systems},
  volume={16},
  year={2003}
}