COLLECTED BY
Organization:
Internet Archive
Focused crawls are collections of frequently-updated webcrawl data from narrow (as opposed to broad or wide) web crawls, often focused on a single domain or subdomain.
The Wayback Machine - https://web.archive.org/web/20200827075616/https://github.com/topics/triplet-loss
Here are
111 public repositories
matching this topic...
Siamese and triplet networks with online pair/triplet mining in PyTorch
Updated
Aug 4, 2020
Python
Implementation of triplet loss in TensorFlow
Updated
May 9, 2019
Python
label-smooth, amsoftmax, focal-loss, triplet-loss. Maybe useful
Updated
Aug 25, 2020
Cuda
Learning to Regress 3D Face Shape and Expression from an Image without 3D Supervision
Updated
Mar 31, 2020
Python
Margin Sample Mining Loss: A Deep Learning Based Method for Person Re-identification
Updated
Jan 12, 2019
Python
Person re-ID baseline with triplet loss
Updated
Dec 3, 2019
Python
Unsupervised Scalable Representation Learning for Multivariate Time Series: Experiments
Updated
Aug 26, 2020
Jupyter Notebook
Keras implementation of ‘’Deep Speaker: an End-to-End Neural Speaker Embedding System‘’ (speaker recognition)
Updated
Apr 27, 2020
Python
Complete Code for "Hard-Aware-Deeply-Cascaded-Embedding"
Updated
Aug 6, 2017
Python
Re-implementation of tripletloss function in FaceNet
Updated
Jul 6, 2017
Python
A PyTorch implementation of Google's FaceNet [1] paper for training a facial recognition model with Triplet Loss and an implementation of the Shenzhen Institutes of Advanced Technology's 'Center Loss' [2] combined with Cross Entropy Loss using the VGGFace2 dataset. A pre-trained model using Triplet Loss is available for download.
Updated
Aug 26, 2020
Python
MassFace: an effecient implementation using triplet loss for face recognition
Updated
Sep 27, 2019
Python
Learning Fine-grained Image Similarity with Deep Ranking is a novel application of neural networks, where the authors use a new multi scale architecture combined with a triplet loss to create a neural network that is able to perform image search. This repository is a simplified implementation of the same
Updated
Apr 13, 2019
Python
Using SigComp'11 dataset for signature verification
Updated
Oct 7, 2019
Jupyter Notebook
This project is intended to solve the task of massive image retrieval.
Updated
Feb 9, 2018
Python
Updated
Nov 10, 2019
Python
Who is your doppelgänger and more with Keras face recognition
Updated
May 20, 2019
Jupyter Notebook
Updated
Apr 5, 2019
Shell
R implementation for selected machine learning methods with deep learning frameworks (Keras, Tensorflow)
Image similarity using Triplet Loss
Updated
Jan 28, 2020
Jupyter Notebook
My solution to the Global Data Science Challenge
Updated
May 6, 2020
Jupyter Notebook
Image Retrieval Experiment Using Triplet Loss
Updated
Dec 12, 2016
Python
One-Shot Learning with Triplet CNNs in Pytorch
Updated
Jul 18, 2020
Python
A tensorflow siamese network implementation. Illustrated using singature recognition/identification.
Updated
Jun 25, 2019
Python
An example of doing MovieLens recommendations using triplet loss in Keras
Updated
Aug 2, 2016
Python
Simple Keras implementation of Triplet-Center Loss on the MNIST dataset
Updated
Jul 3, 2019
Python
Fine tune and train a CNN pre-trained on Imagenet dataset, using Triplet Loss
Updated
Apr 20, 2018
Python
Updated
Jan 31, 2019
Python
Deep Metric and Hash Code Learning Network for Content Based Retrieval of Remote Sensing Images
Updated
Mar 1, 2020
Python
A generic triplet data loader for image classification problems,and a triplet loss net demo.
Updated
Aug 6, 2020
Python
Improve this page
Add a description, image, and links to the
triplet-loss
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
triplet-loss
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session.
You signed out in another tab or window. Reload to refresh your session.