Skip to content

AlexBarbera/ML-helper-tools

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ML-helper-tools

This repo contains helper functions and skeletons for training ML models in an attempt to avoid repeating code search.

The idea is to have a single repo to refer to for implementation and editing instead of writting from scratch, this is not supposed to be a script repo.

 

Losses

WassersteinLoss

  • Earth-mover distance is a better metric for small permutations in image data compared to MSELoss.
  • Original repo

PerceptualLoss

  • Uses an encoder network to compare extracted features between target and generated, useful for SuperResolution. paper

TotalVariationLoss

  • Jitters the image and calculates MSE.
  • Used in SR paper

DeepEnergyLoss

  • Maximizes difference between labels rather than set to 1 or 0.
  • Docs

CycleGANLoss

  • Extracted Cyclegan loss pipeline in case we want to abstract it to somthing different.

 

Models

CycleGAN

Siamese Network

  • Classifies differences in latent space of encoder.
  • This is mine :)

ResNet Generator

MultiLayer Perceptron (Linear and Convolutional)

  • ¯\(ツ)

 

Utils

LightningWrapper for training

TripletDataset

  • Handles tuples of 3 (anchor [reference], positive [simmilar], negative [different]).

ABDataset

  • Matches 2 classes of data in pairs.

MulticlassDataset

  • Matches N classes of data in pairs.

ZCA Whitening

  • Normalizes images so that covariance $\sum$ is the Identity matrix leading to decorrelated features.
  • According to the paper, it should be applied batch-wise.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages