AntixK

PyTorch VAE

A Collection of Variational Autoencoders (VAE) in PyTorch.
Under Apache License 2.0
By AntixK

deep-learning pytorch architecture pytorch-implementation vae paper-implementations variational-autoencoders reproducible-research vae-implementation pytorch-vae beta-vae vqvae gumbel-softmax celeba-dataset iwae wae dfc-vae


PyTorch VAE









A collection of Variational AutoEncoders (VAEs) implemented in pytorch with focus on reproducibility. The aim of this project is to provide
a quick and simple working example for many of the cool VAE models out there. All the models are trained on the CelebA dataset
for consistency and comparison. The architecture of all the models are kept as similar as possible with the same layers, except for cases where the original paper necessitates
a radically different architecture (Ex. VQ VAE uses Residual layers and no Batch-Norm, unlike other models).
Here are the results of each model.


Requirements

Installation

$ git clone https://github.com/AntixK/PyTorch-VAE
$ cd PyTorch-VAE
$ pip install -r requirements.txt


Usage

$ cd PyTorch-VAE
$ python run.py -c configs/<config-file-name.yaml>

Config file template
```yaml
model_params:
name: ""
in_channels: 3
latent_dim:
. # Other parameters required by the model
.
.


exp_params:
data_path: ""
img_size: 64 # Models are designed to work for this size
batch_size: 64 # Better to have a square number
LR: 0.005
weight_decay:
. # Other arguments required for training, like scheduler etc.
.
.


trainer_params:
gpus: 1
max_nb_epochs: 50
gradient_clip_val: 1.5
.
.
.


logging_params:
save_dir: "logs/"
name: ""
manual_seed:
```


View TensorBoard Logs
$ cd logs/<experiment name>/version_<the version you want>
$ tensorboard --logdir tf

Results

| Model | Paper |Reconstruction | Samples |
|------------------------------------------------------------------------|--------------------------------------------------|---------------|---------|
| VAE (Code, Config) |Link | | |
| Conditional VAE (Code, Config) |Link| | |
| WAE - MMD (RBF Kernel) (Code, Config) |Link | | |
| WAE - MMD (IMQ Kernel) (Code, Config) |Link | | |
| Beta-VAE (Code, Config) |Link | | |
| Disentangled Beta-VAE (Code, Config) |Link | | |
| Beta-TC-VAE (Code, Config) |Link | | |
| IWAE (K = 5) (Code, Config) |Link | | |
| MIWAE (K = 5, M = 3) (Code, Config) |Link | | |
| DFCVAE (Code, Config) |Link | | |
| MSSIM VAE (Code, Config) |Link | | |
| Categorical VAE (Code, Config) |Link | | |
| Joint VAE (Code, Config) |Link | | |
| Info VAE (Code, Config) |Link | | |
| LogCosh VAE (Code, Config) |Link| | |
| SWAE (200 Projections) (Code, Config) |Link | | |
| VQ-VAE (K = 512, D = 64) (Code, Config)|Link | | N/A |
| DIP VAE (Code, Config) |Link | | |

Contributing

If you have trained a better model, using these implementations, by fine-tuning the hyper-params in the config file,
I would be happy to include your result (along with your config file) in this repo, citing your name 😊.


Additionally, if you would like to contribute some models, please submit a PR.


License

Apache License 2.0


| Permissions | Limitations | Conditions |
|------------------|-------------------|----------------------------------|
| ✔ Commercial use | ❌ Trademark use | ⓘ License and copyright notice |
| ✔ Modification | ❌ Liability | ⓘ State changes |
| ✔ Distribution | ❌ Warranty | |
| ✔ Patent use | | |
| ✔ Private use | | |


Citation

@misc{Subramanian2020,
author = {Subramanian, A.K},
title = {PyTorch-VAE},
year = {2020},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/AntixK/PyTorch-VAE}}
}