H-GAN: the power of GANs in your Hands

Published in IJCNN, 2021

  1. Full citation

    Oprea, S., Karvounas, G., Martinez-Gonzalez, P., Kyriazis, N., Orts-Escolano, S., Oikonomidis, I., Garcia-Garcia, A., Tsoli, A., Garcia-Rodriguez, J., & Argyros, A. (2021, July). H-GAN: the power of GANs in your Hands. IEEE International Joint Conference of Neural Networks (IJCNN 2021) (to Appear), Also Available at CoRR, ArXiv.

    Abstract

    We present HandGAN (H-GAN), a cycle-consistent adversarial learning approach implementing multi-scale perceptual discriminators. It is designed to translate synthetic images of hands to the real domain. Synthetic hands provide complete ground-truth annotations, yet they are not representative ofthe target distribution of real-world data. We strive to provide the perfect blend of a realistic hand appearance with synthetic annotations. Relying on image-to-image translation, we improve synthetic hands appearance to approximate the statistical distribution underlying a collection of real images of hands. H-GAN tackles not only cross-domain tone mapping but also structural differences in localized areas such as shading discontinuities. Results are evaluated on a qualitative and quantitative basis improving previous works. Furthermore, we successfully apply the generated images to the hand classification task.

    Presentation video

    Code

    The code can be found here.