mixup: Beyond Empirical Risk Minimization
Authors
Venue
ICLR 2018
Abstract
Introduces mixup, a data augmentation technique that trains on convex combinations of input pairs and their labels. Simple, data-independent, and model-agnostic approach that improves generalization and robustness.
Tags
Links
BibTeX
Local Entry
@inproceedings{zhang2018mixup,
title = {mixup: Beyond Empirical Risk Minimization},
author = {Hongyi Zhang and Moustapha Cisse and Yann N. Dauphin and David Lopez-Paz},
year = {2018},
booktitle = {ICLR 2018},
url = {https://arxiv.org/abs/1710.09412},
abstract = {Introduces mixup, a data augmentation technique that trains on convex combinations of input pairs and their labels. Simple, data-independent, and model-agnostic approach that improves generalization and robustness.}
} From OPENALEX
@inproceedings{zhang2018mixup,
title = {mixup: Beyond Empirical Risk Minimization},
author = {Hongyi Zhang and Moustapha Cissé and Yann Dauphin and David Lopez‐Paz},
year = {2017},
booktitle = {arXiv (Cornell University)},
doi = {10.48550/arxiv.1710.09412}
}