mixup: Beyond Empirical Risk Minimization
Not indexed in the checked database. May be too new, non-academic, or use a different identifier.
Authors
Venue
ICLR 2018
Abstract
Introduces mixup, a data augmentation technique that trains on convex combinations of input pairs and their labels. Simple, data-independent, and model-agnostic approach that improves generalization and robustness.
Tags
Links
BibTeX
Local Entry
@inproceedings{zhang2018mixup,
title = {mixup: Beyond Empirical Risk Minimization},
author = {Hongyi Zhang and Moustapha Cisse and Yann N. Dauphin and David Lopez-Paz},
year = {2018},
booktitle = {ICLR 2018},
url = {https://arxiv.org/abs/1710.09412},
abstract = {Introduces mixup, a data augmentation technique that trains on convex combinations of input pairs and their labels. Simple, data-independent, and model-agnostic approach that improves generalization and robustness.}
} External Source
Not found in external databases.