Coresets for Data-efficient Training of Machine Learning Models
Authors
Venue
International Conference on Machine Learning (ICML)
Abstract
Introduces CRAIG (Coresets for Accelerating Incremental Gradient descent), selecting subsets that approximate full gradient for 2-3x training speedups while maintaining performance.
Tags
Links
BibTeX
Local Entry
@inproceedings{mirzasoleiman2020craig,
title = {Coresets for Data-efficient Training of Machine Learning Models},
author = {Baharan Mirzasoleiman and Jeff Bilmes and Jure Leskovec},
year = {2020},
booktitle = {International Conference on Machine Learning (ICML)},
url = {https://proceedings.mlr.press/v119/mirzasoleiman20a.html},
abstract = {Introduces CRAIG (Coresets for Accelerating Incremental Gradient descent), selecting subsets that approximate full gradient for 2-3x training speedups while maintaining performance.}
} From OPENALEX
@inproceedings{mirzasoleiman2020craig,
title = {Coresets for Data-efficient Training of Machine Learning Models},
author = {Baharan Mirzasoleiman and Jeff Bilmes and Jure Leskovec},
year = {2019},
booktitle = {arXiv (Cornell University)},
doi = {10.48550/arxiv.1906.01827}
}