Data Leverage References

← Back to browse

Coresets for Data-efficient Training of Machine Learning Models

2020 inproceedings mirzasoleiman2020craig Not yet verified
Authors
Baharan Mirzasoleiman, Jeff Bilmes, Jure Leskovec
Venue
International Conference on Machine Learning (ICML)
Abstract
Introduces CRAIG (Coresets for Accelerating Incremental Gradient descent), selecting subsets that approximate full gradient for 2-3x training speedups while maintaining performance.

BibTeX

Local Entry
@inproceedings{mirzasoleiman2020craig,
  title = {Coresets for Data-efficient Training of Machine Learning Models},
  author = {Baharan Mirzasoleiman and Jeff Bilmes and Jure Leskovec},
  year = {2020},
  booktitle = {International Conference on Machine Learning (ICML)},
  url = {https://proceedings.mlr.press/v119/mirzasoleiman20a.html},
  abstract = {Introduces CRAIG (Coresets for Accelerating Incremental Gradient descent), selecting subsets that approximate full gradient for 2-3x training speedups while maintaining performance.}
}
From OPENALEX
@inproceedings{mirzasoleiman2020craig,
  title = {Coresets for Data-efficient Training of Machine Learning Models},
  author = {Baharan Mirzasoleiman and Jeff Bilmes and Jure Leskovec},
  year = {2019},
  booktitle = {arXiv (Cornell University)},
  doi = {10.48550/arxiv.1906.01827}
}