Coresets for Data-efficient Training of Machine Learning Models
Not indexed in the checked database. May be too new, non-academic, or use a different identifier.
Authors
Venue
International Conference on Machine Learning (ICML)
Abstract
Introduces CRAIG (Coresets for Accelerating Incremental Gradient descent), selecting subsets that approximate full gradient for 2-3x training speedups while maintaining performance.
Tags
Links
BibTeX
Local Entry
@inproceedings{mirzasoleiman2020craig,
title = {Coresets for Data-efficient Training of Machine Learning Models},
author = {Baharan Mirzasoleiman and Jeff Bilmes and Jure Leskovec},
year = {2020},
booktitle = {International Conference on Machine Learning (ICML)},
url = {https://proceedings.mlr.press/v119/mirzasoleiman20a.html},
abstract = {Introduces CRAIG (Coresets for Accelerating Incremental Gradient descent), selecting subsets that approximate full gradient for 2-3x training speedups while maintaining performance.}
} External Source
Not found in external databases.