Re-se-arch
Our re-se-arch has been generously supported by ARO, NSF, ARFL, IARPA, BlueHalo and Salesforce.
2019
Li, Xilai; Zhou, Yingbo; Wu, Tianfu; Socher, Richard; Xiong, Caiming
Learn to Grow: A Continual Structure Learning Framework for Overcoming Catastrophic Forgetting Proceedings Article
In: International Conference on Machine Learning (ICML), 2019.
@inproceedings{Learn2grow,
title = {Learn to Grow: A Continual Structure Learning Framework for Overcoming Catastrophic Forgetting},
author = {Xilai Li and Yingbo Zhou and Tianfu Wu and Richard Socher and Caiming Xiong},
url = {https://arxiv.org/abs/1904.00310
https://news.ncsu.edu/2019/05/ai-continual-learning-framework/
https://www.army.mil/article/222090/army_funded_research_boosts_memory_of_ai_systems
https://news.science360.gov/archives/20190517
https://techxplore.com/news/2019-05-framework-artificial-intelligence.html
https://www.wraltechwire.com/2019/05/15/researchers-create-framework-to-help-artificial-intelligence-systems-be-less-forgetful/},
year = {2019},
date = {2019-06-11},
booktitle = {International Conference on Machine Learning (ICML)},
abstract = {Addressing catastrophic forgetting is one of the key challenges in continual learning where machine learning systems are trained with sequential or streaming tasks. Despite recent remarkable progress in state-of-the-art deep learning, deep neural networks (DNNs) are still plagued with the catastrophic forgetting problem. This paper presents a conceptually simple yet general and effective framework for handling catastrophic forgetting in continual learning with DNNs. The proposed method consists of two components: a neural structure optimization component and a parameter learning and/or fine-tuning component. By separating the explicit neural structure learning and the parameter estimation, not only is the proposed method capable of evolving neural structures in an intuitively meaningful way, but also shows strong capabilities of alleviating catastrophic forgetting in experiments. Furthermore, the proposed method outperforms all other baselines on the permuted MNIST dataset, the split CIFAR100 dataset and the Visual Domain Decathlon dataset in continual learning setting.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
Addressing catastrophic forgetting is one of the key challenges in continual learning where machine learning systems are trained with sequential or streaming tasks. Despite recent remarkable progress in state-of-the-art deep learning, deep neural networks (DNNs) are still plagued with the catastrophic forgetting problem. This paper presents a conceptually simple yet general and effective framework for handling catastrophic forgetting in continual learning with DNNs. The proposed method consists of two components: a neural structure optimization component and a parameter learning and/or fine-tuning component. By separating the explicit neural structure learning and the parameter estimation, not only is the proposed method capable of evolving neural structures in an intuitively meaningful way, but also shows strong capabilities of alleviating catastrophic forgetting in experiments. Furthermore, the proposed method outperforms all other baselines on the permuted MNIST dataset, the split CIFAR100 dataset and the Visual Domain Decathlon dataset in continual learning setting.
- https://arxiv.org/abs/1904.00310
- https://news.ncsu.edu/2019/05/ai-continual-learning-framework/
- https://www.army.mil/article/222090/army_funded_research_boosts_memory_of_ai_sys[...]
- https://news.science360.gov/archives/20190517
- https://techxplore.com/news/2019-05-framework-artificial-intelligence.html
- https://www.wraltechwire.com/2019/05/15/researchers-create-framework-to-help-art[...]