Systems | Information | Learning | Optimization
 

SILO: Searching for architectures and BERT moments in specialized AI applications

Abstract:  In 2018, advances in architecture design and self-supervised learning led to the “BERT moment” in natural language processing, in which supervised learning workflows were permanently supplanted by the pretraining and fine-tuning of massive Transformer models. This spurred scientists in more specialized areas—e.g. genomics, satellite imaging, and time series forecasting—to develop …