Need Computing Resources? Take a Queue Token!

Need Computing Resources? Take a Queue Token!

Training large neural models for speech and language processing (NLP), requires not only a lot of data input (read here on why and here on how SELMA is handling this ) but also a lot of computing resources. Nowadays, a fair share of computing resources are...
How to satisfy data-hungry machine learning

How to satisfy data-hungry machine learning

Machine learning requires large quantities of labeled training data (for more insights, read more in this post). That means, in order to reach acceptable performance, current speech recognition systems training demands thousands of hours of transcribed speech. For...