Microsoft's Bing team reshape Google's BERT in their own Azure-powered image • DEVCLASS

Researchers’ at Microsoft’s Bing organisation have open sourced a brace of recipes for pre-training and fine-tuning BERT, the NLP model which Google itself open sourced just last November. Google describes BERT as “the first deeply bidirectional, unsupervised language representation, pre-trained only using a plain text corpus” – the corpus in question being Wikipedia. Wikipedia’s collective …
July 21, 2019
1200
1053