Issue 487 · Week of May 05, 2026
Feed Jobs Search Platform About Donate
← Back to feed / //datascience

Microsoft's Bing team reshape Google's BERT in their own Azure-powered image • DEVCLASS

Read full article Discuss
Researchers’ at Microsoft’s Bing organisation have open sourced a brace of recipes for pre-training and fine-tuning BERT, the NLP model which Google itself open sourced just last November. Google describes BERT as “the first deeply bidirectional, unsupervised language representation, pre-trained only using a plain text corpus” – the corpus in question being Wikipedia. Wikipedia’s collective …