This week, Microsoft and Nvidia announced the Megatron-Turing Natural Language Generation model which they call the “most powerful monolithic transformer language model trained to date”. Dr. Nicole Ackermans breaks down the science of language models to help us understand the implications of this announcement.
You can get an ad-free feed of Daily Tech Headlines for $3 a month here.
A special thanks to all our supporters–without you, none of this would be possible.
Big thanks to Dan Lueders for the theme music.
Big thanks to Mustafa A. from thepolarcat.com for the logo!
Thanks to our mods, Kylde, Jack_Shid, KAPT_Kipper, and scottierowland on the subreddit
Send us email to [email protected]