Nvidia and Microsoft develop 530 billion parameter AI model, but it still suffers from bias
Nvidia and Microsoft have developed an incredible 530 billion parameter AI model, but it still suffers from bias.
The pair claim their Megatron-Turing Natural Language Generation (MT-NLG) model is the "most powerful monolithic transformer language model trained to date".
For comparison, OpenAI’s much-lauded GPT-3 has 175 billion parameters.
The duo trained their impressive model on 15 datasets with a total of 339 billion tokens. Various sampling weights...