GPT-NeoX-20B - Open-Source huge language model by EleutherAI (Interview w/ co-founder Connor Leahy)
#eleuther #gptneo #gptj
EleutherAI announces GPT-NeoX-20B, a 20 billion parameter open-source language model, inspired by GPT-3. Connor joins me to discuss the process of training, how the group got their hands on the necessary hardware, what the new model can do, and how anyone can try it out!
OUTLINE:
0:00 - Intro
1:00 - Start of interview
2:00 - How did you get all the hardware?
3:50 - What’s the scale of this model?
6:00 - A look into the experimental results
11:15 - Why are there GPT-Neo, GPT-J, and GPT-NeoX?
14:15 - How difficult is training these big models?
17:00 - Try out the model on GooseAI
19:00 - Final thoughts
Read the announcement:
Try out the model:
Check out EleutherAI:
Read the code:
Hardware sponsor:
Links:
TabNine Code Completion (Referral):
3 views
39
8
3 years ago 00:29:51 1
AI Weekly Update - February 7th, 2022
3 years ago 01:01:52 4
Zeta Alpha’s Trends in AI — February 2022. ConvNets comeback, Neural IR, Multimodal
3 years ago 00:20:06 3
GPT-NeoX-20B - Open-Source huge language model by EleutherAI (Interview w/ co-founder Connor Leahy)