GPT-J, an open-source mimic of the GPT-3 neural net

Sunday 4th July, 2021 - Bruce Sterling

*This seems like it might be significant.

 

https://towardsdatascience.com/cant-access-gpt-3-here-s-gpt-j-its-open-source-cousin-8af86a638b11

(…)

EleutherAI project: Open-sourcing AI researchthe project was born in July 2020 as a quest to replicate OpenAI GPT-family models. A group of researchers and engineers decided to give OpenAI a “run for their money” and so the project began.

Their ultimate goal is to replicate GPT-3-175B to “break OpenAI-Microsoft monopoly” on transformer-based language models.

Since the transformer was invented in 2017, we’ve seen increased effort in creating powerful language models. GPT-3 is the one that became a superstar, but all over the world companies and institutions are competing to find an edge that allows them to take a breath at a hegemonic position. In the words of Alexander Rush, a computer science professor at Cornell University, “There is something akin to an NLP space race going on.”

Because powerful language models need huge amounts of computing power, big tech companies are best prepared to tackle the challenges. But, ahead of their interest in advancing science and helping humanity towards a better future, they put their need for profit. OpenAI started as a non-profit organization but soon realized they’d need to change the approach to fund their projects. As a result, they partnered with Microsoft and received $1 billion.

Now, OpenAI has to move in between the commercial requirements imposed by Microsoft and its original mission.

EleutherAI is trying to compete with these two — and other — AI giants with help from Google and CoreWeave, their cloud computing providers. OpenAI’s models and their specific characteristics aren’t public, so EleutherAI’s researchers are trying to solve the puzzle by combining their extensive knowledge with the sparse bits of info OpenAI has been publishing in their papers (GPT-1, GPT-2, GPT-3, and others)….