So you like our media brand Neural?
A program that can automate website development.
A bot that writes letters on behalf of nature.

An AI-written blog that trended on Hacker News.
But what has been less discussed is how GPT-3 has transformed OpenAI itself.
And hanging in the balance is the very mission for which OpenAI was founded.

It’s free, every week, in your inbox.
In March 2019, OpenAI announced that it would be transitioning from a non-profit lab to a capped-profit company.
But why the structural change?

The key phrase here is compute and talent.
Talent and compute costs are two of the key challenges of AI research.
The talent pool for the kind of research OpenAI does is very small.

Andrej Karpathy, another AI genius, works at Tesla.
DeepMind, another AI research lab, reported payingmore than $483 millionto its 700 employees in 2018.
According to one estimate, training GPT-3 would costat least $4.6 million.

And to be clear, training deep learning models is not a clean, one-shot process.
Theres a lot of trial and error and hyperparameter tuning that would probably increase the cost several-fold.
OpenAI is not the first AI research lab to adopt a commercial model.
Facing similar problems, DeepMind accepted a $650-million acquisition proposal from Google in 2014.
Before Altman, Greg Brockman was the face of the organization.
Brockman, co-founder and CTO of OpenAI, is a seasoned scientist and engineer.
But in the tech investment space, reputation and product management skills are much more valued than scientific genius.
And Altman is exactly the kind of person investors trust with their money.
During his tenure at Y Combinator, he helped launch many successful companies including Airbnb and Dropbox.
In aninterview with TechCrunchin May 2019, Altman said, We have never made any revenue.
We have no current plans to make revenue.
We have no idea how we may one day generate revenue.
But this didnt detract investors from pouring money into OpenAI.
But AGI is a lofty goal that isat least decades awayby expert estimates.
And tech investors are not known for their decades-long patience.
How will OpenAI strike the right balance between AGI research and keeping its funders satisfied?
But there are clear signs that OpenAI is becomingat least in parta product company.
So, Microsoft tapped into OpenAIs talent to create what Altman described as our dream system.
Less than two weeks later,the first version of the GPT-3 paperwas published on the arXiv preprint server.
Unlike its predecessorGPT-2, GPT-3 will not be released to the public.
TheOpenAI API announcementwas made on June 11, though some developers were given early access to the technology.
This will at least help OpenAI return some of the investment Microsoft has made in the company.
The commercial release of GPT-3 brings OpenAI one step closer to becoming an AI product company.
And thats one step away from nonprofit, scientific AI research.
GPT-3 is three orders of magnitude larger than GPT-2.
One of the key problems in deep learning language models is memory span.
The AI starts to lose coherence as the text it generates becomes longer.
In contrast, OpenAI executives tried to downplay the warnings about the GPT-3.
In July, Sam Altman dismissed the GPT-3 hype in a tweet.
The GPT-3 hype is way too much.
Its impressive (thanks for the nice compliments!)
but it still has serious weaknesses and sometimes makes very silly mistakes.
AI is going to change the world, but GPT-3 is just a very early glimpse.
We have a lot still to figure out.
Many developers and entrepreneurs have posted tweets of GPT-3 generating poems, memes, tweets, and website mockups.
GPT-3 is not likely to take away any jobs soon.
But GPT-3 has distinct benefits and potentially presents a tipping point in the business of AI.
One of the key limits of deep learning systems is that they arenarrow AI systems.
They perform well on specific tasks but are poor at generalizing to other domains.
This limitation has stunted the deployment of AI services as platforms.
This means that you might adapt it to many new applications without retuning its parameters.
This capability has already spawned many ideas for using the AI model to create new services.
Debuild.co is a company that uses GPT-3 to create web applications.
And OthersideAI is using GPT-3 to provide creativity tools to users.
GPT-3 is going to change the way you work.
This is very different from releasing an open-source AI model and letting developers do what they want with it.
OpenAI must now satisfy customers, scale its infrastructure, deal with compliance issues, and much more.
OpenAI will still have to handle problems such as removing harmful biases and dealing with model decay.
Those are all costly tasks, especially when dealing with a 175-billion-parameter deep learning model.
And OpenAI still has to figure out how to do all these things while also remaining profitable.
As OpenAI further wades into the realm of product management, it will need even more help from Microsoft.
OpenAI already relies on Microsofts cloud infrastructure to train and run its models.
For the moment, the popular belief is thatbigger deep learning modelswill lead to more advanced AI systems.
The only organizations willing to dole out such amounts of cash for the moment are large tech companies.
In time, the larger company might completely absorb the lab into its own commercial goals.
Weve already seen thisplay out after a fashionafter Google acquired DeepMind.
But the company has yet to even out the costs it is incurring for its owners.
As for OpenAI, the company is now walking a fine line.
you’re able to read the original articlehere.