Site icon Rapid Telecast

Why “generative AI” is suddenly on everyone’s lips: it’s an “open field”

Why “generative AI” is suddenly on everyone’s lips: it’s an “open field”

If you’ve been closely following the progress of Open AI, the company run by Sam Altman whose neural nets can now write original text and create original pictures with astonishing ease and speed, you might just skip this piece.

If, on the other hand, you’ve only been vaguely paying attention to the company’s progress and the increasing traction that other so-called “generative” AI companies are suddenly gaining and want to better understand why, you might benefit from this interview with James Currier, a five-time founder and now venture investor who cofounded the firm NFX five years ago with several of his serial founder friends.

Currier falls into the camp of people following the progress closely — so closely that NFX has made numerous related investments in “generative tech” as he describes it, and it’s garnering more of the team’s attention every month. In fact, Currier doesn’t think the buzz about this new wrinkle on AI isn’t hype so much as a realization that the broader startup world is suddenly facing a very big opportunity for the first time in a long time. “Every 14 years,” says Currier, “we get one of these Cambrian explosions. We had one around the internet in ’94. We had one around mobile phones in 2008. Now we’re having another one in 2022.”

In retrospect, this editor wishes she’d asked better questions, but I’m learning here, too. Excerpts from our chat follow, edited for length and clarity. You can listen to our longer conversation here.

TC: There’s a lot of confusion about generative AI, including how new exactly it is, or whether it’s just become the latest buzzword.

JC: I think what happened to the AI world in general is that we had a sense that we could have deterministic AI, which would help us identify the truth of something. For example, is that a broken piece on the manufacturing line? Is that an appropriate meeting to have? It’s where you’re determining something using AI in the same way that a human determines something. That’s largely what AI has been for the last 10 to 15 years.

The other sets of algorithms in AI were more these diffusion algorithms, which were intended to look at huge corpuses of content and then generate something new from them, saying, ‘Here are 10,000 examples. Can we create the 10,001st example that is similar?’

Those were pretty fragile, pretty brittle, up until about a year and a half ago. [Now] the algorithms have gotten better. But more importantly, the corpuses of content we’ve been looking at have gotten bigger because we just have more processing power. So what’s happened is, these algorithms are riding Moore’s law — [with vastly improved] storage, bandwidth, speed of computation — and have suddenly become able to produce something that looks very much like what a human would produce. That means the face value of the text that it will write, and the face value of the drawing it will draw, looks very similar to what a human will do. And that’s all taken place in the last two years. So it’s not a new idea, but it’s newly at that threshold. That’s why everyone looks at this and says, ‘Wow, that’s magic.’

So it was compute power that suddenly changed the game, not some previously missing piece of tech infrastructure?

It didn’t change suddenly, it just changed gradually until the quality of its generation got to where it was meaningful for us. So the answer is generally no, the algorithms have been very similar. In these diffusion algorithms, they have gotten somewhat better. But really, it’s about the processing power. Then, about two years ago, the [powerful language model] GPT  came out, which was an on-premise type of calculation, then GPT3 came out where [the AI company Open AI] would do [the calculation] for you in the cloud; because the data models were so much bigger, they needed to do it on their own servers. You just can’t afford to do it [on your own]. And at that point, things really took a jump up.

We know because we invested in a company doing AI-based generative games, including “AI Dungeon,” and I think the vast majority of all GPT-3’s computation was coming through “AI Dungeon” at one point.

Does “AI Dungeon” then require a smaller team than another game-maker might? 

That’s one of the big advantages, absolutely. They don’t have to spend all that money to house all that data, and they can, with a small group of people, produce tens of gaming experiences that all take advantage of that. [In fact] the idea is that you’re going to add generative AI to old games, so your non-player characters can actually say something more interesting than they do today, though you’re going to get fundamentally different gaming experiences coming out of AI into gaming, versus adding AI into the existing games.

So a big change is in the quality? Will this technology plateau at some point?

No, it will always be incrementally better. It’s just that the differences of the increments will be will be smaller over time because they’re already getting pretty good,

But the other big change is that Open AI wasn’t really open. They generated this amazing thing, but then it wasn’t open and was very expensive. So groups got together like Stability AI and other folks, and they said, ‘Let’s just make open source versions of this.’ And at that point, the cost dropped by 100x, just in the last two or three months.

These are not offshoots of Open AI.

All this generative tech is not going to be built just on the Open AI GPT-3 model; that was just the first one. The open source community has now replicated a lot of their work, and they’re probably eight months behind, six months behind, in terms of quality. But it’s going to get there. And because the open source versions are a third or a fifth or a twentieth the cost of Open AI, you’re going to see a lot of price competition, and you’re going to see a proliferation of these models that compete with Open AI. And you’re probably going to end up with five, or six, or eight, or maybe, maybe 100 of them.

Then on top of those will be built unique AI models. So you might have an AI model that really looks at making poetry, or AI models that really look at how you make visual images of dogs and dog hair, or you’ll have one that’s really specialized in writing sales emails. You’re going to have a whole layer of these specialized AI models that will then be purpose built. Then on top of those, you’ll have all the generative tech, which will be: how do you get people using the product? How do you get people paying for the product? How do you get people to sign in? How do you get people to share it? How do you create network effects?

Who makes money here?

The application layer where people are going to go after the distribution and the network effects is where you’re going to make the money.

What about large companies that will be able to incorporate this technology into their networks. Won’t it be very hard for a company that doesn’t have that advantage to come out of nowhere and make money?

I think what you’re looking for is something like a Twitch where YouTube could have integrated that into its model, but they didn’t. And Twitch created a new platform and a valuable new part of culture and value for the investors and the founders, even though it was hard. So you’re going to have great founders who are going to use this technology to give them an advantage. And that will create a seam in the market. And while the big guys are doing other things, they’ll be able to build billion dollar companies.

The New York Times ran a piece recently featuring a handful of creatives who said the generative AI apps that they’re using in their respective fields are tools in a broader toolbox. Are people being naive here? Are they at risk of being replaced by this technology? As you mentioned,  the team working on “AI Dungeon”is smaller. That’s good for the company but potentially bad for developers who might have worked on the game otherwise.

I think with most technologies, there is sort of an uncomfortableness that people have of [for example] robots replacing a job at an auto factory. When the internet came along, a lot of the people who were doing direct mail felt threatened that companies would be able to sell direct and not use their paper-based advertising services. But [after] they embraced digital marketing, or digital communication through email, they probably had tremendous bumps in their careers, their productivity went up there, the speed and efficiency went up. The same thing happened with credit cards online. We didn’t feel comfortable putting credit cards online until maybe 2002. But those who embraced [this wave in] 2000 to 2003 did better.

I think that what’s happening now. The writers and designers and architects who are thinking forward and embracing these tools to give themselves a 2x or 3x or 5x productivity lift are going to do incredibly well. I think the whole world is going to end up over the next 10 years seeing a productivity lift. It’s a huge opportunity for 90% of people to just do more, be more, make more, connect more.

Do you think it was a misstep on the part of Open AI not to [open source] what it was building, given what’s sprung up around it?

The leader ends up behaving differently than the followers. I don’t know, I’m not inside the company, I can’t really tell. What I do know is there’s going to be a big ecosystem of AI models, and it’s not clear to me how an AI model stays differentiated as they all asymptote toward the same quality and it just becomes a price game. It seems to me that the people who win are Google Cloud and AWS because we’re all just going to be generating stuff like crazy.

It might be that Open AI ends up moving up or moving down. Maybe they become like an AWS themselves, or maybe they start making specialized AIs that they sell to certain verticals. I think everyone in this space is going to have an opportunity to do well if they navigate properly; they’re just going to have to be smart about it.

NFX has much more on its site about generative AI that’s worth reading, by the way; you can find that here.

Stay connected with us on social media platform for instant update click here to join our  Twitter, & Facebook

We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.

For all the latest Technology News Click Here 

Read original article here

Denial of responsibility! Rapidtelecast.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – abuse@rapidtelecast.com. The content will be deleted within 24 hours.
Exit mobile version