“We have no secret sauce,” said Luke Sernau adding that Google’s best hope is to learn from what others are doing outside of the company.
The competition between big tech companies is reaching new heights with the dawn of the generative AI era – and some have even declared it an ‘AI war.’ From the looks of it, Microsoft is seemingly at the forefront thanks to its immense arsenal of generative AI-powered products like Bing Chat and Image Creator. Meanwhile, Google doesn’t seem too far behind, even as it’s yet to launch a standalone generative AI product or plug the tech into its biggest product, Google Search.
However, a senior software engineer at Google has asserted that it’s not Microsoft or OpenAI that the search engine should fear, rather it’s open source.
Luke Sernau, Senior Software Engineer at Google and founder of Better Engineering, stated in a document that neither Google nor OpenAI is in a position to win the AI arms race. He suggested that Google’s rivalry with OpenAI has distracted the company from the rapid developments being made in open-source technology: “While we’ve been squabbling, a third faction has been quietly eating our lunch. I’m talking, of course, about open source.”
Sernau’s statements were part of a document that was published on an internet system at Google in early April. Since then, it’s been shared thousands of times among Googlers, according to the report which cites a person familiar with the matter. On May 4, the document was published by consulting firm SemiAnalysis, and has thereafter been circulating in Silicon Valley.
“We have no secret sauce”
When it comes to large language models, it’s Meta’s LLaMA that appears to be the open-source community’s favourite. The model, which was released in February, is claimed by Meta to outperform GPT-3 across many tasks, including natural language processing and sentiment analysis. It’s also much more adaptable with customisable weights that allow it to run on less powerful hardware, making it more developer-friendly.
But LLaMA isn’t the only developer-friendly LLM out there and Sernau is obviously aware of this.
“While our models still hold a slight edge in terms of quality, the gap is closing astonishingly quickly. Open-source models are faster, more customizable, more private, and pound-for-pound more capable. They are doing things with $100 and 13B params that we struggle with at $10M and 540B. And they are doing so in weeks, not months.”
“We have no secret sauce,” Sernau wrote. “Our best hope is to learn from and collaborate with what others are doing outside Google.”
He also suggested that clients would be unwilling to pay for restrictive models when other high-quality models are available for free.
Also read | Sam Altman says OpenAI won’t train on customer data; heralds the end of remote work era
“Giant models are slowing us down”
The number of parameters has often been touted as the primary factor which determines accuracy. But training LLM with hundreds of billions of parameters requires substantial computing resources, which translates to high costs and energy consumption. Running such models is highly expensive too.
Sernau suggests that giant models are slowing Google down: “In the long run, the best models are the ones which can be iterated upon quickly. We should make small variants more than an afterthought, now that we know what is possible in the <20B parameter regime.”
To put things into perspective, the engineer cited the Vicuna-13B, saying that models like these “are doing things with $100 and 13B params that we struggle with at $10M and 540B.” He added that large models aren’t more capable in the long run if smaller models, which are cheaper to update, iterate faster.
Sernau wound up by pushing Google for a more open-source policy.
“In the end, OpenAI doesn’t matter. They are making the same mistakes we are in their posture relative to open source, and their ability to maintain an edge is necessarily in question. Open-source alternatives can and will eventually eclipse them unless they change their stance. In this respect, at least, we can make the first move,” he concluded.
Source:indianexpress.com