Cohere says rivals are building Bugatti sports cars, “We build F150s” | Top Vip News

[ad_1]

“What’s really needed is a fleet of F-150 pickup trucks,” Kon said. “We make F-150s.”

Founded by former Google AI researchers and backed by Nvidia, Cohere is betting on generative AI for the enterprise rather than consumer chatbots, which have been the talk of the tech industry since OpenAI launched ChatGPT in late 2022 .

In June, Cohere raised $270 million at a valuation of $2.2 billion, with Salesforce and Oracle participating in the funding round. Company executives have attended AI forums at the White House. And Cohere is reportedly in talks to raise up to a billion dollars in additional capital.

“We don’t comment on rumors,” Kon told CNBC. “But someone once told me that startups are always growing.”

The field of generative AI has exploded over the past year, with a record $29.1 billion invested in nearly 700 deals in 2023, an increase of more than 260% in deal value from the previous year , according to PitchBook. It has become the most talked-about phrase on corporate earnings calls quarter after quarter, and some form of technology is automating tasks in nearly every industry, from financial services and biomedical research to logistics, online travel and utilities.

Although Cohere is often mentioned alongside AI heavyweights like OpenAI, Anthropic, Google, and Microsoft, the startup’s focus on enterprise-only chatbots has set it apart.

Competitors offer AI products for both consumers and businesses. OpenAI, for example, launched ChatGPT Enterprise in August, and Anthropic opened consumer access to its previously enterprise-only chatbot Claude in July.

Kon, who is also the company’s chief operating officer, said that by staying focused only on the business, Cohere can run efficiently and keep costs under control even amid a chip shortage, rising costs for manufacturing units. ever-changing graphics processing (GPU) and licenses. fees for AI models.

“I have rarely seen, in my career, many companies that can successfully be consumers and businesses at the same time, let alone a startup,” Kon said. He added: “We don’t have to raise billions of dollars to offer a free service to the consumer.”

Current clients include Notion, Oracle and Bamboo HR, according to Cohere’s website. Many clients fall into the banking, financial services and insurance categories, Kon said. In November, Cohere told CNBC that he had seen an increase in customer interest after OpenAI’s sudden and temporary firing of CEO Sam Altman.

Kon acknowledges that changing dynamics in the hardware industry have presented persistent challenges. The company has had a stockpile of Google chips for more than two years, Kon said, secured in Cohere’s early days to help it prepare its models.

Now, Cohere is moving toward using more of Nvidia’s H100 GPUs, which power most of today’s big language models.

Cohere’s relationships with strategic investors are another area where it differentiates itself from generative AI competitors, Kon said. Many companies have emerged from companies like Nvidia and Microsoft with some conditions attached to the use of their software or chips.

Kon insists that Cohere never accepted a conditional investment and that every check it cashed (including the one from Nvidia) had no conditions.

“In our last round, we had several controls of the same size; we had no conditions associated with any of them,” Kon said. “We made that decision explicitly so we could say we are not beholden to anyone.”

Cohere’s decision to focus on enterprise-only chatbots may help the company steer clear of the murky territory of concerns about misinformation, especially as the election season approaches.

In January, the Federal Trade Commission announced an investigation into artificial intelligence at Amazon, Alphabet, Microsoft, OpenAI and Anthropic. FTC Chair Lina Khan described it as a “market investigation into the investments and partnerships being formed between AI developers and major cloud service providers.” Cohere was not named.

Kon says the company’s growth so far has largely focused on areas like search and recovery, which require their own separate AI models. He calls it “tool usage,” and it involves training models on where, when, and how to look for the information an enterprise customer needs, even if the model wasn’t originally trained with that data.

Search, Kon said, is a key piece of generative AI that receives less attention than other areas.

“That without a doubt, for companies, will be the true unlocking,” he stated.

Discussing the expansion timeline, Kon called 2023 “the year of proof of concept.”

“We believe 2024 is becoming the year of deployment at scale,” he said.

Don’t miss these CNBC PRO stories:

LOOK: Generative AI will democratize access to business data.

Leave a Comment