launches new APIs to connect LLMs to the web

When OpenAI connected ChatGPT to the internet, it supercharged the AI chatbot’s capabilities. Now, the search engine wants to do the same for every large language model (LLM) out there. today announced the launch of a set of APIs aimed at giving LLMs like Meta’s Llama 2 real-time access to the open web — or narrower slices of it. Starting at $100 per month,’s APIs augment LLMs’ answers to questions from users (e.g. “Which holidays are this week?”) with up-to-date context from the internet.

Customers including LlamaIndex, Anthropic and Cohere have already integrated it with their models.

“[We’ve] received many requests for an API with these capabilities,” CEO and founder Richard Socher told TechCrunch in an email interview. “When you ask about a recent event, like a Super Bowl score on the day of the Super Bowl, our API will search for those scores on the web and then you can add that information, in that moment, to the … LLM and it can then use it to answer your question more accurately.”

Most LLMs are trained on publicly available, static data scraped from public web pages, e-books and elsewhere. That’s sufficient to get them to perform tasks, from writing emails to drafting cover letters and essays. But it limits the LLMs’ knowledge to the data’s time range; an LLM trained on info prior to September 2021 wouldn’t be aware of events that happened yesterday, of course.’s new APIs enable LLMs to overcome this limitation by creating an index of long snippets of websites — a key point of differentiation over the standard search APIs provided by Bing and Google, which Socher claims serve only very short snippets “designed to entice someone to click a link.” LLMs can leverage this custom-built index when answering questions, identifying the relevant snippet and summarizing it to provide an updated answer.

“Every LLM gets a prompt — a description for how it should behave and answer questions,” Socher explained. “You can add your own question to the end of that prompt to have a conversation with an LLM. What this API enables is that you can add a lot of up-to-date context from the web into the prompt, after the question is asked.” is providing three “flavors” of API at launch: Web search, news and RAG. Web search gives LLMs access to the aforementioned index of long snippets, while news — as the name implies — provides exclusively news results. As for RAG (which stands for “retrieval-augmented generation”), it pairs’s web search results with an LLM to generate what Socher claims are “more factual” responses, although the jury’s obviously out on that.

Now, an LLM with web access can be a risky prospect — no matter which APIs it’s tapping. The live web is less curated than a static training dataset and, by implication, less filtered. Search results can be gamed, and they also aren’t necessarily representative of the totality of the web. Because most algorithms prioritize websites that use modern web technologies like encryption, mobile support and schema markup, websites with otherwise quality content get lost in the shuffle.

Socher admitted that’s API has weaknesses particularly in the areas of localized “near me”-style questions (e.g. “Where’s good sushi near me?”), since the API doesn’t know LLM users’ locations. But improvements are already being made, including upgrades that’ll allow’s APIs to code and “produce much more complex answers” with traceable citations, Socher says.

“We’ll soon merge news and general web search to make it even easier for companies using our APIs,” he added. “By incorporating our API into whatever solution is built by creators, their answers will be more relevant and helpful for their end users … [The solution can turn] to the web to verify facts.”

The new APIs have this writer wondering if search is the next battlefield on the generative AI front. As open source LLMs approach the level of some of their closed-source counterparts, the strength of the search engine backing those closed-source LLMs (Bing in ChatGPT’s case, Google in Bard’s) becomes a stronger selling point — unless APIs like’s effectively level the playing field.

That’s a big “if” — no API’s perfect, and’s surely has flaws beyond those Socher mention. But I’d argue that competition is always a good thing.

The new APIs start at $100 per month for 14,200 API calls after a 60-day trial that comes with 1,000 free monthly calls. also offers bespoke packages for larger enterprise deals that come with annual subscriptions and discounts.


Leave a Reply

Your email address will not be published. Required fields are marked *