Join our live event on April 7th to learn how to build with Tavily! Register Here

news4 min read

What We Shipped: February 2026

New in February: Tavily is joining Nebius, we're live on Cursor's MCP marketplace, a Generative UI Research Canvas with LangChain and CopilotKit, and a look at how Writer uses Tavily in production.

Tavily Team

By Tavily Team

March 6, 2026

What We Shipped: February 2026

Here's a rundown of what's new this month and how to start using it.

Tavily is joining Nebius

Tavily is joining Nebius. Together, we are bringing core AI infrastructure and a production-grade web access layer closer, so agentic systems can rely on the web safely, reliably, and at scale.

The most important thing to say clearly: nothing changes about how you use Tavily today. The API remains the same, your data policies remain the same, and zero data retention remains core.

What does change is what we can build going forward. Backed by Nebius's global infrastructure, we can improve latency and uptime, support larger and more complex deployments, and invest more deeply in the reliability and performance that production systems depend on.

Read the full announcement

Tavily is now on Cursor's MCP marketplace

Tavily is now available in Cursor's MCP marketplace, giving your coding agent real-time web access for tasks like researching libraries, pulling live docs, and grounding responses in up-to-date information.

Install the plugin

Build a Generative UI Research Canvas with LangChain, Tako, and CopilotKit

Most research tools return a wall of text. A Generative UI Research Canvas turns that into a dynamic, interactive interface where users can explore, drill down, and act on findings in real time. Tavily now has an official integration showcased alongside LangChain, Tako, and CopilotKit to help you build exactly that inside your own applications.

See it in action and explore how you can bring the same generative UI pattern into your own stack.

See how Writer uses Tavily in production

Writer is the enterprise AI platform built for agentic work. One of their AI engineers joined our latest webinar to show exactly how their Research Agent is wired up. The pattern is clean: /search to find the right pages, /extract to pull the full content. Broad retrieval first, deep extraction second, all grounded in live web data on top of Writer's Palmyra LLMs and graph-based RAG layer.

Watch the session and share it with your team as a reference for how a production Tavily integration can look.

From the community: Tavily powering a real-world WhatsApp agent

Abdallah Obaid built a production-grade WhatsApp AI agent for a renewable energy company using Tavily for domain-restricted web search. He configured Tavily to browse only the company's official website, ensuring the agent surfaces verified, reliable information without going off-script. The agent also integrates LangChain, LangGraph, and Pinecone for a full agentic stack.

See how he built it