By
Search was the first grand prize of the internet, currently generating about $300 billion annually. For years, incumbents dominated through scale: more data improves search quality, and more users creates advertising leverage. Consumer challengers such as struggled to drive enough user adoption, and enterprise search generally failed entirely.
But large language models and innovations in agentic reasoning 鈥 such as -R1 and the recently launched deep research mode in Gemini and 鈥 transform what鈥檚 possible in search. These advancements allow companies to build much more powerful products with much less data.
While isn鈥檛 disappearing, the search market is about to change enormously 鈥 with exciting new opportunities in consumer advertising, domain-specific search and infrastructure.
How LLMs change what鈥檚 possible in search
Traditional search engines rely on a multistep process of query understanding, query execution and response generation.

Charts courtesy of Theory Ventures.

Google has leveraged language modeling 鈥 specifically, semantic vector embeddings and transformers 鈥 in search for years. But modern LLMs pre-trained on the entirety of the internet have new capabilities around language comprehension, information retrieval and basic reasoning.
They allow search engines to:
- Understand complex queries beyond short keywords;
- Evaluate and rank results without complex knowledge graphs that rely on billions of users, instead leveraging LLM鈥檚 world model to determine which data is best;
- Synthesize responses to answer a user鈥檚 question directly instead of providing multiple sources; and
- Create agentic search, which decomposes a user question into multiple queries, iteratively analyzes each result, and provides a refined answer. This approach can replace entire research workflows that previously would have taken many separate searches.

New market opportunities in LLM-powered search
LLM-powered search will create new market opportunities in three key areas.
Transforming consumer search and advertising: Consumer expectations change quickly. After asking an AI assistant an open-ended request or receiving a simple yes/no response, it鈥檚 painful to go back to keyword-based search and navigate a page full of links. Consumer search will soon be fully open-ended and multimodal in both inputs and outputs.
Incumbents will maintain an advantage through the breadth and scale of general search, first-party data (e.g. Google Maps), and massive distribution; but the advertising and SEO ecosystem around them will change.
When the result for 鈥渨hat are the best shoes?鈥 changes from a series of links to a generated recommendation, shoe companies will need new ways to do SEO. When you ask an assistant to 鈥渂ook me a hotel for the weekend,鈥 hotel brands will need interfaces/applets for the agent to query availability and for the user to look at photos of their options.
Proliferation of domain-specific and enterprise search:聽Because traditional search is data-hungry, expensive and complex to build, most domain-specific and enterprise search tools have underperformed. Across industries, LLM-powered startups can disrupt legacy systems such as , and PubMed by automating complex research workflows. For example, in medicine, they can find dozens of relevant clinical trials, filter them based on study criteria, then synthesize the findings.
LLMs will also make enterprise search finally work. is an early leader in general internal search, but there are numerous opportunities to build workflow-specific solutions around core systems of record (ERP, CRM, SIEM), functions (security, operations, finance) or customer-facing applications (product search).
New infrastructure to serve the exploding search market: As more companies build LLM-powered search, infrastructure demands will grow, including in retrieval systems for other AI products (see: ). Areas of opportunity will include:
- Databases and query engines 鈥 optimized for hybrid and multimodal search, with high-throughput and low-latency retrieval at scale;
- Neural information retrieval 鈥 tooling and models to support embedding, indexing and retrieval for different use cases (e.g., 鈥 portfolio company ); and
- Search orchestration 鈥 query planning and decomposition, multistep retrieval orchestration, ranking/re-ranking, fact verification and more.
The evolution of search is already underway, making now the perfect moment for startups to build solutions redefining how information is accessed and utilized in an LLM-powered age.
is a partner at , where he invests in early-stage AI, data and ML companies. His background is both as an operator and investor: He was the first product manager at 鈥 a -backed startup creating synthetic data products for the public sector 鈥 and a machine learning investor at . Triedman started his career at and studied computational neuroscience at , where he built brain-computer interfaces and trained machine learning algorithms on neural data.
Illustration:
Stay up to date with recent funding rounds, acquisitions, and more with the 附近上门 Daily.


67.1K Followers