Elasticsearch has been used by developers to build search experiences for over a decade. At Microsoft Build this year, we announced the launch of Elasticsearch Relevance Engine — a set of tools to enable developers to build AI-powered search applications. With generative AI, large language models (LLMs), and vector search capabilities gaining mindshare, we are delighted to expand our range of tools and enable our customers in building the next generation of search apps.
One example of what a next-generation search experience might look like comes from Relativity — the eDiscovery and legal search technology company. At Build, we shared the stage with the Relativity team as they spoke about how they’re using Elasticsearch and Microsoft Azure. You can read about Relativity’s coverage of Build on their blog.
This blog explores how Relativity leverages Elasticsearch and Azure OpenAI to build futuristic search experiences. It also examines the key components of an AI-powered search experience and the important architectural considerations to keep in mind.
About Relativity
Relativity is the company behind RelativityOne, a leading cloud-based eDiscovery software solution. Relativity partners with Microsoft to innovate and deliver its solutions to hundreds of organizations, helping them manage, search, and act on large amounts of heterogeneous data. RelativityOne is respected in the industry for its global reach, and it is powered by Microsoft Azure infrastructure and a host of other Microsoft Azure services, such as Cognitive Services Translator.
The RelativityOne product is built with scale in mind. Typical use cases involve ingesting large amounts of data provided for legal eDiscovery. This data is presented to legal teams via a search interface. In order to enable high-quality legal investigations, it is critical for the search experience to return highly accurate and relevant results, every time. Elasticsearch fits these requirements and is a key underlying technology.