A look at how OpenSearch has grown into a trusted open-source platform for next-gen analytics, AI, and global collaboration
One year ago, the OpenSearch Project joined the Linux Foundation with a bold ambition: to build the future of search and analytics in the open. This move represented more than just an organizational change; it marked a turning point in the project’s evolution. By embracing community-driven development and transparent governance, OpenSearch signaled its commitment to progress shaped by many, not controlled by one.
Today, the need for that model has never been more urgent. As agentic AI transforms how we search, analyze, and act on data, open source isn’t just a development model, it’s an imperative. The future of intelligent systems depends on shared innovation, interoperable infrastructure, and community-led guardrails. And in the past 12 months, OpenSearch has emerged as a leading example of how open ecosystems can adapt and thrive.
A global community, growing together
The past year has shown what’s possible when the open-source model is paired with an engaged and growing ecosystem.
- Over 400 organizations are actively contributing, and over 3,300 contributors – reflecting a deep and distributed commitment to the project’s success
- The launch of the OpenSearch Software Foundation in September 2024 catalyzed more than 8,800 contributions – evidence of real, sustained momentum.
- OpenSearch now ranks #17 among all Linux Foundation projects by contributor activity, up from #70 just one year ago.
- Total downloads are over 1 billion, up 78% YOY
“In just under 12 months, OpenSearch has grown rapidly, welcoming new contributors, expanding use cases, and hitting major technical milestones. More importantly, it has evolved into a truly open, community-driven project. What began as an effort led by a single organization is now stewarded by a diverse and global contributor base. At the Linux Foundation, we’re proud to support OpenSearch’s transition to vendor-neutral governance and to help foster the transparency, inclusivity, and momentum that define its path forward. The future of OpenSearch is bright, and it’s being built together.” – Michael Dolan, Senior Vice President, Legal and Strategic Programs, The Linux Foundation
But growth isn’t just numbers.
It’s about people coming together.
OpenSearch today is shaped by vibrant design discussions, transparent quarterly roadmaps, and regular community meetings. Forums, GitHub discussions, and events like OpenSearchCon have turned into collaborative spaces where innovation thrives.
Here’s what some of our community members have been saying about OpenSearch:
“With OpenSearch, we’re not just consumers, we’re co-creators. That gives us the agility to move fast while ensuring that our contributions benefit others facing similar scale and performance challenges.”
– Charu Jain, engineering manager, Uber – “Uber’s Innovation for A Cloud Native Future” – OpenSearchCon India
“Going from a black box to an open platform has allowed us to optimize how we build our products and features and get the most out of OpenSearch.”
– Fernando Garcia Valenzuela, head of cloud storage, Atlassian– “OpenSearch at Atlassian Scale,” OpenSearchCon India
“With OpenTelemetry and OpenSearch, we can finally look at all our observability data in one place, filter it instantly, and understand exactly what’s happening across thousands of workloads. That changes how quickly we can act.”
– Karsten Schnitter, software architect, SAP – “The Power of Data Visualizations,” OpenSearchCon EU
“Amazon EC2 team recently leveraged OpenSearch’s Model Context Protocol (MCP) server to revolutionize their droplet quality investigations. By connecting AI agents with multiple OpenSearch clusters simultaneously, we have transformed a previously manual, time-consuming process into an efficient, automated workflow.”
– Reena Gupta, Sr. Product manager, Amazon EC2
“We evaluated five different vector databases. OpenSearch stood out, not just in terms of performance and cost, but because it was already the foundation of our search stack. With OpenSearch, we can deliver unified, hybrid search experiences across text, chat, and multimodal data, all with sub-millisecond latency.”
– Sreedhar Gade, vice president of engineering and responsible AI, Freshworks
“Leveraging OpenSearch has enabled us to build a powerful observability solution with minimal effort, providing critical insights that allow us to resolve issues before they have a chance to impact our customers. As the OpenSearch Software Foundation celebrates its first anniversary, we’re excited to see the project continue to evolve and innovate in ways that help us deliver a seamless, high-quality experience to our customers.”
– Michael Lehr, head of code, KRUU
In addition, backed by our engaged Governing Board and Technical Steering Committee, the project continues to evolve not just as a technology platform, but as a model for open, inclusive innovation and long-term sustainability.
Innovation for the AI era
OpenSearch’s technical evolution over the past year has been shaped by one clear trend: the shift toward agentic AI, intelligent applications, and real-time, high-volume decision-making. This shift is not gradual, it’s exponential.
To meet that demand, OpenSearch delivered a series of groundbreaking capabilities:
- Next-gen vector search: OpenSearch 3.0 introduced a re-architected vector engine with GPU support designed for scale, performance, and RAG-based retrieval in GenAI workloads
- Agent-ready infrastructure and agentic AI features: With support for hybrid search, streaming ingestion, multi-modal embeddings, MCP support and experimental features such as agentic search and agent memory, OpenSearch is optimized for agentic AI pipelines that require context-aware, dynamic interactions.
- Performance at scale: Smarter query execution and indexing speed gains are enabling production-grade, low-latency performance in high-traffic environments. Compared to OpenSearch 1.3, query latency in OpenSearch 3.2 has been reduced by about 91% (geometric mean across key query types), meaning queries are on average over 11x faster than in 1.3.
- Observability and security: Piped Processing Language (PPL) upgrades, enhanced trace analytics, role-based access controls, and anomaly detection ensure that as systems get smarter, they also get safer
- Modular extensibility: The pluggable cloud-native architecture makes it easy to extend OpenSearch with custom models, pipelines, and plugins, tailored to emerging AI use cases
- Democratized releases (OSCAR): OpenSearch 3.2 introduces OCSAR, a multi-agent AI-driven chatbot that revolutionizes releases by offering intelligent assistance, knowledge retrieval, and automated workflow management – making it easier for community members to manage releases while reducing repetitive tasks.
This kind of evolution is only possible in open source. The speed, diversity, and interoperability required to support agentic systems demand an ecosystem where innovation can come from anywhere, and be verified by everyone.
Behind the scenes, OpenSearch’s contribution infrastructure has also matured. A more automated CI/CD process, simplified workflows, and expanded maintainership, now with over 1,000 maintainers across 133 repositories. This ensures the platform continues to scale alongside its use cases.
Collective innovation, real-world impact
OpenSearch is powered by the global community building and contributing to its success. From startups to enterprise leaders, the contributions over the past year reflect how the project is being adapted and advanced across sectors:
- Uber contributed cloud-native features like pull-based ingestion, gRPC support, and reader/writer separation, crucial for streaming-scale architectures
- ByteDance introduced performance gains for segment replication and contributed to conversational search, derived source, and multi-vector enhancements
- Intel brought AVX-512 vector acceleration and zstd compression, optimizing OpenSearch for modern hardware and cloud-native performance
- SAP delivered support for FIPS compliance, enabling secure deployments in regulated industries
- DataStax contributed the JVector engine, expanding options for Java-native vector search with AstraDB integration
- Atlassian, Aiven, and others made contributions across search, performance, multi-tenancy, and deployment tooling
- NVIDIA partnered with OpenSearch to enable GPU acceleration for the vector engine
- The Terraform and Kubernetes Operator ecosystems saw over 15 first-time contributors from 14 new organizations, reinforcing OpenSearch’s growing operational footprint.
This level of engagement signals more than feature velocity, it’s a sign of platform trust. These are organizations placing OpenSearch at the core of their architecture and investing directly in its evolution.
The road ahead
OpenSearch was built for the future, and that future is arriving faster than anyone expected. As agentic systems, LLMs, and real-time analytics reshape the digital stack, organizations need platforms that are open, extensible, and resilient.
Open source is not just an alternative. It’s the foundation. It ensures innovation isn’t gated, decisions aren’t hidden, and infrastructure remains adaptable as needs evolve.
In its second year under the Linux Foundation, OpenSearch will continue to invest in:
- Enterprise-grade performance for next-gen agentic AI workloads
- Community-centered governance to ensure openness and fairness
- Ecosystem expansion to grow adoption and integrations
- Inclusive engagement to welcome new contributors, use cases, and ideas
The opportunity ahead is immense, and OpenSearch is ready.
The best part? We’re just getting started.
Whether you’re an engineer building with agents, an enterprise scaling search infrastructure, or an open-source enthusiast looking to contribute: now is the perfect time to get involved.
The future of search and analytics is being built in the open.
Let’s keep building. Together.