Bits from ElasticON Global 2023: Elasticsearch Stateless
Last December, I wrote a few thoughts about the then-recent release of Amazon OpenSearch Serverless. Three months later, I still haven’t put my hands on it besides a playful hour in a sandbox environment. Shame, but that’s not what I feel guilty of.
Last week, I attended the ElasticON Global 2023 conference hosted by Elastic.co. One highlight of the keynote by CPO Ken Exner was the announcement of the upcoming Elasticsearch serverless offering. Exner didn’t provide a release date, but I wouldn’t be surprised if it happens sometime this year, given that Amazon OpenSearch Serverless is already among us. Further details on the serverless offering were provided by Uri Cohen, VP of Product for data and compute at Elastic. He described it as the next logical step in the evolution of Elasticsearch from a deeply stateful system to a stateless one, enabled by key improvements introduced in recent years/releases, such as searchable snapshots and ILM.
As I was preparing a write-up of these two talks — happy to share what I believed was fresh news — I realised that Elastic already published all about it last October 😅: Stateless — your new state of find with Elasticsearch. Do take the time to go through it; it’s an excellent read, with a good balance between high-level vision and internal engineering details.
And that’s what I feel guilty of. The Elastic blog post came out two months before AWS announced Amazon OpenSearch Serverless ar re:Invent. Had it been on my radar, I would have referenced it in my article “Amazon OpenSearch goes Serverless!” since the idea is pretty much the same: using an object store as the persistence layer and having two independently scalable tiers for indexing and querying operations.
Anyhow. Comparing AWS and Elastic serverless offerings is not yet possible, as only the AWS option is currently available. However, in another ElasticON Global talk, the Product Marketing Director George Kobar gave clues on how Elasticsearch serverless will set itself apart from its AWS counterpart. Kobar highlighted three specific characteristics of Amazon OpenSearch Serverless that prevent it from truly being a serverless solution:
- It is not entirely a consumption-based model, as there is a non-negligible minimum price of $691/month.
- It is not a true resource-based model, as it doesn’t support auto-scaling to zero nodes.
- It is not entirely a configuration-free service, as network policies and resource types still need to be configured.
George Kobar may not have disclosed how Elasticsearch serverless will address these concerns, but his remarks strongly implied that it would overcome them right from the start. Meanwhile, Amazon OpenSearch Serverless is unlikely to remain idle and wait to be outdone; we can expect major improvements to be made before the launch of Elastic’s offering.
Well, it’s a fascinating race, really.
If you found this article useful, give me a high five 👏🏻 so others can find it too. Follow me here on Medium, Linkedin, GitHub, or StackOverflow to stay up-to-date with my work. Thanks for reading!