I’m very excited to announce that we have agreed to be acquired by Databricks, the leading data and AI company, and pioneer of the Lakehouse. This move is the ideal pairing of two technologies that are vital to the future of enterprise AI. For me personally, this move is a validation of a long held theses around the evolution of the AI market.
Two and a half years ago, I met Raj at Cafe Venecia in Palo Alto. At the time, I was an EIR at Redpoint Ventures who was obsessed with solving the data access issues that I knew were slowing the adoption of AI. How did I know? Well, for the prior decade, I had been selling AI to Banks, Health Systems, Insurers and the Government. The technology we sold delivered game changing advantages by tapping into data stored within their transactional systems. But there were some really daunting challenges as well.
As the demand for AI technology grew, it became increasingly obvious that the hardest part of delivering performant apps was not the AI itself, it was the prosaic task of getting access to source data. Every implementation featured a difficult, iterative and lengthy data pipe building project. These pipes were almost alway based on batch queries that ran weekly or monthly. But building a pipe was just the start of the challenge. Schemas frequently change on source systems, so it was very common that a new batch would show up with different data or nulls that broke the downstream models.
And while the data teams tasked with solving these challenges struggled, the C-suite increasingly demanded AI that tapped into the ever growing stream of real time data flowing through their systems.
So when Raj first described what Arcion (then Blitzz) did, the lightbulb went off. Digging in, it was clear that Arcion solved the problems mentioned above in a modern and scalable way. It was also built as a cloud-native architecture that looked like the platforms upon which AI applications were being developed. So we got to work, capitalized the company and started scaling it up.
Over the last two years, the demand for resilient data pipes, transactional integrity and real time delivery exploded. We’ve worked with a number of partners, but Databricks always stood out. Firstly, the solutions our customers were building on Databricks were truly innovative. They weren’t just trying to speed up queries or make snappier dashboards, Databricks use cases were pushing the envelope of what a data application could deliver. The best minds in our customers' data teams were deeply involved in almost every deployment, which told us that thought leaders preferred Databricks. As an added bonus, our teams worked together seamlessly and it was clear that we shared a common vision of the data and AI market.
Looking forward, it’s clear that enterprises that want to derive value from AI need to choose the right platform upon which to build apps. Having worked with many platforms as partners, I am certain that Databricks is that platform. This acquisition further proves the point. By integrating Arcion, Databricks customers will now be able to reduce or remove the friction of data pipe building and maintenance and enable new, innovative and highly valuable AI use cases.
So in closing, a few big thanks.
First, thanks to our amazing team. Having worked in and with many startups, this is a one-of-a-kind team that has shown incredible skill and an amazing level of customer dedication. While building a highly complex and very exacting product of this sort is very difficult, you’ve made it look easy.
I am also very appreciative of our investors (Bessemer, Uncorrelated, Tectonic, First Rays, Emergent, HPE Pathfinder and Databricks Ventures) that allowed us to flesh out this vision. Infrastructure investing is hard and takes lots of patience, so I’m glad that patience paid off. The advice and counsel along the way was very appreciated and a great contributor to our success.
Finally to Raj. Your vision was amazing and it was great to partner with you. I really look forward to seeing what value we’ll develop under the Databricks banner.