It’s been about a year since I last wrote about Legacy and modern architectures and what a year it has been! We have seriously changed our working culture adapting to the impact of COVID-19 and the need for social distancing. Those in-person meetings were replaced with video conferencing. We learned to adapt to those pesky interruptions – surprise spouses, kids, pets joining the video calls as well as the technical difficulties which would often occur (Am I on mute?).
Reflecting back on the change which has impacted information and analytics, it has also been a wild adventure. In 1979 when Teradata was founded, commercial computing systems were working in a batch environment processing kilobytes of data. The founders of Teradata had a vision of working with data sets in excess of 1 billion kilobytes and designed a massively parallel architecture which would support that kind of growth. It was almost 10 years after its founding that the first 1 Terabyte system was put into production, and another 20 years later customers started deploying 1+ Petabyte systems.
There have been a number of disruptive events over the last 40 years and each time Teradata has risen to meet those challenges on behalf of its customers.
The first relational database systems were deployed for decision support. Most often this was for financial reporting and analysis of corporate performance. As the data volumes grew and departments and divisions of a company wanted to do more reporting, the Data Mart was deployed. Teradata countered with a data mart consolidation strategy to build out an Enterprise Data Warehouse. In order to support this, Teradata enhanced its optimizer and indexing strategy to meet this scale. It also integrated with many of the common ETL and BI tools to meet customer needs.
Over time, companies needed a competitive edge and demanded more recent and frequent data to support operational decisions. This brought forth a new data platform: the Operational Data Store. Here, users could see near real time views of the performance of their organizations and make adjustments. Again, Teradata provided the technology underpinnings to evolve to an Active Data Warehouse. Workload management was the star, providing consistent SLAs for known workloads and providing resource management for the longer running decision support queries. Data was loaded through trickle feed or mini-batch and users had immediate access to this data.
The next major event was the rise of open source, and data platforms built around Hadoop architecture. The premise was simple, and in some ways similar to Teradata’s MPP (Massively Parallel Processing) architecture. Take a large problem, break it down into many smaller problems, solve those independently and then bring it back together again. While there were a number of analytics which worked well, there were a number of downsides which prevented it from taking the place of an Enterprise Data Warehouse. Teradata embraced an “and” strategy with these Hadoop systems by building out connectors, supporting multi-structured data sets, expanding its analytic functions and enabling the Connected Ecosystem for its customers.
Over the past few years, cloud computing has grown into a commercially viable solution for companies of all sizes. The allure of paying for what you use, reducing the need for independent data centers and having elasticity across computing and storage needs has really hit the mark with customers. Today, cloud service providers provide technology hosting solutions which meet the SLAs and security needs of the most demanding customers. Teradata has embraced a modern analytics ecosystem approach providing its customers with a choice across Amazon, Microsoft or Google as well as flexible options for on-premises deployment. Cloud brought us object stores, elastic features, consumption-based pricing models, and a comprehensive ecosystem of services to integrate with.
I’m reminded again that some of our competition tries to brand us as “legacy” but what they fail to understand is that over the last 40 years, we have risen and adapted to the changes to meet our customer’s needs. We do have a legacy, and we’re proud of it and that is what is different. We don’t want our customers to have to rip and replace their analytics environment every time a disruptive event happens. They do not have the time or the money to waste.
What will be the next disruptive event? Perhaps we’ll see something around quantum computing, or maybe some breakthrough sentient AI applied to analytics? Whatever it is, I am looking forward to the journey as we continue to build our legacy.