Preface
Delta helps you generate reliable insights at scale and simplifies architecture around data pipelines, allowing you to focus primarily on refining the use cases being worked upon. This is especially important considering the same architecture is reused when onboarding new use cases.
In this book, you'll learn the principles of distributed computing, data modeling techniques, big data design patterns, and templates that help solve end-to-end data flow problems for common scenarios and are reusable across use cases and industry verticals. You'll also learn how to recover from errors and the best practices around handling structured, semi-structured, and unstructured data using Delta. Next, you'll get to grips with features such as ACID transactions on big data, disciplined schema evolution, time travel to help rewind a dataset to a different time or version, and unified batch and streaming capabilities that will help you build agile and robust data products.By the end of this book, you'll be able to use Delta as the foundational block for creating analytics-ready data that fuels all AI/BI use cases.