Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Data Modeling with Snowflake
Data Modeling with Snowflake

Data Modeling with Snowflake: A practical guide to accelerating Snowflake development using universal data modeling techniques

eBook
₹799.99 ₹2978.99
Paperback
₹2978.99 ₹3723.99
Subscription
Free Trial
Renews at ₹800p/m

What do you get with eBook?

Product feature icon Instant access to your Digital eBook purchase
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
Product feature icon AI Assistant (beta) to help accelerate your learning
Table of content icon View table of contents Preview book icon Preview Book

Data Modeling with Snowflake

Unlocking the Power of Modeling

The word modeling has come to mean very different things in the half a century that it has been practiced in database systems. This opening chapter prefaces the book’s overall aim to demystify modeling, along with its applications, methodologies, and benefits. Throughout this journey, the concept of modeling will unfold into a set of methods and terms that help organizations design and manage data and, more importantly, help them understand themselves.

In its broadest interpretation, modeling is a selective simplification that aids in navigating or designing something more complex. Any system can be broken down into smaller, more manageable pieces. Manipulating any piece individually may be straightforward, but doing so without regard to an overall strategy is a tenuous proposal that is sure to encumber scalability and maintenance down the line.

While modeling is generally considered database-agnostic, modern cloud data platforms, such as Snowflake, present their users with many unique features thanks to their innovative architecture and consumption-based pricing. A clear and forward-looking design that takes advantage of the native features of the platform that supports it is the key to building cost-effective solutions capable of meeting and anticipating business needs.

As the analytical requirements of a data-driven organization are notoriously complex and constantly evolving, modeling must keep pace and accompany data teams from idea to execution. To achieve this, modeling must go beyond the structure and relationships of database tables and embrace the transformational logic that moves and shapes the underlying data. Only by leaning into the specifics of Snowflake features and architecture can a model be built efficiently from beginning to end.

In this chapter, we’re going to cover the following main topics:

  • Recognizing the utility of models in our daily lives
  • Getting a glimpse of modeling conventions in action
  • Getting acquainted with the tools in the modeling toolkit
  • Uncovering the benefits of modeling for enterprise teams
  • Incorporating modeling into strategic planning
  • Understanding modeling applications for transactional and analytical systems

Technical requirements

This book focuses on data modeling specifically for the Snowflake Data Cloud. While modeling includes many system-agnostic terms and conventions, this book will leverage unique features of Snowflake architecture, data types, and functions when building physical models and Structured Query Language (SQL) transformations.

To follow along with the exercises in the following chapters, you will need a Snowflake account with access to a sandbox area for creating schemas, objects, and loading data.

You can sign up for a 30-day free trial of Snowflake (https://signup.snowflake.com/) if you do not already have access.

This book will frequently use visual modeling diagrams as part of the modeling process. While a diagram can be drawn by hand and constructed in PowerPoint or Lucidchart, a tool that supports common database modeling features is recommended. The exercises in this book will take the reader from conceptual database-agnostic diagrams to deployable and runnable Snowflake code. For this reason, a tool that supports various modeling types and can forward engineer Snowflake syntax is recommended.

The diagrams in this book were generated using the SqlDBM online database modeling tool (https://sqldbm.com/Home/), which supports the previously mentioned features and offers a 2-week free trial.

Modeling with purpose

Models are used to simplify complex systems. Take a modern city as an example, and you will see that it consists of intricately linked systems such as highways, electrical grids, and transit systems. While these systems operate in the same physical territory, they require very different models to help us understand them. For example, a subway system snakes and curves below a city’s varied terrain, but our model of it—a subway map—uses straight lines and places stations at nearly equidistant intervals. The subway map is not the city—it is a selective simplification of the city that makes it easier for passengers to visualize their journey. The transit map is a model so ubiquitous that it’s hard to imagine doing it any other way—yet it took time to evolve.

The subway map, as we know it today, was invented by Harry Beck in 1931 while re-designing the map used by the London Underground. The old design was confusing to riders because it focused on the wrong goal—geographical exactness. Here’s what it looked like before Beck:

Figure 1.1 – London tube map, before Beck (Legacy Tube map)

Figure 1.1 – London tube map, before Beck (Legacy Tube map)

Thankfully, Beck was not a cartographer—he was an engineer. By sacrificing topographical detail, Beck’s design allowed passengers to quickly count the number of stops required for their journey while retaining their overall sense of direction. This story reminds us (quite literally) of the refrain, the map is not the territory.

As with maps, various kinds of modeling exist to help teams within an organization make sense of the many layers that make up its operational landscape. Also, like maps, models help organizations prepare for the journey ahead. But how does one use a model to navigate a database, let alone plan its future?

Leveraging the modeling toolkit

Before we continue, we need to formally delineate three distinct concepts often used together in the service of modeling to make it simpler to refer to a specific tool in the modeling toolkit in later sections. By understanding where each piece fits in the broader domain of database design and management, diving into deeper technical concepts later in the book will become more meaningful and easier to digest.

The three components are listed here:

  • Natural language semantics—words
  • Technical semantics—SQL
  • Visual semantics—diagrams

Let’s discuss each of these in detail, as follows:

  • Natural language semantics: Terminology employed in communicating details of a model between people. These are agreed-upon words that employ pre-defined conventions to encapsulate more complex concepts in simpler terms. For example, when both parties involved in a verbal exchange understand the concept of a surrogate key, it saves them from having to explain that it is a unique identifier for a table record that holds no intrinsic business meaning, such as an integer sequence or a hash value.

To ensure effective technical conversations, it helps to be fluent in the semantics of modeling. Not only does it save time by succinctly communicating a complex concept, but it also saves even more time by not miscommunicating it. A waiter would return different foods when ordering chips in London rather than in Los Angeles. A properly modeled database would never return different records for the same surrogate key.

  • Technical semantics: SQL is a domain-specific language used to manage data in a Relational Database Management System (RDBMS). Unlike a general-purpose language (for example, YAML or Python), domain-specific languages have a much smaller application but offer much richer nuance and precision. While it can’t format a website or send an email, SQL allows us to create the structure of our database and manipulate its contents.

SQL bridges modeling concepts (expressed in words or images) and what is physically defined in the database. Snowflake uses an American National Standards Institute (ANSI)-compliant SQL syntax, meaning its basic commands (such as SELECT, UPDATE, DELETE, INSERT, and WHERE) are compatible with other database vendors who use this standard. Snowflake also offers many extra functions, clauses, and conventions that go beyond ANSI-standard SQL and give users added flexibility to manage the database.

Unfortunately, due to its domain-specific nature, SQL presents a significant limitation: it can only express what the database explicitly understands. While SQL can define table structure and precisely manipulate data, it is too detailed to easily articulate the underlying business requirements.

  • Visual semantics: Through their simplicity, images can convey a density of information that other forms of language simply cannot. In modeling, diagrams combine the domain-specific precision of SQL with the nuance of natural language. This gives diagrams a lot to work with to capture a data model’s business meaning and technical specifics.

To start, diagrams vary in the level of detail they present—giving the observer exactly what they’re looking for without overwhelming (or underwhelming) them with information. Most importantly, the semantic conventions used in diagrams are universal and can be understood by people besides data analysts and engineers. Yes—modeling diagrams are considered technical drawings; they represent strict technical concepts through agreed-upon visual conventions. However, in their simplest form, models can be understood almost intuitively with no prior knowledge. Even at the more advanced levels, such as logical and physical, learning to read a model is much simpler than learning SQL.

When all these semantics come together and are understood by the entire organization, they form a ubiquitous language, a concept first described by Eric Evans in Domain-Driven Design. Modeling then forms a part of the vocabulary that is understood universally throughout the organization to describe its business and store the data assets that support it. But that is just one of the many benefits that modeling provides.

The benefits of database modeling

Tactics without strategy is the noise before defeat. (Sun Tzu)

For many people, database modeling brings to mind stale diagrams, arcane symbols, or extra work at the end of a project. Only a decade ago, fueled by the rise of distributed computing in the early 2000s—which popularized the concept of big data—the notion that modeling is dead gained notoriety. More precisely, it was thought that cheap and near-limitless computing power had made planning and designing a thing of the past. It was said that flexible semi-structured data formats and the ability to parse them on the fly—known as schema-on-read—had made modeling obsolete.

Eventually, operating and maintenance costs caught up with reality and revealed two great shortcomings of the schema-on-read approach. One is that no matter how data is structured, it must be functionally bound to the business that it helps support. In other words, semi-structured formats are neither a panacea nor an excuse to forgo the process of business validation. The second—and most important—is that a model is not simply the shape that data takes once uploaded to a database, but rather, the blueprint for business operations, without which it is impossible to build sustainable architectures.

Sustainable solutions require a long-term strategy to ensure their design matches the underlying business model. Without this, schema-on-read (discussed in Chapter 15, Modeling Semi-Structured Data), star schema (discussed in Chapter 17, Scaling Data Models through Modern Techniques), or any other schema are narrow-sighted tactics that lead nowhere. But done right, modeling makes developing database architectures more agile and helps the project evolve from the idea phase to implementation. At every stage of development, the model serves as a guide for supporting the conversations necessary to propel the design into the next phase and provide additional business context. Once implemented, the model becomes a living document that helps users understand, navigate, and evolve the system it helped create.

While every organization models in the technical sense—creating tables and transforming data—not everyone models strategically, end to end, in the broad sense of the word—thereby foregoing the long-term benefits. Some of these benefits include the following:

  • Consensus and visibility of the broader business model
  • More productive conversations with business teams
  • Better quality of requirements
  • Higher signal, lower noise in technical conversations
  • Cross-platform, cross-domain, and widely understood conventions
  • Big-picture visual overview of the business and its database footprint
  • Preliminary designs become implementation blueprints
  • Accelerating onboarding of new team members
  • Making data more accessible and unlocking self-service within organizations
  • Keeping the database landscape manageable at scale
  • Getting a handle on complex data pipelines

To demonstrate the difficulties of working without formal modeling, we can take a simple schema based on Snowflake’s shared TPC-H dataset (available in the shared database called SNOWFLAKE_SAMPLE_DATA), which, at first glance, looks like this:

Figure 1.2 – A list of tables in the Snowsight UI

Figure 1.2 – A list of tables in the Snowsight UI

While these tables have been modeled in the strict sense of the word and even contain data, we get very little information on what that data represents, how it relates to data in other tables, or where it fits in the broad context of business operations.

Intuition suggests that SALES_ORDER and CUSTOMER share a relationship, but this assertion needs to be tested. Even in this trivial example of only eight tables, it will take considerable time to thoroughly sift through the data to understand its context.

The irony is that many of the details we’re looking for are already baked into the design of the physical tables, having been modeled at some point in the past. We just can’t see them. Without a map, the terrain is lost from view.

Here is the same set of tables visualized through a modeling convention called an Entity-Relationship Diagram (ERD):

Figure 1.3 – A conceptual model using crow’s foot notation

Figure 1.3 – A conceptual model using crow’s foot notation

At a glance, the big picture comes into focus. Diagrams such as this one allow us to understand the business concepts behind the data and ensure they are aligned. Having a visual model also lets us zoom out from individual tables and understand the semantics of our business: What are the individual pieces involved and how do they interact? This global perspective gives everyone in the organization a means of finding and making sense of data assets without requiring a technical background—thus, business analysts or new hires can unlock the value of the information without any help from the data team.

As the organization grows, expanding in personnel and data assets, it will inevitably become too big for any person, or even a team of people, to coordinate. Here, organizations that have embraced data modeling will stand out from those that did not. Modeling can be the thing that helps organizations scale their data landscape, or it can be the technical debt that holds them back.

Yet, for all its benefits, modeling is not a cookie-cutter solution that guarantees success. There are many approaches to modeling and various modeling methodologies that are suited for different workloads. Throughout this book, we will tackle the fundamentals of modeling that will allow you to understand these differences and apply the best solution using a first-principles approach. First, we will begin by breaking down the two main database use cases and observing the role modeling plays in each of them.

Operational and analytical modeling scenarios

The relational database as we know it today emerged in the 1970s—allowing organizations to store their data in a centralized repository instead of on individual tapes. Later that decade, Online Transaction Processing (OLTP) emerged, enabling faster access to data and unlocking new uses for databases such as booking and bank teller systems. This was a paradigm shift for databases, which evolved from data archives to operational systems.

Due to limited resources, data analysis could not be performed on the same database that ran operational processes. The need to analyze operational data gave rise, in the 1980s, to Management Information Systems (MIS), or Decision Support Systems (DSS) as they later became known. Data would be extracted from the operational database to the DSS, where it could be analyzed according to business needs. OLTP architecture is not best suited for the latter case, so Online Analytical Processing (OLAP) emerged to enable users to analyze multidimensional data from multiple perspectives using complex queries. This is the same paradigm used today by modern data platforms such as Snowflake.

The approach to storing and managing data in OLAP systems fundamentally differs from the operational or transactional database. Data in OLAP systems is generally stored in data warehouses (also known as DWs or DWHs)—centralized repositories that store structured data from various sources for the purpose of analysis and decision-making. While the transactional system keeps the up-to-date version of the truth and is generally concerned with individual records, the data warehouse snapshots many historical versions and aggregates volumes of data to satisfy various analytical needs.

Data originates in the transactional database when daily business operations (for example, bookings, sales, withdrawals) are recorded. In contrast, the warehouse does not create but rather loads extracted information from one or various source systems. The functional differences between transactional databases and warehouses present different modeling challenges.

A transactional system must be modeled to fit the nature of the data it is expected to process. This means knowing the format, relationships, and attributes required for a transaction.

The main concern of a transactional database model is the structure and relationships between its tables.

By contrast, the data warehouse loads existing data from the source system. A data warehouse isn’t concerned with defining a single transaction but with analyzing multitudes of transactions across various dimensions to answer business questions. To do this, a data warehouse must transform the source data to satisfy multiple business analyses, which often means creating copies with varying granularity and detail.

Modeling in a data warehouse builds upon the relational models of its source systems by conforming common elements and transforming the data using logic.

Wait—if transformational logic is a core concept in data warehouse modeling, why is it so consistently absent from modeling discussions? Because in order to do transformational modeling justice, one must forgo the universality of general modeling principles and venture into the realm of platform specifics (that is, syntax, storage, and memory utilization). This book, in contrast, will embrace Snowflake specifics and go beyond physical modeling by diving into the transformation logic behind the physical tables. This approach provides a fuller understanding of the underlying modeling concepts and equips the reader with the required SQL recipes to not only build models but to load and automate them in the most efficient way possible. As we’ll see in later chapters, this is where Snowflake truly shines and confers performance and cost-saving benefits.

Is Snowflake limited to OLAP?

Snowflake’s primary use case is that of a data warehouse—with all the OLAP properties to enable multidimensional analysis at scale over massive datasets. However, at the 2022 Snowflake Summit, the company announced a new table type called Hybrid Unistore, which features both an OLTP-storage table and an OLAP analysis table under one semantic object. This announcement means Snowflake users can now design transactional OLTP database schemas while leveraging the analytical performance that Snowflake is known for. Hybrid Unistore tables are discussed in more detail in later chapters.

Although OLAP and OLTP systems are optimized for different kinds of database operations, they are still databases at heart and operate on the same set of objects (such as tables, constraints, and views) using SQL. However, each use case requires very different approaches to modeling the data within. The following section demonstrates what modeling will typically look like in each scenario.

A look at relational and transformational modeling

The previous section describes how modeling varies between operational and data warehouse scenarios. Before exploring the modeling process in detail, it’s helpful to understand the look and feel of relational and transformational modeling and what we’re working toward. Before proceeding, it would help to summarize the main differences between transactional databases and data warehouses. You can see what these are in the following table:

Transactional Database

Data Warehouse

Supports daily operations

Provides operational insight

Operates on single records

Summarizes many records

Accurate as of the present instant

Historical snapshots over time

Single source of truth (SSOT), non-redundant

Redundant to support different analyses

Data models defined by business operations

Data models generated by business questions

Static and structured data model

Inherited structure and dynamically transformed data model

Single-application data

Multiple sources of converging data

Figure 1.4 – Common differences between transactional databases and warehouses

Given these differences, the following sections demonstrate what modeling looks like in each system and what it aims to achieve.

What modeling looks like in operational systems

Completely ignoring the modeling workflow that got us here, which will be covered in later chapters, we can observe an example of the type of modeling most commonly seen in transactional systems. The physical diagram in Figure 1.5 serves both as a blueprint for declaring the required tables and a guide to understanding their business context.

Following modeling conventions (don’t worry if they are still unfamiliar—they will be covered thoroughly in the coming chapters), we can infer a lot of information from this simple diagram. For example, a person is uniquely identified by an eight-digit identifier (the primary key) and must have a Social Security number (SSN), driver’s license, name, and birth date.

The one-to-many relationship between the two tables establishes that while a person does not necessarily need to have an account created, an account must belong to just one person:

Figure 1.5 – A physical model using crow’s foot notation

Figure 1.5 – A physical model using crow’s foot notation

These details, combined with the list of attributes, data types, and constraints, not only dictate what kinds of data can be written to these tables but also provide an idea of how the business operates. So, how does this differ in analytical databases?

What modeling looks like in analytical systems

In a data warehouse scenario, the PERSON and ACCOUNT tables would not be defined from scratch—they would be extracted from the source in which they exist and loaded—bringing both structure and data into the process. Then, the analytical transformations begin in answer to the organization’s business questions. This is a process known as Extract Transform Load (ETL). (Although ELT has become the preferred processing order, the original term stuck.)

Suppose the management team wanted to analyze which age groups (by decade) were opening which account types and they wanted to store the result in a separate table for an independent analysis.

The following diagram shows the resulting relational model of an object obtained through transformational analysis but provides no business context:

Figure 1.6 – A relational model of a transformational requirement

Figure 1.6 – A relational model of a transformational requirement

Although physical modeling could describe such a table (as seen in Figure 1.6)—containing the account type with age and count of accounts as integers—such a model would fail to communicate the most relevant details, presented here:

  • The logic used to perform the analysis
  • The relationship between the source tables and the output

The business requirement for ACCOUNT_TYPE_AGE_ANALYSIS in this example purposely excludes the source key fields from the target table, preventing the possibility of establishing any relational links. However, the relational model still serves a vital role: it tells us how the sources are related and how to join them correctly to produce the required analysis.

The logic could then be constructed by joining PERSON and ACCOUNT, as shown here:

CREATE TABLE account_types_age_analysis AS
SELECT
    a.account_type,
    ROUND(DATEDIFF(years, p.birth_date, CURRENT_DATE()), -1
    ) AS age_decade,
    COUNT(a.account_id) AS total_accounts
    FROM account AS  a
     INNER JOIN person AS p
     ON a.person_id = p.person_id
GROUP BY     1, 2;

Although there is no relational connection between ACCOUNT_TYPE_AGE_ANALYSIS and its sources, there is still a clear dependency on them and their columns. Instead of using ERDs, which convey entities and relationships, transformational pipelines are visualized through a lineage diagram. This type of diagram gives a column-level mapping from source to target, including all intermediate steps, as shown here:

Figure 1.7 – Transformational modeling seen visually

Figure 1.7 – Transformational modeling seen visually

Paired with the SQL logic used to construct it, the lineage graph gives a complete picture of the transformational relationship between sources and targets in an analytical/warehousing scenario.

Having witnessed both relational and analytical approaches to modeling, it is clear that both play a vital role in navigating the complex dynamic environments that one is liable to encounter in an enterprise-scale Snowflake environment.

Although we have only skimmed the surface of what modeling entails and the unique features of the Snowflake platform that can be leveraged to this end, this chapter has hopefully given you an idea of the vital role that modeling plays in building, maintaining, and documenting database systems. Before diving into the specifics of verbal, technical, and visual modeling semantics of modeling in the chapters to come, let’s review what we learned.

Summary

There is no escaping modeling. We use it in our everyday lives to plan and navigate the complexities of the world around us—databases are no exception. For some readers, the modeling styles presented in this chapter may be a new way of conceptualizing their database landscape, while others may be getting reacquainted with its notation and use cases. Whether thinking about a company business model or sharing a finalized design with team members, we all engage in modeling to varying degrees.

Embracing database modeling and learning to speak in a commonly understood language unlocks many time-saving and collaborative benefits for the entire organization. Thinking long-term and modeling strategically, as opposed to reacting tactically, aligns database designs to the business that they underpin, ensuring their viability. Having seen the advantages that modeling uncovers and where it can be implemented, we can begin to analyze its components to understand precisely where they should be used and how they form a natural design progression.

In the next chapter, we will explore the four modeling types used in database design and discuss where they excel and how they build on one another to help take an idea and evolve it into a technical system design while generating living project artifacts to navigate and maintain the final product.

Further reading

Eric Evans’ book, mentioned earlier in this chapter, explores how to create effective models by going beyond the surface and getting to the intention of the system itself. It is a recommended read for those wishing to go deeper into the realm of effective communication through models, unrestricted by specific technical domains, methods, or conventions:

Evans, Eric. Domain-Driven Design: Tackling Complexity in the Heart of Software. Addison-Wesley Professional, 2004.

References

Left arrow icon Right arrow icon
Download code icon Download Code

Key benefits

  • Learn core modeling techniques tied to practical examples using native Snowflake architecture
  • Adopt a universal modeling language to communicate business value to functional teams
  • Go beyond physical modeling with SQL recipes to transform and shape your Snowflake data

Description

The Snowflake Data Cloud is one of the fastest-growing platforms for data warehousing and application workloads. Snowflake's scalable, cloud-native architecture and expansive set of features and objects enables you to deliver data solutions quicker than ever before. Yet, we must ensure that these solutions are developed using recommended design patterns and accompanied by documentation that’s easily accessible to everyone in the organization. This book will help you get familiar with simple and practical data modeling frameworks that accelerate agile design and evolve with the project from concept to code. These universal principles have helped guide database design for decades, and this book pairs them with unique Snowflake-native objects and examples like never before – giving you a two-for-one crash course in theory as well as direct application. By the end of this Snowflake book, you’ll have learned how to leverage Snowflake’s innovative features, such as time travel, zero-copy cloning, and change-data-capture, to create cost-effective, efficient designs through time-tested modeling principles that are easily digestible when coupled with real-world examples.

Who is this book for?

This book is for developers working with SQL who are looking to build a strong foundation in modeling best practices and gain an understanding of where they can be effectively applied to save time and effort. Whether you’re an ace in SQL logic or starting out in database design, this book will equip you with the practical foundations of data modeling to guide you on your data journey with Snowflake. Developers who’ve recently discovered Snowflake will be able to uncover its core features and learn to incorporate them into universal modeling frameworks.

What you will learn

  • Discover the time-saving benefits and applications of data modeling
  • Learn about Snowflake's cloud-native architecture and its features
  • Understand and apply modeling techniques using Snowflake objects
  • Universal modeling concepts and language through Snowflake objects
  • Get comfortable reading and transforming semistructured data
  • Learn directly with pre-built recipes and examples
  • Learn to apply modeling frameworks from Star to Data Vault

Product Details

Country selected
Publication date, Length, Edition, Language, ISBN-13
Publication date : May 31, 2023
Length: 324 pages
Edition : 1st
Language : English
ISBN-13 : 9781837632787
Category :
Tools :

What do you get with eBook?

Product feature icon Instant access to your Digital eBook purchase
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
Product feature icon AI Assistant (beta) to help accelerate your learning

Product Details

Publication date : May 31, 2023
Length: 324 pages
Edition : 1st
Language : English
ISBN-13 : 9781837632787
Category :
Tools :

Packt Subscriptions

See our plans and pricing
Modal Close icon
₹800 billed monthly
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Simple pricing, no contract
₹4500 billed annually
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just ₹400 each
Feature tick icon Exclusive print discounts
₹5000 billed in 18 months
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just ₹400 each
Feature tick icon Exclusive print discounts

Frequently bought together


Stars icon
Total 10,575.97 11,320.97 745.00 saved
Data Engineering with dbt
₹3723.99
Data Modeling with Snowflake
₹2978.99 ₹3723.99
Data Engineering with AWS
₹3872.99
Total 10,575.97 11,320.97 745.00 saved Stars icon

Table of Contents

22 Chapters
Part 1: Core Concepts in Data Modeling and Snowflake Architecture Chevron down icon Chevron up icon
Chapter 1: Unlocking the Power of Modeling Chevron down icon Chevron up icon
Chapter 2: An Introduction to the Four Modeling Types Chevron down icon Chevron up icon
Chapter 3: Mastering Snowflake’s Architecture Chevron down icon Chevron up icon
Chapter 4: Mastering Snowflake Objects Chevron down icon Chevron up icon
Chapter 5: Speaking Modeling through Snowflake Objects Chevron down icon Chevron up icon
Chapter 6: Seeing Snowflake’s Architecture through Modeling Notation Chevron down icon Chevron up icon
Part 2: Applied Modeling from Idea to Deployment Chevron down icon Chevron up icon
Chapter 7: Putting Conceptual Modeling into Practice Chevron down icon Chevron up icon
Chapter 8: Putting Logical Modeling into Practice Chevron down icon Chevron up icon
Chapter 9: Database Normalization Chevron down icon Chevron up icon
Chapter 10: Database Naming and Structure Chevron down icon Chevron up icon
Chapter 11: Putting Physical Modeling into Practice Chevron down icon Chevron up icon
Part 3: Solving Real-World Problems with Transformational Modeling Chevron down icon Chevron up icon
Chapter 12: Putting Transformational Modeling into Practice Chevron down icon Chevron up icon
Chapter 13: Modeling Slowly Changing Dimensions Chevron down icon Chevron up icon
Chapter 14: Modeling Facts for Rapid Analysis Chevron down icon Chevron up icon
Chapter 15: Modeling Semi-Structured Data Chevron down icon Chevron up icon
Chapter 16: Modeling Hierarchies Chevron down icon Chevron up icon
Chapter 17: Scaling Data Models through Modern Techniques Chevron down icon Chevron up icon
Index Chevron down icon Chevron up icon
Other Books You May Enjoy Chevron down icon Chevron up icon

Customer reviews

Most Recent
Rating distribution
Full star icon Full star icon Full star icon Full star icon Half star icon 4.9
(18 Ratings)
5 star 94.4%
4 star 5.6%
3 star 0%
2 star 0%
1 star 0%
Filter icon Filter
Most Recent

Filter reviews by




Utkarsh DIXIT Feb 13, 2024
Full star icon Full star icon Full star icon Full star icon Full star icon 5
I haven't read it yet . But will surely like it
Feefo Verified review Feefo
kashif Aug 21, 2023
Full star icon Full star icon Full star icon Full star icon Full star icon 5
The book is very comprehensive and takes you through step by step, It depends on where you like to start if you fall in beginner category then go from first chapter/ intermediate then start from chapter 7, Even through I feel it first edition is for beginner/intermediate level people but I would still say anyone can pick it and give a read. i haven't seen any book coming close to this book anywhereIt definitely emphasis the best recipes, these recipes can be still applied in other modern DWH.I would rate it 4.5 and encourage everyone to go for it, Even through it doesn't build complex modules and break the ice , Its still worth reading, I hope second edition will give more focus on breaking the ice with complex examples and give practical examples for data mesh. Since the whole book is practical was expecting something in data vault / data mesh also in deep.Eagerly waiting for second edition
Amazon Verified review Amazon
Vinoth Aug 21, 2023
Full star icon Full star icon Full star icon Full star icon Empty star icon 4
The book covers the best practices of data modeling. It offers the most efficient approach to data modeling in snowflake. Author goes in detail about conceptual, logical, physical, , transformational modeling , lots of tips and tricks about snowflake. Highly recommend this book for all data professionals who want to start with snowflake or already working in snowflake
Amazon Verified review Amazon
J. Reis Aug 15, 2023
Full star icon Full star icon Full star icon Full star icon Full star icon 5
Serge nails data modeling on Snowflake. He's very thoughtfully considered the various angles and approaches for this, and I'm highly impressed.
Amazon Verified review Amazon
Marie Aug 10, 2023
Full star icon Full star icon Full star icon Full star icon Full star icon 5
Not being from a technical background, this book explains data modeling, the importance of it and the process and an understandable way. I was able to really learn and take in a lot of information, and I did not feel overwhelmed as I had expected to. Great read!
Amazon Verified review Amazon
Get free access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

How do I buy and download an eBook? Chevron down icon Chevron up icon

Where there is an eBook version of a title available, you can buy it from the book details for that title. Add either the standalone eBook or the eBook and print book bundle to your shopping cart. Your eBook will show in your cart as a product on its own. After completing checkout and payment in the normal way, you will receive your receipt on the screen containing a link to a personalised PDF download file. This link will remain active for 30 days. You can download backup copies of the file by logging in to your account at any time.

If you already have Adobe reader installed, then clicking on the link will download and open the PDF file directly. If you don't, then save the PDF file on your machine and download the Reader to view it.

Please Note: Packt eBooks are non-returnable and non-refundable.

Packt eBook and Licensing When you buy an eBook from Packt Publishing, completing your purchase means you accept the terms of our licence agreement. Please read the full text of the agreement. In it we have tried to balance the need for the ebook to be usable for you the reader with our needs to protect the rights of us as Publishers and of our authors. In summary, the agreement says:

  • You may make copies of your eBook for your own use onto any machine
  • You may not pass copies of the eBook on to anyone else
How can I make a purchase on your website? Chevron down icon Chevron up icon

If you want to purchase a video course, eBook or Bundle (Print+eBook) please follow below steps:

  1. Register on our website using your email address and the password.
  2. Search for the title by name or ISBN using the search option.
  3. Select the title you want to purchase.
  4. Choose the format you wish to purchase the title in; if you order the Print Book, you get a free eBook copy of the same title. 
  5. Proceed with the checkout process (payment to be made using Credit Card, Debit Cart, or PayPal)
Where can I access support around an eBook? Chevron down icon Chevron up icon
  • If you experience a problem with using or installing Adobe Reader, the contact Adobe directly.
  • To view the errata for the book, see www.packtpub.com/support and view the pages for the title you have.
  • To view your account details or to download a new copy of the book go to www.packtpub.com/account
  • To contact us directly if a problem is not resolved, use www.packtpub.com/contact-us
What eBook formats do Packt support? Chevron down icon Chevron up icon

Our eBooks are currently available in a variety of formats such as PDF and ePubs. In the future, this may well change with trends and development in technology, but please note that our PDFs are not Adobe eBook Reader format, which has greater restrictions on security.

You will need to use Adobe Reader v9 or later in order to read Packt's PDF eBooks.

What are the benefits of eBooks? Chevron down icon Chevron up icon
  • You can get the information you need immediately
  • You can easily take them with you on a laptop
  • You can download them an unlimited number of times
  • You can print them out
  • They are copy-paste enabled
  • They are searchable
  • There is no password protection
  • They are lower price than print
  • They save resources and space
What is an eBook? Chevron down icon Chevron up icon

Packt eBooks are a complete electronic version of the print edition, available in PDF and ePub formats. Every piece of content down to the page numbering is the same. Because we save the costs of printing and shipping the book to you, we are able to offer eBooks at a lower cost than print editions.

When you have purchased an eBook, simply login to your account and click on the link in Your Download Area. We recommend you saving the file to your hard drive before opening it.

For optimal viewing of our eBooks, we recommend you download and install the free Adobe Reader version 9.