Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Hands-On SQL Server 2019 Analysis Services

You're reading from   Hands-On SQL Server 2019 Analysis Services Design and query tabular and multi-dimensional models using Microsoft's SQL Server Analysis Services

Arrow left icon
Product type Paperback
Published in Oct 2020
Publisher Packt
ISBN-13 9781800204768
Length 474 pages
Edition 1st Edition
Languages
Arrow right icon
Author (1):
Arrow left icon
Steven Hughes Steven Hughes
Author Profile Icon Steven Hughes
Steven Hughes
Arrow right icon
View More author details
Toc

Table of Contents (19) Chapters Close

Preface 1. Section 1: Choosing Your Model
2. Chapter 1: Analysis Services in SQL Server 2019 FREE CHAPTER 3. Chapter 2: Choosing the SQL Server 2019 Analytic Model for Your BI Needs 4. Section 2: Building and Deploying a Multidimensional Model
5. Chapter 3: Preparing Your Data for Multidimensional Models 6. Chapter 4: Building a Multidimensional Cube in SSAS 2019 7. Chapter 5: Adding Measures and Calculations with MDX 8. Section 3: Building and Deploying Tabular Models
9. Chapter 6: Preparing Your Data for Tabular Models 10. Chapter 7: Building a Tabular Model in SSAS 2019 11. Chapter 8: Adding Measures and Calculations with DAX 12. Section 4: Exposing Insights while Visualizing Data from Your Models
13. Chapter 9: Exploring and Visualizing Your Data with Excel 14. Chapter 10: Creating Interactive Reports and Enhancing Your Models in Power BI 15. Section 5: Security, Administration, and Managing Your Models
16. Chapter 11: Securing Your SSAS Models 17. Chapter 12: Common Administration and Maintenance Tasks 18. Other Books You May Enjoy

Data optimization considerations

Another consideration when preparing your data for tabular models is the data refresh options available. Typically, data is imported into your tabular model similar to the process we used with multidimensional models. Imported data is loaded into memory and optimized by the VertiPaq engine. This involves a high level of compression, including columnar data storage techniques. The functions of compression and memory combine to create an optimized model with performance. Here are some key considerations when using data refresh:

  • Refresh frequency: The data is only as fresh as the last import. If the data source has been updated recently, the data may be out of sync. This is less of an issue when you are loading data from a data warehouse. The data warehouse is typically loaded in batches as well. If you match your refreshes to the batch loads, your data will be consistent with the data warehouse. If you have chosen to use the transactional database...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime