Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Microsoft SQL Server 2012 Integration Services: An Expert Cookbook

You're reading from   Microsoft SQL Server 2012 Integration Services: An Expert Cookbook Over 80 expert recipes to design, create, and deploy SSIS packages with this book and ebook

Arrow left icon
Product type Paperback
Published in May 2012
Publisher Packt
ISBN-13 9781849685245
Length 564 pages
Edition 1st Edition
Languages
Concepts
Arrow right icon
Toc

Table of Contents (23) Chapters Close

Microsoft SQL Server 2012 Integration Services: An Expert Cookbook
Credits
Foreword
About the Authors
About the Reviewers
www.PacktPub.com
Preface
1. Getting Started with SQL Server Integration Services 2. Control Flow Tasks FREE CHAPTER 3. Data Flow Task Part 1—Extract and Load 4. Data Flow Task Part 2—Transformations 5. Data Flow Task Part 3—Advanced Transformation 6. Variables, Expressions, and Dynamism in SSIS 7. Containers and Precedence Constraints 8. Scripting 9. Deployment 10. Debugging, Troubleshooting, and Migrating Packages to 2012 11. Event Handling and Logging 12. Execution 13. Restartability and Robustness 14. Programming SSIS 15. Performance Boost in SSIS Index

Working with flat files in Data Flow


In this recipe, we will demonstrate the use of the Flat File Source component that is often used in data integration projects. As explained in Chapter 2, Control Flow Tasks for security reasons, the Operational Systems (OS) owners usually prefer to push data to an external location in spite of providing direct access to the OS.

Data quality problems exist in conventional databases such as SQL Server, Oracle, DB2, and so on. It's possible to imagine the quality problems that could arise when dealing with flat files, the construction of these files could generate several problems because the external system that will read each record in this file (for example, the Extract step of the ETL process) needs to know how to split the record into columns and rows. The Row delimiter is required in order to split each row whereas the Column delimiter is required to split each column from each row. But in many cases, the Column delimiter can appear in the column content...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image