Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Pentaho Data Integration Cookbook - Second Edition

You're reading from   Pentaho Data Integration Cookbook - Second Edition The premier open source ETL tool is at your command with this recipe-packed cookbook. Learn to use data sources in Kettle, avoid pitfalls, and dig out the advanced features of Pentaho Data Integration the easy way.

Arrow left icon
Product type Paperback
Published in Dec 2013
Publisher Packt
ISBN-13 9781783280674
Length 462 pages
Edition 2nd Edition
Languages
Tools
Arrow right icon
Toc

Table of Contents (21) Chapters Close

Pentaho Data Integration Cookbook Second Edition
Credits
About the Author
About the Reviewers
www.PacktPub.com
Preface
1. Working with Databases FREE CHAPTER 2. Reading and Writing Files 3. Working with Big Data and Cloud Sources 4. Manipulating XML Structures 5. File Management 6. Looking for Data 7. Understanding and Optimizing Data Flows 8. Executing and Re-using Jobs and Transformations 9. Integrating Kettle and the Pentaho Suite 10. Getting the Most Out of Kettle 11. Utilizing Visualization Tools in Kettle 12. Data Analytics Data Structures References Index

Performing repetitive database design tasks from PDI


While we have cautioned that database design tasks should not normally be performed using PDI, sometimes there are certain tasks that are very repetitive and it can save a lot of time by creating a simple transformation to execute such tasks. For instance, maybe there are some new auditing columns that need to be added to all the tables of a given database. With PDI, and using a database's metadata repository, it is very straightforward to add the columns.

In this recipe we will be creating a transformation that will read a list of tables from the books database in the database's metadata repository and build dynamic queries for each table.

Getting ready

For this recipe you will need the database which can be created and filled by running the script books.sql. This can be found in the code for this chapter.

As the recipe will read metadata from the books database, before proceeding, explore the database's metadata repository to see what is inside. In particular, run the following statement to see what tables are available in the books database:

SELECT
    TABLE_NAME
    , TABLE_TYPE
    , ENGINE
    , VERSION
    , ROW_FORMAT
    , TABLE_ROWS
    , AVG_ROW_LENGTH
    , DATA_LENGTH
FROM information_schema.TABLES
WHERE TABLE_SCHEMA = 'books';

+--------------+------------+--------+---------+------------+------------+----------------+-------------+
| TABLE_NAME   | TABLE_TYPE | ENGINE | VERSION | ROW_FORMAT | TABLE_ROWS | AVG_ROW_LENGTH | DATA_LENGTH |
+--------------+------------+--------+---------+------------+------------+----------------+-------------+
| authors      | BASE TABLE | InnoDB |      10 | Compact    |         13 |           1260 |       16384 |
| books        | BASE TABLE | InnoDB |      10 | Compact    |         34 |            481 |       16384 |
+--------------+------------+--------+---------+------------+------------+----------------+-------------+
2 rows in set (0.00 sec)

Note

The preceding statement and the following tutorial is written with MySQL syntax. Please review and fix it if needed because you are using a different DBMS.

How to do It...

  1. Create a transformation. Add a Table Input step that reads the TABLE_NAME from the TABLES table from MySQL's information_schema database.

  2. From the Scripting category, add an Execute SQL script step. Check the option Execute for each row? and add TABLE_NAME to the Parameters: section. For the SQL script to execute, use the following:

    ALTER TABLE ?
      ADD COLUMN create_date DATETIME DEFAULT '1900-01-01 00:00:00',
      ADD COLUMN update_date TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP
  3. Create a hop between the Table Input and Execute SQL script steps. Save and run the transformation. When completed, check the tables in the books database. All of them should now have a create_date and an update_date column added.

How it works...

Using a database's metadata repository is a very powerful and effective way to know the structure of a given source database. With this transformation we took advantage of that by reading out a list of all the tables from the books database and ran a variable-based query that added two columns to each table based on the table name.

Try adding additional filters to specify certain tables from the books database. MySQL's information_schema database also has a table that details the columns of each table (aptly named COLUMNS). For larger databases, you may want to filter just a subset of tables based on given columns or types.

Note

While it has been stated before, it bears mentioning again that this technique must be used with extreme caution since it can drastically alter your database depending on the type of query executed!

See also

  • Getting data from a database by running a query built at runtime

  • The Executing part of a job once for every row in a dataset recipe in Chapter 8, Executing and Re-using Jobs and Transformations

  • Building SQL queries based on database metadata

You have been reading a chapter from
Pentaho Data Integration Cookbook - Second Edition - Second Edition
Published in: Dec 2013
Publisher: Packt
ISBN-13: 9781783280674
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image