Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds
Polars Cookbook
Polars Cookbook

Polars Cookbook: Over 60 practical recipes to transform, manipulate, and analyze your data using Python Polars 1.x

eBook
$9.99 $39.99
Paperback
$49.99
Subscription
Free Trial
Renews at $19.99p/m

What do you get with eBook?

Product feature icon Instant access to your Digital eBook purchase
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
Product feature icon AI Assistant (beta) to help accelerate your learning
OR
Modal Close icon
Payment Processing...
tick Completed

Billing Address

Table of content icon View table of contents Preview book icon Preview Book

Polars Cookbook

Reading and Writing Files

Reading and writing files is a fundamental step in your data workflows. Your data processing pipelines have both input and output. Learning and knowing how to input and output your data effectively is an essential component for your successful Polars implementations.

Polars provides a way to read and write files while in lazy mode by utilizing the scan and sink methods. They help push down predicates (filters) and projections (column selections) to the scan level and write the output that is larger than RAM streaming results to disk..

In this chapter, you’ll be learning how to read and write various types of file formats.

We will explore the following recipes in this chapter:

  • Reading and writing CSV files
  • Reading and writing Parquet files
  • Reading and writing Delta Lake tables
  • Reading and writing JSON files
  • Reading and writing Excel files
  • Reading and writing other data file formats
  • Reading and writing multiple...

Technical requirements

You can download the datasets and code for this chapter from the GitHub repository:

Also, it is assumed that you have installed the Polars library in your Python environment:

>>> pip install polars

It is also assumed that you imported it in your code:

import polars as pl

Reading and writing CSV files

Comma-separated values (CSV) is one of the most commonly used file formats for storing data. The structure or the way in which you read a CSV file may be familiar to you if you have worked with another DataFrame library such as pandas.

In this recipe, we’ll examine how to read and write a CSV file in Polars with some parameters. We’ll also look at how we can do the same in a LazyFrame.

How to do it...

Here are the steps and examples for how to read and write CSV files in Polars:

  1. Read the customer_shopping_data.csv dataset into a DataFrame:
    df = pl.read_csv('../data/customer_shopping_data.csv')
    df.head()

    The preceding code will return the following output:

 Figure 2.1 – The first five rows of the customer shopping dataset

Figure 2.1 – The first five rows of the customer shopping dataset

  1. If the CSV file doesn’t have a header, Polars would treat the first row as the header:
    df = pl.read_csv('../data/customer_shopping_data_no_header...

Reading and writing Parquet files

The Parquet file format is an open source columnar file format that’s efficient for data storage and processing. This column-oriented format is suitable for analytics workloads and efficient compression. The Parquet file format is very common in big data analytics.

In this recipe, you will learn how to read and write Parquet files in both a DataFrame and LazyFrame.

Getting ready

Toward the end of the recipe, you’ll need the pyarrow library. If you haven’t yet installed it, run the following command:

>>> pip install pyarrow

How to do it...

We’ll first cover reading a Parquet file:

  1. Read a Parquet file:
    parquet_input_file_path = '../data/venture_funding_deals.parquet'
    df = pl.read_parquet(
        parquet_input_file_path,
        columns=['Company', 'Amount', 'Valuation', 'Industry'],
        row_index_name='row_cnt'
    )
    df.head()

    The preceding code...

Reading and writing Delta Lake tables

Delta Lake is an open source storage layer built on top of the Parquet format. Delta Lake has more features than the Parquet format such as versioning and ACID guarantees. It’s basically a Parquet file with some additional benefits.

Many data pipelines nowadays are built in lakehouse architecture, which is a mix of data lakes and warehouses. Delta Lake table is a popular option and is used by many companies. Delta Lake tables can essentially be stored in your data lake but also be queried and used like relational tables. So, Polars being able to work with Delta Lake tables is a big plus.

In this recipe, we’ll look at how to read and write Delta Lake tables with a few useful parameters.

Getting ready

This recipe requires you to install another Python library, deltalake. It’s a dependency required for Polars to work with Delta Lake tables. Run the following command to install it in your Python environment:

&gt...

Reading and writing JSON files

JavaScript Object Notation (JSON) is an open source file format used to store and transport data. It can easily be parsed into a JavaScript object. JSON is language independent and is used in projects with other programming languages that require a lightweight data exchange format. JSON stores and represents data as key-value pairs. In Python terms, JSON is very much like data that is stored in Python dictionaries.

In this recipe, we’ll cover how to read and write JSON files in Polars. We’ll also cover how to work with a different variation of JSON: Newline Delimited JSON (NDJSON). It is also called JSON Lines (JSONL) or Line-Delimited JSON (LDJSON). As the name suggests, each line is a JSON object.

How to do it...

Next, we’ll dive into how to work with JSON files in Polars:

  1. Read a JSON file, showing the first 10 columns:
    df = pl.read_json('../data/world_population.json')
    df.select(df.columns[:10]).head(...

Reading and writing Excel files

We all know that Excel is one of the most popular data analysis tools out there. It still is the one that most of us are familiar with. Being able to work with Excel in Polars is essential for data analysts. In this recipe, we’ll go through reading and writing Excel files, as well as utilizing some of their useful parameters.

Getting ready

This recipe requires a few Python libraries on top of Polars. You can install it with the following command:

>>> pip install xlsx2csv xlsxwriter

How to do it...

We’ll cover how to read and write Excel files using the following steps:

  1. Let’s first read a CSV file into a DataFrame and write it to an Excel file:
    output_file_path = '../data/output/financial_sample_output.xlsx'
    df = pl.read_csv('../data/customer_shopping_data.csv')
    df.write_excel(
        output_file_path,
        worksheet='Output Sheet1',
    &...

Reading and writing other data file formats

There are many other formats aside from the ones introduced earlier. Polars keeps adding features to work with more formats in its frequent updates. We’ll be going over a few other data file formats to read from and write to in Polars.

In this recipe, we’ll cover reading from and/or writing to the Arrow IPC format, Apache Avro, and Apache Iceberg.

These file formats are not as common as the other ones we covered in earlier recipes. However, there are still use cases where companies and people need to work with these formats.

Getting ready

You’ll need to install a few other libraries other than Polars for this recipe. They are pyiceberg, numpy, and pyarrow. Run the following commands in Terminal to install them if you haven’t already:

>>> pip install pyiceberg
>>> pip install numpy
>>> pip install pyarrow

How to do it...

Here are the steps for working with other data...

Reading and writing multiple files

When working on actual data projects, there are cases where data is split into multiple files in a directory. Dealing with each file one by one can be a pain and may distract you from working on other critical components of your project.

In this recipe, we’ll cover reading multiple files into a single DataFrame or into multiple DataFrames, as well as writing a DataFrame to multiple files.

How to do it...

Here are some ways to work with multiple files:

  1. Write a DataFrame to multiple CSV files:
    1. Create a DataFrame:
    data = {'Letter': ['A','B','C'], 'Value': [1,2,3]}
    df = pl.DataFrame(data)
    1. Split it into multiple DataFrames:
    dfs = df.group_by(['Letter'])
    print(dfs)

    The preceding code will return the following output:

    >> <polars.dataframe.group_by.GroupBy object at 0x154373390>
    1. Write them to CSV files:
    for name, df in dfs:
        df.write_csv(f'...

Working with databases

In addition to working with files, it’s very common to work with databases. Polars has integrations with various databases, whether they’re hosted in the cloud or on-premises. Once you understand how to work with one database in Polars, you can apply the same patterns to various other databases.

In this recipe, we’ll specifically look at ways to read from and write to a popular database: Postgres. We’ll look at how to work with cloud databases in Chapter 11, Working with Common Cloud Data Sources.

Getting ready

This recipe requires a few additional dependencies. You’ll need to install the following libraries:

  • connectorx
  • adbc-driver-postgresql
  • pyarrow
  • pg8000 (or psycopg2)

You’ll also need to have a Postgres database on your local machine. You can refer to the following websites for more information on how to install a Postgres database locally:

Left arrow icon Right arrow icon
Download code icon Download Code

Key benefits

  • Unlock the power of Python Polars for faster and more efficient data analysis workflows
  • Master the fundamentals of Python Polars with step-by-step recipes
  • Discover data manipulation techniques to apply across multiple data problems
  • Purchase of the print or Kindle book includes a free PDF eBook

Description

The Polars Cookbook is a comprehensive, hands-on guide to Python Polars, one of the first resources dedicated to this powerful data processing library. Written by Yuki Kakegawa, a seasoned data analytics consultant who has worked with industry leaders like Microsoft and Stanford Health Care, this book offers targeted, real-world solutions to data processing, manipulation, and analysis challenges. The book also includes a foreword by Marco Gorelli, a core contributor to Polars, ensuring expert insights into Polars' applications. From installation to advanced data operations, you’ll be guided through data manipulation, advanced querying, and performance optimization techniques. You’ll learn to work with large datasets, conduct sophisticated transformations, leverage powerful features like chaining, and understand its caveats. This book also shows you how to integrate Polars with other Python libraries such as pandas, numpy, and PyArrow, and explore deployment strategies for both on-premises and cloud environments like AWS, BigQuery, GCS, Snowflake, and S3. With use cases spanning data engineering, time series analysis, statistical analysis, and machine learning, Polars Cookbook provides essential techniques for optimizing and securing your workflows. By the end of this book, you'll possess the skills to design scalable, efficient, and reliable data processing solutions with Polars.

Who is this book for?

This book is for data analysts, data scientists, and data engineers who want to learn how to use Polars in their workflows. Working knowledge of the Python programming language is required. Experience working with a DataFrame library such as pandas or PySpark will also be helpful.

What you will learn

  • Read from different data sources and write to various files and databases
  • Apply aggregations, window functions, and string manipulations
  • Perform common data tasks such as handling missing values and performing list and array operations
  • Discover how to reshape and tidy your data by pivoting, joining, and concatenating
  • Analyze your time series data in Python Polars
  • Create better workflows with testing and debugging

Product Details

Country selected
Publication date, Length, Edition, Language, ISBN-13
Publication date : Aug 23, 2024
Length: 394 pages
Edition : 1st
Language : English
ISBN-13 : 9781805125150
Category :
Languages :
Concepts :

What do you get with eBook?

Product feature icon Instant access to your Digital eBook purchase
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
Product feature icon AI Assistant (beta) to help accelerate your learning
OR
Modal Close icon
Payment Processing...
tick Completed

Billing Address

Product Details

Publication date : Aug 23, 2024
Length: 394 pages
Edition : 1st
Language : English
ISBN-13 : 9781805125150
Category :
Languages :
Concepts :

Packt Subscriptions

See our plans and pricing
Modal Close icon
$19.99 billed monthly
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Simple pricing, no contract
$199.99 billed annually
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just $5 each
Feature tick icon Exclusive print discounts
$279.99 billed in 18 months
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just $5 each
Feature tick icon Exclusive print discounts

Frequently bought together


Stars icon
Total $ 159.97
Polars Cookbook
$49.99
Expert Data Modeling with Power BI, Second Edition
$59.99
Python Data Cleaning Cookbook
$49.99
Total $ 159.97 Stars icon
Banner background image

Table of Contents

14 Chapters
Chapter 1: Getting Started with Python Polars Chevron down icon Chevron up icon
Chapter 2: Reading and Writing Files Chevron down icon Chevron up icon
Chapter 3: An Introduction to Data Analysis in Python Polars Chevron down icon Chevron up icon
Chapter 4: Data Transformation Techniques Chevron down icon Chevron up icon
Chapter 5: Handling Missing Data Chevron down icon Chevron up icon
Chapter 6: Performing String Manipulations Chevron down icon Chevron up icon
Chapter 7: Working with Nested Data Structures Chevron down icon Chevron up icon
Chapter 8: Reshaping and Tidying Data Chevron down icon Chevron up icon
Chapter 9: Time Series Analysis Chevron down icon Chevron up icon
Chapter 10: Interoperability with Other Python Libraries Chevron down icon Chevron up icon
Chapter 11: Working with Common Cloud Data Sources Chevron down icon Chevron up icon
Chapter 12: Testing and Debugging in Polars Chevron down icon Chevron up icon
Index Chevron down icon Chevron up icon
Other Books You May Enjoy Chevron down icon Chevron up icon

Customer reviews

Rating distribution
Full star icon Full star icon Full star icon Full star icon Full star icon 5
(5 Ratings)
5 star 100%
4 star 0%
3 star 0%
2 star 0%
1 star 0%
george baptista Oct 29, 2024
Full star icon Full star icon Full star icon Full star icon Full star icon 5
"Polars Cookbook" is a great, practical resource to learn Polars. It has plenty of good examples and opportunities to work through the nuances of various Polars operations.Since this is a "cookbook"-style book, the emphasis is on practical and straightforward to use content. The material is organized around common real-world problems, and provides useful solutions. The code-snippets are clear, clean and easily understandable.I particularly found useful Chapter 7 (Working with Nested Data Structures) and Chapter 8 (Reshaping and Tidying Data). For me those two chapters alone were worth the price of the book.All in all, I highly recommend this book to anyone interested in a hands-on approach to learning Polars.
Amazon Verified review Amazon
anon Sep 29, 2024
Full star icon Full star icon Full star icon Full star icon Full star icon 5
Polars Cookbook is an excellent guide to getting started with Polars.When I expressed my frustration with learning Pandas to a friend they gave me a short introduction to Polars and I found the syntax to be exactly what I was looking for.However, I still felt that I needed a more structured introduction to Polars that went a bit deeper. Polars Cookbook fit that need, and after a few chapters I felt ready to take on my first project using Polars.I'd recommend this book to anyone who wants a quick, no-fluff guide to getting started in Polars!
Amazon Verified review Amazon
Daigo Tanaka Sep 29, 2024
Full star icon Full star icon Full star icon Full star icon Full star icon 5
As a Polars newbie, I love Polars Cookbook because I can use it first as a step-by-step tutorial and then as a reference later. The book is thoughtfully organized to be useful both ways. On the table of topics, I loved seeing how it progressed seamlessly from the basic topics to more advanced topics.Starting from how to set up the Polars, the book covers end-to-end topics for data analysts and engineers, from the key concepts that make Polars performant, data I/O, and basic data transformation to practical use cases for analytics, such as handling missing data, string manipulation, and so on. It also covers data engineering topics like cloud data integration, testing, and debugging. All sections come with easy-to-understand code examples and data visualizations when applicable.The author (Yuki Kakegawa) is known for Polars tips on LinkedIn for tens of thousands of followers. I always wished his tips were organized for beginners; this book is a dream come true, and I highly recommend it to everyone who wants to get started with Polars (with or without Python Pandas experience!)
Amazon Verified review Amazon
Alierwai Oct 08, 2024
Full star icon Full star icon Full star icon Full star icon Full star icon 5
I recently had the opportunity to review Yuki's book on the Polars Python library, and I must say that Yuki did a wonderful job putting it together. In addition to reviewing his book, I have been following Yuki on LinkedIn for several months and have learned many useful Polars tricks and tips from him. Yuki and Matt Harrison have reignited my interest in learning Polars.Whether you are a beginner looking to learn Polars or a seasoned user needing a reference, this book is an excellent guide. Yuki not only demonstrates the ins and outs of Polars, but he also shows how to integrate other Python packages with Polars. For example, he showcases how to visualize data with the Plotly package (p. 81). Furthermore, he has included a chapter on testing and debugging, covering topics such as performing unit tests with pytest and using Cualle for data quality testing. After reading this chapter, I implemented data quality testing in my work projects."Polars Cookbook" is one of the best Polars books I have read so far, and I highly recommend checking it out.Suggestion/Recommendation:I believe this book would benefit from the inclusion of more real-world datasets, especially when developing the second edition.
Amazon Verified review Amazon
McCall Sep 23, 2024
Full star icon Full star icon Full star icon Full star icon Full star icon 5
The author, Yuki, does a great job taking a complex Python library and distilling it down to consumable pieces. I highly recommend if you’re new to Python programming and want to understand how to process datasets.
Amazon Verified review Amazon
Get free access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

How do I buy and download an eBook? Chevron down icon Chevron up icon

Where there is an eBook version of a title available, you can buy it from the book details for that title. Add either the standalone eBook or the eBook and print book bundle to your shopping cart. Your eBook will show in your cart as a product on its own. After completing checkout and payment in the normal way, you will receive your receipt on the screen containing a link to a personalised PDF download file. This link will remain active for 30 days. You can download backup copies of the file by logging in to your account at any time.

If you already have Adobe reader installed, then clicking on the link will download and open the PDF file directly. If you don't, then save the PDF file on your machine and download the Reader to view it.

Please Note: Packt eBooks are non-returnable and non-refundable.

Packt eBook and Licensing When you buy an eBook from Packt Publishing, completing your purchase means you accept the terms of our licence agreement. Please read the full text of the agreement. In it we have tried to balance the need for the ebook to be usable for you the reader with our needs to protect the rights of us as Publishers and of our authors. In summary, the agreement says:

  • You may make copies of your eBook for your own use onto any machine
  • You may not pass copies of the eBook on to anyone else
How can I make a purchase on your website? Chevron down icon Chevron up icon

If you want to purchase a video course, eBook or Bundle (Print+eBook) please follow below steps:

  1. Register on our website using your email address and the password.
  2. Search for the title by name or ISBN using the search option.
  3. Select the title you want to purchase.
  4. Choose the format you wish to purchase the title in; if you order the Print Book, you get a free eBook copy of the same title. 
  5. Proceed with the checkout process (payment to be made using Credit Card, Debit Cart, or PayPal)
Where can I access support around an eBook? Chevron down icon Chevron up icon
  • If you experience a problem with using or installing Adobe Reader, the contact Adobe directly.
  • To view the errata for the book, see www.packtpub.com/support and view the pages for the title you have.
  • To view your account details or to download a new copy of the book go to www.packtpub.com/account
  • To contact us directly if a problem is not resolved, use www.packtpub.com/contact-us
What eBook formats do Packt support? Chevron down icon Chevron up icon

Our eBooks are currently available in a variety of formats such as PDF and ePubs. In the future, this may well change with trends and development in technology, but please note that our PDFs are not Adobe eBook Reader format, which has greater restrictions on security.

You will need to use Adobe Reader v9 or later in order to read Packt's PDF eBooks.

What are the benefits of eBooks? Chevron down icon Chevron up icon
  • You can get the information you need immediately
  • You can easily take them with you on a laptop
  • You can download them an unlimited number of times
  • You can print them out
  • They are copy-paste enabled
  • They are searchable
  • There is no password protection
  • They are lower price than print
  • They save resources and space
What is an eBook? Chevron down icon Chevron up icon

Packt eBooks are a complete electronic version of the print edition, available in PDF and ePub formats. Every piece of content down to the page numbering is the same. Because we save the costs of printing and shipping the book to you, we are able to offer eBooks at a lower cost than print editions.

When you have purchased an eBook, simply login to your account and click on the link in Your Download Area. We recommend you saving the file to your hard drive before opening it.

For optimal viewing of our eBooks, we recommend you download and install the free Adobe Reader version 9.