Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases now! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Databricks Certified Associate Developer for Apache Spark Using Python
Databricks Certified Associate Developer for Apache Spark Using Python

Databricks Certified Associate Developer for Apache Spark Using Python: The ultimate guide to getting certified in Apache Spark using practical examples with Python

eBook
₹799.99 ₹2085.99
Paperback
₹1954.99 ₹2606.99
Subscription
Free Trial
Renews at ₹800p/m

What do you get with Print?

Product feature icon Instant access to your digital eBook copy whilst your Print order is Shipped
Product feature icon Paperback book shipped to your preferred address
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
Product feature icon AI Assistant (beta) to help accelerate your learning
Table of content icon View table of contents Preview book icon Preview Book

Databricks Certified Associate Developer for Apache Spark Using Python

Overview of the Certification Guide and Exam

Preparing for any task initially involves comprehending the problem at hand thoroughly and, subsequently, devising a strategy to tackle the challenge. Creating a step-by-step methodology for addressing each aspect of the challenge is an effective approach within this planning phase. This method enables smaller tasks to be handled individually, aiding in a systematic progression through the challenges without the need to feel overwhelmed.

This chapter intends to demonstrate this step-by-step approach to working through your Spark certification exam. In this chapter, we will cover the following topics:

  • Overview of the certification exam
  • Different types of questions to expect in the exam
  • Overview of the rest of the chapters in this book

We’ll start by providing an overview of the certification exam.

Overview of the certification exam

The exam consists of 60 questions. The time you’re given to attempt these questions is 120 minutes. This gives you about 2 minutes per question.

To pass the exam, you need to have a score of 70%, which means that you need to answer 42 questions correctly out of 60 for you to pass.

If you are well prepared, this time should be enough for you to answer the questions and also review them before the time finishes.

Next, we will see how the questions are distributed throughout the exam.

Distribution of questions

Exam questions are distributed into the following broad categories. The following table provides a breakdown of questions based on different categories:

Topic

Percentage of Exam

Number of Questions

Spark Architecture: Understanding of Concepts

17%

10

Spark Architecture: Understanding of Applications

11%

7

Spark DataFrame API Applications

72%

43

Table 1.1: Exam breakdown

Looking at this distribution, you would want to focus on the Spark DataFrame API a lot more in your exam preparation since this section covers around 72% of the exam (about 43 questions). If you can answer these questions correctly, passing the exam will become easier.

But this doesn’t mean that you shouldn’t focus on the Spark architecture areas. Spark architecture questions have varied difficulty, and they can sometimes be confusing. At the same time, they allow you to score easy points as architecture questions are generally straightforward.

Let’s look at some of the other resources available that can help you prepare for this exam.

Resources to prepare for the exam

When you start planning to take the certification exam, the first thing you must do is master Spark concepts. This book will help you with these concepts. Once you’ve done this, it would be useful to do mock exams. There are two mock exams available in this book for you to take advantage of.

In addition, Databricks provides a practice exam, which is very useful for exam preparation. You can find it here: https://files.training.databricks.com/assessments/practice-exams/PracticeExam-DCADAS3-Python.pdf.

Resources available during the exam

During the exam, you will be given access to the Spark documentation. This is done via Webassessor and its interface is a little different than the regular Spark documentation you’ll find on the internet. It would be good for you to familiarize yourself with this interface. You can find the interface at https://www.webassessor.com/zz/DATABRICKS/Python_v2.html. I recommend going through it and trying to find different packages and functions of Spark via this documentation to make yourself comfortable navigating it during the exam.

Next, we will look at how we can register for the exam.

Registering for your exam

Databricks is the company that has prepared these exams and certifications. Here is the link to register for the exam: https://www.databricks.com/learn/certification/apache-spark-developer-associate.

Next, we will look at some of the prerequisites for the exam.

Prerequisites for the exam

Some prerequisites are needed before you can take the exam so that you can be successful in passing the certification. Some of the major ones are as follows:

  • Grasp the fundamentals of Spark architecture, encompassing the principles of Adaptive Query Execution.
  • Utilize the Spark DataFrame API proficiently for various data manipulation tasks, such as the following:
    • Performing column operations, such as selection, renaming, and manipulation
    • Executing row operations, including filtering, dropping, sorting, and aggregating data
    • Conducting DataFrame-related tasks, such as joining, reading, writing, and implementing partitioning strategies
    • Demonstrating proficiency in working with user-defined functions (UDFs) and Spark SQL functions
  • While not explicitly tested, a functional understanding of either Python or Scala is expected. The examination is available in both programming languages.

Hopefully, by the end of this book, you will be able to fully grasp all these concepts and have done enough practice on your own to be prepared for the exam with full confidence.

Now, let’s discuss what to expect during the online proctored exam.

Online proctored exam

The Spark certification exam is an online proctored exam. What this means is that you will be taking the exam from the comfort of your home, but someone will be proctoring the exam online. I encourage you to understand the procedures and rules of the proctored exam in advance. This will save you a lot of trouble and anxiety at the time of the exam.

To give you an overview, throughout the exam session, the following procedures will be in place:

  • Webcam monitoring will be conducted by a Webassessor proctor to ensure exam integrity
  • You will need to present a valid form of identification with a photo
  • You will need to conduct the exam alone
  • Your desk needs to be decluttered and there should be no other electronic devices in the room except the laptop that you’ll need for the exam
  • There should not be any posters or charts on the walls of the room that may aid you in the exam
  • The proctor will be listening to you during the exam as well, so you’ll want to make sure that you’re sitting in a quiet and comfortable environment
  • It is recommended to not use your work laptop for this exam as it requires software to be installed and your antivirus and firewall to be disabled

The proctor’s responsibilities are as follows:

  • Overseeing your exam session to maintain exam integrity
  • Addressing any queries related to the exam delivery process
  • Offering technical assistance if needed
  • It’s important to note that the proctor will not offer any form of assistance regarding the exam content

I recommend that you take sufficient time before the exam to set up the environment where you’ll be taking the exam. This will ensure a smooth online exam procedure where you can focus on the questions and not worry about anything else.

Now, let’s talk about the different types of questions that may appear in the exam.

Types of questions

There are different categories of questions that you will find in the exam. They can be broadly divided into theoretical and code questions. We will look at both categories and their respective subcategories in this section.

Theoretical questions

Theoretical questions are the questions where you will be asked about the conceptual understanding of certain topics. Theoretical questions can be subdivided further into different categories. Let’s look at some of these categories, along with example questions taken from previous exams that fall into them.

Explanation questions

Explanation questions are ones where you need to define and explain something. It can also include how something works and what it does. Let’s look at an example.

Which of the following describes a worker node?

  1. Worker nodes are the nodes of a cluster that perform computations.
  2. Worker nodes are synonymous with executors.
  3. Worker nodes always have a one-to-one relationship with executors.
  4. Worker nodes are the most granular level of execution in the Spark execution hierarchy.
  5. Worker nodes are the coarsest level of execution in the Spark execution hierarchy.

Connection questions

Connections questions are such questions where you need to define how different things are related to each other or how they differ from each other. Let’s look at an example to demonstrate this.

Which of the following describes the relationship between worker nodes and executors?

  1. An executor is a Java Virtual Machine (JVM) running on a worker node.
  2. A worker node is a JVM running on an executor.
  3. There are always more worker nodes than executors.
  4. There are always the same number of executors and worker nodes.
  5. Executors and worker nodes are not related.

Scenario question

Scenario questions involve defining how things work in different if-else scenarios – for example, “If ______ occurs, then _____ happens.” Moreover, it also includes questions where a statement is incorrect about a scenario. Let’s look at an example to demonstrate this.

If Spark is running in cluster mode, which of the following statements about nodes is incorrect?

  1. There is a single worker node that contains the Spark driver and the executors.
  2. The Spark driver runs in its own non-worker node without any executors.
  3. Each executor is a running JVM inside a worker node.
  4. There is always more than one node.
  5. There might be more executors than total nodes or more total nodes than executors.

Categorization questions

Categorization questions are such questions where you need to describe categories that something belongs to. Let’s look at an example to demonstrate this.

Which of the following statements accurately describes stages?

  1. Tasks within a stage can be simultaneously executed by multiple machines.
  2. Various stages within a job can run concurrently.
  3. Stages comprise one or more jobs.
  4. Stages temporarily store transactions before committing them through actions.

Configuration questions

Configuration questions are such questions where you need to outline how things will behave based on different cluster configurations. Let’s look at an example to demonstrate this.

Which of the following statements accurately describes Spark’s cluster execution mode?

  1. Cluster mode runs executor processes on gateway nodes.
  2. Cluster mode involves the driver being hosted on a gateway machine.
  3. In cluster mode, the Spark driver and the cluster manager are not co-located.
  4. The driver in cluster mode is located on a worker node.

Next, we’ll look at the code-based questions and their subcategories.

Code-based questions

The next category is code-based questions. A large number of Spark API-based questions lie in this category. Code-based questions are the questions where you will be given a code snippet, and you will be asked questions about it. Code-based questions can be subdivided further into different categories. Let’s look at some of these categories, along with example questions taken from previous exams that fall into these different subcategories.

Function identification questions

Function identification questions are such questions where you need to define which function does something. It is important to know the different functions that are available in Spark for data manipulation, along with their syntax. Let’s look at an example to demonstrate this.

Which of the following code blocks returns a copy of the df DataFrame, where the column salary has been renamed employeeSalary?

  1. df.withColumn(["salary", "employeeSalary"])
  2. df.withColumnRenamed("salary").alias("employeeSalary ")
  3. df.withColumnRenamed("salary", " employeeSalary ")
  4. df.withColumn("salary", " employeeSalary ")

Fill-in-the-blank questions

Fill-in-the-blank questions are such questions where you need to complete the code block by filling in the blanks. Let’s look at an example to demonstrate this.

The following code block should return a DataFrame with the employeeId, salary, bonus, and department columns from the transactionsDf DataFrame. Choose the answer that correctly fills the blanks to accomplish this.

df.__1__(__2__)
    1. drop
    2. "employeeId", "salary", "bonus", "department"
    1. filter
    2. "employeeId, salary, bonus, department"
    1. select
    2. ["employeeId", "salary", "bonus", "department"]
    1. select
    2. col(["employeeId", "salary", "bonus", "department"])

Order-lines-of-code questions

Order-lines-of-code questions are such questions where you need to place the lines of code in a certain order so that you can execute an operation correctly. Let’s look at an example to demonstrate this.

Which of the following code blocks creates a DataFrame that shows the mean of the salary column of the salaryDf DataFrame based on the department and state columns, where age is greater than 35?

  1. salaryDf.filter(col("age") > 35)
  2. .filter(col("employeeID")
  3. .filter(col("employeeID").isNotNull())
  4. .groupBy("department")
  5. .groupBy("department", "state")
  6. .agg(avg("salary").alias("mean_salary"))
  7. .agg(average("salary").alias("mean_salary"))
  1. i, ii, v, vi
  2. i, iii, v, vi
  3. i, iii, vi, vii
  4. i, ii, iv, vi

Summary

This chapter provided an overview of the certification exam. At this point, you know what to expect in the exam and how to best prepare for it. To do so, we covered different types of questions that you will encounter.

Going forward, each chapter of this book will equip you with practical knowledge and hands-on examples so that you can harness the power of Apache Spark for various data processing and analytics tasks.

Left arrow icon Right arrow icon
Download code icon Download Code

Key benefits

  • Understand the fundamentals of Apache Spark to design robust and fast Spark applications
  • Explore various data manipulation components for each phase of your data engineering project
  • Prepare for the certification exam with sample questions and mock exams
  • Purchase of the print or Kindle book includes a free PDF eBook

Description

Spark has become a de facto standard for big data processing. Migrating data processing to Spark saves resources, streamlines your business focus, and modernizes workloads, creating new business opportunities through Spark’s advanced capabilities. Written by a senior solutions architect at Databricks, with experience in leading data science and data engineering teams in Fortune 500s as well as startups, this book is your exhaustive guide to achieving the Databricks Certified Associate Developer for Apache Spark certification on your first attempt. You’ll explore the core components of Apache Spark, its architecture, and its optimization, while familiarizing yourself with the Spark DataFrame API and its components needed for data manipulation. You’ll also find out what Spark streaming is and why it’s important for modern data stacks, before learning about machine learning in Spark and its different use cases. What’s more, you’ll discover sample questions at the end of each section along with two mock exams to help you prepare for the certification exam. By the end of this book, you’ll know what to expect in the exam and gain enough understanding of Spark and its tools to pass the exam. You’ll also be able to apply this knowledge in a real-world setting and take your skillset to the next level.

Who is this book for?

This book is for data professionals such as data engineers, data analysts, BI developers, and data scientists looking for a comprehensive resource to achieve Databricks Certified Associate Developer certification, as well as for individuals who want to venture into the world of big data and data engineering. Although working knowledge of Python is required, no prior knowledge of Spark is necessary. Additionally, experience with Pyspark will be beneficial.

What you will learn

  • Create and manipulate SQL queries in Apache Spark
  • Build complex Spark functions using Spark's user-defined functions (UDFs)
  • Architect big data apps with Spark fundamentals for optimal design
  • Apply techniques to manipulate and optimize big data applications
  • Develop real-time or near-real-time applications using Spark Streaming
  • Work with Apache Spark for machine learning applications
Estimated delivery fee Deliver to India

Premium delivery 5 - 8 business days

₹630.95
(Includes tracking information)

Product Details

Country selected
Publication date, Length, Edition, Language, ISBN-13
Publication date : Jun 14, 2024
Length: 274 pages
Edition : 1st
Language : English
ISBN-13 : 9781804619780
Vendor :
Databricks
Category :
Languages :

What do you get with Print?

Product feature icon Instant access to your digital eBook copy whilst your Print order is Shipped
Product feature icon Paperback book shipped to your preferred address
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
Product feature icon AI Assistant (beta) to help accelerate your learning
Estimated delivery fee Deliver to India

Premium delivery 5 - 8 business days

₹630.95
(Includes tracking information)

Product Details

Publication date : Jun 14, 2024
Length: 274 pages
Edition : 1st
Language : English
ISBN-13 : 9781804619780
Vendor :
Databricks
Category :
Languages :

Packt Subscriptions

See our plans and pricing
Modal Close icon
₹800 billed monthly
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Simple pricing, no contract
₹4500 billed annually
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just ₹400 each
Feature tick icon Exclusive print discounts
₹5000 billed in 18 months
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just ₹400 each
Feature tick icon Exclusive print discounts

Frequently bought together


Stars icon
Total 7,353.96 10,054.97 2,701.01 saved
Databricks Certified Associate Developer for Apache Spark Using Python
₹1954.99 ₹2606.99
Building LLM Powered  Applications
₹2605.99 ₹3723.99
Data Engineering with Databricks Cookbook
₹2792.98 ₹3723.99
Total 7,353.96 10,054.97 2,701.01 saved Stars icon

Table of Contents

17 Chapters
Part 1: Exam Overview Chevron down icon Chevron up icon
Chapter 1: Overview of the Certification Guide and Exam Chevron down icon Chevron up icon
Part 2: Introducing Spark Chevron down icon Chevron up icon
Chapter 2: Understanding Apache Spark and Its Applications Chevron down icon Chevron up icon
Chapter 3: Spark Architecture and Transformations Chevron down icon Chevron up icon
Part 3: Spark Operations Chevron down icon Chevron up icon
Chapter 4: Spark DataFrames and their Operations Chevron down icon Chevron up icon
Chapter 5: Advanced Operations and Optimizations in Spark Chevron down icon Chevron up icon
Chapter 6: SQL Queries in Spark Chevron down icon Chevron up icon
Part 4: Spark Applications Chevron down icon Chevron up icon
Chapter 7: Structured Streaming in Spark Chevron down icon Chevron up icon
Chapter 8: Machine Learning with Spark ML Chevron down icon Chevron up icon
Part 5: Mock Papers Chevron down icon Chevron up icon
Chapter 9: Mock Test 1 Chevron down icon Chevron up icon
Chapter 10: Mock Test 2 Chevron down icon Chevron up icon
Index Chevron down icon Chevron up icon
Other Books You May Enjoy Chevron down icon Chevron up icon

Customer reviews

Rating distribution
Full star icon Full star icon Full star icon Full star icon Full star icon 5
(4 Ratings)
5 star 100%
4 star 0%
3 star 0%
2 star 0%
1 star 0%
Kindle Customer Oct 06, 2024
Full star icon Full star icon Full star icon Full star icon Full star icon 5
Saba did a wonderful job creating this book. The material was very easy to digest and understand. I found the practical hands-on examples very helpful to follow along with in the Databricks Community Edition Notebooks. This aided in cementing my fundamental knowledge on sparks syntax. I recommend this book to anyone who is wanting to upskill in spark and test their knowledge by sitting for the exam.
Amazon Verified review Amazon
Alexander Sep 06, 2024
Full star icon Full star icon Full star icon Full star icon Full star icon 5
Thanks to this book I was able to clarify many concepts and I became certified
Amazon Verified review Amazon
Michael Thomsen Jul 23, 2024
Full star icon Full star icon Full star icon Full star icon Full star icon 5
Feefo Verified review Feefo
Raghu Kundurthi Jul 19, 2024
Full star icon Full star icon Full star icon Full star icon Full star icon 5
Proud & honored to present my review of The book "Databricks Certified Associate Developer for Apache Spark Using Python" , a 274 page that preps one to clear the "certification" and serves as a guide to build a successful career as a Databricks Apache Spark developer. A lucid explanation of the concepts of Apache Spark(pgs 13-31) for several audiences (Data/Business analysts, Data & ML engineers , Citizen Data Scientists(formerly SMEs/Power Users, Data Scientists) that empower anybody to be an effective data advocates. Through simple examples, the author nudges a reader to practice.The author rivets the reader to stay on target and prepare to win(exam and career)! "Type of questions section" has gainworthy points for a developer -not only preparing for the exam but also laying the foundation of building a career as a knowledge worker in the Apache Spark world.The author summarizes decades of SDLC coding patterns in 10 pages (51-61) under Spark Operations that introduce core Spark features. The sections "Spark Architecture and Transformations", Advanced Operations, ) are "treasure troves" to get into the weeds of Apache Spark.The sections in pgs 161-189 guides ML engineers to build ML Models experiments.The legion of SQL Users (the Select * folks) that span an entire organization , the Excel user, (Pivots,Data Modelers, Macro users) , BI Users , Business Analysts, SMEs, Power Users can focus on the Section "SQL Queries in Spark". The author has perhaps addressed this discerning gap of the most crucial audience that form a bridge between the geeks and gods who sign project funding checks (leaders).The abundance of learning material (~200 pgs) and the exam prep content (code & theoretical questions, 120 questions in two Mock tests) should make any novice clear the exam in one sitting!A seminal read for any individual to build a career as a Databricks Associate Apache Engineer Spark!A “must have” in the top shelf of your bookshelf ! Good luck & best wishes to clear the exam and launch an enriching career as a Databricks Apache Spark Developer! Godspeed to many more books on Databricks!
Amazon Verified review Amazon
Get free access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

What is the delivery time and cost of print book? Chevron down icon Chevron up icon

Shipping Details

USA:

'

Economy: Delivery to most addresses in the US within 10-15 business days

Premium: Trackable Delivery to most addresses in the US within 3-8 business days

UK:

Economy: Delivery to most addresses in the U.K. within 7-9 business days.
Shipments are not trackable

Premium: Trackable delivery to most addresses in the U.K. within 3-4 business days!
Add one extra business day for deliveries to Northern Ireland and Scottish Highlands and islands

EU:

Premium: Trackable delivery to most EU destinations within 4-9 business days.

Australia:

Economy: Can deliver to P. O. Boxes and private residences.
Trackable service with delivery to addresses in Australia only.
Delivery time ranges from 7-9 business days for VIC and 8-10 business days for Interstate metro
Delivery time is up to 15 business days for remote areas of WA, NT & QLD.

Premium: Delivery to addresses in Australia only
Trackable delivery to most P. O. Boxes and private residences in Australia within 4-5 days based on the distance to a destination following dispatch.

India:

Premium: Delivery to most Indian addresses within 5-6 business days

Rest of the World:

Premium: Countries in the American continent: Trackable delivery to most countries within 4-7 business days

Asia:

Premium: Delivery to most Asian addresses within 5-9 business days

Disclaimer:
All orders received before 5 PM U.K time would start printing from the next business day. So the estimated delivery times start from the next day as well. Orders received after 5 PM U.K time (in our internal systems) on a business day or anytime on the weekend will begin printing the second to next business day. For example, an order placed at 11 AM today will begin printing tomorrow, whereas an order placed at 9 PM tonight will begin printing the day after tomorrow.


Unfortunately, due to several restrictions, we are unable to ship to the following countries:

  1. Afghanistan
  2. American Samoa
  3. Belarus
  4. Brunei Darussalam
  5. Central African Republic
  6. The Democratic Republic of Congo
  7. Eritrea
  8. Guinea-bissau
  9. Iran
  10. Lebanon
  11. Libiya Arab Jamahriya
  12. Somalia
  13. Sudan
  14. Russian Federation
  15. Syrian Arab Republic
  16. Ukraine
  17. Venezuela
What is custom duty/charge? Chevron down icon Chevron up icon

Customs duty are charges levied on goods when they cross international borders. It is a tax that is imposed on imported goods. These duties are charged by special authorities and bodies created by local governments and are meant to protect local industries, economies, and businesses.

Do I have to pay customs charges for the print book order? Chevron down icon Chevron up icon

The orders shipped to the countries that are listed under EU27 will not bear custom charges. They are paid by Packt as part of the order.

List of EU27 countries: www.gov.uk/eu-eea:

A custom duty or localized taxes may be applicable on the shipment and would be charged by the recipient country outside of the EU27 which should be paid by the customer and these duties are not included in the shipping charges been charged on the order.

How do I know my custom duty charges? Chevron down icon Chevron up icon

The amount of duty payable varies greatly depending on the imported goods, the country of origin and several other factors like the total invoice amount or dimensions like weight, and other such criteria applicable in your country.

For example:

  • If you live in Mexico, and the declared value of your ordered items is over $ 50, for you to receive a package, you will have to pay additional import tax of 19% which will be $ 9.50 to the courier service.
  • Whereas if you live in Turkey, and the declared value of your ordered items is over € 22, for you to receive a package, you will have to pay additional import tax of 18% which will be € 3.96 to the courier service.
How can I cancel my order? Chevron down icon Chevron up icon

Cancellation Policy for Published Printed Books:

You can cancel any order within 1 hour of placing the order. Simply contact customercare@packt.com with your order details or payment transaction id. If your order has already started the shipment process, we will do our best to stop it. However, if it is already on the way to you then when you receive it, you can contact us at customercare@packt.com using the returns and refund process.

Please understand that Packt Publishing cannot provide refunds or cancel any order except for the cases described in our Return Policy (i.e. Packt Publishing agrees to replace your printed book because it arrives damaged or material defect in book), Packt Publishing will not accept returns.

What is your returns and refunds policy? Chevron down icon Chevron up icon

Return Policy:

We want you to be happy with your purchase from Packtpub.com. We will not hassle you with returning print books to us. If the print book you receive from us is incorrect, damaged, doesn't work or is unacceptably late, please contact Customer Relations Team on customercare@packt.com with the order number and issue details as explained below:

  1. If you ordered (eBook, Video or Print Book) incorrectly or accidentally, please contact Customer Relations Team on customercare@packt.com within one hour of placing the order and we will replace/refund you the item cost.
  2. Sadly, if your eBook or Video file is faulty or a fault occurs during the eBook or Video being made available to you, i.e. during download then you should contact Customer Relations Team within 14 days of purchase on customercare@packt.com who will be able to resolve this issue for you.
  3. You will have a choice of replacement or refund of the problem items.(damaged, defective or incorrect)
  4. Once Customer Care Team confirms that you will be refunded, you should receive the refund within 10 to 12 working days.
  5. If you are only requesting a refund of one book from a multiple order, then we will refund you the appropriate single item.
  6. Where the items were shipped under a free shipping offer, there will be no shipping costs to refund.

On the off chance your printed book arrives damaged, with book material defect, contact our Customer Relation Team on customercare@packt.com within 14 days of receipt of the book with appropriate evidence of damage and we will work with you to secure a replacement copy, if necessary. Please note that each printed book you order from us is individually made by Packt's professional book-printing partner which is on a print-on-demand basis.

What tax is charged? Chevron down icon Chevron up icon

Currently, no tax is charged on the purchase of any print book (subject to change based on the laws and regulations). A localized VAT fee is charged only to our European and UK customers on eBooks, Video and subscriptions that they buy. GST is charged to Indian customers for eBooks and video purchases.

What payment methods can I use? Chevron down icon Chevron up icon

You can pay with the following card types:

  1. Visa Debit
  2. Visa Credit
  3. MasterCard
  4. PayPal
What is the delivery time and cost of print books? Chevron down icon Chevron up icon

Shipping Details

USA:

'

Economy: Delivery to most addresses in the US within 10-15 business days

Premium: Trackable Delivery to most addresses in the US within 3-8 business days

UK:

Economy: Delivery to most addresses in the U.K. within 7-9 business days.
Shipments are not trackable

Premium: Trackable delivery to most addresses in the U.K. within 3-4 business days!
Add one extra business day for deliveries to Northern Ireland and Scottish Highlands and islands

EU:

Premium: Trackable delivery to most EU destinations within 4-9 business days.

Australia:

Economy: Can deliver to P. O. Boxes and private residences.
Trackable service with delivery to addresses in Australia only.
Delivery time ranges from 7-9 business days for VIC and 8-10 business days for Interstate metro
Delivery time is up to 15 business days for remote areas of WA, NT & QLD.

Premium: Delivery to addresses in Australia only
Trackable delivery to most P. O. Boxes and private residences in Australia within 4-5 days based on the distance to a destination following dispatch.

India:

Premium: Delivery to most Indian addresses within 5-6 business days

Rest of the World:

Premium: Countries in the American continent: Trackable delivery to most countries within 4-7 business days

Asia:

Premium: Delivery to most Asian addresses within 5-9 business days

Disclaimer:
All orders received before 5 PM U.K time would start printing from the next business day. So the estimated delivery times start from the next day as well. Orders received after 5 PM U.K time (in our internal systems) on a business day or anytime on the weekend will begin printing the second to next business day. For example, an order placed at 11 AM today will begin printing tomorrow, whereas an order placed at 9 PM tonight will begin printing the day after tomorrow.


Unfortunately, due to several restrictions, we are unable to ship to the following countries:

  1. Afghanistan
  2. American Samoa
  3. Belarus
  4. Brunei Darussalam
  5. Central African Republic
  6. The Democratic Republic of Congo
  7. Eritrea
  8. Guinea-bissau
  9. Iran
  10. Lebanon
  11. Libiya Arab Jamahriya
  12. Somalia
  13. Sudan
  14. Russian Federation
  15. Syrian Arab Republic
  16. Ukraine
  17. Venezuela