Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Stream Analytics with Microsoft Azure
Stream Analytics with Microsoft Azure

Stream Analytics with Microsoft Azure: Real-time data processing for quick insights using Azure Stream Analytics

eBook
₹799 ₹2919.99
Paperback
₹3649.99
Subscription
Free Trial
Renews at ₹800p/m

What do you get with Print?

Product feature icon Instant access to your digital eBook copy whilst your Print order is Shipped
Product feature icon Paperback book shipped to your preferred address
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
OR
Modal Close icon
Payment Processing...
tick Completed

Shipping Address

Billing Address

Shipping Methods
Table of content icon View table of contents Preview book icon Preview Book

Stream Analytics with Microsoft Azure

Introducing Stream Processing and Real-Time Insights

The popularity of stream data platforms is increasing significantly in recent times. Due to the requirement of real-time access to information. Enterprises are transitioning parts of their data infrastructure to a streaming paradigm due to changing business needs.
The streaming model presents a significant shift by moving from point queries against stationary data to a standing temporal query that consumes moving data. Fundamentally, we enable insight on the data before it is stored in the analytics repository. This introduces a new paradigm in thinking. Before going deep into stream processing, we have to cover a couple of key basic concepts related to events and stream. In this chapter, we'll explore the basics of the following points:

  • Publish/Subscribe (Pub/Sub)
  • Stream processing
  • Real-Time Insights

The core theme of this book is the Azure Streaming Service. Before diving deeper into Azure Streaming Service, we should take a moment to consider why we need stream processing, or Real-Time Insights, and why it is a tool worth adding to your repertoire.

Understanding stream processing

So what is stream processing and why is it important? In traditional data processing, data is typically processed in batch mode. The data will be dealt with on a regular schedule. One fundamental challenge with conventional data processing is it's inherently reactive because it focuses on ageing information. Stream processing, on the other hand, processes data as it flows through in real time.

The following are some of the highlights of why stream processing is critical:

  • Response time is critical:
    • Reducing decision latency can unlock business value
    • Need to ask questions about data in motion
    • Can't wait for data to get to rest before running computation
  • Actions by human actors:
    • See and seize insights
    • Live visualization
    • Alerts and alarms
    • Dynamic aggregation
  • Machine-to-machine interactions:
    • Data movement with enrichment
    • Kick-off workflows for automation

Before one goes into stream analytics, it is essential to understand the core basics around events and different models of publishing and consuming events. Let's get more familiar with queues, Pub/Sub, and events, which will surely help you understand the later chapters better. In the following sections, we will explore queues, Pub/Sub, and events.

Understanding queues, Pub/Sub, and events

In this section, we will review two key concepts—queues and Publish/Subscribe models, followed by event-based messaging models.

Queues

A queue implements a one-way communication, where the sender places a message on the queue and a receiver will collect the message asynchronously. Features such as dead letter queues, paired namespaces, active/passive replication, and auto-forwarding to a chain queue that's part of the same name provide the rich feature set for message flowing between an application and providing a highly available solution.

A queue consists of three key elements:

  • Sender: Sends the message to the receiver through a durable entity.
  • Durable entity: Stores the received durable message and offers persistence. The messages are stored until they are collected by the receiver.
  • Receiver: The final recipient of the message.

The key advantages of a queue are as follows:

  • Queues operate on the principle of first in, first out (FIFO): For example, consider a simple queue where, at one end, you put messages, and on the other end you will receive them in the same respective order. For example, service bus queue implements the FIFO pattern.
  • Point-to-point: The fundamental concept of Queues is, they are point-to-point messaging; even though there may be multiple senders of messages, there is only one receiver of the messages. 
  • Asynchronous communication: This implies that endpoint addresses are connected directly. A static structure may exist where senders and receivers communicate through named channels. Asynchronous communication helps with building decoupled architecture and allows higher resilience to add and process messages when either the publisher or consumer of messages has downtime.
  • Security: Due to the mutual knowledge of senders and receivers from the security point of view, senders know where the data will land, and it's easier to enforce security policies.

The following figure illustrates the preceding concept:

Publish and Subscribe model

Publish/Subscribe is a communication paradigm for a large-scale system. It enables loose coupling between mutually anonymous components and supports many-to-many communication.

The core concept of the Publish/Subscribe model is very simple. A Publisher publishes information on some topic, and anyone that is interested in the information will be able to find that information at the same time, simply by subscribing to that information. Well known example of this pattern is News Feeds and end user that are interested can subscribe to the type of news feeds they like to listen. Let's review  key components in the Publish/Subscribe paradigm:

  • Publisher (message sender):
    • Middleware connects the Publish/Subscribe middleware to communicate
    • Publishers produce events without any dependence on subscribers
    • Publishers advertise the events they are prepared to publish
    • The publisher announces an event without having any understanding of the potential subscriber
  • Topic:
    • The topic is conceptually similar to the queue, but the topic can have a copy of a given message that is forwarded to multiple subscriptions
    • Topics and subscriptions provide a one-to-many form of communication-based on the Publish/Subscribe pattern
  • Subscriber (receiver):
    • Subscribers register their interest in receiving events through a subscription that the middleware handles
    • The subscriber can subscribe and unsubscribe to events
    • The subscriber has to express interest in one or more events and only receive events related to their interest, without any knowledge of which publishers can provide that given event
  • Subscription:
    • Provided once the event is received and a subscriber consumes it, the same event cannot be replaced, and new subscribers will not see the event to eliminate duplicate processing of events
    • Subscription is similar to a virtual queue that receives copies of the message that were sent to the topic; you can optionally include filter rules for a topic on a per-subscription basis, which allows you to filter messages as illustrated:

Key benefits of the Pub/Sub model are as follows:

  • Decoupling (loose coupling): Space, time, synchronization decoupling:
    • Space: The publisher and subscriber don't need to know each other either by name or IP address, for instance.
    • Time: The publisher and subscriber don't need to run at the same time.
    • Synchronization: Operations can continue at both ends of the spectrum (publish and receiving).
  • Highly parallel:
    • The model is highly parallel in that subscribers can process events and, at the same time, the publisher can keep publishing events.
  • Scalability:
    • Due to the decoupling nature and parallel nature of the model, the Pub/Sub model is highly scalable.
    • To achieve higher velocity, events can cache and smarter routing to the subscriber can be configured to scale.
    • The key challenge with the Pub/Sub model is scaling to millions of publishers and subscribers.

In addition to the preceding, there are two key challenges with the Pub/Sub model:

  • No guarantee of message delivery because of the decoupled nature of the model
  • For applications that totally depend on guaranteed message delivery, the queue-based model will not fit

Real-world implementations of the Publish/Subscribe model

TIBCO is one of the pioneers that preached the Publish/Subscriber model during the time of centralized batch processing. The TIBCO approach changed the paradigm on the stock trading floor.

RSS Feeds use the Pub/Sub model; you subscribe to an RSS feed, to one or more forums on a discussions platform, or follow someone on Twitter--in each case, there is one publisher and multiple subscribers involved.

Companies like IBM build protocols like Message Queue Telemetry Transport (MQTT). Some example of products that are built on the Pub/Sub model:

  • IBM MQ is one of the early deployers of the Pub/Sub model
  • Websphere
  • Wormhole Pub/Sub system from Facebook
  • Google Cloud Pub/Sub

In the next section, we will look how Azure implements queues and Pub/Sub models.

Azure implementation of queues and Publish/Subscribe models

Azure supports two types of queuing mechanism:

  • Storage queues
  • Service bus queues

Storage queues are part of the Azure storage infrastructure and offer a simple REST-based GET/PUT/PEEK interface. This does provide a reliable persistent messaging within and between the services.

Service bus queues are part of the enterprise offering of the Azure messaging infrastructure. As a part of the offering, it includes queues, Pub/Sub, and advanced integration patterns.

Both of these queuing technologies exist in parallel, to cater different type of use cases. Storage queues were first offered as a service to exist on top of the Azure storage service. Service bus queues came after storage queues and support wider use cases and scenarios. For example, if your components span multiple communication protocols, data contracts, trust domains, and network environments Azure Service Bus queues is the ideal solution.

There are a couple of key technical differences between Azure Storage queues and Azure Service Bus queues.

You should consider Azure Storage queues when:

  1. Your queue size requires over 80 GB and messages will have a lifetime of shorter than seven days.
  2. You are building on Azure worker roles and you want to preserve messages between worker roles crashes.
  3. Server-side logs are required for all transactions executed against your queues.

You should consider Azure Service Bus queues:

  1. When you want to be guaranteed FIFO ordered delivery.
  2. When your solution requires duplicate detection.
  3. When your (Time to live (TTL) ) can exceed more than seven days and your message sizes are greater than 80 GB.

If you would like to understand the detailed differences between Azure Storage queues versus Azure Service Bus queues, visit https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-azure-and-service-bus-queues-compared-contrasted.

Azure Service Bus messaging is an implementation message queuing concept implemented in the Microsoft Azure as a Platform as a Service (PaaS) offering. All Azure PaaS services are built with high resiliency and high availability.

In this section, we briefly reviewed  Azure implements queues and Pub/Sub models. Since the goal of this book is Stream event processing, we will explore deeper into Azure Events in the next section.

What is an event?

An event is made of two parts, the event header, and the event body. The event header will have a name, the timestamp of the event, and type of event. The event body will have details of the event.

Events can be triggered by the business process or by many different types of activities.

Event streaming

Events are written to a common log. One of the key characteristics of event streaming is that they are strictly ordered (within a partition) and durable. One key difference between Azure Service Bus and events is that the clients don't subscribe to the stream. The client has the flexibility to read from any part of the stream and this opens multiple possibilities.

One major advantage is that a client can read from any part of the stream and the client is solely responsible for advancing their position in the stream. This enables the client to join at any given time and to replay events.

Event correlation

Event correlation is the process of trying to identify the cause of a situation or condition when massive amounts of data points (potentially related to the situation) exist.

Azure implementation of event processing

If you require receiving and processing millions of events per second, Azure Event Hub is the ideal solution. Typical use cases include tracking and monitoring telemetry collected from an industrial machine, mobile devices, and connected vehicles. For example, in-game events capture in-console applications.

Event Hubs work with low latency and at a massive scale, and serves as the on-ramp for big data: 

The following screenshot is a canonical implementation of event processing on Azure:

The Advanced Message Queuing Protocol 1.0 is a standardized framing and transfer protocol for asynchronously, securely, and reliably transferring messages between two parties. It is the primary protocol for Azure Service Bus Messaging and Azure Event Hubs. Both services also support HTTPS. The proprietary SBMP protocol that is also supported is being phased out in favor of AMQP.

AMQP 1.0 is the result of broad industry collaboration that brought together middleware vendors, such as Microsoft and Red Hat, with many messaging middleware users such as JP Morgan Chase representing the financial services industry. The technical standardization forum for the Advanced Message Queuing Protocol (AMQP) protocol and extension specifications is OASIS, and it has achieved formal approval as an international standard as ISO/IEC 19494.

Architectural components of Event Hubs

Event Hubs contain the following key elements:

  • Event producers/publishers: The event can be published via AMQP or HTTPS.
  • Capture: Azure Storage Blob item is used as a data storage repository for the events.
  • Partitions: If a consumer wants to read a specific subset or partition of the event stream, partitions will provide the required options for the consumer.
  • SAS tokens: Identity and authentication for the event publisher are provided by SAS tokens.
  • Event consumers (receiver): Event consumers connect using AMQP 1.0. Any entity can read event data from an Event Hub.
  • Consumer groups: Consumer groups provide a scale by providing separate views of the event stream. This provides each multiple consuming application with a separate view of the event stream, enabling those consumers to act independently.
  • Throughput units: A throughput event provides scaling options. The customer can pre-purchase units of capacity. A single partition has a max scale of one throughput unit.

Azure Service Bus works on the competing consumer pattern. In the competing consumer pattern scenario, multiple consumers will process the messages as illustrated in the image shown as following.

These increases improve scalability and availability, on the same note, this pattern is useful for asynchronous message processing:

Event Hubs, on the other hand, work on the concept of partitions. Event Hub is composed of multiple partitions that will receive messages from publishers. As the volume of messages increase the number of partitions can be increased to handle the additional load. 

Having partitions will increase the capacity to handle more messages and also have high throughput:

In summary, real-time streaming is all around us, be it a simple thermostat, your car telemetry, household electric meter data. Data is constantly streamed without anyone realizing it. For instance, when you are driving a car, the onboard computer is constantly doing the calculation on some telemetry data on the fly.

The final decision maker when it comes to the car is the driver that's in the driving seat. The same may not be true in other scenarios with modern day collision avoidance systems. If the onboard computer has enough data points that the car will collide with the car in front, it will decide to slow you down. That's where the real-time decision making comes into play.

The key objective of this book is to get you started on a very strong basis with event processing using Azure; as a reader, you can go on to do bigger and better things using this technology.

Before we dive further into this broad topic, let's see some of the core basics you need to know to get started.

An event-driven architecture can involve building a Publish and Subscribe, Event streaming model and a processing system:

  • Publish/Subscribe: The underlying message infrastructure keeps track of subscriptions. Each subscriber will receive an event when it gets published.
    • After the event is received, it cannot be replayed and new subscribers cannot see the event. In other words, you get only one opportunity to process the message. There is no way to go back to message to re-process or retry.
  • Event streaming: In event streaming, clients are independent of the event producers, and they read from a common logging system.
    • The client can read from any part of the system and they are responsible for advancing their posting in the stream.
    • It also gives them the flexibility to join at any time and replay events as they want. One key feature set of event streaming is that, in a given partition, they are sequentially ordered and durable.
  • If you look at message or event data they are simply data with a timestamp. This data need be processed by applying business logic or rule to derive or create an outcome. There are 3 well-known processing systems:
    • Simple event processing
    • Event stream processing
    • Complex event processing

Simple event processing 

An event immediately triggers into action in the consumer. For instance, you can use Azure Functions that can execute when it receives a message on the Service Bus topic.

Simple event processing (SEP) is used whenever you need to handle events in a simple way. There are not many differences between the events, the system will just process all of them.

In simple event processing, multiple single events will land into the processing engine. The events will be filtered, transformed, split, and routed. A classic example could be URL matching in a web server. Let's say you have a shopping portal and users can select different products, or click on different or the same products. In that instance, the web server will route the request, or the web server filters the request based on the URL it receives from the interactions and routes it accordingly.

The key characteristic of simple event processing is that a single event is processed without looking at other events. Events are processed at a time.

The following are the stages of the SEP:

  • Filter: Filtering the event stream for a specific type of event
  • Transform: Transforming events schema from one form to another
  • Enrich: Augmenting the event payload with additional data
  • Split: Splitting the events into multiple events and processing them
  • Route: Moving the event from one channel or stream to another

Event stream processing

Continuous streams of data are processed in real time by applying a series of operations (stream processors) on each data point. The event stream processors (ESP) will act to process or transform the stream of data.

For example, one can use data streaming platforms, such as Azure IoT Hub or Apache Kafka, to act as a pipeline to ingest events and feed them to stream processors as showcased in the following illustration. Depending on the scale and complexity, there will be more than one stream processor to work on various subsystems of that given application. This approach is a good fit for clickstream analytics, IoT and device telemetry, credit fraud detection: 

Complex event processing 

Forrester defines a CEP platform as, a software infrastructure that can detect patterns of events (and expected events that didn’t occur) by filtering, correlating, contextualizing, and analyzing data captured from disparate live data sources to respond as defined using the platform’s development tools.

Complex event processing (CEP) is a subset of event stream processing. CEP enables you to gain insights from large volumes of data in near real-time by monitoring, analyzing, and acting on data while it is in motion. Data is typically generated by business or system events such as placing an order or adding a message to a queue. CEP is the continuous monitoring and processing of events from multiple sources on a near real-time basis. Since CEP enables the analysis of data in real-time, it lends itself to predictive scenarios to enable more proactive decisions. 

Typical scenarios may include:

  • Monitoring the effectiveness of key performance indicators (KPIs) by using data from event streams
  • Monitoring the health and availability of servers, networks and service level threshold compliance
  • Fraud detection
  • Stock ticker analysis—taking action when certain events occur or price points are achieved
  • Performance history—predicting spikes
  • Buying patterns (what product/pricing combinations are most popular)

The concept behind CEP is the aggregation of information over a time window or looking for a pattern and generating a notification when the aggregation of data or pattern breaches a defined condition. The emphasis is placed on detection of the event.

CEP has its origins in the stock market and, because of this fact, it is tuned for low latency and often responds in a few milliseconds or sub-milliseconds. Some of the events can be ignored without impact.

Internet of things (IoT) applications are very good to use cases for CEP since they are time series data, auto-correlated. IoT use cases are usually complex and they go beyond aggregation and calculation of data. These types of cases need complex operations such as time windows and temporal query patterns. Due to the availability of temporal operators, it's easy to process time series data efficiently. The following figure illustrations showcase CEP Flow:

Summary

In this chapter, we covered Publish/Subscribe, events, and complex event processing.

In the following chapters, we will look at hands-on labs and also more details into how these features can be implemented in Azure.

Left arrow icon Right arrow icon
Download code icon Download Code

Key benefits

  • Analyze your data from various sources using Microsoft Azure Stream Analytics
  • Develop, manage and automate your stream analytics solution with Microsoft Azure
  • A practical guide to real-time event processing and performing analytics on the cloud

Description

Microsoft Azure is a very popular cloud computing service used by many organizations around the world. Its latest analytics offering, Stream Analytics, allows you to process and get actionable insights from different kinds of data in real-time. This book is your guide to understanding the basics of how Azure Stream Analytics works, and building your own analytics solution using its capabilities. You will start with understanding what Stream Analytics is, and why it is a popular choice for getting real-time insights from data. Then, you will be introduced to Azure Stream Analytics, and see how you can use the tools and functions in Azure to develop your own Streaming Analytics. Over the course of the book, you will be given comparative analytic guidance on using Azure Streaming with other Microsoft Data Platform resources such as Big Data Lambda Architecture integration for real time data analysis and differences of scenarios for architecture designing with Azure HDInsight Hadoop clusters with Storm or Stream Analytics. The book also shows you how you can manage, monitor, and scale your solution for optimal performance. By the end of this book, you will be well-versed in using Azure Stream Analytics to develop an efficient analytics solution that can work with any type of data.

Who is this book for?

If you are looking for a resource that teaches you how to process continuous streams of data in real-time, this book is what you need. A basic understanding of the concepts in analytics is all you need to get started with this book

What you will learn

  • • Perform real-time event processing with Azure Stream Analysis
  • • Incorporate the features of Big Data Lambda architecture pattern in real-time data processing
  • • Design a streaming pipeline for storage and batch analysis
  • • Implement data transformation and computation activities over stream of events
  • • Automate your streaming pipeline using Powershell and the .NET SDK
  • • Integrate your streaming pipeline with popular Machine Learning and Predictive Analytics modelling algorithms
  • • Monitor and troubleshoot your Azure Streaming jobs effectively
Estimated delivery fee Deliver to India

Premium delivery 5 - 8 business days

₹630.95
(Includes tracking information)

Product Details

Country selected
Publication date, Length, Edition, Language, ISBN-13
Publication date : Dec 01, 2017
Length: 322 pages
Edition : 1st
Language : English
ISBN-13 : 9781788395908
Vendor :
Microsoft
Languages :
Tools :

What do you get with Print?

Product feature icon Instant access to your digital eBook copy whilst your Print order is Shipped
Product feature icon Paperback book shipped to your preferred address
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
OR
Modal Close icon
Payment Processing...
tick Completed

Shipping Address

Billing Address

Shipping Methods
Estimated delivery fee Deliver to India

Premium delivery 5 - 8 business days

₹630.95
(Includes tracking information)

Product Details

Publication date : Dec 01, 2017
Length: 322 pages
Edition : 1st
Language : English
ISBN-13 : 9781788395908
Vendor :
Microsoft
Languages :
Tools :

Packt Subscriptions

See our plans and pricing
Modal Close icon
₹800 billed monthly
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Simple pricing, no contract
₹4500 billed annually
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just ₹400 each
Feature tick icon Exclusive print discounts
₹5000 billed in 18 months
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just ₹400 each
Feature tick icon Exclusive print discounts

Frequently bought together


Stars icon
Total 10,576.97
Azure for Architects
₹3649.99
Stream Analytics with Microsoft Azure
₹3649.99
Learning Microsoft Azure Storage
₹3276.99
Total 10,576.97 Stars icon
Banner background image

Table of Contents

11 Chapters
Introducing Stream Processing and Real-Time Insights Chevron down icon Chevron up icon
Introducing Azure Stream Analytics and Key Advantages Chevron down icon Chevron up icon
Designing Real-Time Streaming Pipelines Chevron down icon Chevron up icon
Developing Real-Time Event Processing with Azure Streaming Chevron down icon Chevron up icon
Building Using Stream Analytics Query Language Chevron down icon Chevron up icon
How to achieve Seamless Scalability with Automation Chevron down icon Chevron up icon
Integration of Microsoft Business Intelligence and Big Data Chevron down icon Chevron up icon
Designing and Managing Stream Analytics Jobs Chevron down icon Chevron up icon
Optimizing Intelligence in Azure Streaming Chevron down icon Chevron up icon
Understanding Stream Analytics Job Monitoring Chevron down icon Chevron up icon
Use Cases for Real-World Data Streaming Architectures Chevron down icon Chevron up icon

Customer reviews

Rating distribution
Full star icon Full star icon Full star icon Full star icon Half star icon 4.2
(5 Ratings)
5 star 80%
4 star 0%
3 star 0%
2 star 0%
1 star 20%
Zhong Feb 19, 2018
Full star icon Full star icon Full star icon Full star icon Full star icon 5
A very comprehensive book showing how to build effective stream processing pipelines with Azure Stream Analytics service. Would like to see more content on real world use cases, and complex end to end scenarios. Nevertheless, it's an excellent introductory book for stream analytics.
Amazon Verified review Amazon
Jan Mar 02, 2018
Full star icon Full star icon Full star icon Full star icon Full star icon 5
I highly recommend this book for anyone who needs to understand the Stream Analytics in Azure and want to be specialized on this area. With so many things to learn, it becomes challenging to find a "structured" path to learn it this book offers a learning path to understand the core things related with Streams Analytics in Azure.
Amazon Verified review Amazon
RR Mar 02, 2018
Full star icon Full star icon Full star icon Full star icon Full star icon 5
This is the book if you want to go completely serverless for your near real-time analytics. The book covers everything from event processing to complex event processing. The authors did a great job of not only going deep into stream analytics capabilities in Azure but also covering other relevant Azure services that are used to build end to end real-time analytics solutions.
Amazon Verified review Amazon
CLAUDIA ANGELELLI Jul 04, 2020
Full star icon Full star icon Full star icon Full star icon Full star icon 5
libro eccellente descrive perfettamente scenari di Real-time data processing
Amazon Verified review Amazon
Martin Smith Aug 08, 2019
Full star icon Empty star icon Empty star icon Empty star icon Empty star icon 1
so far, I've got up to chapter 4.Needs severe editing. The vast majority of paragraphs in the book are very difficult to read as they are not written in coherent flowing English. In far too many places it just becomes word salad and it is difficult to determine what, if any, information the authors are trying to convey.
Amazon Verified review Amazon
Get free access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

What is the delivery time and cost of print book? Chevron down icon Chevron up icon

Shipping Details

USA:

'

Economy: Delivery to most addresses in the US within 10-15 business days

Premium: Trackable Delivery to most addresses in the US within 3-8 business days

UK:

Economy: Delivery to most addresses in the U.K. within 7-9 business days.
Shipments are not trackable

Premium: Trackable delivery to most addresses in the U.K. within 3-4 business days!
Add one extra business day for deliveries to Northern Ireland and Scottish Highlands and islands

EU:

Premium: Trackable delivery to most EU destinations within 4-9 business days.

Australia:

Economy: Can deliver to P. O. Boxes and private residences.
Trackable service with delivery to addresses in Australia only.
Delivery time ranges from 7-9 business days for VIC and 8-10 business days for Interstate metro
Delivery time is up to 15 business days for remote areas of WA, NT & QLD.

Premium: Delivery to addresses in Australia only
Trackable delivery to most P. O. Boxes and private residences in Australia within 4-5 days based on the distance to a destination following dispatch.

India:

Premium: Delivery to most Indian addresses within 5-6 business days

Rest of the World:

Premium: Countries in the American continent: Trackable delivery to most countries within 4-7 business days

Asia:

Premium: Delivery to most Asian addresses within 5-9 business days

Disclaimer:
All orders received before 5 PM U.K time would start printing from the next business day. So the estimated delivery times start from the next day as well. Orders received after 5 PM U.K time (in our internal systems) on a business day or anytime on the weekend will begin printing the second to next business day. For example, an order placed at 11 AM today will begin printing tomorrow, whereas an order placed at 9 PM tonight will begin printing the day after tomorrow.


Unfortunately, due to several restrictions, we are unable to ship to the following countries:

  1. Afghanistan
  2. American Samoa
  3. Belarus
  4. Brunei Darussalam
  5. Central African Republic
  6. The Democratic Republic of Congo
  7. Eritrea
  8. Guinea-bissau
  9. Iran
  10. Lebanon
  11. Libiya Arab Jamahriya
  12. Somalia
  13. Sudan
  14. Russian Federation
  15. Syrian Arab Republic
  16. Ukraine
  17. Venezuela
What is custom duty/charge? Chevron down icon Chevron up icon

Customs duty are charges levied on goods when they cross international borders. It is a tax that is imposed on imported goods. These duties are charged by special authorities and bodies created by local governments and are meant to protect local industries, economies, and businesses.

Do I have to pay customs charges for the print book order? Chevron down icon Chevron up icon

The orders shipped to the countries that are listed under EU27 will not bear custom charges. They are paid by Packt as part of the order.

List of EU27 countries: www.gov.uk/eu-eea:

A custom duty or localized taxes may be applicable on the shipment and would be charged by the recipient country outside of the EU27 which should be paid by the customer and these duties are not included in the shipping charges been charged on the order.

How do I know my custom duty charges? Chevron down icon Chevron up icon

The amount of duty payable varies greatly depending on the imported goods, the country of origin and several other factors like the total invoice amount or dimensions like weight, and other such criteria applicable in your country.

For example:

  • If you live in Mexico, and the declared value of your ordered items is over $ 50, for you to receive a package, you will have to pay additional import tax of 19% which will be $ 9.50 to the courier service.
  • Whereas if you live in Turkey, and the declared value of your ordered items is over € 22, for you to receive a package, you will have to pay additional import tax of 18% which will be € 3.96 to the courier service.
How can I cancel my order? Chevron down icon Chevron up icon

Cancellation Policy for Published Printed Books:

You can cancel any order within 1 hour of placing the order. Simply contact customercare@packt.com with your order details or payment transaction id. If your order has already started the shipment process, we will do our best to stop it. However, if it is already on the way to you then when you receive it, you can contact us at customercare@packt.com using the returns and refund process.

Please understand that Packt Publishing cannot provide refunds or cancel any order except for the cases described in our Return Policy (i.e. Packt Publishing agrees to replace your printed book because it arrives damaged or material defect in book), Packt Publishing will not accept returns.

What is your returns and refunds policy? Chevron down icon Chevron up icon

Return Policy:

We want you to be happy with your purchase from Packtpub.com. We will not hassle you with returning print books to us. If the print book you receive from us is incorrect, damaged, doesn't work or is unacceptably late, please contact Customer Relations Team on customercare@packt.com with the order number and issue details as explained below:

  1. If you ordered (eBook, Video or Print Book) incorrectly or accidentally, please contact Customer Relations Team on customercare@packt.com within one hour of placing the order and we will replace/refund you the item cost.
  2. Sadly, if your eBook or Video file is faulty or a fault occurs during the eBook or Video being made available to you, i.e. during download then you should contact Customer Relations Team within 14 days of purchase on customercare@packt.com who will be able to resolve this issue for you.
  3. You will have a choice of replacement or refund of the problem items.(damaged, defective or incorrect)
  4. Once Customer Care Team confirms that you will be refunded, you should receive the refund within 10 to 12 working days.
  5. If you are only requesting a refund of one book from a multiple order, then we will refund you the appropriate single item.
  6. Where the items were shipped under a free shipping offer, there will be no shipping costs to refund.

On the off chance your printed book arrives damaged, with book material defect, contact our Customer Relation Team on customercare@packt.com within 14 days of receipt of the book with appropriate evidence of damage and we will work with you to secure a replacement copy, if necessary. Please note that each printed book you order from us is individually made by Packt's professional book-printing partner which is on a print-on-demand basis.

What tax is charged? Chevron down icon Chevron up icon

Currently, no tax is charged on the purchase of any print book (subject to change based on the laws and regulations). A localized VAT fee is charged only to our European and UK customers on eBooks, Video and subscriptions that they buy. GST is charged to Indian customers for eBooks and video purchases.

What payment methods can I use? Chevron down icon Chevron up icon

You can pay with the following card types:

  1. Visa Debit
  2. Visa Credit
  3. MasterCard
  4. PayPal
What is the delivery time and cost of print books? Chevron down icon Chevron up icon

Shipping Details

USA:

'

Economy: Delivery to most addresses in the US within 10-15 business days

Premium: Trackable Delivery to most addresses in the US within 3-8 business days

UK:

Economy: Delivery to most addresses in the U.K. within 7-9 business days.
Shipments are not trackable

Premium: Trackable delivery to most addresses in the U.K. within 3-4 business days!
Add one extra business day for deliveries to Northern Ireland and Scottish Highlands and islands

EU:

Premium: Trackable delivery to most EU destinations within 4-9 business days.

Australia:

Economy: Can deliver to P. O. Boxes and private residences.
Trackable service with delivery to addresses in Australia only.
Delivery time ranges from 7-9 business days for VIC and 8-10 business days for Interstate metro
Delivery time is up to 15 business days for remote areas of WA, NT & QLD.

Premium: Delivery to addresses in Australia only
Trackable delivery to most P. O. Boxes and private residences in Australia within 4-5 days based on the distance to a destination following dispatch.

India:

Premium: Delivery to most Indian addresses within 5-6 business days

Rest of the World:

Premium: Countries in the American continent: Trackable delivery to most countries within 4-7 business days

Asia:

Premium: Delivery to most Asian addresses within 5-9 business days

Disclaimer:
All orders received before 5 PM U.K time would start printing from the next business day. So the estimated delivery times start from the next day as well. Orders received after 5 PM U.K time (in our internal systems) on a business day or anytime on the weekend will begin printing the second to next business day. For example, an order placed at 11 AM today will begin printing tomorrow, whereas an order placed at 9 PM tonight will begin printing the day after tomorrow.


Unfortunately, due to several restrictions, we are unable to ship to the following countries:

  1. Afghanistan
  2. American Samoa
  3. Belarus
  4. Brunei Darussalam
  5. Central African Republic
  6. The Democratic Republic of Congo
  7. Eritrea
  8. Guinea-bissau
  9. Iran
  10. Lebanon
  11. Libiya Arab Jamahriya
  12. Somalia
  13. Sudan
  14. Russian Federation
  15. Syrian Arab Republic
  16. Ukraine
  17. Venezuela