Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
OpenNI Cookbook
OpenNI Cookbook

OpenNI Cookbook: Learn how to write NIUI-based applications and motion-controlled games

eBook
$9.99 $28.99
Paperback
$48.99
Subscription
Free Trial
Renews at $19.99p/m

What do you get with Print?

Product feature icon Instant access to your digital eBook copy whilst your Print order is Shipped
Product feature icon Paperback book shipped to your preferred address
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
OR
Modal Close icon
Payment Processing...
tick Completed

Shipping Address

Billing Address

Shipping Methods
Table of content icon View table of contents Preview book icon Preview Book

OpenNI Cookbook

Chapter 1. Getting Started

The first step before writing an application or game using OpenNI is to install OpenNI itself, the drivers, and any other prerequisites. So in this chapter, we will cover this process and make everything ready for writing an app using OpenNI.

In this chapter, we will cover the following recipes:

  • Downloading and installing OpenNI

  • Downloading and installing NiTE

  • Downloading and installing the Microsoft Kinect SDK

  • Connecting Asus Xtion and PrimeSense sensors

  • Connecting Microsoft Kinect

Introduction


As an introduction, it is important for you to have an idea about the technology behind the topics just mentioned and our reasons for writing this book, as well as to know about the different devices and middleware libraries that can be used with OpenNI.

Introduction to the "Introduction"

Motion detectors are part of our everyday life, from a simple alarm system to complicated military radars or an earthquake warning system, all using different methods and different sensors but for the same purpose—detecting motion in the environment.

But they were rarely used to control computers or devices until recent years. This was usually because of the high price of capable devices and the lack of powerful software and hardware for consumers, and maybe because end users did not need this technology. Fortunately, this situation changed after some of the powerful players in computer technology tried to use this idea and supported other small innovation companies in this task.

We believe that the idea of controlling computers and other devices with environment-aware input devices is going to grow in computer industries even more in the coming years. Computers can't rely any more on a keyboard and a mouse to learn about real environments. Computers are going to control more and more parts of our everyday life; each time they need to understand better our living environment. So if you are interested in being part of this change, work through this book.

In this book, we are going to show you how to start using current devices and software to write your own applications or games to interact with the real world.

In this chapter, we will introduce you to some usable technologies and devices, and then introduce some of the frameworks and middleware before speaking a little about how you can make applications or games with Natural Interactive User Interfaces (NIUI).

Tip

This way of interacting with a computer is known as 3DUI (3D User Interaction or 3D User Interfaces), RBI (Reality based interaction), or NI (Natural Interaction). To know more, visit http://en.wikipedia.org/wiki/3D_user_interaction and http://en.wikipedia.org/wiki/Natural_user_interface.

Motion-capture devices and the technologies behind them

The keyboard and mouse are two of the most used input devices for computers; they're the way they learn from outside of the box. But the usage of these two devices is very limited and there is a real gap between the physical world and the computer's understanding of the surrounding environment.

To fill this gap, different projects were raised to reconstruct 3D environments for computers using different methods. Read more about these techniques at http://en.wikipedia.org/wiki/Range_imaging.

For example, vSlam is one such famous project designed for robotic researchers who try to do this using one or two RGB cameras. This project is an open source one and is available at http://www.ros.org/wiki/vslam.

However, since most of these solutions depend on the camera's movement or detection of similar patterns from two cameras, and then use Stereo triangulation algorithms for creating a 3D map of the environment, they perform a high number of calculations along with using complex algorithms. This makes them slow and their output unreliable and/or inaccurate.

There are more expensive methods to solve these problems when high accuracy is needed. Methods such as Laser Imaging Detection and Ranging (LIDaR) use one or more laser beams to scan the environment. These methods are expensive and actually not a good option for targeting end users. They are usually big in size and the mid-level models are slow at scanning a 3D environment completely. Yet, because they use ToF (Time of Flight) for calculating distances, they have very good accuracy and a very good range too. The devices that use laser beams are used mainly for scanning huge objects, buildings, surfaces, landforms (in Geology), and so on, from the ground, an airplane, or from a satellite. Read more on http://en.wikipedia.org/wiki/Lidar.

To know more about the other types of 3D scanners, visit http://en.wikipedia.org/wiki/3D_scanner.

In 2010, Microsoft released the Kinect device for Xbox 360 users to control their console and games without a controller. Kinect originally uses PrimeSense's technology and its SoC (System on Chip) to capture and analyze the depth of the environment. PrimeSense's method of scanning the environment is based on projecting a pattern of a hundred beams of infrared lasers to the environment and capturing these beams using a simple image CMOS sensor (a.k.a. Active Pixel Sensor or APS) with an infrared-passing filter in front of it. PrimeSense's SoC is then responsible for comparing the results of the captured pattern with the projected one and creates a displacement map of the captured pattern compared to the projected pattern. This displacement map is actually the same depth map that the device provides to the developers later with some minor changes. This technology is called Structured-light 3D scanning. Its accuracy, size, and error rate (below 70 millimeters in the worst possible case) when compared to its cost makes it a reasonable choice for a consumer-targeted device.

To know more about Kinect, visit http://en.wikipedia.org/wiki/Kinect.

PrimeSense decided to release similar devices after Kinect was released. Carmine 1.08, Carmine 1.09 (a short range version of Carmine 1.08), and Capri 1.25 (an embeddable version) are the three devices from PrimeSense. In this book, we will call them all PrimeSense sensors. A list of the available devices from PrimeSense can be viewed at http://www.primesense.com/solutions/sensor/.

Before the release of PrimeSense sensors, Asus released two sensors in 2011 named Asus Xtion (with only depth and IR output) and Asus Xtion Pro Live (with depth, color, IR, and audio output) with PrimeSense's technology and chipset, just as with Kinect, but without some features such as tilting, custom design, higher resolution, and frame rate compared to Kinect. From what PrimeSense told us, the Asus Xtion series and PrimeSense's sensors both share the same design and are almost identical.

Both of PrimeSense's sensors and the Asus Xtion series are almost twice as expensive compared to Microsoft Kinect, yet they have a more acceptable price than the other competitors (in the U.K., Microsoft Kinect is priced at $110).

Here is an illustration to help you understand how Kinect, Asus Xtion, and PrimeSense sensors work:

More information about this method is available on Wikipedia at http://en.wikipedia.org/wiki/Structured-light_3D_scanner.

After the release of Kinect, other devices aimed to give better and faster outputs to users and yet keep the price in an acceptable range. These devices usually use ToF to scan environments and must have better accuracy, at least in theory. SoftKinetic devices (the DepthSense series of devices) and pmd[vision]® CamBoard nano are two of the notable designs. Currently, there is no support for them in OpenNI and they are not very popular compared to Kinect, Asus Xtion, and PrimeSense's sensors. Their resolution is less than what PrimeSense-based devices can offer, but their frame rate is usually better because of a simple calculation they use to produce a depth frame. Current devices can offer from 60 to 120 frames per second ranging from 160 x 120 to 320 x 240 resolutions, whereas Kinect, Asus Xtion, and PrimeSense's sensor can give you up to 640 x 480 resolutions at 30 to 60 frames per second. Also, these devices usually cost more than PrimeSense-based devices (from $250$ to $690 at the time of writing this book).

Microsoft introduced Xbox One in 2013 with a new version of Kinect, known as Kinect for Xbox One (a.k.a. Kinect 2), which uses ToF technology and custom-made CMOS for capturing both RGB and depth data along with projecting beams of laser. From what Microsoft told the media, it is completely made by Microsoft and, unlike the first version of Kinect, this time there is no third-party company involved. It is unknown if this new version of Kinect is compatible with OpenNI, but Microsoft promised a Windows SDK, which means we can expect a custom module for OpenNI from the community at least.

You can read more about ToF-based cameras and their technologies on Wikipedia at http://en.wikipedia.org/wiki/Time-of-flight_camera.

Fotonic is another manufacturing company for 3D imaginary cameras. Fotonic E series products are OpenNI-compatible TOF devices. You can check their website (http://www.fotonic.com/) for more information.

In this book, we use Asus Xtion Pro Live and Kinect, but you can use any of PrimeSense's sensors and it will give you the same result as Asus Xtion Pro without any headache. We even expect the same result with any other OpenNI-compatible device (for example, Fotonic E70 or E40).

What is OpenNI?

After having good hardware for capturing the 3D environment, it is very important to have a good interface to communicate and read data from a device. Apart from the fact that each device may have its own SDK, it is important for developers to use one interface for all of the different devices.

Unfortunately, there is no unique interface for such devices now. But OpenNI, as the default framework, and SDK, for PrimeSense-based devices (such as Kinect, PrimeSense sensors, and Asus Xtion), have the capacity to become one.

OpenNI is an organization that is responsible for its framework with the same name. Their framework (that we will call OpenNI in this book) is an open source project and is available for change by any developer. The funder of this project is PrimeSense itself. This project became very famous because of being the first framework with unofficial Kinect support when there wasn't any reliable framework. In the current version of OpenNI, Kinect is officially supported via the Microsoft SDK.

OpenNI, on one hand, gives device producers the ability to connect their devices to the framework, and on the other hand gives developers the ability to work with the same API for different devices. At the same time, other companies and individuals can develop their own middleware and expand the API of OpenNI. Having these features gives this framework the value that other competitions don't have.

As mentioned in the title of the book, we will use OpenNI as a way to know this field better and to develop our applications.

What is NiTE?

NiTE is a middleware based on the OpenNI framework and was developed by PrimeSense as an enterprise project.

NiTE gives us more information about a scene based on the information from the depth stream of a device.

We will use NiTE in this book for accessing a user's data and body tracking as well as hand tracking and gesture recognition.

NiTE is not the only middleware; there are other middleware that you can use along with OpenNI, such as the following:

And a whole lot more. A list of SDKs and middleware libraries is available at OpenNI.org (http://www.openni.org/software/?cat_slug=file-cat1).

Developing applications and games with the Natural Interactive User Interface

With the seventh generation of video game consoles, interacting with users via motion detection became popular, with the focus on improving gaming experience, starting with the Nintendo Wii controller and followed by Microsoft Kinect and Sony PlayStation Move.

But gaming isn't the only subject capable of using these new ways. There are different cases where interacting with users via natural ways is a better option than traditional ways, or at least can be used as an improvement. Just think of how you can use it in advertising panels, or how you can give product information to users. Or you can design an intelligent house that is able to identify and understand a user's orders. Just look at what some of the companies such as Samsung did with their Smart TV line of productions.

With improving the device's accuracy and usable field of view, you can expect the creation of applications for personal computers to become reasonable too, for example, moving and rotating a 3D model in 3D modeling apps, or helping in drawing apps, as well as the possibility of interacting with the Windows 8 Modern interface or other similar interfaces.

As a developer, you can think of it as a 3D touch screen, and one can do lots of work with a 3D touch screen. What it needs is a little creativity and innovation to find and create ways and ideas to use these methods to interact with users.

Yet, developing games and applications is not the only area that you can use this technology for. There are projects already underway for creating more environment-aware indoor robots and different indoor security systems as well as constructing and scanning an environment completely (such as the KinectFusion project or other similar projects). It's hard to ignore and not mention the available Motion Capture applications (for example, iPi Motion Capture™).

As you can see, there are lots of possibilities in which you can use OpenNI, NiTE, and other middleware libraries.

But in this book, we are not going to show you how to do anything specific to one of the preceding categories. Instead, we are going to cover how to use OpenNI and NiTE, and it all depends on you and how you want to use the information provided in this book.

In this chapter, we are going to introduce OpenNI and cover the process of initializing OpenNI as well as the process of accessing different devices. The next step for you in this book is reading RAW data from devices and using OpenNI to customize this data from a device. NiTE can help you to convert this data to understandable information about the current scene. This information can be used to interact with users. We are going to cover NiTE and its features in this book too.

By using this information, you will be able to create your own body-controlled game, an application with an NI interface, or even custom systems and projects with better understanding of the world and with the possibility of interacting more easily and in natural ways with users.

The main programming language with OpenNI is C, but there is a C++ wrapper with each release. This book makes conservative use of C++ for simplicity. We used a little bit of OpenGL using the GLUT library to visually show some of the information. So you may need to know C++ and have a little understanding about what OpenGL and 2D drawing are.

Currently, there are two official wrappers for OpenNI and NiTE: C++ and Java wrappers. Yet there is no official wrapper for .NET, Unity, or other languages/software.

Community-maintained wrappers, at the time of writing this book, are NiWrapper.Net which is an open source project supporting OpenNI and NiTE functionalities for .NET developers and ZDK for Unity3D, which is a commercial project for adding OpenNI 2 and NiTE 2 support to Unity. Of course, there are other frameworks that use OpenNI as the backend, but none of these can be fitted in the subject of this book.

OpenNI is a multiplatform framework supporting Windows (32 bit and 64 bit; the ARM edition is not yet available at the time of writing this book), Mac OS X, and Linux (32 bit, 64 bit, and ARM editions). In this book, we are going to use Windows (mainly 64 bit) for projects. But porting codes to other platforms is easily possible and it is unlikely to create serious problems for you if you decided to do this.

Downloading and installing OpenNI


The first step to use OpenNI to develop any application or game is to install the OpenNI framework on your development machine. In this recipe, we will show you how to install OpenNI; actually it is as easy as 1-2-3.

How to do it...

  1. Open your browser and navigate to www.openni.org/openni-sdk. The following screen will be displayed:

  2. Download the latest version of OpenNI for your platform and CPU architecture. We recommend downloading both 32-bit (x86) and 64-bit versions of OpenNI if you are using a 64-bit OS.

  3. Open the downloaded file; it is usually a ZIP file that can be opened by different programs (including but not limited to WinZip, WinRar, 7Zip, and so on) and even Windows Explorer. Then run/open the OpenNI installer from within the zipped archive:

  4. Click on Install in the installer dialog:

  5. Wait for the installation process to complete. If any dialog appears to ask for an approval of drivers during the installation process, simply click on the Install button:

  6. At the end of the installation, click on Finish and you are done.

How it works...

There is nothing special here; we downloaded and opened the archive file and then executed the installer package. Also, we accepted the installation of new drivers to the Windows catalog.

See also

  • The Downloading and installing NiTE recipe

Downloading and installing NiTE


If you want to use high-level outputs and some advanced tracking and recognition features of NiTE, you need to install it as well. NiTE is a middleware based on the OpenNI framework and needs to be installed after it.

Getting ready

Before installing NiTE, you need to have OpenNI installed using the Downloading and installing OpenNI recipe in this chapter.

How to do it...

  1. Before downloading NiTE, you need to register yourself in OpenNI.org. For doing so, please open your browser and navigate to www.openni.org/my-profile. Now fill all the fields and click on the Submit button:

  2. After the registration, if everything goes fine, you'll be able to download NiTE. Open your browser and navigate to www.openni.org/files/nite:

  3. Download NiTE using the big DOWNLOAD button at the upper-right corner and then select your desired version of it:

  4. Open the downloaded file; it is usually a ZIP file that can be opened by different programs (including but not limited to WinZip, WinRar, 7Zip, and so on) and even Windows Explorer. Then run/open the actual installer from within the zipped archive:

  5. Read and accept the license arguments, then click on Next and then on Install in the installer dialog:

  6. Wait for the installation to complete and, at the end of the installation, click on Finish and you are done:

How it works...

Actually we did nothing special here either; all we did was register, download, and install NiTE for our version of the OS and CPU architecture.

See also

  • The Downloading and installing O penNI recipe

Downloading and installing the Microsoft Kinect SDK


For using Kinect on Windows 7 and Windows 8, you need to install the Microsoft Kinect SDK. This SDK lets OpenNI access Kinect for Windows and Kinect for Xbox devices.

How to do it...

  1. Open your browser and navigate to www.microsoft.com/en-us/kinectforwindows/develop/developer-downloads.aspx:

  2. Download the Kinect SDK by clicking on the center-left button named DOWNLOAD LATEST SDK.

    Note

    Please note that the current version of OpenNI (OpenNI 2.2) works only with Version 1.6 and higher of Microsoft Kinect SDK. The current stable version of Kinect SDK is 1.7.

  3. Open the installer package after it's downloaded, read and accept the license arguments, and click on the Install button.

    Note

    Please note that Microsoft Kinect SDK can only be installed on Windows 7 and later.

  4. Wait for the installation process to complete. At the end of the installation, click on Close and you are done.

How to do it...

Just as with the previous two recipes, we did nothing worth explaining except for downloading and installing the Kinect SDK, Kinect Drivers, and the Kinect Runtime.

See also

  • The Downloading and installing OpenNI recipe

  • The Downloading and installing NiTE recipe

Connecting Asus Xtion and PrimeSense sensors


After installing OpenNI, you need to connect your device to your PC. In this recipe, we will show you how to connect and expect Windows to recognize your device. Actually, both of these devices use only one USB port and drivers are also a part of OpenNI, so in this recipe we are not going to do anything other than connecting and waiting.

Getting ready

Before connecting your device, you need to have OpenNI installed using the Downloading and installing OpenNI recipe in this chapter.

How to do it...

  1. Unbox your device and connect its USB cable to one of your computer's USB ports. If any message appears on the screen about the failure of recognizing your device, you can simply change the connected USB port and see if it makes any difference.

    Note

    Please note that your device may not be compatible with USB3. It is possible for PrimeSense and Asus Xtion users to update their device firmware to add support for Audio and USB3. Check out the PrimeSense website (http://www.primesense.com/updates/) for downloading the latest firmware.

  2. Now, the following pop up will appear on your Windows notification bar:

  3. You must wait for the installation to complete or click on it to visually see the installation steps:

  4. When Ready to use is displayed, it means everything is good and you have connected your device successfully.

  5. You can check if it is successfully installed by going to Device Manager. To open Device Manager, right-click on the My Computer icon and select Manage, then navigate to Device Manager from the left-hand side tree. Then check if you have the PrimeSense node installed in the right-hand side panel, as shown in the following screenshot:

How it works...

The preceding steps are actually quite self-explanatory; we connected our device, waited for it to be recognized by Windows, and then let automatic installation finish.

See also

  • The Downloading and installing OpenNI recipe

  • The Downloading and installing NiTE recipe

Connecting Microsoft Kinect


If you are going to use Kinect, you need to connect this device properly to your PC. In this recipe, we are going to show you this operation. Of course, there are only a few changes between this and the previous recipe that is about the Adapter and Power Supply Kit in Kinect for the Xbox version of Kinect; everything else pretty much remains the same.

Getting ready

Before installing NiTE, you need to have both OpenNI and Microsoft Kinect SDK installed using the Downloading and installing OpenNI and Downloading and installing the Microsoft Kinect SDK recipes in this chapter.

How to do it...

  1. Unbox your Kinect device and, if you are using Kinect for Xbox, connect the Kinect Sensor Power Supply Kit to the device. This kit converts the Kinect special port to a power and a USB port so that you can connect it to your PC. If you have Kinect for Windows, then your device probably has this kit built in. If you don't have this kit, you can buy it from the Microsoft Store:

    www.microsoftstore.com/store/msstore/pd/Kinect-Sensor-Power-Supply/productID.221244000/catID.50606600/parentCategoryID.50790100/categoryID.57399200/list.true

    Or you can use a short URL from tinyurl.com (tinyurl.com/kinectpowerkit):

  2. Then connect its USB cable to a USB port on your computer and connect its adapter to a power plug.

    Note

    Unlike other devices, Kinect is compatible with USB3 ports from the very first moment.

    It is because Kinect uses an internal USB2 hub that is compatible with USB3 even when the device itself may not be. This also means connecting Kinect to a USB hub or next to any other sensor can make it unusable or undetectable.

  3. Now the following pop up will appear on your Windows notification bar:

  4. You must wait for the installation to complete or click on it to visually see the installation steps:

  5. When Ready to use is displayed, it means everything is good and you have connected your device successfully.

  6. You can check if it is successfully installed by going to Device Manager. To open Device Manager, right-click on the My Computer icon and select Manage, then navigate to Device Manager from the left-hand side tree. Then check if you have the Microsoft Kinect node installed in the right-hand side panel, as shown in the following screenshot:

How it works...

The preceding steps are actually quite self-explanatory; we connected our device, waited for it to be recognized by Windows, and then let automatic installation finish.

See also

  • The Downloading and installing OpenNI recipe

  • The Downloading and installing NiTE recipe

  • The Downloading and installing the Microsoft Kinect SDK recipe

Left arrow icon Right arrow icon

Key benefits

  • Use OpenNI for all your needs from games and application UI to low-level data processing or motion detection
  • Learn more about the Natural Interaction features of OpenNI
  • The book is useful for both beginners and professionals because it covers the most basic to advanced concepts in the OpenNi technology.
  • Full of illustrations, examples, and tips for understanding different aspects of topics, with clear step-by-step instructions to get different parts of OpenNI working for you

Description

The release of Microsoft Kinect, then PrimeSense Sensor, and Asus Xtion opened new doors for developers to interact with users, re-design their application’s UI, and make them environment (context) aware. For this purpose, developers need a good framework which provides a complete application programming interface (API), and OpenNI is the first choice in this field. This book introduces the new version of OpenNI. "OpenNI Cookbook" will show you how to start developing a Natural Interaction UI for your applications or games with high level APIs and at the same time access RAW data from different sensors of different hardware supported by OpenNI using low level APIs. It also deals with expanding OpenNI by writing new modules and expanding applications using different OpenNI compatible middleware, including NITE. "OpenNI Cookbook" favors practical examples over plain theory, giving you a more hands-on experience to help you learn. OpenNI Cookbook starts with information about installing devices and retrieving RAW data from them, and then shows how to use this data in applications. You will learn how to access a device or how to read data from it and show them using OpenGL, or use middleware (especially NITE) to track and recognize users, hands, and guess the skeleton of a person in front of a device, all through examples.You also learn about more advanced aspects such as how to write a simple module or middleware for OpenNI itself. "OpenNI Cookbook" shows you how to start and experiment with both NIUI designs and OpenNI itself using examples.

Who is this book for?

If you are a beginner or a professional in NIUI and want to write serious applications or games, then this book is for you. Even OpenNI 1 and OpenNI 1.x programmers who want to move to new versions of OpenNI can use this book as a starting point.This book uses C++ as the primary language but there are some examples in C# and Java too, so you need to have about a basic working knowledge of C or C++ for most cases.

What you will learn

  • Retrieve and use depth, vision, and audio from compatible devices
  • Get basic information about the environment
  • Recognize hands, humans, and their skeleton and track their moves
  • Customize frames right from the device itself
  • Identify basic gestures like pushing or swapping
  • Select between devices or use more than one device to read data
  • Recognize pre-defined hand gestures and detect user poses
Estimated delivery fee Deliver to Japan

Standard delivery 10 - 13 business days

$8.95

Premium delivery 3 - 6 business days

$34.95
(Includes tracking information)

Product Details

Country selected
Publication date, Length, Edition, Language, ISBN-13
Publication date : Jul 26, 2013
Length: 324 pages
Edition : 1st
Language : English
ISBN-13 : 9781849518468
Languages :
Concepts :
Tools :

What do you get with Print?

Product feature icon Instant access to your digital eBook copy whilst your Print order is Shipped
Product feature icon Paperback book shipped to your preferred address
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
OR
Modal Close icon
Payment Processing...
tick Completed

Shipping Address

Billing Address

Shipping Methods
Estimated delivery fee Deliver to Japan

Standard delivery 10 - 13 business days

$8.95

Premium delivery 3 - 6 business days

$34.95
(Includes tracking information)

Product Details

Publication date : Jul 26, 2013
Length: 324 pages
Edition : 1st
Language : English
ISBN-13 : 9781849518468
Languages :
Concepts :
Tools :

Packt Subscriptions

See our plans and pricing
Modal Close icon
$19.99 billed monthly
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Simple pricing, no contract
$199.99 billed annually
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just $5 each
Feature tick icon Exclusive print discounts
$279.99 billed in 18 months
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just $5 each
Feature tick icon Exclusive print discounts

Frequently bought together


Stars icon
Total $ 136.97
Augmented Reality with Kinect
$38.99
OpenNI Cookbook
$48.99
Mastering OpenCV with Practical Computer Vision Projects
$48.99
Total $ 136.97 Stars icon
Banner background image

Table of Contents

7 Chapters
Getting Started Chevron down icon Chevron up icon
OpenNI and C++ Chevron down icon Chevron up icon
Using Low-level Data Chevron down icon Chevron up icon
More about Low-level Outputs Chevron down icon Chevron up icon
NiTE and User Tracking Chevron down icon Chevron up icon
NiTE and Hand Tracking Chevron down icon Chevron up icon
NiTE and Skeleton Tracking Chevron down icon Chevron up icon

Customer reviews

Rating distribution
Full star icon Full star icon Full star icon Full star icon Half star icon 4.2
(5 Ratings)
5 star 20%
4 star 80%
3 star 0%
2 star 0%
1 star 0%
Chilijung Dec 07, 2013
Full star icon Full star icon Full star icon Full star icon Full star icon 5
I really love the book very much!I've just started to learn OpenNI recently, but I found out there is lack of documentations and tutorials. In the book the author have teach openNI from the basic to advance level. The book also taught some low-level data usage, that makes me easy to know more about the details.But I wish the book could also teach some installation in Linux or MacOS, cause I'm mostly using these two platforms for my project development, otherwise the book is really great.I'll definitely recommend the book to friends that started openNI.
Amazon Verified review Amazon
carles Feb 17, 2014
Full star icon Full star icon Full star icon Full star icon Empty star icon 4
As a previous user of OpenNI I already knew some of the usage of its API. This book has completed my understanding of both the underlying technology and API.This book is a cookbook on OpenNI (version 2.x), so it is basically a structured list of recipes "How to ...". It is structured to be read in a linear way: it covers from the most basic uses (the inner workings of a depth sensor) to the most high level purposes (skeleton tracking), but every recipe is independent enough to be understood alone.Every recipe is structured in the same way: first the goal description, then the code and finally an explanation. While this structure may be useful for the reader that only wants to read a single recipe, I feel that just printing the code on a book is not the best way to put it. Fortunately, the source code is distributed electronically in the original format, so the reader can read it in an appropriate code editor.I found the introduction about the depth sensor technologies present at the beginning of the book to be extremely useful and interesting. It seemed well documented with links to Wikipedia articles allowing the reader to enhance his knowledge of the field. Introductions in other chapters explain not only the most common settings, but all the options. This is also very useful in order to get the whole picture. I would say that this part, the documentation, is where the value of this book lies.On the other side, I think its coding standards are not the best. Use of vectors, for instance, is not consistent through the book and may lead to confusion. Although I understand that this book may be designed to be suited to beginners, I would have recommended better C++ - centered standards, instead of mixing C idioms; after all the book is centered on the C++ bindings of OpenNI, and not its C API. Coding styles apart, the explanation of the code is good enough.A valuable addition included in the book is a step by step guide to setup the programming environment. Newbies will love this, as it guarantees that they will focus on the problem, and not on the some times over-complex configuration system of the IDE.If you are beginning to use OpenNI or you plan to, and want to get a list of common solutions that you may use that also gives the necessary details to go more in depth, this book is for you.
Amazon Verified review Amazon
Googigo Nov 21, 2013
Full star icon Full star icon Full star icon Full star icon Empty star icon 4
This is the kind of book that would be helpful if you just started developing OpenNI applications in Windows. The code in this book uses the OpenNI2 library, which is the latest version of OpenNI. See more review in the link below.[...]
Amazon Verified review Amazon
SeaLark Jan 16, 2014
Full star icon Full star icon Full star icon Full star icon Empty star icon 4
This book looks great and was certainly needed for all the people interested in OpenNI. That said, go to the OpenNI website Blog and read the comments about the absence of the entire staff and total lack of new support. I only report this since anyone considering basing an expensive development on OpenNI should consider the likelihood that the open source code's future is very dark!
Amazon Verified review Amazon
videoman Nov 15, 2013
Full star icon Full star icon Full star icon Full star icon Empty star icon 4
Nice Book; I am a beginner at OPenNi and found this book useful in getting started and understanding the dev. environment along with the details of the API.In the 'Getting Ready' section, there are plenty of screenshots to get you thru the installation, along with background of both the hardware and software tecnology to start you off in Chapter 1.Chapter 2 continues with the step by step 'How To Do It' instructions and screenshot showing how to setup a new visual studio 2010 project.I like the technique of walking you step by step through a procedure, then summarizing the steps afterwards in the 'How it works' section; each 'recipe' has this format.Chapter 2 continues with examples on how to get OpenNi version info, device info etc along with showing how to capture errors. This chapter and following chapters are chock full of code samples; easily cut and pasted from PDF.Another nice touch is the book explains C++ syntax and constructs (things like Templates for example.The book then dives into the details of frame capture and processing.--Being a beginner at openni, chapter 2 thru 4 were of particular interest to me; working thru some of the examples;The rest of the book goes on to discuss and teach the Nite middleware SDK;The author describes the object model in details, followed by many examples in the 'Getty Ready', 'How to do it' and 'How it works' format.All in all, a great tutorial and reference for getting up and running with OpenNi;About Me: I am a long time software engineer who is interested in the latest enabling technologies; always looking for a good foundation to work upon when using new technologies.
Amazon Verified review Amazon
Get free access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

What is the delivery time and cost of print book? Chevron down icon Chevron up icon

Shipping Details

USA:

'

Economy: Delivery to most addresses in the US within 10-15 business days

Premium: Trackable Delivery to most addresses in the US within 3-8 business days

UK:

Economy: Delivery to most addresses in the U.K. within 7-9 business days.
Shipments are not trackable

Premium: Trackable delivery to most addresses in the U.K. within 3-4 business days!
Add one extra business day for deliveries to Northern Ireland and Scottish Highlands and islands

EU:

Premium: Trackable delivery to most EU destinations within 4-9 business days.

Australia:

Economy: Can deliver to P. O. Boxes and private residences.
Trackable service with delivery to addresses in Australia only.
Delivery time ranges from 7-9 business days for VIC and 8-10 business days for Interstate metro
Delivery time is up to 15 business days for remote areas of WA, NT & QLD.

Premium: Delivery to addresses in Australia only
Trackable delivery to most P. O. Boxes and private residences in Australia within 4-5 days based on the distance to a destination following dispatch.

India:

Premium: Delivery to most Indian addresses within 5-6 business days

Rest of the World:

Premium: Countries in the American continent: Trackable delivery to most countries within 4-7 business days

Asia:

Premium: Delivery to most Asian addresses within 5-9 business days

Disclaimer:
All orders received before 5 PM U.K time would start printing from the next business day. So the estimated delivery times start from the next day as well. Orders received after 5 PM U.K time (in our internal systems) on a business day or anytime on the weekend will begin printing the second to next business day. For example, an order placed at 11 AM today will begin printing tomorrow, whereas an order placed at 9 PM tonight will begin printing the day after tomorrow.


Unfortunately, due to several restrictions, we are unable to ship to the following countries:

  1. Afghanistan
  2. American Samoa
  3. Belarus
  4. Brunei Darussalam
  5. Central African Republic
  6. The Democratic Republic of Congo
  7. Eritrea
  8. Guinea-bissau
  9. Iran
  10. Lebanon
  11. Libiya Arab Jamahriya
  12. Somalia
  13. Sudan
  14. Russian Federation
  15. Syrian Arab Republic
  16. Ukraine
  17. Venezuela
What is custom duty/charge? Chevron down icon Chevron up icon

Customs duty are charges levied on goods when they cross international borders. It is a tax that is imposed on imported goods. These duties are charged by special authorities and bodies created by local governments and are meant to protect local industries, economies, and businesses.

Do I have to pay customs charges for the print book order? Chevron down icon Chevron up icon

The orders shipped to the countries that are listed under EU27 will not bear custom charges. They are paid by Packt as part of the order.

List of EU27 countries: www.gov.uk/eu-eea:

A custom duty or localized taxes may be applicable on the shipment and would be charged by the recipient country outside of the EU27 which should be paid by the customer and these duties are not included in the shipping charges been charged on the order.

How do I know my custom duty charges? Chevron down icon Chevron up icon

The amount of duty payable varies greatly depending on the imported goods, the country of origin and several other factors like the total invoice amount or dimensions like weight, and other such criteria applicable in your country.

For example:

  • If you live in Mexico, and the declared value of your ordered items is over $ 50, for you to receive a package, you will have to pay additional import tax of 19% which will be $ 9.50 to the courier service.
  • Whereas if you live in Turkey, and the declared value of your ordered items is over € 22, for you to receive a package, you will have to pay additional import tax of 18% which will be € 3.96 to the courier service.
How can I cancel my order? Chevron down icon Chevron up icon

Cancellation Policy for Published Printed Books:

You can cancel any order within 1 hour of placing the order. Simply contact customercare@packt.com with your order details or payment transaction id. If your order has already started the shipment process, we will do our best to stop it. However, if it is already on the way to you then when you receive it, you can contact us at customercare@packt.com using the returns and refund process.

Please understand that Packt Publishing cannot provide refunds or cancel any order except for the cases described in our Return Policy (i.e. Packt Publishing agrees to replace your printed book because it arrives damaged or material defect in book), Packt Publishing will not accept returns.

What is your returns and refunds policy? Chevron down icon Chevron up icon

Return Policy:

We want you to be happy with your purchase from Packtpub.com. We will not hassle you with returning print books to us. If the print book you receive from us is incorrect, damaged, doesn't work or is unacceptably late, please contact Customer Relations Team on customercare@packt.com with the order number and issue details as explained below:

  1. If you ordered (eBook, Video or Print Book) incorrectly or accidentally, please contact Customer Relations Team on customercare@packt.com within one hour of placing the order and we will replace/refund you the item cost.
  2. Sadly, if your eBook or Video file is faulty or a fault occurs during the eBook or Video being made available to you, i.e. during download then you should contact Customer Relations Team within 14 days of purchase on customercare@packt.com who will be able to resolve this issue for you.
  3. You will have a choice of replacement or refund of the problem items.(damaged, defective or incorrect)
  4. Once Customer Care Team confirms that you will be refunded, you should receive the refund within 10 to 12 working days.
  5. If you are only requesting a refund of one book from a multiple order, then we will refund you the appropriate single item.
  6. Where the items were shipped under a free shipping offer, there will be no shipping costs to refund.

On the off chance your printed book arrives damaged, with book material defect, contact our Customer Relation Team on customercare@packt.com within 14 days of receipt of the book with appropriate evidence of damage and we will work with you to secure a replacement copy, if necessary. Please note that each printed book you order from us is individually made by Packt's professional book-printing partner which is on a print-on-demand basis.

What tax is charged? Chevron down icon Chevron up icon

Currently, no tax is charged on the purchase of any print book (subject to change based on the laws and regulations). A localized VAT fee is charged only to our European and UK customers on eBooks, Video and subscriptions that they buy. GST is charged to Indian customers for eBooks and video purchases.

What payment methods can I use? Chevron down icon Chevron up icon

You can pay with the following card types:

  1. Visa Debit
  2. Visa Credit
  3. MasterCard
  4. PayPal
What is the delivery time and cost of print books? Chevron down icon Chevron up icon

Shipping Details

USA:

'

Economy: Delivery to most addresses in the US within 10-15 business days

Premium: Trackable Delivery to most addresses in the US within 3-8 business days

UK:

Economy: Delivery to most addresses in the U.K. within 7-9 business days.
Shipments are not trackable

Premium: Trackable delivery to most addresses in the U.K. within 3-4 business days!
Add one extra business day for deliveries to Northern Ireland and Scottish Highlands and islands

EU:

Premium: Trackable delivery to most EU destinations within 4-9 business days.

Australia:

Economy: Can deliver to P. O. Boxes and private residences.
Trackable service with delivery to addresses in Australia only.
Delivery time ranges from 7-9 business days for VIC and 8-10 business days for Interstate metro
Delivery time is up to 15 business days for remote areas of WA, NT & QLD.

Premium: Delivery to addresses in Australia only
Trackable delivery to most P. O. Boxes and private residences in Australia within 4-5 days based on the distance to a destination following dispatch.

India:

Premium: Delivery to most Indian addresses within 5-6 business days

Rest of the World:

Premium: Countries in the American continent: Trackable delivery to most countries within 4-7 business days

Asia:

Premium: Delivery to most Asian addresses within 5-9 business days

Disclaimer:
All orders received before 5 PM U.K time would start printing from the next business day. So the estimated delivery times start from the next day as well. Orders received after 5 PM U.K time (in our internal systems) on a business day or anytime on the weekend will begin printing the second to next business day. For example, an order placed at 11 AM today will begin printing tomorrow, whereas an order placed at 9 PM tonight will begin printing the day after tomorrow.


Unfortunately, due to several restrictions, we are unable to ship to the following countries:

  1. Afghanistan
  2. American Samoa
  3. Belarus
  4. Brunei Darussalam
  5. Central African Republic
  6. The Democratic Republic of Congo
  7. Eritrea
  8. Guinea-bissau
  9. Iran
  10. Lebanon
  11. Libiya Arab Jamahriya
  12. Somalia
  13. Sudan
  14. Russian Federation
  15. Syrian Arab Republic
  16. Ukraine
  17. Venezuela