Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases now! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tech News - Augmented Reality / Virtual Reality

8 Articles
article-image-google-announces-glass-enterprise-edition-2-an-enterprise-based-augmented-reality-headset
Amrata Joshi
21 May 2019
3 min read
Save for later

Google announces Glass Enterprise Edition 2: an enterprise-based augmented reality headset

Amrata Joshi
21 May 2019
3 min read
Today, the team at Google has announced a new version of its Google Glass, called Google Glass Enterprise Edition 2, an augmented reality headset. Glass Enterprise Edition 2 is now an official Google product. https://youtu.be/5IK-zU51MU4 Glass Enterprise Edition has been useful for workers in a variety of industries ranging from logistics to manufacturing, to field services. It helps workers for accessing checklists, view instructions or sending inspection photos or videos, etc. This headset is no more under Google’s parent company Alphabet’s X “Moonshot Factory”. The official blog reads, “Now, in order to meet the demands of the growing market for wearables in the workplace and to better scale our enterprise efforts, the Glass team has moved from X to Google.” https://twitter.com/Theteamatx/status/1130504636501090305 https://twitter.com/jetscott/status/1130506213379235840 Glass Enterprise Edition 2 helps businesses to improve the efficiency of their employees. It costs $999, and it’s not being sold directly to consumers. Features of Glass Enterprise Edition 2 An improved camera with good performance and quality that builds on an existing first person video streaming and collaboration features. Features a new processor built on the Qualcomm Snapdragon XR1 platform. It provides a powerful multicore CPU (central processing unit) and a new artificial intelligence engine. Comes with a USB-C port for fast charging It also features a thicker, bulkier design, which helps to fit a larger 820mAh battery compared to the original's 570mAh. It also helps in power savings, enhancing the performance and support for computer vision and advanced machine learning capabilities. Google team further mentions, “We’ve also partnered with Smith Optics to make Glass-compatible safety frames for different types of demanding work environments, like manufacturing floors and maintenance facilities.” Glass Enterprise Edition 2 is built on Android which makes it easier for customers to integrate the services and APIs they already use. And also supports Android Enterprise Mobile Device Management in order to scale deployments. Even other big tech companies like Microsoft, Vuzix, and Epson are also working towards business-focused augmented reality glasses and make their positions strong in this league. To know more about this news, check out the official blog post by Google. Google AI engineers introduce Translatotron, an end-to-end speech-to-speech translation model Introducing Minecraft Earth, Minecraft’s AR-based game for Android and iOS users As US-China tech cold war escalates, Google revokes Huawei’s Android support, allows only those covered under open source licensing  
Read more
  • 0
  • 0
  • 3733

article-image-introducing-minecraft-earth-minecrafts-ar-based-game-for-android-and-ios-users
Amrata Joshi
20 May 2019
4 min read
Save for later

Introducing Minecraft Earth, Minecraft's AR-based game for Android and iOS users

Amrata Joshi
20 May 2019
4 min read
Last week, the team at Minecraft introduced a new AR-based game called ‘Minecraft Earth’, which is free for Android and iOS users. The most striking feature about Minecraft Earth is that it builds on the real world with augmented reality, I am sure it will remind you of the game Pokémon Go. https://twitter.com/minecraftearth/status/1129372933565108224 Minecraft has around 91 million active players, and now Microsoft is looking forward to taking the Pokémon Go concept on the next level by letting Minecraft players create and share whatever they’ve made in the game with friends in the real world. Users can now build something in Minecraft on their phones and then drop it into their local park for all their friends to see it together at the same location. This game aims to transform single user AR gaming to multi-user gaming while letting users access the virtual world that’s shared by everyone. Read Also: Facebook launched new multiplayer AR games in Messenger Minecraft Earth will be available in beta on iOS and Android, this summer. This game brings modes like creative that has unlimited blocks and items; or survival where you lose all your items when you die. Torfi Olafsson, game director of Minecraft Earth, explains, “This is an adaptation, this is not a direct translation of Minecraft. While it’s an adaptation, it’s built on the existing Bedrock engine so it will be very familiar to existing Minecraft players. If you like building Redstone machines, or you’re used to how the water flows, or how sand falls down, it all works. Olafsson further added, “All of the mobs of animals and creatures in Minecraft are available, too, including a new pig that really loves mud. We have tried to stay very true to the kind of core design pillars of Minecraft, and we’ve worked with the design team in Stockholm to make sure that the spirit of the game is carried through.” Players have to venture out into the real world to collect things just like how it works in Pokemon Go! Minecraft Earth has something similar to pokéstops called “tapables”, which are randomly placed in the world around the player. They are designed to give players rewards that allow them to build things, and players need to collect as many of these as possible in order to get resources and items to build vast structures in the building mode. The maps in this game are based on OpenStreetMap that has allowed Microsoft to place Minecraft adventures into the world. On the Minecraft Earth map, these adventures spawn dynamically and are also designed for multiple people to get involved in. Players can play together while sitting side by side to experience similar adventures at the exact same time and spot. They can also fight monsters, break down structures for resources together, and even stand in front of a friend to block them from physically killing a virtual sheep. Players can even see the tools that fellow players have in their hands on your phone’s screen, alongside their username. Microsoft is also using its Azure Spatial Anchors technology in Minecraft Earth which uses machine vision algorithms so that real-world objects can be used as anchors for digital content. Niantic, a Pokémon Go developer had to recently settle a lawsuit with angry homeowners who had pokéstops placed near their houses. With what happened with Pokemon Go in the past could be a threat for games like Minecraft Earth too. As there are many challenges in bringing augmented reality within private spaces. Saxs Persson, creative director of Minecraft said, “There are lots of very real challenges around user-generated content. It’s a complicated problem at the scale we’re talking about, but that doesn’t mean we shouldn’t tackle it.” https://twitter.com/Toadsanime/status/1129374278384795649 https://twitter.com/ExpnandBanana/status/1129419087216562177 https://twitter.com/flamnhotsadness/status/1129429075490160642 https://twitter.com/pixiebIush/status/1129455271833550848 To know more about Minecraft Earth, check out Minecraft’s page. Game rivals, Microsoft and Sony, form a surprising cloud gaming and AI partnership Obstacle Tower Environment 2.0: Unity announces Round 2 of its ‘Obstacle Tower Challenge’ to test AI game players OpenAI Five beats pro Dota 2 players; wins 2-1 against the gamers
Read more
  • 0
  • 0
  • 5207

article-image-valve-reveals-new-index-vr-kit-with-detail-specs-and-costs-upto-999
Fatema Patrawala
02 May 2019
4 min read
Save for later

Valve reveals new Index VR Kit with detail specs and costs upto $999

Fatema Patrawala
02 May 2019
4 min read
Valve introduced the new VR headset kit, Valve Index, only one month ago. And said the preorders will begin from, May 1st, and will ship in June. Today, Valve is fully detailing the Index headset for the first time, and revealing exactly how much it will cost: $999. The price seems to be relatively high according to today’s VR headset standards. In comparison, Facebook announced the Oculus Quest and Oculus Rift S to be shipped on May 21st for $399. But Valve says it will let you buy parts piecemeal if you need, which is good deal if you do not wish to buy the whole kit. And if you’ve already got a Vive or Vive Pro and / or don’t need the latest Knuckles controllers, you won’t necessarily need to spend that whole $999 to get started. Get the best look yet of the Index headset at the Valve Index website. Like the HTC Vive, which was co-designed with Valve, the Index will still be a tethered experience with a 5-meter cable that plugs into a gaming PC. It also uses the company’s laser-firing Lighthouse base stations to figure out where the headset is at any given time. That’s how it lets you walk around a room worth of space in VR — up to a huge 10 x 10 meter room. Valve’s not using cameras for inside-out tracking; the company says the twin stereo RGB cameras here are designed for passthrough (letting you see the real world through the headset) and for the computer vision community. Instead, Valve says the Index’s focus is on delivering the highest fidelity VR experience possible, meaning improved lenses, screens, and audio. In this case it actually includes a pair of 1440 x 1600-resolution RGB LCDs, rather than the higher-res OLED screens much of which the competition is already using. But Valve says its screens run faster — 120Hz, with an experimental 144Hz mode — and are better at combating the “screen door effect” and blurry when you move your head, persistence issues that first-gen VR headsets struggled with. The Valve Index also has an IPD slider to adjust for the distance between your eyes and lenses that Valve says offer a 20-degree larger field of view than the HTC Vive “for typical users.” Most interesting in Valve are the built-in headphone images shown on the website which aren’t actually headphones — but they’re speakers. And they are designed to not touch your ears, instead firing their sound toward your head. It is similar to how Microsoft’s HoloLens visors produce audio, which means that while people around you could theoretically hear what you’re doing, there’ll be less fiddling with the mechanism to get that audio aligned with your ears. They have also provided a 3.5mm headphone jack if you want to plug in your own headphones. Another interesting part of the Valve Index is it can be purchased separately for $279. The Valve Index Controllers, formerly known as Knuckles, might be the most intuitive way to get your hands into VR yet. While a strap holds the controller to your hand, 87 sensors track the position of your hands and fingers and even how hard you’re pressing down. Theoretically, you could easily reach, grab, and throw virtual objects with such a setup, something that wasn’t really possible with the HTC Vive or Oculus Touch controllers. Here’s one gameplay example that Valve is showing off: Source - Valve website Another small improvement is the company’s Lighthouse base stations. Since they only use a single laser now, and no IR blinker, Valve says they play nicer with other IR devices, which mean you can turn on and off TV without needing to power off them first. According to the reports by Polygon which got an early hands-on with the Valve Index, they say the Knuckles feel great, the optics are sharp, and that it may be the most comfortable way to wear a VR headset over a pair of glasses yet. Polygon also further explained the $999 price point. They said, during Valve’s demonstration, a spokesperson said that Index is the sort of thing that is likely to appeal to a virtual reality enthusiast who (a) must have the latest thing and (b) enjoys sufficient disposable income to satisfy that desire. It’s an interesting contrast with Facebook’s strategy for Rift, which is pushing hard for the price tipping point when VR suddenly becomes a mass-market thing, like smartphones did a decade ago. Get to know about pricing details of Valve Index kit on its official page. Top 7 tools for virtual reality game developers Game developers say Virtual Reality is here to stay Facebook releases DeepFocus, an AI-powered rendering system to make virtual reality more real
Read more
  • 0
  • 0
  • 3376

article-image-adobe-acquires-allegorithmic-a-popular-3d-editing-and-authoring-company
Amrata Joshi
24 Jan 2019
3 min read
Save for later

Adobe Acquires Allegorithmic, a popular 3D editing and authoring company

Amrata Joshi
24 Jan 2019
3 min read
Yesterday, Adobe announced that it has acquired Allegorithmic. Allegorithmic is the creator of Substance, and other 3D editing and authoring tools for gaming, entertainment, and post-production. Allegorithmic’s customer base is diverse ranging across gaming, film and television, e-commerce, retail, automotive, architecture, design and advertising industries. Popular Algorithmic users include Electronic Arts, Ubisoft, Ikea, BMW, Louis Vuitton, Foster + Partners among others. Allegorithmic’s Substance tools are used in games, such as Call of Duty, Assassin’s Creed, and Forza. Allegorithmic’s tools have been used for visual effects and animation for some of the popular movies like Blade Runner 2049, Pacific Rim Uprising, and Tomb Raider. Adobe will help in accelerating Allegorithmic’s product roadmap and go-to-market strategy and further extend its reach among enterprise, SMB, and individual customers. Sebastien Deguy, CEO and founder at Allegorithmic will take up a leadership role as the vice president and handle Adobe’s broader 3D and immersive designs. With this acquisition, Adobe also wants to make Creative Cloud (a set of applications and services from Adobe Systems that gives users access to a collection of software used for graphic design, video editing and more) the home to 3D design tools. How will Creative Cloud benefit from Allegorithmic Adobe and Allegorithmic previously worked together three years ago. As the result of their work, Adobe introduced a standard PBR material for its Project Aero, Adobe Dimension, Adobe Capture, and every 3D element in Adobe Stock. Now, Adobe will empower video game creators, VFX artists, designers, and marketers by combining Allegorithmic’s Substance 3D design tools with Creative Cloud’s imaging, video and motion graphics tools. Creative Cloud can benefit from Allegorithmic’s tools in gaming, entertainment, retail and even for designing textures and materials that give 3D content detail and realism. Creative Cloud tools such as Photoshop, Premiere Pro, Dimension, and After Effects are already in use and are of great significance for content creators, the addition of Allegorithmic’s Substance tools to Creative Cloud would turn out to be more powerful. In a blog post, Scott Belsky, chief product officer and executive vice president at Creative Cloud, said, “Our goal with Creative Cloud is to provide creators with all the tools they need for whatever story they choose to tell. Increasingly, stories are being told with 3D content. That’s why I’m excited to announce that today Adobe has acquired Allegorithmic, the industry standard in tools for 3D material and texture creation for gaming and entertainment.” Sebastien Deguy, said, “Allegorithmic and Adobe share the same passion for bringing inspiring technologies to creators. We are excited to join the team, bring together the strength of Allegorithmic’s industry-leading tools with the Creative Cloud platform and transform the way businesses create powerful, interactive content and experiences.” In future, Adobe might focus on making Allegorithmic tools available via subscription. Some users are concerned about the termination of the perpetual license and are unhappy about this news. It would be interesting to see the next set of updates from the team at Adobe. https://twitter.com/sudokuloco/status/1088101391871107073 https://twitter.com/2017_nonsense/status/1088181496710479872 Adobe set to acquire Marketo putting Adobe Experience Cloud at the heart of all marketing Adobe glides into Augmented Reality with Adobe Aero Adobe to spot fake images using Artificial Intelligence
Read more
  • 0
  • 0
  • 3568

article-image-magic-leap-teams-imaginarium-studios-augmented-reality
Sugandha Lahoti
06 Sep 2018
2 min read
Save for later

Magic Leap teams with Andy Serkis’ Imaginarium Studios to enhance Augmented Reality

Sugandha Lahoti
06 Sep 2018
2 min read
Since its launch Magic Leap has always been in the limelight for its unique AR experiences. To take it one step further, Magic Leap has announced a tie-up with motion capture actor Andy Serkis and his UK-based The Imaginarium Studios. Source: Twitter Andy Serkis and his Imaginarium Studios are best known for innovative motion capture and performance capture roles such as Gollum, King Kong, Caesar, Captain Haddock and Supreme Leader Snoke. Magic Leap and Serkis’ The Imaginarium plan to create additional content for the Magic Leap platform, and support third-party developers with its production resources, including the motion capture stage at its London studio. Matthew Brown, CEO of The Imaginarium Studios, quotes, “In using performance capture, we have the opportunity to authentically portray the intent of the actor. With Magic Leap’s technology, that performance is pulled off the screen and into the world of the viewer. This is storytelling as it should be: immersive, personal, and integrated into our world.” As the first product of this partnership, Andy has created a performance-captured character for the Magic Leap One Creator Edition mixed reality headset. This character named Grishneck, is the first Magic Leap character played by Serkis. Grishneck is described as a 3D Orc-like creature that interacts with the viewer when wearing the Magic Leap One Goggles in their controlled environment. In the story, Grishneck is a character who was rejected from a project because he wasn't scary enough. Andy has expressed enthusiasm for the potential of augmented reality. In an interview with The Hollywood Reporter, he said, “We now have a new way of experiencing story. What is thrilling is that with Magic Leap, the relationship of the performances to the real world can change hugely and they can be placed and manifested in countless ways.” Magic Leap’s executive creative director Andy Lanning confirmed that Imaginarium and Magic Leap have been in conversations for at least five years. Imaginarium is currently working on several Magic Leap projects. Magic Leap One, the first mixed reality headsets by Magic Leap, is now available at $2295. Magic Leap’s first augmented reality headset, powered by Nvidia Tegra X2, is coming this Summer. Understanding the hype behind Magic Leap’s New Augmented Reality Headsets.
Read more
  • 0
  • 0
  • 2883

article-image-facebook-reality-labs-launch-sumo-challenge-to-improve-3d-scene-understanding-and-modeling-algorithms
Sugandha Lahoti
04 Sep 2018
3 min read
Save for later

Facebook Reality Labs launch SUMO Challenge to improve 3D scene understanding and modeling algorithms

Sugandha Lahoti
04 Sep 2018
3 min read
Facebook Reality Labs have launched the Scene Understanding and Modeling SUMO Challenge. The challenge is designed by a group of computer vision researchers at Facebook with collaborators from Stanford, Princeton and Virginia Tech. The goal of the challenge is to aid the development of comprehensive 3D scene understanding and modeling algorithms. For the SUMO challenge, participants are required to generate an instance-based 3D representation of an indoor scene given only a 360-degree RGB-D image taken from a single viewpoint. The generated scene is modeled by a collection of elements, each of which represents one object, such as a wall, the floor, or a chair. Source: Facebook Blog What are the three types of tasks? The SUMO Challenge is organized into three performance tracks based on the output representation of the scene. Participants can join in any of the three increasingly detailed and difficult performance tracks. Bounding Boxes Track: The scene is represented by a collection of oriented bounding boxes. This is similar to the SUN RGB-D Object Detection Challenge. The bounding box is the coordinates of the rectangular border that fully encloses a digital image when it is placed over a page, a canvas, or a screen. Voxels Track: The scene is represented by a collection of oriented voxel grids. A voxel represents a value on a regular grid in three-dimensional space. Meshes Track: The scene is represented by a collection of textured surface meshes.  A mesh is a collection of vertices, edges and faces that defines the shape of a polyhedral object in 3D computer graphics and solid modeling. How are the tasks evaluated? The SUMO evaluation metrics focus on four aspects of the representation: geometry, appearance, semantics, and perception (GASP). Participants will be evaluated on their ability to consistently infer the correct geometry, pose, appearance and semantics of the elements in each scene. The challenge runs from August 29th until November 16th, 2018. The top winners in each track will receive prizes, including cash rewards and NVIDIA Titan X GPUs. 1st prize - winner of mesh track: $2,500 in cash + Titan X GPU 2nd prize - winner of voxel track: $2,000 in cash + Titan X GPU 3rd prize - winner of bounding box track: $1,500 in cash + Titan X GPU Winners will be announced at the SUMO Challenge Workshop on December 2nd at ACCV 2018, where they will present their results. How to Participate Familiarize yourself with the input and output formats. Download the SUMO software and the data set. See the data set page for details. Develop your algorithm. Submit your results using EvalAI. For more information, visit the SUMO Challenge website. Facebook Watch is now available world-wide challenging video streaming rivals, YouTube, Twitch, and more. Facebook launched new multiplayer AR games in Messenger. Facebook to launch AR ads on its news feed to let you try on products virtually.
Read more
  • 0
  • 0
  • 3169
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-steamvr-introduces-new-controllers-for-game-developers-the-steamvr-input-system
Sugandha Lahoti
16 May 2018
2 min read
Save for later

SteamVR introduces new controllers for game developers, the SteamVR Input system

Sugandha Lahoti
16 May 2018
2 min read
SteamVR announced new controllers adding accessibility features to the Virtual reality ecosystem. The SteamVR input system, lets you build controller bindings for any game, “even for controllers that didn’t exist when the game was written”, says Valve’s Joe Ludwig in a Steam forum post. What this essentially means is that any past, present or future game can hypothetically add support for any SteamVR compatible controller. Source: Steam community Supported controllers include the XBox One gamepad, Vive Tracker, Oculus Touch, and motion controllers for HTC Vive and Windows Mixed Reality VR headsets. The key-binding system of the SteamVR input system allows users to build binding configurations. Users can adapt the controls of games to take into account user behavior such as left-handedness, a disability, or personal preference. These configurations can also be shared easily with other users of the same game via the Steam Workshop. For developers, the new SteamVR input system means easier adaptation of games to diverse controllers. Developers entirely control the default bindings for each controller type. They can also offer alternate control schemes directly without the need to change the games themselves. SteamVR Input works with every SteamVR application; it doesn’t require developers to update their app to support it. Hardware designers are also free to try more types of input, apart from Vive Tracker, Oculus Touch etc. They can expose whatever input controls exist on their device and then describe that device to the system. Most importantly, the entire mechanism is captured in an easy to use UI that is available in-headset under the Settings menu. Source: Steam community For now, SteamVR Input is in beta. Details for developers are available on the OpenVR SDK 1.0.15 page. You can also see the documentation to enable native support in your applications. Hardware developers can read the driver API documentation to see how they can enable this new system for their devices. Google open sources Seurat to bring high precision graphics to Mobile VR Oculus Go, the first stand-alone VR headset arrives! Google Daydream powered Lenovo Mirage solo hits the market
Read more
  • 0
  • 0
  • 6691

article-image-virtual-reality-solar-system-unity-google-cardboard
Sugandha Lahoti
25 Apr 2018
21 min read
Save for later

Build a Virtual Reality Solar System in Unity for Google Cardboard

Sugandha Lahoti
25 Apr 2018
21 min read
In today's tutorial, we will feature visualization of a newly discovered solar system. We will leverage the virtual reality development process for this project in order to illustrate the power of VR and ease of use of the Unity 3D engine. This project is dioramic scene, where the user floats in space, observing the movement of planets within the TRAPPIST-1 planetary system. In February 2017, astronomers announced the discovery of seven planets orbiting an ultra-cool dwarf star slightly larger than Jupiter. We will use this information to build a virtual environment to run on Google Cardboard (Android and iOS) or other compatible devices: We will additionally cover the following topics: Platform setup: Download and install the platform-specific software needed to build an application on your target device. Experienced mobile developers with the latest Android or iOS SDK may skip this step. Google Cardboard setup: This package of development tools facilitates display and interaction on a Cardboard device. Unity environment setup: Initializing Unity's Project Settings in preparation for a VR environment. Building the TRAPPIST-1 system: Design and implement the Solar System project. Build for your device: Build and install the project onto a mobile device for viewing in Google Cardboard. Platform setup Before we begin building the solar system, we must setup our computer environment to build the runtime application for a given VR device. If you have never built a Unity application for Android or iOS, you will need to download and install the Software Development Kit (SDK) for your chosen platform. An SDK is a set of tools that will let you build an application for a specific software package, hardware platform, game console, or operating system. Installing the SDK may require additional tools or specific files to complete the process, and the requirements change from year to year, as operating systems and hardware platforms undergo updates and revisions. To deal with this nightmare, Unity maintains an impressive set of platform-specific instructions to ease the setup process. Their list contains detailed instructions for the following platforms: Apple Mac Apple TV Android iOS Samsung TV Standalone Tizen Web Player WebGL Windows For this project, we will be building for the most common mobile devices: Android or iOS. The first step is to visit either of the following links to prepare your computer: Android: Android users will need the Android Developer Studio, Java Virtual Machine (JVM), and assorted drivers. Follow this link for installation instructions and files: https://docs.unity3d.com/Manual/Android-sdksetup.html. Apple iOS: iOS builds are created on a Mac and require an Apple Developer account, and the latest version of Xcode development tools. However, if you've previously built an iOS app, these conditions will have already been met by your system. For the complete instructions, follow this link: https://docs.unity3d.com/Manual/iphone-GettingStarted.html. Google Cardboard setup Like the Unity documentation website, Google also maintains an in-depth guide for the Google VR SDK for Unity set of tools and examples. This SDK provides the following features on the device: User head tracking Side-by-side stereo rendering Detection of user interactions (via trigger or controller) Automatic stereo configuration for a specific VR viewer Distortion correction Automatic gyro drift correction These features are all contained in one easy-to-use package that will be imported into our Unity scene. Download the SDK from the following link, before moving on to the next step: http://developers.google.com/cardboard/unity/download. At the time of writing, the current version of the Google VR SDK for Unity is version 1.110.1 and it is available via a GitHub repository. The previous link should take you to the latest version of the SDK. However, when starting a new project, be sure to compare the SDK version requirements with your installed version of Unity. Setting up the Unity environment Like all projects, we will begin by launching Unity and creating a new project. The first steps will create a project folder which contains several files and directories: Launch the Unity application. Choose the New option after the application splash screen loads. Create a new project by launching the Unity application. Save the project as Trappist1 in a location of your choice, as demonstrated in Figure 2.2: To prepare for VR, we will adjust the Build Settings and Player Settings windows. Open Build Settings from File | Build Settings. Select the Platform for your target device (iOS or Android). Click the Switch Platform button to confirm the change. The Unity icon in the right-hand column of the platform panel indicates the currently selected build platform. By default, it will appear next to the Standalone option. After switching, the icon should now be on Android or iOS platform, as shown in Figure 2.3: Note for Android developers: Ericsson Texture Compression (ETC) is the standard texture compression format on Android. Unity defaults to ETC (default), which is supported on all current Android devices, but it does not support textures that have an alpha channel. ETC2 supports alpha channels and provides improved quality for RBG textures on Android devices that support OpenGL ES 3.0. Since we will not need alpha channels, we will stick with ETC (default) for this project: Open the Player Settings by clicking the button at the bottom of the window. The PlayerSetting panel will open in the Inspector panel. Scroll down to Other Settings (Unity 5.5 thru 2017.1) or XR Settings and check the Virtual Reality Supported checkbox. A list of choices will appear for selecting VR SDKs. Add Cardboard support to the list, as shown in Figure 2.4: You will also need to create a valid Bundle Identifier or Package Name under Identification section of Other Settings. The value should follow the reverse-DNS format of the com.yourCompanyName.ProjectName format using alphanumeric characters, periods, and hyphens. The default value must be changed in order to build your application. Android development note: Bundle Identifiers are unique. When an app is built and released for Android, the Bundle Identifier becomes the app's package name and cannot be changed. This restriction and other requirements are discussed in this Android documentation link: http://developer.Android.com/reference/Android/content/pm/PackageInfo.html. Apple development note: Once you have registered a Bundle Identifier to a Personal Team in Xcode, the same Bundle Identifier cannot be registered to another Apple Developer Program team in the future. This means that, while testing your game using a free Apple ID and a Personal Team, you should choose a Bundle Identifier that is for testing only, you will not be able to use the same Bundle Identifier to release the game. An easy way to do this is to add Test to the end of whatever Bundle Identifier you were going to use, for example, com.MyCompany.VRTrappistTest. When you release an app, its Bundle Identifier must be unique to your app, and cannot be changed after your app has been submitted to the App Store. Set the Minimum API Level to Android Nougat (API level 24) and leave the Target API on Automatic. Close the Build Settings window and save the project before continuing. Choose Assets | Import Package | Custom Package... to import the GoogleVRForUnity.unitypackage previously downloaded from http://developers.google.com/cardboard/unity/download. The package will begin decompressing the scripts, assets, and plugins needed to build a Cardboard product. When completed, confirm that all options are selected and choose Import. Once the package has been installed, a new menu titled GoogleVR will be available in the main menu. This provides easy access to the GoogleVR documentation and Editor Settings. Additionally, a directory titled GoogleVR will appear in the Project panel: Right-click in the Project and choose Create | Folder to add the following directories: Materials, Scenes, and Scripts. Choose File | Save Scenes to save the default scene. I'm using the very original Main Scene and saving it to the Scenes folder created in the previous step. Choose File | Save Project from the main menu to complete the setup portion of this project. Building the TRAPPIST-1 System Now that we have Unity configured to build for our device, we can begin building our space themes VR environment. We have designed this project to focus on building and deploying a VR experience. If you are moderately familiar with Unity, this project will be very simple. Again, this is by design. However, if you are relatively new, then the basic 3D primitives, a few textures, and a simple orbiting script will be a great way to expand your understanding of the development platform: Create a new script by selecting Assets | Create | C# Script from the main menu. By default, the script will be titled NewBehaviourScript. Single click this item in the Project window and rename it OrbitController. Finally, we will keep the project organized by dragging OrbitController's icon to the Scripts folder. Double-click the OrbitController script item to edit it. Doing this will open a script editor as a separate application and load the OrbitController script for editing. The following code block illustrates the default script text: using System.Collections; using System.Collections.Generic; using UnityEngine; public class OrbitController : MonoBehaviour { // Use this for initialization void Start () { } // Update is called once per frame void Update () { } } This script will be used to determine each planet's location, orientation, and relative velocity within the system. The specific dimensions will be added later, but we will start by adding some public variables. Starting on line 7, add the following five statements: public Transform orbitPivot; public float orbitSpeed; public float rotationSpeed; public float planetRadius; public float distFromStar; Since we will be referring to these variables in the near future, we need a better understanding of how they will be used: orbitPivot stores the position of the object that each planet will revolve around (in this case, it is the star TRAPPIST-1). orbitalSpeed is used to control how fast each planet revolves around the central star. rotationSpeed is how fast an object rotates around its own axis. planetRadius represents a planet's radius compared to Earth. This value will be used to set the planet's size in our environment. distFromStar is a planet's distance in Astronomical Units (AU) from the central star. Continue by adding the following lines of code to the Start() method of the OrbitController script: // Use this for initialization void Start () { // Creates a random position along the orbit path Vector2 randomPosition = Random.insideUnitCircle; transform.position = new Vector3 (randomPosition.x, 0f, randomPosition.y) * distFromStar; // Sets the size of the GameObject to the Planet radius value transform.localScale = Vector3.one * planetRadius; } As shown within this script, the Start() method is used to set the initial position of each planet. We will add the dimensions when we create the planets, and this script will pull those values to set the starting point of each game object at runtime: Next, modify the Update() method by adding two additional lines of code, as indicated in the following code block: // Update is called once per frame. This code block updates the Planet's position during each // runtime frame. void Update () { this.transform.RotateAround (orbitPivot.position, Vector3.up, orbitSpeed * Time.deltaTime); this.transform.Rotate (Vector3.up, rotationSpeed * Time.deltaTime); } This method is called once per frame while the program is running. Within Update(), the location for each object is determined by computing where the object should be during the next frame. this.transform.RotateAround uses the sun's pivot point to determine where the current GameObject (identified in the script by this) should appear in this frame. Then this.transform.Rotate updates how much the planet has rotated since the last frame. Save the script and return to Unity. Now that we have our first script, we can begin building the star and its planets. For this process, we will use Unity's primitive 3D GameObject to create the celestial bodies: Create a new sphere using GameObject | 3D Object | Sphere. This object will represent the star TRAPPIST-1. It will reside in the center of our solar system and will serve as the pivot for all seven planets. Right-click on the newly created Sphere object in the Hierarchy window and select Rename. Rename the object Star. Using the Inspector tab, set the object to Position: 0,0,0 and Scale: 1,1,1. With the Star selected, locate the Add Component button in the Inspector panel. Click the button and enter orbitcontroller in the search box. Double-click on the OrbitController script icon when it appears. The script is now a component of the star. Create another sphere using GameObject | 3D Object | Sphere and position it anywhere in the scene, with the default scale of 1,1,1. Rename the object Planet b. Figure 2.5, from the TRAPPIST-1 Wikipedia page, shows the relative orbital period, distance from the star, radius, and mass of each planet. We will use these dimensions and names to complete the setup of our VR environment. Each value will be entered as public variables for their associated GameObjects: Apply the OrbitController script to the Planet b asset by dragging the script icon to the planet in the Scene window or the Planet b object in the Hierarchy window. Planet b is our first planet and it will serve as a prototype for the rest of the system. Set the Orbit Pivot point of Planet b in the Inspector. Do this by clicking the Selector Target next to the Orbit Pivot field (see Figure 2.6). Then, select Star from the list of objects. The field value will change from None (Transform) to Star (Transform). Our script will use the origin point of the select GameObject as its pivot point. Go back and select the Star GameObject and set the Orbit Pivot to Star as we did with Planet b. Save the scene: Now that our template planet has the OrbitController script, we can create the remaining planets: Duplicate the Planet b GameObject six times, by right-clicking on it and choosing Duplicate. Rename each copy Planet c through Planet h. Set the public variables for each GameObject, using the following chart: GameObject Orbit Speed Rotation Speed Planet Radius Dist From Star Star 0 2 6 0 Planet b .151 5 0.85 11 Planet c .242 5 1.38 15 Planet d .405 5 0.41 21 Planet e .61 5 0.62 28 Planet f .921 5 0.68 37 Planet g 1.235 5 1.34 45 Planet h 1.80 5 0.76 60 Table 2.1: TRAPPIST-1 gameobject Transform settings Create an empty GameObject by right clicking in the Hierarchy panel and selecting Create Empty. This item will help keep the Hierarchy window organized. Rename the item Planets and drag Planet b—through Planet h into the empty item. This completes the layout of our solar system, and we can now focus on setting a location for the stationary player. Our player will not have the luxury of motion, so we must determine the optimal point of view for the scene: Run the simulation. Figure 2.7 illustrates the layout being used to build and edit the scene. With the scene running and the Main Camera selected, use the Move and Rotate tools or the Transform fields to readjust the position of the camera in the Scene window or to find a position with a wide view of the action in the Game window; or a position with an interesting vantage point. Do not stop the simulation when you identify a position. Stopping the simulation will reset the Transform fields back to their original values. Click the small Options gear in the Transform panel and select Copy Component. This will store a copy of the Transform settings to the clipboard: Stop the simulation. You will notice that the Main Camera position and rotation have reverted to their original settings. Click the Transform gear again and select Paste Component Values to set the Transform fields to the desired values. Save the scene and project. You might have noticed that we cannot really tell how fast the planets are rotating. This is because the planets are simple spheres without details. This can be fixed by adding materials to each planet. Since we really do not know what these planets look like we will take a creative approach and go for aesthetics over scientific accuracy. The internet is a great source for the images we need. A simple Google search for planetary textures will result in thousands of options. Use a collection of these images to create materials for the planets and the TRAPPIST-1 star: Open a web browser and search Google for planet textures. You will need one texture for each planet and one more for the star. Download the textures to your computer and rename them something memorable (that is, planet_b_mat...). Alternatively, you can download a complete set of textures from the Resources section of the supporting website: http://zephyr9.pairsite.com/vrblueprints/Trappist1/. Copy the images to the Trappist1/Assets/Materials folder. Switch back to Unity and open the Materials folder in the Project panel. Drag each texture to its corresponding GameObject in the Hierarchy panel. Notice that each time you do this Unity creates a new material and assigns it to the planet GameObject: Run the simulation again and observe the movement of the planets. Adjust the individual planet Orbit Speed and Rotation Speed to feel natural. Take a bit of creative license here, leaning more on the scene's aesthetic quality than on scientific accuracy. Save the scene and the project. For the final design phase, we will add a space themed background using a Skybox. Skyboxes are rendered components that create the backdrop for Unity scenes. They illustrate the world beyond the 3D geometry, creating an atmosphere to match the setting. Skyboxes can be constructed of solids, gradients, or images using a variety of graphic programs and applications. For this project, we will find a suitable component in the Asset Store: Load the Asset Store from the Window menu. Search for a free space-themed skybox using the phrase space skybox price:0. Select a package and use the Download button to import the package into the Scene. Select Window | Lighting | Settings from the main menu. In the Scene section, click on the Selector Target for the Skybox Material and choose the newly downloaded skybox: Save the scene and the project. With that last step complete, we are done with the design and development phase of the project. Next, we will move on to building the application and transferring it to a device. Building the application To experience this simulation in VR, we need to have our scene run on a head-mounted display as a stereoscopic display. The app needs to compile the proper viewing parameters, capture and process head tracking data, and correct for visual distortion. When you consider the number of VR devices we would have to account for, the task is nothing short of daunting. Luckily, Google VR facilitates all of this in one easy-to-use plugin. The process for building the mobile application will depend on the mobile platform you are targeting. If you have previously built and installed a Unity app on a mobile device, many of these steps will have already been completed, and a few will apply updates to your existing software. Note: Unity is a fantastic software platform with a rich community and an attentive development staff. During the writing of this book, we tackled software updates (5.5 through 2017.3) and various changes in the VR development process. Although we are including the simplified building steps, it is important to check Google's VR documentation for the latest software updates and detailed instructions: Android: https://developers.google.com/vr/unity/get-started iOS: https://developers.google.com/vr/unity/get-started-ios Android Instructions If you are just starting out building applications from Unity, we suggest starting out with the Android process. The workflow for getting your project export from Unity to playing on your device is short and straight forward: On your Android device, navigate to Settings | About phone or Settings | About Device | Software Info. Scroll down to Build number and tap the item seven times. A popup will appear, confirming that you are now a developer. Now navigate to Settings | Developer options | Debugging and enable USB debugging. Building an Android application In your project directory (at the same level as the Asset folder), create a Build folder. Connect your Android device to the computer using a USB cable. You may see a prompt asking you to confirm that you wish to enable USB debugging on the device. If so, click OK. In Unity, select File | Build Settings to load the Build dialog. Confirm that the Platform is set to Android. If not choose Android and click Switch Platform. Note that Scenes/Main Scene should be loaded and checked in the Scenes In Build portion of the dialog. If not, click the Add Open Scenes button to add Main Scene to the list of scenes to be included in the build. Click the Build button. This will create an Android executable application with the .apk file extension. Invalid command Android error Some Android users have reported an error relating to the Android SDK Tools location. The problem has been confirmed in many installations prior to Unity 2017.1. If this problem occurs, the best solution is to downgrade to a previous version of the SDK Tools. This can be done by following the steps outlined here: Locate and delete the Android SDK Tools folder [Your Android SDK Root]/tools. This location will depend on where the Android SDK package was installed. For example, on my computer the Android SDK Tools folder is found at C:UserscpalmerAppDataLocalAndroidsdk. Download SDK Tools from http://dl-ssl.google.com/Android/repository/tools_r25.2.5-windows.zip. Extract the archive to the SDK root directory. Re-attempt the Build project process. If this is the first time you are creating an Android application, you might get an error indicating that Unity cannot locate your Android SDK root directory. If this is the case, follow these steps: Cancel the build process and close the Build Settings... window. Choose Edit | Preferences... from the main menu. Choose External Tools and scroll down to Android. Enter the location of your Android SDK root folder. If you have not installed the SDK, click the download button and follow the installation process. Install the app onto your phone and load the phone into your Cardboard device: iOS Instructions The process for building an iOS app is much more involved than the Android process. There are two different types of builds: Build for testing Build for distribution (which requires an Apple Developer License) In either case, you will need the following items to build a modern iOS app: A Mac computer running OS X 10.11 or later The latest version of Xcode An iOS device and USB cable An Apple ID Your Unity project For this demo, we will build an app for testing and we will assume you have completed the Getting Started steps (https://docs.unity3d.com/Manual/iphone-GettingStarted.html) from Section 1. If you do not yet have an Apple ID, obtain one from the Apple ID site (http://appleid.apple.com/). Once you have obtained an Apple ID, it must be added to Xcode: Open Xcode. From the menu bar at the top of the screen, choose Xcode | Preferences. This will open the Preferences window. Choose Accounts at the top of the window to display information about the Apple IDs that have been added to Xcode. To add an Apple ID, click the plus sign at the bottom left corner and choose Add Apple ID. Enter your Apple ID and password in the resulting popup box. Your Apple ID will then appear in the list. Select your Apple ID. Apple Developer Program teams are listed under the heading of Team. If you are using the free Apple ID, you will be assigned to Personal Team. Otherwise, you will be shown the teams you are enrolled in through the Apple Developer Program. Preparing your Unity project for iOS Within Unity, open the Build Settings from the top menu (File | Build Settings). Confirm that the Platform is set to iOS. If not choose iOS and click Switch Platform at the bottom of the window. Select the Build & Run button. Building an iOS application Xcode will launch with your Unity project. Select your platform and follow the standard process for building an application from Xcode. Install the app onto your phone and load the phone into your Cardboard device. We looked at the basic Unity workflow for developing VR experiences. We also provided a stationary solution so that we could focus on the development process. The Cardboard platform provides access to VR content from a mobile platform, but it also allows for touch and gaze controls. You read an excerpt from the book, Virtual Reality Blueprints, written by Charles Palmer and John Williamson. In this book, you will learn how to create compelling Virtual Reality experiences for mobile and desktop with three top platforms—Cardboard VR, Gear VR, and OculusVR. Read More Top 7 modern Virtual Reality hardware systems Virtual Reality for Developers: Cardboard, Gear VR, Rift, and Vive    
Read more
  • 0
  • 2
  • 10625