Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tech News - 3D Game Development

56 Articles
article-image-think-silicon-open-sources-glove-an-opengl-es-over-vulkan-middleware
Savia Lobo
03 Aug 2018
2 min read
Save for later

Think Silicon open sources GLOVE: An OpenGL ES over Vulkan middleware

Savia Lobo
03 Aug 2018
2 min read
Think Silicon, a firm that delivers ultra-low power graphics IP technology recently open sourced  GLOVE (GL Over Vulkan). GLOVE, a middleware allows Android, Linux, and Windows OS developers to seamlessly run OpenGL ES on supported hardware by translating OpenGL ES API calls to Vulkan API commands at runtime. Why GLOVE (GL Over Vulkan)? OpenGL and OpenGL ES are the most widely used standards for the development of graphics-based applications. Increased complexity of driver's implementation based on OpenGL and OpenGL ES led to the introduction of Vulkan (a lower-level API that transfers much of the driver's functionality to the application-side). However, in most situations, a direct transition to Vulkan requires additional efforts by the developers and vendors. This means they are forced to maintain driver support for both Vulkan and OpenGL. With the introduction of GLOVE, developers can seamlessly transition their software between both Vulkan and OpenGL APIs. Following this, vendors are able to discard duplicate OpenGL ES drivers and rely solely on a lighter implementation of the Vulkan API. Additional features of GLOVE GLOVE allows running OpenGL ES calls not only on Android and Linux systems but also on Windows operating systems, which was not possible before. It also offers the possibility to run legacy applications and games on top of Vulkan at a glance. This saves a lot of effort and provides backward compatibility. It allows you to quickly explore Vulkan driver capabilities and performance by using existing OpenGL ES code. Its modular design can be easily extended to encompass implementations of other client APIs as well. Dimitris Georgakakis, Team Lead, Graphics Software Stack, Think Silico said, "We are excited to release GLOVE™ as Open Source Project to the graphics developer community and we will continue our efforts to support more platforms and features to ensure GLOVE™ can be useful in a lot of use cases.” Read more about GLOVE (GL Over Vulkan) on ThinkSilicon release notes. Implementing Unity 2017 Game Audio [Tutorial] Unity assets to create interactive 2D games [Tutorial]
Read more
  • 0
  • 0
  • 4423

article-image-unreal-engine-4-20-released-with-focus-on-mobile-and-immersive-ar-vr-mr-devices
Sugandha Lahoti
20 Jul 2018
4 min read
Save for later

Unreal Engine 4.20 released with focus on mobile and immersive (AR/VR/MR) devices

Sugandha Lahoti
20 Jul 2018
4 min read
Following the release of Unreal Engine 4.19 this April, Epic games have launched the Unreal Engine 4.20. This major update focuses on enhancing scalability and creativity, helping developers create more realistic characters, and immersive environments, for games, film, TV, and VR/AR devices. Multiple optimizations for Mobile Game development Epic games brought over 100 optimizations created for Fortnite on iOS and Android, for Unreal Engine 4.20. Hardware Occlusion Queries are now supported for high-end mobile devices on iOS and Android that support ES 3.1 or Vulkan using the GPU. Developers can also iterate and debug on Android without having to repackage the UE4 project. Game developers now have unlimited Landscape Material layers on mobile devices. Mixed Reality Capture Unreal Engine 4.20 provides a new Mixed Reality Capture functionality, which makes it easy to composite real players into a virtual space for mixed reality applications. It has three components: video input, calibration, and in-game compositing. You can use supported webcams and HDMI capture devices to pull real-world green-screened video into the Unreal Engine from a variety of sources.  The setup and calibration are done through a standalone calibration tool that can be reused across Unreal Engine 4 titles. Niagara Visual effects editor The Niagara visual effects Editor is available as an early access plugin. While the Niagara editor builds on the same particle manipulation methods of Cascade (UE4’s previous VFX), unlike Cascade, Niagara is fully Modular. UE 4.20 adds multiple improvements to Niagara Effect Design and Creation. All of Niagara’s Modules have been updated to support commonly used behaviors in building effects for games. New UI features have also been added for the Niagara stack that mimic the options developers have with UProperties in C++. Niagara now has support for GPU Simulation when used on DX11, PS4, Xbox One, OpenGL (ES3.1), and Metal platforms. Niagara CPU Simulation now works on PC, PS4, Xbox One, OpenGL (ES3.1) and Metal. Niagara was showcased at the GDC 2018 and you can see the presentation Programmable VFX with Unreal Engine’s Niagara for a complete overview. Cinematic Depth of Field Unreal Engine 4.20 also adds Cinematic Depth of Field, where developers can achieve cinema quality camera effects in real-time. Cinematic DoF, provides cleaner depth of field effect providing a cinematic appearance with the use of a procedural Bokeh simulation. It also features dynamic resolution stability, supports alpha channel, and includes settings to scale it down for console projects. For additional information, you can see the Depth of Field documentation. Proxy LOD improvements The Proxy LOD tool is now production-ready. This tool improves performance by reducing rendering cost due to poly count, draw calls, and material complexity. It  results in significant gains when developing for mobile and console platforms. The production-ready version of the Proxy LOD tool has several enhancements over the Experimental version found in UE4.19. Improved Normal Control: The use may now supply the hard-edge cutoff angle and the method used in computing the vertex normal. Gap Filling: The Proxy system automatically discards any inaccessible structures. Gap Filling results in fewer total triangles and a better use of the limited texture resource. Magic Leap One Early Access Support With Unreal Engine 4.20, game developers can now build for Magic Leap One. Unreal Engine 4 support for Magic Leap One uses built-in UE4 frameworks such as camera control, world meshing, motion controllers, and forward and deferred rendering. For developers with access to hardware, Unreal Engine 4.20 can deploy and run on the device in addition to supporting Zero Iteration workflows through Play In Editor. Read more The hype behind Magic Leap’s New Augmented Reality Headsets Magic Leap’s first AR headset, powered by Nvidia Tegra X2, is coming this Summer Apple ARKit 2.0 and Google ARCore 1.2 Support Unreal Engine 4.20 adds support for Apple’s ARKit 2.0, for better tracking quality, support for vertical plane detection, face tracking, 2D and 3D image detection, and persistent and shared AR experiences. It also adds support for Google’s ARCore 1.2, including vertical plane detection, Augmented Images, and Cloud Anchor to build collaborative AR experiences. These are just a select few updates to the Unreal Engine. The full list of release notes is available on the Unreal Engine blog. What’s new in Unreal Engine 4.19? Game Engine Wars: Unity vs Unreal Engine
Read more
  • 0
  • 1
  • 6451

article-image-meet-yuzu-an-experimental-emulator-for-the-nintendo-switch
Sugandha Lahoti
17 Jul 2018
3 min read
Save for later

Meet yuzu – an experimental emulator for the Nintendo Switch

Sugandha Lahoti
17 Jul 2018
3 min read
The makers of Citra, an emulator for the Nintendo 3DS, have released a new emulator called yuzu. This emulator is made for the Nintendo Switch, which is the 7th major video game console from Nintendo. The journey so far for yuzu Yuzu was initiated as an experimental setup by Citra’s lead developer bunnei after he saw that there were signs of the Switch’s operating system being based on the 3DS’s operating system. yuzu has the same core code as Citra and much of the same OS High-Level Emulation (HLE). The core emulation and memory management of yuzu are based on Citra, albeit modified to work with 64-bit addresses. It also has a loader for the Switch games and Unicorn integration for CPU emulation. Yuzu uses Reverse Engineering process to figure out how games work, and how the Switch GPU works. Switch’s GPU is more advanced than 3DS’ used in Citra and poses multiple challenges to reverse engineer it. However, the RE process of yuzu is essentially the same as Citra. Most of their RE and other development is being done in a trial-and-error manner. OS emulation The Switch’s OS is based Nintendo 3DS’s OS. So the developers used a large part of Citra’s OS HLE code for yuzu OS. The loader and file system service was reused from Citra and modified to support Switch game dump files. The Kernel OS threading, scheduling, and synchronization fixes for yuzu were also ported from Citra’s OS implementation. The save data functionality, which allowed games to read and write files to the save data directory was also taken from 3DS. Switchbrew helped them create libnx, a userland library to write homebrew apps for the Nintendo Switch. (Homebrew is a popular term used for applications that are created and executed on a video game console by hackers, programmers, developers, and consumers.) The Switch IPC (Inter-process communication) process is much more robust and complicated than the 3DS’s. Their system has different command modes, a typical IPC request response, and a Domain to efficiently conduct multiple service calls. Yuzu uses the Nvidia services to configure the video driver to get the graphics output. However, Nintendo re-purposed the Android graphics stack and used it in the Switch for rendering. And so yuzu developers had to implement this even to get homebrew applications to display graphics. The Next Steps Being at a nascent stage, yuzu still has a long way to go. The developers still have to add HID (user input support) such as support for all 9 controllers, rumble, LEDs, layouts etc. Currently, the Audio HLE is in progress, but they still have to implement audio playback. Audio playback, if implemented properly, would be a major breakthrough as most complicated games often hang or go into a deadlock because of this issue. They are also working on resolving minor fixes to help them boot further in games like Super Mario Odyssey, 1-2-Switch, and The Binding of Issac. Be sure to read the entire progress report on the yuzu blog. AI for game developers: 7 ways AI can take your game to the next level AI for Unity game developers: How to emulate real-world senses in your NPC agent behavior Unity 2018.2: Unity release for this year 2nd time in a row!
Read more
  • 0
  • 0
  • 8976
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-game-developers-excited-about-unity-2018-2
Amarabha Banerjee
26 Jun 2018
3 min read
Save for later

What’s got game developers excited about Unity 2018.2?

Amarabha Banerjee
26 Jun 2018
3 min read
The undisputed leader of game engines over the last few years has been Unity. It brings .NET professionals and enthusiasts from across the globe under the gaming umbrella with its C# game scripting feature . Unity also boasts of a very active community and an even busier release schedule. Unity has been following a semantic versioning. Under this scheme, version numbers and the way they change convey meaning about the underlying code and what has been modified from one version to the next.  Unity have just released their 2018.2 beta version. Here are some exciting features you can look forward to while working in Unity 2018.2. Texture Mipmap streaming feature: If you are a game developer then saving GPU memory is probably one of your top priorities. Unity 2018.2 gives you control over which graphical map or mipmap you will load in the CPU. The previous versions used to load all the mipmaps at the same time and hence put a huge amount of load on the GPU. While this memory allocation helps in reducing GPU load, it does increase a little bit of CPU load. Improved Package manager: Unity 2018.2 comes with an improved package manager. The improvements are in the UI font, and the status of package label. It also now has the ability to dock the window and provides easy access to both documentation and the list of changes. Improvements in the Particle system: Unity 2018.2 beta comes with an improved particle system and new scripting APIs for baking the geometry of a Particle System into a Mesh. Unity now allows up to eight texture coordinates to be used on meshes and passed to shaders. Particle Systems will also now convert their colors into linear space, when appropriate, before uploading them to the GPU. Camera Improvements: Unity has come up with some major improvements in their camera and and the way it functions and renders the objects in the game to portray them like real life objects. Animation Jobs C# API: Unity 2018.2 has improved the AnimationPlayables by allowing users to write their own C# Playables that can interact directly with the animation data. This allows integration of user made IK solvers, procedural animation or even custom mixers into the current animation system. These features along with some other improvements and bug fixes are sure to help the developers create better and smarter games with the latest Unity 2018.2. To know more on the Unity 2018.2 features, you can visit the official Unity blog. How to use arrays, lists, and dictionaries in Unity for 3D game development Build an ARCore app with Unity from scratch Implementing lighting & camera effects in Unity 2018
Read more
  • 0
  • 0
  • 3510

Banner background image
article-image-unite-berlin-2018-keynote-unity-partners-with-google-launches-ml-agents-toolkit-0-4-project-mars-and-more
Sugandha Lahoti
20 Jun 2018
5 min read
Save for later

Unite Berlin 2018 Keynote: Unity partners with Google, launches Ml-Agents ToolKit 0.4, Project MARS and more

Sugandha Lahoti
20 Jun 2018
5 min read
Unite Berlin 2018, the Unity annual developer conference, kicked off on June 19’ 2018. This three-day extravaganza will take you through a thrilling ride filled with new announcements, sessions, and workshops from the amazing creators of Unity. It’s a place to develop, network, and participate with artists, developers, filmmakers, researchers, storytellers and other creators. Day 1 was inaugurated with the promising Keynote, presented by John Riccitiello, CEO of Unity Technologies. It featured previews of upcoming unity technology, most prominently Unity’s alliance with Google Cloud to help developers build connected games. Let’s take a look at what was showcased. Connected Games with Unity and Google Cloud Unity and Google Cloud have collaborated for helping developers create real-time multiplayer games. They are building a suite of managed services and tools to help developers, test, and run connected experiences while offloading the hard work of quickly scaling game servers to Google Cloud. Games can be easily scaled to meet the needs of the players. Game developers can harness the massive power of Google cloud without having to be a cloud expert. Here’s what Google Cloud with Unity has in store: Game-Server Hosting: Streamlined resources to develop and scale hosted multiplayer games. Sample FPS: A production-quality sample project of a real-time multiplayer game. New ECS Networking Layer: Fast, flexible networking code that delivers performant multiplayer by default. Unity ML-Agents Toolkit v0.4 A new version of Unity ML-Agents Toolkit was also announced at Unite Berlin. The v0.4 toolkit hosts multiple updates as requested by the Unity community. Game developers now have the option to train environments directly from the Unity editor, rather than as built executables. Developers can simply launch the learn.py script, and then press the “play” button from within the editor to perform training. They have also launched a set of two new challenging environments, Walker and Pyramids. Walker is physics-based humanoid ragdoll and Pyramids is a complex sparse-reward environment. There are also algorithmic improvements in reinforcement learning. Agents are now trained to learn to solve tasks that were previously learned with great difficulty. Unity is also partnering with Udacity to launch Deep Reinforcement Learning Nanodegree to help students and professionals gain a deeper understanding of reinforcement learning. Augmented Reality with Project MARS Unity has also announced their Project MARS, a Mixed and Augmented Reality studio, that will be provided as a Unity extension. This studio will require almost little-to-no custom coding and will allow game developers to build AR and MR applications that intelligently interact with any real-world environment, with little-to-no custom coding. Unite Berlin - AR Keynote Reel MARS will include abstract layers for object recognition, location, and map data. It will have sample templates with simulated rooms, for testing against different environments, inside the editor.  AR-specific gizmos will be provided to easily define spatial conditions like plane size, elevation, and proximity without requiring code or precise measurements. It will also have elements such as face masks, to avatars, to entire rooms of digital art. Project MARS will be coming to Unity as an experimental package later this year. Unity has also unveiled a Facial AR Remote Component. Powered by Augmented Reality, this component can perform and capture animated characters, allowing filmmakers and CGI developers to shoot CG content with body movement, just like you would with live action. Kinematica - Machine Learning powered Animation system Unity also showcased their AI research by announcing Kinematica, an all-new ML-powered animation system. Kinematica overpowers traditional animation systems which generally require animators to explicitly define transitions. Kinematica does not have any superimposed structure, like graphs or blend trees. It generates smooth transitions and movements by applying machine learning to any data source. Game developers and animators no longer need to manually map out animation graphs. Unite Berlin 2018 - Kinematica Demo Kinematica decides in real time how to combine data clips from a single library into a sequence that matches the controller input, the environment content, and the gameplay requests. As with Project MARS, Kinematica will also be available later this year as an experimental package. New Prefab workflows The entire Prefab systems have been revamped with multiple improvements. This improved Prefab workflow is now available as a preview build. New additions include Prefab Mode, prefab variance, and nested prefabs. Prefab Mode allows faster, efficient, and safer editing of Prefabs in an isolated mode, without adding them to the actual scene. Developers can now edit the model prefabs, and the changes are propagated to all prefab variants. With Nested prefabs, teams can work on different parts of the prefab and then come together for the final asset. Predictive Personalized Placements Personalized placements bring the best of both worlds for players and the commercial business. With this new feature, game developers can create tailor-made game experiences for each player. This feature runs on an engine which is powered by predictive analytics. This prediction engine determines what to show to each player based on what will drive the highest engagement and lifetime value. This includes ad, an IAP promotion, a notification of a new feature, or a cross-promotion. And the algorithm will only get better with time. These were only a select few of the announcements presented in Unity Berlin Keynote. You can watch the full video on YouTube. Details on other sessions, seminars, and activities are available on the Unite website. GitHub for Unity 1.0 is here with Git LFS and file locking support Unity announces a new automotive division and two-day Unity AutoTech Summit Put your game face on! Unity 2018.1 is now available
Read more
  • 0
  • 0
  • 4767

article-image-github-for-unity-1-0-is-here-with-git-lfs-and-file-locking-support
Sugandha Lahoti
19 Jun 2018
3 min read
Save for later

GitHub for Unity 1.0 is here with Git LFS and file locking support

Sugandha Lahoti
19 Jun 2018
3 min read
GitHub for Unity is now available in version 1. GitHub for Unity 1.0 is a free and open source Unity editor extension that brings Git into Unity 5.6, 2017.x, and 2018.x. GitHub for Unity was announced as an alpha version in March 2017.  The beta version was released earlier this year. Now the full release GitHub for Unity 1.0 is available just in time for Unite Berlin 2018, scheduled to happen on June 19-21. GitHub for Unity 1.0 allows you to stay in sync with your team as you can now collaborate with other developers, pull down recent changes, and lock files to avoid troublesome merge conflicts. It also introduces two key features for game developers and their teams for managing large assets and critical scene files using Git, with the same ease of managing code files. Updates to Git LFS GitHub for Unity 1.0 has improved Git and Git LFS support for Mac. Git Large File Storage (LFS) replaces large files such as audio samples, videos, datasets, and graphics with text pointers inside Git. Previously, the package included full portable installations of the Git and Git LFS. Now, these are downloaded when needed, reducing the package size to 1.6MB. Critical Git and Git LFS updates and patches are distributed faster and in a more flexible way now. File locking File locking management is now a top-level view within the GitHub window. With this new feature developers can now lock or unlock multiple files. Other features include: Diffing support to visualize changes to files. The diffing program can be customized (set in the “Unity Preferences” area) directly from the “Changes” view in the GitHub window. No hassles of command line, as developers can now view project history, experiment in branches, craft a commit from their changes and push their code to GitHub without leaving Unity. A Git action bar for essential operations. Game developers will now get a notification within Unity whenever a new version is available. They can choose to download or skip the current update. Easy email sign in. Developers can sign in to their GitHub account with their GitHub username or the email address associated with their account. GitHub for Unity 1.0 is available for download at unity.github.com and from the Unity Asset Store. Lead developer at Unity, Andreia Gaita will conduct a GitHub for Unity talk on June 19 at Unite Berlin to explain how to incorporate Git into your game development workflow. Put your game face on! Unity 2018.1 is now available Unity announces a new automotive division and two-day Unity AutoTech Summit AI for Unity game developers: How to emulate real-world senses in your NPC agent behavior
Read more
  • 0
  • 0
  • 5515
article-image-cryengine-5-5-preview-3-is-here
Natasha Mathur
25 May 2018
3 min read
Save for later

CRYENGINE 5.5 preview 3 goes live!

Natasha Mathur
25 May 2018
3 min read
After 400+ improvements, the latest CRYENGINE 5.5 preview 3 is live. Typically, CRYENGINE follows three preview cycles before the stable release. The latest, preview 3, is packed with code interface changes, improvements to animations and graphics, changes to the AI and core system along with other additional fixes. The developer team at CRYENGINE has worked hard on the previews released earlier to ensure improved efficiency for the fellow game developers across the globe. The feedback provided by the CRYENGINE community played a pivotal role in this process. Here’s a quick rundown of all the new major features and updates in CRYENGINE 5.5 preview 3: Code Interface changes Following are the changes made to the code interface: The new CryGamePlatform plugin (wrapping Steam and PSN APIs) replaces CryLobby. The r_WindowType has replaced the r_Fullscreen and  r_FullscreenWindow CVars. The Windows key has been enabled in the engine by default. You have to use g_disableWinKeys=1 to return to prior behavior. The IID function is no longer needed by the Entity Components. The ICryPlugin interface has been renamed to IEnginePlugin and moved to the Cry namespace. There is no need for the plugins to implement the GetName and GetCategory functions. This is because the default implementations have been added. CryAction, which is a game framework is currently being removed. Its vital parts will move onto the other parts of the engine. Get the full information at the CRYENGINE code interface changes documentation. Animation and Graphics improvements Animation Attachments such as a pistol, shotgun etc, were wrongly attached in cases where compute skinning is used. This is also resolved. Race-condition crash in PoseAlignerChain when using ground alignment IK has been fixed. Graphics A new Profile section has been added which helps detect GetOrCreateInputLayout's shader reflection request. A new annotating HLSL shader is added in Vulkan that includes resource layout descriptors. HLSLcc local shader compilation is another newly added feature in Vulkan. Transformation/rotation issues for objects with sub-objects have been fixed. New screen fill stats are added. Now there are Enhanced grouping capabilities in new ActivateRandom feature. This supports all possibilities of old SecondGen features. AI system and Core/System changes AI system Islands connectivity, a brand new feature added, takes navigation annotations into account. Registering smart object as an off-mesh link that doesn’t work after creating a new level has also been fixed. Core New CryAPIExamples module is introduced to compile Doxygen snippets. Simple IEntityAudioComponent examples have been added. Plugin load from disk if shared libraries are unsupported has been disabled. This helps fix the PS4 startup issue. Preview drawing is added to the Physic Constraint Components. Apart from these changes and improvements, there have also been additional fixes. CVar is added to draw full extent of last cached shadow cascade, an unused ColorGrading technique is removed that was giving compiler errors, etc. You can access the 5.5.0 Preview 3 Release through the CRYENGINE launcher and GitHub. These are just a few selected updates to the CRYENGINE 5.5 preview 3. The full list of over 400+ changes, updates, and other known issues is available on the CRYENGINE release notes documentation. What’s new in Unreal Engine 4.19? Put your game face on! Unity 2018.1 is now available Working with Unity Variables to script powerful Unity 2017 games  
Read more
  • 0
  • 0
  • 2301

article-image-put-your-game-face-on-unity-2018-1-is-now-available
Sugandha Lahoti
07 May 2018
4 min read
Save for later

Put your game face on! Unity 2018.1 is now available

Sugandha Lahoti
07 May 2018
4 min read
Unity Technologies has announced the release of their latest platform update Unity 2018.1 giving artists, developers and game engineers the power to express their talents and collaborate more efficiently to build games. Unity 2018.1 also marks the start of a new release cycle. Since 2017, Unity has adopted a new release plan where they come up with a new version every quarter and Unity 2018.1 marks the first version of the 2018 series. According to Brett Bibby, VP of Engineering, Unity Technologies, “With Unity 2018.1 we are introducing one of the largest upgrades in the history of our company, and it’s centered around two major concepts - next-level rendering and performance by default,” This release features two new upgrades: the Scriptable Render Pipelines and the Entity Component System. Together they make it easier for creators to make richer experiences utilizing modern hardware to deliver beautiful graphics. Next-level rendering with Scriptable Render Pipeline (SRP) Scriptable Render Pipeline (SRP) is available in the preview of Unity 2018.1. With SRP, developers and technical artists can now work directly with hardware and GPUs without having to go through millions of lines of C++ engine code. SRP makes it easy to customize the rendering pipeline via C# code and material shaders.   Unity 2018.1 also introduces two render pipelines. The High-Definition Render Pipeline (HD RP) is for developers with AAA aspirations. The Lightweight Render Pipeline (LW RP) is for those looking for a combination of graphics and speed. It optimizes the battery life for mobile devices and other similar platforms. Performance by default with the C# Job System &  Entity Component System (ECS) The C# Job system enables developers to write very fast, parallelized code in C# to take full advantage of multicore processors. It also provides protection from the pitfalls of multi-threading, such as race conditions and deadlocks. The runtime system is now combined with a new programming model, the Entity Component System. This new runtime system enables developers to use multicore processors without worrying about the programming. They can use this power to add more effects and complexity to games or add AI to make their creations richer and more immersive. It uses a data-oriented design instead of an object-oriented approach which makes it easier to reuse the code and easier for others to understand and work on it as well. Level design and shaders Unity 2018.1 reduces the time and effort required by artists, designers, and developers by allowing them to create levels, cinematic content, and gameplay sequences without coding. For this, new tools like ProBuilder/Polybrush and the new visual Shader Graph offer intuitive ways to design levels and create shaders without programming skills. ProBuilder is a unique hybrid of 3D-modeling and level-design tools optimized for building simple geometry, but capable of detailed editing and UV unwrapping as needed. With Polybrush developers can blend textures and colors, sculpt meshes and scatter objects directly in the Unity editor. Shader Graph can build shaders visually using a designer tool — without writing a single line of code. They offer easy drag-and-drop usability to create and connect nodes in a graph network. Unity Package Manager UI Unity 2018.1 builds on the package manager introduced in Unity 2017.2. It has a newly released Package Manager User Interface, the Hub, and Project Templates, to help start new projects faster and more efficiently. The Unity Package Manager UI  improves the following aspects of the project management workflow: Quick access to newly released features Get the latest fixes, instantly Access to Preview features Easily share lightweight projects Unity 2018.1 offers support for over 25+ platforms. This includes Magic Leap One, Oculus Go, ARCore 1.1, Android ARM64, Daydream Standalone and more. You can refer to the release notes for the full list of new features, improvements, and fixes. Unity will be showcasing all their latest innovations during Unite Berlin scheduled on June 19 - 21, 2018. Unity plugins for augmented reality application development Game Engine Wars: Unity vs Unreal Engine Unity releases ML-Agents v0.3: Imitation Learning, Memory-Enhanced Agents and more  
Read more
  • 0
  • 0
  • 3097

article-image-how-to-create-non-player-characters-npc-with-unity-2018
Amarabha Banerjee
26 Apr 2018
10 min read
Save for later

How to create non-player Characters (NPC) with Unity 2018

Amarabha Banerjee
26 Apr 2018
10 min read
Today, we will learn to create game characters while focusing mainly on non-player characters. Our Cucumber Beetles will serve as our game's non-player characters and will be the Cucumber Man's enemies. We will incorporate Cucumber Beetles in our game through direct placement. We will review the beetles' 11 animations and make changes to the non-player character's animation controller. In addition, we will write scripts to control the non-player characters. We will also add cucumber patches, cucumbers, and cherries to our game world. Understanding the non-player characters Non-player characters commonly referred to as NPCs, are simply game characters that are not controlled by a human player. These characters are controlled through scripts, and their behaviors are usually responsive to in-game conditions. Our game's non-player characters are the Cucumber Beetles. These beetles, as depicted in the following screenshot, have six legs that they can walk on; under special circumstances, they can also walk on their hind legs: Cucumber Beetles are real insects and are a threat to cucumbers. They cannot really walk on their hind legs, but they can in our game. Importing the non-player characters into our game You are now ready to import the asset package for our game's non-player character, the Cucumber Beetle. Go through the following steps to import the package: Download the Cucumber_Beetle.unitypackage file from the publisher's companion website In Unity, with your game project open, select Assets | Import Package | Custom Package from the top menu Navigate to the location of the asset package you downloaded in step 1 and click the Open button When presented with the Import Asset Package dialog window, click the Import button As you will notice, the Cucumber_Beetle asset package contains several assets related to the Cucumber Beetles, including a controller, scripts, a prefab, animations, and other assets: Now that the Cucumber_Beetle asset package has been imported into our game project, we should save our project. Use the File | Save Project menu option. Next, let's review what was imported. In the Project panel, under Assets | Prefabs, you will see a new Beetle.Prefab. Also in the Project panel, under Assets, you will see a Beetle folder. It is important that you understand what each component in the folder is for. Please refer to the following screenshot for an overview of the assets that you will be using in this chapter in regards to the Cucumber Beetle: The other assets in the previous screenshot that were not called out include a readme.txt file, the texture and materials for the Cucumber Beetle, and the source files. We will review the Cucumber Beetle's animations in the next section. Animating our non-player characters Several Cucumber Beetle animations have been prepared for use in our game. Here is a list of the animation names as they appear in our project, along with brief descriptions of how we will incorporate the animation into our game. The animations are listed in alphabetical order by name: Animation Name Usage Details Attack_Ground The beetle attacks the Cucumber Man's feet from the ground Attack_Standing The beetle attacks the Cucumber Man from a standing position Die_Ground The beetle dies from the starting position of on the ground Die_Standing The beetle dies from the starting position of standing on its hind legs Eat_Ground The beetle eats cucumbers while on the ground Idle_Ground The beetle is not eating, walking, fighting, or standing Idle_Standing The beetle is standing, but not walking, running, or attacking Run_Standing The beetle runs on its hind legs Stand The beetle goes from an on-the-ground position to standing (it stands up) Walk_Ground The beetle walks using its six legs Walk_Standing The beetle walks on its hind legs You can preview these animations by clicking on an animation file, such as Eat_Ground.fbx, in the Project panel. Then, in the Inspector panel, click the play button to watch the animation. There are 11 animations for our Cucumber Beetle, and we will use scripting, later to determine when an animation is played. In the next section, we will add the Cucumber Beetle to our game. Incorporating the non-player characters into our game First, let's simply drag the Beetle.Prefab from the Assets/Prefab folder in the Project panel to our game in Scene view. Place the beetle somewhere in front of the Cucumber Man so that the beetle can be seen as soon as you put the game into game mode. A suggested placement is illustrated in the following screenshot: When you put the game into game mode, you will notice that the beetle cycles through its animations. If you double-click the Beetle.controller in the Assets | Beetle folder in the Project panel, you will see, as shown in the following screenshot, that we currently have several animations set to play successively and repeatedly: This initial setup is intended to give you a first, quick way of previewing the various animations. In the next section, we will modify the animation controller. Working with the Animation Controller We will use an Animation Controller to organize our NPCs' animations. The Animation Controller will also be used to manage the transitions between animations. Before we start making changes to our Animation Controller, we need to identify what states our beetle has and then determine what transitions each state can have in relation to other states. Here are the states that the beetle can have, each tied to an animation: Idle on Ground Walking on Ground Eating on Ground Attacking on Ground Die on Ground Stand Standing Idle Standing Walk Standing Run Standing Attack Die Standing With the preceding list of states, we can assign the following transitions: From Idle on Ground to: Walking on Ground Running on Ground Eating on Ground Attacking on Ground Stand From Stand to: Standing Idle Standing Walk Standing Run Standing Attack Reviewing the transitions from Idle on Ground to Stand demonstrates the type of state-to-state transition decisions you need to make for your game. Let's turn our attention back to the Animation Controller window. You will notice that there are two tabs in the left panel of that window: Layers and Parameters. The Layers tab shows a Base Layer. While we can create additional layers, we do not need to do this for our game. The Parameters tab is empty, and that is fine. We will make our changes using the Layout area of the Animation Controller window. That is the area with the grid background. Let's start by making the following changes. For all 11 New State buttons, do the following: Left-click the state button Look in the Inspector panel to determine which animation is associated with the state button Rename the state name in the Inspector panel to reflect the animation. Click the return button Double-check the state button to ensure your change was made When you have completed the preceding five steps for all 11 states, your Animation Controller window should match the following screenshot: If you were to put the game into game mode, you would see that nothing has changed. We only changed the state names so they made more sense to us. So, we have some more work to do with the Animation Controller. Currently, the Attacking on Ground state is the default. That is not what we want. It makes more sense to have the Idle on Ground state to be our default. To make that change, right-click the Idle on Ground state and select Set as Layer Default State: Next, we need to make a series of changes to the state transitions. There are a lot of states and there will be a lot of transitions. In order to make things easier, we will start by deleting all the default transitions. To accomplish this, left-click each white line with an arrow and press your keyboard's Delete key. Do not delete the orange line that goes from Entry to Idle on Ground. After all transitions have been deleted, you can drag your states around so you have more working room. You might temporarily reorganize them in a manner similar to what is shown in the following screenshot: Our next task is to create all of our state transitions. Follow these steps for each state transition you want to add: Right-click the originating state. Select Create Transition. Click on the destination state. Once you have made all your transitions, you can reorganize your states to declutter the Animation Controller's layout area. A suggested final organization is provided in the following screenshot: As you can see in our final arrangement, we have 11 states and over two dozen transitions. You will also note that the Die on Ground and Die Standing states do not have any transitions. In order for us to use these animations in our game, they must be placed into an Animation Controller. Let's run a quick experiment: Select the Beetle character in the Hierarchy panel. In the Inspector panel, click the Add Component button. Select Physics | Box Collider. Click the Edit Collider button. Modify the size and position of the box collider so that it encases the entire beetle body. Click the Edit Collider button again to get out of edit mode. Your box collider should look similar to what is depicted in the following screenshot: Next, let's create a script that invokes the Die on Ground animation when the Cucumber Man character collides with the beetle. This will simulate the Cucumber Man stepping on the beetle. Follow these steps: Select the Beetle character in the Hierarchy panel. In the Inspector panel, click the Add Component button. Select New Script. Name the script BeetleNPC. Click the Create and Add button. In the project view, select Favorites | All Scripts | BeetleNPC. Double-click the BeetleNPC script file. Edit the script so that it matches the following code block: using System.Collections; using System.Collections.Generic; using UnityEngine; public class BeetleNPC : MonoBehaviour { Animator animator; // Use this for initialization void Start () { animator = GetComponent<Animator>(); } // Collision Detection Test void OnCollisionEnter(Collision col) { if (col.gameObject.CompareTag("Player")) { animator.Play("Die on Ground"); } } } This code detects a collision between the Cucumber Man and the beetle. If a collision is detected, the Die on Ground animation is played.  As you can see in the following screenshot, the Cucumber Man defeated the Cucumber Beetle: This short test demonstrated two important things that will help us further develop this game: Earlier in this section, you renamed all the states in the Animation Controller window. The names you gave the states are the ones you will reference in code. Since the animation we used did not have any transitions to other states, the Cucumber Beetle will remain in the final position of the animation unless we script it otherwise. So, if we had 100 beetles and defeated them all, all 100 would remain on their backs in the game world. This was a simple and successful scripting test for our Cucumber Beetle. We will need to write several more scripts to manage the beetles in our game. First, there are some game world modifications we will make. To summarize, we discussed how to create interesting character animations and bring them to life using the Unity 2018 platform. You read an extract from the book Getting Started with Unity 2018 written by Dr. Edward Lavieri. This book gives you a practical understanding of how to get started with Unity 2018. Read More Unity 2D & 3D game kits simplify Unity game development for beginners Build a Virtual Reality Solar System in Unity for Google Cardboard Unity plugins for augmented reality application development    
Read more
  • 0
  • 0
  • 21386
article-image-unity-2d-3d-game-kits-simplify-unity-game-development-for-beginners
Amey Varangaonkar
18 Apr 2018
2 min read
Save for later

Unity 2D &amp; 3D game kits simplify Unity game development for beginners

Amey Varangaonkar
18 Apr 2018
2 min read
The rise of the video game industry over the last two decades has been staggering, to say the least. Considered to be an area with massive revenue potential, we have seen a revolution in the way games are designed, developed and played across various platforms.Unity, the most popular cross-platform game development platform, is now encouraging even the non-programmers to take up Unity game development by equipping them with state-of-the-art tools for designing interactive games. Unity game development simplified for non-developers These days, there are a lot of non-developers, game designers and even artists who wish to build their own games. Well, they are now in for a treat. Unity have come up with their 2D and 3D Game kits wherein the users develop 2D or 3D gameplays without the need to code. With the help of these game kits, beginners can utilize the elements, tools and systems within the kit to design their gameplay. The Unity 2D game kit currently supports versions Unity 2017.3 and higher, while the 3D game kit is supported by Unity 2018.1 or higher. Visual scripting with Bolt Unity  have also introduced a new visual scripting tool called Bolt, which allows non-programmers to create new gameplays from scratch and design interactive systems in Unity, without having to write a single line of code. With live editing, predictive debugging and a whole host of other features, Bolt ensures you can get started with designing your own game in no time at all. The idea of introducing these game kits and the Bolt scripting engine is to encourage more and more non-programmers to take up game development and let their creative juices flow. It will also serve as a starting point for absolute beginners to start their journey in game development. To know more about how to use these Unity game kits, check out the introduction to game kit by Unity.
Read more
  • 0
  • 0
  • 5588

article-image-whats-new-in-unreal-engine-4-19
Sugandha Lahoti
16 Apr 2018
3 min read
Save for later

What's new in Unreal Engine 4.19?

Sugandha Lahoti
16 Apr 2018
3 min read
The highly anticipated Unreal Engine 4.19 is now generally available. This release hosts a new Live Link plugin, improvements to Sequencer, new Dynamic Resolution feature, and multiple workflow and usability improvements. In addition to all of these major updates, this release also features a massive 128 improvements based on queries submitted by the Unreal Engine developers community on GitHub. Unreal Engine 4.19 allows game developers to know exactly what their finished game will look like at every step of the development process. This update comes with three major goals: Let developers step inside the creative process. Build gaming worlds that run faster than ever before. Give developers full control. Here's a list of all the major features and what they bring to the game development process: New Unreal Engine 4.19 features Live Link Plugin Improvements The Maya Live Link Plugin is now available and can be used to establish a connection between Maya and UE4 to preview changes in real-time. Virtual Subjects are added to the Live Link. It can also be used with Motion Controllers. Live Link Sources can now define their own custom settings. Virtual Initialization function and Update DeltaTime parameter are also added to Live Link Retargeter API. Source: Unreal Engine blog Unified Unreal AR framework The Unreal Augmented Reality Framework provides a unified framework for building Augmented Reality (AR) apps for both Apple and Google handheld platforms using a single code path. Features include functions supporting Alignment, Light Estimation, Pinning, Session State, Trace Results, and Tracking. Temporal upsampling The new upscaling method, Temporal Upsample performs two separate screen percentages used for upscaling: Primary screen percentage that by default will use the spatial upscale pass as before; Secondary screen percentage that is a static, spatial only upscale at the very end of post-processing, before the UI draws. Dynamic resolution Dynamic Resolution adjusts the resolution to achieve the desired frame rate, for games on PlayStation 4 and Xbox One. It uses a heuristic to set the primary screen percentage based on the previous frames GPU workload. Source: Unreal Engine blog Physical light units All light units are now defined using physically based units. The new light unit property can be edited per light, changing how the engine interprets the intensity property when doing lighting related computations. Source: Unreal Engine blog Landscape rendering optimization The Landscape level of detail (LOD) system now uses screen size to determine detail for a component, similar to how the Static Mesh LOD system works. Starting from this release, all existing UE4 content that supports SteamVR is now compatible with HTC's newly-announced Vive Pro. These are just a select few updates to the Unreal Engine. The full list of release notes is available on the Unreal Engine forums.
Read more
  • 0
  • 0
  • 4809