Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tech News - Game Development

93 Articles
article-image-corona-labs-open-sources-corona-its-free-and-cross-platform-2d-game-engine
Natasha Mathur
03 Jan 2019
3 min read
Save for later

Corona Labs open sources Corona, its free and cross-platform 2D game engine

Natasha Mathur
03 Jan 2019
3 min read
Corona Labs announced yesterday that it’s making its free and cross-platform 2D game engine, Corona, available as open source under the GPLv3 license and commercial licenses. The license for builds and releases remains unchanged and the change applies only to the source code of the engine. Corona is a popular game engine for creating 2D games and apps for mobile, desktop systems, TV platforms, and the web. It is based on Lua language and makes use of over 1,000 built-in APIs and plugins, and Corona Native extensions (C/C++/Obj-C/Java). According to Vlad Sherban, product manager for Corona Labs, the Corona team had been discussing making Corona open source ever since it got acquired by Appodeal, back in 2017. “We believe that this move will bring transparency to the development process, and will allow users to contribute features or bug fixes to make the project better for everyone,” said Sherban. The team also mentions that transitioning to open source would help them respond quickly to market shifts and changes. It would also ensure that Corona stays relevant at all times for all mobile app developers. Moreover, now that Corona is open source, it will bring more visibility to the development process by letting users see what the engine team is working on and where the project is going. It will also offer extra benefits for businesses as they will be able to acquire a commercial license for source code and customize the engine for certain commercial projects. Additionally, Corona Labs won’t be collecting any statistics from apps built with daily build 2018.3454 or later. When Corona Labs was a closed source product, it used to collect basic app usage stats such as the number of sessions, daily average users, etc. With Corona available as open source now, there is no need to collect this data. “Powered by the new open source model and supported by the development of new features and bug fixes will make Corona more community driven — but not without our help and guidance --- going open source will provide confidence in the future of the engine and an opportunity to grow community involvement in engine development,” said Sherban. NVIDIA open sources its game physics simulation engine, PhysX, and unveils PhysX SDK 4.0 Microsoft open sources Trill, a streaming engine that employs algorithms to process “a trillion events per day” Facebook contributes to MLPerf and open sources Mask R-CNN2Go, its CV framework for embedded and mobile devices
Read more
  • 0
  • 0
  • 4742

article-image-fortnite-server-suffered-a-minor-outage-epic-games-was-quick-to-address-the-issue
Prasad Ramesh
27 Dec 2018
2 min read
Save for later

Fortnite server suffered a minor outage, Epic Games was quick to address the issue

Prasad Ramesh
27 Dec 2018
2 min read
Yesterday, many Fortnite players reported long queues for matches and timeouts while trying to play the game. The Fortnite outage happened during the holiday season. https://twitter.com/Soldier_Dimitri/status/1078029461461913614 Epic games knew about this issue and tweeted that an investigation is underway to find the cause of the timeouts and slowdowns when some users were trying to log in and play. Epic had told the players to check the status on their website. https://twitter.com/FortniteGame/status/1078027774034657282 TechCrunch noted this Fortnite outage and was able to replicate it. The game was continuously held for about five minutes and then timed out. Epic games reported a “minor service outage” that affected game services. Within three hours of acknowledging, Epic games also issued a fix for the issue and let the users know on Twitter: https://twitter.com/FortniteGame/status/1078065448585965568 A member from Epic Games explained the reason for the Fortnite outage on Reddit: “Quick summary is that deploying a fix for elf challenge reward not being granted exposed a latent bug in our profile migration code (the code that fixes up players). This caused players to be kicked etc and triggered our waiting room. We fixed the issue and deployed a new backend, however, didn’t see a recovery in login success. This ended up due to “sticky session” configuration having been lost on our waiting room load balances when moving them to ALBs. This meant that there was a 90% chance of having to requeue after hitting the front of the line. D’oh. This should all be fixed now and we are seeing a recovery in numbers/login throughout / waiting room / etc.” Gamers have appreciated the company’s transparency in letting users know about what happened: “As a company, I applaud you for your transparency. It is highly unusual and I hope you continue to set precedence in this industry.” Fortnite creator Epic games launch Epic games store where developers get 88% of revenue earned; challenging Valve’s dominance Epic games CEO calls Google “irresponsible” for disclosing the security flaw in Fortnite Android Installer before patch was ready Google is missing out $50 million because of Fortnite’s decision to bypass Play Store
Read more
  • 0
  • 0
  • 3111

article-image-blender-2-8-released-with-a-revamped-user-interface-and-a-high-end-viewport-among-others
Natasha Mathur
26 Dec 2018
2 min read
Save for later

Blender 2.8 beta released with a revamped user interface, and a high-end viewport among others

Natasha Mathur
26 Dec 2018
2 min read
The Blender team released beta version 2.8 of its Blender, a free and open-source 3D creation software, earlier this week. Blender 2.8 beta comes with new features and updates such as EEVEE, a high-end Viewport, Collections, Cycles, and 2D animation among others. Blender is a 3D creation suite that offers the entirety of the 3D pipeline including modeling, rigging, animation, simulation, rendering, compositing, and motion tracking. It allows video editing as well as game creation. What’s new in Blender 2.8 Beta? EEVEE Blender 2.8 beta comes with EEVEE, a new physically based real-time renderer. EEVEE works as a renderer for final frames, and also as the engine driving Blender’s real-time viewport. It consists of advanced features like volumetrics, screen-space reflections and refractions, subsurface scattering, soft and contact shadows, depth of field, camera motion blur and bloom. A new 3D Viewport There's a new and modern 3D viewport that was completely rewritten. It can help optimize the modern graphics cards as well as add powerful new features. It consists of a workbench engine that helps visualize your scene in flexible ways. EEVEE also helps power the viewport to enable interactive modeling and painting with PBR materials. 2D Animation There are a new and improved 2D drawing capabilities, which include a new Grease Pencil. Grease Pencil is a powerful and new 2D animation system that was added, with a native 2D grease pencil object type, modifier, and shader effects. In a nutshell, it helps to create a user-friendly interface for the 2D artist. Collections Blender 2.8 beta introduces ‘collections’, a new concept that lets you organize your scene with the help of Collections and View Layers. Cycles Blender 2.8 beta comes with a new feature called Cycles that includes new principled volume and hair shaders, bevel and ambient occlusion shaders, along with many other improvements and optimizations. Other features Dependency Graph: In blender 2.8 beta, the core object evaluation and computation system have been rewritten. Blender offers better performance for modern many-core CPUs as well as for new features in the future releases. Multi-object editing: Blender 2.8 beta comes with multiple-object editing that allows you to enter edit modes for multiple objects together. For more information, check out the official Blender 2.8 beta release notes. Mozilla partners with Khronos Group to bring glTF format to Blender Building VR objects in React V2 2.0: Getting started with polygons in Blender Blender 2.5: Detailed Render of the Earth from Space
Read more
  • 0
  • 0
  • 7558

article-image-nvidia-launches-geforce-nows-gfn-recommended-router-program-to-enhance-the-overall-performance-and-experience-of-gfn
Natasha Mathur
24 Dec 2018
2 min read
Save for later

NVIDIA launches GeForce Now’s (GFN) 'recommended router' program to enhance the overall performance and experience of GFN

Natasha Mathur
24 Dec 2018
2 min read
NVIDIA launched a ‘recommended router’ program last week to improve the overall experience of its GeForce Now (GFN) cloud gaming service for PC and Mac. The GeForce NOW game-streaming service has transformed the user experience when it comes to playing high-performance games. NVIDIA has now come out with a few enhancements in beta mode to improve the quality of its service, using its ‘recommended router program’. The recommended router program comes comprises the latest generation routers for cloud-gaming in the home for video streaming and downloading. These routers enable the users to configure its settings in a way that it prioritizes GeForce NOW over all the other data. Recommended routers get certified as “factory-enabled” with a GeForce NOW “quality of service (QoS) profile” that makes sure that your cloud game playing is at its best quality. The router settings get automatically loaded once the GeForce Now launches. Network latency which is the biggest drawback on cloud gaming is quite low with these routers and also better streaming speeds are offered for GeForce NOW. “We’re working closely with ASUS, D-LINK, Netgear, Razer, TP-Link, Ubiquiti Networks and other router manufacturers to build GeForce NOW recommended routers. They’re committed to building best-in-class cloud gaming routers — just as we’re committed to delivering best-in-class gaming experiences,” says the NVIDIA team. GFN recommended routers are now available in the U.S. and Canada starting with Amplifi HD Gamer’s Edition by Ubiquiti Networks. Amplifi makes use of multiple self-configuring radios and advanced antenna technology that helps it deliver a powerful, whole-home Wi-Fi coverage. For more information, read the official NVIDIA blog. NVIDIA demos a style-based generative adversarial network that can generate extremely realistic images; has ML community enthralled NVIDIA makes its new “brain for autonomous AI machines”, Jetson AGX Xavier Module, available for purchase NVIDIA open sources its game physics simulation engine, PhysX, and unveils PhysX SDK 4.0
Read more
  • 0
  • 0
  • 3920

Banner background image
article-image-unity-and-baidu-collaborate-for-simulating-the-development-of-autonomous-vehicles
Amrata Joshi
21 Dec 2018
3 min read
Save for later

Unity and Baidu collaborate for simulating the development of autonomous vehicles

Amrata Joshi
21 Dec 2018
3 min read
This week, Unity Technologies, the real-time 3D development platform creator, announced its collaboration with Baidu Inc., China's leading Internet giant for developing a real-time simulation product that creates virtual environments while allowing developers to test autonomous vehicles in real-world situations. This real-time simulation will be available for developers taking part in Baidu’s Apollo platform-- an open and reliable platform working towards the development, testing, and deployment of Levels 3, 4, 5 autonomous vehicles. It includes all the different areas of the self-driving technology spectrum, right from perception and localization to 3D simulation and end-to-end training and testing of autonomous vehicles. The collaboration between Baidu and Unity is expected to gear up the development of a simulation environment for testing autonomous driving software. This simulation will enable developers to digitalize the entire development process. There are advantages of using simulations and virtual environment in development and testing. For instance, risky or implausible scenarios can be generated and tested in simulation whereas it might be impossible to do the same in the real world. Moreover, dangerous situations which can’t be tested in real life can be created and tested using such a simulation. “The ability to accurately conduct autonomous testing in a simulated environment allows for millions of simulations to simultaneously occur, providing Apollo partners with a competitive advantage while helping to keep their business costs down,” said Tim McDonough, general manager of Industrial, Unity Technologies. Unity’s real-time 3D platform helps in reducing errors and risks while increasing efficiency and speed of testing by offering simulations that replicate real-world scenarios. Apart from Baidu, Unity also works the largest OEMs in the world by improving the way they design, build, service and sell automobiles. The company has experts from companies such as BMW,  Toyota, General Motors, Volvo, and the Volkswagen Group. Unity also recently launched SimViz Solution Template, a package that provides OEMs and helps in the building of simulation environments. Jaewon Jung, Chief Architect of Baidu’s Intelligent Driving Group said, “By using a platform like Unity, our developers can focus on testing and research without worrying about non-functional environments or building something from scratch. The Unity-powered game engine simulation for Apollo has the ability to expedite autonomous vehicle validation and training with precise ground truth data in a more effective and safer way.” It would be interesting to see how this collaboration accelerates Baidu’s development of autonomous driving software. Read more about this news, check out Business Wire’ post. Unity ML-Agents Toolkit v0.6 gets two updates: improved usability of Brains and workflow for Imitation Learning Unity 2018.3 is here with improved Prefab workflows, Visual Effect graph and more Unity introduces guiding Principles for ethical AI to promote responsible use of AI
Read more
  • 0
  • 0
  • 2893

article-image-unity-ml-agents-toolkit-v0-6-gets-two-updates-improved-usability-of-brains-and-workflow-for-imitation-learning
Sugandha Lahoti
19 Dec 2018
2 min read
Save for later

Unity ML-Agents Toolkit v0.6 gets two updates: improved usability of Brains and workflow for Imitation Learning

Sugandha Lahoti
19 Dec 2018
2 min read
Unity ML-agents toolkit v0.6 is getting two major enhancements, announced the Unity team in a blog post on Monday. The first update turns Brains from MonoBehaviors to ScriptableObjects improving their usability. The second update allows developers to record expert demonstrations and use them for offline training, providing a better user workflow for Imitation Learning. Brains are now ScriptableObjects Brains were GameObjects that were attached as children to the Academy GameObject in previous versions of ML-Agents Toolkit. This made it difficult to re-use Brains across Unity scenes within the same project. In the v0.6 release, Brains are Scriptable objects, making them manageable as standard Unity assets. This makes it easy to use them across scenes and to create Agents’ Prefabs with Brains pre-attached. The Unity team has come up with the Learning Brain Scriptable Object that replaces the previous Internal and External Brains. It has also introduced Player and Heuristic Brain Scriptable Objects to replace the Player and Heuristic Brain Types, respectively. Developers can no longer change the type of Brain with the Brain Type drop down and need to create a different Brain for Player and Learning from the Assets menu. The BroadcastHub in the Academy Component can keep a track of which Brains are being trained. Record expert demonstrations for offline training The Demonstration Recorder allows users to record the actions and observations of an Agent while playing a game. These recordings can be used to train Agents at a later time via Imitation Learning or to analyze the data. Basically, Demonstration recorder helps training data for multiple training sessions, rather than capturing it every time. Users can add the Demonstration Recorder component to their Agent, check Record and give the demonstration a name. To train an Agent with the recording, users can modify the Hyperparameters in the training configuration. Check out the documentation on Github for more information. Read more about the new enhancements on Unity Blog. Getting started with ML agents in Unity [Tutorial] Unity releases ML-Agents toolkit v0.5 with Gym interface, a new suite of learning environments Unite Berlin 2018 Keynote: Unity partners with Google, launches Ml-Agents ToolKit 0.4, Project MARS and more
Read more
  • 0
  • 0
  • 4094
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-anthony-levandowski-announces-pronto-ai-and-makes-a-coast-to-coast-self-driving-trip
Sugandha Lahoti
19 Dec 2018
2 min read
Save for later

Anthony Levandowski announces Pronto AI and makes a coast-to-coast self-driving trip

Sugandha Lahoti
19 Dec 2018
2 min read
Anthony Levandowski is back in the self-driving space with a new company. Pronto AI. This Tuesday, he announced on a blog post on Medium that he has completed a trip across the country in a self-driving car without any human intervention. He is also developing a $5,000 aftermarket driver assistance system for semi-trucks, which will handle the steering, throttle, and brakes on the highway. https://twitter.com/meharris/status/1075036576143466497 Previously, Levandowski has been at the center of a controversy between Alphabet’s self-driving car company Waymo and Uber. Levandowski had allegedly taken with him confidential documents over which the companies got into a legal battle. He was briefly barred from the autonomous driving industry during the trial. However, the companies settled the case early this year. After laying low for a while, he is back with Pronto AI and it’s first ADAS ( advanced driver assistance system). “I know what some of you might be thinking: ‘He’s back?’” Levandowski wrote in his Medium post announcing Pronto’s launch. “Yes, I’m back.” Levandowski told the Guardian that he traveled in a self-driving vehicle from San Francisco to New York without human intervention. He didn't touch the steering wheel or pedals — except for periodic rest stops — for the full 3,099 miles. He posted a video that shows a portion of the drive, though it's hard to fact-check the full journey. The car was a modified Toyota Prius which used only video cameras, computers, and basic digital maps to make the cross-country trip. In the medium blog post, he also announced the development of a new camera-based ADAS. Named Copilot by Pronto, it delivers advanced features, built specifically for Class 8 vehicles, with driver comfort and safety top of mind. It will also offer lane keeping, cruise control and collision avoidance for commercial semi-trucks and will be rolled out in early 2019. Alphabet’s Waymo to launch the world’s first commercial self-driving cars next month Apex.AI announced Apex.OS and Apex.Autonomy for building failure-free autonomous vehicles Uber manager warned the leadership team of the inadequacy of safety procedures in their prototype robo-taxis early March, reports The Information
Read more
  • 0
  • 0
  • 3142

article-image-discord-to-adopt-90-10-revenue-split-for-game-developers-starting-from-2019
Bhagyashree R
17 Dec 2018
2 min read
Save for later

Discord to adopt 90/10 revenue split for game developers starting from 2019

Bhagyashree R
17 Dec 2018
2 min read
Last week, Discord announced that developers will be allowed to self publish games and get a 90% share of the revenue once their game store opens for all the creators in 2019. https://twitter.com/discordapp/status/1073606080188637189 Major game distribution platforms generally follow 30 percent cut of revenues from any games sold through their online stores. But, with the increasing competition and developers opting for self-publishing, this trend has started to change. Earlier this month, Epic launched its own store that takes only a 12-percent share of total revenue. Valve also recently updated their Steam Distribution Agreement which includes new revenue share tiers for games that hit certain revenue levels. As per Discord’s announcement, the game distribution shouldn’t cost 30%, “Turns out, it does not cost 30% to distribute games in 2018. After doing some research, we discovered that we can build amazing developer tools, run them, and give developers the majority of the revenue share.” Discord in the announcement further added that they will try reducing the 10% share as well,”...and we’ll explore lowering it by optimizing our tech and making things more efficient.” The beta version of the Discord Games Store was first launched in August, which now includes up to 100 titles. This new self-serve publishing platform will give developers, irrespective of how big the game or the team is, access to the Discord Game Store and the new 90-percent revenue share. In addition to the 90/10 revenue share, they will also be focusing on empowering developers to communicate with their players by improving Verified Servers. Read Discord’s official announcement on Medium. Unity 2018.3 is here with improved Prefab workflows, Visual Effect graph and more Uses of Machine Learning in Gaming Key Takeaways from the Unity Game Studio Report 2018
Read more
  • 0
  • 0
  • 1759

article-image-unity-2018-3-is-here-with-improved-prefab-workflows-visual-effect-graph-and-more
Sugandha Lahoti
14 Dec 2018
3 min read
Save for later

Unity 2018.3 is here with improved Prefab workflows, Visual Effect graph and more

Sugandha Lahoti
14 Dec 2018
3 min read
Yesterday, the team at Unity released the next update of Unity for 2018. Unity 2018.3 has been released with improved Prefab workflows, Visual Effect Graph (Preview), and an updated Terrain System along with more than 2000 new features, fixes, and improvements. Improved Prefab Workflows Prefabs workflows have been improved in Unity 2018.3 with a focus on reusability, control, and safety. These updates implement support for nesting and make it safer for more efficient to work with Prefabs for teams in all sizes. Nested Prefabs make it easier for teams of all sizes to: Split up Prefabs into multiple entities for greater efficiency Reuse any content, from small to large Work on different parts of content simultaneously Visual Effect Graph (Preview) The Visual effect graph will make it easy for artists to create stand-out VFX for games and other projects in real-time. Developers can create simple and complex effects, with this tool. It also includes an API for creating custom nodes to meet the needs of advanced creators. Updated Terrain System The updated terrain tools provide developers with better performance and improved usability. Operations are shifted over to the GPU which helps creators get faster access to faster tools, larger brush sizes, improved previews, and the ability to paint Terrain tile borders with automatic seam-stitching. Improvements have been made to support the High Definition Render Pipeline (HDRP) and the Lightweight Render Pipeline (LWRP). FPS Sample Project Unity 2018.3 comes with FPS Sample which gives game developers source code access to a connected multiplayer FPS experience. They can download the sample and use it as a starting point to learn the latest technologies such as HDRP, or for their next connected game. In Unity 2018.3 significant improvements have been made to how Timeline Animation Tracks handle animations on the root transform of a hierarchy. This includes new Track Offset modes, Adapts to scale, improved editor preview, and depreciation of Root Motion. Mobile improvements include Dynamic Resolution Scaling support for Vulkan and Metal, Android AppBundle generation support and faster APK package build times on Android with APKzlib. Plans for 2019 For 2019 they are planning to release new announcements and innovations These include a new MegaCity Demo to showcase Unity’s approach to Data-Oriented Design. Another demo, Cinecast (Experimental) is an AI cinematography system that enables the creation of movie-like cinematic sequences from gameplay, in real-time. Project MARS is an extension for Unity that helps developers build applications that intelligently interact with any real-world environment, with little-to-no custom coding. 2019 will also see the preview of Project Tiny, which is Unity’s new, highly-modular runtime and Editor mode that enables the creation of small, light, and fast instant games and experiences. Find the full list of Unity 2018.3 features on the Unity blog. Unity introduces guiding Principles for ethical AI to promote responsible use of AI. Unity has won the Technology and Engineering Emmy Award for excellence in engineering creativity What you should know about Unity 2018 Interface
Read more
  • 0
  • 0
  • 3190

article-image-minecraft-bedrock-beta-1-9-0-3-is-out-with-experimental-scripting-api
Natasha Mathur
10 Dec 2018
2 min read
Save for later

Minecraft Bedrock beta 1.9.0.3 is out with experimental scripting API!

Natasha Mathur
10 Dec 2018
2 min read
Minecraft team released Minecraft Bedrock beta 1.9.0.3 last week. The latest release explores new feature such as scripting API, along with minor changes and fixes. Let’s have a look at what’s new in Minecraft Bedrock 1.9.0.3 (beta). Experimental Scripting API Minecraft Bedrock beta 1.9.0.3 comes with a new Scripting API that allows users to tweak the inner components within a game by writing commands. The Minecraft Script Engine uses the JavaScript language with the help of which Scripts can be written and bundled along with Behaviour Packs to invoke different actions. These actions include listening to and responding to game events, retrieving and modifying data in components that entities have, and that can affect different parts of the game. This feature is currently only available in Windows 10 on enabling the “Use Experimental Gameplay” setting. Changes and Fixes A minor change has been made to the size of the crossbow, as it now appears bigger in pillager’s (hostile illager mobs with crossbows in Minecraft) hands. A crash occurring during gameplay has been fixed. The issue of tamed llamas turning into bioluminescent creatures on opening an inventory has been resolved. Items in hand appearing completely white in colour have been fixed. Rare instances of players getting teleported into a boat while travelling near water have been fixed. The issue of logo not being visible on the loading screen after suspending and resuming the game has been fixed. With the new release, players no longer have an option to respawn in a semi-dead state in case they get killed while in a bed. The texture of the Beacon beams has been improved. The inventory blocks once again follow the textures that are set in blocks.json. Optimizations have been done to get a proper synchronization between client and server. Minecraft bedrock beta 1.9.0.3 is available only on Xbox One, Windows 10, and Android (Google Play). For more information, check out the official release notes. Minecraft Java team are open sourcing some of Minecraft’s code as libraries Minecraft is serious about global warming, adds a new (spigot) plugin to allow changes in climate mechanics A Brief History of Minecraft Modding
Read more
  • 0
  • 0
  • 5951
article-image-fortnite-creator-epic-games-launch-epic-games-store-where-developers-get-88-of-revenue-earned-challenging-valves-dominance
Sugandha Lahoti
05 Dec 2018
3 min read
Save for later

Fortnite creator Epic games launch Epic games store where developers get 88% of revenue earned; challenging Valve’s dominance

Sugandha Lahoti
05 Dec 2018
3 min read
The Game studio, who brought the phenomenal online video game Fortnite to life, has launched an Epic games store. In a blog post on the Unreal Engine website, Epic stated that the store will have a “fair economics and a direct relationship with players”. All players who buy a game will be subscribed to a developer’s newsfeed where they can contact them for updates and news about upcoming releases. Developers can also control their game pages and connect with YouTube content creators, Twitch streamers, and bloggers with the recently launched Support-A-Creator program. Epic games store will also follow an 88/12 revenue split. “Developers receive 88% of revenue,” the company wrote. “There are no tiers or thresholds. Epic takes 12%. And if you’re using Unreal Engine, Epic will cover the 5% engine royalty for sales on the Epic Games store, out of Epic’s 12%.” Source: Unreal Engine Epic’s inspiration for the 88/12 split may have possibly been taken from Valve’s Steam store (a major competitor to Epic games) who have tweaked their revenue making process. “Starting from October 1, 2018, when a game makes over $10 million on Steam, the revenue share for that application will adjust to 75 percent/25 percent on earnings beyond $10 million,” Valve wrote in the official blog post. “At $50 million, the revenue share will adjust to 80 percent/20 percent on earnings beyond $50 million. The Epic game store with launch with a few selected games on PC and Mac, then it will open up to other games and to Android and other open platforms throughout 2019. With this move, Epic Games are looking to attract more gamers and developers to their platform. And a better revenue split will automatically do most of the work for them. Developer-favour revenue splitting will also increase the market where previously there was a lack of competition in PC-game distribution by the immovable 30/70 split. Twitteratis were fairly happy with this announcement and expressed their feelings and agreed on it to being a threat to Valve. https://twitter.com/Grummz/status/1069975572984385537 https://twitter.com/SpaceLyon/status/1069979966501208065 https://twitter.com/lucasmtny/status/1069970212424953857 https://twitter.com/nickchester/status/1069970684112265217 The Epic Games team will reveal more details on upcoming game releases at the Game Awards this Thursday. Read the blog post by Epic games to know more. Epic games CEO calls Google “irresponsible” for disclosing the security flaw in Fortnite Android Installer before patch was ready Google is missing out $50 million because of Fortnite’s decision to bypass Play Store Implementing fuzzy logic to bring AI characters alive in Unity based 3D games
Read more
  • 0
  • 0
  • 3428

article-image-nvidia-open-sources-its-game-physics-simulation-engine-physx-and-unveils-physx-sdk-4-0
Natasha Mathur
04 Dec 2018
2 min read
Save for later

NVIDIA open sources its game physics simulation engine, PhysX, and unveils PhysX SDK 4.0

Natasha Mathur
04 Dec 2018
2 min read
NVIDIA team unveiled PhysX SDK 4.0, yesterday, and also announced that it's making its popular real-time physics simulation engine, PhysX, available as open source under the simple BSD-3 license. “We’re doing this because physics simulation — a long key to immersive games and entertainment — turns out to be more important than we ever thought. PhysX will now be the only free, open-source physics solution that takes advantage of GPU acceleration and can handle large virtual environments”, says the NVIDIA team. NVIDIA had designed PhysX specifically for the purpose of hardware acceleration using powerful processors that comprise hundreds of processing cores. This design offers a dramatic boost in the physics processing power, which in turn, takes the gaming experience to a whole new level, offering more rich, and immersive physical gaming environments. The new PhysX SDK 4.0 is a scalable, open source, and multi-platform game physics solution that offers support to a wide range of devices, ranging from smartphones to high-end multicore CPUs and GPUs. PhysX 4.0 SDK has been upgraded to offer industrial-grade simulation quality at game simulation levels. PhysX 4.0 comes with Temporal Gauss-Seidel Solver (TGS), that is capable of adjusting the constraints within games with each iteration, depending on the bodies’ relative motion. Other than that, the overall stability has been improved and now allows for new filtering rules for kinematics and statics. Some of the major features of PhysX SDK 4.0, includes effective memory usage management, support offered for different measurement units and scales, multiple broad-phase, convex-mesh, triangle mesh, and primitive shape collision detection algorithms. PhysX SDK 4.0 will be made available on December 20, 2018. Public reaction to the news is largely positive as PhysX was earlier available for commercial use for free, but now that its available as open source, people can interact deeply with the physics engine, modifying it as per their needs at absolutely no cost. https://twitter.com/puradawid/status/1069614540671909888 https://twitter.com/tauke/status/1069603803463184384 For more information, check out the official NVIDIA blog post. NVIDIA open sources its material definition language, MDL SDK NVIDIA unveils a new Turing architecture: “The world’s first ray tracing GPU” BlazingDB announces BlazingSQL , a GPU SQL Engine for NVIDIA’s open source RAPIDS
Read more
  • 0
  • 0
  • 3548

article-image-unity-has-won-the-technology-and-engineering-emmy-award-for-excellence-in-engineering-creativity
Amrata Joshi
22 Nov 2018
2 min read
Save for later

Unity has won the Technology and Engineering Emmy Award for excellence in engineering creativity

Amrata Joshi
22 Nov 2018
2 min read
Yesterday, Unity Technologies won its first Technology and Engineering Emmy Award for excellence in engineering creativity. Unity has won this award for their collaboration with Disney Television Animation on the broadcast-quality shorts Baymax Dreams. The National Academy of Television Arts and Sciences (NATAS), a service organization for advancement in arts and sciences of television, have acknowledged the efforts taken by Unity in Baymax Dreams. Earlier this month Unity also won an Emmy for 3D Engine Software for the Production of Animation. Baymax Dreams series is based on the story of a 14-year old tech genius Hiro and his robot, named, Baymax. The characterization and the visual effects are mesmerizing. https://www.youtube.com/watch?v=DpuUnNLZf5k Different Unity teams from animation, films, Virtual Reality (VR), and gaming, united for this creative project. They designed the entire workflow and matched it as per the story and the vision of the director. Graphics Engineer, John Parsaie used Unity’s High Definition Render Pipeline (HDRP) to create materials like Baymax’s emissive ‘night light’. He also worked with the Unity artist, Keijiro Takahashi on the voxelization effect. This effect can be seen, when Baymax first enters his dream state. The characters and artwork were built and reviewed in VR. This, of course, helped in increasing clarity and imagining the visuals better. This also enabled experimenting with different styles and formats. The team at Unity made use of multiple methods, including, Unity’s multi-track sequencer, Timeline and Cinemachine’s suite of smart cameras and Post-Processing Stack v2. These were used for layout, lighting, and compositing. To read more about this news, check out the official blog post by Unity. The Technology and Engineering Emmy® Awards in partnership with the NAB show (National Association of Broadcasters) will be held in Las Vegas on Sunday, April 7, 2019. Exploring shaders and materials in Unity 2018.x to develop scalable mobile games Building your own Basic Behavior tree in Unity [Tutorial] Getting started with ML agents in Unity [Tutorial]
Read more
  • 0
  • 0
  • 3547
article-image-introducing-zink-an-opengl-implementation-on-top-of-vulkan
Amrata Joshi
02 Nov 2018
3 min read
Save for later

Introducing Zink: An OpenGL implementation on top of Vulkan

Amrata Joshi
02 Nov 2018
3 min read
Erik Kusma Faye-Lund, a graphics programmer, introduced Zink on Wednesday. Zinc is an OpenGL implementation on top of Vulkan. It is a Mesa Gallium driver that supports OpenGL implementation in Mesa to provide hardware-accelerated OpenGL when only a Vulkan driver is available. Currently, Zink is only available as a source code, distro-packages aren’t available yet. It has only been tested on Linux. To build Zink, one needs to have Git, Vulkan headers and libraries, Meson and Ninja. Also, one needs to build dependencies to compile Mesa. Erik says, “And most importantly, we are not a conformant OpenGL implementation. I’m not saying we will never be, but as it currently stands, we do not do conformance testing, and as such we neither submit conformance results to Khronos.” What Zink may include 1. Just one API OpenGL is a big API and is well-established as a requirement for applications and desktop compositors. But since the release of Vulkan, there are two APIs for essentially the same hardware functionality but both are important. As the software-world is working hard to implement Vulkan support everywhere, this is leading to complexity. One would only require things like desktop compositors to support one API in the future. There might be a future where OpenGL’s role could purely be one of legacy application compatibility. Maybe Zink can help in making the future better! 2. Lessen the workload of GPU drivers Everyone wants less amount of code to maintain for legacy hardware but the drivers to maintain are growing rapidly. Also, new drivers have been written for old hardware. If the hardware is capable of supporting Vulkan, it could be easier to only support Vulkan “natively”, and do OpenGL through Zink. There aren’t infinite programmers that can maintain every GPU driver forever. But maybe with Zink, driver-support might get better and easier. 3.  Zink comes with benefits Since Zink is implemented as a Gallium driver in Mesa, there are some side-benefits that come “for free”. For instance, projects like Gallium Nine or Clover could, in theory, may work on top of the i965 Vulkan driver through Zink in the future. In the coming years, Zink might also act as a cooperation-layer between OpenGL and Vulkan code in the same application. 4. Zink could be used as a closed-source Vulkan driver Zink might also run smoothly on top of a closed-source Vulkan driver and still get proper window system integration. What does Zink require? Currently, Zink requires a Vulkan 1.0 implementation and the following extensions: VK_KHR_maintenance1: This extension is required for the viewport flipping. VK_KHR_external_memory_fd : This extension is required for getting the rendered result on screen. Additionally, Erick has also shared a list of features that Zink doesn’t support, which include: Currently, glPointSize() is not supported. Though writing to gl_PointSize from the vertex shader does work. The texture borders are currently black due to Vulkan’s lack of arbitrary border-color support. Currently, no control-flow is supported in the shaders. There is no GL_ALPHA_TEST and glShadeModel(GL_FLAT) support yet. It would be interesting to see how Zink turns out when the features go live! Read more about this news on Kusma’s official website. Valve’s Steam Play Beta uses Proton, a modified WINE, allowing Linux gamers to play Windows games UI elements and their implementation Game Engine Wars: Unity vs Unreal Engine
Read more
  • 0
  • 0
  • 3987

article-image-electronic-arts-ea-announces-project-atlas-a-futuristic-cloud-based-ai-powered-game-development-platform
Natasha Mathur
02 Nov 2018
4 min read
Save for later

Electronic Arts (EA) announces Project Atlas, a futuristic cloud-based AI powered game development platform

Natasha Mathur
02 Nov 2018
4 min read
Electronic Arts (EA) announced Project Atlas, a new AI-powered, cloud computing based game development platform, earlier this week. Project Atlas comes with high-quality LIDAR Data, improved scalability, cloud-based engine, and enhanced security among others. Information regarding when the general availability of Project Atlas hasn’t been disclosed yet. “We’re calling this Project Atlas and we believe in it so much that we have over 1,000 EA employees working on building it every day, and dozens of studios around the world contributing their innovations, driving priorities, and already using many of the components” mentioned Ken Moss, Chief Technology Officer at Electronic Arts. Let’s discuss the features of Project Atlas. High-quality LIDAR Data Project Atlas will be using high-quality LIDAR data about real mountain ranges. This data will then be further passed through a deep neural network which has been trained to create terrain-building algorithms. With the help of this AI-assisted terrain generation, designers will be able to generate not just a single mountain, but a series of mountains along with the surrounding environment to bring the realism of the real world. “This is just one example of dozens or even hundreds where we can apply advanced technology to help game teams of all sizes scale to build bigger and more fun games,” says Moss. Improved Scalability Earlier, all simulation or rendering of in-game actions used to be limited to either the processing performance of the player’s console or to a single server that would interact with your system. But, now, with the help of the cloud, players will have the ability to tap into a network of many servers, that are dedicated to computing complex tasks. This will deliver hyper-realistic destruction within new HD games, that would be highly indistinguishable from real life. ”We’re working to deploy that level of gaming immersion on every device”, says Moss. Moreover, the integration of distributed networks at the rendering level means infinite scalability from the cloud. So, whether you’re on a team of 500 or just 5, you’ll now be able to scale games and create immersive experiences, in unprecedented ways. Cloud-based engine and Moddable asset database Now with Project Atlas, you can turn your own vision into reality, and share the creation with your friends as well as the whole world. You can also market your ideas and visions to the community. Keeping this in mind, Project Atlas team is planning on having a cloud-enabled engine that’ll be able to seamlessly integrates different services. Along with a moddable asset database, there’ll also be a common marketplace so that users can for share and rate other players’ creations. “Players and developers want to create. We want to help them. By blurring the line between content producers and players, this will truly democratize the game experience” adds Moss. Enhanced Security Project Atlas comes with a unified platform, where game makers have the ability to seamlessly deploy security measures such as SSL certificates, configuration, appropriate encryption of data, and zero-downtime patches for every feature from a single secure source. This will allow them to focus more on creating games and less on taking the required security measures. “We’re solving for some of the manually intensive demands by bringing together AI capabilities in an engine and cloud-enabled services at scale. With an integrated platform that delivers consistency and seamless delivery from the game, game makers will free up time, brainspace, and energy for the creative pursuit”, says Moss. For more information, check out the official Project Atlas blog. Xenko 3.0 game engine is here, now free and open-source Meet yuzu – an experimental emulator for the Nintendo Switch AI for Unity game developers: How to emulate real-world senses in your NPC agent behavior
Read more
  • 0
  • 0
  • 3060