Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Tech News - 3D Game Development

56 Articles
article-image-anime-studio-khara-switching-primary-3d-cg-tools-to-blender
Sugandha Lahoti
19 Aug 2019
4 min read
Save for later

Japanese Anime studio Khara is switching its primary 3D CG tools to Blender

Sugandha Lahoti
19 Aug 2019
4 min read
Popular Japanese animation studio Khara, announced on Friday that it will be moving to open source 3D software Blender as its primary 3D CG tool. Khara is a motion picture planning and production company and are currently working on “EVANGELION:3.0+1.0”, a film to be released in June 2020. Primarily, they will partially use Blender for ‘EVANGELION:3.0+1.0’ but will make the full switch once that project is finished. Khara is also helping the Blender Foundation by joining the Development Fund as a corporate member. Last month, Epic Games granted Blender $1.2 million in cash. Following Epic Games, Ubisoft also joined the Blender Development fund and adopted Blender as its main DCC tool. Why Khara opted for Blender? Khara had been using AutoDesk’s “3ds Max” as their primary tool for 3D CG so far. However, their project scale got bigger than what was possible with 3ds Max. 3ds Max is also quite expensive; according to Autodesk’s website, an annual fee for a single user is $2,396. Khara also had to reach out to small and medium-sized businesses for its projects. Another complaint was that Autodesk took time to release improvements to their proprietary software, which happens at a much faster rate in an open source software environment. They had also considered Maya as one of the alternatives, but dropped the idea as it resulted in duplication of work resource. Finally they switched to Blender, as it is open source and free. They were also intrigued by the new Blender 2.8 release which provided them with a 3D creation tool that worked like “paper and pencil”.  Blender’s Grease Pencil feature enables you to combine 2D and 3D worlds together right in the viewport. It comes with a new multi-frame edition mode with which you can change and edit several frames at the same time. It has a build modifier to animate the drawings similar to the Build modifier for 3D objects. “I feel the latest Blender 2.8 is intentionally ‘filling the gap’ with 3ds Max to make those users feel at home when coming to Blender. I think the learning curve should be no problem.”, told Mr. Takumi Shigyo, Project Studio Q Production Department. Khara founded “Project Studio Q, Inc.” in 2017, a company focusing mainly on the movie production and the training of Anime artists. Providing more information on their use of Blender, Hiroyasu Kobayashi, General Manager of Digital Dpt. and Director of Board of Khara, said in the announcement, “Preliminary testing has been done already. We are now at the stage to create some cuts actually with Blender as ‘on live testing’. However, not all the cuts can be done by Blender yet. But we think we can move out from our current stressful situation if we place Blender into our work flows. It has enough potential ‘to replace existing cuts’.” While Blender will be used for the bulk of the work, Khara does have a backup plan if there's anything Blender struggles with. Kobayashi added "There are currently some areas where Blender cannot take care of our needs, but we can solve it with the combination with Unity. Unity is usually enough to cover 3ds Max and Maya as well. Unity can be a bridge among environments." Khara is also speaking with their partner companies to use Blender together. Khara’s transition was well appreciated by people. https://twitter.com/docky/status/1162279830785646593 https://twitter.com/eoinoneillPDX/status/1154161101895950337 https://twitter.com/BesuBaru/status/1154015669110710273 Blender 2.80 released with a new UI interface, Eevee real-time renderer, grease pencil, and more Following Epic Games, Ubisoft joins Blender Development fund; adopts Blender as its main DCC tool Epic Games grants Blender $1.2 million in cash to improve the quality of their software development projects
Read more
  • 0
  • 0
  • 7171

article-image-unity-2019-2-releases-with-updated-probuilder-shader-graph-2d-animation-burst-compiler-and-more
Fatema Patrawala
31 Jul 2019
3 min read
Save for later

Unity 2019.2 releases with updated ProBuilder, Shader Graph, 2D Animation, Burst Compiler and more

Fatema Patrawala
31 Jul 2019
3 min read
Yesterday, the Unity team announced the release of Unity 2019.2. In this release, they have added more than 170 new features and enhancements for artists, designers, and programmers. They have updated ProBuilder, Shader Graph, 2D Animation, Burst Compiler, UI Elements, and many more. Major highlights for Unity 2019.2 ProBuilder 4.0 ships as verified with 2019.2. It is a unique hybrid of 3D modeling and level design tools, optimized for building simple geometry but capable of detailed editing and UV unwrapping as needed. Polybrush is now available via Package Manager as a Preview package. This versatile tool lets you sculpt complex shapes from any 3D model, position detail meshes, paint in custom lighting or coloring, and blend textures across meshes directly in the Editor. DSPGraph is the new audio rendering/mixing system, built on top of Unity’s C# Job System. It’s now available as a Preview package. They have improved on UI Elements, Unity’s new UI framework, which renders UI for graph-based tools such as Shader Graph, Visual Effect Graph, and Visual Scripting. To help you better organize your complex graphs, Unity has added subgraphs to Visual Effect Graph. You can share, combine, and reuse subgraphs for blocks and operators, and also embed complete VFX within VFX. There is an improvement in the integration between Visual Effect Graph and the High-Definition Render Pipeline (HDRP), which pulls VFX Graph in by default, providing you with additional rendering features. With Shader Graph you can now use Color Modes to highlight nodes on your graph with colors based on various features or select your own colors to improve readability. This is especially useful in large graphs. The team has added swappable Sprites functionality to the 2D Animation tool. With this new feature, you can change a GameObject’s rendered Sprites while reusing the same skeleton rig and animation clips. This lets you quickly create multiple characters using different Sprite Libraries or customize parts of them with Sprite Resolvers. With this release Burst Compiler 1.1 includes several improvements to JIT compilation time and some C# improvements. Additionally, the Visual Studio Code and JetBrains Rider integrations are available as packages. Mobile developers will benefit from improved OpenGL support, as the team has added OpenGL multithreading support (iOS) to improve performance on low-end iOS devices that don’t support Metal. As with all releases, 2019.2 includes a large number of improvements and bug fixes. You can find the full list of features, improvements, and fixes in Unity 2019.2 Release Notes. How to use arrays, lists, and dictionaries in Unity for 3D game development OpenWrt 18.06.4 released with updated Linux kernel, security fixes Curl and the Linux kernel and much more! How to manage complex applications using Kubernetes-based Helm tool [Tutorial]
Read more
  • 0
  • 0
  • 4373

article-image-blender-2-80-released-with-new-ui-interface-eevee-real-time-renderer-grease-pencil
Bhagyashree R
31 Jul 2019
3 min read
Save for later

Blender 2.80 released with a new UI interface, Eevee real-time renderer, grease pencil, and more

Bhagyashree R
31 Jul 2019
3 min read
After about three long years of development, the most awaited Blender version, Blender 2.80 finally shipped yesterday. This release comes with a redesigned UI interface, workspaces, templates, Eevee real-time renderer, grease pencil, and much more. The user interface is revamped with a focus on usability and accessibility Blender’s user interface is revamped with a better focus on usability and accessibility. It has a fresh look and feel with a dark theme and modern icon set. The icons change color based on the theme you select so that they are readable against bright or dark backgrounds. Users can easily access the most used features via the default shortcut keys or map their own. You will be able to fully use Blender with a one-button trackpad or pen input as it now supports the left mouse button by default for selection. It provides a new right-click context menu for quick access to important commands in the given context. There is also a Quick Favorites popup menu where you can add your favorite commands. Get started with templates and workspaces You can now choose from multiple application templates when starting a new file. These include templates for 3D modeling, shading, animation, rendering, grease pencil based 2D drawing and animation, sculpting, VFX, video editing, and the list goes on. Workspaces give you a screen layout for specific tasks like modeling, sculpting, animating, or editing. Each template that you choose will provide a default set of workspaces that can be customized. You can create new workspaces or copy from the templates as well. Completely rewritten 3D Viewport Blender 2.8’s completely rewritten 3D viewport is optimized for modern graphics and offers several new features. The new Workbench render engine helps you get work done in the viewport for tasks like scene layout, modeling, and sculpting. Viewport overlays allow you to decide which utilities are visible on top of the render. The LookDev new shading mode allows you to test multiple lighting conditions (HDRIs) without affecting the scene settings. The smoke and fire simulations are overhauled to make them look as realistic as possible. Eevee real-time renderer Blender 2.80 has a new physically-based real-time renderer called Eevee. It performs two roles: a renderer for final frames and the engine driving Blender's real-time viewport for creating assets. Among the various features it supports volumetrics, screen-space reflections and refractions, depth of field, camera motion blur, bloom, and much more. You can create Eevee materials using the same shader nodes as Cycles, which makes it easier to render existing scenes. 2D animation with Grease Pencil Grease Pencil enables you to combine 2D and 3D worlds together right in the viewport. With this release, it has now become a “full 2D drawing and animation system.” It comes with a new multi-frame edition mode with which you can change and edit several frames at the same time. It has a build modifier to animate the drawings similar to the Build modifier for 3D objects. There are many other features added to grease pencil. Watch this video to get a glimpse of what you can create with it: https://www.youtube.com/watch?v=JF3KM-Ye5_A Check out for more features in Blender 2.80 on its official website. Blender celebrates its 25th birthday! Following Epic Games, Ubisoft joins Blender Development fund; adopts Blender as its main DCC tool Epic Games grants Blender $1.2 million in cash to improve the quality of their software development projects  
Read more
  • 0
  • 0
  • 4142

article-image-following-epic-games-ubisoft-joins-blender-development-fund-adopts-blender-as-its-main-dcc-tool
Vincy Davis
23 Jul 2019
5 min read
Save for later

Following Epic Games, Ubisoft joins Blender Development fund; adopts Blender as its main DCC tool

Vincy Davis
23 Jul 2019
5 min read
Yesterday, Ubisoft Animation Studio (UAS) announced that they will fund the development of Blender as a corporate Gold member through the Blender Foundation’s Development Fund. It has also been announced that Ubisoft will be adopting the open-source animation software Blender as their main digital content creation (DCC) tool. The exact funding amount has not been disclosed. Gold corporate members of the Blender development fund can have their prominent logo on blender.org dev fund page and have credit as Corporate Gold Member in blender.org and in official Blender foundation communication. The Gold corporate members also have a strong voice in approving projects for Blender. The Gold corporate members donate a minimum of EUR 30,000 as long as they remain a member. Pierrot Jacquet, Head of Production at UAS mentioned in the press release , “Blender was, for us, an obvious choice considering our big move: it is supported by a strong and engaged community, and is paired up with the vision carried by the Blender Foundation, making it one of the most rapidly evolving DCCs on the market.”  He also believes that since Blender is an open source project, it will allow Ubisoft to share some of their own developed tools with the community. “We love the idea that this mutual exchange between the foundation, the community, and our studio will benefit everyone in the end”, he adds. As part of their new workflow, Ubisoft is creating a development environment supported by open source and inner source solutions. The Blender software will replace Ubisoft’s in-house digital content creation tool and will be used to produce short content with the incubator. Later, the Blender software will also be used in Ubisoft’s upcoming shows in 2020. Per Jacquet, Blender 2.8 will be a “game-changer for the CGI industry”. Blender 2.8 beta is already out, and its stable version is expected to be released in the coming days. Ubisoft was impressed with the growth of the internal Blender community as well as with the innovations expected in Blender 2.8. Blender 2.8 will have a revamped UX, Grease Pencil, EEVEE real-time rendering, new 3D viewport and UV editor tools to enhance users gaming experience. Ubisoft was thus convinced that this is the “right time to bring support to our artists and productions that would like to add Blender to their toolkit.” This news comes a week after Epic Games announced that it is awarding Blender Foundation $1.2 million in cash spanning three years, to accelerate the quality of their software development projects. With two big companies funding Blender, the future does look bright for them. The Blender 2.8 preview features is expected to have made both the companies step forward and support Blender, as both Epic and Ubisoft have announced their funding just days before the stable release of Blender 2.8. In addition to Epic and Ubisoft, corporate members include animation studio Tangent, Valve, Intel, Google, and Canonical's Ubuntu Linux distribution. Ton Roosendaal, founder and chairman of Blender Foundation is surely a happy man when he says that “Good news keeps coming”. He added, “it’s such a miracle to witness the industry jumping on board with us! I’ve always admired Ubisoft, as one of the leading games and media producers in the world. I look forward to working with them and help them find their ways as a contributor to our open source projects on blender.org.” https://twitter.com/tonroosendaal/status/1153376866604113920 Users are very happy and feel that this is a big step forward for Blender. https://twitter.com/nazzagnl/status/1153339812105064449 https://twitter.com/Nahuel_Belich/status/1153302101142978560 https://twitter.com/DJ_Link/status/1153300555986550785 https://twitter.com/cgmastersnet/status/1153438318547406849 Many also see this move as the industry’s way of sidelining Autodesk, the company which is popularly used for its DCC tools. https://twitter.com/flarb/status/1153393732261072897 A Hacker News user comments, “Kudos to blender's marketing team. They get a bit of free money from this. But the true motive for Epic and Unisoft is likely an attempt to strong-arm Autodesk into providing better support and maintenance. Dissatisfaction with Autodesk, lack of care for their DCC tools has been growing for a very long time now, but studios also have a huge investment into these tools as part of their proprietary pipelines.  Expect Autodesk to kowtow soon and make sure that none of these companies will make the switch. If it means that Autodesk actually delivers bug fixes for the version the customer has instead of one or two releases down the road, it is a good outcome for the studios.” Visit the Ubisoft website for more details. CraftAssist: An open-source framework to enable interactive bots in Minecraft by Facebook researchers What to expect in Unreal Engine 4.23? Pluribus, an AI bot built by Facebook and CMU researchers, has beaten professionals at six-player no-limit Texas Hold ’Em Poker
Read more
  • 0
  • 0
  • 6766

article-image-craftassist-an-open-source-framework-to-enable-interactive-bots-in-minecraft-by-facebook-researchers
Vincy Davis
19 Jul 2019
5 min read
Save for later

CraftAssist: An open-source framework to enable interactive bots in Minecraft by Facebook researchers

Vincy Davis
19 Jul 2019
5 min read
Two days ago, researchers from Facebook AI Research published a paper titled “CraftAssist: A Framework for Dialogue-enabled Interactive Agents”. The authors of this research are Facebook AI research engineers Jonathan Gray and Kavya Srinet, Facebook AI research scientist C. Lawrence Zitnick and Arthur Szlam and Yacine Jernite, Haonan Yu, Zhuoyuan Chen, Demi Guo and Siddharth Goyal. The paper describes the implementation of an assistant bot called CraftAssist which appears and interacts like another player, in the open sandbox game of Minecraft. The framework enables players to interact with the bot via in-game chat through various implemented tools and platforms. The players can also record these interactions through an in-game chat. The main aim of the bot is to be a useful and entertaining assistant to all the tasks listed and evaluated by the human players. Image Source: CraftAssist paper For motivating the wider AI research community to use the CraftAssist platform in their own experiments, Facebook researchers have open-sourced the framework, the baseline assistant, data and the models. The released data includes the functions which was used to build the 2,586 houses in Minecraft, the labeling data of the walls, roofs, etc. of the houses, human rephrasing of fixed commands, and the conversion of natural language commands to bot interpretable logical forms. The technology that allows the recording of human and bot interaction on a Minecraft server has also been released so that researcher will be able to independently collect data. Why is the Minecraft protocol used? Minecraft is a popular multiplayer volumetric pixel (voxel) 3D game based on building and crafting which allows multiplayer servers and players to collaborate and build, survive or compete with each other. It operates through a client and server architecture. The CraftAssist bot acts as a client and communicates with the Minecraft server using the Minecraft network protocol. The Minecraft protocol allows the bot to connect to any Minecraft server without the need for installing server-side mods. This lets the bot to easily join a multiplayer server along with human players or other bots. It also lets the bot to join an alternative server which implements the server-side component of the Minecraft network protocol. The CraftAssist bot uses a 3rd-party open source Cuberite server. It is a fast and extensible game server used for Minecraft. Read More: Introducing Minecraft Earth, Minecraft’s AR-based game for Android and iOS users How does the CraftAssist function? The block diagram below demonstrates how the bot interacts with incoming in-game chats and reaches the desired target. Image Source: CraftAssist paper Firstly, the incoming text is transformed into a logical form called the action dictionary. The action dictionary is then translated by a dialogue object which interacts with the memory module of the bot. This produces an action or a chat response to the user. The bot’s memory uses a relational database which is structured to recognize the relation between stored items of information. The major advantage of this type of memory is the easy to convert semantic parser, which is converted into a fully specified tasks. The bot responds to higher-level actions, called Tasks. Tasks are an interruptible process which follows a clear objective of step by step actions. It can adjust to long pauses between steps and can also push other Tasks onto a stack, like the way functions can call other functions in a standard programming language. Move, Build and Destroy are few of the many basic Tasks assigned to the bot. The The Dialogue Manager checks for illegal or profane words, then queries the semantic parser. The semantic parser takes the chat as input and produces an action dictionary. The action dictionary indicates that the text is a command given by a human and then specifies the high-level action to be performed by the bot. Once the task is created and pushed onto the Task stack, it is the responsibility of the command task ‘Move’ to compare the bot’s current location to the target location. This will make the bot to undertake a sequence of low-level step movements to reach the target. The core of the bot’s understanding of natural language depends on a neural semantic parser called the Text-toAction-Dictionary (TTAD) model. This model receives the incoming command/chat and then classifies it into an action dictionary which is interpreted by the Dialogue Object. The CraftAssist framework thus enables the bots in Minecraft to interact and play with players by understanding human interactions, using the implemented tools. The researchers hope that since the dataset of CraftAssist is now open-sourced, more developers will be empowered to contribute to this framework by assisting or training the bots, which might lead to the bots learning from human dialogue interactions, in the future. Developers have found the CraftAssist framework interesting. https://twitter.com/zehavoc/status/1151944917859688448 A user on Hacker News comments, “Wow, this is some amazing stuff! Congratulations!” Check out the paper CraftAssist: A Framework for Dialogue-enabled Interactive Agents for more details. Epic Games grants Blender $1.2 million in cash to improve the quality of their software development projects What to expect in Unreal Engine 4.23? A study confirms that pre-bunk game reduces susceptibility to disinformation and increases resistance to fake news
Read more
  • 0
  • 0
  • 10165

article-image-epic-games-grants-blender-1-2-million-in-cash-to-improve-the-quality-of-their-software-development-projects
Vincy Davis
16 Jul 2019
4 min read
Save for later

Epic Games grants Blender $1.2 million in cash to improve the quality of their software development projects

Vincy Davis
16 Jul 2019
4 min read
Yesterday, Epic Games announced that it is awarding Blender Foundation $1.2 million in cash spanning three years, to accelerate the quality of their software development projects. Blender is a free and open-source 3D creation suite which supports a full range of tools to empower artists to create 3D graphics, animation, special effects or games. Ton Roosendaal, founder, and chairman of Blender Foundation thanked Epic Games in a statement. He said “Thanks to the grant we will make a significant investment in our project organization to improve on-boarding, coordination and best practices for code quality. As a result, we expect more contributors from the industry to join our projects.” https://twitter.com/tonroosendaal/status/1150793424536313862 The $1.2 million grant from Epic is part of their $100 million MegaGrants program which was announced this year in March. Tim Sweeney, CEO of Epic Games had announced that Epic will be offering $100 million in grants to game developers to boost the growth of the gaming industry by supporting enterprise professionals, media and entertainment creators, students, educators, and tool developers doing excellent work with Unreal Engine or enhancing open-source capabilities for the 3D graphics community. Sweeney believes that open tools, libraries, and platforms are critical to the future of the digital content ecosystem. “Blender is an enduring resource within the artistic community, and we aim to ensure its advancement to the benefit of all creators”, he adds. This is the biggest award announced by Epic so far. Blender has no obligation to use or promote Epic Games’ storefront or engine considering this is a pure generous offer by Epic Games with “no strings attached”. In April, Magic Leap revealed that the company will provide 500 Magic Leap One Creator Edition spatial computing devices for giveaway as part of Epic MegaGrants program. Blender users are appreciative of the support and generosity of Epic Games. https://twitter.com/JeannotLandry/status/1150812155412963328 https://twitter.com/DomAnt2/status/1150798726379839488 A Redditor comments, “There's a reason Epic as a company has an extremely positive reputation with people in the industry. They've been doing this kind of thing for years, and a huge amount of money they're making from Fortnite is planned to be turned into grants as well.  Say what you want about them, they are without question the top company in gaming when it comes to actually using their profits to immediately reinvest/donate to the gaming industry itself. It doesn't hurt that every company who works with them consistently says that they're possibly the very best company in gaming to work with.” A comment on Hacker News read, “Epic are doing a great job improving fairness in the gaming industry, and the economic conditions for developers. I'm looking forward to their Epic Store opening up to more (high quality) Indie games.” In 2015, Epic had launched Unreal Dev Grants offering a pool of $5 million to independent developers with interesting projects in Unreal Engine 4 to fund the development of their projects. In December 2018, Epic had also launched an Epic game store where developers will get 88% of the earned revenue. The large sum donation of Epic to Blender holds more value considering the highly anticipated release of Blender 2.8 is around the corner. Though its release candidate is already out, users are quite excited for its stable release. Blender 2.8 will have new 3D viewport and UV editor tools to enhance users gaming experience. With Blender aiming to increase its quality of projects, such grants from major game publishers will only help them get bigger. https://twitter.com/ddiakopoulos/status/1150826388229726209 A user on Hacker News comments, “Awesome. Blender is on the cusp of releasing a major UI overhaul (2.8) that will make it more accessible to newcomers (left-click is now the default!). I'm excited to see it getting some major support from the gaming industry as well as the film industry.” What to expect in Unreal Engine 4.23? Epic releases Unreal Engine 4.22, focuses on adding “photorealism in real-time environments” Blender celebrates its 25th birthday!
Read more
  • 0
  • 0
  • 5078
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-what-to-expect-in-unreal-engine-4-23
Vincy Davis
12 Jul 2019
3 min read
Save for later

What to expect in Unreal Engine 4.23?

Vincy Davis
12 Jul 2019
3 min read
A few days ago, Epic released the first preview of Unreal Engine 4.23 for the developer community to check out its features and report back in case of any issues, before the final release. This version has new additions of Skin Weight Profiles, VR Scouting tools, New Pro Video Codecs and many updates on features like XR, animation, core, virtual production, gameplay and scripting, audio and more. The previous version, Unreal Engine 4.22 focused on adding photorealism in real-time environments. Some updates in Unreal Engine 4.23 XR Hololens 2 Native Support: With updates to the Stereo Panoramic Capture tool, it will be much easier to capture high-quality stereoscopic stills and videos of the virtual world in industry-standard formats, and to view those captures in an Oculus or GearVR headset. Stereo Panoramic capture Tool Improvements: This will make it easy to capture high-quality stereoscopic stills and videos of the virtual world in industry-standard formats. Animation Skin Weight Profiles: The new Skin Weight Profile system will enable users to override the original Skin Weights that are stored with a Skeletal Mesh. Animation Streaming: This is aimed at improving memory management for animation data. Sub Animation Graphs: New Sub Anim Graphs will allow dynamic switching of sub-sections of an Animation Graph, enabling multi-user-collaboration and memory savings for vaulted or unavailable items. Core Unreal Insights Tool: This will help developers to collect and analyze data about the Engine's behavior in a uniform fashion. This system has three components: The Trace System API will gather information from runtime systems in a consistent format and captures it for later processing. Multiple live sessions can contribute data at the same time. The Analysis API will process data from the Trace System API, and convert it into a form that the Unreal Insights tool can use. The Unreal Insights tool will provide an interactive visualization of data processed through the Analysis API, which will provide developers with a unified interface for stats, logs, and metrics from their application. Virtual production Remote Control over HTTP Extended LiveLink Plugin New VR Scouting tools New Pro Video Codecs nDisplay: Warp and Blend for Curved Surfaces Virtual Camera Improvements Gameplay & Scripting UMG Widget Diffing: Expanded and improved Blueprint Diffing will now support Widget Blueprints as well as Actor and Animation Blueprints. Audio Open Sound Control: It will enable a native implementation of the Open Sound Control (OSC) standard in an Unreal Engine plugin. Wave Table Synthesis: The new monophonic Wavetable synthesizer leverages UE4’s built-in curve editor to author the time-domain wavetables, enabling a wide range of sound design capabilities can be driven by gameplay parameters. There are many more updates provided for the Editor, Niagara editor, Physics simulation, Rendering system and the Sequencer multi-track editor in Unreal Engine 4.23. The Unreal Engine team has notified users that the preview release is not fully quality tested, hence should be considered as unstable until the final release. Users are excited to try the latest version of Unreal Engine 4.23. https://twitter.com/ClicketyThe/status/1149070536762372096 https://twitter.com/cinedatabase/status/1149077027565309952 https://twitter.com/mygryphon/status/1149334005524750337 Visit the Unreal Engine page for more details. Unreal Engine 4.22 update: support added for Microsoft’s DirectX Raytracing (DXR) Unreal Engine 4.20 released with focus on mobile and immersive (AR/VR/MR) devices What’s new in Unreal Engine 4.19?
Read more
  • 0
  • 0
  • 5126

article-image-unity-learn-premium-a-learning-platform-for-professionals-to-master-real-time-3d-development
Sugandha Lahoti
27 Jun 2019
3 min read
Save for later

Unity Learn Premium, a learning platform for professionals to master real-time 3D development

Sugandha Lahoti
27 Jun 2019
3 min read
Unity has announced a new learning platform for professionals and hobbyists to advance their Unity knowledge and skills within their industry. The Unity Learn Premium, builds upon the launch of the free Unity Learn platform. The Unity Learn, platform hosts hundreds of free projects and tutorials, including two new beginner projects. Users can search learning materials by topic, content type, and level of expertise. Tutorials comes with  how-to instructions, video clips, and code snippets, making it easier to switch between Unity Learn and the Unity Editor. The Unity Learn Premium service allows creators to get immediate answers, feedback, and guidance directly from experts with Learn Live, biweekly interactive sessions with Unity-certified instructors. Learners can also track progress on guided learning paths, work through shared challenges with peers, and access an exclusive library of resources updated every month with the latest Unity releases. The premium version will offer live access to Unity experts, and learning content across industries, including architecture, engineering, and construction, automotive, transportation, and manufacturing), media and entertainment, and gaming. The Unity Learn Premium announcement comes on the heels of the launch of the Unity Academic Alliance. With this membership program,  educators and institutions can incorporate Unity into their curriculum. Jessica Lindl, VP and Global Head of Education, Unity Technologies wrote to us in a statement, “Until now, there wasn’t a definitive learning resource for learning intermediate to advanced Unity skills, particularly for professionals in industries beyond gaming. The workplace of today and tomorrow is fast-paced and driven by innovation, meaning workers need to become lifelong learners, using new technologies to upskill and ultimately advance their careers. We hope that Unity Learn Premium will be the perfect tool for professionals to continue on this learning path.” She further wrote to us, "Through our work to enable the success of creators around the world, we discovered there is no definitive source for advancing from beginner to expert across all industries, which is why we're excited to launch the Unity Learn Platform. The workplace of today and tomorrow is fast-paced and driven by innovation, forcing professionals to constantly be reskilling and upskilling in order to succeed. We hope the Unity Learn Platform enables these professionals to excel in their respective industries." Unity Learn Premium will be available at no additional cost for Plus and Pro subscribers and offered as a standalone subscription for $15/month. You can access more information here. Related News Developers can now incorporate Unity features into native iOS and Android apps Unity Editor will now officially support Linux Obstacle Tower Environment 2.0: Unity announces Round 2 of its ‘Obstacle Tower Challenge’ to test AI game players.
Read more
  • 0
  • 0
  • 5291

article-image-google-announces-early-access-of-game-builder-a-platform-for-building-3d-games-with-zero-coding
Bhagyashree R
17 Jun 2019
3 min read
Save for later

Google announces early access of ‘Game Builder’, a platform for building 3D games with zero coding

Bhagyashree R
17 Jun 2019
3 min read
Last week, a team within Area 120, Google’s workshop for experimental products, introduced an experimental prototype of Game Builder. It is a “game building sandbox” that enables you to build and play 3D games in just a few minutes. It is currently in early access and is available on Steam. https://twitter.com/artofsully/status/1139230946492682240 Here’s how Game Builder makes “building a game feel like playing a game”: Source: Google Following are some of the features that Game Builder comes with: Everything is multiplayer Game Builder’s always-on multiplayer feature allows multiple users to build and play games simultaneously. Your friends can also play the game while you are working on it. Thousands of 3D models from Google Poly You can find thousands of free 3D models (such as rocket ship, synthesizer, ice cream cone) to use in your games from Google Poly. You can also “remix” most of the models using Tilt Brush and Google Blocks application integration to make it fit for your game. Once you find the right 3D model, you can easily and instantly use it in your game. No code, no compilation required This platform is designed for all skill levels, from enabling players to build their first game to providing game developers a faster way to realize their game ideas. Game Builder’s card-based visual programming allows you to bring your game to life with bare minimum knowledge of programming. You just need to drag and drop cards to answer questions like  “How do I move?.” You can also create your own cards with Game Builder’s extensive JavaScript API. It allows you to script almost everything in the game. As the code is live, you just need to save the changes and you are ready to play the game without any compilation. Apart from these features, you can also create levels with terrain blocks, edit the physics of objects, create lighting and particle effects, and more. Once the game is ready you can share your creations on Steam Workshop. Many people are commending this easy way of game building, but also think that this is nothing new. We have seen such platforms in the past, for instance, GameMaker by YoYo Games. “I just had a play with it. It seems very well thought out. It has a very nice tutorial that introduces all the basic concepts. I am looking forward to trying out the multiplayer aspect, as that seems to be the most compelling thing about it,”  a Hacker News user commented. You can read Google’s official announcement for more details. Google Research Football Environment: A Reinforcement Learning environment for AI agents to master football Google Walkout organizer, Claire Stapleton resigns after facing retaliation from management Ian Lance Taylor, Golang team member, adds another perspective to Go being Google’s language
Read more
  • 0
  • 0
  • 4873

article-image-introducing-minecraft-earth-minecrafts-ar-based-game-for-android-and-ios-users
Amrata Joshi
20 May 2019
4 min read
Save for later

Introducing Minecraft Earth, Minecraft's AR-based game for Android and iOS users

Amrata Joshi
20 May 2019
4 min read
Last week, the team at Minecraft introduced a new AR-based game called ‘Minecraft Earth’, which is free for Android and iOS users. The most striking feature about Minecraft Earth is that it builds on the real world with augmented reality, I am sure it will remind you of the game Pokémon Go. https://twitter.com/minecraftearth/status/1129372933565108224 Minecraft has around 91 million active players, and now Microsoft is looking forward to taking the Pokémon Go concept on the next level by letting Minecraft players create and share whatever they’ve made in the game with friends in the real world. Users can now build something in Minecraft on their phones and then drop it into their local park for all their friends to see it together at the same location. This game aims to transform single user AR gaming to multi-user gaming while letting users access the virtual world that’s shared by everyone. Read Also: Facebook launched new multiplayer AR games in Messenger Minecraft Earth will be available in beta on iOS and Android, this summer. This game brings modes like creative that has unlimited blocks and items; or survival where you lose all your items when you die. Torfi Olafsson, game director of Minecraft Earth, explains, “This is an adaptation, this is not a direct translation of Minecraft. While it’s an adaptation, it’s built on the existing Bedrock engine so it will be very familiar to existing Minecraft players. If you like building Redstone machines, or you’re used to how the water flows, or how sand falls down, it all works. Olafsson further added, “All of the mobs of animals and creatures in Minecraft are available, too, including a new pig that really loves mud. We have tried to stay very true to the kind of core design pillars of Minecraft, and we’ve worked with the design team in Stockholm to make sure that the spirit of the game is carried through.” Players have to venture out into the real world to collect things just like how it works in Pokemon Go! Minecraft Earth has something similar to pokéstops called “tapables”, which are randomly placed in the world around the player. They are designed to give players rewards that allow them to build things, and players need to collect as many of these as possible in order to get resources and items to build vast structures in the building mode. The maps in this game are based on OpenStreetMap that has allowed Microsoft to place Minecraft adventures into the world. On the Minecraft Earth map, these adventures spawn dynamically and are also designed for multiple people to get involved in. Players can play together while sitting side by side to experience similar adventures at the exact same time and spot. They can also fight monsters, break down structures for resources together, and even stand in front of a friend to block them from physically killing a virtual sheep. Players can even see the tools that fellow players have in their hands on your phone’s screen, alongside their username. Microsoft is also using its Azure Spatial Anchors technology in Minecraft Earth which uses machine vision algorithms so that real-world objects can be used as anchors for digital content. Niantic, a Pokémon Go developer had to recently settle a lawsuit with angry homeowners who had pokéstops placed near their houses. With what happened with Pokemon Go in the past could be a threat for games like Minecraft Earth too. As there are many challenges in bringing augmented reality within private spaces. Saxs Persson, creative director of Minecraft said, “There are lots of very real challenges around user-generated content. It’s a complicated problem at the scale we’re talking about, but that doesn’t mean we shouldn’t tackle it.” https://twitter.com/Toadsanime/status/1129374278384795649 https://twitter.com/ExpnandBanana/status/1129419087216562177 https://twitter.com/flamnhotsadness/status/1129429075490160642 https://twitter.com/pixiebIush/status/1129455271833550848 To know more about Minecraft Earth, check out Minecraft’s page. Game rivals, Microsoft and Sony, form a surprising cloud gaming and AI partnership Obstacle Tower Environment 2.0: Unity announces Round 2 of its ‘Obstacle Tower Challenge’ to test AI game players OpenAI Five beats pro Dota 2 players; wins 2-1 against the gamers
Read more
  • 0
  • 0
  • 5217
article-image-epic-releases-unreal-engine-4-22-focuses-on-adding-photorealism-in-real-time-environments
Sugandha Lahoti
03 Apr 2019
4 min read
Save for later

Epic releases Unreal Engine 4.22, focuses on adding “photorealism in real-time environments”

Sugandha Lahoti
03 Apr 2019
4 min read
Epic games released a new version of it’s flagship game engine, Unreal Engine 4.22. This release comes with a total of 174 improvements, focused on “pushing the boundaries of photorealism in real-time environments”. It also comes with improved build times, up to 3x faster, and new features such as real-time ray tracing. Unreal Engine 4.22 also adds support for Microsoft HoloLens remote streaming and Visual Studio 2019. What’s new in Unreal Engine 4.22? Real-Time Ray Tracing and Path Tracing (Early Access): The Ray Tracing features, first introduced in a preview in mid-february,  are composed of a series of ray tracing shaders and ray tracing effects. They help in achieving natural realistic looking lighting effects in real-time. The Path Tracer includes a full global illumination path for indirect lighting that creates ground truth reference renders right inside of the engine. This improves workflow content in a scene without needing to export to a third-party offline path tracer for comparison. New Mesh drawing pipeline: The new pipeline for Mesh drawing results in faster caching of information for static scene elements. Automatic instancing merges draw calls where possible, resulting in four to six time fewer lines of code. This change is a big one so backwards compatibility for Drawing Policies is not possible. Any Custom Drawing Policies will need to be rewritten as FMeshPassProcessors in the new architecture. Multi-user editing (Early Access): Simultaneous multi user editing allows multiple level designers and artists to connect multiple instances of Unreal Editor together to work collaboratively in a shared editing session. Faster C++ iterations: Unreal has licensed Molecular Matters' Live++ for all developers to use on their Unreal Engine projects, and integrated it as the new Live Coding feature. Developers can now make C++ code changes in their development environment and compile and patch it into a running editor or standalone game in a few seconds. UE 4.22 also optimizes UnrealBuildTool and UnrealHeaderTool, reducing build times and resulting in up to 3x faster iterations when making C++ code changes. Improved audio with TimeSynth (Early access): TimeSynth is a new audio component with features like sample accurate starting, stopping, and concatenation of audio clips. Also includes precise and synchronous audio event queuing. Enhanced Animation: Unreal Engine 4.22 comes with a new Animation Plugin which is based upon the Master-Pose Component system and adds blending and additive Animation States. It reduces the overall amount of animation work required for a crowd of actors. This release also features an Anim Budgeter tool to help developers set a fixed budget per platform (ms of work to perform on the gamethread). Improvements in the Virtual Production Pipeline: New Composure UI: Unreal’s built-in compositing tool Composure has an updated UI to achieve real time compositing capabilities to build images, video feeds, and CG elements directly within the Unreal Engine. OpenColorIO (OCIO) color profiles: Unreal Engine now supports the Open Color IO framework for transforming the color space of any Texture or Composure Element directly within the Unreal Engine. Hardware-accelerated video decoding (Experimental): On Windows platforms, UE 4.22 can use the GPU to speed up the processing of H.264 video streams to reduce the strain on the CPU when playing back video streams. New Media I/O Formats: UE 4.22 ships with new features for professional video I/O input formats and devices, including 4K UHD inputs for both AJA and Blackmagic and AJA Kona 5 devices. nDisplay improvements (Experimental): Several new features make the nDisplay multi-display rendering system more flexible, handling new kinds of hardware configurations and inputs. These were just a select few updates. To learn more about Unreal Engine 4.22 head on over to the Unreal Engine blog. Unreal Engine 4.22 update: support added for Microsoft’s DirectX Raytracing (DXR) Unreal Engine 4.20 released with focus on mobile and immersive (AR/VR/MR) devices Implementing an AI in Unreal Engine 4 with AI Perception components [Tutorial]
Read more
  • 0
  • 0
  • 4672

article-image-epic-games-at-gdc-announces-epic-megagrants-rtx-powered-ray-tracing-demo-and-free-online-services-for-game-developers
Natasha Mathur
22 Mar 2019
4 min read
Save for later

Epic Games announces: Epic MegaGrants, RTX-powered Ray tracing demo, and free online services for game developers

Natasha Mathur
22 Mar 2019
4 min read
Epic Games, an American video game and software development company, made a series of announcements, earlier this week. These include: Epic Game’s CEO, Tim Sweeney to offer $100 million in grants to game developers Stunning RTX-powered Ray-Tracing Demo named Troll Epic’s free Online Services launch for game developers Epic MegaGrants: $100 million funds to Game Developers Tim Sweeney, CEO, Epic Games Inc, announced earlier this week that he will be offering $100 million in grants to game developers to boost the growth of the gaming industry. Sweeney made the announcement during a presentation on Wednesday at the Game Developers Conference (GDC). GDC is the world's largest professional game industry event that ended yesterday in San Francisco. Epic Games also created a $5 million fund for grants that have been disbursed over the last three years. Now Epic Games is off to build a new fund called Epic MegaGrants. These are “no-strings-attached” grants, meaning that they don’t consist of any contracts requiring game developers to do anything for Epic. All that game developers need to do is apply for the grants, create an innovative project, and if the Epic’s judges find it worthy, they’ll offer them the funds. “There are no commercial hooks back to Epic. You don’t have to commit to any deliverables. This is our way of sharing Fortnite’s unbelievable success with as many developers as we can”, said Sweeney. Troll: a Ray Tracing Unreal Engine 4 Demo Another eye-grabbing moment at GDC this year was a “visually stunning” ray tracing demo revealed by Goodbye Kansas and Deep Forest Films called "Troll”. Troll was rendered in real time using Unreal Engine 4.22 ray tracing and camera effects. And powered by a NVIDIA’s single GeForce RTX 2080 Ti graphics card.  Troll is visually inspired by Swedish painter and illustrator John Bauer, whose illustrations are famous for Swedish folklore and fairy tales anthology known as ‘Among Gnomes and Trolls’. https://www.youtube.com/watch?v=Qjt_MqEOcGM                                                            Troll “Ray tracing is more than just reflections — it’s about all the subtle lighting interactions needed to create a natural, beautiful image. Ray tracing adds these subtle lighting effects throughout the scene, making everything look more real and natural,” said Nick Penwarden, Director of Engineering for Unreal Engine at Epic Games. NVIDIA team states in a blog post that Epic Games has been working to integrate RTX-accelerated ray tracing into its popular Unreal Engine 4. In fact, Unreal Engine 4.22 will have the support for new Microsoft DXR API for real-time ray tracing. Epic’s free online services launch for game developers Epic Games also announced the launch of free tools and services, part of the Epic Online Services, which was announced in December 2018. The SDK is available via the new developer portal for immediate download and use. SDK currently supports Windows, Mac, and Linux. Moreover, the SDK, as a part of the release, provides support for two free services, namely, game analytics and player ticketing. Game analytics help developers understand player behavior. It features DAU (Daily active users), MAU (Monthly active users), retention, new player counts, game launch counts, online user count, and more. The ticketing system connects players directly with developers and allows them to report bugs or other problems. These two services will continue to evolve along with the rest of Epic Online Services (EOS) to offer infrastructure and tools required by the developers to launch, operate, and scale the high-quality online games. Epic games will also be offering additional free services throughout 2019, including player data storage, player reports, leaderboards & stats, player identity, player inventory, matchmaking etc. “We are committed to developing EOS with features that can be used with any engine, any store and that can support any major platform...these services will allow developers to deliver cross-platform gameplay experiences that enable players to enjoy games no matter what platform they play on”, states the Epic Games team. Fortnite server suffered a minor outage, Epic Games was quick to address the issue Epic games CEO calls Google “irresponsible” for disclosing the security flaw in Fortnite Android installer Fortnite creator Epic games launch Epic games store where developers get 88% of revenue earned
Read more
  • 0
  • 0
  • 3947

article-image-researcher-shares-a-wolfenstein-real-time-ray-tracing-demo-in-webgl1
Bhagyashree R
18 Mar 2019
3 min read
Save for later

Researcher shares a Wolfenstein real-time ray tracing demo in WebGL1

Bhagyashree R
18 Mar 2019
3 min read
Last week, Reinder Nijhoff, a computer vision researcher, created a project that does real-time ray tracing in WebGL1 using Nvidia’s RTX graphics card. This demo was inspired by Metro's real-time global illumination. https://twitter.com/ReinderNijhoff/status/1106193109376008193 This demo uses a hybrid rendering engine created using WebGL1. It renders all the polygons in a frame with the help of traditional rasterization technologies and then combines the result with ray traced shadows, diffuse GI, and reflections. Credits: Reinder Nijhoff What is the ray tracing technique? In computer graphics, ray tracing is a technique for rendering 3D graphics with very complex light interactions. Basically, in this technique, an algorithm traces the path of light and then simulates the way the light will interact with the virtual objects. There are three ways light interacts with the virtual objects: It can be reflected from one object to another causing reflection. It can be blocked by objects causing shadows. It can pass through transparent or semi-transparent objects causing refractions. All these interactions are then combined to determine the final color of a pixel. Ray tracing has been used for offline rendering due to its ability to accurately model the physical behavior of light in the real world. Due to its computationally intensive nature, ray tracing was often not the first choice for real-time rendering. However, this changed with the introduction of Nvidia RTX graphics card as it adds custom acceleration hardware and makes real-time ray tracing relatively straightforward. What was this demo about? The project’s prototype was based on a forward renderer that first draws all the geometry in the scene. Next, the shader used to rasterize (converting an image into pixels) the geometry, calculates the direct lighting. Additionally, the shader also casts random rays from the surface of the rendered geometry to collect the indirect light reflection due to non-shiny surfaces using a ray tracer. The author started with a very simple scene for the prototype that included a single light and rendered only a few spheres and cubes. This made the ray tracing code pretty much straightforward. Once the prototype was complete, he wanted to take the prototype to the next level by adding more geometry and a lot of lights to the scene. Despite the complexity of the environment, Nijhoff wanted to perform ray tracing of the scene in real-time. Generally, to speed up the ray trace process, a bounding volume hierarchy (BVH) is used as an acceleration structure. However, when using WebGL1 shaders it is difficult to pre-calculate and use BVH. This is why Nijhoff decided to use a Wolfenstein 3D level for this demo. To know more in detail, check out the original post shared by Reinder Nijhoff. Unity switches to WebAssembly as the output format for the Unity WebGL build target NVIDIA shows off GeForce RTX, real-time raytracing GPUs, as the holy grail of computer graphics to gamers Introducing SCRIPT-8, an 8-bit JavaScript-based fantasy computer to make retro-looking games  
Read more
  • 0
  • 0
  • 4749
article-image-godot-3-1-released-with-improved-c-support-opengl-es-2-0-renderer-and-much-more
Savia Lobo
15 Mar 2019
4 min read
Save for later

Godot 3.1 released with improved C# support, OpenGL ES 2.0 renderer and much more!

Savia Lobo
15 Mar 2019
4 min read
On 13 March, Wednesday, the Godot developers announced the release of a new version of the open source 2D and 3D cross-platform compatible game engine, Godot 3.1. This new version includes the much-requested improvements to the major release, Godot 3.0. Improved features in the Godot 3.1 OpenGL ES 2.0 renderer Rendering is done entirely on sRGB color space (the GLES3 renderer uses linear color space). This is much more efficient and compatible, but it means that HDR will not be supported. Some advanced PBR features such as subsurface scattering are not supported. Unsupported features will not be visible when editing materials. Some shader features will not work and throw an error when used. Also, some post-processing effects are not present either. Unsupported features will not be visible when editing environments. GPU-based Particles will not work as there is no transform feedback support. Users can use the new CPUParticles node instead. Optional typing in GDScript This has been one of the most requested Godot features from day one. GDScript allows to write code in a quick way within a controlled environment. The code editor will now show which lines are safe with a slight highlight of the line number. This will be vital in the future to optimize small pieces of code which may require more performance. Revamped Inspector The Godot inspector has been rewritten from scratch. It includes features such as proper vector field editing, sub-inspectors for resource editing, better custom visual editors for many types of objects, very comfortable to use spin-slider controls, better array and dictionary editing and many more features. Kinematicbody2d (and 3d) improvements Kinematic bodies are among Godot's most useful nodes. They allow creating very game-like character motion with little effort. For Godot 3.1 they have been considerably improved with: Support for snapping the body to the floor. Support for RayCast shapes in kinematic bodies. Support for synchronizing kinematic movement to physics, avoiding a one-frame delay. New Axis Handling system Godot 3.1 uses the novel concept of "action strength". This approach allows using actions for all use cases and it makes it very easy to create in-game customizable mappings and customization screens. Visual Shader Editor This was a pending feature to re-implement in Godot 3.0, but it couldn't be done in time back then. The new version has new features such as PBR outputs, port previews, and easier to use mapping to inputs. 2D Meshes Godot now supports 2D meshes, which can be used from code or converted from sprites to avoid drawing large transparent areas. 2D Skeletons It is now possible to create 2D skeletons with the new Skeleton2D and Bone2D nodes. Additionally, Polygon2D vertices can be assigned bones and weight painted. Adding internal vertices for better deformation is also supported. Constructive Solid Geometry (CSG) CSG tools have been added for fast level prototyping, allowing generic primitives and custom meshes to be combined via boolean operations to generate more complex shapes. They can also become colliders to test together with physics. CPU-based particle system Godot 3.0 integrated a GPU-based particle system, which allows emitting millions of particles at little performance cost. The developers added alternative CPUParticles and CPUParticles2D nodes that perform particle processing using the CPU (and draw using the MultiMesh API). These nodes open the window for adding features such as physics interaction, sub-emitters or manual emission, which are not possible using the GPU. More VCS-friendly The new 3.1 version includes some very requested enhancements such as: Folded properties are no longer saved in scenes. This avoids unnecessary history pollution. Non-modified properties are no longer saved. This reduces text files considerably and makes history even more readable. Improved C# support In Godot 3.1, C# projects can be exported to Linux, macOS, and Windows. Support for Android, iOS, and HTML5 will come soon. To know about other improvements in detail, visit the changelog or the official website. Microsoft announces Game stack with Xbox Live integration to Android and iOS OpenAI introduces Neural MMO, a multiagent game environment for reinforcement learning agents Google teases a game streaming service set for Game Developers Conference
Read more
  • 0
  • 0
  • 4416

article-image-microsoft-announces-game-stack-with-xbox-live-integration-to-android-and-ios
Natasha Mathur
15 Mar 2019
3 min read
Save for later

Microsoft announces Game stack with Xbox Live integration to Android and iOS

Natasha Mathur
15 Mar 2019
3 min read
Microsoft has good news for all the game developers out there. It launched a new initiative, called Microsoft Game Stack yesterday, which includes an amalgamation of different Microsoft tools and services into a single robust ecosystem to ‘empower game developers’. It doesn’t matter whether you’re a rookie indie developer or an AAA studio, this developer-focused platform will make the game development process ten times easier for you. The main goal of Game Stack is to help developers easily find different tools and services required for game development at one spot. These tools range from Azure, PlayFab, DirectX, Visual Studio, to  Xbox Live, App Center, and Havok. Cloud plays a major role in Game Stack and it makes use of Azure to fulfill this requirement. Source: Microsoft Azure is globally available in 54 regions will help scale its Project xCloud (a service that streams games to PCs, consoles, and mobile devices) to provide an uninterrupted gaming experience for players worldwide. Not to forget, companies like Rare, Ubisoft and Wizards of the Coast are already hosting multiplayer game servers and storing their player data on Azure. It is also capable of analyzing game telemetry, protecting games from DDOS attacks, and training AI. Moreover, Microsoft Game Stack is device agnostic which makes it really convenient for the gamers. Another great component of Game Stack is a backend service for operating and building new games, called PlayFab. PlayFab offers game development services, real-time analytics, and LiveOps capabilities to Game Stack. PlayFab is also device agnostic. It supports iOS, Android, PC, Web, Xbox, Sony PlayStation, Nintendo Switch and all the other major game engines such as Unity and Unreal. Microsoft has also released a preview for five new PlayFab services. Out of these five, one, called, PlayFab Matchmaking is open for public preview, while other four including PlayFab Party, PlayFab Insights, PlayFab PubSub, and PlayFab user-generated Content are in private preview. Game Stack also comes with Xbox Live, one of the most engaging and interactive gaming communities in the world. Xbox Live will be providing identity and community services in the Game Stack. Microsoft has also expanded the cross-platform capabilities of Xbox Live under Game Stack with a new SDK for iOS and Android devices. Mobile developers will be able to easily connect with some of the most highly engaged and passionate gamers on the planet using Xbox Live. Other benefits of the Xbox Live SDK includes more focus on building games and leveraging Microsoft‘s trusted identity network to offer support for log-in, privacy, online safety and child accounts. Apart from that, there are features like gamerscore, and “hero” stats, that help keep the gamers engaged. Also, components such as Visual Studio, Mixer, DirectX, Azure App Center, Visual Studio, Visual Studio Code, and Havok are all a part of Game Stack. For more information, check out the official Microsoft Game Stack blog post. Microsoft open sources the Windows Calculator code on GitHub Microsoft open sources ‘Accessibility Insights for Web’, a chrome extension to help web developers fix their accessibility issue Microsoft researchers introduce a new climate forecasting model and a public dataset to train these models
Read more
  • 0
  • 0
  • 3151