Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

How-To Tutorials - 3D Game Development

115 Articles
article-image-packaging-game
Packt
07 Jul 2016
13 min read
Save for later

Packaging the Game

Packt
07 Jul 2016
13 min read
A game is not just an art, code, and a game design packaged within an executable. You have to deal with stores, publishers, ratings, console providers, and making assets and videos for stores and marketing, among other minor things required to fully ship a game. This article by Muhammad A.Moniem, author of the book Mastering Unreal Engine 4.X, will take care of the last steps you need to do within the Unreal environment in order to get this packaged executable fine and running. Anything post-Unreal, you need to find a way to do it, but from my seat, I'm telling you have done the complex, hard, huge, and long part; what comes next is a lot simpler! This article will help us understand and use Unreal's Project Launcher, patching the project and creating DLCs (downloadable content) (For more resources related to this topic, see here.) Project Launcher and DLCs The first and the most important thing you have to keep in mind is that the project launcher is still in development and the process of creating DLCs is not final yet, and might get changed in the future with upcoming engine releases. While writing this book I've been using Unreal 4.10 and testing everything I do and write within the Unreal 4.11 preview version, and yet still the DLC process remains experimental. So be advised that you might find it a little different in the future as the engine evolves: While we have packaged the game previously through the File menu using Packaging Project, there is another, more detailed, more professional way to do the same job. Using the Project Launcher, which comes in the form of a separate app with Unreal (Unreal Frontend), you have the choice to run it directly from the editor. You can access the Project Launcher from the Windows menu, and then choose Project Launcher, and that will launch it right away. However, I have a question here. Why would you go through these extra steps, then just do the packaging process in one click? Well, extensibility is the answer. Using the Unreal Project Launcher allows you to create several profiles, each profile having a different build setting, and later you can fire each build whenever you need it; not only that, but the profiles could be made for different projects, which means you can have an already made setup for all your projects with all the different build configurations. And yet even that's not everything; it comes in handier when you get the chance to cook the content of a game several times, so rather than keep doing it through the File menu, you can just cook the content for the game for all the different platforms at once. For example; if you have to change one texture within your game which is supported on five platforms, you can make a profile which will cook the content for all the platforms and arrange them for you at once, and you can spend that time doing something else. The Project Launcher does the whole thing for you. What if you have to cook the game content for different languages? Let's say the game supports 10 languages? Do you have to do it one by one for each language? The answer is simple; the Project Launcher will do it for you. So you can simply think of the Project Launcher as a batch process, custom command-line tool, or even a form of build server. You set the configurations and requests, and leave it alone doing the whole thing for you, while you are saving your time doing something else. It is all about productivity! And the most important part about the Project Launcher is that you can create DLCs very easily. By just setting a profile for it with a different set of options and settings, you can end up with getting the DLC or game mode done without any complications. In a word, it is all about profiles, and because of that let's discuss how to create profiles, that could serve different purposes. Sometimes the Project Launcher proposes for you a standard profile that matching the platform you are using. That is good, but usually those profiles might not have all we need, and that's why it is recommended to always create new profiles to serve our goals. The Project Launcher by default is divided into two sections vertically; the upper part contains the default profiles, while the lower part contains the custom profiles. And in order to create a new profile all you have to do is to hit the plus sign at the bottom part, where it is titled Custom Launch Profiles: Pressing it will take you to a wizard, or it is better to describe this as a window, where you can set up the new profile options. Those options are drastic, and changing between them leads to a completely different result, so you have to be careful. But in general, you mostly will be building either a project for release, or building a DLC or patch for an already released project. Not to mention that you can even do more types of building that serve different goals, such as a language package for an already released game, which is treated as a patch or DLC but at the same time it has different a setup and options than a patch or DLC. Anyway, we will be taking care of the two main types of process that developers usually have to deal with in the Project Launcher: release and patch. Packaging a release After the new Custom Launch Profile wizard window opens, you have changes for its settings that are necessary to make our Release build of the project. This includes: General: This has the following fields: Give a name to the profile, and this name will be displayed in the Project Launcher main window Give a description to the profile in order to make its goals clear for you in the future, or for anyone else who is going to use it Project: This has the following sections: Select a project, the one that needs to be built. Or you can leave this at Any Project, in order to build the current active project: Build: This has the following sections: Indeed, you have to check the box of the build, so you make a build and activate this section of options. From the Build Configuration dropdown, you have to choose a build type, which is Shipping in this case. Finally, you can check the Build UAT (Unreal Automation Tool) option from Advanced Settings in this section. The UAT could be considered as a bunch of scripts creating a set of automated processes, but in order to decide whether to run it or not, you have to really understand what the UAT is: Written in C# (may convert to C++ in the future) Automates repetitive tasks through automation scripts Builds, cooks, packages, deploys and launches projects Invokes UBT for compilation Analyzes and fixes game content files Codes surgery when updating to new engine versions Distributes compilation (XGE) and build system integration Generates code documentation Automates testing of code and content And many others—you can add your own scripts! Now you will know if you want to enable it or not: Cook: This has the following settings: In the Cook section, you need to set it to by the book. This means you need to define what exactly is needed to be cooked and for which platforms it is enough for now to set it to WindowsNoEditor, and check the cultures you want from the list. I have chosen all of them (this is faster than picking one at a time) and then exclude the ones that I don't want: Then you need to check which maps should be cooked; if you can't see maps, it is probably the first build. Later you'll find the maps listed. But anyway, you must keep all the maps listed in the Maps folder under the Content directory: Now from the Release / DLC / Patching Settings section, you have to check the option Create a release version of the game for distribution, as this version going to be the distribution one. And from the same section give the build a version. This is going to create some extra files that will be used in the future if we are going to create patches or DLCs: You can expand the Advanced Settings section to set your own options. By default, Compress Content and Save Packages without versions are both checked, and both are good for the type of build we are making. But also you can set Store all content in a single file (UnrealPak) to keep things tidy; one .pak file is better than lots of separated files. Finally, you can set Cooker Build Configuration to Shipping, as long as we set Build Configuration itself to Shipping: Package: This has the following options: From this section's drop-down menu, choose Package & store locally, and that will save the packages on the drive. You can't set anything else here, unless you want to store the game packaged project into a repository: Deploy: The Deploy section is meant to build the game into the device of your choice, and I don't think it is the case here, or anyone will want to do it. If you want to put the game into a device, you could directly do Launch from within the editor itself. So, let's set this section to Do Not Deploy: Launch: In case you have chosen to deploy the game into a device, then you'll be able to find this section; otherwise, the options here will be disabled. The set of options here is meant to choose the configurations of the deployed build, as once it is deployed to the device it will run. Here you can set something like the language culture, the default startup map, command-line arguments, and so on. And as we are not deploying now, this section will be disabled: Now we have finished editing our profile, you can find a back arrow at the top of this wizard. Pressing it will take you back to the Project Launcher main window: Now you can find our profile in the bottom section. Any other profiles you'll be making in the future will be listed there. Now there is one step to finish the build. In the right corner of the profile there is a button that says Launch This Profile. Hitting it will start the process of building, cooking, and packaging this profile for the selected project. Hit it right away if you want the process to start. And keep in mind, anytime you need to change any of the previously set settings, there is always an Edit button for each profile: The Project Launcher will start processing this profile; it will take some time, but the amount of time depends on your choices. And you'll be able to see all the steps while it is happening. Not only this, but you can also watch a detailed log; you can save this log, or you can even cancel the process at any time: Once everything is done, a new button will appear at the bottom: Done. Hitting it will take you back again to the Project Launcher main window. And you can easily find the build in the SavedStagedBuildsWindowsNoEditor directory of your project, which is in my case: C:UsersMuhammadDesktopBellzSavedStagedBuildsWindowsNoEditor. The most important thing now is that, if you are planning to create patches or DLCs for this project, remember when you set a version number in the Cook section. This produced some files that you can find in: ProjectNameReleaseReleaseVersionPlatform. Which in my case is: C:UsersMuhammadDesktopBellzReleases1.0WindowsNoEditor. There are two files; you have to make sure that you have a backup of them on your drive for future use. Now you can ship the game and upload it to the distribution channel! Packaging a patch or DLC The good news is, there isn't much to do here. Or in other words, you have to do lots of things, but it is a repetitive process. You'll be creating a new profile in the Project Launcher, and you'll be setting 90% of the options so they're the same as the previous release profile; the only difference will be in the Cook options. Which means the settings that will remain the same are: Project Build Package Deploy Launch The only difference is that in the Release/DLC/Patching Settings section of the Cook section you have to: Disable Create a release version of the game for distribution. Set the number of the base build (the release) as the release version this is based on, as this choice will make sure to compare the previous content with the current one. Check Generate patch, if the current build is a patch, not a DLC. Check Build DLC, if the current build is a DLC, not a patch: Now you can launch this profile, and wait until it is done. The patching process creates a *.pak file in the directory: ProjectNameSavedStagedBuildsPlatformNameProjectNameContentPaks. This .pak file is the patch that you'll be uploading to the distribution channel! And the most common way to handle these type of patch is by creating installers; in this case, you'll create an installer to copy the *.pak file into the player's directory: ProjectNameReleasesVersionNumberPlatformName. Which means, it is where the original content *.pak file of the release version is. In my case I copy the *.pak file from: C:UsersMuhammadDesktopBellzReleases1.0WindowsNoEditor to: C:UsersMuhammadDesktopBellzSavedStagedBuildsWindowsNoEditorBellzContentPaks. Now you've found the way to patch and download content, and you have to know, regardless of the time you have spent creating it, it will be faster in the future, because you'll be getting more used to it, and Epic is working on making the process better and better. Summary The Project Launcher is a very powerful tool shipped with the Unreal ecosystem. Using it is not mandatory, but sometimes it is needed to save time, and you learned how and when to use this powerful tool. Many games nowadays have downloadable content; it helps to keep the game community growing, and the game earn more revenue. Having DLCs is not essential, but it is good, having them must be planned earlier as we discussed, and you've learned how to manage them within Unreal Engine. And you learned how to make patches and DLCs using the Unreal Project Launcher. Resources for Article: Further resources on this subject: Development Tricks with Unreal Engine 4 [article] Bang Bang – Let's Make It Explode [article] Lighting basics [article]
Read more
  • 0
  • 0
  • 2325

article-image-development-tricks-unreal-engine-4
Packt
22 Jun 2016
39 min read
Save for later

Development Tricks with Unreal Engine 4

Packt
22 Jun 2016
39 min read
In this article by Benjamin Carnall, the author of Unreal Engine 4 by Example, we will look at some development tricks with Unreal Engine 4. (For more resources related to this topic, see here.) Creating the C++ world objects With the character constructed, we can now start to build the level. We are going to create a block out for the lanes that we will be using for the level. We can then use this block to construct a mesh that we can reference in code. Before we get into the level creation, we should ensure the functionality that we implemented for the character works as intended. With the BountyDashMap open, navigate to the C++ classes folder of the content browser. Here, you will be able to see the BountyDashCharacter. Drag and drop the character into the game level onto the platform. Then, search for TargetPoint in the Modes Panel. Drag and drop three of these target points into the game level, and you will be presented with the following: Now, press the Play button to enter the PIE (Play in Editor) mode. The character will be automatically possessed and used for input. Also, ensure that when you press A or D, the character moves to the next available target point. Now that we have the base of the character implemented, we should start to build the level. We require three lanes for the player to run down and obstacles for the player to dodge. For now, we must focus on the lanes that the player will be running on. Let's start by blocking out how the lanes will appear in the level. Drag a BSP box brush into the game world. You can find the Box brush in the Modes Panel under the BSP section, which is under the name called Box. Place the box at world location (0.0f, 0.0f, and -100.0f). This will place the box in the center of the world. Now, change the X Property of the box under the Brush settings section of the Details panel to 10000. We require this lane to be so long so that later on, we can hide the end using fog without obscuring the objects that the player will need to dodge. Next, we need to click and drag two more copies of this box. You can do this by holding Alt while moving an object via the transform widget. Position one box copy at world location (0.0f, -230.0f, -100) and the next at (0.0f, 230, -100). The last thing we need to do to finish blocking the level is place the Target Points in the center of each lane. You will be presented with this when you are done: Converting BSP brushes into a static mesh The next thing we need to do is convert the lane brushes we made into one mesh, so we can reference it within our code base. Select all of the boxes in the scene. You can do this by holding Ctrl while selecting the box brushes in the editor. With all of the brushes selected, address the Details panel. Ensure that the transformation of your selection is positioned in the middle of these three brushes. If it is not, you can either reselect the brushes in a different order, or you can group the brushes by pressing Ctrl + G while the boxes are selected. This is important as the position of the transform widget shows what the origin of the generated mesh will be. With the grouping or boxes selected, address the Brush Settings section in the Details Panel there is a small white expansion arrow at the bottom of the section; click on this now. You will then be presented with a create static mesh button; press this now. Name this mesh Floor_Mesh_BountyDash, and save it under the Geometry/Meshes/ of the content folder. Smoke and Mirrors with C++ objects We are going to create the illusion of movement within our level. You may have noticed that we have not included any facilities in our character to move forward in the game world. This is because our character will never advance past his X positon at 0. Instead, we are going to be moving the world toward and past him. This way, we can create very easy spawning and processing logic for the obstacles and game world, without having to worry about continuously spawning objects that the player can move past further and further down the X axis. We require some of the level assets to move through the world, so we can establish the illusion of movement for the character. One of these moving objects will be the floor. This requires some logic that will reposition floor meshes as they reach a certain depth behind the character. We will be creating a swap chain of sorts that will work with three meshes. The meshes will be positioned in a contiguous line. As the meshes move underneath and behind the player, we need to move any mesh that is far enough behind the player’s back to the front of the swap chain. The effect is a never-ending chain of floor meshes constantly flowing underneath the player. The following diagram may help to understand the concept: Obstacles and coin pickups will follow a similar logic. However, they will simply be destroyed upon reaching the Kill point in the preceding diagram. Modifying the BountyDashGameMode Before we start to create code classes that will feature in our world, we are going to modify the BountyDashGameMode that was generated when the project was created. The game mode is going to be responsible for all of the game state variables and rules. Later on, we are going to use the game mode to determine how the player respawns when the game is lost. BountyDashGameMode Class Definition The game mode is going to be fairly simple; we are going to a add a few member variables that will hold the current state of the game, such as game speed, game level, and the number of coins needed to increase the game speed. Navigate to BountyDashGameMode.h and add the following code: UCLASS(minimalapi) class ABountyDashGameMode : public AGameMode { GENERATED_BODY()   UPROPERTY() float gameSpeed;   UPROPERTY() int32 gameLevel; As you can see, we have two private member variables called gameSpeed and gameLevel. These are private, as we wish no other object to be able to modify the contents of these values. You will also note that the class has been specified with minimalapi. This specifier effectively informs the engine that other code modules will not need information from this object outside of the class type. This means you will be able to cast this class type, but functions cannot be called within other modules. This is specified as a way to optimize compile times, as no module outside of this project API will require interactions with our game mode. Next, we declare the public functions and members that we will be using within our game mode. Add the following code to the ABountyDashGameMode class definition: public: ABountyDashGameMode();       void CharScoreUp(unsigned int charScore);       UFUNCTION()     float GetInvGameSpeed();       UFUNCTION()     float GetGameSpeed();       UFUNCTION()     int32 GetGameLevel(); The function called CharScroreUp()takes in the player’s current score (held by the player) and changes game values based on this score. This means we are able to make the game more difficult as the player scores more points. The next three functions are simply the accessor methods that we can use to get the private data of this class in other objects. Next, we need to declare our protected members that we have exposed to be EditAnywhere, so we may adjust these from the editor for testing purposes: protected:   UPROPERTY(EditAnywhere, BlueprintReadOnly) int32 numCoinsForSpeedIncrease;   UPROPERTY(EditAnywhere, BlueprintReadWrite) float gameSpeedIncrease;   }; The numCoinsForSpeedIncrease variable will determine how many coins it takes to increase the speed of the game, and the gameSpeedIncrease value will determine how much faster the objects move when the numCoinsForSpeedIncrease value has been met. BountyDashGameMode Function Definitions Let's begin add some definitions to the BountyDashGameMode functions. They will be very simple at this point. Let's start by providing some default values for our member variables within the constructor and by assigning the class that is to be used for our default pawn. Add the definition for the ABountyDashGameMode constructor: ABountyDashGameMode::ABountyDashGameMode() {     // set default pawn class to our ABountyDashCharacter     DefaultPawnClass = ABountyDashCharacter::StaticClass();       numCoinsForSpeedIncrease = 5;     gameSpeed = 10.0f;     gameSpeedIncrease = 5.0f;     gameLevel = 1; } Here, we are setting the default pawn class; we are doing this by calling StaticClass() on the ABountyDashCharacter. As we have just referenced, the ABountyDashCharacter type ensures that #include BountyDashCharacter.h is added to the BountyDashGameMode.cpp include list. The StaticClass() function is provided by default for all objects, and it returns the class type information of the object as a UClass*. We then establish some default values for member variables. The player will have to pick up five coins to increase the level. The game speed is set to 10.0f (10m/s), and the game will speed up by 5.0f (5m/s) every time the coin quota is reached. Next, let's add a definition for the CharScoreUp() function: void ABountyDashGameMode::CharScoreUp(unsignedintcharScore) {     if (charScore != 0 &&        charScore % numCoinsForSpeedIncrease == 0)     {         gameSpeed += gameSpeedIncrease;         gameLevel++;     } } This function is quite self-explanatory. The character's current score is passed into the function. We then check whether the character's score is not currently 0, and we check if the remainder of our character score is 0 after being divided by the number of coins needed for a speed increase; that is, if it is divided equally, thus the quota has been reached. We then increase the game speed by the gameSpeedIncrease value and then increment the level. The last thing we need to add is the accessor methods described previously. They do not require too much explanation short of the GetInvGameSpeed() function. This function will be used by objects that wish to be pushed down the X axis at the game speed: float ABountyDashGameMode::GetInvGameSpeed() {     return -gameSpeed; }   float ABountyDashGameMode::GetGameSpeed() {     return gameSpeed; }   int32 ABountyDashGameMode::GetGameLevel() {     return gameLevel; } Getting our game mode via Template functions The ABountyDashGame mode now contains information and functionality that will be required by most of the BountyDash objects we create going forward. We need to create a light-weight method to retrieve our custom game mode, ensuring that the type information is preserved. We can do this by creating a template function that will take in a world context and return the correct game mode handle. Traditionally, we could just use a direct cast to ABountyDashGameMode; however, this would require including BountyDashGameMode.h in BountyDash.h. As not all of our objects will require the knowledge of the game mode, this is wasteful. Navigate to the BoutyDash.h file now. You will be presented with the following: #pragma once   #include "Engine.h" What currently exists in the file is very simple—#pragma once has again been used to ensure that the compiler only builds and includes the file once. Then Engine.h has been included, so every other object in BOUNTYDASH_API (they include BountyDash.h by default) has access to the functions within Engine.h. This is a good place to include utility functions that you wish all objects to have access to. In this file, include the following lines of code: template<typename T> T* GetCustomGameMode(UWorld* worldContext) {     return Cast<T>(worldContext->GetAuthGameMode()); } This code, simply put, is a template function that takes in a game world handle. It gets the game mode from this context via the GetAuthGameMode() function, and then casts this game mode to the template type provided to the function. We must cast to the template type as the GetAuthGameMode() simply returns a AGameMode handle. Now, with this in place, let's begin coding our never ending floor. Coding the floor The construction of the floor will be quite simple in essence, as we only need a few variables and a tick function to achieve the functionality we need. Use the class wizard to create a class named Floor that inherits from AActor. We will start by modifying the class definition found in Floor.h. Navigate to this file now. Floor Class Definition The class definition for the floor is very basic. All we need is a Tick function and some accessor methods, so we may provide some information about the floor to other objects. I have also removed the BeginPlay function provided by default by the class wizard as it is not needed. The following is what you will need to write for the AFloor class definition in its entirety, replace what is present in Floor.h with this now (keeping the #include list intact): UCLASS() class BOUNTYDASH_API AFloor : public AActor { GENERATED_BODY()     public:      // Sets default values for this actor's properties     AFloor();         // Called every frame     virtual void Tick( float DeltaSeconds ) override;       float GetKillPoint();     float GetSpawnPoint();   protected:     UPROPERTY(EditAnywhere)     TArray<USceneComponent*> FloorMeshScenes;       UPROPERTY(EditAnywhere)     TArray<UStaticMeshComponent*> FloorMeshes;       UPROPERTY(EditAnywhere)     UBoxComponent* CollisionBox;       int32 NumRepeatingMesh;     float KillPoint;     float SpawnPoint; }; We have three UPROPERTY declared members. The first two being TArrays that will hold handles to the USceneComponent and UMeshComponent objects that will make up the floor. We require the TArray of scene components as the USceneComponentobjects provide us with a world transform that we can apply translations to, so we may update the position of the generated floor mesh pieces. The last UPROPERTY is a collision box that will be used for the actual player collisions to prevent the player from falling through the moving floor. The reason we are using BoxComponent instead of the meshes for collision is because we do not want the player to translate with the moving meshes. Due to surface friction simulation, having the character to collide with any of the moving meshes will cause the player to move with the mesh. The last three members are protected and do not require any UPROPRTY specification. We are simply going to use the two float values—KillPoint and SpawnPoint—to save output calculations from the constructor, so we may use them in the Tick() function. The integer value called NumRepeatingMesh will be used to determine how many meshes we will have in the chain. Floor Function Definitions As always, we will start with the constructor of the floor. We will be performing the bulk of our calculations for this object here. We will be creating USceneComponents and UMeshComponents we are going to use to make up our moving floor. With dynamic programming in mind, we should establish the construction algorithm, so we can create any number of meshes in the moving line. Also, as we will be getting the speed of the floors movement form the game mode, ensure that #include “BountyDashGameMode.h” is included in Floor.cpp The AFloor::AFloor() constructor Start this by adding the following lines to the AFloor constructor called AFloor::AFloor(), which is found in Floor.cpp: RootComponent =CreateDefaultSubobject<USceneComponent>(TEXT("Root"));   ConstructorHelpers::FObjectFinder<UStaticMesh>myMesh(TEXT( "/Game/Barrel_Hopper/Geometry/Floor_Mesh_BountyDash.Floor_Mesh_BountyDash"));   ConstructorHelpers::FObjectFinder<UMaterial>myMaterial(TEXT( "/Game/StarterContent/Materials/M_Concrete_Tiles.M_Concrete_Tiles")); To start with, we are simply using FObjectFinders to find the assets that we require for the mesh. For the myMesh finder, ensure that you parse the reference location of the static floor mesh that we created earlier. We also created a scene component to be used as the root component for the floor object. Next, we are going to be checking the success of the mesh acquisition and then establishing some variables for the mesh placement logic: if (myMesh.Succeeded()) {     NumRepeatingMesh = 3;       FBoxSphereBounds myBounds = myMesh.Object->GetBounds();     float XBounds = myBounds.BoxExtent.X * 2;     float ScenePos = ((XBounds * (NumRepeatingMesh - 1)) / 2.0f) * -1;       KillPoint = ScenePos - (XBounds * 0.5f);     SpawnPoint = (ScenePos * -1) + (XBounds * 0.5f); Note that we have just opened an if statement without closing the scope; from time to time, I will split the segments of the code within a scope across multiple pages. If you are ever lost as to the current scope that we are working from, then look for this comment; <-- Closing If(MyMesh.Succed()) or in the future a similarly named comment. Firstly, we are initializing the NumRepeatingMesh value with 3. We are using a variable here instead of a hard coded value so that we may update the number of meshes in the chain without having to refactor the remaining code base. We then get the bounds of the mesh object using the function called GetBounds() on the mesh asset that we just retrieved. This returns the FBoxSphereBounds structure, which will provide you with all of the bounding information of a static mesh asset. We then use the X component of the member called BoxExtent to initialize Xbounds. BoxExtent is a vector that holds the extent of the bounding box of this mesh. We save the X component of this vector, so we can use it for mesh chain placement logic. We have doubled this value, as the BoxExtent vector only represents the extent of the box from origin to one corner of the mesh. Meaning, if we wish the total bounds of the mesh, we must double any of the BoxExtent components. Next, we calculate the initial scene positon of the first USceneCompoennt we will be attaching a mesh to and storing in the ScenePos array. We can determine this position by getting the total length of all of the meshes in the (XBounds * (numRepeatingMesh – 1) chain and then halve the resulting value, so we can get the distance the first SceneComponent that will be from origin along the X axis. We also multiply this value by -1 to make it negative, as we wish to start our mesh chain behind the character (at the X position 0). We then use ScenePos to specify killPoint, which represents the point in space at which floor mesh pieces should get to, before swapping back to the start of the chain. For the purposes the swap chain, whenever a scene component is half a mesh piece length behind the position of the first scene component in the chain, it should be moved to the other side of the chain. With all of our variables in place, we can now iterate through the number of meshes we desire (3) and create the appropriate components. Add the following code to the scope of the if statement that we just opened: for (int i = 0; i < NumRepeatingMesh; ++i) { // Initialize Scene FString SceneName = "Scene" + FString::FromInt(i); FName SceneID = FName(*SceneName); USceneComponent* thisScene = CreateDefaultSubobject<USceneComponent>(SceneID); check(thisScene);   thisScene->AttachTo(RootComponent); thisScene->SetRelativeLocation(FVector(ScenePos, 0.0f, 0.0f)); ScenePos += XBounds;   floorMeshScenes.Add(thisScene); Firstly, we are creating a name for the scene component by appending Scene with the iteration value that we are up too. We then convert this appended FString to FName and provide this to the CreateDefaultSubobject template function. With the resultant USceneComponent handle, we call AttachTo()to bind it to the root component. Then, we set the RelativeLocation of USceneComponent. Here, we are parsing in the ScenePos value that we calculated earlier as the Xcomponent of the new relative location. The relative location of this component will always be based off of the position of the root SceneComponent that we created earlier. With the USceneCompoennt appropriately placed, we increment the ScenePos value by that of the XBounds value. This will ensure that subsequent USceneComponents created in this loop will be placed an entire mesh length away from the previous one, forming a contiguous chain of meshes attached to scene components. Lastly, we add this new USceneComponent to floorMeshScenes, so we may later perform translations on the components. Next, we will construct the mesh components and add the following code to the loop: // Initialize Mesh FString MeshName = "Mesh" + FString::FromInt(i); UStaticMeshComponent* thisMesh = CreateDefaultSubobject<UStaticMeshComponent>(FName(*MeshName)); check(thisMesh);   thisMesh->AttachTo(FloorMeshScenes[i]); thisMesh->SetRelativeLocation(FVector(0.0f, 0.0f, 0.0f)); thisMesh->SetCollisionProfileName(TEXT("OverlapAllDynamic"));   if (myMaterial.Succeeded()) {     thisMesh->SetStaticMesh(myMesh.Object);     thisMesh->SetMaterial(0, myMaterial.Object); }   FloorMeshes.Add(thisMesh); } // <--Closing For(int i = 0; i < numReapeatingMesh; ++i) As you can see, we have performed a similar name creation process for the UMeshComponents as we did for the USceneComponents. The preceding construction process was quite simple. We attach the mesh to the scene component so the mesh will follow any translation that we apply to the parent USceneComponent. We then ensure that the mesh's origin will be centered around the USceneComponent by setting its relative location to be (0.0f, 0.0f, 0.0f). We then ensure that meshes do not collide with anything in the game world; we do so with the SetCollisionProfileName() function. If you remember, when we used this function earlier, we provided a profile name that we wish edthe object to use the collision properties from. In our case, we wish this mesh to overlap all dynamic objects, thus we parse OverlapAllDynamic. Without this line of code, the character may collide with the moving floor meshes, and this will drag the player along at the same speed, thus breaking the illusion of motion we are trying to create. Lastly, we assign the static mesh object and material we obtained earlier with the FObjectFinders. We ensure that we add this new mesh object to the FloorMeshes array in case we need it later. We also close the loop scope that we created earlier. The next thing we are going to do is create the collision box that will be used for character collisions. With the box set to collide with everything and the meshes set to overlap everything, we will be able to collide on the stationary box while the meshes whip past under our feet. The following code will create a box collider: collisionBox =CreateDefaultSubobject<UBoxComponent>(TEXT("CollsionBox")); check(collisionBox);   collisionBox->AttachTo(RootComponent); collisionBox->SetBoxExtent(FVector(spawnPoint, myBounds.BoxExtent.Y, myBounds.BoxExtent.Z)); collisionBox->SetCollisionProfileName(TEXT("BlockAllDynamic"));   } // <-- Closing if(myMesh.Succeeded()) As you can see, we initialize UBoxComponent as we always initialize components. We then attach the box to the root component as we do not wish it to move. We also set the box extent to be that of the length of the entire swap chain by setting the spawnPoint value as the X bounds of the collider. We set the collision profile to BlockAllDynamic. This means it will block any dynamic actor, such as our character. Note that we have also closed the scope of the if statement opened earlier. With the constructor definition finished, we might as well define the accessor methods for spawnPoint and killPoint before we move onto theTick() function: float AFloor::GetKillPoint() {     return KillPoint; }   float AFloor::GetSpawnPoint() {     return SpawnPoint; } AFloor::Tick() Now, it is time to write the function that will move the meshes and ensure that they move back to the start of the chain when they reach KillPoint. Add the following code to the Tick() function found in Floor.cpp: for (auto Scene : FloorMeshScenes) { Scene->AddLocalOffset(FVector(GetCustomGameMode <ABountyDashGameMode>(GetWorld())->GetInvGameSpeed(), 0.0f, 0.0f));   if (Scene->GetComponentTransform().GetLocation().X <= KillPoint) {     Scene->SetRelativeLocation(FVector(SpawnPoint, 0.0f, 0.0f)); } } Here we use a C++ 11 range-based for the loop. Meaning that for each element inside of FloorMeshScenes, it will populate the scene handle of type auto with a pointer to whatever type is contained by FloorMeshScenes; in this case, it is USceneComponent*. For every scene component contained within FloorMeshScenes, we add a local offset to each frame. The amount we offset each frame is dependent on the current game speed. We are getting the game speed from the game mode via the template function that we wrote earlier. As you can see, we have specified the template function to be of the ABountyDashGameMode type, thus we will have access to the bounty dash game mode functionality. We have done this so that the floor will move faster under the player's feet as the speed of the game increases. The next thing we do is check the X value of the scene component’s location. If this value is equal to or less than the value stored in KillPoint, we reposition the scene component back to the spawn point. As we attached the meshes to USceenComponents earlier, the meshes will also translate with the scene components. Lastly, ensure that you have added #include BountyDashGameMode.h to .cpp include the list. Placing the Floor in the level! We are done making the floor! Compile the code and return to the level editor. We can now place this new floor object in the level! Delete the static mesh that would have replaced our earlier box brushes, and drag and drop the Floor object into the scene. The floor object can be found under the C++ classes folder of the content browser. Select the floor in the level, and ensure that its location is set too (0.0f, 0.0f, and -100.f). This will place the floor just below the player's feet around origin. Also, ensure that the ATargetPoints that we placed earlier are in the right positions above the lanes. With all this in place, you should be able to press play and observe the floor moving underneath the player indefinitely. You will see something similar to this: You will notice that as you move between the lanes by pressing A and D, the player maintains the X position of the target points but nicely travels to the center of each lane. Creating the obstacles The next step for this project is to create the obstacles that will come flying at the player. These obstacles are going to be very simple and contain only a few members and functions. These obstacles are to only used to serve as a blockade for the player, and all of the collision with the obstacles will be handled by the player itself. Use the class wizard to create a new class named Obstacle, and inherit this object from AActor. Once the class has been generated, modify the class definition found in Obstacle.h so that it appears as follows: UCLASS(BlueprintType) class BOUNTYDASH_API AObstacle: public AActor { GENERATED_BODY()         float KillPoint;   public:      // Sets default values for this actor's properties     AObstacle ();       // Called when the game starts or when spawned     virtual void BeginPlay() override;         // Called every frame     virtual void Tick( float DeltaSeconds ) override;     void SetKillPoint(float point); float GetKillPoint();   protected:     UFUNCITON()     virtual void MyOnActorOverlap(AActor* otherActor);       UFUNCTION()     virtual void MyOnActorEndOverlap(AActor* otherActor);     public: UPROPERTY(EditAnywhere, BlueprintReadWrite)     USphereComponent* Collider;       UPROPERTY(EditAnywhere, BlueprintReadWrite)     UStaticMeshComponent* Mesh; }; You will notice that the class has been declared with the BlueprintType specifier! This object is simple enough to justify extension into blueprint, as there is no new learning to be found within this simple object, and we can use blueprint for convenience. For this class, we have added a private member called KillPoint that will be used to determine when AObstacle should destroy itself. We have also added the accessor methods for this private member. You will notice that we have added the MyActorBeginOverlap and MyActorEndOverlap functions that we will provide appropriate delegates, so we can provide custom collision response for this object. The definitions of these functions are not too complicated either. Ensure that you have included #include BountyDashGameMode.h in Obstacle.cpp. Then, we can begin filling out our function definitions; the following is code what we will use for the constructor: AObstacle::AObstacle() { PrimaryActorTick.bCanEverTick = true;   Collider = CreateDefaultSubobject<USphereComponent>(TEXT("Collider")); check(Collider);   RootComponent = Collider; Collider ->SetCollisionProfileName("OverlapAllDynamic");   Mesh = CreateDefaultSubobject<UStaticMeshComponent>(TEXT("Mesh")); check(Mesh); Mesh ->AttachTo(Collider); Mesh ->SetCollisionResponseToAllChannels(ECR_Ignore); KillPoint = -20000.0f;   OnActorBeginOverlap.AddDynamic(this, &AObstacle::MyOnActorOverlap); OnActorBeginOverlap.AddDynamic(this, &AObstacle::MyOnActorEndOverlap); } The only thing of note within this constructor is that again, we set the mesh of this object to ignore the collision response to all the channels; this means that the mesh will not affect collision in any way. We have also initialized kill point with a default value of -20000.0f. Following that we are binding the custom the MyOnActorOverlap and MyOnActorEndOverlap functions to appropriate delegates. The Tick() function of this object is responsible for translating the obstacle during play. Add the following code to the Tick function of AObstacle: void AObstacle::Tick( float DeltaTime ) {     Super::Tick( DeltaTime ); float gameSpeed = GetCustomGameMode<ABountyDashGameMode>(GetWorld())-> GetInvGameSpeed();       AddActorLocalOffset(FVector(gameSpeed, 0.0f, 0.0f));       if (GetActorLocation().X < KillPoint)     {         Destroy();     } } As you can see the tick function will add an offset to AObstacle each frame along the X axis via the AddActorLocalOffset function. The value of the offset is determined by the game speed set in the game mode; again, we are using the template function that we created earlier to get the game mode to call GetInvGameSpeed(). AObstacle is also responsible for its own destruction; upon reaching a maximum bounds defined by killPoint, the AObstacle will destroy itself. The last thing to we need to add is the function definitions for the OnOverlap functions the and KillPoint accessors: void AObstacle::MyOnActorOverlap(AActor* otherActor) {     } void AObstacle::MyOnActorEndOverlap(AActor* otherActor) {   }   void AObstacle::SetKillPoint(float point) {     killPoint = point; }   float AObstacle::GetKillPoint() {     return killPoint; } Now, let's abstract this class into blueprint. Compile the code and go back to the game editor. Within the content folder, create a new blueprint object that inherits form the Obstacle class that we just made, and name it RockObstacleBP. Within this blueprint, we need to make some adjustments. Select the collider component that we created, and expand the shape sections in the Details panel. Change the Sphere radius property to 100.0f. Next, select the mesh component and expand the Static Mesh section. From the provided drop-down menu, choose the SM_Rock mesh. Next, expand the transform section of the Mesh component details panel and match these values:   You should end up with an object that looks similar to this:   Spawning Actors from C++ Despite the obstacles being fairly easy to implement from a C++ standpoint, the complication usually comes from the spawning system that we will be using to create these objects in the game. We will leverage a similar system to the player's movement by basing the spawn locations off of the ATargetPoints that are already in the scene. We can then randomly select a spawn target when we require a new object to spawn. Open the class wizard now, and create a class that inherits from Actor and call it ObstacleSpawner. We inherit from AActor as even though this object does not have a physical presence in the scene, we still require ObstacleSpawner to tick. The first issue we are going to encounter is that our current target points give us a good indication of the Y positon for our spawns, but the X position is centered around the origin. This is undesirable for the obstacle spawn point, as we would like to spawn these objects a fair distance away from the player, so we can do two things. One, obscure the popping of spawning the objects via fog, and two, present the player with enough obstacle information so that they may dodge them at high speeds. This means we are going to require some information from our floor object; we can use the KillPoint and SpawnPoint members of the floor to determine the spawn location and kill the location of the Obstacles. Obstacle Spawner Class definition This will be another fairly simple object. It will require a BeginPlay function, so we may find the floor and all the target points that we require for spawning. We also require a Tick function so that we may process spawning logic on a per frame basis. Thankfully, both of these are provided by default by the class wizard. We have created a protected SpawnObstacle() function, so we may group that functionality together. We are also going to require a few UPRORERTY declared members that can be edited from the Level editor. We need a list of obstacle types to spawn; we can then randomly select one of the types each time we spawn an obstacle. We also require the spawn targets (though, we will be populating these upon beginning the play). Finally, we will need a spawn time that we can set for the interval between obstacles spawning. To accommodate for all of this, navigate to ObstacleSpawner.h now and modify the class definition to match the following: UCLASS() class BOUNTYDASH_API AObstacleSpawner : public AActor {     GENERATED_BODY()     public:     // Sets default values for this actor's properties     AObstacleSpawner();       // Called when the game starts or when spawned     virtual void BeginPlay() override;         // Called every frame     virtual void Tick( float DeltaSeconds ) override;     protected:       void SpawnObstacle();   public:     UPROPERTY(EditAnywhere, BlueprintReadWrite)     TArray<TSubclassof<class AObstacle*>> ObstaclesToSpawn;       UPROPERTY()     TArray<class ATargetPoint*>SpawnTargets;       UPROPERTY(EditAnywhere, BlueprintReadWrite)     float SpawnTimer;       UPROPERTY()     USceneComponent* scene; private:     float KillPoint;     float SpawnPoint;    float TimeSinceLastSpawn; }; I have again used TArrays for our containers of obstacle objects and spawn targets. As you can see, the obstacle list is of type TSubclassof<class AObstacle>>. This means that the objects in TArray will be class types that inherit from AObscatle. This is very useful as not only will we be able to use these array elements for spawn information, but the engine will also filter our search when we will be adding object types to this array from the editor. With these class types, we will be able to spawn objects that inherit from AObject (including blueprints) when required. We have also included a scene object, so we can arbitrarily place AObstacleSpawner in the level somewhere, and we can place two private members that will hold the kill and the spawn point of the objects. The last element is a float timer that will be used to gauge how much time has passed since the last obstacle spawn. Obstacle Spawner function definitions Okay now, we can create the body of the AObstacleSpawner object. Before we do so, ensure to include the list in ObstacleSpawner.cpp as follows: #include "BountyDash.h" #include "BountyDashGameMode.h" #include "Engine/TargetPoint.h" #include “Floor.h” #include “Obstacle.h” #include "ObstacleSpawner.h"   Following this we have a very simple constructor that establishes the root scene component: // Sets default values AObstacleSpawner::AObstacleSpawner() { // Set this actor to call Tick() every frame.  You can turn this off to improve performance if you don't need it. PrimaryActorTick.bCanEverTick = true;   Scene = CreateDefaultSubobject<USceneComponent>(TEXT("Root")); check(Scene); RootComponent = scene;   SpawnTimer = 1.5f; } Following the constructor, we have BeginPlay(). Inside this function, we are going to do a few things. Firstly, we are simply performing the same in the level object retrieval that we executed in ABountyDashCarhacter to get the location of ATargetPoints. However, this object also requires information from the floor object in the level. We are also going to get the floor object the same way we did with the ATargetPoints by utilizing TActorIterators. We will then get the required kill and spawn the point information. We will also set TimeSinceLastSpawn to SpawnTimer so that we begin spawning objects instantaneously: // Called when the game starts or when spawned void AObstacleSpawner::BeginPlay() {     Super::BeginPlay();   for(TActorIterator<ATargetPoint> TargetIter(GetWorld()); TargetIter;     ++TargetIter)     {         SpawnTargets.Add(*TargetIter);     }   for (TActorIterator<AFloor> FloorIter(GetWorld()); FloorIter;         ++FloorIter)     {         if (FloorIter->GetWorld() == GetWorld())         {             KillPoint = FloorIter->GetKillPoint();             SpawnPoint = FloorIter->GetSpawnPoint();         }     }     TimeSinceLastSpawn = SpawnTimer; } The next function we will understand in detail is Tick(), which is responsible for the bulk of the AObstacleSpawner functionality. Within this function, we need to check if we require a new object to be spawned based off of the amount of time that has passed since we last spawned an object. Add the following code to AObstacleSpawner::Tick() underneath Super::Tick(): TimeSinceLastSpawn += DeltaTime;   float trueSpawnTime = spawnTime / (float)GetCustomGameMode <ABountyDashGameMode>(GetWorld())->GetGameLevel();   if (TimeSinceLastSpawn > trueSpawnTime) {     timeSinceLastSpawn = 0.0f;     SpawnObstacle (); } Here, we are accumulating the delta time in TimeSinceLastSpawn, so we may gauge how much real time has passed since the last obstacle was spawned. We then calculate the trueSpawnTime of the AObstacleSpawner. This is based off of a base SpawnTime, which is divided by the current game level retrieved from the game mode via the GetCustomGamMode() template function. This means that as the game level increases and the obstacles begin to move faster, the obstacle spawner will also spawn objects at a faster rate. If the accumulated timeSinceLastSpawn is greater than the calculated trueSpawnTime, we need to call SpawnObject() and reset the timeSinceLastSpawn timer to 0.0f. Getting information from components in C++ Now, we need to write the spawn function. This spawn function is going to have to retrieve some information from the components of the object that is being spawned. As we have allowed our AObstacle class to be extended into blueprint, we have also exposed the object to a level of versatility that we must compensate for in the codebase. With the ability to customize the mesh and bounds of the Sphere Collider that makes up any given obstacle, we must be sure to spawn the obstacle in the right place regardless of size! To do this, we are going to need to obtain information form the components contained within the spawned AObstacle class. This can be done via GetComponentByClass(). It will take the UClass* of the component you wish to retrieve, and it will return a handle to the component if it has been found. We can then cast this handle to the appropriate type and retrieve the information that we require! Let's begin detailing the spawn function; add the following code to ObstacleSpawner.cpp: void AObstacleSpawner::SpawnObstacle() { if (SpawnTargets.Num() > 0 && ObstaclesToSpawn.Num() > 0) { short Spawner = Fmath::Rand() % SpawnTargets.Num();     short Obstical = Fmath::Rand() % ObstaclesToSpawn.Num();     float CapsuleOffset = 0.0f; Here, we ensure that both of the arrays have been populated with at least one valid member. We then generate the random lookup integers that we will use to access the SpawnTargets and obstacleToSpawn arrays. This means that every time we spawn an object, both the lane spawned in and the type of the object will be randomized. We do this by generating a random value with FMath::Rand(), and then we find the remainder of this number divided by the number of elements in the corresponding array. The result will be a random number that exists between zero and the number of objects in either array minus one which is perfect for our needs. Continue by adding the following code: FActorSpawnParameters SpawnInfo;   FTransform myTrans = SpawnTargets[Spawner]->GetTransform(); myTrans.SetLocation(FVector(SpawnPoint, myTrans.GetLocation().Y, myTrans.GetLocation().Z)); Here, we are using a struct called FActorSpawnParameters. The default values of this struct are fine for our purposes. We will soon be parsing this struct to a function in our world context. After this, we create a transform that we will be providing to the world context as well. The transform of the spawner will suffice apart from the X component of the location. We need to adjust this so that the X value of the spawn transform matches the spawn point that we retrieved from the floor. We do this by setting the X component of the spawn transforms location to be the spawnPoint value that we received earlier, and w make sure that the other components of the location vector to be the Y and Z components of the current location. The next thing we must do is actually spawn the object! We are going to utilize a template function called SpawnActor() that can be called form the UWorld* handle returned by GetWorld(). This function will spawn an object of a specified type in the game world at a specified location. The type of the object is determined by passing the UClass* handle that holds the object type we wish to spawn. The transform and spawn parameters of the object are also determined by the corresponding input parameters of SpawnActor(). The template type of the function will dictate the type of object that is spawned and the handle that is returned from the function. In our case, we require AObstacle to be spawned. Add the following code to the SpawnObstacle function: AObstacle* newObs = GetWorld()-> SpawnActor<AObstacle>(obstacleToSpawn[Obstical, myTrans, SpawnInfo);   if(newObs) { newObs->SetKillPoint(KillPoint); As you can see we are using SpawnActor() with a template type of AObstacle. We use the random look up integer we generated before to retrieve the class type from the obstacleToSpawn array. We also provide the transform and spawn parameters we created earlier to SpawnActor(). If the new AObstacle was created successfully we then save the return of this function into an AObstacle handle that we will use to set the kill point of the obstacle via SetKillPoint(). We must now adjust the height of this object. The object will more than likely spawn in the ground in its current state. We need to get access to the sphere component of the obstacle so we may get the radius of this capsule and adjust the positon of the obstacle so it sits above the ground. We can use the capsule as a reliable resource as it is the root component of the Obstacle thus we can move the Obstacle entirely out of the ground if we assume the base of the sphere will line up with the base of the mesh. Add the following code to the SpawnObstacle() function: USphereComponent* obsSphere = Cast<USphereComponent> (newObs->GetComponentByClass(USphereComponent::StaticClass()));   if (obsSphere) { newObs->AddActorLocalOffset(FVector(0.0f, 0.0f, obsSphere-> GetUnscaledSphereRadius())); } }//<-- Closing if(newObs) }//<-- Closing if(SpawnTargets.Num() > 0 && obstacleToSpawn.Num() > 0) Here, we are getting the sphere component out of the newObs handle that we obtained from SpawnActor() via GetComponentByClass(), which was mentioned previously. We pass the class type of USphereComponent via the static function called StaticClass() to the function. This will return a valid handle if newObs does indeed contain USphereComponent (which we know it does). We then cast the result of this function to USphereComponent*, and save it in the obsSphere handle. We ensure that this handle is valid; if it is, we can then offset the actor that we just spawned on the Z axis by the unscaled radius of the sphere component. This will result in all the obstacles spawned be in line with the top of the floor! Ensuring the obstacle spawner works Okay, now is the time to bring the obstacle spawner into the scene. Be sure to compile the code, then navigate to the C++ classes folder of the content browser. From here, drag and drop ObstacleSpawner into the scene. Select the new ObstacleSpawner via the World Outlier, and address the Details panel. You will see the exposed members under the ObstacleSpawner section like so: Now, to add RockObstacleBP that we made earlier to the ObstacleToSpawn array, press the small white plus next to the property in the Details panel; this will add an element to TArray that you will then be able to customize. Select the drop-down menu that currently says None. Within this menu, search for RockObstacleBP and select it. If you wish to create and add more obstacle types to this array, feel free to do so. We do not need to add any members to the Spawn Targets property, as this will happen automatically. Now, press Play and behold a legion of moving rocks. Summary This article gives an overview about various development tricks associated with Unreal Engine 4. Resources for Article: Further resources on this subject: Special Effects [article] Bang Bang – Let's Make It Explode [article] The Game World [article]
Read more
  • 0
  • 0
  • 10814

article-image-first-person-shooter-part-1-creating-exterior-environments
Packt
19 May 2016
13 min read
Save for later

First Person Shooter Part 1 – Creating Exterior Environments

Packt
19 May 2016
13 min read
In this article by John P. Doran, the author of the book Unity 5.x Game Development Blueprints, we will be creating a first-person shooter; however, instead of shooting a gun to damage our enemies, we will be shooting a picture in a survival horror environment, similar to the Fatal Frame series of games and the recent indie title DreadOut. To get started on our project, we're first going to look at creating our level or, in this case, our environments starting with the exterior. In the game industry, there are two main roles in level creation: an environment artist and a level designer. An environment artist is a person who builds the assets that go into the environment. He/she uses tools such as 3Ds Max or Maya to create the model and then uses other tools such as Photoshop to create textures and normal maps. The level designer is responsible for taking the assets that the environment artist created and assembling them in an environment for players to enjoy. He/she designs the gameplay elements, creates the scripted events, and tests the gameplay. Typically, a level designer will create environments through a combination of scripting and using a tool that may or may not be in development as the game is being made. In our case, that tool is Unity. One important thing to note is that most companies have their own definition for different roles. In some companies, a level designer may need to create assets and an environment artist may need to create a level layout. There are also some places that hire someone to just do lighting or just to place meshes (called a mesher) because they're so good at it. (For more resources related to this topic, see here.) Project overview In this article, we take on the role of an environment artist who has been tasked to create an outdoor environment. We will use assets that I've placed in the example code as well as assets already provided to us by Unity for mesh placement. In addition, you will also learn some beginner-level design. Your objectives This project will be split into a number of tasks. It will be a simple step-by-step process from the beginning to end. Here is the outline of our tasks: Creating the exterior environment—terrain Beautifying the environment—adding water, trees, and grass Building the atmosphere Designing the level layout and background Project setup At this point, I assume that you have a fresh installation of Unity and have started it. You can perform the following steps: With Unity started, navigate to File | New Project. Select a project location of your choice somewhere on your hard drive and ensure that you have Setup defaults for set to 3D. Then, put in a Project name (I used First Person Shooter). Once completed, click on Create project. Here, if you see the Welcome to Unity popup, feel free to close it as we won't be using it. Level design 101 – planning Now just because we are going to be diving straight into Unity, I feel that it's important to talk a little more about how level design is done in the game industry. Although you may think a level designer will just jump into the editor and start playing, the truth is that you normally would need to do a ton of planning ahead of time before you even open up your tool. In general, a level begins with an idea. This can come from anything; maybe you saw a really cool building, or a photo on the Internet gave you a certain feeling; maybe you want to teach the player a new mechanic. Turning this idea into a level is what a level designer does. Taking all of these ideas, the level designer will create a level design document, which will outline exactly what you're trying to achieve with the entire level from start to end. A level design document will describe everything inside the level; listing all of the possible encounters, puzzles, so on and so forth, which the player will need to complete as well as any side quests that the player will be able to achieve. To prepare for this, you should include as many references as you can with maps, images, and movies similar to what you're trying to achieve. If you're working with a team, making this document available on a website or wiki will be a great asset so that you know exactly what is being done in the level, what the team can use in their levels, and how difficult their encounters can be. In general, you'll also want a top-down layout of your level done either on a computer or with a graph paper, with a line showing a player's general route for the level with encounters and missions planned out. Of course, you don't want to be too tied down to your design document and it will change as you playtest and work on the level, but the documentation process will help solidify your ideas and give you a firm basis to work from. For those of you interested in seeing some level design documents, feel free to check out Adam Reynolds (Level Designer on Homefront and Call of Duty: World at War) at http://wiki.modsrepository.com/index.php?title=Level_Design:_Level_Design_Document_Example. If you want to learn more about level design, I'm a big fan of Beginning Game Level Design, John Feil (previously my teacher) and Marc Scattergood, Cengage Learning PTR. For more of an introduction to all of game design from scratch, check out Level Up!: The Guide to Great Video Game Design, Scott Rogers, Wiley and The Art of Game Design, Jesse Schell, CRC Press. For some online resources, Scott has a neat GDC talk named Everything I Learned About Level Design I Learned from Disneyland, which can be found at http://mrbossdesign.blogspot.com/2009/03/everything-i-learned-about-game-design.html, and World of Level Design (http://worldofleveldesign.com/) is a good source for learning about of level design, though it does not talk about Unity specifically. Introduction to terrain Terrain is basically used for non-manmade ground; things such as hills, deserts, and mountains. Unity's way of dealing with terrain is different than what most engines use in the fact that there are two mays to make terrains, one being using a height map and the other sculpting from scratch. Height maps Height maps are a common way for game engines to support terrains. Rather than creating tools to build a terrain within the level, they use a piece of graphics software to create an image and then we can translate that image into a terrain using the grayscale colors provided to translate into different height levels, hence the name height map. The lighter in color the area is, the lower its height, so in this instance, black represents the terrain's lowest areas, whereas white represents the highest. The Terrain's Terrain Height property sets how high white actually is compared with black. In order to apply a height map to a terrain object, inside an object's Terrain component, click on the Settings button and scroll down to Import Raw…. For more information on Unity's Height tools, check out http://docs.unity3d.com/Manual/terrain-Height.html. If you want to learn more about creating your own HeightMaps using Photoshop while this tutorial is for UDK, the area in Photoshop is the same: http://worldofleveldesign.com/categories/udk/udk-landscape-heightmaps-photoshop-clouds-filter.php  Others also use software such as Terragen to create HeightMaps. More information on that is at http://planetside.co.uk/products/terragen3. Exterior environment – terrain When creating exterior environments, we cannot use straight floors for the most part unless you're creating a highly urbanized area. Our game takes place in a haunted house in the middle of nowhere, so we're going to create a natural landscape. In Unity, the best tool to use to create a natural landscape is the Terrain tool. Unity's Terrain system lets us add landscapes, complete with bushes, trees, and fading materials to our game. To show how easy it is to use the terrain tool, let's get started. The first thing that we're going to do is actually create the terrain we'll be placing for the world. Let's first create a Terrain by selecting GameObject | 3D Object | Terrain. At this point, you should see the terrain on the screen. If for some reason you have problems seeing the terrain object, go to the Hierarchy tab and double-click on the Terrain object to focus your camera on it and move in as needed. Right now, it's just a flat plane, but we'll be doing a lot to it to make it shine. If you look to the right with the Terrain object selected, you'll see the Terrain editing tools, which do the following (from left to right): Raise/Lower Height—This will allow us to raise or lower the height of our terrain in a certain radius to create hills, rivers, and more. Paint Height—If you already know exactly the height that a part of your terrain needs to be, this tool will allow you to paint a spot to that location. Smooth Height—This averages out the area that it is in, attempts to smooth out areas, and reduces the appearance of abrupt changes. Paint Texture—This allows us to add textures to the surface of our terrain. One of the nice features of this is the ability to lay multiple textures on top of each other. Place Trees—This allows us to paint objects in our environment that will appear on the surface. Unity attempts to optimize these objects by billboarding distant trees so we can have dense forests without having a horrible frame rate. By billboarding, I mean that the object will be simplified and its direction usually changes constantly as the object and camera move, so it always faces the camera direction. Paint Details—In addition to trees, you can also have small things like rocks or grass covering the surface of your environment, using 2D images to represent individual clumps with bits of randomization to make it appear more natural. Terrain Settings—Settings that will affect the overall properties of the particular Terrain, options such as the size of the terrain and wind can be found here. By default, the entire Terrain is set to be at the bottom, but we want to have ground above us and below us so we can add in things like lakes. With the Terrain object selected, click on the second button from the left on the Terrain component (Paint height mode). From there, set the Height value under Settings to 100 and then press the Flatten button. At this point, you should note the plane moving up, so now everything is above by default. Next, we are going to create some interesting shapes to our world with some hills by "painting" on the surface. With the Terrain object selected, click on the first button on the left of our Terrain component (the Raise/Lower Terrain mode). Once this is completed, you should see a number of different brushes and shapes that you can select from. Our use of terrain is to create hills in the background of our scene, so it does not seem like the world is completely flat. Under the Settings, change the Brush Size and Opacity of your brush to 100 and left-click around the edges of the world to create some hills. You can increase the height of the current hills if you click on top of the previous hill. When creating hills, it's a good idea to look at multiple angles while you're building them, so you can make sure that none are too high or too low In general, you want to have taller hills as you go further back, or else you cannot see the smaller ones since they're blocked. In the Scene view, to move your camera around, you can use the toolbar at the top-right corner or hold down the right mouse button and drag it in the direction you want the camera to move around in, pressing the W, A, S, and D keys to pan. In addition, you can hold down the middle mouse button and drag it to move the camera around. The mouse wheel can be scrolled to zoom in and out from where the camera is. Even though you should plan out the level ahead of time on something like a piece of graph paper to plan out encounters, you will want to avoid making the level entirely from the preceding section, as the player will not actually see the game with a bird's eye view in the game at all (most likely). Referencing the map from the same perspective as your character will help ensure that the map looks great. To see many different angles at one time, you can use a layout with multiple views of the scene, such as the 4 Split. Once we have our land done, we now want to create some holes in the ground, which we will fill with water later. This will provide a natural barrier to our world that players will know they cannot pass, so we will create a moat by first changing the Brush Size value to 50 and then holding down the Shift key, and left-clicking around the middle of our texture. In this case, it's okay to use the Top view; remember that this will eventually be water to fill in lakes, rivers, and so on, as shown in the following screenshot:   To make this easier to see, you can click on the sun-looking light icon from the Scene tab to disable lighting for the time being. At this point, we have done what is referred to in the industry as "grayboxing," making the level in the engine in the simplest way possible but without artwork (also known as "whiteboxing" or "orangeboxing" depending on the company you're working for). At this point in a traditional studio, you'd spend time playtesting the level and iterating on it before an artist or you will take the time to make it look great. However, for our purposes, we want to create a finished project as soon as possible. When doing your own games, be sure to play your level and have others play your level before you polish it. For more information on grayboxing, check out http://www.worldofleveldesign.com/categories/level_design_tutorials/art_of_blocking_in_your_map.php. For an example with images of a graybox to the final level, PC Gamer has a nice article available at http://www.pcgamer.com/2014/03/18/building-crown-part-two-layout-design-textures-and-the-hammer-editor/. Summary With this, we now have a great-looking exterior level for our game! In addition, we covered a lot of features that exist in Unity for you to be able to use in your own future projects.   Resources for Article: Further resources on this subject: Learning NGUI for Unity [article] Components in Unity [article] Saying Hello to Unity and Android [article]
Read more
  • 0
  • 0
  • 1440

article-image-creating-coin-material
Packt
10 Mar 2016
7 min read
Save for later

Creating a Coin Material

Packt
10 Mar 2016
7 min read
In this article by Alan Thorn, the author of Unity 5.x By Example, the coin object, as a concept, represents a basic or fundamental unit in our game logic because the player character should be actively searching the level looking for coins to collect before a timer runs out. This means that the coin is more than mere appearance; its purpose in the game is not simply eye candy, but is functional. It makes an immense difference to the game outcome whether the coin is collected by the player or not. Therefore, the coin object, as it stands, is lacking in two important respects. Firstly, it looks dull and grey—it doesn't really stand out and grab the player's attention. Secondly, the coin cannot actually be collected yet. Certainly, the player can walk into the coin, but nothing appropriate happens in response. Figure 2.1: The coin object so far The completed CollectionGame project, as discussed in this article and the next, can be found in the book companion files in the Chapter02/CollectionGame folder. (For more resources related to this topic, see here.) In this section, we'll focus on improving the coin appearance using a material. A material defines an algorithm (or instruction set) specifying how the coin should be rendered. A material doesn't just say what the coin should look like in terms of color; it defines how shiny or smooth a surface is, as opposed to rough and diffuse. This is important to recognize and is why a texture and material refer to different things. A texture is simply an image file loaded in memory, which can be wrapped around a 3D object via its UV mapping. In contrast, a material defines how one or more textures can be combined together and applied to an object to shape its appearance. To create a new material asset in Unity, right-click on an empty area in the Project panel, and from the context menu, choose Create | Material. See Figure 2.2. You can also choose Assets | Create | Material from the application menu. Figure 2.2: Creating a material A material is sometimes called a Shader. If needed, you can create custom materials using a Shader Language or you can use a Unity add-on, such as Shader Forge. After creating a new material, assign it an appropriate name from the Project panel. As I'm aiming for a gold look, I'll name the material mat_GoldCoin. Prefixing the asset name with mat helps me know, just from the asset name, that it's a material asset. Simply type a new name in the text edit field to name the material. You can also click on the material name twice to edit the name at any time later. See Figure 2.3: Figure 2.3: Naming a material asset Next, select the material asset in the Project panel, if it's not already selected, and its properties display immediately in the object Inspector. There are lots of properties listed! In addition, a material preview displays at the bottom of the object Inspector, showing you how the material would look, based on its current settings, if it were applied to a 3D object, such as a sphere. As you change material settings from the Inspector, the preview panel updates automatically to reflect your changes, offering instant feedback on how the material would look. See the following screenshot: Figure 2.4: Material properties are changed from the Object Inspector Let's now create a gold material for the coin. When creating any material, the first setting to choose is the Shader type because this setting affects all other parameters available to you. The Shader type determines which algorithm will be used to shade your object. There are many different choices, but most material types can be approximated using either Standard or Standard (Specular setup). For the gold coin, we can leave the Shader as Standard. See the following screenshot: Figure 2.5: Setting the material Shader type Right now, the preview panel displays the material as a dull grey, which is far from what we need. To define a gold color, we must specify the Albedo. To do this, click on the Albedo color slot to display a Color picker, and from the Color picker dialog, select a gold color. The material preview updates in response to reflect the changes. Refer to the following screenshot: Figure 2.6: Selecting a gold color for the Albedo channel The coin material is looking better than it did, but it's still supposed to represent a metallic surface, which tends to be shiny and reflective. To add this quality to our material, click and drag the Metallic slider in the object Inspector to the right-hand side, setting its value to 1. This indicates that the material represents a fully metal surface as opposed to a diffuse surface such as cloth or hair. Again, the preview panel will update to reflect the change. See Figure 2.7: Figure 2.7: Creating a metallic material We now have a gold material created, and it's looking good in the preview panel. If needed, you can change the kind of object used for a preview. By default, Unity assigns the created material to a sphere, but other primitive objects are allowed, including cubes, cylinders, and torus. This helps you preview materials under different conditions. You can change objects by clicking on the geometry button directly above the preview panel to cycle through them. See Figure 2.8: Figure 2.8: Previewing a material on an object When your material is ready, you can assign it directly to meshes in your scene just by dragging and dropping. Let's assign the coin material to the coin. Click and drag the material from the Project panel to the coin object in the scene. On dropping the material, the coin will change appearance. See Figure 2.9: Figure 2.9: Assigning the material to the coin You can confirm that material assignment occurred successfully and can even identify which material was assigned by selecting the coin object in the scene and viewing its Mesh Renderer component from the object Inspector. The Mesh Renderer component is responsible for making sure that a mesh object is actually visible in the scene when the camera is looking. The Mesh Renderer component contains a Materials field. This lists all materials currently assigned to the object. By clicking on the material name from the Materials field, Unity automatically selects the material in the Project panel, making it quick and simple to locate materials. See Figure 2.10, The Mesh Renderer component lists all materials assigned to an object: Mesh objects may have multiple materials with different materials assigned to different faces. For best in-game performance, use as few unique materials on an object as necessary. Make the extra effort to share materials across multiple objects, if possible. Doing so can significantly enhance the performance of your game. For more information on optimizing rendering performance, see the online documentation at http://docs.unity3d.com/Manual/OptimizingGraphicsPerformance.html. Figure 2.10: The Mesh Renderer component lists all materials assigned to an object That's it! You now have a complete and functional gold material for the collectible coin. It's looking good. However, we're still not finished with the coin. The coin looks right, but it doesn't behave right. Specifically, it doesn't disappear when touched, and we don't yet keep track of how many coins the player has collected overall. To address this, then, we'll need to script. Summary Excellent work! In this article, you've completed the coin collection game as well as your first game in Unity. Resources for Article: Further resources on this subject: Animation features in Unity 5 [article] Saying Hello to Unity and Android [article] Learning NGUI for Unity [article]
Read more
  • 0
  • 0
  • 3936

article-image-lighting-basics
Packt
19 Feb 2016
5 min read
Save for later

Lighting basics

Packt
19 Feb 2016
5 min read
In this article by Satheesh PV, author of the book Unreal Engine 4 Game Development Essentials, we will learn that lighting is an important factor in your game, which can be easily overlooked, and wrong usage can severely impact on performance. But with proper settings, combined with post process, you can create very beautiful and realistic scenes. We will see how to place lights and how to adjust some important values. (For more resources related to this topic, see here.) Placing lights In Unreal Engine 4, lights can be placed in two different ways. Through the modes tab or by right-clicking in the level: Modes tab: In the Modes tab, go to place tab (Shift + 1) and go to the Lights section. From there you can drag and drop various lights. Right-clicking: Right-click in viewport and in Place Actor you can select your light. Once a light is added to the level, you can use transform tool (W to move, E to rotate) to change the position and rotation of your selected light. Since Directional Light casts light from an infinite source, updating their location has no effect. Various lights Unreal Engine 4 features four different types of light Actors. They are: Directional Light: Simulates light from a source that is infinitely far away. Since all shadows casted by this light will be parallel, this is the ideal choice for simulating sunlight. Spot Light: Emits light from a single point in a cone shape. There are two cones (inner cone and outer cone). Within inner cone, light achieves full brightness and between inner and outer cone a falloff takes place, which softens the illumination. That means after inner cone, light slowly loses its illumination as it goes to outer cone. Point Light: Emits light from a single point to all directions, much like a real-world light bulb. Sky Light: Does not really emit light, but instead captures the distant parts of your scene (for example, Actors that are placed beyond Sky Distance Threshold) and applies them as light. That means you can have lights coming from your atmosphere, distant mountains, and so on. Note that Sky Light will only update when you rebuild your lighting or by pressing Recapture Scene (in the Details panel with Sky Light selected). Common light settings Now that we know how to place lights into a scene, let's take a look at some of the common settings of a light. Select your light in a scene and in the Details panel you will see these settings: Intensity: Determines the intensity (energy) of the light. This is in lumens units so, for example, 1700 lm (lumen units) corresponds to a 100 W bulb. Light Color: Determines the color of light. Attenuation Radius: Sets the limit of light. It also calculates the falloff of light. This setting is only available in Point Lights and Spot Lights. Attenuation Radius from left to right: 100, 200, 500. Source Radius: Defines the size of specular highlights on surfaces. This effect can be subdued by adjusting the Min Roughness setting. This also affects building light using Lightmass. Larger Source Radius will cast softer shadows. Since this is processed by Lightmass, it will only work on Lights with mobility set to Static Source Radius 0. Notice the sharp edges of the shadow. Source Radius 0. Notice the sharp edges of the shadow. Source Length: Same as Source Radius. Light mobility Light mobility is an important setting to keep in mind when placing lights in your level because this changes the way light works and impacts on performance. There are three settings that you can choose. They are as follows: Static: A completely static light that has no impact on performance. This type of light will not cast shadows or specular on dynamic objects (for example, characters, movable objects, and so on). Example usage: Use this light where the player will never reach, such as distant cityscapes, ceilings, and so on. You can literally have millions of lights with static mobility. Stationary: This is a mix of static and dynamic light and can change its color and brightness while running the game, but cannot move or rotate. Stationary lights can interact with dynamic objects and is used where the player can go. Movable: This is a completely dynamic light and all properties can be changed at runtime. Movable lights are heavier on performance so use them sparingly. Only four or fewer stationary lights are allowed to overlap each other. If you have more than four stationary lights overlapping each other the light icon will change to red X, which indicates that the light is using dynamic shadows at a severe performance cost! In the following screenshot, you can easily see the overlapping light. Under View Mode, you can change to Stationary Light Overlap to see which light is causing an issue. Summary We will look into different light mobilities and learn more about Lightmass Global Illumination, which is the static Global Illumination solver created by Epic games. We will also learn how to prepare assets to be used with it. Resources for Article:   Further resources on this subject: Understanding Material Design [article] Build a First Person Shooter [article] Machine Learning With R [article]
Read more
  • 0
  • 0
  • 5029

article-image-ragdoll-physics
Packt
19 Feb 2016
5 min read
Save for later

Ragdoll Physics

Packt
19 Feb 2016
5 min read
In this article we will learn how to apply Ragdoll physics to a character. (For more resources related to this topic, see here.) Applying Ragdoll physics to a character Action games often make use of Ragdoll physics to simulate the character's body reaction to being unconsciously under the effect of a hit or explosion. In this recipe, we will learn how to set up and activate Ragdoll physics to our character whenever she steps in a landmine object. We will also use the opportunity to reset the character's position and animations a number of seconds after that event has occurred. Getting ready For this recipe, we have prepared a Unity Package named Ragdoll, containing a basic scene that features an animated character and two prefabs, already placed into the scene, named Landmine and Spawnpoint. The package can be found inside the 1362_07_08 folder. How to do it... To apply ragdoll physics to your character, follow these steps: Create a new project and import the Ragdoll Unity Package. Then, from the Project view, open the mecanimPlayground level. You will see the animated book character and two discs: Landmine and Spawnpoint. First, let's set up our Ragdoll. Access the GameObject | 3D Object | Ragdoll... menu and the Ragdoll wizard will pop-up. Assign the transforms as follows:     Root: mixamorig:Hips     Left Hips: mixamorig:LeftUpLeg     Left Knee: mixamorig:LeftLeg     Left Foot: mixamorig:LeftFoot     Right Hips: mixamorig:RightUpLeg     Right Knee: mixamorig:RightLeg     Right Foot: mixamorig:RightFoot     Left Arm: mixamorig:LeftArm     Left Elbow: mixamorig:LeftForeArm     Right Arm: mixamorig:RightArm     Right Elbow: mixamorig:RightForeArm     Middle Spine: mixamorig:Spine1     Head: mixamorig:Head     Total Mass: 20     Strength: 50 Insert image 1362OT_07_45.png From the Project view, create a new C# Script named RagdollCharacter.cs. Open the script and add the following code: using UnityEngine; using System.Collections; public class RagdollCharacter : MonoBehaviour { void Start () { DeactivateRagdoll(); } public void ActivateRagdoll(){ gameObject.GetComponent<CharacterController> ().enabled = false; gameObject.GetComponent<BasicController> ().enabled = false; gameObject.GetComponent<Animator> ().enabled = false; foreach (Rigidbody bone in GetComponentsInChildren<Rigidbody>()) { bone.isKinematic = false; bone.detectCollisions = true; } foreach (Collider col in GetComponentsInChildren<Collider>()) { col.enabled = true; } StartCoroutine (Restore ()); } public void DeactivateRagdoll(){ gameObject.GetComponent<BasicController>().enabled = true; gameObject.GetComponent<Animator>().enabled = true; transform.position = GameObject.Find("Spawnpoint").transform.position; transform.rotation = GameObject.Find("Spawnpoint").transform.rotation; foreach(Rigidbody bone in GetComponentsInChildren<Rigidbody>()){ bone.isKinematic = true; bone.detectCollisions = false; } foreach (CharacterJoint joint in GetComponentsInChildren<CharacterJoint>()) { joint.enableProjection = true; } foreach(Collider col in GetComponentsInChildren<Collider>()){ col.enabled = false; } gameObject.GetComponent<CharacterController>().enabled = true; } IEnumerator Restore(){ yield return new WaitForSeconds(5); DeactivateRagdoll(); } } Save and close the script. Attach the RagdollCharacter.cs script to the book Game Object. Then, select the book character and, from the top of the Inspector view, change its tag to Player. From the Project view, create a new C# Script named Landmine.cs. Open the script and add the following code: using UnityEngine; using System.Collections; public class Landmine : MonoBehaviour { public float range = 2f; public float force = 2f; public float up = 4f; private bool active = true; void OnTriggerEnter ( Collider collision ){ if(collision.gameObject.tag == "Player" && active){ active = false; StartCoroutine(Reactivate()); collision.gameObject.GetComponent<RagdollCharacter>().ActivateRagdoll(); Vector3 explosionPos = transform.position; Collider[] colliders = Physics.OverlapSphere(explosionPos, range); foreach (Collider hit in colliders) { if (hit.GetComponent<Rigidbody>()) hit.GetComponent<Rigidbody>().AddExplosionForce(force, explosionPos, range, up); } } } IEnumerator Reactivate(){ yield return new WaitForSeconds(2); active = true; } } Save and close the script. Attach the script to the Landmine Game Object. Play the scene. Using the WASD keyboard control scheme, direct the character to the Landmine Game Object. Colliding with it will activate the character's Ragdoll physics and apply an explosion force to it. As a result, the character will be thrown away to a considerable distance and will no longer be in the control of its body movements, akin to a ragdoll. How it works... Unity's Ragdoll Wizard assigns, to selected transforms, the components Collider, Rigidbody, and Character Joint. In conjunction, those components make ragdoll physics possible. However, those components must be disabled whenever we want our character to be animated and controlled by the player. In our case, we switch those components on and off using the RagdollCharacter script and its two functions: ActivateRagdoll() and DeactivateRagdoll(), the latter including instructions to re-spawn our character in the appropriate place. For the testing purposes, we have also created the Landmine script, which calls RagdollCharacter script's function named ActivateRagdoll(). It also applies an explosion force to our ragdoll character, throwing it outside the explosion site. There's more... Instead of resetting the character's transform settings, you could have destroyed its gameObject and instantiated a new one over the respawn point using Tags. For more information on this subject, check Unity's documentation at: http://docs.unity3d.com/ScriptReference/GameObject.FindGameObjectsWithTag.html. Summary In this article we learned how to apply Ragdoll physics to a character. We also learned how to setup Ragdoll for the character of the game. To learn more please refer to the following books: Learning Unity 2D Game Development by Examplehttps://www.packtpub.com/game-development/learning-unity-2d-game-development-example. Unity Game Development Blueprintshttps://www.packtpub.com/game-development/unity-game-development-blueprints. Getting Started with Unityhttps://www.packtpub.com/game-development/getting-started-unity. Resources for Article: Further resources on this subject: Using a collider-based system [article] Looking Back, Looking Forward [article] The Vertex Functions [article]
Read more
  • 0
  • 0
  • 7326
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-build-first-person-shooter
Packt
18 Feb 2016
29 min read
Save for later

Build a First Person Shooter

Packt
18 Feb 2016
29 min read
We will be creating a first person shooter; however, instead of shooting a gun to damage our enemies, we will be shooting a picture in a survival horror environment; similar to the Fatal Frame series of games and the recent indie title DreadOut. To get started on our project, we're first going to look at creating our level or, in this case, our environments starting with the exterior. In the game industry, there are two main roles in level creation: the environment artist and level designer. An environment artist is a person who builds the assets that go into the environment. He/she uses tools such as 3ds Max or Maya to create the model, and then uses other tools such as Photoshop to create textures and normal maps. The level designer is responsible for taking the assets that the environment artist has created and assembling them into an environment for players to enjoy. He/she designs the gameplay elements, creates the scripted events, and tests the gameplay. Typically, a level designer will create environments through a combination of scripting and using a tool that may or may not be in development as the game is being made. In our case, that tool is Unity. One important thing to note is that most companies have their own definition for different roles. In some companies, a level designer may need to create assets and an environment artist may need to create a level layout. There are also some places that hire someone to just do lighting, or just to place meshes (called a mesher) because they're so good at it. (For more resources related to this topic, see here.) Project overview In this article, we take on the role of an environment artist whose been tasked with creating an outdoor environment. We will use assets that I've placed in the example code as well as assets already provided to us by Unity for mesh placement. In addition to this, you will also learn some beginner-level design. Your objectives This project will be split into a number of tasks. It will be a simple step-by-step process from the beginning to end. Here is an outline of our tasks: Creating the exterior environment – Terrain Beautifying the environment – adding water, trees, and grass Building the atmosphere Designing the level layout and background The project setup At this point, I assume you have a fresh installation of Unity and have started it. You can perform the following steps: With Unity started, navigate to File | New Project. Select a project location of your choice somewhere on your hard drive and ensure that you have Setup defaults for set to 3D. Once completed, click on Create. Here, if you see the Welcome to Unity pop up, feel free to close it as we won't be using it. Level design 101 – planning Now just because we are going to be diving straight into Unity, I feel it's important to talk a little more about how level design is done in the gaming industry. While you may think a level designer will just jump into the editor and start playing, the truth is you normally would need to do a ton of planning ahead of time before you even open up your tool. Generally, a level design begins with an idea. This can come from anything; maybe you saw a really cool building, or a photo on the Internet gave you a certain feeling; maybe you want to teach the player a new mechanic. Turning this idea into a level is what a level designer does. Taking all of these ideas, the level designer will create a level design document, which will outline exactly what you're trying to achieve with the entire level from start to end. A level design document will describe everything inside the level; listing all of the possible encounters, puzzles, so on and so forth, which the player will need to complete as well as any side quests that the player will be able to achieve. To prepare for this, you should include as many references as you can with maps, images, and movies similar to what you're trying to achieve. If you're working with a team, making this document available on a website or wiki will be a great asset so that you know exactly what is being done in the level, what the team can use in their levels, and how difficult their encounters can be. Generally, you'll also want a top-down layout of your level done either on a computer or with a graph paper, with a line showing a player's general route for the level with encounters and missions planned out. Of course, you don't want to be too tied down to your design document and it will change as you playtest and work on the level, but the documentation process will help solidify your ideas and give you a firm basis to work from. For those of you interested in seeing some level design documents, feel free to check out Adam Reynolds (Level Designer on Homefront and Call of Duty: World at War) at http://wiki.modsrepository.com/index.php?title=Level_Design:_Level_Design_Document_Example. If you want to learn more about level design, I'm a big fan of Beginning Game Level Design, John Feil (previously my teacher) and Marc Scattergood, Cengage Learning PTR. For more of an introduction to all of game design from scratch, check out Level Up!: The Guide to Great Video Game Design, Scott Rogers, Wiley and The Art of Game Design, Jesse Schell, CRC Press. For some online resources, Scott has a neat GDC talk called Everything I Learned About Level Design I Learned from Disneyland, which can be found at http://mrbossdesign.blogspot.com/2009/03/everything-i-learned-about-game-design.html, and World of Level Design (http://worldofleveldesign.com/) is a good source for learning about level design, though it does not talk about Unity specifically. Exterior environment – terrain When creating exterior environments, we cannot use straight floors for the most part, unless you're creating a highly urbanized area. Our game takes place in a haunted house in the middle of nowhere, so we're going to create a natural landscape. In Unity, the best tool to use to create a natural landscape is the Terrain tool. Unity's terrain system lets us add landscapes, complete with bushes, trees, and fading materials to our game. To show how easy it is to use the terrain tool, let's get started. The first thing that we're going to want to do is actually create the terrain we'll be placing for the world. Let's first create a terrain by navigating to GameObject | Create Other | Terrain: If you are using Unity 4.6 or later, navigate to GameObject | Create  General | Terrain to create the Terrain. At this point, you should see the terrain. Right now, it's just a flat plane, but we'll be adding a lot to it to make it shine. If you look to the right with the Terrain object selected, you'll see the Terrain Editing tools, which can do the following (from left to right): Raise/Lower Height: This option will allow us to raise or lower the height of our terrain up to a certain radius to create hills, rivers, and more. Paint Height: If you already know the exact height that a part of your terrain needs to be, this option will allow you to paint a spot on that location. Smooth Height: This option averages out the area that it is in, and then attempts to smooth out areas and reduce the appearance of abrupt changes. Paint Texture: This option allows us to add textures to the surface of our terrain. One of the nice features of this is the ability to lay multiple textures on top of each other. Place Trees: This option allows us to paint objects in our environment, which will appear on the surface. Unity attempts to optimize these objects by billboarding distant trees so that we can have dense forests without a horrible frame rate. Paint Details: In addition to trees, we can also have small things such as rocks or grass covering the surface of our environment. We can use 2D images to represent individual clumps using bits of randomization to make it appear more natural. Terrain Settings: These are settings that will affect the overall properties of a particular terrain; options such as the size of the terrain and wind can be found here. By default, the entire terrain is set to be at the bottom, but we want to have some ground above and below us; so first, with the terrain object selected, click on the second button to the left of the terrain component (the Paint Height mode). From here, set the Height value under Settings to 100 and then click on the Flatten button. At this point, you should notice the plane moving up, so now everything is above by default. Next, we are going to add some interesting shapes to our world with some hills by painting on the surface. With the Terrain object selected, click on the first button to the left of our Terrain component (the Raise/Lower Terrain mode). Once this is completed, you should see a number of different brushes and shapes that you can select from. Our use of terrain is to create hills in the background of our scene so that it does not seem like the world is completely flat. Under the Settings area, change the Brush Size and Opacity values of your brush to 100 and left-click around the edges of the world to create some hills. You can increase the height of the current hills if you click on top of the previous hill. This is shown in the following screenshot: When creating hills, it's a good idea to look at multiple angles while you're building them, so you can make sure that none are too high or too short. Generally, you want to have taller hills as you go further back, otherwise you cannot see the smaller ones since they would be blocked. In the Scene view, to move your camera around, you can use the toolbar in the top right corner or hold down the right mouse button and drag it in the direction you want the camera to move around in, pressing the W, A, S, and D keys to pan. In addition, you can hold down the middle mouse button and drag it to move the camera around. The mouse wheel can be scrolled to zoom in and out from where the camera is. Even though you should plan out the level ahead of time on something like a piece of graph paper to plan out encounters, you will want to avoid making the level entirely from the preceding section, as the player will not actually see the game with a bird's eye view in the game at all (most likely). Referencing the map from the same perspective of your character will help ensure that the map looks great. To see many different angles at one time, you can use a layout with multiple views of the scene, such as the 4 Split. Once we have our land done, we now want to create some holes in the ground, which we will fill in with water later. This will provide a natural barrier to our world that players will know they cannot pass, so we will create a moat by first changing the Brush Size value to 50 and then holding down the Shift key, and left-clicking around the middle of our texture. In this case, it's okay to use the Top view; remember this will eventually be water to fill in lakes, rivers, and so on, as shown in the following screenshot: At this point, we have done what is referred to in the industry as "greyboxing", making the level in the engine in the simplest way possible but without artwork (also known as "whiteboxing" or "orangeboxing" depending on the company you're working for). At this point in a traditional studio, you'd spend time playtesting the level and iterating on it before an artist or you takes the time to make it look great. However, for our purposes, we want to create a finished project as soon as possible. When doing your own games, be sure to play your level and have others play your level before you polish it. For more information on greyboxing, check out http://www.worldofleveldesign.com/categories/level_design_tutorials/art_of_blocking_in_your_map.php. For an example with images of a greybox to the final level, PC Gamer has a nice article available at http://www.pcgamer.com/2014/03/18/building-crown-part-two-layout-design-textures-and-the-hammer-editor/. This is interesting enough, but being in an all-white world would be quite boring. Thankfully, it's very easy to add textures to everything. However, first we need to have some textures to paint onto the world and for this instance, we will make use of some of the free assets that Unity provides us with. So, with that in mind, navigate to Window | Asset Store. The Asset Store option is home to a number of free and commercial assets that can be used with Unity to help you create your own projects created by both Unity and the community. While we will not be using any unofficial assets, the Asset Store option may help you in the future to save your time in programming or art asset creation. An the top right corner, you'll see a search bar; type terrain assets and press Enter. Once there, the first asset you'll see is Terrain Assets, which is released by Unity Technologies for free. Left-click on it and then once at the menu, click on the Download button: Once it finishes downloading, you should see the Importing Package dialog box pop up. If it doesn't pop up, click on the Import button where the Download button used to be: Generally, you'll only want to select the assets that you want to use and uncheck the others. However, since you're exploring the tools, we'll just click on the Import button to place them all. Close the Asset Store screen; if it's still open, go back into our game view. You should notice the new Terrain Assets folder placed in our Assets folder. Double-click on it and then enter the Textures folder: These will be the textures we will be placing in our environment: Select the Terrain object and then click on the fourth button from the left to select the Paint Texture button. Here, you'll notice that it looks quite similar to the previous sections we've seen. However, there is a Textures section as well, but as of now, there is the information No terrain textures defined in this section. So let's fix that. Click on the Edit Textures button and then select Add Texture. You'll see an Add Terrain Texture dialog pop up. Under the Texture variable, place the Grass (Hill) texture and then click on the Add button: At this point, you should see the entire world change to green if you're far away. If you zoom in, you'll see that the entire terrain uses the Grass (Hill) texture now: Now, we don't want the entire world to have grass. Next, we will add cliffs around the edges where the water is. To do this, add an additional texture by navigating to Edit Textures... | Add Texture. Select Cliff (Layered Rock) as the texture and then select Add. Now if you select the terrain, you should see two textures. With the Cliff (Layered Rock) texture selected, paint the edges of the water by clicking and holding the mouse, and modifying the Brush Size value as needed: We now want to create a path for our player to follow, so we're going to create yet another texture this time using the GoodDirt material. Since this is a path the player may take, I'm going to change the Brush Size value to 8 and the Opacity value to 30, and use the second brush from the left, which is slightly less faded. Once finished, I'm going to paint in some trails that the player can follow. One thing that you will want to try to do is make sure that the player shouldn't go too far before having to backtrack and reward the player for exploration. The following screenshot shows the path: However, you'll notice that there are two problems with it currently. Firstly, it is too big to fit in with the world, and you can tell that it repeats. To reduce the appearance of texture duplication, we can introduce new materials with a very soft opacity, which we place in patches in areas where there is just plain ground. For example, let's create a new texture with the Grass (Meadow) texture. Change the Brush Size value to 16 and the Opacity value to something really low, such as 6, and then start painting the areas that look too static. Feel free to select the first brush again to have a smoother touch up. Now, if we zoom into the world as if we were a character there, I can tell that the first grass texture is way too big for the environment but we can actually change that very easily. Double-click on the texture to change the Size value to (8,8). This will make the texture smaller before it duplicates. It's a good idea to have different textures with different sizes so that the seams of each texture aren't visible to others. The following screenshot shows the size options: Do the same changes as in the preceding step for our Dirt texture as well, changing the Size option to (8,8): With this, we already have a level that looks pretty nice! However, that being said, it's just some hills. To really have a quality-looking title, we are going to need to do some additional work to beautify the environment. Beautifying the environment – adding water, trees, and grass We now have a base for our environment with the terrain, but we're still missing a lot of polish that can make the area stand out and look like a quality environment. Let's add some of those details now: First, let's add water. This time we will use another asset from Unity, but we will not have to go to the Asset Store. Navigate to Assets | Import Package | Water (Basic) and import all of the files included in the package. We will be creating a level for the night time, so navigate to Standard Assets | Water Basic and drag-and-drop the Nighttime Simple Water prefab onto the scene. Once there, set the Position values to (1000,50, 1000) and the Scale values to (1000,1,1000): At this point, you want to repaint your cliff materials to reflect being next to the water better. Next, let's add some trees to make this forest level come to life. Navigate to Terrain Assets | Trees Ambient-Occlusion and drag-and-drop a tree into your world (I'm using ScotsPineTree). By default, these trees do not contain collision information, so our player could just walk through it. This is actually great for areas that the player will not reach as we can add more trees without having to do meaningless calculations, but we need to stop the player from walking into these. To do that, we're going to need to add a collider. To do so, navigate to Component | Physics | Capsule Collider and then change the Radius value to 2. You have to use a Capsule Collider in order to have the collision carried over to the terrain. After this, move our newly created tree into the Assets folder under the Project tab and change its name to CollidingTree. Then, delete the object from the Hierarchy view. With this finished, go back to our Terrain object and then click on the Place Trees mode button. Just like working with painting textures, there are no trees here by default, so navigate to Edit Trees… | Add Tree, add our CollidingTree object created earlier in this step, and then select Add. Next, under the Settings section, change the Tree Density value to 15 and then with our new tree selected, paint the areas on the main island that do not have paths on them. Once you've finished with placing those trees, up the Tree Density value to 50 and then paint the areas that are far away from paths to make it less likely that players go that way. You should also enable Create Tree Collider in the terrain's Terrain Collider component: In our last step to create an environment, let's add some details. The mode next to the Plant Trees mode is Paint Details. Next, click on the Edit Details… button and select Add Grass Texture. Select the Grass texture for the Detail Texture option and then click on Add. In the terrain's Settings mode (the one on the far right), change the Detail Distance value to 250, and then paint the grass where there isn't any dirt along the route in the Paint Details mode: You may not see the results unless you zoom your camera in, which you can do by using the mouse wheel. Don't go too far in though, or the results may not show as well. This aspect of level creation isn't very difficult, just time consuming. However, it's taking time to enter these details that really sets a game apart from the other games. Generally, you'll want to playtest and make sure your level is fun before performing these actions; but I feel it's important to have an idea of how to do it for your future projects. Lastly, our current island is very flat, and while that's okay for cities, nature is random. Go back into the Raise/Lower Height tool and gently raise and lower some areas of the level to give the illusion of depth. Do note that your trees and grass will raise and fall with the changes that you make, as shown in the following screenshot: With this done, let's now add some details to the areas that the player will not be visiting, such as the outer hills. Go into the Place Trees mode and create another tree, but this time select the one without collision and then place it around the edges of the mountains, as shown in the following screenshot: At this point, we have a nice exterior shape created with the terrain tools! If you want to add even more detail to your levels, you can add additional trees and/or materials to the level area as long as it makes sense for them to be there. For more information on the terrain engine that Unity has, please visit http://docs.unity3d.com/Manual/script-Terrain.html. Creating our player Now that we have the terrain and its details, it's hard to get a good picture of what the game looks like without being able to see what it looks like down on the surface, so next we will do just that. However, instead of creating our player from scratch as we've done previously, we will make use of the code that Unity has provided us. We will perform the following steps: Start off by navigating to Assets | Import Package | Character Controller. When the Importing Package dialog comes up, we only need to import the files shown in the following screenshot: Now drag-and-drop the First Person Controller prefab under the Prefabs folder in our Project tab into your world, where you want the player to spawn, setting the Y Position value to above 100. If you see yourself fall through the world instead of hitting the ground when you spawn, then increase the Y Position value until you get there. If you open up the First Person Controller object in the Hierarchy tab, you'll notice that it has a Main Camera object already, so delete the Main Camera object that already exists in the world. Right now, if we played the game, you'd see that everything is dark because we don't have any light. For the purposes of demonstration, let's add a directional light by navigating to GameObject | Create Other | Directional Light. If you are using Unity 4.6 or later, navigate to GameObject | Create  General | Terrain to create the Terrain. Save your scene and hit the Play button to drop into your level: At this point, we have a playable level that we can explore and move around in! Building the atmosphere Now, the base of our world has been created; let's add some effects to make the game even more visually appealing and so it will start to fit in with the survival horror feel that we're going to be giving the game. The first part of creating the atmosphere is to add something for the sky aside from the light blue color that we currently use by default. To fix this, we will be using a skybox. A skybox is a method to create backgrounds to make the area seem bigger than it really is, by putting an image in the areas that are currently being filled with the light blue color, not moving in the same way that the sky doesn't move to us because it's so far away. The reason why we call a skybox a skybox is because it is made up of six textures that will be the inside of the box (one for each side of a cube). Game engines such as Unreal have skydomes, which are the same thing; but they are done with a hemisphere instead of a cube. We will perform the following steps to build the atmosphere: To add in our skybox, we are going to navigate to Assets | Import Package | Skyboxes. We want our level to display the night, so we'll be using the Starry Night Skybox. Just select the StarryNight Skybox.mat file and textures inside the Standard Assets/Skyboxes/Textures/StarryNight/ location, and then click on Import: With this file imported, we need to navigate to Edit | Render Settings next. Once there, we need to set the Skybox Material option to the Starry Night skybox: If you go into the game, you'll notice the level starting to look nicer already with the addition of the skybox, except for the fact that the sky says night while the world says it's daytime. Let's fix that now. Switch to the Game tab so that you can see the changes we'll be making next. While still at the RenderSettings menu, let's turn on the Fog property by clicking on the checkbox with its name and changing the Fog Color value to a black color. You should notice that the surroundings are already turning very dark. Play around with the Fog Density value until you're comfortable with how much the player can see ahead of them; I used 0.005. Fog obscures far away objects, which adds to the atmosphere and saves the rendering power. The denser the fog, the more the game will feel like a horror game. The first game of the Silent Hill franchise used fog to make the game run at an acceptable frame rate due to a large 3D environment it had on early PlayStation hardware. Due to how well it spooked players, it continued to be used in later games even though they could render larger areas with the newer technology. Let's add some lighting tweaks to make the environment that the player is walking in seem more like night. Go into the DirectionalLight properties section and change the Intensity value to 0.05. You'll see the value get darker, as shown in the following screenshot: If for some reason, you'd like to make the world pitch black, you'll need to modify the Ambient Light property to black inside the RenderSettings section. By default, it is dark grey, which will show up even if there are no lights placed in the world. In the preceding example, I increased the Intensity value to make it easier to see the world to make it easier for readers to follow, but in your project, you probably don't want the player to see so far out with such clarity. With this, we now have a believable exterior area at night! Now that we have this basic knowledge, let's add a flashlight so the players can see where they are going. Creating a flashlight Now that our level looks like a dark night, we still want to give our players the ability to see what's in front of them with a flashlight. We will customize the First Person Controller object to fit our needs: Create a spotlight by navigating to GameObject | Create Other | Spotlight. Once created, we are going to make the spotlight a child of the First Person Controller object's Main Camera object by dragging-and-dropping it on top of it. Once a child, change the Transform Position value to (0, -.95, 0). Since positions are relative to your parent's position, this places the light slightly lower than the camera's center, just like a hand holding a flashlight. Now change the Rotation value to (0,0,0) or give it a slight diagonal effect across the scene if you don't want it to look like it's coming straight out: Now, we want the flashlight to reach out into the distance. So we will change the Range value to 1000, and to make the light wider, we will change the Spot Angle value to 45. The effects are shown in the following screenshot: If you have Unity Pro, you can also give shadows to the world based on your lights by setting the Shadow Type property. We now have a flashlight, so the player can focus on a particular area and not worry. Walking / flashlight bobbing animation Now the flashlight looks fine in a screenshot, but if you walk throughout the world, it will feel very static and unnatural. If a person is actually walking through the forest, there will be a slight bob as you walk, and if someone is actually holding a flash light, it won't be stable the entire time because your hand would move. We can solve both of these problems by writing yet another script. We perform the following steps: Create a new folder called Scripts. Inside this folder, create a new C# script called BobbingAnimation. Open the newly created script and use the following code inside it: using UnityEngine; using System.Collections;   /// <summary> /// Allows the attached object to bob up and down through /// movement or /// by default. /// </summary> public class BobbingAnimation : MonoBehaviour {   /// <summary>   /// The elapsed time.   /// </summary>   private float elapsedTime;     /// <summary>   /// The starting y offset from the parent.   /// </summary>   private float startingY;     /// <summary>   /// The controller.   /// </summary>   private CharacterController controller;     /// <summary>   /// How far up and down the object will travel   /// </summary>   public float magnitude = .2f;     /// <summary>   /// How often the object will move up and down   /// </summary>   public float frequency = 10;     /// <summary>   /// Do you always want the object to bob up and down or   /// with movement?   /// </summary>   public bool alwaysBob = false;     /// <summary>   /// Start this instance.   /// </summary>   void Start ()   {     startingY = transform.localPosition.y;     controller = GetComponent<CharacterController> ();   }     /// <summary>   /// Update this instance.   /// </summary>   void Update ()   {          // Only increment elapsedTime if you want the player to     // bob, keeping it the same will keep it still     if(alwaysBob)     {       elapsedTime += Time.deltaTime;     }     else     {       if((Input.GetAxis("Horizontal") != 0.0f) || (Input.GetAxis("Vertical")  != 0.0f) )         elapsedTime += Time.deltaTime;     }         float yOffset = Mathf.Sin(elapsedTime * frequency) * magnitude;       //If we can't find the player controller or we're     // jumping, we shouldn't be bobbing     if(controller && !controller.isGrounded)     {       return;     }       //Set our position     Vector3 pos = transform.position;         pos.y = transform.parent.transform.position.y +             startingY + yOffset;         transform.position = pos;       } } The preceding code will tweak the object it's attached to so that it will bob up and down whenever the player is moving along the x or y axis. I've also added a variable called alwaysBob, which, when true, will make the object always bob. Math is a game developer's best friend, and here we are using sin (pronounced sine). Taking the sin of an angle number gives you the ratio of the length of the opposite side of the angle to the length of the hypotenuse of a right-angled triangle. If that didn't make any sense to you, don't worry. The neat feature of sin is that as the number it takes gets larger, it will continuously give us a value between 0 and 1 that will go up and down forever, giving us a smooth repetitive oscillation. For more information on sine waves, visit http://en.wikipedia.org/wiki/Sine_wave. While we're using the sin just for the player's movement and the flashlight, this could be used in a lot of effects, such as having save points/portals bob up and down, or any kind of object you would want to have slight movement or some special FX. Next, attach the BobbingAnimation component to the Main Camera object, leaving all the values with the defaults. After this, attach the BobbingAnimation component to the spotlight as well. With the spotlight selected, turn the Always Bob option on and change the Magnitude value to .05 and the Frequency value to 3. The effects are shown in the following screenshot: Summary To learn more about FPS game, the following books published by Packt Publishing (https://www.packtpub.com/) are recommended: Building an FPS Game with Unity (https://www.packtpub.com/game-development/building-fps-game-unity) Resources for Article:   Further resources on this subject: Mobile Game Design Best Practices [article] Putting the Fun in Functional Python [article] Using a collider-based system [article]
Read more
  • 0
  • 0
  • 3158

article-image-audio-and-animation-hand-hand
Packt
16 Feb 2016
5 min read
Save for later

Audio and Animation: Hand in Hand

Packt
16 Feb 2016
5 min read
In this article, we are going to learn techniques to match audio pitch to animation speed. This is very crucial while editing videos and creating animated contents. (For more resources related to this topic, see here.) Matching the audio pitch to the animation speed Many artifacts sound higher in pitch when accelerated and lower when slowed down. Car engines, fan coolers, Vinyl, a record player the list goes on. If you want to simulate this kind of sound effect in an animated object that can have its speed changed dynamically, follow this article. Getting ready For this, you'll need an animated 3D object and an audio clip. Please use the files animatedRocket.fbx and engineSound.wav, available in the 1362_09_01 folder, that you can find in code bundle of the book Unity 5.x Cookbook at https://www.packtpub.com/game-development/unity-5x-cookbook. How to do it... To change the pitch of an audio clip according to the speed of an animated object, please follow these steps: Import the animatedRocket.fbx file into your Project. Select the carousel.fbx file in the Project view. Then, from the Inspector view, check its Import Settings. Select Animations, then select the clip Take 001, and make sure to check the Loop Time option. Click on the Apply button, shown as follows to save the changes: The reason why we didn't need to check Loop Pose option is because our animation already loops in a seamless fashion. If it didn't, we could have checked that option to automatically create a seamless transition from the last to the first frame of the animation. Add the animatedRocket GameObject to the scene by dragging it from the Project view into the Hierarchy view. Import the engineSound.wav audio clip. Select the animatedRocket GameObject. Then, drag engineSound from the Project view into the Inspector view, adding it as an Audio Source for that object. In the Audio Source component of carousel, check the box for the Loop option, as shown in the following screenshot: We need to create a Controller for our object. In the Project view, click on the Create button and select Animator Controller. Name it as rocketlController. Double-click on rocketController object to open the Animator window, as shown. Then, right-click on the gridded area and select the Create State | Empty option, from the contextual menu. Name the new state spin and set Take 001 as its motion in the Motion field: From the Hierarchy view, select animatedRocket. Then, in the Animator component (in the Inspector view), set rocketController as its Controller and make sure that the Apply Root Motion option is unchecked as shown: In the Project view, create a new C# Script and rename it to ChangePitch. Open the script in your editor and replace everything with the following code: using UnityEngine; public class ChangePitch : MonoBehaviour{ public float accel = 0.05f; public float minSpeed = 0.0f; public float maxSpeed = 2.0f; public float animationSoundRatio = 1.0f; private float speed = 0.0f; private Animator animator; private AudioSource audioSource; void Start(){ animator = GetComponent<Animator>(); audioSource = GetComponent<AudioSource>(); speed = animator.speed; AccelRocket (0f); } void Update(){ if (Input.GetKey (KeyCode.Alpha1)) AccelRocket(accel); if (Input.GetKey (KeyCode.Alpha2)) AccelRocket(-accel); } public void AccelRocket(float accel){ speed += accel; speed = Mathf.Clamp(speed,minSpeed,maxSpeed); animator.speed = speed; float soundPitch = animator.speed * animationSoundRatio; audioSource.pitch = Mathf.Abs(soundPitch); } } Save your script and add it as a component to animatedRocket GameObject. Play the scene and change the animation speed by pressing key 1 (accelerate) and 2 (decelerate) on your alphanumeric keyboard. The audio pitch will change accordingly. How it works... At the Start() method, besides storing the Animator and Audio oururcecuSource components in variables, we'll get the initial speed from the Animator and, we'll call the AccelRocket() function by passing 0 as an argument, only for that function to calculate the resulting pitch for the Audio Source. During Update() function, the lines of the if(Input.GetKey (KeyCode.Alpha1)) and if(Input.GetKey (KeyCode.Alpha2)) code detect whenever the 1 or 2 keys are being pressed on the alphanumeric keyboard to call the AccelRocket() function, passing a accel float variable as an argument. The AccelRocket() function, in its turn, increments speed with the received argument (the accel float variable). However, it uses the Mathf.Clamp()command to limit the new speed value between the minimum and maximum speed as set by the user. Then, it changes the Animator speed and Audio Source pitch according to the new speed absolute value (the reason for making it an absolute value is keeping the pitch a positive number, even when the animation is reversed by a negative speed value). Also, please note that setting the animation speed and therefore, the sound pitch to 0 will cause the sound to stop, making it clear that stopping the object's animation also prevents the engine sound from playing. There's more... Here is some information on how to fine-tune and customize this recipe. Changing the Animation/Sound Ratio If you want the audio clip pitch to be more or less affected by the animation speed, change the value of the Animation/Sound Ratio parameter. Accessing the function from other scripts The AccelRocket()function was made public so that it can be accessed from other scripts. As an example, we have included the ExtChangePitch.cs script in 1362_09_01 folder. Try attaching this script to the Main Camera object and use it to control the speed by clicking on the left and right mouse buttons. Summary In this article we learned, how to match audio pitch to the animation speed, how to change Animation/Sound Ratio. To learn more please refer to the following books: Learning Unity 2D Game Development by Examplehttps://www.packtpub.com/game-development/learning-unity-2d-game-development-example. Unity Game Development Blueprintshttps://www.packtpub.com/game-development/unity-game-development-blueprints. Getting Started with Unityhttps://www.packtpub.com/game-development/getting-started-unity. Resources for Article:   Further resources on this subject: The Vertex Functions [article] Lights and Effects [article] Virtual Machine Concepts [article]
Read more
  • 0
  • 0
  • 1695

article-image-welcome-land-bludborne
Packt
23 Oct 2015
12 min read
Save for later

Welcome to the Land of BludBorne

Packt
23 Oct 2015
12 min read
In this article by Patrick Hoey, the author of Mastering LibGDX Game Development, we will jump into creating the world of BludBourne (that's our game!). We will first learn some concepts and tools related to creating tile based maps and then we will look into starting with BludBorne! We will cover the following topics in this article: Creating and editing tile based maps Implementing the starter classes for BludBourne (For more resources related to this topic, see here.) Creating and editing tile based maps For the BludBourne project map locations, we will be using tilesets, which are terrain and decoration sprites in the shape of squares. These are easy to work with since LibGDX supports tile-based maps with its core library. The easiest method to create these types of maps is to use a tile-based editor. There are many different types of tilemap editors, but there are two primary ones that are used with LibGDX because they have built in support: Tiled: This is a free and actively maintained tile-based editor. I have used this editor for the BludBourne project. Download the latest version from http://www.mapeditor.org/download.html. Tide: This is a free tile-based editor built using Microsoft XNA libraries. The targeted platforms are Windows, Xbox 360, and Windows Phone 7. Download the latest version from http://tide.codeplex.com/releases. For the BludBourne project, we will be using Tiled. The following figure is a screenshot from one of the editing sessions when creating the maps for our game:    The following is a quick guide for how we can use Tiled for this project: Map View (1): The map view is the part of the Tiled editor where you display and edit your individual maps. Numerous maps can be loaded at once, using a tab approach, so that you can switch between them quickly. There is a zoom feature available for this part of Tiled in the lower right hand corner, and can be easily customized depending on your workflow. The maps are provided in the project directory (under coreassetsmaps), but when you wish to create your own maps, you can simply go to File | New. In the New Map dialog box, first set the Tile size dimensions, which, for our project, will be a width of 16 pixels and a height of 16 pixels. The other setting is Map size which represents the size of your map in unit size, using the tile size dimensions as your unit scale. An example would be creating a map that is 100 units by 100 units, and if our tiles have a dimension of 16 pixels by 16 pixels then this would give is a map size of 1600 pixels by 1600 pixels. Layers (2): This represents the different layers of the currently loaded map. You can think of creating a tile map like painting a scene, where you paint the background first and build up the various elements until you get to the foreground. Background_Layer: This tile layer represents the first layer created for the tilemap. This will be the layer to create the ground elements, such as grass, dirt paths, water, and stone walkways. Nothing else will be shown below this layer. Ground_Layer: This tile layer will be the second later created for the tilemap. This layer will be buildings built on top of the ground, or other structures like mountains, trees, and villages. The primary reason is convey a feeling of depth to the map, as well as the fact that structural tiles such as walls have a transparency (alpha channel) so that they look like they belong on the ground where they are being created. Decoration_Layer: This third tile layer will contain elements meant to decorate the landscape in order to remove repetition and make more interesting scenes. These elements include rocks, patches of weeds, flowers, and even skulls. MAP_COLLISION_LAYER: This fourth layer is a special layer designated as an object layer. This layer does not contain tiles, but will have objects, or shapes. This is the layer that you will configure to create areas in the map that the player character and non-player characters cannot traverse, such as walls of buildings, mountain terrain, ocean areas, and decorations such as fountains. MAP_SPAWNS_LAYER: This fifth layer is another special object layer designated only for player and non-playable character spawns, such as people in the towns. These spawns will represent the various starting locations where these characters will first be rendered on the map. MAP_PORTAL_LAYER: This sixth layer is the last object layer designated for triggering events in order to move from one map into another. These will be locations where the player character walks over, triggering an event which activates the transition to another map. An example would be in the village map, when the player walks outside of the village map, they will find themselves on the larger world map. Tilesets (3): This area of Tiled represents all of the tilesets you will work with for the current map. Each tileset, or spritesheet, will get its own tab in this interface, making it easy to move between them. Adding a new tileset is as easy as clicking the New icon in the Tilesets area, and loading the tileset image in the New Tileset dialog. Tiled will also partition out the tilemap into the individual tiles after you configure the tile dimensions in this dialog. Properties (4): This area of Tiled represents the different additional properties that you can set for the currently selected map element, such as a tile or object. An example of where these properties can be helpful is when we create a portal object on the portal layer. We can create a property defining the name of this portal object that represents the map to load. So, when we walk over a small tile that looks like a town in the world overview map, and trigger the portal event, we know that the map to load is TOWN because the name property on this portal object is TOWN. After reviewing a very brief description of how we can use the Tiled editor for BludBourne, the following screenshots show the three maps that we will be using for this project. The first screenshot is of the TOWN map which will be where our hero will discover clues from the villagers, obtain quests, and buy armor and weapons. The town has shops, an inn, as well as a few small homes of local villagers:    The next screenshot is of the TOP_WORLD map which will be the location where our hero will battle enemies, find clues throughout the land, and eventually make way to the evil antagonist held up in his castle. The hero can see how the pestilence of evil has started to spread across the lands and lay ruin upon the only harvestable fields left:    Finally, we make our way to the CASTLE_OF_DOOM map, which will be where our hero, once leveled enough, will battle the evil antagonist held up in the throne room of his own castle. Here, the hero will find many high level enemies, as well as high valued items for trade:     Implementing the starter classes for BludBourne Now that we have created the maps for the different locations of BludBourne, we can now begin to develop the initial pieces of our source code project in order to load these maps, and move around in our world. The following diagram represents a high level view of all the relevant classes that we will be creating:   This class diagram is meant to show not only all the classes we will be reviewing in this article, but also the relationships that these classes share so that we are not developing them in a vacuum. The main entry point for our game (and the only platform specific class) is DesktopLauncher, which will instantiate BludBourne and add it along with some configuration information to the LibGDX application lifecycle. BludBourne will derive from Game to minimize the lifecycle implementation needed by the ApplicationListener interface. BludBourne will maintain all the screens for the game. MainGameScreen will be the primary gameplay screen that displays the different maps and player character moving around in them. MainGameScreen will also create the MapManager, Entity, and PlayerController. MapManager provides helper methods for managing the different maps and map layers. Entity will represent the primary class for our player character in the game. PlayerController implements InputProcessor and will be the class that controls the players input and controls on the screen. Finally, we have some asset manager helper methods in the Utility class used throughout the project. DesktopLauncher The first class that we will need to modify is DesktopLauncher, which the gdx-setup tool generated: package com.packtpub.libgdx.bludbourne.desktop; import com.badlogic.gdx.Application; import com.badlogic.gdx.Gdx; import com.badlogic.gdx.backends.lwjgl.LwjglApplication; import com.badlogic.gdx.backends.lwjgl.LwjglApplicationConfiguration; import com.packtpub.libgdx.bludbourne.BludBourne; The Application class is responsible for setting up a window, handling resize events, rendering to the surfaces, and managing the application during its lifetime. Specifically, Application will provide the modules for dealing with graphics, audio, input and file I/O handling, logging facilities, memory footprint information, and hooks for extension libraries. The Gdx class is an environment class that holds static instances of Application, Graphics, Audio, Input, Files, and Net modules as a convenience for access throughout the game. The LwjglApplication class is the backend implementation of the Application interface for the desktop. The backend package that LibGDX uses for the desktop is called LWJGL. This implementation for the desktop will provide cross-platform access to native APIs for OpenGL. This interface becomes the entry point that the platform OS uses to load your game. The LwjglApplicationConfiguration class provides a single point of reference for all the properties associated with your game on the desktop: public class DesktopLauncher { public static void main (String[] arg) { LwjglApplicationConfiguration config = new LwjglApplicationConfiguration(); config.title = "BludBourne"; config.useGL30 = false; config.width = 800; config.height = 600; Application app = new LwjglApplication(new BludBourne(), config); Gdx.app = app; //Gdx.app.setLogLevel(Application.LOG_INFO); Gdx.app.setLogLevel(Application.LOG_DEBUG); //Gdx.app.setLogLevel(Application.LOG_ERROR); //Gdx.app.setLogLevel(Application.LOG_NONE); } } The config object is an instance of the LwjglApplicationConfiguration class where we can set top level game configuration properties, such as the title to display on the display window, as well as display window dimensions. The useGL30 property is set to false, so that we use the much more stable and mature implementation of OpenGL ES, version 2.0. The LwjglApplicationConfiguration properties object, as well as our starter class instance, BludBourne, are then passed to the backend implementation of the Application class, and an object reference is then stored in the Gdx class. Finally, we will set the logging level for the game. There are four values for the logging levels which represent various degrees of granularity for application level messages output to standard out. LOG_NONE is a logging level where no messages are output. LOG_ERROR will only display error messages. LOG_INFO will display all messages that are not debug level messages. Finally, LOG_DEBUG is a logging level that displays all messages. BludBourne The next class to review is BludBourne. The class diagram for BludBourne shows the attributes and method signatures for our implementation: The import packages for BludBourne are as follows: package com.packtpub.libgdx.bludbourne; import com.packtpub.libgdx.bludbourne.screens.MainGameScreen; import com.badlogic.gdx.Game; The Game class is an abstract base class which wraps the ApplicationListener interface and delegates the implementation of this interface to the Screen class. This provides a convenience for setting the game up with different screens, including ones for a main menu, options, gameplay, and cutscenes. The MainGameScreen is the primary gameplay screen that the player will see as they move their hero around in the game world: public class BludBourne extends Game { public static final MainGameScreen _mainGameScreen = new MainGameScreen(); @Override public void create(){ setScreen(_mainGameScreen); } @Override public void dispose(){ _mainGameScreen.dispose(); } } The gdx-setup tool generated our starter class BludBourne. This is the first place where we begin to set up our game lifecycle. An instance of BludBourne is passed to the backend constructor of LwjglApplication in DesktopLauncher which is how we get hooks into the lifecycle of LibGDX. BludBourne will contain all of the screens used throughout the game, but for now we are only concerned with the primary gameplay screen, MainGameScreen. We must override the create() method so that we can set the initial screen for when BludBourne is initialized in the game lifecycle. The setScreen() method will check to see if a screen is already currently active. If the current screen is already active, then it will be hidden, and the screen that was passed into the method will be shown. In the future, we will use this method to start the game with a main menu screen. We should also override dispose() since BludBourne owns the screen object references. We need to make sure that we dispose of the objects appropriately when we are exiting the game. Summary In this article, we first learned about tile based maps and how to create them with the Tiled editor. We then learned about the high level architecture of the classes we will have to create and implemented starter classes which allowed us to hook into the LibGDX application lifecycle. Have a look at Mastering LibGDX Game Development to learn about textures, TMX formatted tile maps, and how to manage them with the asset manager. Also included is how the orthographic camera works within our game, and how to display the map within the render loop. You can learn to implement a map manager that deals with collision layers, spawn points, and a portal system which allows us to transition between different locations seamlessly. Lastly, you can learn to implement a player character with animation cycles and input handling for moving around the game map. Resources for Article: Further resources on this subject: Finding Your Way [article] Getting to Know LibGDX [article] Replacing 2D Sprites with 3D Models [article]
Read more
  • 0
  • 0
  • 4002

article-image-using-tiled-map-editor
Packt
13 Oct 2015
5 min read
Save for later

Using the Tiled map editor

Packt
13 Oct 2015
5 min read
LibGDX is a game framework and not a game engine. This is why it doesn't have any editor to place the game objects or to make levels. Tiled is a 2D level/map editor well-suited for this purpose. In this article by Indraneel Potnis, the author of LibGDX Cross-platform Development Blueprints, we will learn how to draw objects and create animations. LibGDX has an excellent support for rendering and reading maps/levels made through Tiled. (For more resources related to this topic, see here.) Drawing objects Sometimes, simple tiles may not satisfy your requirements. You might need to create objects with complex shapes. You can define these shape outlines in the editor easily. The first thing you need to do is create an object layer. Go to Layer | Add Object Layer: You will notice that a new layer has been added to the Layers pane called Object Layer 1. You can rename it if you like: With this layer selected, you can see the object toolbar getting enabled: You can draw basic shapes, such as a rectangle or an ellipse/circle: You can also draw a polygon and a polyline by selecting the appropriate options from the toolbar. Once you have added all the edges, click on the right mouse button to stop drawing the current object: Once the polygon/polyline is drawn, you can edit it by selecting the Edit Polygons option from the toolbar: After this, select the area that encompasses your polygon in order to change to the edit mode. You can edit your polygons/polylines now: You can also add custom properties to your polygons by right-clicking on them and selecting Object Properties: You can then add custom properties as mentioned previously: You can also add tiles as an object. Click on the Insert Tile icon in the toolbar: Once you select this, you can insert tiles as objects into the map. You will observe that the tiles can be placed anywhere now, irrespective of the grid boundaries: To select and move multiple objects, you can select the Select Objects option from the toolbar: You can then select the area that encompasses the objects. Once they are selected, you can move them by dragging them with your mouse cursor: You can also rotate the object by dragging the indicators at the corners after they are selected: Tile animations and images Tiled allows you to create animations in the editor. Let's make an animated shining crystal. First, we will need an animation sheet of the crystal. I am using this one, which is 16 x 16 pixels per crystal: The next thing we need to do is add this sheet as a tileset to the editor and name it crystals. After you add the tileset, you can see a new tab in the Tilesets pane: Go to View | Tile Animation Editor to open the animation editor: A new window will open that will allow you to edit the animations: On the right-hand side, you will see the individual animation frames that make up the animation. This is the animation tileset, which we added. Hold Ctrl on your keyboard, and select all of them with your mouse. Then, drag them to the left window: The numbers beside the images indicate the amount of time each image will be displayed in milliseconds. The images are displayed in this order and repeat continuously. In this example, every image will be shown for 100ms or 1/10th of a second. In the bottom-left corner, you can preview the animation you just created. Click on the Close button. You can now see something like this in the Tilesets pane: The first tile represents the animation, which we just created. Select it, and you can draw the animation anywhere in the map. You can see the animation playing within the map: Lastly, we can also add images to our map. To use them, we need to add an image layer to our map. Go to Layer | Add Image Layer. You will notice that a new layer has been added to the Layers pane. Rename it House: To use an image, we need to set the image's path as a property for this layer. In the Properties pane, you will find a property called Image. There is a file picker next to it where you can select the image you want: Once you set the image, you can use it to draw on the map:   Summary In this article, we learned about a tool called Tiled, and we also learned how to draw various objects and make tile animations and add images. Carry on with LibGDX Cross-platform Development Blueprints to learn how to develop great games, such as Monty Hall Simulation, Whack a Mole, Bounce the Ball, and many more. You can also take a look at the vast array of LibGDX titles from Packt Publishing, a few among these are as follows: Learning Libgdx Game Development, Andreas Oehlke LibGDX Game Development By Example, James Cook LibGDX Game Development Essentials, Juwal Bose Resources for Article:   Further resources on this subject: Getting to Know LibGDX [article] Using Google's offerings [article] Animations in Cocos2d-x [article]
Read more
  • 0
  • 0
  • 2724
article-image-lights-and-effects
Packt
29 Sep 2015
27 min read
Save for later

Lights and Effects

Packt
29 Sep 2015
27 min read
 In this article by Matt Smith and Chico Queiroz, authors of Unity 5.x Cookbook, we will cover the following topics: Using lights and cookie textures to simulate a cloudy day Adding a custom Reflection map to a scene Creating a laser aim with Projector and Line Renderer Reflecting surrounding objects with Reflection Probes Setting up an environment with Procedural Skybox and Directional Light (For more resources related to this topic, see here.) Introduction Whether you're willing to make a better-looking game, or add interesting features, lights and effects can boost your project and help you deliver a higher quality product. In this article, we will look at the creative ways of using lights and effects, and also take a look at some of Unity's new features, such as Procedural Skyboxes, Reflection Probes, Light Probes, and custom Reflection Sources. Lighting is certainly an area that has received a lot of attention from Unity, which now features real-time Global Illumination technology provided by Enlighten. This new technology provides better and more realistic results for both real-time and baked lighting. For more information on Unity's Global Illumination system, check out its documentation at http://docs.unity3d.com/Manual/GIIntro.html. The big picture There are many ways of creating light sources in Unity. Here's a quick overview of the most common methods. Lights Lights are placed into the scene as game objects, featuring a Light component. They can function in Realtime, Baked, or Mixed modes. Among the other properties, they can have their Range, Color, Intensity, and Shadow Type set by the user. There are four types of lights: Directional Light: This is normally used to simulate the sunlight Spot Light: This works like a cone-shaped spot light Point Light: This is a bulb lamp-like, omnidirectional light Area Light: This baked-only light type is emitted in all directions from a rectangle-shaped entity, allowing for a smooth, realistic shading For an overview of the light types, check Unity's documentation at http://docs.unity3d.com/Manual/Lighting.html. Different types of lights Environment Lighting Unity's Environment Lighting is often achieved through the combination of a Skybox material and sunlight defined by the scene's Directional Light. Such a combination creates an ambient light that is integrated into the scene's environment, and which can be set as Realtime or Baked into Lightmaps. Emissive materials When applied to static objects, materials featuring the Emission colors or maps will cast light over surfaces nearby, in both real-time and baked modes, as shown in the following screenshot: Projector As its name suggests, a Projector can be used to simulate projected lights and shadows, basically by projecting a material and its texture map onto the other objects. Lightmaps and Light Probes Lightmaps are basically texture maps generated from the scene's lighting information and applied to the scene's static objects in order to avoid the use of processing-intensive real-time lighting. Light Probes are a way of sampling the scene's illumination at specific points in order to have it applied onto dynamic objects without the use of real-time lighting. The Lighting window The Lighting window, which can be found through navigating to the Window | Lighting menu, is the hub for setting and adjusting the scene's illumination features, such as Lightmaps, Global Illumination, Fog, and much more. It's strongly recommended that you take a look at Unity's documentation on the subject, which can be found at http://docs.unity3d.com/Manual/GlobalIllumination.html. Using lights and cookie textures to simulate a cloudy day As it can be seen in many first-person shooters and survival horror games, lights and shadows can add a great deal of realism to a scene, helping immensely to create the right atmosphere for the game. In this recipe, we will create a cloudy outdoor environment using cookie textures. Cookie textures work as masks for lights. It functions by adjusting the intensity of the light projection to the cookie texture's alpha channel. This allows for a silhouette effect (just think of the bat-signal) or, as in this particular case, subtle variations that give a filtered quality to the lighting. Getting ready If you don't have access to an image editor, or prefer to skip the texture map elaboration in order to focus on the implementation, please use the image file called cloudCookie.tga, which is provided inside the 1362_06_01 folder. How to do it... To simulate a cloudy outdoor environment, follow these steps: In your image editor, create a new 512 x 512 pixel image. Using black as the foreground color and white as the background color, apply the Clouds filter (in Photoshop, this is done by navigating to the Filter | Render | Clouds menu). Learning about the Alpha channel is useful, but you could get the same result without it. Skip steps 3 to 7, save your image as cloudCookie.png and, when changing texture type in step 9, leave Alpha from Greyscale checked. Select your entire image and copy it. Open the Channels window (in Photoshop, this can be done by navigating to the Window | Channels menu). There should be three channels: Red, Green, and Blue. Create a new channel. This will be the Alpha channel. In the Channels window, select the Alpha 1 channel and paste your image into it. Save your image file as cloudCookie.PSD or TGA. Import your image file to Unity and select it in the Project view. From the Inspector view, change its Texture Type to Cookie and its Light Type to Directional. Then, click on Apply, as shown: We will need a surface to actually see the lighting effect. You can either add a plane to your scene (via navigating to the GameObject | 3D Object | Plane menu), or create a Terrain (menu option GameObject | 3D Object | Terrain) and edit it, if you so you wish. Let's add a light to our scene. Since we want to simulate sunlight, the best option is to create a Directional Light. You can do this through the drop-down menu named Create | Light | Directional Light in the Hierarchy view. Using the Transform component of the Inspector view, reset the light's Position to X: 0, Y: 0, Z: 0 and its Rotation to X: 90; Y: 0; Z: 0. In the Cookie field, select the cloudCookie texture that you imported earlier. Change the Cookie Size field to 80, or a value that you feel is more appropriate for the scene's dimension. Please leave Shadow Type as No Shadows. Now, we need a script to translate our light and, consequently, the Cookie projection. Using the Create drop-down menu in the Project view, create a new C# Script named MovingShadows.cs. Open your script and replace everything with the following code: using UnityEngine; using System.Collections; public class MovingShadows : MonoBehaviour{ public float windSpeedX; public float windSpeedZ; private float lightCookieSize; private Vector3 initPos; void Start(){ initPos = transform.position; lightCookieSize = GetComponent<Light>().cookieSize; } void Update(){ Vector3 pos = transform.position; float xPos= Mathf.Abs (pos.x); float zPos= Mathf.Abs (pos.z); float xLimit = Mathf.Abs(initPos.x) + lightCookieSize; float zLimit = Mathf.Abs(initPos.z) + lightCookieSize; if (xPos >= xLimit) pos.x = initPos.x; if (zPos >= zLimit) pos.z = initPos.z; transform.position = pos; float windX = Time.deltaTime * windSpeedX; float windZ = Time.deltaTime * windSpeedZ; transform.Translate(windX, 0, windZ, Space.World); } } Save your script and apply it to the Directional Light. Select the Directional Light. In the Inspector view, change the parameters Wind Speed X and Wind Speed Z to 20 (you can change these values as you wish, as shown). Play your scene. The shadows will be moving. How it works... With our script, we are telling the Directional Light to move across the X and Z axis, causing the Light Cookie texture to be displaced as well. Also, we reset the light object to its original position whenever it traveled a distance that was either equal to or greater than the Light Cookie Size. The light position must be reset to prevent it from traveling too far, causing problems in real-time render and lighting. The Light Cookie Size parameter is used to ensure a smooth transition. The reason we are not enabling shadows is because the light angle for the X axis must be 90 degrees (or there will be a noticeable gap when the light resets to the original position). If you want dynamic shadows in your scene, please add a second Directional Light. There's more... In this recipe, we have applied a cookie texture to a Directional Light. But what if we were using the Spot or Point Lights? Creating Spot Light cookies Unity documentation has an excellent tutorial on how to make the Spot Light cookies. This is great to simulate shadows coming from projectors, windows, and so on. You can check it out at http://docs.unity3d.com/Manual/HOWTO-LightCookie.html. Creating Point Light Cookies If you want to use a cookie texture with a Point Light, you'll need to change the Light Type in the Texture Importer section of the Inspector. Adding a custom Reflection map to a scene Whereas Unity Legacy Shaders use individual Reflection Cubemaps per material, the new Standard Shader gets its reflection from the scene's Reflection Source, as configured in the Scene section of the Lighting window. The level of reflectiveness for each material is now given by its Metallic value or Specular value (for materials using Specular setup). This new method can be a real time saver, allowing you to quickly assign the same reflection map to every object in the scene. Also, as you can imagine, it helps keep the overall look of the scene coherent and cohesive. In this recipe, we will learn how to take advantage of the Reflection Source feature. Getting ready For this recipe, we will prepare a Reflection Cubemap, which is basically the environment to be projected as a reflection onto the material. It can be made from either six or, as shown in this recipe, a single image file. To help us with this recipe, it's been provided a Unity package, containing a prefab made of a 3D object and a basic Material (using a TIFF as Diffuse map), and also a JPG file to be used as the reflection map. All these files are inside the 1362_06_02 folder. How to do it... To add Reflectiveness and Specularity to a material, follow these steps: Import batteryPrefab.unitypackage to a new project. Then, select battery_prefab object from the Assets folder, in the Project view. From the Inspector view, expand the Material component and observe the asset preview window. Thanks to the Specular map, the material already features a reflective look. However, it looks as if it is reflecting the scene's default Skybox, as shown: Import the CustomReflection.jpg image file. From the Inspector view, change its Texture Type to Cubemap, its Mapping to Latitude - Longitude Layout (Cylindrical), and check the boxes for Glossy Reflection and Fixup Edge Seams. Finally, change its Filter Mode to Trilinear and click on the Apply button, shown as follows: Let's replace the Scene's Skybox with our newly created Cubemap, as the Reflection map for our scene. In order to do this, open the Lighting window by navigating to the Window | Lighting menu. Select the Scene section and use the drop-down menu to change the Reflection Source to Custom. Finally, assign the newly created CustomReflection texture as the Cubemap, shown as follows: Check out for the new reflections on the battery_prefab object. How it works... While it is the material's specular map that allows for a reflective look, including the intensity and smoothness of the reflection, the refection itself (that is, the image you see on the reflection) is given by the Cubemap that we have created from the image file. There's more... Reflection Cubemaps can be achieved in many ways and have different mapping properties. Mapping coordinates The Cylindrical mapping that we applied was well-suited for the photograph that we used. However, depending on how the reflection image is generated, a Cubic or Spheremap-based mapping can be more appropriate. Also, note that the Fixup Edge Seams option will try to make the image seamless. Sharp reflections You might have noticed that the reflection is somewhat blurry compared to the original image; this is because we have ticked the Glossy Reflections box. To get a sharper-looking reflection, deselect this option; in which case, you can also leave the Filter Mode option as default (Bilinear). Maximum size At 512 x 512 pixels, our reflection map will probably run fine on the lower-end machines. However, if the quality of the reflection map is not so important in your game's context, and the original image dimensions are big (say, 4096 x 4096), you might want to change the texture's Max Size at the Import Settings to a lower number. Creating a laser aim with Projector and Line Renderer Although using GUI elements, such as a cross-hair, is a valid way to allow players to aim, replacing (or combining) it with a projected laser dot might be a more interesting approach. In this recipe, we will use the Projector and Line components to implement this concept. Getting ready To help us with this recipe, it's been provided with a Unity package containing a sample scene featuring a character holding a laser pointer, and also a texture map named LineTexture. All files are inside the 1362_06_03 folder. Also, we'll make use of the Effects assets package provided by Unity (which you should have installed when installing Unity). How to do it... To create a laser dot aim with a Projector, follow these steps: Import BasicScene.unitypackage to a new project. Then, open the scene named BasicScene. This is a basic scene, featuring a player character whose aim is controlled via mouse. Import the Effects package by navigating to the Assets | Import Package | Effects menu. If you want to import only the necessary files within the package, deselect everything in the Importing package window by clicking on the None button, and then check the Projectors folder only. Then, click on Import, as shown: From the Inspector view, locate the ProjectorLight shader (inside the Assets | Standard Assets | Effects | Projectors | Shaders folder). Duplicate the file and name the new copy as ProjectorLaser. Open ProjectorLaser. From the first line of the code, change Shader "Projector/Light" to Shader "Projector/Laser". Then, locate the line of code – Blend DstColor One and change it to Blend One One. Save and close the file. The reason for editing the shader for the laser was to make it stronger by changing its blend type to Additive. However, if you want to learn more about it, check out Unity's documentation on the subject, which is available at http://docs.unity3d.com/Manual/SL-Reference.html. Now that we have fixed the shader, we need a material. From the Project view, use the Create drop-down menu to create a new Material. Name it LaserMaterial. Then, select it from the Project view and, from the Inspector view, change its Shader to Projector/Laser. From the Project view, locate the Falloff texture. Open it in your image editor and, except for the first and last columns column of pixels that should be black, paint everything white. Save the file and go back to Unity. Change the LaserMaterial's Main Color to red (RGB: 255, 0, 0). Then, from the texture slots, select the Light texture as Cookie and the Falloff texture as Falloff. From the Hierarchy view, find and select the pointerPrefab object (MsLaser | mixamorig:Hips | mixamorig:Spine | mixamorig:Spine1 | mixamorig:Spine2 | mixamorig:RightShoulder | mixamorig:RightArm | mixamorig:RightForeArm | mixamorig:RightHand | pointerPrefab). Then, from the Create drop-down menu, select Create Empty Child. Rename the new child of pointerPrefab as LaserProjector. Select the LaserProjector object. Then, from the Inspector view, click the Add Component button and navigate to Effects | Projector. Then, from the Projector component, set the Orthographic option as true and set Orthographic Size as 0.1. Finally, select LaserMaterial from the Material slot. Test the scene. You will be able to see the laser aim dot, as shown: Now, let's create a material for the Line Renderer component that we are about to add. From the Project view, use the Create drop-down menu to add a new Material. Name it as Line_Mat. From the Inspector view, change the shader of the Line_Mat to Particles/Additive. Then, set its Tint Color to red (RGB: 255;0;0). Import the LineTexture image file. Then, set it as the Particle Texture for the Line_Mat, as shown: Use the Create drop-down menu from Project view to add a C# script named LaserAim. Then, open it in your editor. Replace everything with the following code: using UnityEngine; using System.Collections; public class LaserAim : MonoBehaviour { public float lineWidth = 0.2f; public Color regularColor = new Color (0.15f, 0, 0, 1); public Color firingColor = new Color (0.31f, 0, 0, 1); public Material lineMat; private Vector3 lineEnd; private Projector proj; private LineRenderer line; void Start () { line = gameObject.AddComponent<LineRenderer>(); line.material = lineMat; line.material.SetColor("_TintColor", regularColor); line.SetVertexCount(2); line.SetWidth(lineWidth, lineWidth); proj = GetComponent<Projector> (); } void Update () { RaycastHit hit; Vector3 fwd = transform.TransformDirection(Vector3.forward); if (Physics.Raycast (transform.position, fwd, out hit)) { lineEnd = hit.point; float margin = 0.5f; proj.farClipPlane = hit.distance + margin; } else { lineEnd = transform.position + fwd * 10f; } line.SetPosition(0, transform.position); line.SetPosition(1, lineEnd); if(Input.GetButton("Fire1")){ float lerpSpeed = Mathf.Sin (Time.time * 10f); lerpSpeed = Mathf.Abs(lerpSpeed); Color lerpColor = Color.Lerp(regularColor, firingColor, lerpSpeed); line.material.SetColor("_TintColor", lerpColor); } if(Input.GetButtonUp("Fire1")){ line.material.SetColor("_TintColor", regularColor); } } } Save your script and attach it to the LaserProjector game object. Select the LaserProjector GameObject. From the Inspector view, find the Laser Aim component and fill the Line Material slot with the Line_Mat material, as shown: Play the scene. The laser aim is ready, and looks as shown: In this recipe, the width of the laser beam and its aim dot have been exaggerated. Should you need a more realistic thickness for your beam, change the Line Width field of the Laser Aim component to 0.05, and the Orthographic Size of the Projector component to 0.025. Also, remember to make the beam more opaque by setting the Regular Color of the Laser Aim component brighter. How it works... The laser aim effect was achieved by combining two different effects: a Projector and Line Renderer. A Projector, which can be used to simulate light, shadows, and more, is a component that projects a material (and its texture) onto other game objects. By attaching a projector to the Laser Pointer object, we have ensured that it will face the right direction at all times. To get the right, vibrant look, we have edited the projector material's Shader, making it brighter. Also, we have scripted a way to prevent projections from going through objects, by setting its Far Clip Plane on approximately the same level of the first object that is receiving the projection. The line of code that is responsible for this action is—proj.farClipPlane = hit.distance + margin;. Regarding the Line Renderer, we have opted to create it dynamically, via code, instead of manually adding the component to the game object. The code is also responsible for setting up its appearance, updating the line vertices position, and changing its color whenever the fire button is pressed, giving it a glowing/pulsing look. For more details on how the script works, don't forget to check out the commented code, available within the 1362_06_03 | End folder. Reflecting surrounding objects with Reflection Probes If you want your scene's environment to be reflected by game objects, featuring reflective materials (such as the ones with high Metallic or Specular levels), then you can achieve such effect using Reflection Probes. They allow for real-time, baked, or even custom reflections through the use of Cubemaps. Real-time reflections can be expensive in terms of processing; in which case, you should favor baked reflections, unless it's really necessary to display dynamic objects being reflected (mirror-like objects, for instance). Still, there are some ways real-time reflections can be optimized. In this recipe, we will test three different configurations for reflection probes: Real-time reflections (constantly updated) Real-time reflections (updated on-demand) via script Baked reflections (from the Editor) Getting ready For this recipe, we have prepared a basic scene, featuring three sets of reflective objects: one is constantly moving, one is static, and one moves whenever it is interacted with. The Probes.unitypackage package that is containing the scene can be found inside the 1362_06_04 folder. How to do it... To reflect the surrounding objects using the Reflection probes, follow these steps: Import Probes.unitypackage to a new project. Then, open the scene named Probes. This is a basic scene featuring three sets of reflective objects. Play the scene. Observe that one of the systems is dynamic, one is static, and one rotates randomly, whenever a key is pressed. Stop the scene. First, let's create a constantly updated real-time reflection probe. From the Create drop-down button of the Hierarchy view, add a Reflection Probe to the scene (Create | Light | Reflection Probe). Name it as RealtimeProbe and make it a child of the System 1 Realtime | MainSphere game object. Then, from the Inspector view, the Transform component, change its Position to X: 0; Y: 0; Z: 0, as shown: Now, go to the Reflection Probe component. Set Type as Realtime; Refresh Mode as Every Frame and Time Slicing as No time slicing, shown as follows: Play the scene. The reflections will be now be updated in real time. Stop the scene. Observe that the only object displaying the real-time reflections is System 1 Realtime | MainSphere. The reason for this is the Size of the Reflection Probe. From the Reflection Probe component, change its Size to X: 25; Y: 10; Z: 25. Note that the small red spheres are now affected as well. However, it is important to notice that all objects display the same reflection. Since our reflection probe's origin is placed at the same location as the MainSphere, all reflective objects will display reflections from that point of view. If you want to eliminate the reflection from the reflective objects within the reflection probe, such as the small red spheres, select the objects and, from the Mesh Renderer component, set Reflection Probes as Off, as shown in the following screenshot: Add a new Reflection Probe to the scene. This time, name it OnDemandProbe and make it a child of the System 2 On Demand | MainSphere game object. Then, from the Inspector view, Transform component, change its Position to X: 0; Y: 0; Z: 0. Now, go to the Reflection Probe component. Set Type as Realtime, Refresh Mode as Via scripting, and Time Slicing as Individual faces, as shown in the following screenshot: Using the Create drop-down menu in the Project view, create a new C# Script named UpdateProbe. Open your script and replace everything with the following code: using UnityEngine; using System.Collections; public class UpdateProbe : MonoBehaviour { private ReflectionProbe probe; void Awake () { probe = GetComponent<ReflectionProbe> (); probe.RenderProbe(); } public void RefreshProbe(){ probe.RenderProbe(); } } Save your script and attach it to the OnDemandProbe. Now, find the script named RandomRotation, which is attached to the System 2 On Demand | Spheres object, and open it in the code editor. Right before the Update() function, add the following lines: private GameObject probe; private UpdateProbe up; void Awake(){ probe = GameObject.Find("OnDemandProbe"); up = probe.GetComponent<UpdateProbe>(); } Now, locate the line of code called transform.eulerAngles = newRotation; and, immediately after it, add the following line: up.RefreshProbe(); Save the script and test your scene. Observe how the Reflection Probe is updated whenever a key is pressed. Stop the scene. Add a third Reflection Probe to the scene. Name it as CustomProbe and make it a child of the System 3 On Custom | MainSphere game object. Then, from the Inspector view, the Transform component, change its Position to X: 0; Y: 0; Z: 0. Go to the Reflection Probe component. Set Type as Custom and click on the Bake button, as shown: A Save File dialog window will show up. Save the file as CustomProbe-reflectionHDR.exr. Observe that the reflection map does not include the reflection of red spheres on it. To change this, you have two options: set the System 3 On Custom | Spheres GameObject (and all its children) as Reflection Probe Static or, from the Reflection Probe component of the CustomProbe GameObject, check the Dynamic Objects option, as shown, and bake the map again (by clicking on the Bake button). If you want your reflection Cubemap to be dynamically baked while you edit your scene, you can set the Reflection Probe Type to Baked, open the Lighting window (the Assets | Lighting menu), access the Scene section, and check the Continuous Baking option as shown. Please note that this mode won't include dynamic objects in the reflection, so be sure to set System 3 Custom | Spheres and System 3 Custom | MainSphere as Reflection Probe Static. How it works... The Reflection Probes element act like omnidirectional cameras that render Cubemaps and apply them onto the objects within their constraints. When creating Reflection Probes, it's important to be aware of how the different types work: Real-time Reflection Probes: Cubemaps are updated at runtime. The real-time Reflection Probes have three different Refresh Modes: On Awake (Cubemap is baked once, right before the scene starts); Every frame (Cubemap is constantly updated); Via scripting (Cubemap is updated whenever the RenderProbe function is used).Since Cubemaps feature six sides, the Reflection Probes features Time Slicing, so each side can be updated independently. There are three different types of Time Slicing: All Faces at Once (renders all faces at once and calculates mipmaps over 6 frames. Updates the probe in 9 frames); Individual Faces (each face is rendered over a number of frames. It updates the probe in 14 frames. The results can be a bit inaccurate, but it is the least expensive solution in terms of frame-rate impact); No Time Slicing (The Probe is rendered and mipmaps are calculated in one frame. It provides high accuracy, but it also the most expensive in terms of frame-rate). Baked: Cubemaps are baked during editing the screen. Cubemaps can be either manually or automatically updated, depending whether the Continuous Baking option is checked (it can be found at the Scene section of the Lighting window). Custom: The Custom Reflection Probes can be either manually baked from the scene (and even include Dynamic objects), or created from a premade Cubemap. There's more... There are a number of additional settings that can be tweaked, such as Importance, Intensity, Box Projection, Resolution, HDR, and so on. For a complete view on each of these settings, we strongly recommend that you read Unity's documentation on the subject, which is available at http://docs.unity3d.com/Manual/class-ReflectionProbe.html. Setting up an environment with Procedural Skybox and Directional Light Besides the traditional 6 Sided and Cubemap, Unity now features a third type of skybox: the Procedural Skybox. Easy to create and setup, the Procedural Skybox can be used in conjunction with a Directional Light to provide Environment Lighting to your scene. In this recipe, we will learn about different parameters of the Procedural Skybox. Getting ready For this recipe, you will need to import Unity's Standard Assets Effects package, which you should have installed when installing Unity. How to do it... To set up an Environment Lighting using the Procedural Skybox and Directional Light, follow these steps: Create a new scene inside a Unity project. Observe that a new scene already includes two objects: the Main Camera and a Directional Light. Add some cubes to your scene, including one at Position X: 0; Y: 0; Z: 0 scaled to X: 20; Y: 1; Z: 20, which is to be used as the ground, as shown: Using the Create drop-down menu from the Project view, create a new Material and name it MySkybox. From the Inspector view, use the appropriate drop-down menu to change the Shader of MySkybox from Standard to Skybox/Procedural. Open the Lighting window (menu Window | Lighting), access the Scene section. At the Environment Lighting subsection, populate the Skybox slot with the MySkybox material, and the Sun slot with the Directional Light from the Scene. From the Project view, select MySkybox. Then, from the Inspector view, set Sun size as 0.05 and Atmosphere Thickness as 1.4. Experiment by changing the Sky Tint color to RGB: 148; 128; 128, and the Ground color to a value that resembles the scene cube floor's color (such as RGB: 202; 202; 202). If you feel the scene is too bright, try bringing the Exposure level down to 0.85, shown as follows: Select the Directional Light and change its Rotation to X: 5; Y: 170; Z: 0. Note that the scene should resemble a dawning environment, something like the following scene: Let's make things even more interesting. Using the Create drop-down menu in the Project view, create a new C# Script named RotateLight. Open your script and replace everything with the following code: using UnityEngine; using System.Collections; public class RotateLight : MonoBehaviour { public float speed = -1.0f; void Update () { transform.Rotate(Vector3.right * speed * Time.deltaTime); } } Save it and add it as a component to the Directional Light. Import the Effects Assets package into your project (via the Assets | Import Package | Effects menu). Select the Directional Light. Then, from Inspector view, Light component, populate the Flare slot with the Sun flare. From the Scene section of the Lighting window, find the Other Settings subsection. Then, set Flare Fade Speed as 3 and Flare Strength as 0.5, shown as follows: Play the scene. You will see the sun rising and the Skybox colors changing accordingly. How it works... Ultimately, the appearance of Unity's native Procedural Skyboxes depends on the five parameters that make them up: Sun size: The size of the bright yellow sun that is drawn onto the skybox is located according to the Directional Light's Rotation on the X and Y axes. Atmosphere Thickness: This simulates how dense the atmosphere is for this skybox. Lower values (less than 1.0) are good for simulating the outer space settings. Moderate values (around 1.0) are suitable for the earth-based environments. Values that are slightly above 1.0 can be useful when simulating air pollution and other dramatic settings. Exaggerated values (like more than 2.0) can help to illustrate extreme conditions or even alien settings. Sky Tint: It is the color that is used to tint the skybox. It is useful for fine-tuning or creating stylized environments. Ground: This is the color of the ground. It can really affect the Global Illumination of the scene. So, choose a value that is close to the level's terrain and/or geometry (or a neutral one). Exposure: This determines the amount of light that gets in the skybox. The higher levels simulate overexposure, while the lower values simulate underexposure. It is important to notice that the Skybox appearance will respond to the scene's Directional Light, playing the role of the Sun. In this case, rotating the light around its X axis can create dawn and sunset scenarios, whereas rotating it around its Y axis will change the position of the sun, changing the cardinal points of the scene. Also, regarding the Environment Lighting, note that although we have used the Skybox as the Ambient Source, we could have chosen a Gradient or a single Color instead—in which case, the scene's illumination wouldn't be attached to the Skybox appearance. Finally, also regarding the Environment Lighting, please note that we have set the Ambient GI to Realtime. The reason for this was to allow the real-time changes in the GI, promoted by the rotating Directional Light. In case we didn't need these changes at runtime, we could have chosen the Baked alternative. Summary In this article you have learned and had hands-on approach to a number Unity's lighting system features, such as cookie textures, Reflection maps, Lightmaps, Light and Reflection probes, and Procedural Skyboxes. The article also demonstrated the use of Projectors. Resources for Article: Further resources on this subject: Animation features in Unity 5[article] Scripting Strategies[article] Editor Tool, Prefabs, and Main Menu [article]
Read more
  • 0
  • 0
  • 5596

article-image-user-interface
Packt
23 Sep 2015
10 min read
Save for later

User Interface

Packt
23 Sep 2015
10 min read
This article, written by John Doran, the author of the Unreal Engine Game Development Cookbook, covers the following recipes: Creating a main menu Animating a menu (For more resources related to this topic, see here.) In order to create a good game project, you need to be able to communicate information to the player. To do this, we need to create a user interface (UI), which will allow us to display information such as the player's health, inventory, and so on. Inside Unreal 4, we use the Slate UI framework to create user interfaces, however, it's a very complex system. To make things easier for end users, Unreal also released the Unreal Motion Graphics (UMG) UI Designer which is a visual UI authoring tool with a much easier workflow. This is what we will be using in this article. For more information on Slate, refer to https://docs.unrealengine.com/latest/INT/Programming/Slate/index.html. Creating a main menu A main menu can serve as an introduction to your game and is a great place for us to discuss some additional things that UMG has, such as Texts and Buttons. We'll also learn how we can make buttons do things. Let's spend some time to see just how easy it is to create one! For more information on the client-server model, refer to https://en.wikipedia.org/wiki/Client%E2%80%93server_model. How to do it… To give you an idea of how it works, let's take a simple example of a coin collectable: Create a new level by going to File | New Level and select Empty Level. Next, inside the Content Browser tab, go to our UI folder, then to Add New | User Interface | Widget Blueprint, and give it a name of MainMenu. Double-click on it to open the editor. In this menu, we are going to have the title of the game and then a series of buttons the player can press: From the Palette tab, open up the Common section and drag and drop a Button onto the middle of the screen. Select the button and change its Size X to 400 and Size Y to 80. We will also rename the button to Play Game. Drag and drop a Text object onto the Play Game button and you should see it snap on to the button as a child. Under Content, change Text to Play Game. From here under Appearance, change the color of the button to black and change the Font size to 32. From the Hierarchy tab, select the Play Game button and copy and paste it to create duplicate. Move the button down, rename it to Quit Game, and change the Text to Content as well. Move both of the objects so that they're on the bottom part of the HUD, slightly above and side by side, as shown in the following image: Lastly, we'll want to set our pivots and anchors accordingly. When you select either the Quit Game or Play Game buttons, you may notice a sun-like looking widget that displays the Anchors of the object (known as the Anchor Medallion). In our case, open Anchors from the Details panel and click on the bottom-center option. Now that we have the buttons created, we want them to actually do something when we click on them. Select the Play Game button and from the Details tab, scroll down until you see the Events component. There should be a series of big green + buttons. Click on the green button beside OnClicked. Next, it will take us to the Event Graph with the appropriate event created for us. To the right of the event, right-click and create an Open Level action. Under Level Name, put in whatever level you like (for example, StarterMap) and then connect the output of the OnClicked action to the input of the Open Level action. To the right of that, create a Remove from Parent action to make sure that when we leave that, the menu doesn't stay. Finally, create a Get Player Controller action and to the right of it a Set Show Mouse Cursor action, which should be disabled, so that the mouse will no longer be visible since we will want to see the mouse in the menu. (Drag Return Value from the Get Player Controller action to create a new node and search for the mouse cursor action.) Now, go back to the Designer button and then select the Quit Game button. Click on the OnClicked button as well and to the right of this one, create a Quit Game action and connect the output of the OnClicked action to the input of the Quit Game action. Lastly, as a bit of polish, let's add our game's title to the screen. Drag and drop another Text object onto the scene, this time with Anchor at the top-center. From here, change Position X to 0 and Position Y to 176. Change Alignment in the X axis to .5 and check the Size to Content option for it to automatically resize. Set the Content component's Text property to the game's name (in my case, Game Name). Under the Appearance component, set the Font size to 93 and set Justification to Center. There are a number of other styling options that you may wish to use when developing your HUDs. For more information about it, refer to https://docs.unrealengine.com/latest/INT/Engine/UMG/UserGuide/Styling/index.html. Compile the menu, and saveit. Now we need to actually have the widget show up. To do so, we'll need to take the same steps as we did earlier. Open up Level Blueprint by going to Blueprints | Open Level Blueprint and create an EventBeginPlay event. Then, to the right of this, right-click and create a Create Widget action. From the dropdown under Class, select MainMenu and connect the arrow from Event Begin Play to the input of Create MainMenu_C Widget. After this, click and drag the output arrow and create an Add to Viewport event. Then, connect Return Value of our Create Widget action to Target of the Add to Viewport action. Now lastly, we also want to display the player's cursor on the screen to show buttons. To do this, right-click and select Get Player Controller. Then, from Return Value of that, create a Show Mouse Cursor object in Set. Connect the output of the Add to Viewport action to the input of the Show Mouse Cursor action. Compile, save, and run the project! With this, our menu is completed! We can quit the game without any problem, and pressing the Play Game button will start our level! Animating a menu You may have created a menu or UI element at some point, but rather than having it static and non-moving, let's spend some time looking at how we can animate the menus by having them fly in and out or animating them in some way. This will help add to the polish of the title as well as enable players to notice things easier as they move in. Getting ready Before we start working on this, we need to have a project created and set up. Do the previous recipe all the way to completion. How to do it… Open up the MainMenu blueprint once more and from the bottom-left in the Animations tab, click on the +Animation button and give the new animation a name of MenuFlyIn. Select the newly created animation and you should see the window on the right-hand side brighten up. Next, click on the Auto Key toggle to have the animation editor automatically set keys that are appropriate for our implementation. If it's not there already, move the timeline bar (the white line with two orange ends on the top and bottom) to the 0.00 mark on the animation timeline. Next, select the Game Name object and under Color and Opacity, open it and change the A (alpha) value to 0. Now move the timeline bar to the 1.00 mark and then open the color again and set the A value to 1. You'll notice a transition—going from a completely transparent text to a fully shown one. This is a good start. Let's have the buttons fly in after the text appears. Next, move the Time bar to the 2.00 mark and select the Play Game button. Now from the Details tab, you'll notice that under the variables, there are new + icons to the left of variables. This value will save the value for use in the animations. Click on the + icon by the Position Y value. If you use your scroll wheel while inside the dark grey portion of the timeline bar (where the keyframe numbers are displayed), it zooms in and out. This can be quite useful when you create more complex animations. Now move the Time bar to the 1.00 mark and move the Play Game button off the screen. By doing the animation in this way, we are saving where we want it to be first at the end, and then going back in time to do the animations. Do the same animation for the Quit Game button. Now that our animation is created, let's make it in a way so that when the object starts, this animation is played. Click on the Graph button and from the MyBlueprint tab under the Graphs section, double-click on the Event Construct event, which is called as soon as we add the menu to the scene. Grab the pin on the end of it and create a Play Animation action. Drag and drop a MenuFlyIn animation into the scene and select Get. Connect its output pin to the In Animation property of the Play Animation action. Now that we have the animation work when we create the menu, let's have it play when we leave the menu. Select the Play Animation and Menu Fly In variables and copy them. Then move to the OnClicked (Play Game) action. Drag the OnClicked event over to the left and remove its original connection to the Open Level action by holding down Alt and clicking. Now paste (Ctrl + V) the new objects and connect the out pin of OnClicked (Play Game) to the input of Play Animation. Now change Play Mode to Reverse. To the right of this, create a Delay action. For the Duration variable, we want it to wait as long as the animation is, so from the Menu Fly In variable, create another pin and create a Get End Time action. Connect Return Value of Get End Time to the input of the Delay action. Connect the output of the Play Animation action to the input of the Delay action and the Completed output of the Delay action to the input of the Open Level action. Now we need to do the same for the OnClicked (Quit Game) event. Now compile, save, and run the game! Our menu is now completed and we've learned about how animation works inside UMG! For more examples of using UMG for animation, refer to https://docs.unrealengine.com/latest/INT/Engine/UMG/UserGuide/Animation/index.html. Summary This article gave you some insight on Slate and the UMG Editor to create a number of UI elements and an animated main menu to tie your whole game together. We created a main menu and also learned how to make buttons do things. We spent some time looking at how we can animate menus by having them fly in and out. Resources for Article: Further resources on this subject: The Blueprint Class[article] Adding Fog to Your Games [article] Overview of Unreal Engine 4 [article]
Read more
  • 0
  • 0
  • 1705

article-image-prototyping-levels-prototype
Packt
22 Sep 2015
13 min read
Save for later

Prototyping Levels with Prototype

Packt
22 Sep 2015
13 min read
Level design 101 – planning Now, just because we are going to be diving straight into Unity, I feel it's important to talk a little more about how level design is done in the game industry. While you may think a level designer will just jump into the editor and start playing, the truth is you would normally need to do a ton of planning ahead of time before you even open up your tool. Generally, a level begins with an idea. This can come from anything; maybe you saw a really cool building or a photo on the Internet gave you a certain feeling; maybe you want to teach the player a new mechanic. Turning this idea into a level is what a level designer does. Taking all of these ideas, the level designer will create a level design document, which will outline exactly what you're trying to achieve with the entire level from start to end. In this article by John Doran, author of Building FPS Games with Unity, a level design document will describe everything inside the level; listing all of the possible encounters, puzzles, so on and so forth, which the player will need to complete as well as any side quests that the player will be able to achieve. To prepare for this, you should include as many references as you can with maps, images, and movies similar to what you're trying to achieve. If you're working with a team, making this document available on a website or wiki will be a great asset so that you know exactly what is being done in the level, what the team can use in their levels, and how difficult their encounters can be. Generally, you'll also want a top-down layout of your level done either on a computer or with a graph paper, with a line showing a player's general route for the level with the encounters and missions planned out. (For more resources related to this topic, see here.) Of course, you don't want to be too tied down to your design document. It will change as you playtest and work on the level, but the documentation process will help solidify your ideas and give you a firm basis to work from. For those of you interested in seeing some level design documents, feel free to check out Adam Reynolds' Level Designer on Homefront and Call of Duty: World at War at http://wiki.modsrepository.com/index.php?title=Level_Design:_Level_Design_Document_Example. If you want to learn more about level design, I'm a big fan of Beginning Game Level Design by John Feil (previously, my teacher) and Marc Scattergood, Cengage Learning PTR. For more of an introduction to all of game design from scratch, check out Level Up!: The Guide to Great Video Game Design by Scott Rogers and Wiley and The Art of Game Design by Jesse Schel. For some online resources, Scott has a neat GDC talk called Everything I Learned About Level Design I Learned from Disneyland, which can be found at http://mrbossdesign.blogspot.com/2009/03/everything-i-learned-about-game-design.html, and World of Level Design (http://worldofleveldesign.com/) is a good source to learn about level design, though it does not talk about Unity specifically. In addition to a level design document, you can also create a game design document (GDD) that goes beyond the scope of just the level and includes story, characters, objectives, dialogue, concept art, level layouts, and notes about the game's content. However, it is something to do on your own. Creating architecture overview As a level designer, one of the most time-consuming parts of your job will be creating environments. There are many different ways out there to create levels. By default, Unity gives us some default meshes such as a Box, Sphere, and Cylinder. While it's technically possible to build a level in this way, it could get really tedious very quickly. Next, I'm going to quickly go through the most popular options to build levels for the games made in Unity before we jump into building a level of our own. 3D modelling software A lot of times, opening up a 3D modeling software package and building an architecture that way is what professional game studios will often do. This gives you maximum freedom to create your environment and allows you to do exactly what it is you'd like to do; but it requires you to be proficient in that tool, be it Maya, 3ds Max, Blender (which can be downloaded for free at blender.org), or some other tool. Then, you just need to export your models and import them into Unity. Unity supports a lot of different formats for 3D models (most commonly used are .obj and .fbx), but there are a lot of issues to consider. For some best practices when it comes to creating art assets, please visit http://blogs.unity3d.com/2011/09/02/art-assets-best-practice-guide/. Constructing geometry with brushes Constructive Solid Geometry (CSG), commonly referred to as brushes, is a tool artists/designers use to quickly block out pieces of a level from scratch. Using brushes inside the in-game level editor has been a common approach for artists/designers to create levels. Unreal Engine 4, Hammer, Radiant, and other professional game engines make use of this building structure, making it quite easy for people to create and iterate through levels quickly through a process called white-boxing, as it's very easy to make changes to the simple shapes. However; just like learning a modeling software tool, there can be a higher barrier for entry in creating complex geometry using a 3D application, but using CSG brushes will provide a quick solution to create shapes with ease. Unity does not support building things like this by default, but there are several tools in the Unity Asset Store, which allow you to do something like this. For example, sixbyseven studio has an extension called ProBuilder that can add this functionality to Unity, making it very easy to build out levels. The only possible downside is the fact that it does cost money, though it is worth every penny. However, sixbyseven has kindly released a free version of their tools called Prototype, which we installed earlier. It contains everything we will need for this chapter, but it does not allow us to add custom textures and some of the more advanced tools. We will be using ProBuilder later on in the book to polish the entire product. You can find out more information about ProBuilder at http://www.protoolsforunity3d.com/probuilder/. Modular tilesets Another way to generate architecture is through the use of "tiles" that are created by an artist. Similar to using Lego pieces, we can use these tiles to snap together walls and other objects to create a building. With creative uses of the tiles, you can create a large amount of content with just a minimal amount of assets. This is probably the easiest way to create a level at the expense of not being able to create unique looking buildings, since you only have a few pieces to work with. Titles such as Skyrim use this to a great extent to create their large world environments. Mix and match Of course, it's also possible to use a mixture of the preceding tools in order to use the advantages of certain ways of doing things. For example, you could use brushes to block out an area and then use a group of tiles called a tileset to replace the boxes with the highly detailed models, which is what a lot of AAA studios do. In addition, we could initially place brushes to test our gameplay and then add in props to break up the repetitiveness of the levels, which is what we are going to be doing. Creating geometry The first thing we are going to do is to learn how we can create geometry as described in the following steps: From the top menu, go to File | New Scene. This will give us a fresh start to build our project. Next, because we already have Prototype installed, let's create a cube by hitting Ctrl + K. Right now, our Cube (with a name of pb-Cube-1562 or something similar) is placed on a Position of 2, -7, -2. However, for simplicity's sake, I'm going to place it in the middle of the world. We can do this by typing in 0,0,0 by left-clicking in the X position field, typing 0, and then pressing Tab. Notice the cursor is now automatically at the Y part. Type in 0, press Tab again, and then, from the Z slot, press 0 again. Alternatively you can right-click on the Transform component and select Reset Position. Next, we have to center the camera back onto our Cube object. We can do this by going over to the Hierarchy tab and double-clicking on the Cube object (or selecting it and then pressing F). Now, to actually modify this cube, we are going to open up Prototype. We can do this by first selecting our Cube object, going to the Pb_Object component, and then clicking on the green Open Prototype button. Alternatively, you can also go to Tools | Prototype | Prototype Window. This is going to bring up a window much like the one I have displayed here. This new Prototype tab can be detached from the main Unity window or, if you drag from the tab over into Unity, it can be "hooked" into place elsewhere, like the following screenshot shows by my dragging and dropping it to the right of the Hierarchy tab. Next, select the Scene tab in the middle of the screen and press the G key to toggle us into the Object/Geometry mode. Alternatively, you can also click on the Element button in the Scene tab. Unlike the default Object/Top level mode, this will allow us to modify the cube directly to build upon it. For more information on the different modes, check out the Modes & Elements section from http://www.protoolsforunity3d.com/docs/probuilder/#buildingAndEditingGeometry. You'll notice the top of the Prototype tab has three buttons. These stand for what selection type you are currently wanting to use. The default is Vertex or the Point mode, which will allow us to select individual parts to modify. The next is Edge and the last is Face. Face is a good standard to use at this stage, because we only want to extend things out. Select the Face mode by either clicking on the button or pressing the H key twice until it says Editing Faces on the screen. Afterwards, select the box's right side. For a list of keyword shortcuts included with Prototype/ProBuilder, check out http://www.protoolsforunity3d.com/docs/probuilder/#keyboardShortcuts. Now, pull on the red handle to extend our brush outward. Easy enough. Note that, by default, while pulling things out, it is being done in 1 increment. This is nice when we are polishing our levels and trying to make things exactly where we want them, but right now, we are just prototyping. So, getting it out as quickly as possible is paramount to test if it's enjoyable. To help with this, we can use a feature of Unity called Unit Snapping. Undo the previous change we made by pressing Ctrl+Z. Then, move the camera over to the other side and select our longer face. Drag it 9 units out by holding down the Control key (Command on Mac). ProCore3D also has another tool out called ProGrids, which has some advanced unit snapping functionality, but we are not going to be using it. For more information on it, check out http://www.protoolsforunity3d.com/progrids/ If you'd like to change the distance traveled while using unit snapping, set it using the Edit | Snap Settings… menu. Next, drag both the sides out until they are 9 x 9 wide. To make things easier to see, select the Directional Light object in our scene via the Hierarchy tab and reduce the Light component's Intensity to . 5. So, at this point, we have a nice looking floor. However, to create our room, we are first going to need to create our ceiling. Select the floor we have created and press Ctrl + D to duplicate the brush. Once completed, change back into the Object/Top Level editing mode and move the brush so that its Position is at 0, 4, 0. Alternatively, you can click on the duplicated object and, from the Inspector tab, change the Position's Y value to 4. Go back into the sub-selection mode by hitting H to go back to the Faces mode. Then, hold down Ctrl and select all of the edges of our floor. Click on the Extrude button from the Prototype panel. This creates a new part on each of the four edges, which is by default .5 wide (change by clicking on the + button on the edge). This adds additional edges and/or faces to our object. Next, we are going to extrude again; but, rather than doing it from the menu, let's do it manually by selecting the tops of our newly created edges and holding down the Shift button and dragging it up along the Y (green) axis. We then hold down Ctrl after starting the extrusion to have it snap appropriately to fit around our ceiling. Note that the box may not look like this as soon as you let go, as Prototype needs time to compute lighting and materials, which it will mention from the bottom right part of Unity. Next, select Main Camera in the Hierarchy, hit W to switch to the Translate mode, and F to center the selection. Then, move our camera into the room. You'll notice it's completely dark due to the ceiling, but we can add light to the world to fix that! Let's add a point light by going to GameObject | Light | Point Light and position it in the center of the room towards the ceiling (In my case, it was at 4.5, 2.5. 3.5). Then, up the Range to 25 so that it hits the entire room. Finally, add a player to see how he interacts. First, delete the Main Camera object from Hierarchy, as we won't need it. Then, go into the Project tab and open up the AssetsUFPSBaseContentPrefabsPlayers folder. Drag and drop the AdvancedPlayer prefab, moving it so that it doesn't collide with the walls, floors, or ceiling, a little higher than the ground as shown in the following screenshot: Next, save our level (Chapter 3_1_CreatingGeometry) and hit the Play button. It may be a good idea for you to save your levels in such a way that you are able to go back and see what was covered in each section for each chapter, thus making things easier to find in the future. Again, remember that we can pull a weapon out by pressing the 1-5 keys. With this, we now have a simple room that we can interact with! Summary In this article, we take on the role of a level designer, who has been asked to create a level prototype to prove that our gameplay is solid. We will use the free Prototype tool to help in this endeavor. In addition, we will also learn some beginning level designs. Resources for Article: Further resources on this subject: Unity Networking – The Pong Game [article] Unity 3.x Scripting-Character Controller versus Rigidbody [article] Animations in Cocos2d-x [article]
Read more
  • 0
  • 0
  • 2302
article-image-using-mannequin-editor
Packt
07 Sep 2015
14 min read
Save for later

Using the Mannequin editor

Packt
07 Sep 2015
14 min read
In this article, Richard Marcoux, Chris Goodswen, Riham Toulan, and Sam Howels, the authors of the book CRYENGINE Game Development Blueprints, will take us through animation in CRYENGINE. In the past, animation states were handled by a tool called Animation Graph. This is akin to Flow Graph but handled animations and transitions for all animated entities, and unfortunately reduced any transitions or variation in the animations to a spaghetti graph. Thankfully, we now have Mannequin! This is an animation system where the methods by which animation states are handled is all dealt with behind the scenes—all we need to take care of are the animations themselves. In Mannequin, an animation and its associated data is known as a fragment. Any extra detail that we might want to add (such as animation variation, styles, or effects) can be very simply layered on top of the fragment in the Mannequin editor. While complex and detailed results can be achieved with all manner of first and third person animation in Mannequin, for level design we're only really interested in basic fragments we want our NPCs to play as part of flavor and readability within level scripting. Before we look at generating some new fragments, we'll start off with looking at how we can add detail to an existing fragment—triggering a flare particle as part of our flare firing animation. (For more resources related to this topic, see here.) Getting familiar with the interface First things first, let's open Mannequin! Go to View | Open View Pane | Mannequin Editor. This is initially quite a busy view pane so let's get our bearings on what's important to our work. You may want to drag and adjust the sizes of the windows to better see the information displayed. In the top left, we have the Fragments window. This lists all the fragments in the game that pertain to the currently loaded preview. Let's look at what this means for us when editing fragment entries. The preview workflow A preview is a complete list of fragments that pertains to a certain type of animation. For example, the default preview loaded is sdk_playerpreview1p.xml, which contains all the first person fragments used in the SDK. You can browse the list of fragments in this window to get an idea of what this means—everything from climbing ladders to sprinting is defined as a fragment. However, we're interested in the NPC animations. To change the currently loaded preview, go to File | Load Preview Setup and pick sdk_humanpreview.xml. This is the XML file that contains all the third person animations for human characters in the SDK. Once this is loaded, your fragment list should update to display a larger list of available fragments usable by AI. This is shown in the following screenshot:   If you don't want to perform this step every time you load Mannequin, you are able to change the default preview setup for the editor in the preferences. Go to Tools | Preferences | Mannequin | General and change the Default Preview File setting to the XML of your choice. Working with fragments Now we have the correct preview populating our fragment list, let's find our flare fragment. In the box with <FragmentID Filter> in it, type flare and press Enter. This will filter down the list, leaving you with the fireFlare fragment we used earlier. You'll see that the fragment is comprised of a tree. Expanding this tree one level brings us to the tag. A tag in mannequin is a method of choosing animations within a fragment based on a game condition. For example, in the player preview we were in earlier, the begin_reload fragment has two tags: one for SDKRifle and one for SDKShotgun. Depending on the weapon selected by the player, it applies a different tag and consequently picks a different animation. This allows animators to group together animations of the same type that are required in different situations. For our fireFlare fragment, as there are no differing scenarios of this type, it simply has a <default> tag. This is shown in the following screenshot:   Inside this tag, we can see there's one fragment entry: Option 1. These are the possible variations that Mannequin will choose from when the fragment is chosen and the required tags are applied. We only have one variation within fireFlare, but other fragments in the human preview (for example, IA_talkFunny) offer extra entries to add variety to AI actions. To load this entry for further editing, double-click Option 1. Let's get to adding that flare! Adding effects to fragments After loading the fragment entry, the Fragment Editor window has now updated. This is the main window in the center of Mannequin and comprises of a preview window to view the animation and a list of all the available layers and details we can add. The main piece of information currently visible here is the animation itself, shown in AnimLayer under FullBody3P: At the bottom of the Fragment Editor window, some buttons are available that are useful for editing and previewing the fragment. These include a play/pause toggle (along with a playspeed dropdown) and a jump to start button. You are also able to zoom in and out of the timeline with the mouse wheel, and scrub the timeline by click-dragging the red timeline marker around the fragment. These controls are similar to the Track View cinematics tool and should be familiar if you've utilized this in the past. Procedural layers Here, we are able to add our particle effect to the animation fragment. To do this, we need to add ProcLayer (procedural layer) to the FullBody3P section. The ProcLayer runs parallel to AnimLayer and is where any extra layers of detail that fragments can contain are specified, from removing character collision to attaching props. For our purposes, we need to add a particle effect clip. To do this, double-click on the timeline within ProcLayer. This will spawn a blank proc clip for us to categorize. Select this clip and Procedural Clip Properties on the right-hand side of the Fragment Editor window will be populated with a list of parameters. All we need to do now is change the type of this clip from None to ParticleEffect. This is editable in the dropdown Type list. This should present us with a ParticleEffect proc clip visible in the ProcLayer alongside our animation, as shown in the following screenshot:   Now that we have our proc clip loaded with the correct type, we need to specify the effect. The SDK has a couple of flare effects in the particle libraries (searchable by going to RollupBar | Objects Tab | Particle Entity); I'm going to pick explosions.flare.a. To apply this, select the proc clip and paste your chosen effect name into the Effect parameter. If you now scrub through fragment, you should see the particle effect trigger! However, currently the effect fires from the base of the character in the wrong direction. We need to align the effect to the weapon of the enemy. Thankfully, the ParticleEffect proc clip already has support for this in its properties. In the Reference Bone parameter, enter weapon_bone and hit Enter. The weapon_bone is the generic bone name that character's weapons are attached too, and as such it is a good bet for any cases where we require effects or objects to be placed in a character's weapon position. Scrubbing through the fragment again, the effect will now fire from the weapon hand of the character. If we ever need to find out bone names, there are a few ways to access this information within the editor. Hovering over the character in the Mannequin previewer will display the bone name. Alternatively, in Character Editor (we'll go into the details later), you can scroll down in the Rollup window on the right-hand side, expand Debug Options, and tick ShowJointNames. This will display the names of all bones over the character in the previewer. With the particle attached, we can now ensure that the timing of the particle effect matches the animation. To do this, you can click-and-drag the proc clip around timeline—around 1.5 seconds seems to match the timings for this animation. With the effect timed correctly, we now have a fully functioning fireFlare fragment! Try testing out the setup we made earlier with this change. We should now have a far more polished looking event. The previewer in Mannequin shares the same viewport controls as the perspective view in Sandbox. You can use this to zoom in and look around to gain a better view of the animation preview. The final thing we need to do is save our changes to the Mannequin databases! To do this, go to File | Save Changes. When the list of changed files is displayed, press Save. Mannequin will then tell you that you're editing data from the .pak files. Click Yes to this prompt and your data will be saved to your project. The resulting changed database files will appear in GameSDKAnimationsMannequinADB, and it should be distributed with your project if you package it for release. Adding a new fragment Now that we know how to add some effects feedback to existing fragments, let's look at making a new fragment to use as part of our scripting. This is useful to know if you have animators on your project and you want to get their assets in game quickly to hook up to your content. In our humble SDK project, we can effectively simulate this as there are a few animations that ship with the SDK that have no corresponding fragment. Now, we'll see how to browse the raw animation assets themselves, before adding them to a brand new Mannequin fragment. The Character Editor window Let's open the Character Editor. Apart from being used for editing characters and their attachments in the engine, this is a really handy way to browse the library of animation assets available and preview them in a viewport. To open the Character Editor, go to View | Open View Pane | Character Editor. On some machines, the expense of rendering two scenes at once (that is, the main viewport and the viewports in the Character Editor or Mannequin Editor) can cause both to drop to a fairly sluggish frame rate. If you experience this, either close one of the other view panes you have on the screen or if you have it tabbed to other panes, simply select another tab. You can also open the Mannequin Editor or the Character Editors without a level loaded, which allows for better performance and minimal load times to edit content. Similar to Mannequin, the Character Editor will initially look quite overwhelming. The primary aspects to focus on are the Animations window in the top-left corner and the Preview viewport in the middle. In the Filter option in the Animations window, we can search for search terms to narrow down the list of animations. An example of an animation that hasn't yet been turned into a Mannequin fragment is the stand_tac_callreinforcements_nw_3p_01 animation. You can find this by entering reinforcements into the search filter:   Selecting this animation will update the debug character in the Character Editor viewport so that they start to play the chosen animation. You can see this specific animation is a oneshot wave and might be useful as another trigger for enemy reinforcements further in our scripting. Let's turn this into a fragment! We need to make sure we don't forget this animation though; right-click on the animation and click Copy. This will copy the name to the clipboard for future reference in Mannequin. The animation can also be dragged and dropped into Mannequin manually to achieve the same result. Creating fragment entries With our animation located, let's get back to Mannequin and set up our fragment. Ensuring that we're still in the sdk_humanpreview.xml preview setup, take another look at the Fragments window in the top left of Mannequin. You'll see there are two rows of buttons: the top row controls creation and editing of fragment entries (the animation options we looked at earlier). The second row covers adding and editing of fragment IDs themselves: the top level fragment name. This is where we need to start. Press the New ID button on the second row of buttons to bring up the New FragmentID Name dialog. Here, we need to add a name that conforms to the prefixes we discussed earlier. As this is an action, make sure you add IA_ (interest action) as the prefix for the name you choose; otherwise, it won't appear in the fragment browser in the Flow Graph. Once our fragment is named, we'll be presented with Mannequin FragmentID Editor. For the most part, we won't need to worry about these options. But it's useful to be aware of how they might be useful (and don't worry, these can be edited after creation). The main parameters to note are the Scope options. These control which elements of the character are controlled by the fragment. By default, all these boxes are ticked, which means that our fragment will take control of each ticked aspect of the character. An example of where we might want to change this would be the character LookAt control. If we want to get an NPC to look at another entity in the world as part of a scripted sequence (using the AI:LookAt Flow Graph node), it would not be possible with the current settings. This is because the LookPose and Looking scopes are controlled by the fragment. If we were to want to control this via Flow Graph, these would need to be unticked, freeing up the look scopes for scripted control. With scopes covered, press OK at the bottom of the dialog box to continue adding our callReinforcements animation! We now have a fragment ID created in our Fragments window, but it has no entries! With our new fragment selected, press the New button on the first row of buttons to add an entry. This will automatically add itself under the <default> tag, which is the desired behavior as our fragment will be tag-agnostic for the moment. This has now created a blank fragment in the Fragment Editor. Adding the AnimLayer This is where our animation from earlier comes in. Right-click on the FullBody3P track in the editor and go to Add Track | AnimLayer. As we did previously with our effect on ProcLayer, double-click on AnimLayer to add a new clip. This will create our new Anim Clip, with some red None markup to signify the lack of animation. Now, all we need to do is select the clip, go to the Anim Clip Properties, and paste in our animation name by double-clicking the Animation parameter. The Animation parameter has a helpful browser that will allow you to search for animations—simply click on the browse icon in the parameter entry section. It lacks the previewer found in the Character Editor but can be a quick way to find animation candidates by name within Mannequin. With our animation finally loaded into a fragment, we should now have a fragment setup that displays a valid animation name on the AnimLayer. Clicking on Play will now play our reinforcements wave animation!   Once we save our changes, all we need to do now is load our fragment in an AISequence:Animation node in Flow Graph. This can be done by repeating the steps outlined earlier. This time, our new fragment should appear in the fragment dialog. Summary Mannequin is a very powerful tool to help with animations in CRYENGINE. We have looked at how to get started with it. Resources for Article: Further resources on this subject: Making an entity multiplayer-ready[article] Creating and Utilizing Custom Entities[article] CryENGINE 3: Breaking Ground with Sandbox [article]
Read more
  • 0
  • 0
  • 1536

article-image-getting-know-libgdx
Packt
25 Aug 2015
15 min read
Save for later

Getting to Know LibGDX

Packt
25 Aug 2015
15 min read
In this article written by James Cook, author of the book LibGDX Game Development By Example, the author likes to state that, "Creating games is fun, and that is why I like to do it". The process of having an idea for a game to actually delivering it has changed over the years. Back in the 1980s, it was quite common that the top games around were created by either a single person or a very small team. However, anyone who is lucky enough (in my opinion) to see games grow from being quite a simplistic affair to the complex beast that the now AAA titles are, must have also seen the resources needed for these grow with them. The advent of mobile gaming reduced the barrier for entry; once again, the smaller teams could produce a game that could be a worldwide hit! Now, there are games of all genres and complexities available across major gaming platforms. Due to this explosion in the number of games being made, new general-purpose game-making tools appeared in the community. Previously, the in-house teams built and maintained very specific game engines for their games; however, this would have led to a lot of reinventing the wheel. I hate to think how much time I would have lost if for each of my games, I had to start from scratch. Now, instead of worrying about how to display a 2D image on the screen, I can focus on creating that fun player experience I have in my head. My tool of choice? LibGDX. (For more resources related to this topic, see here.) Before I dive into what LibGDX is, here is how LibGDX describes itself. From the LibGDX wiki—https://github.com/libgdx/libgdx/wiki/Introduction: LibGDX is a cross-platform game and visualization development framework. So what does that actually mean? What can LibGDX do for us game-makers that allows us to focus purely on the gameplay? To begin with, LibGDX is Java-based. This means you can reuse a lot, and I mean a lot, of tools that already exist in the Java world. I can imagine a few of you right now must be thinking, "But Java? For a game? I thought Java is supposed to be slow". To a certain extent, this can be true; after all, Java is still an interpreted language that runs in a virtual machine. However, to combat the need for the best possible performance, LibGDX takes advantage of the Java Native Interface (JNI) to implement native platform code and negate the performance disadvantage. One of the beauties of LibGDX is that it allows you to go as low-level as you would like. Direct access to filesystems, input devices, audio devices, and OpenGL (via OpenGL ES 2.0/3.0) is provided. However, the added edge LibGDX gives is that with the APIs that are built on top of these low-level facilities, displaying an image on the screen takes now a days only a few lines of code. A full list of the available features for LibGDX can be found here:http://libgdx.badlogicgames.com/features.html I am happy to wait here while you go and check it out. Impressive list of features, no? So, how cross-platform is this gaming platform? This is probably what you are thinking now. Well, as mentioned before, games are being delivered on many different platforms, be it consoles, PCs, or mobiles. LibGDX currently supports the following platforms: Windows Linux Mac OS X Android BlackBerry iOS HTML/WebGL That is a pretty comprehensive list. Being able to write your game once and have it delivered to all the preceding platforms is pretty powerful. At this point, I would like to mention that LibGDX is completely free and open source. You can go to https://github.com/libGDX/libGDX and check out all the code in all its glory. If the code does something and you would like to understand how, it is all possible; or, if you find a bug, you can make a fix and offer it back to the community. Along with the source code, there are plenty of tests and demos showcasing what LibGDX can do, and more importantly, how to do it. Check out the wiki for more information: https://github.com/libgdx/libgdx/wiki/Running-Demos https://github.com/libgdx/libgdx/wiki/Running-Tests "Who else uses LibGDX?" is quite a common query that comes up during a LibGDX discussion. Well it turns out just about everyone has used it. Google released a game called "Ingress" (https://play.google.com/store/apps/details?id=com.nianticproject.ingress&hl=en) on the play store in 2013, which uses LibGDX. Even Intel (https://software.intel.com/en-us/articles/getting-started-with-libgdx-a-cross-platform-game-development-framework) has shown an interest in LibGDX. Finally, I would like to end this section with another quote from the LibGDX website: LibGDX aims to be a framework rather than an engine, acknowledging that there is no one-size-fits-all solution. Instead we give you powerful abstractions that let you chose how you want to write your game or application. libGDX wiki—https://github.com/libgdx/libgdx/wiki/Introduction This means that you can use the available tools if you want to; if not, you can dive deeper into the framework and create your own! Setting up LibGDX We know by now that LibGDX is this awesome tool for creating games across many platforms with the ability to iterate on our code at superfast speeds. But how do we start using it? Thankfully, some helpful people have made the setup process quite easy. However, before we get to that part, we need to ensure that we have the prerequisites installed, which are as follows: Java Development Kit 7+ (at the time of writing, version 8 is available) Android SDK Not that big a list! Follow the given steps: First things first. Go to http://www.oracle.com/technetwork/java/javase/downloads/index.html. Download and install the latest JDK if you haven't already done so. Oracle developers are wonderful people and have provided a useful installation guide, which you can refer to if you are unsure on how to install the JDK, at http://docs.oracle.com/javase/8/docs/technotes/guides/install/install_overview.html. Once you have installed the JDK, open up the command line and run the following command: java -version If it is installed correctly, you should get an output similar to this: If you generate an error while doing this, consult the Oracle installation documentation and try again. One final touch would be to ensure that we have JAVA_HOME configured. On the command line, perform the following:    For Windows, set JAVA_HOME = C:PathToJDK    For Linux and Mac OSX, export JAVA_HOME = /Path/ToJDK/ Next, on to the Android SDK. At the time of writing, Android Studio has just been released. Android Studio is an IDE offered by Google that is built upon JetBrains IntelliJ IDEA Java IDE. If you feel comfortable using Android Studio as your IDE, and as a developer who has used IntelliJ for the last 5 years, I suggest that you at least give it a go. You can download Android Studio + Android SDK in a bundle from here: http://developer.android.com/sdk/index.html Alternatively, if you plan to use a different IDE (Eclipse or NetBeans, for example) you can just install the tools from the following URL: http://developer.android.com/sdk/index.html#Other You can find the installation instructions here: https://developer.android.com/sdk/installing/index.html?pkg=tools However, I would like to point out that the official IDE for Android is now Android Studio and no longer Eclipse with ADT. For the sake of simplicity, we will only focus on making games for desktops for the greater part of this article. We will look at exporting to Android and iOS later on. Once the Android SDK is installed, it would be well worth running the SDK manager application; so, finalize the set up. If you opt to use Android Studio, you can access this from the SDK Manager icon in the toolbar. Alternatively, you can also access it as follows: On Windows: Double-click on the SDK's Manager.exe file at the root of the Android SDK directory On Mac/Linux: Open a terminal and navigate to the tools/ directory in the location where the Android SDK is installed, then execute Android SDK. The following screen might appear: As a minimum configuration, select: Android SDK Tools Android SDK Platform-tools Android SDK Build-tools (latest available version) Latest version of SDK Platform Let them download and install the selected configuration. Then that's it! Well, not really. We just need to set the ANDROID_HOME environment variable. To do this, we can open up a command line and run the following command: On Windows: Set ANDROID_HOME=C:/Path/To/Your/Android/Sdk On Linux and Mac OS X: Export ANDROID_HOME=/Path/To/Your/Android/Sdk Phew! With that done, we can now move on to the best part—creating our first ever LibGDX game! Creating a project Follow the given steps to create your own project: As mentioned earlier, LibGDX comes with a really useful project setup tool. Download the application from here: http://libgdx.badlogicgames.com/download.html At the time of writing, it is the big red "Download Setup App" button in the middle of your screen. Once downloaded, open the command line and navigate to the location of the application. You will notice that it is a JAR file type. This means we need to use Java to run it. Running this will open the setup UI: Before we hit the Generate button, let's just take a look at what we are creating here: Name: This is the name of our game. Package: This is the Java package our game code will be developed in. Game class: This parameter sets the name of our game class, where the magic happens! Destination: This is the project's directory. You can change this to any location of your choice. Android SDK: This is the location of the SDK. If this isn't set correctly, we can change it here. Going forward, it might be worth setting the ANDROID_HOME environment variable. Next is the version of LibGDX we want to use. At time of writing, the version is 1.5.4. Now, let's move on to the subprojects. As we are only interested in desktops at the moment, let's deselect the others. Finally, we come to extensions. Feel free to uncheck any that are checked. We won't be needing any of them at this point in time. For more information on available extensions, check out the LibGDX wiki (https://github.com/libgdx/libgdx/wiki). Once all is set, let's hit the Generate button! There is a little window at the bottom of the UI that will now spring to life. Here, it will show you the setup progress as it downloads the necessary setup files. Once complete, open that command line, navigate to the directory, and run your preferred tree command (in Windows, it is just "tree").   Hopefully, you will have the same directory layout as the previous image shows. The astute among you will now ask, "What is this Gradle?" and quite rightly so. I haven't mentioned it yet, although it appears twice in our projects directory. What is Gradle? Well, Gradle is a very excellent build tool and LibGDX leverages its abilities to look after the dependencies, build process, and IDE integration. This is especially useful if you are going to be working in a team with a shared code base. Even if you are not, the dependency management aspect is worth it alone. Anyone who isn't familiar with dependency management may well be used to downloading Java JARs manually and placing them in a libs folder, but they might run into problems later when the JAR they just downloaded needs another JAR, and so on. The dependency management will take care of this for you and even better is that the LibGDX setup application takes care of this for you by already describing the dependencies that you need to run! Within LibGDX, there is something called the Gradle Wrapper. This is essentially the Gradle application embedded into the project. This allows portability of our project, as now if we want someone else to run it, they can. I guess this leads us to the question, how do we use Gradle to run our project? In the LibGDX wiki (https://github.com/libgdx/libgdx/wiki/Gradle-on-the-Commandline), you will find a comprehensive list of commands that can be used while developing your game. However, for now, we will only cover the desktop project. What you may not have noticed is that the setup application actually generates a very simple "Hello World" game for us. So, we have something we can run from the command line right away! Let's go for it! On our command line, let's run the following:    On Windows: gradlew desktop:run    On Linux and Mac OS X: ./gradlew desktop:run The following screen will appear once you execute the preceding command:   You will get an output similar to the preceding screenshot. Don't worry if it suddenly wants to start downloading the dependencies. This is our dependency management in action! All those JARs and native binaries are being downloaded and put on to classpaths. But, we don't care. We are here to create games! So, after the command prompt has finished downloading the files, it should then launch the "Hello World" game. Awesome! You have just launched your very first LibGDX game! Although, before we get too excited, you will notice that not much actually happens here. It is just a red screen with the Bad Logic Games logo. I think now is the time to look at the code! Importing a project So far, we have launched the "Hello World" game via the command line, and haven't seen a single line of code so far. Let's change that. To do this, I will use IntelliJ IDEA. If you are using Android Studio, the screenshots will look familiar. If you are using Eclipse, I am sure you will be able to see the common concepts. To begin with, we need to generate the appropriate IDE project files. Again, this is using Gradle to do the heavy lifting for us. Once again, on the command line, run the following (pick the one that applies): On Windows: gradlew idea or gradlew eclipse On Linux and Mac OS X: ./gradlew idea or ./gradlew eclipse Now, Gradle will have generated some project files. Open your IDE of choice and open the project. If you require more help, check out the following wiki pages: https://github.com/libgdx/libgdx/wiki/Gradle-and-Eclipse https://github.com/libgdx/libgdx/wiki/Gradle-and-Intellij-IDEA https://github.com/libgdx/libgdx/wiki/Gradle-and-NetBeans Once the project is open, have a poke around and look at some of the files. I think our first port of call should be the build.gradle file in the root of the project. Here, you will see that the layout of our project is defined and the dependencies we require are on display. It is a good time to mention that going forward, there will be new releases of LibGDX, and to update our project to the latest version, all we need to do is update the following property: gdxVersion = '1.6.4' Now, run your game and Gradle will kick in and download everything for you! Next, we should look for our game class, remember the one we specified in the setup application—MyGdxGame.java? Find it, open it, and be in awe of how simple it is to display that red screen and Bad Logic Games logo. In fact, I am going to paste the code here for you to see how simple it is: public class MyGdxGame extends ApplicationAdapter { SpriteBatch batch; Texture img; @Override public void create () { batch = new SpriteBatch(); img = new Texture("badlogic.jpg"); } @Override public void render () { Gdx.gl.glClearColor(1, 0, 0, 1); Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT); batch.begin(); batch.draw(img, 0, 0); batch.end(); } } Essentially, we can see that when the create() method is called, it sets up a SpriteBatch batch and creates a texture from a given JPEG file. Then, on the render() method, this is called on every iteration of the game loop; it covers the screen with the color red, then it draws the texture at the (0, 0) coordinate location. Finally, we will look at the DesktopLauncher class, which is responsible for running the game in the desktop environment. Let's take a look at the following code snippet: public class DesktopLauncher { public static void main (String[] arg) { LwjglApplicationConfiguration config = new LwjglApplicationConfiguration(); new LwjglApplication(new MyGdxGame(), config); } } The preceding code shows how simple it is. We have a configuration object that will define how our desktop application runs, setting things like screen resolution and framerate, amongst others. In fact, this is an excellent time to utilize the open source aspect of LibGDX. In your IDE, click through to the LwjglApplicationConfiguration class. You will see all the properties that can be tweaked and notes on what they mean. The instance of the LwjglApplicationConfiguration class is then passed to the constructor of another class LwjglApplication, along with an instance of our MyGdxGame class. Finally, those who have worked with Java a lot in the past will recognize that it is wrapped in a main method—a traditional entry point for a Java application. That is all that is needed to create and launch a desktop-only LibGDX game. Summary In this article, we looked at what LibGDX is about and how to go about creating a standard project, running it from the command line and importing it into your preferred IDE ready for development. Resources for Article: Further resources on this subject: 3D Modeling[article] Using Google's offerings[article] Animations in Cocos2d-x [article]
Read more
  • 0
  • 0
  • 3334