Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon

How-To Tutorials - Game Design

26 Articles
article-image-component-based-approach-unity
Packt
18 Dec 2013
4 min read
Save for later

Component-based approach of Unity

Packt
18 Dec 2013
4 min read
(For more resources related to this topic, see here.) First of all, you have a project, which is essentially a folder that contains all of the files and information about your game. Some of the files are called scenes (think of them as levels). A scene contains a number of game objects that you have added to it. The contents of your scenes are determined by you, and you can have as many of them as you want. You can also make your game switch between different scenes, thus making different sets of game objects active. On a smaller scale, you have game objects and components. A game object by itself is simply an invisible container that does not do anything. Without adding appropriate components to it, it cannot, for instance, appear in the scene, receive input from the player, or move and interact with other objects. Using components, you can easily assemble powerful game objects while reusing several small parts, each responsible for a simple task or behavior—rendering the game object, handling the input, taking damage, playing an audio effect, and so on—making your game much simpler to develop and manage. Unity relies heavily on this approach, so the better you grasp it, the faster you will get good at it. The only component that each and every game object in Unity has attached to it by default is Transform. It lets you define the game object's position, rotation, and scale. Normally, you can attach, detach, and destroy components in any given game object at will, but you cannot remove Transform. Each component has a number of properties that you can access and change: these can be integer or floating point numbers, strings of text, textures, scripts, references to game objects or other components. They are used to change the way a certain component behaves, to influence its appearance or interaction. Some of the properties include the position, rotation, and scale properties of the Transform component. The following screenshot shows the Wall game object with the Transform, Mesh Filter, Box Collider, Mesh Renderer, and Script components attached to it. the properties of Transform are displayed. In order to reveal or hide a component's properties you need to left-click on its name or on the small arrow on the left of its icon. Unity has a number of predefined game objects that already have components attached to them, such as cameras, lights, and primitives. You can access them by choosing GameObject | Create from the main menu. Alternatively, you can create empty game objects by pressing command + Shift + N (Ctrl + Shift + N in Windows) and attach components to them using the Components submenu. The following figure shows the project structure that we have discussed. Note that there can be any number of scenes within a single project, any number of game objects within a single scene, any number of components attached to a single game object, and finally, any number of properties within a single component. One final thing that you need to know about components right now is that you can copy them by right-clicking on the name of the component in the Inspector panel and selecting Copy Component from the contextual menu shown in the following screenshot. You can also reset the properties of the components to their default values, remove components, and move them up or down for your convenience. Summary This article has covered the basic concept of the component-based approach of Unity and the figures/screenshots demonstrate the various aspect of the same. Resources for Article: Further resources on this subject: Mobile Game Design [Article] Unity Game Development: Welcome to the 3D world [Article] Interface Designing for Games in iOS [Article]
Read more
  • 0
  • 0
  • 3418

article-image-audio-playback
Packt
04 Sep 2013
17 min read
Save for later

Audio Playback

Packt
04 Sep 2013
17 min read
(For more resources related to this topic, see here.) Understanding FMOD One of the main reasons why I chose FMOD for this book is that it contains two separate APIs—the FMOD Ex Programmer's API, for low-level audio playback, and FMOD Designer, for high-level data-driven audio. This will allow us to cover game audio programming at different levels of abstraction without having to use entirely different technologies. Besides that reason, FMOD is also an excellent piece of software, with several advantages to game developers: License: It is free for non-commercial use, and has reasonable licenses for commercial projects. Cross-platform: It works across an impressive number of platforms. You can run it on Windows, Mac, Linux, Android, iOS, and on most of the modern video game consoles by Sony, Microsoft, and Nintendo. Supported formats: It has native support for a huge range of audio file formats, which saves you the trouble of having to include other external libraries and decoders. Programming languages: Not only can you use FMOD with C and C++, there are also bindings available for other programming languages, such as C# and Python. Popularity: It is extremely popular, being widely considered as the industry standard nowadays. It was used in games such as BioShock, Crysis, Diablo 3, Guitar Hero, Start Craft II, and World of Warcraft. It is also used to power several popular game engines, such as Unity3D and CryEngine. Features: It is packed with features, covering everything from simple audio playback, streaming and 3D sound, to interactive music, DSP effects and low-level audio programming. Installing FMOD Ex Programmer's API Installing a C++ library can be a bit daunting at first. The good side is that once you have done it for the first time, the process is usually the same for every other library. Here are the steps that you should follow if you are using Microsoft Visual Studio: Download the FMOD Ex Programmer's API from http://www.fmod.org and install it to a folder that you can remember, such as C:FMOD. Create a new empty project, and add at least one .cpp file to it. Then, right-click on the project node on the Solution Explorer , and select Properties from the list. For all the steps that follow, make sure that the Configuration option is set to All Configurations . Navigate to C/C++ | General , and add C:FMODapiinc to the list of Additional Include Directories (entries are separated by semicolons). Navigate to Linker | General , and add C:FMODapilib to the list of Additional Library Directories . Navigate to Linker | Input , and add fmodex_vc.lib to the list of Additional Dependencies . Navigate to Build Events | Post-Build Event , and add xcopy /y "C:FMODapifmodex.dll" "$(OutDir)" to the Command Lin e list. Include the <fmod.hpp> header file from your code. Creating and managing the audio system Everything that happens inside FMOD is managed by a class named FMOD::System, which we must start by instantiating with the FMOD::Syste m_Create() function: FMOD::System* system; FMOD::System_Create(&system); Notice that the function returns the system object through a parameter. You will see this pattern every time one of the FMOD functions needs to return a value, because they all reserve the regular return value for an error code. We will discuss error checking in a bit, but for now let us get the audio engine up and running. Now that we have a system object instantiated, we also need to initialize it by calling the init() method: system->init(100, FMOD_INIT_NORMAL, 0); The first parameter specifies the maximum number of channels to allocate. This controls how many sounds you are able to play simultaneously. You can choose any number for this parameter because the system performs some clever priority management behind the scenes and distributes the channels using the available resources. The second and third parameters customize the initialization process, and you can usually leave them as shown in the example. Many features that we will use work properly only if we update the system object every frame. This is done by calling the update() method from inside your game loop: system->update(); You should also remember to shutdown the system object before your game ends, so that it can dispose of all resources. This is done by calling the release() method: system->release(); Loading and streaming audio files One of the greatest things about FMOD is that you can load virtually any audio file format with a single method call. To load an audio file into memory, use the createSound() method: FMOD::Sound* sound; system->createSound("sfx.wav", FMOD_DEFAULT, 0, &sound); To stream an audio file from disk without having to store it in memory, use the createStream() method: FMOD::Sound* stream; system->createStream("song.ogg", FMOD_DEFAULT, 0, &stream); Both methods take the path of the audio file as the first parameter, and return a pointer to an FMOD::Sound object through the fourth parameter, which you can use to play the sound. The paths in the previous examples are relative to the application path. If you are running these examples in Visual Studio, make sure that you copy the audio files into the output folder (for example, using a post-build event such as xcopy /y "$(ProjectDir)*.ogg" "$(OutDir)"). The choice between loading and streaming is mostly a tradeoff between memory and processing power. When you load an audio file, all of its data is uncompressed and stored in memory, which can take up a lot of space, but the computer can play it without much effort. Streaming, on the other hand, barely uses any memory, but the computer has to access the disk constantly, and decode the audio data on the fly. Another difference (in FMOD at least) is that when you stream a sound, you can only have one instance of it playing at any time. This limitation exists because there is only one decode buffer per stream. Therefore, for sound effects that have to be played multiple times simultaneously, you have to either load them into memory, or open multiple concurrent streams. As a rule of thumb, streaming is great for music tracks, voice cues, and ambient tracks, while most sound effects should be loaded into memory. The second and third parameters allow us to customize the behavior of the sound. There are many different options available, but the following list summarizes the ones we will be using the most. Using FMOD_DEFAULT is equivalent to combining the first option of each of these categories: FMOD_LOOP_OFF and FMOD_LOOP_NORMAL: These modes control whether the sound should only play once, or loop once it reaches the end FMOD_HARDWARE and FMOD_SOFTWARE: These modes control whether the sound should be mixed in hardware (better performance) or software (more features) FMOD_2D and FMOD_3D: These modes control whether to use 3D sound We can combine multiple modes using the bitwise OR operator (for instance, FMOD_DEFAULT | FMOD_LOOP_NORMAL | FMOD_SOFTWARE). We can also tell the system to stream a sound even when we are using the createSound() method, by setting the FMOD_CREATESTREAM flag. In fact, the createStream() method is simply a shortcut for this. When we do not need a sound anymore (or at the end of the game) we should dispose of it by calling the release() method of the sound object. We should always release the sounds we create, regardless of the audio system also being released. sound->release(); Playing sounds With the sounds loaded into memory or prepared for streaming, all that is left is telling the system to play them using the playSound() method: FMOD::Channel* channel; system->playSound(FMOD_CHANNEL_FREE, sound, false, &channel); The first parameter selects in which channel the sound will play. You should usually let FMOD handle it automatically, by passing FMOD_CHANNEL_FREE as the parameter. The second parameter is a pointer to the FMOD::Sound object that you want to play. The third parameter controls whether the sound should start in a paused state, giving you a chance to modify some of its properties without the changes being audible. If you set this to true, you will also need to use the next parameter so that you can unpause it later. The fourth parameter is an output parameter that returns a pointer to the FMOD::Channel object in which the sound will play. You can use this handle to control the sound in multiple ways, which will be the main topic of the next chapter. You can ignore this last parameter if you do not need any control over the sound, and simply pass in 0 in its place. This can be useful for non-lopping one-shot sounds. system->playSound(FMOD_CHANNEL_FREE, sound, false, 0); Checking for errors So far, we have assumed that every operation will always work without errors. However, in a real scenario, there is room for a lot to go wrong. For example, we could try to load an audio file that does not exist. In order to report errors, every function and method in FMOD has a return value of type FMOD_RESULT, which will only be equal to FMOD_OK if everything went right. It is up to the user to check this value and react accordingly: FMOD_RESULT result = system->init(100, FMOD_INIT_NORMAL, 0); if (result != FMOD_OK) { // There was an error, do something about it } For starters, it would be useful to know what the error was. However, since FMOD_RESULT is an enumeration, you will only see a number if you try to print it. Fortunately, there is a function called FMOD_ErrorString() inside the fmod_errors.h header file which will give you a complete description of the error. You might also want to create a helper function to simplify the error checking process. For instance, the following function will check for errors, print a description of the error to the standard output, and exit the application: #include <iostream> #include <fmod_errors.h> void ExitOnError(FMOD_RESULT result) { if (result != FMOD_OK) { std::cout << FMOD_ErrorString(result) << std::endl; exit(-1); } } You could then use that function to check for any critical errors that should cause the application to abort: ExitOnError(system->init(100, FMOD_INIT_NORMAL, 0)); The initialization process described earlier also assumes that everything will go as planned, but a real game should be prepared to deal with any errors. Fortunately, there is a template provided in the FMOD documentation which shows you how to write a robust initialization sequence. It is a bit long to cover here, so I urge you to refer to the file named Getting started with FMOD for Windows.pdf inside the documentation folder for more information. For clarity, all of the code examples will continue to be presented without error checking, but you should always check for errors in a real project. Project 1 building a simple audio manager In this project, we will be creating a SimpleAudioManager class that combines everything that was covered in this chapter. Creating a wrapper for an underlying system that only exposes the operations that we need is known as the façade design pattern , and is very useful in order to keep things nice and simple. Since we have not seen how to manipulate sound yet, do not expect this class to be powerful enough to be used in a complex game. Its main purpose will be to let you load and play one-shot sound effects with very little code (which could in fact be enough for very simple games). It will also free you from the responsibility of dealing with sound objects directly (and having to release them) by allowing you to refer to any loaded sound by its filename. The following is an example of how to use the class: SimpleAudioManager audio; audio.Load("explosion.wav"); audio.Play("explosion.wav"); From an educational point of view, what is perhaps even more important is that you use this exercise as a way to get some ideas on how to adapt the technology to your needs. It will also form the basis of the next chapters in the book, where we will build systems that are more complex. Class definition Let us start by examining the class definition: #include <string> #include <map> #include <fmod.hpp> typedef std::map<std::string, FMOD::Sound*> SoundMap; class SimpleAudioManager { public: SimpleAudioManager(); ~SimpleAudioManager(); void Update(float elapsed); void Load(const std::string& path); void Stream(const std::string& path); void Play(const std::string& path); private: void LoadOrStream(const std::string& path, bool stream); FMOD::System* system; SoundMap sounds; }; From browsing through the list of public class members, it should be easy to deduce what it is capable of doing: The class can load audio files (given a path) using the Load() method The class can stream audio files (given a path) using the Stream() method The class can play audio files (given a path) using the Play() method (granted that they have been previously loaded or streamed) There is also an Update() method and a constructor/destructor pair to manage the sound system The private class members, on the other hand, can tell us a lot about the inner workings of the class: At the core of the class is an instance of FMOD::System responsible for driving the entire sound engine. The class initializes the sound system on the constructor, and releases it on the destructor. Sounds are stored inside an associative container, which allows us to search for a sound given its file path. For this purpose, we will be relying on one of the C++ Standard Template Library (STL ) associative containers, the std::map class, as well as the std::string class for storing the keys. Looking up a string key is a bit inefficient (compared to an integer, for example), but it should be fast enough for our needs. An advantage of having all the sounds stored on a single container is that we can easily iterate over them and release them from the class destructor. Since the code for loading and streaming audio file is almost the same, the common functionality has been extracted into a private method called LoadOrStream(), to which Load() and Stream() delegate all of the work. This prevents us from repeating the code needlessly. Initialization and destruction Now, let us walk through the implementation of each of these methods. First we have the class constructor, which is extremely simple, as the only thing that it needs to do is initialize the system object. SimpleAudioManager::SimpleAudioManager() { FMOD::System_Create(&system); system->init(100, FMOD_INIT_NORMAL, 0); } Updating is even simpler, consisting of a single method call: void SimpleAudioManager::Update(float elapsed) { system->update(); } The destructor, on the other hand, needs to take care of releasing the system object, as well as all the sound objects that were created. This process is not that complicated though. First, we iterate over the map of sounds, releasing each one in turn, and clearing the map at the end. The syntax might seem a bit strange if you have never used an STL iterator before, but all that it means is to start at the beginning of the container, and keep advancing until we reach its end. Then we finish off by releasing the system object as usual. SimpleAudioManager::~SimpleAudioManager() { // Release every sound object and clear the map SoundMap::iterator iter; for (iter = sounds.begin(); iter != sounds.end(); ++iter) iter->second->release(); sounds.clear(); // Release the system object system->release(); system = 0; } Loading or streaming sounds Next in line are the Load() and Stream() methods, but let us examine the private LoadOrStream() method first. This method takes the path of the audio file as a parameter, and checks if it has already been loaded (by querying the sound map). If the sound has already been loaded there is no need to do it again, so the method returns. Otherwise, the file is loaded (or streamed, depending on the value of the second parameter) and stored in the sound map under the appropriate key. void SimpleAudioManager::LoadOrStream(const std::string& path, bool stream) { // Ignore call if sound is already loaded if (sounds.find(path) != sounds.end()) return; // Load (or stream) file into a sound object FMOD::Sound* sound; if (stream) system->createStream(path.c_str(), FMOD_DEFAULT, 0, &sound); else system->createSound(path.c_str(), FMOD_DEFAULT, 0, &sound); // Store the sound object in the map using the path as key sounds.insert(std::make_pair(path, sound)); } With the previous method in place, both the Load() and the Stream() methods can be trivially implemented as follows: void SimpleAudioManager::Load(const std::string& path) { LoadOrStream(path, false); } void SimpleAudioManager::Stream(const std::string& path) { LoadOrStream(path, true); } Playing sounds Finally, there is the Play() method, which works the other way around. It starts by checking if the sound has already been loaded, and does nothing if the sound is not found on the map. Otherwise, the sound is played using the default parameters. void SimpleAudioManager::Play(const std::string& path) { // Search for a matching sound in the map SoundMap::iterator sound = sounds.find(path); // Ignore call if no sound was found if (sound == sounds.end()) return; // Otherwise play the sound system->playSound(FMOD_CHANNEL_FREE, sound->second, false, 0); } We could have tried to automatically load the sound in the case when it was not found. In general, this is not a good idea, because loading a sound is a costly operation, and we do not want that happening during a critical gameplay section where it could slow the game down. Instead, we should stick to having separate load and play operations. A note about the code samples Although this is a book about audio, all the samples need an environment to run on. In order to keep the audio portion of the samples as clear as possible, we will also be using the Simple and Fast Multimedia Library 2.0 (SFML ) (http://www.sfml-dev.org). This library can very easily take care of all the miscellaneous tasks, such as window creation, timing, graphics, and user input, which you will find in any game. For example, here is a complete sample using SFML and the SimpleAudioManager class. It creates a new window, loads a sound, runs a game loop at 60 frames per second, and plays the sound whenever the user presses the space key. #include <SFML/Window.hpp> #include "SimpleAudioManager.h" int main() { sf::Window window(sf::VideoMode(320, 240), "AudioPlayback"); sf::Clock clock; // Place your initialization logic here SimpleAudioManager audio; audio.Load("explosion.wav"); // Start the game loop while (window.isOpen()) { // Only run approx 60 times per second float elapsed = clock.getElapsedTime().asSeconds(); if (elapsed < 1.0f / 60.0f) continue; clock.restart(); sf::Event event; while (window.pollEvent(event)) { // Handle window events if (event.type == sf::Event::Closed) window.close(); // Handle user input if (event.type == sf::Event::KeyPressed && event.key.code == sf::Keyboard::Space) audio.Play("explosion.wav"); } // Place your update and draw logic here audio.Update(elapsed); } // Place your shutdown logic here return 0; } Summary In this article, we have seen some of the advantages of using the FMOD audio engine. We saw how to install the FMOD Ex Programmer's API in Visual Studio, how to initialize, manage, and release the FMOD sound system, how to load or stream an audio file of any type from disk, how to play a sound that has been previously loaded by FMOD, how to check for errors in every FMOD function, and how to create a simple audio manager that encapsulates the act of loading and playing audio files behind a simple interface. Resources for Article : Further resources on this subject: Using SpriteFonts in a Board-based Game with XNA 4.0 [Article] HTML5 Games Development: Using Local Storage to Store Game Data [Article] Making Money with Your Game [Article]
Read more
  • 0
  • 0
  • 2747

article-image-using-specular-unity
Packt
16 Aug 2013
14 min read
Save for later

Using Specular in Unity

Packt
16 Aug 2013
14 min read
(For more resources related to this topic, see here.) The specularity of an object surface simply describes how shiny it is. These types of effects are often referred to as view-dependent effects in the Shader world. This is because in order to achieve a realistic Specular effect in your Shaders, you need to include the direction the camera or user is facing the object's surface. Although Specular requires one more component to achieve its visual believability, which is the light direction. By combining these two directions or vectors, we end up with a hotspot or highlight on the surface of the object, half way between the view direction and the light direction. This half-way direction is called the half vector and is something new we are going to explore in this article, along with customizing our Specular effects to simulate metallic and cloth Specular surfaces. Utilizing Unity3D's built-in Specular type Unity has already provided us with a Specular function we can use for our Shaders. It is called the BlinnPhong Specular lighting model. It is one of the more basic and efficient forms of Specular, which you can find used in a lot of games even today. Since it is already built into the Unity Surface Shader language, we thought it is best to start with that first and build on it. You can also find an example in the Unity reference manual, but we will go into a bit more depth with it and explain where the data is coming from and why it is working the way it is. This will help you to get a nice grounding in setting up Specular, so that we can build on that knowledge in the future recipes in this article. Getting ready Let's start by carrying out the following: Create a new Shader and give it a name. Create a new Material, give it a name, and assign the new Shader to its shaper property. Then create a sphere object and place it roughly at world center. Finally, let's create a directional light to cast some light onto our object. When your assets have been set up in Unity, you should have a scene that resembles the following screenshot: How to do it… Begin by adding the following properties to the Shader's Properties block: We then need to make sure we add the variables to the CGPROGRAM block, so that we can use the data in our new properties inside our Shader's CGPROGRAM block. Notice that we don't need to declare the _SpecColor property as a variable. This is because Unity has already created this variable for us in the built-in Specular model. All we need to do is declare it in our Properties block and it will pass the data along to the surf() function. Our Shader now needs to be told which lighting model we want to use to light our model with. You have seen the Lambert lighting model and how to make your own lighting model, but we haven't seen the BlinnPhong lighting model yet. So, let's add BlinnPhong to our #pragma statement like so: We then need to modify our surf() function to look like the following: How it works… This basic Specular is a great starting point when you are prototyping your Shaders, as you can get a lot accomplished in terms of writing the core functionality of the Shader, while not having to worry about the basic lighting functions. Unity has provided us with a lighting model that has already taken the task of creating your Specular lighting for you. If you look into the UnityCG.cginc file found in your Unity's install directory under the Data folder, you will notice that you have Lambert and BlinnPhong lighting models available for you to use. The moment you compile your Shader with the #pragma surface surf BlinnPhong, you are telling the Shader to utilize the BlinnPhong lighting function in the UnityCG.cginc file, so that we don't have to write that code over and over again. With your Shader compiled and no errors present, you should see a result similar to the following screenshot: Creating a Phong Specular type The most basic and performance-friendly Specular type is the Phong Specular effect. It is the calculation of the light direction reflecting off of the surface compared to the user's view direction. It is a very common Specular model used in many applications, from games to movies. While it isn't the most realistic in terms of accurately modeling the reflected Specular, it gives a great approximation that performs well in most situations. Plus, if your object is further away from the camera and the need for a very accurate Specular isn't needed, this is a great way to provide a Specular effect on your Shaders. In this article, we will be covering how to implement the per vertex version of the and also see how to implement the per pixel version using some new parameters in the surface Shader's Input struct. We will see the difference and discuss when and why to use these two different implementations for different situations. Getting ready Create a new Shader, Material, and object, and give them appropriate names so that you can find them later. Finally, attach the Shader to the Material and the Material to the object. To finish off your new scene, create a new directional light so that we can see our Specular effect as we code it. How to do it… You might be seeing a pattern at this point, but we always like to start out with our most basic part of the Shader writing process: the creation of properties. So, let's add the following properties to the Shader: We then have to make sure to add the corresponding variables to our CGPROGRAM block inside our SubShader block. Now we have to add our custom lighting model so that we can compute our own Phong Specular. Add the following code to the Shader's SubShader() function. Don't worry if it doesn't make sense at this point; we will cover each line of code in the next section: Finally, we have to tell the CGPROGRAM block that it needs to use our custom lighting function instead of one of the built-in ones. We do this by changing the #pragma statement to the following: The following screenshot demonstrates the result of our custom Phong lighting model using our own custom reflection vector: How it works… Let's break down the lighting function by itself, as the rest of the Shader should be pretty familiar to you at this point. We simply start by using the lighting function that gives us the view direction. Remember that Unity has given you a set of lighting functions that you can use, but in order to use them correctly you have to have the same arguments they provide. Refer to the following table, or go to http://docs.unity3d.com/Documentation/Components/SL-SurfaceShaderLighting.html: Not view Dependent half4 Lighting Name You choose (SurfaceOutput s, half3 lightDir, half atten); View Dependent half4 Lighting Name You choose (SurfaceOutput s, half3 lightDir, half3 viewDir, half atten); In our case, we are doing a Specular Shader, so we need to have the view-dependent lighting function structure. So, we have to write: This will tell the Shader that we want to create our own view-dependent Shader. Always make sure that your lighting function name is the same in your lighting function declaration and the #pragma statement, or Unity will not be able to find your lighting model. The lighting function then begins by declaring the usual Diffuse component by dotting the vertex normal with the light direction or vector. This will give us a value of 1 when a normal on the model is facing towards the light, and a value of -1 when facing away from the light direction. We then calculate the reflection vector taking the vertex normal, scaling it by 2.0 and by the diff value, then subtracting the light direction from it. This has the effect of bending the normal towards the light; so as a vertex normal is pointing away from the light, it is forced to look at the light. Refer to the following screenshot for a more visual representation. The script that produces this debug effect is included at the book's support page at www.packtpub.com/support. Then all we have left to do is to create the final spec's value and color. To do this, we dot the reflection vector with the view direction and take it to a power of _SpecPower. Finally, we just multiply the _SpecularColor.rgb value over the spec value to get our final Specular highlight. The following screenshot displays the final result of our Phong Specular calculation isolated out in the Shader: Creating a BlinnPhong Specular type Blinn is another more efficient way of calculating and estimating specularity. It is done by getting the half vector from the view direction and the light direction. It was brought into the world of Cg by a man named Jim Blinn. He found that it was much more efficient to just get the half vector instead of calculating our own reflection vectors. It cut down on both code and processing time. If you actually look at the built-in BlinnPhong lighting model included in the UnityCG.cginc file, you will notice that it is using the half vector as well, hence the reason why it is named BlinnPhong. It is just a simpler version of the full Phong calculation. Getting ready This time, instead of creating a whole new scene, let's just use the objects and scene we have, and create a new Shader and Material and name them BlinnPhong. Once you have a new Shader, double-click on it to launch MonoDevelop, so that we can start to edit our Shader. How to do it… First, we need to add our own properties to the Properties block, so that we can control the look of the Specular highlight. Then, we need to make sure that we have created the corresponding variables inside our CGPROGRAM block, so that we can access the data from our Properties block, inside of our subshader. Now it's time to create our custom lighting model that will process our Diffuse and Specular calculations. To complete our Shader, we will need to tell our CGPROGRAM block to use our custom lighting model rather than a built-in one, by modifying the #pragma statement with the following code: The following screenshot demonstrates the results of our BlinnPhong lighting model: How it works… The BlinnPhong Specular is almost exactly like the Phong Specular, except that it is more efficient because it uses less code to achieve almost the same effect. You will find this approach nine times out of ten in today's modern Shaders, as it is easier to code and lighter on the Shader performance. Instead of calculating our own reflection vector, we are simply going to get the vector half way between the view direction and the light direction, basically simulating the reflection vector. It has actually been found that this approach is more physically accurate than the last approach, but we thought it is necessary to show you all the possibilities. So to get the half vector, we simply need to add the view direction and the light direction together, as shown in the following code snippet: Then, we simply need to dot the vertex normal with that new half vector to get our main Specular value. After that, we just take it to a power of _SpecPower and multiply it by the Specular color variable. It's much lighter on the code and much lighter on the math, but still gives us a nice Specular highlight that will work for a lot of real-time situations. Masking Specular with textures Now that we have taken a look at how to create a Specular effect for our Shaders, let's start to take a look into the ways in which we can start to modify our Specular and give more artistic control over its final visual quality. In this next recipe, we will look at how we can use textures to drive our Specular and Specular power attributes. The technique of using Specular textures is seen in most modern game development pipelines because it allows the 3D artists to control the final visual effect on a per-pixel basis. This provides us with a way in which we can have a mat-type surface and a shiny surface all in one Shader; or, we can drive the width of the Specular or the Specular power with another texture, to have one surface with a broad Specular highlight and another surface with a very sharp, tiny highlight. There are many effects one can achieve by mixing his/her Shader calculations with textures, and giving artists the ability to control their Shader's final visual effect is key to an efficient pipeline. Let's see how we can use textures to drive our Specular lighting models. This article will introduce you to some new concepts, such as creating your own Input struct, and learning how the data is being passed around from the output struct, to the lighting function, to the Input struct, and to the surf() function. Understanding the flow of data between these core Surface Shader elements is core to a successful Shader pipeline. Getting ready We will need a new Shader, Material, and another object to apply our Shader and Material on to. With the Shader and Material connected and assigned to your object in your scene, double-click the Shader to bring it up in MonoDevelop. We will also need a Specular texture to use. Any texture will do as long as it has some nice variation in colors and patterns. The following screenshot shows the textures we are using for this recipe: How to do it… First, let's populate our Properties block with some new properties. Add the following code to your Shader's Properties block: We then need to add the corresponding variables to the subshader, so that we can access the data from the properties in our Properties block. Add the following code, just after the #pragma statement: Now we have to add our own custom Output struct. This will allow us to store more data for use between our surf function and our lighting model. Don't worry if this doesn't make sense just yet. We will cover the finer details of this Output struct in the next section of the article. Place the following code just after the variables in the SubShader block: Just after the Output struct we just entered, we need to add our custom lighting model. In this case, we have a custom lighting model called LightingCustomPhong. Enter the following code just after the Output struct we just created: In order for our custom lighting model to work, we have to tell the SubShader block which lighting model we want to use. Enter the following code to the #pragma statement so that it loads our custom lighting model: Since we are going to be using a texture to modify the values of our base Specular calculation, we need to store another set of UVs for that texture specifically. This is done inside the Input struct by placing the word uv in front of the variable's name that is holding the texture. Enter the following code just after your custom lighting model: To finish off the Shader, we just need to modify our surf() function with the following code. This will let us pass the texture information to our lighting model function, so that we can use the pixel values of the texture to modify our Specular values in the lighting model function: The following screenshot shows the result of masking our Specular calculations with a color texture and its channel information. We now have a nice variation in Specular over the entire surface, instead of just a global value for the Specular:
Read more
  • 0
  • 0
  • 5552
Banner background image
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-configuration-and-handy-tweaks-udk
Packt
01 Mar 2012
18 min read
Save for later

Configuration and Handy Tweaks for UDK

Packt
01 Mar 2012
18 min read
(For more resources on UDK, see here.) Groundwork for adjusting configuration defaults In this article we'll build up from simply changing some trivial configuration settings to orchestrating unique gaming experiences. To start out, we need to make clear how UDK configures content under the hood by introducing the layout and formatting for this kind of file. The experience in this recipe can be likened to savoring morsels of cake samples in a shop before committing to a purchase; while you're not actually changing major settings yet, the aim is to make some tasty observations so later the changes you make will come from informed decisions. Getting ready In your UDK installation, you have a folder called C:UDK~UDKGameConfig and it is worthwhile to browse the files here and get to know them. Treat them like the faces of colleagues in a new company. It may take a while, but you'll eventually know all their names! You may want to make an alternative install of UDK before starting this article, to protect any content you've been working on. In these examples we're going to assume you are using ConTEXT, or a notepad alternative that highlights UnrealScript syntax, like those listed at http://wiki.beyondunreal.com/Legacy:Text_Editor. The advantage with ConTEXT is that you have a history of recently opened files, and several concurrently open files can be arranged in tabs; also, you can view specific lines in the file in response to line error warnings from the UDK log, should they arise. In ConTEXT, to display line numbers go to the Options | Environment Options menu then click on the Editor tab in the options dialog, tick on Line Numbers and press Apply. How to do it... Open the file C:UDK~UDKGameConfigDefaultCharInfo.INI using ConTEXT. Alongside it, open C:UDK~UDKGameConfigUDKCharInfo.INI, which is a rather similar file. Values we set from an existing class are presented after a reference to the class in square brackets which surround the folder and class name, such as [UTGame.UTCharInfo]. Commented out lines or notes are distinguished using ; and the entire line isn't parsed. This differs from the // used to comment out lines in UnrealScript. In some configuration files you will see BasedOn=... followed by the path to another configuration file. This helps you track where the info is coming from. Values for variables are set in the configuration file as in the example: LOD1DisplayFactor=0.4. In ConTEXT , click on File | Open... and scroll down the list of files. Notice that it previews the contents of the highlighted .INI files before you open them. Choose DefaultWeapon.INI and edit line 3 (strictly speaking line 1 is empty). Press Ctrl + G (or View | Go to Line) and enter the line number 3. This line specifies CrosshairColor for the weapon. If you change the value A=255 to A=0 you will effectively have hidden the weapon target. Supposing you wanted to do so, you'd just have to save this file then reload UDK. No compiling is needed for adjusting configuration files, unlike classes, unless the configuration is defining custom scripts to be used in some way. Let's assume for now that you don't want to hide the weapon cursor, so close the file without saving by pressing Ctrl + W and choose No for the file save option. Open the file UDKEditor.INI and go to line 12: Bindings=(Key="S",SeqObjClassName="Engine.SeqAct_PlaySound") then look at line 28: Bindings=(Key="S",bControl=true,SeqObjClassName="Engine.SeqEvent_LevelLoaded"). What's important here is the added bControl=true for the S key. S will create a Play Sound node in Kismet. Ctrl + S will create a Level Loaded event. We really don't need to change anything but you can, for instance, change Ctrl + S to Ctrl + L in line 28, for adding a new Level Loaded event node in Kismet: Bindings=(Key="L",bControl=true,SeqObjClassName="Engine.SeqEvent_LevelLoaded") . Make sure that UDK is closed before you save the change or the file will not be effected at all. Reloading UDK after saving the change will see it take effect. Unless you have very long hands you will probably want to test this using the right Ctrl button on the right-hand side of the keyboard so your fingers can hold L and Ctrl and left mouse click all at once. You should get the Level Loaded event in Kismet from this key combination now. On that note, when you are specifying hotkeys for your game, bear in mind the idea of user friendly interface as you decide what keys to use. Often used keys should be easy to remember, fast to reach, and possibly semantically clustered together. How it works... What we looked at in this recipe were some formatting features that occur in every configuration file. In particular it is important to know that edits should be made while UDK is closed or they get scrubbed back out immediately. Also you will have noticed that the values we change reference UnrealScript classes from the C:UDK~DevelopmentSrc folder, and reading through their layout can help you learn how the default content in UDK is made to work during gameplay. There's more... Consider a version control software for editing UDK content There is a free version control system called Bazaar ( http://bazaar.canonical.com) that integrates with Windows folders. What version control software does is keep track of changes you have made to files, protecting them through a history based backup that lets you review and revert changes to a previous state if needed. You can init a folder, then browse it, add content and commit changes to changed files with comments that help you track what's going on. Where needed you can review the change history and revert files to any previously committed state. Alternatives to Bazaar are the commercial tool Alienbrain Essentials for Artists, or the free repository TortoiseSVN. The utility of version control in the case of UDK development is to prevent unrecoverable problems when doing a script compile when changes haven't been tracked and therefore can't be restored without re-installing from scratch, and to allow assets to be overwritten with a history. Enabling the remote control for game inspection This is a method for turning on an extra feature of UDK called the Remote Control that can be used to manipulate render nodes, inspect Actors, and evaluate performance. How to do it... In Windows, go to the Start menu or your desktop and find the shortcut for the UDK Editor and right-click on it to expose its properties. In the Target field edit it to read: C:UDK~BinariesUDKLift.exe editor -wxwindows -remotecontrol -log. The main point of this entry is so that we can launch a tool called RemoteControl. The usefulness of running the -log window increases over time. It is used for tracking what is happening while you run the editor and PIE. When trouble shooting problems in Kismet or with missing assets for example, it is a good first port of call for seeing where and when errors occur. In the following screenshot, the log shows the creation of a Trigger Touch event in the main Kismet sequence based on the actor Trigger_0 in the scene: Having edited the UDK launch properties to allow RemoteControl to launch, now load the Midday Lighting map template and PIE (F8). If you press Tab and type remotecontrol you should get a pop-up window like this: If you hit the Actors tab you get access to properties of certain actors, and you can change properties live while playing. For example, expand Actors | DominantDirectionalLight and double-click on DominantDirectionalLight_0 . Then in the light's property Movement | Rotation | Yaw or Pitch, try out different angle values. However, the changed values will revert to the editor state after PIE is closed. See also: http://udn.epicgames.com/Three/RemoteControl.html. An additional note: if you happen to minimize the RemoteControl window in some cases it may stay like that until you press Alt + Space . And to swap between the game and the Remote Control window press Alt + Tab . Pressing Show Flags lets you display various elements of the scene such as Collision, Bones, and Bounds. Go to the Stats tab and tick on Memory in the listing, then expand it and tick on the item within it called Lightmap Memory . This shows the cost of displaying lighting backed into lightmaps. Further down in the Stats tab list, tick on the item D3D9RHI and look at the DrawPrimitive calls. In the game, look straight up at the empty sky and note the value. Now look at the box on the ground. Notice the value increases. This is because the view has to draw the added objects (ground and box). In a large scene, especially on iOS, there is a functional limit to the number of drawcalls. How it works... The RemoteControl tool is external to the game window and is created using wxWindows. It is meant for use during PIE, for evaluation of performance. The Actors tab shows us a tree list of what is in the level, along with filters. You can access Actor Properties for an actor under the crosshairs using the icon [] or access them from a list. What you see in this screenshot is the result of turning the memory statistics on within a scene and the frame rate indicator (FPS = frames per second) through the Rendering tab in the Stats section, as well as the display of bones used in the scene. In Remote Control , you can set the Game Resolution (or game window size) under Rendering | View Settings. In the next recipe, we'll look at how to do this in UDK's configuration. Changing the Play in Editor view resolution This is a very short method for setting the view size for PIE sessions. How to do it... In C:UDK~UDKGameConfigDefaultEngineUDK.INI press Ctrl + F and search for [SystemSettings]. This should expose the lines: [SystemSettings] ; NOTE THAT ANY ITEMS IN THIS SECTION AFFECT ALL PLATFORMS! bEnableForegroundShadowsOnWorld=False bEnableForegroundSelfShadowing=False ResX=1024 ResY=768 Change the ResX and ResY values to suit yourself, using screen resolutions that make sense, such as 1920x1080. This will update UDKEngine.INI in the same folder so you will see the change reflected in these lines: PlayInEditorWidth=1920 PlayInEditorHeight=1080 Load a level and PIE to see the difference. Note that if you update UDKEngine.INI directly it will just revert to whatever is set in DefaultEngineUDK.INI. There is a lot of redundancy built into UDK's configuration that takes some time and practice to get used to. Removing the loading hints and similar articles Quickly getting rid of all the peripheral text and imagery that wraps around a given level, especially in console mode, is not easy. A few options exist for removing the more distracting elements such as splash screens and menus. You may want to do this if you wish to show your work without any artwork made by someone else getting in the way of your own. One method is called destructive editing, where your delete or blank out assets at the source, and this isn't as safe as it is quick. Instead you can provide your own menus, splash, and UI by extending on the classes that call up the default ones. How to do it... Removing the console mode videos during map loading Open C:UDK~UDKGameConfigDefaultEngine.INI. Press Ctrl + F and search for [FullScreenMovie] , which should expose the startup and loadmap references. Comment out the entries as follows: [FullScreenMovie] //+StartupMovies=UDKFrontEnd.UDK_loading //+LoadMapMovies=UDKFrontEnd.UDK_loading Load a level and play in console mode []. You won't get the movies that precede gameplay. If you take out all the pre-loading content there may occur the problem of getting a look at the level too early and "pre-caching" showing up. To learn how to instead swap out the .BIK files that constitute the loading movies between levels you can follow the video by Michael J Collins: http://www.youtube.com/watch?v=SX1VQK1w4NU. Removing the level loading hints To totally prevent .BIK movies during development, you can open C:UDK~EngineConfigBaseEngine.INI and search for NoMovies, then adjust the FALSE in the exposed lines: // Game Type name //class'Engine'.static.AddOverlay( LoadingScreenGameTypeNameFont, Desc, 0.1822, 0.435, 1.0, 1.0, false); // becomes class'Engine'.static.AddOverlay( LoadingScreenGameTypeNameFont, Desc, 0.1822, 0.435, 1.0, 0, false); // and Map name // class'Engine'.static.AddOverlay( LoadingScreenMapNameFont, MapName, 0.1822, 0.46, 2.0, 2.0, false); // becomes class'Engine'.static.AddOverlay( LoadingScreenMapNameFont, MapName, 0.1822, 0.46, 2.0, 0, false); What's happening here is that the last digit of four in an entry 1,1,1,1 is the Alpha value, controlling transparency, so 1,1,1,0 will be invisible. The first three numbers are RGB values, but they can be anything if the Alpha is 0. Removing the default exit menu Open C:UDK~UDKGameConfigDefaultInput.INI and press Ctrl + F to search for Escape. The first time will expose a removed key binding, so search from the cursor again to find in line 205: .Bindings=(Name="Escape",Command="GBA_ShowMenu" and comment it out with ; then add this line underneath instead: .Bindings=(Name="Escape",Command="quit" if UDK should close directly. If you want to provide a custom menu type: .Bindings=(Name="Escape",Command="open Menu" , where players pressing Esc will be sent to Menu.UDK (a scene of your own design) instead of the default menu. This won't do anything if you don't provision a Menu.UDK map first and cook it with your game levels. The map Menu.UDK would typically include some kind of clickable exit, resume, and reload buttons. If you want Esc to automatically restart the level you're playing, put in "open YOURMAPNAME" but bear in mind the only way to exit then will be Alt + F4. Possibly a strong way to approach the Escape option is to have a Pause command that permits a choice about leaving the game through a floating button: Resume or Exit. In addition you might have a similar floating button when the player dies: Replay or Exit, rather than the default Fire to Respawn. Editing DefaultEngineUDK to allow 4096x4096 texture compression This is a method for enabling UDK to use textures larger than its default limit. Conventional wisdom says that game textures should be highly optimized, but large resolution artwork is always enticing for many designers, and computers are getting better all the time. Performance issues aside, it's a good goal to push the graphic envelope and larger textures allow detail to hold up better on close inspection. Getting ready We've provided one really large texture that is 4096x4096 that you may find convenient, intended for use as a Skydome. If you are going to use a large texture it would most likely be on a very important model like a key character always close to the camera or else of a very large model which is always visible, such as a Skydome, or Skybox. A simple tutorial for making a Skybox is at http://www.worldofleveldesign.com/categories/UDK/UDK-how-add-skybox.php but this recipe assumes the use of a provided one. How to do it... With UDK closed, open C:UDK~UDKGameConfigDefaultEngineUDK.INI. Press Ctrl + F in ConTEXT and search for Skybox . You should be directed to line 127: TEXTUREGROUP_Skybox=(MinLODSize=512,MaxLODSize=2048,LODBias=0,MinMagFilter=aniso,MipFilter=point). Change the value for MaxLODSize=2048 to 4096. To really force it, you can also set the MinLODSize=4096 too. Doing this for a Skybox is okay, since there's normally only one used in a map, but you'd risk slowing the game down to do this with regular textures. Note, the TEXTUREGROUP_Skybox will allow a texture for a Skybox to be large, but not other things like character textures. For that, you can edit the relevant values in the other TEXTUREGROUP lines. Further down, in the SystemSettingsMobile section, the texture sizes are much smaller, which is due to the relatively limited processing power of mobile devices. Now save, and next we'll verify this in fact worked by adding a large sky to a scene in UDK. Look in the Content Browser and search the Packt folder for Packt_SkyDome , which is a typical mesh for a sky. You can see there is a completed version, and a copy called Packt_SkyDomeStart which has no material. Go to the Packt texture group. You will see there is already a provisioned 4096x4096 texture for Packt_SkyDome , but let's import a fresh one. Right-click in the Content Browser panel and choose Import , and browse to find Packt_SkydomeStart.PNG which is just a copy of the already existing texture. The reason to import it, is to verify you understand the compression setting. In the options you will see a panel that lets you specify the name info, which you should enter as Packt.Texture.SkyDomeTest or something unique. Further down you will see the compression settings. Choose LODGroup and from the expanding list choose TEXTUREGROUP_Skybox, as shown in the next screenshot, since this is what we have set to have 4096x4096 compression enabled in the configuration: The file may take some time to process, given its size. Once it is complete you can create a new Material Packt.Material.SkyDomeTest_mat. Open it and in the Material Editor hold T and click to add the highlighted SkyDomeTest texture to the Emissive channel. Skies are self lighting, so in the PreviewMaterial node's properties, set the Lighting Model to MLM_Unlit . The mesh Packt_SkyDomeStart is already UV mapped to match the texture, and if you double-click on it you can assign the new Material Packt.Material.SkyDomeTest_mat in the LODGroupInfo by expanding until you access the empty Material channel. Select the Material in the Content Browser then use the assign icon [] to assign it. Then you can save the package and place the mesh in the level. Be sure to access its properties (F4) and under the Lighting turn off Cast Shadow and set the Lighting Channels tick on Skybox and uncheck Static , as shown in the next screenshot: You could scale the mesh in the scene to suit, and perhaps drop it down below Z=0 a little. You could also use an Exponential Height Fog to hide the horizon. Since there is a specific sun shown in the sky image, you will need to place a Dominant Directional light in the scene and rotate it so its arrow (representing its direction) approximates the direction the sunlight would be coming from. It would be appropriate to tint the light warmly for a sunset. Setting the preview player size reference object In UDK versions greater than April 2011, pressing the key in the editor Perspective view will show a mesh that represents the player height. By default this is just a cube. The mesh to display can be set in the configuration file UDKEditorUserSettings.INI and we'll look at how to adjust this. This is to help designers maintain proper level metrics. You'll be able to gauge how tall and wide to make doors so there's sufficient space for the player to move through without getting stuck. Getting ready Back up C:UDK~UDKGameConfigUDKEditorUserSettings.INI then open it in ConTEXT with UDK closed. How to do it... Press Ctrl + F and search for [EditorPreviewMesh] . Under it, we will change the entry PreviewMeshNames=" EditorMeshes.TexPropCube ". Note the we need to replace this with a StaticMesh, and a good place to put this to ensure loading would be the EngineContent package EditorMeshes . First, open UDK and in the Content Browser search using the type field for TexPropCube . When this appears, right-click on the asset and choose Find Package . The packages list will show us EngineContentEditorMeshes and in here right-click and choose Import . You'll be prompted to browse in Windows, so from the provided content folder, find SkinTail.ASE which is a character model and import this into EditorMeshes. There's no need to set a group name for this. Importing this file as a StaticMesh enables it to be used as a preview model. By contrast, the SkeletalMesh Packt.Mesh.Packt_SkinTail won't work for what we are trying to do. If you set up a SkeletalMesh for the preview model the log will return cannot find staticmesh Yourmodelname whenever you press in the editor. It is optional, but you can double click the imported StaticMesh and assign a Material to it after expanding the LOD_Info property to show the Material channel. For the SkinTail content, choose Packt.Material.Packt_CharMat . Then save the EditorMeshes package including SkinTail and quit UDK. Use ConTEXT to edit the file UDKEditorUserSettings.INI so that the line we were looking at in Step 1 is changed to: PreviewMeshNames=" EditorMeshes.SkinTail " Eventually you'll opt to use your own StaticMesh. Save, close, and restart UDK. Open a map, and press in the editor to see if SkinTail will display. If it doesn't, run UDK using the -log option and check for error warnings when is pressed. Note that PreviewMeshNames=" EditorMeshes.TexPropCube " can also be adjusted in these configuration files: C:UDK~EngineConfigBaseEditorUserSettings.INI or C:UDK~UDKGameConfigDefaultEditorUserSettings.INI.
Read more
  • 0
  • 0
  • 4549

article-image-tips-and-tricks-away3d-36
Packt
18 Mar 2011
7 min read
Save for later

Tips and Tricks on Away3D 3.6

Packt
18 Mar 2011
7 min read
  Away3D 3.6 Essentials Take Flash to the next dimension by creating detailed, animated, and interactive 3D worlds with Away3D Determining the current frame rate When we talk about the performance of an Away3D application, almost always we are referring to the number of frames per second (FPS) that are being rendered. This is also referred to as the frame rate. Higher frame rates result in a more fluid and visually-appealing experience for the end user. Although it is possible to visually determine if an application has an acceptable frame rate, it can also be useful to get a more objective measurement. Fortunately, Away3D has this functionality built in. By default, when it is constructed, the View3D class will create an instance of the Stats class, which is in the away3d.core.stats package. This Stats object can be accessed via the statsPanel property from the View3D class. You can display the output of the Stats object on the screen using the Away3D project stats option in the context (or right-click) menu of an Away3D application. To see the Away3D Project stats option in the context menu you will need to click on a visible 3D object. If you click on the empty space around the 3D objects in the scene, you will see the standard Flash context menu. This will display a window similar to the following screenshot: This window provides a number of useful measurements: FPS, which measures the current frames per second AFPS, which measures the average number of frames per second Max, which measures the maximum peak value of the frames per second MS, which measures the time it took to render the last frame in milliseconds RAM, which measures how much memory the application is using MESHES, which measures the number of 3D objects in the scene SWF FR, which measures the maximum frame rate of the Flash application T ELEMENTS, which measures the total number of individual elements that make up the 3D objects in the scene R ELEMENTS, which measures the number of individual elements that make up the 3D objects that are being rendered to the screen These values come in very handy when trying to quantify the performance of an Away3D application. Setting the maximum frame rate Recent versions of Flash default to a maximum frame rate of 24 frames per second. This is usually fine for animations, but changing the maximum frame rate for a game may allow you to achieve a more fluid end result. The easiest way to do this is to use the SWF frameRate meta tag, which is a line of code added before the Away3DTemplate class. [SWF(frameRate=100)] public class Away3DTemplate extends Sprite { // class definition goes here } The SWF FR measurement displayed by the Away3D Stats object reflects the maximum frame rate defined by the frameRate meta tag. Note that setting the maximum frame rate using the frameRate meta tag does not mean that your application will always run at a higher frame rate, just that it can run at a higher frame rate. A slow PC will still run an Away3D application at a low frame rate even if the maximum frame rate has been set to a high value. You also need to be aware that any calculations performed in the onEnterFrame() function, such as transforming a 3D object, can be dependent on the frame rate of the application. In the following code, we rotate a 3D object by 1 degree around the X-axis every frame. override protected function onEnterFrame(event:Event):void { super.onEnterFrame(event); shipModel.rotationX += 1; } If the frame rate is 30 FPS, the 3D object will rotate around the X-axis by 30 degrees every second. If the frame rate is 90 FPS, the 3D object will rotate around the X-axis by 90 degrees every second. If your application requires these kinds of transformations to be performed consistently regardless of the frame rate, you can use a tweening library. Setting Flash quality to low You may have noticed that Flash offers a number of quality settings in its context menu. This quality setting can be set to one of the four options, which are defined in the StageQuality class from the flash.display package. As described by the Flash API documentation, these settings are: StageQuality.LOW: Low rendering quality. Graphics are not anti-aliased, and bitmaps are not smoothed, but runtime still use mip-mapping. StageQuality.MEDIUM: Medium rendering quality. Graphics are anti-aliased using a 2 x 2 pixel grid, bitmap smoothing is dependent on the Bitmap.smoothing setting. Runtimes use mip-mapping. This setting is suitable for movies that do not contain text. StageQuality.HIGH: High rendering quality. Graphics are anti-aliased using a 4 x 4 pixel grid, and bitmap smoothing is dependent on the Bitmap.smoothing setting. Runtimes use mip-mapping. This is the default rendering quality setting that Flash Player uses. StageQuality.BEST: Very high rendering quality. Graphics are anti-aliased using a 4 x 4 pixel grid. If Bitmap.smoothing is true the runtime uses a high-quality downscale algorithm that produces fewer artifacts. Mip-mapping refers to the use of mip-maps, which are precomputed smaller versions of an original bitmap. They are used instead of the original bitmap when the original is scaled down by more than 50 %. This bitmap scaling may occur when a 3D object with a bitmap material is itself scaled down, or off in the distance within the scene. The quality setting is defined by assigning one of these values to the quality property on the stage object: stage.quality = StageQuality.LOW; A number of demos that are supplied with Away3D set the stage quality by using the SWF quality metatag, like so: [SWF(quality="LOW")] The Flex compiler does not support setting the stage quality in this way. Although this code will not raise any errors during compilation, the stage quality will remain at the default value of StageQuality.HIGH. You can find more information on the metatags supported by the Flex compiler at http://livedocs.adobe.com/flex/3/html/help.html?content=metadata_3.html. Setting the stage quality to low will improve the performance of your Away3D application. The increase is felt most in applications that display a large number of 3D objects. The downside to setting the stage quality to low is that it affects all the objects on the stage, not just those drawn by Away3D. The low stage quality is particularly noticeable when rendering text, so the visual quality of controls like textfields and buttons can be significantly degraded. Using the medium-quality setting offers a good compromise between speed and visual quality. Reducing the size of the viewport The fewer pixels that are drawn to the screen, the faster the rendering process will be. The area that the view will draw into can be defined by assigning a ClippingRectangle object to the clipping property on the View3D class. To use the RectangleClipping class you first need to import it from the away3d.core.clip package. You can then define the area that Away3D will draw into by supplying the minx, maxX, minY, and maxY init object parameters to the RectangleClipping constructor like so: view.clipping = new RectangleClipping( { minX: -100, maxX: 100, minY: -100, maxY: 100 } ); The preceding code will limit the output of the view to an area 200 x 200 units in size. The ViewportClippingDemo application, which can be found on the Packt website as code bundle, allows you to modify the size of the clipping rectangle at runtime using the arrow up and arrow down keys. You can see the difference that the clipping rectangle makes in the following image. On the left, the clipping rectangle is set to the full area of the stage. On the right, the clipping rectangle has been reduced.
Read more
  • 0
  • 0
  • 2013

article-image-away3d-36-applying-light-and-pixel-bender-materials
Packt
09 Mar 2011
4 min read
Save for later

Away3D 3.6: Applying Light and Pixel Bender materials

Packt
09 Mar 2011
4 min read
  Away3D 3.6 Essentials Take Flash to the next dimension by creating detailed, animated, and interactive 3D worlds with Away3D         Read more about this book       (For more resources on 3D, see here.) The reader will comprehend this article better by referring the previous articles on: Away3D 3.6: Applying Animated and Composite materials. Materials, Lights and Shading Techniques with Away3D 3.6. Away3D 3.6: Applying Basic and Bitmap Materials. Models and Animations with Away3D 3.6. Light materials Light materials can be illuminated by an external light source. There are three different types of lights that can be applied to these materials: ambient, point, and directional. Also, remember that these materials will not necessarily recognize each type of light, or more than one light source. The table under the Lights and materials section lists which light sources can be applied to which materials. WhiteShadingBitmapMaterial The WhiteShadingBitmapMaterial class uses flat shading to apply lighting over a bitmap texture. As the class name suggests, the lighting is always white in color, ignoring the color of the source light. protected function applyWhiteShadingBitmapMaterial():void{ initSphere(); initPointLight(); materialText.text = "WhiteShadingBitmapMaterial"; var newMaterial:WhiteShadingBitmapMaterial = new WhiteShadingBitmapMaterial( Cast.bitmap(EarthDiffuse) ); currentPrimitive.material = newMaterial;} The WhiteShadingBitmapMaterial class extends the BitmapMaterial class. This means that in addition to those parameters in the following list, the init object parameters listed for the BitmapMaterial are also valid for the WhiteShadingBitmapMaterial. ShadingColorMaterial The ShadingColorMaterial class uses flat shading to apply lighting over a solid base color. protected function applyShadingColorMaterial():void{ initSphere(); initPointLight(); materialText.text = "ShadingColorMaterial"; var newMaterial:ShadingColorMaterial = new ShadingColorMaterial( Cast.trycolor("deepskyblue") ); currentPrimitive.material = newMaterial;} The ShadingColorMaterial class extends the ColorMaterial class. This means that in addition to those parameters in the following list, the init object parameters listed for the ColorMaterial class are also valid for the ShadingColorMaterial class. The color parameter can accept an int or String value. However, due to a bug in the ColorMaterial class, only an int value will work correctly. In the previous example, we have manually converted the color represented by the string deepskyblue into an int with the trycolor() function from the Cast class. PhongBitmapMaterial The PhongBitmapMaterial uses phong shading to apply lighting over a TransformBitmapMaterial base material. protected function applyPhongBitmapMaterial():void{ initSphere(); initDirectionalLight(); materialText.text = "PhongBitmapMaterial"; var newMaterial:PhongBitmapMaterial = new PhongBitmapMaterial(Cast.bitmap(EarthDiffuse)); currentPrimitive.material = newMaterial;} PhongBitmapMaterial is a composite material that passes the init object to a contained instance of the TransformBitmapMaterial class. This means that in addition to those parameters in the following list, the init object parameters listed for the TransformBitmapMaterial are also valid for the PhongBitmapMaterial. PhongColorMaterial The PhongColorMaterial uses phong shading to apply lighting over a solid color base material. protected function applyPhongColorMaterial():void{ initSphere(); initDirectionalLight(); materialText.text = "PhongColorMaterial"; var newMaterial:PhongColorMaterial = new PhongColorMaterial("deepskyblue"); currentPrimitive.material = newMaterial;} PhongMovieMaterial The PhongMovieMaterial uses phong shading to apply lighting over an animated MovieMaterial base material. protected function applyPhongMovieMaterial():void{ initSphere(); initDirectionalLight(); materialText.text = "PhongMovieMaterial"; var newMaterial:PhongMovieMaterial = new PhongMovieMaterial(new Bear()); currentPrimitive.material = newMaterial;} PhongMovieMaterial is a composite material that passes the init object to a contained instance of the MovieMaterial class. This means that in addition to those parameters in the following list, the init object parameters listed for the PhongMovieMaterial are also valid for the MovieMaterial. Dot3BitmapMaterial The Dot3BitmapMaterial uses normal mapping to add depth to a 3D object. protected function applyDot3BitmapMaterial():void{ initSphere(); initDirectionalLight(); materialText.text = "Dot3BitmapMaterial"; var newMaterial:Dot3BitmapMaterial = new Dot3BitmapMaterial( Cast.bitmap(EarthDiffuse), Cast.bitmap(EarthNormal) ); currentPrimitive.material = newMaterial;} Dot3BitmapMaterial is a composite material that passes the init object to a contained instance of the BitmapMaterial class. This means that in addition to those parameters in the following list, the init object parameters listed for the BitmapMaterial are also valid for the Dot3BitmapMaterial.
Read more
  • 0
  • 0
  • 1641
article-image-away3d-36-applying-animated-and-composite-materials
Packt
04 Mar 2011
3 min read
Save for later

Away3D 3.6: Applying Animated and Composite materials

Packt
04 Mar 2011
3 min read
  Away3D 3.6 Essentials Take Flash to the next dimension by creating detailed, animated, and interactive 3D worlds with Away3D         Animated materials As mentioned above a number of materials can be used to display animations on the surface of a 3D object. These animations are usually movies that have been encoded into a SWF file. You can also display an interactive SWF file, like a form, on the surface of a 3D object. MovieMaterial The MovieMaterial displays the output of a Sprite object, which can be animated. The sprite usually originates from another SWF file, which in this case we have embedded and referenced via the Bear class. A new instance of the Bear class is then passed to the MovieMaterial constructor. protected function applyMovieMaterial():void { initCube(); materialText.text = "MovieMaterial"; var newMaterial:MovieMaterial = new MovieMaterial(new Bear()); currentPrimitive.material = newMaterial; } The MovieMaterial class extends the TransformBitmapMaterial class. This means that in addition to those parameters in the following list, the init object parameters listed for the TransformBitmapMaterial are also valid for the MovieMaterial. AnimatedBitmapMaterial The AnimatedBitmapMaterial class displays the frames from a MovieClip object. In order to increase performance, it will first render each frame of the supplied MovieClip into a bitmap. These bitmaps are stored in a cache, which increases playback performance at the cost of using additional memory. Because of the memory overhead resulting from this cache, the AnimatedBitmapMaterial cannot be used to display movie clips longer than two seconds. If you pass a movie clip longer than two seconds an exception will be thrown. The MovieClip object, passed to the AnimatedBitmapMaterial constructor, usually originates from another SWF file. This source SWF file needs to be implemented in the ActionScript Virtual Machine 2 (AVM2) format, which is the format used by Flash Player 9 and above. This is an important point, because a large number of video conversion tools will output AVM1 SWF files. If you need to display a SWF movie in AVM1 format, use MovieMaterial class instead. If you try to use an AVM1 SWF file with the AnimatedBitmapMaterial class, an exception similar to the following will be thrown: TypeError: Error #1034: Type Coercion failed: cannot convert flash.display:: AVM1Movie@51e8e51 to flash.display.MovieClip. FFmapeg is a free, cross-platform tool that can be used to convert video files into AVM2 SWF files. It can be downloaded from , and precompiled Windows binaries can be downloaded from http://sourceforge.net/projects/mplayer-win32/files/FFmpeg/. The following command will convert a WMV video into a two second AVM2 SWF file with a resolution of 320 x 240 without any audio. ffmpeg -i Butterfly.wmv -t 2 -s 320x240 -an -f avm2 Butterfly.SWF protected function applyAnimatedBitmapMaterial():void { initCube(); materialText.text = "AnimatedBitmapMaterial"; var newMaterial:AnimatedBitmapMaterial = new AnimatedBitmapMaterial(new Butterfly()); currentPrimitive.material = newMaterial; } The AnimatedBitmapMaterial class extends the TransformBitmapMaterial class. This means that in addition to those parameters in the following list, the init object parameters listed for the TransformBitmapMaterial are also valid for the AnimatedBitmapMaterial. Interactive MovieMaterial By setting the interactive parameter to true, a MovieMaterial object can pass mouse events to the Sprite object it is displaying. This allows you to interact with the material as if it were added directly to the Flash stage while it is wrapped around a 3D object. protected function applyInteractiveMovieMaterial():void { initCube(); materialText.text = "MovieMaterial - Interactive"; var newMaterial:MovieMaterial = new MovieMaterial(new InteractiveTexture(), { interactive: true, smooth: true } ); currentPrimitive.material = newMaterial; } Refer to the previous table for the MovieMaterial class for the list of constructor parameters.
Read more
  • 0
  • 0
  • 1991

article-image-creating-and-warping-3d-text-away3d-36
Packt
11 Feb 2011
7 min read
Save for later

Creating and Warping 3D Text with Away3D 3.6

Packt
11 Feb 2011
7 min read
  Away3D 3.6 Essentials The external library, swfvector, is contained in the wumedia package. More information about the swfvector library can be found at http://code.google.com/p/swfvector/. This library was not developed as part of the Away3D engine, but has been integrated since version 2.4 and 3.4, to enable Away3D to provide a way to create and display text 3D objects within the scene. Embedding fonts Creating a text 3D object in Away3D requires a source SWF file with an embedded font. To accommodate this, we will create a very simple application using the Fonts class below. This class embeds a single true-type font called Vera Sans from the Vera.ttf file. When compiled, the resulting SWF file can then be referenced by our Away3D application, allowing the embedded font file to be accessed. When embedding fonts using the Flex 4 SDK, you may need to set the embedAsCFF property to false, like: [Embed(mimeType="application/x-font", source="Vera. ttf", fontName="Vera Sans", embedAsCFF=false)] This is due to the new way fonts can be embedded with the latest versions of the Flex SDK. You can find more information on the embedAsCFF property at http://help.adobe.com/en_US/flex/using/WS2db454920e96a9e51e63e3d11c0bf6320a-7fea.html. package { import flash.display.Sprite; public class Fonts extends Sprite { [Embed(mimeType="application/x-font", source="Vera.ttf", fontName="Vera Sans")] public var VeraSans:Class; } } The font used here is Bitstream Vera, which can be freely distributed, and can be obtained from http://www.gnome.org/fonts/. However, not all fonts can be freely redistributed, so be mindful of the copyright or license restrictions that may be imposed by a particular font. Displaying text in the scene Text 3D objects are represented by the TextField3D class, from the away3d.primitives package. Creating a text 3D object requires two steps: Extracting the fonts that were embedded inside a separate SWF file. Creating a new TextField3D object. Let's create an application called FontDemo that creates a 3D textfield and adds it to the scene. package { We import the TextField3D class, making it available within our application. import away3d.primitives.TextField3D; The VectorText class will be used to extract the fonts from the embedded SWF file. import wumedia.vector.VectorText; public class FontDemo extends Away3DTemplate { The Fonts.SWF file was created by compiling the Fonts class above. We want to embed this SWF file as raw data, so we specify the MIME type to be application/octet-stream. [Embed(source="Fonts.swf", mimeType="application/octet-stream")] protected var Fonts:Class; public function FontDemo() { super(); } protected override function initEngine():void { super.initEngine(); Before any TextField3D objects can be created we need to extract the fonts from the embedded SWF file. This is done by calling the static extractFonts() function in the VectorText class, and passing a new instance of the embedded SWF file. Because we specified the MIME type of the embedded file to be application/octet-stream, a new instance of the class is created as a ByteArray. VectorText.extractFont(new Fonts()); } protected override function initScene():void { super.initScene(); this.camera.z = 0; Here we create the new instance of the TextField3D class. The first parameter is the font name, which corresponds to the font name included in the embedded SWF file. The TextField3D constructor also takes an init object, whose parameters are listed in the next table. var text:TextField3D = new TextField3D("Vera Sans", { text: "Away3D Essentials", align: VectorText.CENTER, z: 300 } ); scene.addChild(text); } } } The following table shows you the init object parameters accepted by the TextField3D constructor. When the application is run, the scene will contain a single 3D object that has been created to spell out the words "Away3D Essentials" and formatted using the supplied font. At this point, the text 3D object can be transformed and interacted with, just like other 3D object. 3D Text materials You may be aware of applying bitmap materials to the surface of a 3D object according to their UV coordinates. The default UV coordinates defined by a TextField3D object generally do not allow bitmap materials to be applied in a useful manner. However, simple colored materials like WireframeMaterial, WireColorMaterial, and ColorMaterial can be applied to a TextField3D object. Extruding 3D text By default, a text 3D object has no depth (although it is visible from both sides). One of the extrusion classes called TextExtrusion can be used to create an additional 3D object that uses the shape of a text 3D object and extends it into a third dimension. When combined, the TextExtrusion and TextField3D objects can be used to create the appearance of a solid block of text. The FontExtrusionDemo class in the following code snippet gives an example of this process: package { import away3d.containers.ObjectContainer3D; import away3d.extrusions.TextExtrusion; import away3d.primitives.TextField3D; import flash.events.Event; import wumedia.vector.VectorText; public class FontExtrusionDemo extends Away3DTemplate { [Embed(source="Fonts.swf", mimeType="application/octet-stream")] protected var Fonts:Class; The TextField3D 3D object and the extrusion 3D object are both added as children of a ObjectContainer3D object, referenced by the container property. protected var container:ObjectContainer3D; The text property will reference the TextField3D object used to display the 3D text. protected var text:TextField3D; The extrusion property will reference the TextExtrusion object used to give the 3D text some depth. protected var extrusion:TextExtrusion; public function FontExtrusionDemo() { super(); } protected override function initEngine():void { super.initEngine(); this.camera.z = 0; VectorText.extractFont(new Fonts()); } protected override function initScene():void { super.initScene(); text = new TextField3D("Vera Sans", { text: "Away3D Essentials", align: VectorText.CENTER } ); The TextExtrusion constructor takes a reference to the TextField3D object (or any other Mesh object). It also accepts an init object, which we have used to specify the depth of the 3D text, and to make both sides of the extruded mesh visible. extrusion = new TextExtrusion(text, { depth: 10, bothsides:true } ); The ObjectContainer3D object is created, supplying the TextField3D and TextExtrusion 3D objects that were created above as children. The initial position of the ObjectContainer3D object is set to 300 units down the positive end of the Z-axis. container = new ObjectContainer3D(text, extrusion, { z: 300 } ); The container is then added as a child of the scene. scene.addChild(container); } protected override function onEnterFrame(event:Event):void { super.onEnterFrame(event); The container is slowly rotated around its Y-axis by modifying the rotationY property in every frame. In previous examples, we have simply incremented the rotation property, without any regard for when the value became larger than 360 degrees. After all, rotating a 3D object by 180 or 540 degrees has the same overall effect. But in this case, we do want to keep the value of the rotationY property between 0 and 360 so we can easily test to see if the rotation is within a given range. To do this, we use the mod (%) operator. container.rotationY = (container.rotationY + 1) % 360; Z-sorting issues can rise due to the fact that the TextExtrusion and TextField3D objects are so closely aligned. This issue results in parts of the TextField3D or TextExturude 3D objects showing through where it is obvious that they should be hidden. To solve this problem, we can use the procedure to force the sorting order of 3D objects. Here we are assigning a positive value to the TextField3D screenZOffset property to force it to be drawn behind the TextExturude object, when the container has been rotated between 90 and 270 degrees around the Y-axis. When the container is rotated like this, the TextField3D object is at the back of the scene. Otherwise, the TextField3D is drawn in front by assigning a negative value to the screenZOffset property. if (container.rotationY > 90 && container.rotationY < 270) text.screenZOffset = 10; else text.screenZOffset = -10; } } } The result of the FontExtrusionDemo application is shown in the following image:
Read more
  • 0
  • 0
  • 1887

article-image-away3d-36-applying-basic-and-bitmap-materials
Packt
07 Feb 2011
7 min read
Save for later

Away3D 3.6: Applying Basic and Bitmap Materials

Packt
07 Feb 2011
7 min read
Away3D 3.6 Essentials Take Flash to the next dimension by creating detailed, animated, and interactive 3D worlds with Away3D Create stunning 3D environments with highly detailed textures Animate and transform all types of 3D objects, including 3D Text Eliminate the need for expensive hardware with proven Away3D optimization techniques, without compromising on visual appeal Written in a practical and illustrative style, which will appeal to Away3D beginners and Flash developers alike To demonstrate the basic materials available in Away3D, we will create a new demo called MaterialsDemo. package { Some primitives show off a material better than others. To accommodate this, we will apply the various materials to the sphere, torus, cube, and plane primitive 3D objects in this demo. All primitives extend the Mesh class, which makes it the logical choice for the type of the variable that will reference instances of all four primitives. import away3d.core.base.Mesh; The Cast class provides a number of handy functions that deal with the casting of objects between types. import away3d.core.utils.Cast; As we saw previously, those materials that can be illuminated support point or directional light sources (and sometimes both). To show off materials that can be illuminated, one of these types of lights will be added to the scene. import away3d.lights.DirectionalLight3D; import away3d.lights.PointLight3D; In order to load textures from external image files, we need to import the TextureLoadQueue and TextureLoader classes. import away3d.loaders.utils.TextureLoadQueue; import away3d.loaders.utils.TextureLoader; The various material classes demonstrated by the MaterialsDemo class are imported from the away3d.materials package. import away3d.materials.AnimatedBitmapMaterial; import away3d.materials.BitmapFileMaterial; import away3d.materials.BitmapMaterial; import away3d.materials.ColorMaterial; import away3d.materials.CubicEnvMapPBMaterial; import away3d.materials.DepthBitmapMaterial; import away3d.materials.Dot3BitmapMaterial; import away3d.materials.Dot3BitmapMaterialF10; import away3d.materials.EnviroBitmapMaterial; import away3d.materials.EnviroColorMaterial; import away3d.materials.FresnelPBMaterial; import away3d.materials.MovieMaterial; import away3d.materials.PhongBitmapMaterial; import away3d.materials.PhongColorMaterial; import away3d.materials.PhongMovieMaterial; import away3d.materials.PhongMultiPassMaterial; import away3d.materials.PhongPBMaterial; import away3d.materials.ShadingColorMaterial; import away3d.materials.TransformBitmapMaterial; import away3d.materials.WhiteShadingBitmapMaterial; import away3d.materials.WireColorMaterial; import away3d.materials.WireframeMaterial; These materials will all be applied to a number of primitive types, which are all imported from the away3d.primitives package. import away3d.primitives.Cube; import away3d.primitives.Plane; import away3d.primitives.Sphere; import away3d.primitives.Torus; The CubFaces class defines a number of constants that identify each of the six sides of a cube. import away3d.primitives.utils.CubeFaces; The following Flash classes are used when loading textures from external image files, to handle events, to display a textfield on the screen, and to define a position or vector within the scene. import flash.geom.Vector3D; import flash.net.URLRequest; import flash.display.BitmapData; import flash.events.Event; import flash.events.KeyboardEvent; import flash.text.TextField; The MaterialsDemo class extends the Away3DTemplate class (download code Ch:1). public class MaterialsDemo extends Away3DTemplate { One of the ways to manage resources that was discussed in the Resource management section was to embed them. Here, we see how an external JPG image file, referenced by the source parameter, has been embedded using the Embed keyword. Embedding an image file in this way means that instantiating the EarthDiffuse class will result in a Bitmap object populated with the image data contained in the earth_diffuse.jpg file. [Embed(source = "earth_diffuse.jpg")] protected var EarthDiffuse:Class; A number of additional images have been embedded in the same way. [Embed(source = "earth_normal.jpg")] protected var EarthNormal:Class; [Embed(source = "earth_specular.jpg")] protected var EarthSpecular:Class; [Embed(source = "checkerboard.jpg")] protected var Checkerboard:Class; [Embed(source = "bricks.jpg")] protected var Bricks:Class; [Embed(source = "marble.jpg")] protected var Marble:Class; [Embed(source = "water.jpg")] protected var Water:Class; [Embed(source = "waternormal.jpg")] protected var WaterNormal:Class; [Embed(source = "spheremap.gif")] protected var SphereMap:Class; [Embed(source = "skyleft.jpg")] protected var Skyleft:Class; [Embed(source = "skyfront.jpg")] protected var Skyfront:Class; [Embed(source = "skyright.jpg")] protected var Skyright:Class; [Embed(source = "skyback.jpg")] protected var Skyback:Class; [Embed(source = "skyup.jpg")] protected var Skyup:Class; [Embed(source = "skydown.jpg")] protected var Skydown:Class; Here we are embedding three SWF files. These are embedded just like the preceding images. [Embed(source = "Butterfly.swf")] protected var Butterfly:Class; [Embed(source = "InteractiveTexture.swf")] private var InteractiveTexture:Class; [Embed(source = "Bear.swf")] private var Bear:Class; A TextField object is used to display the name of the current material on the screen. protected var materialText:TextField; The currentPrimitive property is used to reference the primitive to which we will apply the various materials. protected var currentPrimitive:Mesh; The directionalLight and pointLight properties each reference a light that is added to the scene to illuminate certain materials. protected var directionalLight:DirectionalLight3D; protected var pointLight:PointLight3D; The bounce property is set to true when we want the sphere to bounce along the Z-axis. This bouncing motion will be used to show off the effect of the DepthBitmapMaterial class. protected var bounce:Boolean; The frameCount property maintains a count of the frames that have been rendered while bounce property is set to true. protected var frameCount:int; The constructor calls the Away3DTemplate constructor, which will initialize the Away3D engine. public function MaterialsDemo() { super(); } The removePrimitive() function removes the current primitive 3D object from the scene, in preparation for a new primitive to be created. protected function removePrimitive():void { if (currentPrimitive != null) { scene.removeChild(currentPrimitive); currentPrimitive = null; } } The initSphere() function first removes the existing primitive from the scene by calling the removePrimitive() function, and then creates a new sphere primitive and adds it to the scene. Optionally, it can set the bounce property to true, which indicates that the primitive should bounce along the Z-axis. protected function initSphere(bounce:Boolean = false):void { removePrimitive(); currentPrimitive = new Sphere(); scene.addChild(currentPrimitive); this.bounce = bounce; } The initTorus(), initCube(), and initPlane() functions all work like the initSphere() function to add a specific type of primitive to the scene. These functions all set the bounce property to false, as none of the materials that will be applied to these primitives gain anything by having the primitive bounce within the scene. protected function initTorus():void { removePrimitive(); currentPrimitive = new Torus(); scene.addChild(currentPrimitive); this.bounce = false; } protected function initCube():void { removePrimitive(); currentPrimitive = new Cube( { width: 200, height: 200, depth: 200 } ); scene.addChild(currentPrimitive); this.bounce = false; } protected function initPlane():void { removePrimitive(); currentPrimitive = new Plane( { bothsides: true, width: 200, height: 200, yUp: false } ); scene.addChild(currentPrimitive); this.bounce = false; } The removeLights() function will remove any lights that have been added to the scene in preparation for a new light to be created. protected function removeLights():void { if (directionalLight != null) { scene.removeLight(directionalLight); directionalLight = null; } if (pointLight != null) { scene.removeLight(pointLight); pointLight = null; } } The initPointLight() and initDirectionalLight() functions each remove any existing lights in the scene by calling the removeLights() function, and then add their specific type of light to the scene. protected function initPointLight():void { removeLights(); pointLight = new PointLight3D( { x: -300, y: -300, radius: 1000 } ); scene.addLight(pointLight); } protected function initDirectionalLight():void { removeLights(); directionalLight = new DirectionalLight3D( { x: 300, y: 300, The direction that the light is pointing is set to (0, 0, 0) by default, which effectively means the light is not pointing anywhere. If you have a directional light that is not being reflected off the surface of a lit material, leaving the direction property to this default value may be the cause. Here we override the default to make the light point back to the origin. direction: new Vector3D(-1, -1, 0) } ); scene.addLight(directionalLight); } The initScene() function has been overridden to call the applyWireColorMaterial() function, which will display a sphere with the WireColorMaterial material applied to it. We also set the position of the camera back to the origin. protected override function initScene():void { super.initScene(); this.camera.z = 0; applyWireColorMaterial(); }
Read more
  • 0
  • 0
  • 1759
article-image-materials-lights-and-shading-techniques-away3d-36
Packt
04 Feb 2011
6 min read
Save for later

Materials, Lights and Shading Techniques with Away3D 3.6

Packt
04 Feb 2011
6 min read
The difference between textures and materials Throughout this article, a number of references will be made to materials and textures. A texture is simply an image, like you would create in an image editing application like Photoshop or view in a web page. Textures are then used by materials, which in Away3D are classes that can be applied to the surface of a 3D object. Resource management Quite a number of the materials included in Away3D rely on textures that exist in external image like a PNG, JPG, or GIF file. There are two ways of dealing with external files: embedding them or accessing them at runtime. ActionScript includes the Embed keyword, which can be used to embed external files directly inside a compiled SWF file. There are a number of benefits to embedded resources: The Flash application can be distributed as a single file There is no wait when accessing the resources at runtime The security issues associated with accessing remote resources are avoided There is no additional network traffic once the SWF is downloaded The SWF file can be run offline The embedded files can have additional compression applied The downside to embedding resources is that the size of the final SWF is increased, resulting in a longer initial download time. Alternatively, the external files can be saved separately and accessed at runtime, which has the following advantages: The SWF file is smaller, resulting in shorter initial download times Resources are only downloaded when they are needed, and cached for future access Resources can be updated or modified without recompiling the SWF file There are several downsides to accessing resources at runtime: Permissions on the server hosting the resources may need to be configured before the external files can be accessed Distribution of the final Flash application is more difficult due to the increased number of individual files There will be a delay when the application is run as the remote resources are downloaded Away3D supports the use of both embedded and external resources, and both methods will be demonstrated below. Embedding the resources is usually the best option when managing resources. It prevents a number of possible errors due to unreliable networks and security restrictions, and produces a SWF file that is much simpler to distribute and publish. However, for applications where it is not possible to know what resources will be required beforehand, like a 3D image gallery, loading external resources is the only option. You may also want to load external resources for applications where there is a large volume of data that does not need to be downloaded immediately, like a large game with levels that the player won't necessarily see in a single sitting. Defining colors in Away3D The appearance of a number of materials can be modified by supplying a color. A good example is the WireColorMaterial material (the same one that is applied to a 3D object when no material is specified), the fill and outline colors of which can be defined via the color and wirecolor init object parameters. Colors can be defined in Away3D in a number of different formats. Common to all the formats is the idea that a color is made up of red, green, and blue component. For example, the color purple is made up of red and blue, while yellow is made up of red and green. By integer Colors can be defined as an integer. These int values are usually defined in their hexadecimal form, which looks like 0x12CD56. The characters that make up the int can be digits between 0 and 9, and characters between A and F. You can think of the characters A through to F as representing the numbers 10 to 15, allowing each character to represent 16 different values. For each color component, 00 is the lowest value, and FF is the highest. The first two characters define the red components of the color, the next two define the green component, and the final two define the blue component. It is sometimes necessary to define the transparency of a color. This is done by adding two additional characters to the beginning of the hexadecimal notation, such as 0xFF12CD56. In this form, the two leading characters define the transparency, or alpha, of the color. The last six characters represent the red, green, and blue components. Smaller alpha values make a color more transparent, while higher alpha values make a color more opaque. You can see an example of a color being defined as an int in the applyWireframeMaterial() function from the MaterialsDemo class. By string The same hexadecimal format used by integers can also be represented as a String. The only difference is that the prefix 0x is left off. An example would be "12CD56", or "FF12CD56". The MaterialsDemo applyColorMaterial() function demonstrates the use of this color format. Away3D also recognizes a number of colors by name. These are listed in the following table. The MaterialsDemo applyWireColorMaterial() function demonstrates the use of colors defined by name. Pixel Bender Pixel Bender is a technology, new to Flash Player 10, that implements generalized graphics processing in the Pixel Bender language. The programs written using Pixel Bender are known as kernels or shaders. Shaders have the advantage of being able to be run across multiple CPUs and CPU cores, unlike the graphics processing done via the Flash graphics API. This gives shaders the potential to be much faster. The term shader and kernel can be used interchangeably with respect to Pixel Bender. One of the advantages of using Away3D version 3.x over version 2.x is the ability to use Pixel Bender shaders. The implementation of these shaders is largely hidden by the material classes that utilize them, meaning that they can be used much like the regular material classes, while at the same time offering a much higher level of detail. A common misconception is that Flash Player 10 uses the Graphics Processing Unit (GPU), which is common to most video chipsets these days, to execute shaders. This is incorrect. Unlike some other Adobe products that also make use of Pixel Bender shaders, Flash Player 10 does not utilize the GPU when executing shaders. Adobe has indicated that GPU rendering support for Pixel Bender may be included in future releases of Flash Player.
Read more
  • 0
  • 0
  • 2223

article-image-models-and-animations-away3d-36
Packt
28 Jan 2011
7 min read
Save for later

Models and Animations with Away3D 3.6

Packt
28 Jan 2011
7 min read
Away3D 3.6 Essentials Take Flash to the next dimension by creating detailed, animated, and interactive 3D worlds with Away3D Create stunning 3D environments with highly detailed textures Animate and transform all types of 3D objects, including 3D Text Eliminate the need for expensive hardware with proven Away3D optimization techniques, without compromising on visual appeal Written in a practical and illustrative style, which will appeal to Away3D beginners and Flash developers alike Models and Animations It is possible to create a 3D object from the ground up using basic elements like vertices, triangle faces, Sprite3D objects, and segments. However, creating each element manually in code is not practical for more complex models. While the classes from the away3d.primitives package offer a solution by providing a way to quickly create some standard shapes, advanced applications will need to display more complex shapes. For those situations where these standard primitive shapes do not provide enough fexibility, Away3D can load and display 3D models created by external 3D modeling applications. 3D modeling applications are specifcally designed to provide a visual environment in which 3D models can be manipulated. It is certainly much more convenient to create or edit a 3D mesh in one of these applications than it is to build up a mesh in code using ActionScript. Away3D can directly load a wide range of 3D formats. The process of exporting a 3D mesh into a fle that can be used with Away3D will be covered for the following 3D modeling applications: 3ds Max: A popular commercial modeling, animation, and rendering application which runs on Windows. Blender: A free and open source modeling application, which is available on a number of platforms, including Windows, Linux, and MacOS. Milkshape: A commercial low-polygon modeler which runs on Windows that was originally designed for the game Half-Life. Sketch-up: A free 3D modeling application provided by Google. A commercial version is also available that includes a number of additional features. Sketch-up runs on Windows and MacOS. Actually creating a model in these 3D modeling applications is outside the scope of this article. However, 3D models are provided that can be loaded and then exported from these applications, which will allow you run through the procedure without having to know how to make a 3D model from scratch. 3D formats supported by Away3D Away3D includes classes that can load a wide range of 3D model fle formats. All the supported formats can be used to load a static 3D model, while a smaller number can be used to load animated models. The following table lists the 3D model formats supported by Away3D, their common extensions, whether they can load animated 3D models, and the Away3D class that is used to load and parse them. Exporting 3D models The following instructions show you how to export a Collada fle from a number of different 3D modeling applications. Collada is an open, XML-based format that has been designed to provide a way to exchange data between 3D applications. Away3D supports loading both static and animated 3D models from the Collada format. Exporting from 3ds Max 3ds Max is a commercial 3D modeling application. At the time of writing, the latest version of the ColladaMax plugin, which is the plugin that we will use to export the 3D model, was 3.05C. This version supports 3ds Max 2008, 3ds Max 9, 3ds Max 8 SP3, or 3ds Max 7 SP1. Note that this version does not support 3ds Max 2010 or 2011. A trial version of 3ds Max 9 is available, although it can be diffcult to fnd. You should be able to fnd a copy if you search the Internet for Autodesk3dsMax2009_ENU_TrialDownload.exe, which is the name of fle that will install the trial version of 3ds Max 9. Download and install the ColladaMax plugin from http://sourceforge.net/projects/colladamaya/files/. Open 3ds Max. Click File | Open. Select the MAX file you wish to open and click on the Open button. Click File | Export from within 3ds Max. Select COLLADA (*.DAE) from the Save as type drop-down list. Select the same directory where the original MAX fle was located. Type a fle name for the exported fle in the File name textbox, and click on the Save button. In the ColladaMax Export dialog box make sure the following checkboxes are enabled: Relative Paths Normals Triangulate If you want to export animations, enable the Enable export checkbox. If you want to export a specifc range of frames, enable the Sample animation checkbox and enter the required values in the Start and End textboxes. Click on the OK button to export the fle. Exporting from MilkShape The Collada exporter supplied with MilkShape does not export animations. So even if the MilkShape MS3D file we are loading contains an animated model, the exported Collada DAE file will be a static mesh. A trial version of MilkShape can be downloaded and installed from its website at http://chumbalum.swissquake.ch/. Click File | Open. Select the MS3D file you wish to open and click on the Open button. Click File | Export | COLLADA…. Select the same directory where the original MS3D file was located. Type a flename for the exported fle in the File name textbox and click the Save button. Exporting from Sketch-Up Like Milkshape, Sketch-up does not support exporting animated Collada fles. Sketch-Up can be downloaded for free from http://sketchup.google.com/. Click File | Open. Select the SKP file you wish to open and click on the Open button. Click File | Export | 3D Model…. Select Collada File (*.dae) from the Export type combobox. Select an appropriate directory, and type a filename for the exported file in the File name textbox. Click on the Options... button. Make sure the Triangulate All Faces checkbox is enabled. If the Export Texture Maps option is enabled, Sketch-Up will export the textures along with the DAE file. Click on the OK button to save the options. Click on the Export button to export the file. Exporting from Blender The latest version of the Collada exporter for Blender, which is version 0.3.162 at the time of writing, does support exporting animations. However, in most cases Away3D will not load these animations correctly. It is recommended that only static meshes be exported from Blender to a Collada fle. Click File | Open.... Select the BLEND file you wish to open and click on the Open button. Click File | Export | COLLADA1.4 (*.dae) .... Type a flename for the exported fle in the directory where the original BLEND fle was located in the Export File textbox. Make sure the Triangles and Use Relative Paths buttons are pressed. Click on the Export and Close button. A note about the Collada exporters Despite being free and open standard, exporting to a Collada fle that can be correctly parsed by Away3D can be a hit-and-miss affair. The Collada exporters for 3ds Max are a good example. During testing, neither the built-in Collada exporter included with 3ds Max, nor the third-party OpenCollada exporter from http://opencollada.org (version 1.2.5 was the latest version at the time of writing) would export an animated Collada fle that Away3D could read. At best Away3D would display a static mesh, and at worst it would throw an exception when reading the DAE fle. Likewise, neither of the Collada exporters that come with Blender (which was at version 2.49b at the time of writing) would consistently export an animated Collada mesh that was compatible with Away3D. It is important to be aware that just because a 3D modeling application says that it can export to a Collada fle, this is no guarantee that the resulting fle can be read correctly by Away3D.
Read more
  • 0
  • 0
  • 2654