Asset preparation
When you've reached a clear decision on the initial concept and design, you're ready to prototype! This means building a Unity project demonstrating the core mechanic and game rules in action as a playable sample. After this, you typically refine the design more, and repeat prototyping until arriving at an artifact you want to pursue. From here, the art team must produce assets (meshes and textures) based on the concept art, the game design, and photographic references. When producing meshes and textures for Unity, some important guidelines should be followed to achieve optimal graphical performance in-game. This is about structuring and building assets in a smart way so that they export cleanly and easily from their originating software and can then be imported with minimal fuss, performing as best as they can at runtime. Let's take a look at some of these guidelines for meshes and textures.
Meshes - work only with good topology
A good mesh topology consists in all polygons having only three or four sides in the model (not more). Additionally, Edge Loops should flow in an ordered, regular way along the contours of the model, defining its shape and form.
Clean topology
Unity automatically converts, on import, any NGons (polygons with more than four sides) into triangles, if the mesh has any. However, it's better to build meshes without NGons as opposed to relying on Unity's automated methods. Not only does this cultivate good habits at the modeling phase, but it avoids any automatic and unpredictable retopology of the mesh, which affects how it's shaded and animated.
Meshes - minimize polygon count
Every polygon in a mesh entails a rendering performance hit insofar as a GPU needs time to process and render each polygon. Consequently, it's sensible to minimize the number of a polygons in a mesh, even though modern graphics hardware is adept at working with many polygons. It's a good practice to minimize polygons wherever possible and to the degree that it doesn't detract from your central artistic vision and style.
High-poly meshes! (try reducing polygons where possible)
There are many techniques available to reduce polygon counts. Most 3D applications (such as 3ds Max, Maya, and Blender) offer automated tools that decimate polygons in a mesh while retaining its basic shape and outline. However, these methods frequently make a mess of topology, leaving you with faces and edge loops leading in all directions. Even so, this can still be useful for reducing polygons in static meshes (meshes that never animate), such as statues, houses, or chairs. However, it's typically bad for animated meshes where topology is especially important.
Reducing mesh polygons with automated methods can produce messy topology!
Note
If you want to know the total vertex and face count of a mesh, you can use your 3D software statistics. Blender, Maya, 3ds Max, and most 3D software let you see vertex and face counts of selected meshes directly from the viewport. However, this information should only be considered a rough guide! This is because after importing a mesh into Unity, the vertex count frequently turns out higher than expected! There are many reasons for this, which is explained in more depth online at http://docs.unity3d.com/Manual/OptimizingGraphicsPerformance.html.
In short, use the Unity vertex count as the final word on the actual vertex count of your mesh. To view the vertex count for an imported mesh in Unity, click on the right-arrow on the mesh thumbnail in the Project panel. This shows the internal mesh asset. Select this asset, and then view the vertex count from the preview pane in the Inspector object.
Viewing the vertex and face count for meshes in Unity
Meshes - simulating bump details without geometry
As mentioned, try keeping meshes as low-poly as possible. Low-poly meshes are, however, of lower quality than higher-resolution meshes. They have fewer polygons and thereby hold fewer details. Yet, this need not be problematic. Techniques exist for simulating detail in low-poly meshes, making them appear at a higher resolution than they really are. Normal Mapping is one example of this. Normal Maps are special textures that define the orientation and roughness of a mesh surface across its polygons and how those polygons interact with lighting. In short, a Normal Map specifies how lighting interacts over a mesh and ultimately effects how the mesh is shaded. This influences how we perceive the details. You can produce Normal Maps in many ways, for example, typically using 3D modeling software. By producing two mesh versions (namely, a high-poly version containing all the needed details, and a low-poly version to receive the details), you can bake normal information from the high-poly mesh to the low-poly mesh via a texture file. This approach (known as Normal Map Baking) can lead to stunningly accurate and believable results, as follows:
Simulating high-poly detail with Normal Maps
However, if you don't have any Normal Maps for an imported mesh, Unity can generate them from a standard, diffuse texture, via the Import Settings. This may not produce the most believable and physically accurate results, like Normal Map Baking, but it's useful to quickly and easily generate displacement details, enhancing the mood and realism of a scene. To create a Normal Map from a diffuse texture, first select the imported texture from the Project panel and duplicate it-make sure that the original version is not invalidated or affected. Then, from the object Inspector, change the Texture Type (for the duplicate texture) from Texture to Normal map. This changes how Unity understands and works with the texture:
Configuring texture as a Normal map
Specifying Normal Map for a texture configures Unity to use and work with that texture in a specialized, optimized way for generating bump details on your model. However, when creating a Normal Map from a diffuse texture, you'll also need to enable the Create from Grayscale checkbox. When enabled, Unity generates a Normal Map from a grayscale version of the diffuse texture, using the Bumpiness and Filtering settings, as follows:
Enable Create from Grayscale for Normal maps
With Create from Grayscale enabled, you can use the Bumpiness slider to intensify and weaken the bump effect and the Filtering setting to control the roughness or smoothness of the bump. When you've adjusted the settings as needed, confirm the changes and preview the result by pressing the Apply button from the Inspector object:
Customizing an imported Normal Map
Meshes - minimize UV seams
Seams are edge cuts inserted into a mesh during UV mapping to help it unfold, flattening out into a 2D space for the purpose of texture assignment. This process is achieved in 3D modeling software, but the cuts it makes are highly important for properly unfolding a model and getting it to look as intended inside Unity. An edge is classified as a seam in UV space when it has only one neighboring face, as opposed to two. Essentially, the seams determine how a mesh's UVs are cut apart into separate UV shells or UV islands, which are arranged into a final UV layout. This layout maps a texture onto the mesh surface, as follows:
Creating a UV layout
Always minimize UV seams where feasible by joining together disparate edges, shells, or islands, forming larger units. This is not something you do in Unity, but in your 3D modeling software. Even so, by doing this, you potentially reduce the vertex count and complexity of your mesh. This leads to improved runtime performance in Unity. This is because Unity must duplicate all vertices along the seams to accommodate the rendering standards for most real-time graphics hardware. Thus, wherever there are seams, there will be a doubling up of vertices, as shown here:
Binding together edges and islands to reduce UV seams
Meshes - export as FBX
Unity officially supports many mesh import formats, including .ma
, .mb
, .max
, .blend
, and others. Details and comparisons of these are found online at http://docs.unity3d.com/Manual/3D-formats.html. Unity divides mesh formats into two main groups: exported and proprietary. The exported formats include .fbx
and .dae
. These are meshes exported manually from 3D modeling software into an independent data-interchange format, which is industry recognized. It's feature limited, but widely supported. The proprietary formats, in contrast, are application-specific formats that support a wider range of features but at the cost of compatibility. In short, you should almost always use the exported FBX file format. This is the most widely supported, used and tested format within the Unity community and supports imported meshes of all types, both static and animated. It gives the best results. If you choose a proprietary format, you'll frequently end up importing additional 3D objects that you'll never use in your game, and your Unity project is automatically tied to the 3D software itself. That is, you'll need a fully licensed copy of your 3D software on every machine for which you intend to open your Unity project; this is annoying.
Exporting meshes to an FBX file, works best with Unity
Meshes - use meters scale (metric)
Unity measures 3D space using the metric system, and 1 world unit is understood, by the physics system, to mean 1 meter. Unity is configured to work with models from most 3D applications using their default settings. However, sometimes, your models will appear too big or small when imported. This usually happens when your world units are not configured to metric in your 3D modeling software. The details of how to change units varies for each software, such as Blender, Maya, or 3ds Max. Each program allows unit customization from the Preferences menu.
Configuring 3D software to Metric units
Textures - never use lossless compression
Always save your textures in lossless formats, such as PNG, TGA, or PSD. Avoid lossy formats such as JPG, even though they're typically smaller in file size. JPG might be ideal for website images or for sending holiday snaps to your friends and family; but, for creating video game textures, they are problematic-they lose quality exponentially with each successive save operation. By using lossless formats and by removing JPG from every step of your workflow (including intermediary steps), your textures can remain crisp and sharp:
Saving textures to PNG files
Textures - power of 2 sizes
If your textures are for 3D models and meshes (not sprites or GUI elements), then make their dimensions power-2 size for best results. The textures needn't be square (equal in width and height), but each dimension should be from a range of power-2 sizes. Valid sizes include 32, 64, 128, 256, 512, 1024, 2048, 4096, and 8192. Sizing textures to a power-2 dimension helps Unity scale textures up and down, as well as copy pixels between textures as needed, across the widest range of graphical hardware.
Creating textures at power-2 sizes
When creating textures, it's always best to design for the largest possible power-2 size you'll need (as opposed to the largest possible size allowed), and then to downscale wherever appropriate to smaller power-2 sizes for older hardware and weaker systems, such as mobile devices. For each imported texture, you can use the Unity platform tabs from the Inspector object to specify an appropriate maximum size for each texture on a specific platform: one for desktop systems, one for Android, one for iOS, and so on. This caps the maximum size allowed for the selected target on a per-platform basis. This value should be the smallest size that is compatible with your artistic intentions and intended quality.
Overriding texture sizes for other platforms
Textures - alpha textures
Alpha textures are textures with transparency. When applied to 3D models, they make areas of the model transparent, allowing objects behind it to show through. Alpha textures can be either TGA files with dedicated alpha channels or PNG files with transparent pixels. In either case, alpha textures can render with artifacts in Unity if they're not created and imported correctly.
Creating alpha textures
If you need to use alpha textures, ensure that you check out the official Unity documentation on how to export them for optimal results from http://docs.unity3d.com/Manual/HOWTO-alphamaps.html.