Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon

Using a collider-based system

Save for later
  • 10 min read
  • 17 Feb 2016

article-image

In this article by Jorge Palacios, the author of the book Unity 5.x Game AI Programming Cookbook, you will learn how to implement agent awareness using a mixed approach that considers the previous learnt sensory-level algorithms.

(For more resources related to this topic, see here.)

Seeing using a collider-based system

This is probably the easiest way to simulate vision. We take a collider, be it a mesh or a Unity primitive, and use it as the tool to determine whether an object is inside the agent's vision range or not.

Getting ready

It's important to have a collider component attached to the same game object using the script on this recipe, as well as the other collider-based algorithms in this chapter. In this case, it's recommended that the collider be a pyramid-based one in order to simulate a vision cone. The lesser the polygons, the faster it will be on the game.

How to do it…

We will create a component that is able to see enemies nearby by performing the following steps:

  1. Create the Visor component, declaring its member variables. It is important to add the corresponding tags into Unity's configuration:
    using UnityEngine;
    using System.Collections;
    
    public class Visor : MonoBehaviour
    {
       public string tagWall = "Wall";
       public string tagTarget = "Enemy";
       public GameObject agent;
    }
  2. Implement the function for initializing the game object in case the component is already assigned to it:
    void Start()
    {
       if (agent == null)
           agent = gameObject;
    }
  3. Declare the function for checking collisions for every frame and build it in the following steps:
    public void OnCollisionStay(Collision coll)
    {
       // next steps here
    }
  4. Discard the collision if it is not a target:
    string tag = coll.gameObject.tag;
    if (!tag.Equals(tagTarget))
       return;
  5. Get the game object's position and compute its direction from the Visor:
    GameObject target = coll.gameObject;
    Vector3 agentPos = agent.transform.position;
    Vector3 targetPos = target.transform.position;
    Vector3 direction = targetPos - agentPos;
  6. Compute its length and create a new ray to be shot soon:
    float length = direction.magnitude;
    direction.Normalize();
    Ray ray = new Ray(agentPos, direction);
  7. Cast the created ray and retrieve all the hits:
    RaycastHit[] hits;
    hits = Physics.RaycastAll(ray, length);
  8. Check for any wall between the visor and target. If none, we can proceed to call our functions or develop our behaviors to be triggered:
    int i;
    for (i = 0; i < hits.Length; i++)
    {
       GameObject hitObj;
       hitObj = hits[i].collider.gameObject;
       tag = hitObj.tag;
       if (tag.Equals(tagWall))
          return;
    }
    // TODO
    // target is visible
    // code your behaviour below

How it works…

The collider component checks every frame to know whether it is colliding with any game object in the scene. We leverage the optimizations to Unity's scene graph and engine, and focus only on how to handle valid collisions.

After checking whether a target object is inside the vision range represented by the collider, we cast a ray in order to check whether it is really visible or there is a wall in between.

Hearing using a collider-based system

In this recipe, we will emulate the sense of hearing by developing two entities; a sound emitter and a sound receiver. It is based on the principles proposed by Millington for simulating a hearing system, and uses the power of Unity colliders to detect receivers near an emitter.

Getting ready

As with the other recipes based on colliders, we will need collider components attached to every object to be checked and rigid body components attached to either emitters or receivers.

How to do it…

We will create the SoundReceiver class for our agents and SoundEmitter for things such as alarms:

  1. Create the class for the SoundReceiver object:
    using UnityEngine;
    using System.Collections;
    
    public class SoundReceiver : MonoBehaviour
    {
       public float soundThreshold;
    }
  2. We define the function for our own behavior to handle the reception of sound:
    public virtual void Receive(float intensity, Vector3 position)
    {
       // TODO
       // code your own behavior here
    }
  3. Now, let's create the class for the SoundEmitter object:
    using UnityEngine;
    using System.Collections;
    using System.Collections.Generic;
    
    public class SoundEmitter : MonoBehaviour
    {
       public float soundIntensity;
       public float soundAttenuation;
       public GameObject emitterObject;
       private Dictionary<int, SoundReceiver> receiverDic;
    }
  4. Initialize the list of receivers nearby and emitterObject in case the component is attached directly:
    void Start()
    {
       receiverDic = new Dictionary<int, SoundReceiver>();
       if (emitterObject == null)
           emitterObject = gameObject;
    }
  5. Implement the function for adding new receivers to the list when they enter the emitter bounds:
    public void OnCollisionEnter(Collision coll)
    {
       SoundReceiver receiver;
       receiver = coll.gameObject.GetComponent<SoundReceiver>();
       if (receiver == null)
           return;
       int objId = coll.gameObject.GetInstanceID();
       receiverDic.Add(objId, receiver);
    }
  6. Also, implement the function for removing receivers from the list when they are out of reach:
    public void OnCollisionExit(Collision coll)
    {
       SoundReceiver receiver;
       receiver = coll.gameObject.GetComponent<SoundReceiver>();
       if (receiver == null)
           return;
       int objId = coll.gameObject.GetInstanceID();
       receiverDic.Remove(objId);
    }
  7. Define the function for emitting sound waves to nearby agents:
    public void Emit()
    {
       GameObject srObj;
       Vector3 srPos;
       float intensity;
       float distance;
       Vector3 emitterPos = emitterObject.transform.position;
       // next step here
    }
  8. Compute sound attenuation for every receiver:
    foreach (SoundReceiver sr in receiverDic.Values)
    {
       srObj = sr.gameObject;
       srPos = srObj.transform.position;
       distance = Vector3.Distance(srPos, emitterPos);
       intensity = soundIntensity;
       intensity -= soundAttenuation * distance;
       if (intensity < sr.soundThreshold)
           continue;
       sr.Receive(intensity, emitterPos);
    }

How it works…

The collider triggers help register agents in the list of agents assigned to an emitter. The sound emission function then takes into account the agent's distance from the emitter in order to decrease its intensity using the concept of sound attenuation.

There is more…

We can develop a more flexible algorithm by defining different types of walls that affect sound intensity. It works by casting rays and adding up their values to the sound attenuation:

Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at AU $24.99/month. Cancel anytime
  1. Create a dictionary to store wall types as strings (using tags) and their corresponding attenuation:
    public Dictionary<string, float> wallTypes;
  2. Reduce sound intensity this way:
    intensity -= GetWallAttenuation(emitterPos, srPos);
  3. Define the function called in the previous step:
    public float GetWallAttenuation(Vector3 emitterPos, Vector3 receiverPos)
    {
       // next steps here
    }
  4. Compute the necessary values for ray casting:
    float attenuation = 0f;
    Vector3 direction = receiverPos - emitterPos;
    float distance = direction.magnitude;
    direction.Normalize();
  5. Cast the ray and retrieve the hits:
    Ray ray = new Ray(emitterPos, direction);
    RaycastHit[] hits = Physics.RaycastAll(ray, distance);
  6. For every wall type found via tags, add up its value (stored in the dictionary):
    int i;
    for (i = 0; i < hits.Length; i++)
    {
       GameObject obj;
       string tag;
       obj = hits[i].collider.gameObject;
       tag = obj.tag;
       if (wallTypes.ContainsKey(tag))
           attenuation += wallTypes[tag];
    }
    return attenuation;

Smelling using a collider-based system

Smelling can be simulated by computing collision between an agent and odor particles, scattered throughout the game level.

Getting ready

In this recipe based on colliders, we will need collider components attached to every object to be checked, which can be simulated by computing a collision between an agent and odor particles.

How to do it…

We will develop the scripts needed to represent odor particles and agents able to smell:

  1. Create the particle's script and define its member variables for computing its lifespan:
    using UnityEngine;
    using System.Collections;
    
    public class OdorParticle : MonoBehaviour
    {
       public float timespan;
       private float timer;
    }
  2. Implement the Start function for proper validations:
    void Start()
    {
       if (timespan < 0f)
           timespan = 0f;
       timer = timespan;
    }
  3. Implement the timer and destroy the object after its life cycle:
    void Update()
    {
       timer -= Time.deltaTime;
       if (timer < 0f)
           Destroy(gameObject);
    }
  4. Create the class for representing the sniffer agent:
    using UnityEngine;
    using System.Collections;
    using System.Collections.Generic;
    
    public class Smeller : MonoBehaviour
    {
       private Vector3 target;
       private Dictionary<int, GameObject> particles;
    }
  5. Initialize the dictionary for storing odor particles:
    void Start()
    {
       particles = new Dictionary<int, GameObject>();
    }
  6. Add to the dictionary the colliding objects that have the odor-particle component attached:
    public void OnCollisionEnter(Collision coll)
    {
       GameObject obj = coll.gameObject;
       OdorParticle op;
       op = obj.GetComponent<OdorParticle>();
       if (op == null)
           return;
       int objId = obj.GetInstanceID();
       particles.Add(objId, obj);
       UpdateTarget();
    }
  7. Release the odor particles from the local dictionary when they are out of the agent's range or are destroyed:
    public void OnCollisionExit(Collision coll)
    {
       GameObject obj = coll.gameObject;
       int objId = obj.GetInstanceID();
       bool isRemoved;
       isRemoved = particles.Remove(objId);
       if (!isRemoved)
           return;
       UpdateTarget();
    }
  8. Create the function for computing the odor centroid according to the current elements in the dictionary:
    private void UpdateTarget()
    {
       Vector3 centroid = Vector3.zero;
       foreach (GameObject p in particles.Values)
       {
           Vector3 pos = p.transform.position;
           centroid += pos;
       }
       target = centroid;
    }
  9. Implement the function for retrieving the odor centroid, if any:
    public Vector3? GetTargetPosition()
    {
       if (particles.Keys.Count == 0)
           return null;
       return target;
    }

How it works…

Just like the hearing recipe based on colliders, we use the trigger colliders to register odor particles to an agent's perception (implemented using a dictionary). When a particle is included or removed, the odor centroid is computed. However, we implement a function to retrieve that centroid because when no odor particle is registered, the internal centroid position is not updated.

There is more…

The particle emission logic is left behind to be implemented according to our game's needs and it basically instantiates odor-particle prefabs. Also, it is recommended to attach the rigid body components to the agents. Odor particles are prone to be massively instantiated, reducing the game's performance.

Seeing using a graph-based system

We will start a recipe oriented to use graph-based logic in order to simulate sense. Again, we will start by developing the sense of vision.

Getting ready

It is important to grasp the chapter regarding path finding in order to understand the inner workings of the graph-based recipes.

How to do it…

We will just implement a new file:

  1. Create the class for handling vision:
    using UnityEngine;
    using System.Collections;
    using System.Collections.Generic;
    
    public class VisorGraph : MonoBehaviour
    {
       public int visionReach;
       public GameObject visorObj;
       public Graph visionGraph;
    }
  2. Validate the visor object:
    void Start()
    {
       if (visorObj == null)
           visorObj = gameObject;
    }
  3. Define and start building the function needed to detect visibility of a given set of nodes:
    public bool IsVisible(int[] visibilityNodes)
    {
       int vision = visionReach;
       int src = visionGraph.GetNearestVertex(visorObj);
       HashSet<int> visibleNodes = new HashSet<int>();
       Queue<int> queue = new Queue<int>();
       queue.Enqueue(src);
    }
  4. Implement a breath-first search algorithm:
    while (queue.Count != 0)
    {
       if (vision == 0)
           break;
       int v = queue.Dequeue();
       List<int> neighbours = visionGraph.GetNeighbors(v);
       foreach (int n in neighbours)
       {
           if (visibleNodes.Contains(n))
               continue;
           queue.Enqueue(v);
           visibleNodes.Add(v);
       }
    }
  5. Compare the set of visible nodes with the set of nodes reached by the vision system:
    foreach (int vn in visibleNodes)
    {
       if (visibleNodes.Contains(vn))
           return true;
    }
  1. Return false if there is no match between the two sets of nodes:
    return false;

How it works…

The recipe uses the breath-first search algorithm in order to discover nodes within its vision reach, and then compares this set of nodes with the set of nodes where the agents reside.

Summary

In this article, we explained some algorithms for simulating senses and agent awareness.

Resources for Article:


Further resources on this subject: