Introduction
In the last couple of posts, we've explored some of the abilities of the UE4 Navigation System.
In this post, we're going to look at setting up some rudimentary AI logic using Blueprints/C++. We're still going to touch on some Navigation concepts however, specifically, Filtering and Avoidance.
By the end of the today, you're going to have a fully functioning AI vs AI "Game of Tag"!
Specifically, we're going to add:
- A "tagged" status, to indicate if our AI is "it" or "not-it". When the "tagged" status is changed, we're going to change:
- The colour of the mesh
- The maximum walking speed
- "it" bots can move slightly faster
- Depending upon being "tagged", the AI will behave differently:
- "it" bots will move towards the nearest "not-it" bot that it can see, or a random spot on the map if it can't see
- "not-it" bots will simply run to random points on the map
- but we'll bias this to try and run away from any visible "it" bots!
- we'll also turn on Avoidance for them, so they try and avoid being touched by the "it" bots, even when passing close by!
- We'll make it work with a little gameplay - an "it" bot can "touch" a "not-it" bot to make it "it".
Note: this variant of tag is sometimes known as "zombie tag"
- Finally, we're going to use some Navigation Filtering to change whether the bot can use the Launchpads, so that "it" bots can't use them - giving the "not-it" bots a chance to get away!
A lot of this, if not all of this, will be replaced in later stages (using the UE4 Behaviour Tree, Environmental Query System, and Perception System), but it's useful to have a simple baseline script to compare against.
The project has been updated to use version 4.18. This change required no adjustments to the code that we've written so far.
If you haven't followed along, or just want to make sure you're on the same page, you can download the current state of the project from the bitbucket repository.
Apologies for the delay between posts - but hopefully you all enjoy this one! Due to the time taken to get this ready for publication, 4.19 has just been released! There appear to be no issues with using 4.19 at this stage, and the next post will be released using 4.19.
AI Character: Tagged Status
The first thing we need to do is add a "Tagged" status to the Character. We do this on the Character class because it is game logic, not input logic.
We're also going to make this a visible change, so in a new ChangeTaggedStatus
function we're going to make this set the material of the character.
We're also going to add two collision volumes that we can use later. One will be for "it" bots to tag a new bot, and the other will be for the AI's "vision radius".
Before we start scripting, we'll need to create two Material Instances of the Mannequin Body Material found at TemplateContent/Mannequin/Character/Materials/M_UE4Man_Body
.
You can then pick any two colours that you want to use for "it" and "not-it" bots. I chose Red and Blue:
Tip: If you want to learn more about Material Instances, the Documentation is a great resource!
Blueprint: Tagged Status
Although we could use an Enum to indicate the status of our bot, because we are either "tagged" or not, we're just going to create a single public Boolean variable Tagged
on our Character class:
If we had more than two states, we would use an enum (and the rest of the tutorial would continue the same).
We're now going to create a function, ChangeTaggedStatus
, which will allow us to use this. This function is going to do more than just set the value of our variable, it's going to also set the material on our mesh and change our maximum walking speed:
Note: You could easily replace the 'magic' numbers
660.f
and600.f
with variables that the designers could tweak in the editor.
We also want to be able to set our Tagged status in the editor, so lets call this function in the construction script:
Now we have a functioning "tag" in our game logic!
Before we continue, lets add two collision volumes. One is our "interaction" volume (where an "it" bot can touch a "not-it" bot), and the other is our "vision" volume:
Note: The reason we use a Vision volume, is because we want to ray-trace against every known possible enemy. This is fairly cheap with a Collider, because they update their contained actors automatically as we move, but if we were to use (for example)
Get All Actors of Class
to find all possible enemies, it would be very expensive! Later in this series we'll use the built in Unreal Engine systems to handle this for us.
END ACCORDION
C++: Tagged Status
There are a couple of small things to note when doing this in C++.
Firstly, if you've downloaded a copy of the project at the top of this blog, you'll notice that there are two characters. We will be using the one with the _cpp
suffix (that does not contain any BP logic).
Halfway through the development of this post, I moved this into a
cpp
folder, and split up the level files into (BP/cpp/Geo). If you're following just C++, you shouldn't have an issue.
Secondly, our Controller is in C++, and you can't utilise blueprint functions from C++ without a little bit of messing around! The main reason we want the BP character, anyway, is because it's easier to set Meshes and Collision sizes there, etc. So for this section, we're going to:
- Create a new
ATagCharacter
class- Derived from the
ACharacter
class - Add the variables, functions, and components which we might want to utilise in C++
- Derived from the
- Reparent the Blueprint class from the
Character
class toTagCharacter
class in the Class Settings panel - Set the dimensions of our collision volumes in the child
BP_TagCharacter_cpp
class!
Once you create the ATagCharacter
class, it'll have a load of cruft in there that we don't need. You can empty out the class declaration and lets fill it in with:
UCLASS()
class TAG_API ATagCharacter : public ACharacter
{
GENERATED_BODY()
public:
ATagCharacter();
// Denotes the area that an "it" bot can "tag"
UPROPERTY(VisibleAnywhere)
UShapeComponent* InteractionVolume;
// Denotes the area a bot can see
UPROPERTY(VisibleAnywhere)
UShapeComponent* VisionVolume;
// Whether the bot is "it" or "not-it"
// Do not edit directly in any code (use ChangeTaggedStatus)
UPROPERTY(EditAnywhere, BlueprintReadOnly)
bool Tagged;
// Sets the correct material and speed when changing Tag status
UFUNCTION(BlueprintCallable)
void ChangeTaggedStatus(bool NewStatus);
// Call ChangeTaggedStatus if required
void OnConstruction(const FTransform& Transform) override;
private:
// Using static material interfaces,
// we can ensure we only load each material once
static UMaterialInterface* RedMaterial;
static UMaterialInterface* BlueMaterial;
};
With just the above, you can probably fill in the implementation yourself! There's nothing particularly novel that we're doing. Let's go over each function, however.
Firstly, the constructor. All we have to do is create our two components (I chose a Box and a Sphere as my shapes of choice) and load our materials into the static member variables:
// By default, the materials are null
UMaterialInterface* ATagCharacter::RedMaterial = nullptr;
UMaterialInterface* ATagCharacter::BlueMaterial = nullptr;
ATagCharacter::ATagCharacter()
{
// We are not going to use on tick logic in our character if we can avoid it
PrimaryActorTick.bCanEverTick = false;
// Build default shapes for our volumes
InteractionVolume = CreateDefaultSubobject<UBoxComponent>(TEXT("Interaction Volume"));
InteractionVolume->AttachToComponent(RootComponent);
VisionVolume = CreateDefaultSubobject<USphereComponent>(TEXT("Vision Volume"));
VisionVolume->AttachToComponent(RootComponent);
// If the static material hasn't been loaded yet
if (BlueMaterial == nullptr)
{
// Load the Material from our Content folder
auto BlueMaterialLoader =
ConstructorHelpers::FObjectFinderOptional<UMaterialInterface>(
TEXT("/Game/TemplateContent/Mannequin/Character/Materials/MI_Body_Blue"));
if (BlueMaterialLoader.Succeeded())
{
BlueMaterial = BlueMaterialLoader.Get();
}
}
// If the static material hasn't been loaded yet
if (RedMaterial == nullptr)
{
// Load the Material from our Content folder
auto RedMaterialLoader =
ConstructorHelpers::FObjectFinderOptional<UMaterialInterface>(
TEXT("/Game/TemplateContent/Mannequin/Character/Materials/MI_Body_Red"));
if (RedMaterialLoader.Succeeded())
{
RedMaterial = RedMaterialLoader.Get();
}
}
}
Note: If you make a mistake with the Constructor Helper calls, it may cause your project to crash upon opening. It's easy enough to open the code file and comment it out. Just be careful with the line and you should be ok! Consider yourselves warned.
Next, the OnConstruction
function simply calls ChangeTaggedStatus
(as described in the header).
void ATagCharacter::OnConstruction(const FTransform & Transform)
{
Super::OnConstruction(Transform);
ChangeTaggedStatus(Tagged);
}
Finally, the ChangeTaggedStatus
function does three things. Sets Tagged
to argument passed in, and then based on the value, SetMaterial
on the Mesh
and the set the value of MaxWalkSpeed
on the CharacterMovementComponent
.
void ATagCharacter::ChangeTaggedStatus(bool NewStatus)
{
Tagged = NewStatus;
// Select the correct material using the Ternary operator
auto Material = Tagged ? RedMaterial : BlueMaterial;
// If we have a mesh
auto Mesh = GetMesh();
if (Mesh)
{
// Apply the material
Mesh->SetMaterial(0, Material);
}
// Select the Walk Speed using the Ternary operator
auto MaxWalkSpeed = Tagged ? 660.f : 600.f;
// Apply the walk speed
GetCharacterMovement()->MaxWalkSpeed = MaxWalkSpeed;
}
Now that we've made our base Character class in C++, lets change the Parent Class
of our BP_TagCharacter_cpp
. We can do this in the Class Settings panel:
Once you've done that, you can edit the settings of the Interaction and Vision Volumes so that they fit exactly where you want them to:
Tip: This is one of the best ways to work with C++ and BP in Unreal. Using C++ to drive your core game logic, and using the BP visual editor to help select and place your Meshes/Materials/etc.
Note: You could easily replace the 'magic' numbers
660.f
and600.f
with UPROPERTYs that the designers could tweak in the editor.
END ACCORDION
Fixing the Launchpad
At this point, you may notice that your AI starts having weird interactions with any Launchpads around the level - this is because the extra volumes we've created are triggering the Launchpad!
We're going to have to make a couple of fixes.
Blueprint: Fixing the Launchpad
Firstly, before we trigger the launch, lets check that the Other Comp
was the Root Component
of the Other Actor
. This means the launch will only trigger when we touch the Launchpad with our main collider:
Secondly, we'll need to change the Calculate Launch Velocity
function we wrote to use the bounds of the root component, instead of the actor, when calculating our Start
location for the Launch:
And now the Launchpads should work properly again!
END ACCORDION
C++: Fixing the Launchpad
Firstly, before we trigger the launch, lets check that the OtherComp
was the Root Component
of the Other
Actor. This means the launch will only trigger when we touch the Launchpad with our main collider. To do so, change the line in OnTriggerBeginOverlap
from:
if (Character)
To:
if (Character && OtherComp == Character->GetRootComponent())
Secondly, we'll need to change the CalculateLaunchVelocity
function we wrote to use the bounds of the root component, instead of the actor, when calculating our Start
location for the Launch.
This will be easier if we change the function's AActor* LaunchedActor
parameter to ACharacter* LaunchedActor
(as we have already cast the Character, this should work easily).
Then, the line that was:
Start.Z -= LaunchedActor->GetSimpleCollisionHalfHeight();
Becomes:
Start.Z -= LaunchedCharacter->GetCapsuleComponent()->GetScaledCapsuleHalfHeight();
And now the Launchpads should work properly again!
END ACCORDION
AI Controller: Movement Behaviour
Now that we have our status, we're going to tweak our Controller to take notice of this.
Firstly, we're going to need a way for our AI to "see" other bots!
Then, if we're "it", we'll move straight towards them!
If we're "not-it", we're just going to move to a random point. However, if we can see some "it" bots, we're going to bias our search for our random point away from any visible "its".
Blueprint: Movement Behaviour
To let our AI "see" the other bots inside the vision sphere, lets make a new pure function inside our BP_TagCharacter
- GetVisibleEnemies
. This function is going to get all of the AI Characters that are inside the volume, filter by "Tagged" status, and then ray-cast to them to ensure we can see them.
The initial part of the function looks like this:
The raycast is a little tricky. What we want to do is raycast from the eyes of our AI, towards the centre of the possibly visible enemy. If the first object we touch is the enemy, then we know we have a direct "line of sight"! To do this, we're going to write a new pure function, CanSeeTarget
:
Note: Pay attention to the argument of the Trace Channel parameter - we change from
Visibility
toCamera
, because Characters don't block Visibility by default!Note: By default, Base Eye Height is 64, but we should set it to be 72 for the default mannequin. This can be set on the character, under the
Camera
heading.Note: If you wanted to be a bit more accurate, you could raycast from the eyes to the feet, centre, and head, and if any of the three raycasts succeeds you can see the target. But that may be overkill!
Hook up the new function in the comment we left earlier, and it should end up looking like this:
Now that the Character has its gameplay logic hooked up, we can head over to the Controller and start changing how our AI thinks.
The first thing we need to do is store a reference to our Controlled Pawn which has been Cast correctly, because we need to start using the values (like Tagged
) that we're adding to the BP_TagCharacter
class. We're also going to replace calls of Go To Random Waypoint
with Movement Behaviour
.
Note: We have to call Movement Behaviour after we store our Tag Character.
Movement Behaviour
is a really simply function which abstracts away how we're handling our movement:
In our It Behaviour
function, we want to either Move to Actor
on the first any we can see, if we can see any. Otherwise, we'll Go To Random Waypoint
:
Tip: You can make the behaviour a bit more deterministic (and smarter) by making it move towards the nearest target, rather than the 0th target in the array.
In our Not-it Behaviour
function, we're going to use similar logic. However, if enemies are around, we're going to generate three random waypoints, and pick the one that is the best option. We'll do this logic in a new pure function Find Flee Target
.
By the time we get into the function, we are guaranteed to have at least one enemy. To bias our waypoint target away from all enemies, we're going to generate three Random Waypoint
s and then we'll find the one that is the furthest away from all enemies.
Note: At this point, I usually prefer to write the code in C++, as it can get quite messy in BP, but it's perfectly viable.
The first thing we need to do inside the Find Flee Target
is set up some local variables:
Note: Local variables are accessible only during the function call, and will be set to their default values every time the function is called (i.e. they don't carry over information).
Next, we're going to loop through each of the Waypoints that we generated, and figure out which is the furthest away from all the visible enemies:
Now if you drop in two AI characters, one who's Tagged
by default, you should see them start chasing each other around!
END ACCORDION
C++: Movement Behaviour
Firstly, lets add a couple of functions to our TagCharacter
:
public:
UFUNCTION(BlueprintCallable)
const TArray<ATagCharacter*> GetVisibleEnemies() const;
UFUNCTION(BlueprintPure)
bool CanSeeEnemy(ATagCharacter* Target) const;
The code for these is rather simple:
const TArray<ATagCharacter*> ATagCharacter::GetVisibleEnemies() const
{
static TArray<AActor*> VisibleCharacters;
VisionVolume->GetOverlappingActors(VisibleCharacters, ATagCharacter::StaticClass());
TArray<ATagCharacter*> VisibleEnemies;
for (auto Character : VisibleCharacters)
{
if (Character == this)
continue;
auto TagCharacter = Cast<ATagCharacter>(Character);
// If they don't have the same tag as me, and I can see them
if (TagCharacter->Tagged != Tagged && CanSeeEnemy(TagCharacter))
{
VisibleEnemies.Add(TagCharacter);
}
}
return VisibleEnemies;
}
Note: We use function-local static initialization to prevent us from creating
TArray
's forVisibleCharacters
with every call to this function. It might be an unecessary optimisation at this point - but this kind of thinking early on can help you scale your code! We don't do this for the second array, because we can rely on Return Value Optimisation to take care of the optimisation for that.
bool ATagCharacter::CanSeeEnemy(ATagCharacter * Target) const
{
// Raycast from my eyes
FVector RayOrigin{ GetCapsuleComponent()->Bounds.Origin };
RayOrigin.Z += BaseEyeHeight;
// To the middle of their body
FVector RayTarget{ Target->GetCapsuleComponent()->Bounds.Origin };
static FHitResult HitResult(ForceInit);
FCollisionQueryParams params;
params.AddIgnoredActor(this);
GetWorld()->LineTraceSingleByChannel(HitResult, RayOrigin, RayTarget, ECC_Camera, params);
return HitResult.Actor == Target;
}
Note: We use the same static initialisation for the HitResult. Using "magic statics" is ideal in situations where you will be creating a large data structure (>32 bytes) frequently, but do not need it to remain useful for a long period of time. This does mean, however, that you can not truly multithread such code.
Note: Alternatively, you can use a private member variable to achieve the same effect. However, this increases the size of your
TagCharacter
per instance, and any function utilising that cannot beconst
(without using themutable
type specifier).
Pay attention to the argument of the Trace Channel parameter - we change from Visibility
to Camera
, because Characters don't block Visibility by default!
By default, Base Eye Height is 64, but we should set it to be 72 for the default mannequin. This can be set on the character, under the Camera
heading, or in the constructor.
Note: If you wanted to be a bit more accurate, you could raycast from the eyes to the feet, centre, and head, and if any of the three raycasts succeeds you can see the target. But that may be overkill!
Now that the Character has its gameplay logic hooked up, we can head over to the Controller and start changing how our AI thinks.
Let's define the following:
UFUNCTION()
void MovementBehaviour();
UFUNCTION()
AActor* FindFleeTarget(const TArray<ATagCharacter*> Enemies) const;
UPROPERTY()
ATagCharacter* TagCharacter;
First, lets replace all existing uses of GoToRandomWaypoint
to use MovementBehaviour
instead, and store a pre-cast reference to our TagCharacter
:
void ATagController::BeginPlay()
{
Super::BeginPlay();
UGameplayStatics::GetAllActorsOfClass(this, ATargetPoint::StaticClass(), Waypoints);
TagCharacter = Cast<ATagCharacter>(GetCharacter());
if (TagCharacter)
{
TagCharacter->LandedDelegate.AddUniqueDynamic(this, &ATagController::OnLanded);
TagCharacter->MovementModeChangedDelegate.AddUniqueDynamic(this, &ATagController::OnMovementModeChanged);
MovementBehaviour();
}
}
void ATagController::OnMoveCompleted(FAIRequestID RequestID, const FPathFollowingResult & Result)
{
Super::OnMoveCompleted(RequestID, Result);
GetWorldTimerManager().SetTimer(TimerHandle, this, &ATagController::MovementBehaviour, 1.0f, false);
}
Next, lets write our MovementBehaviour
function:
void ATagController::MovementBehaviour()
{
const auto VisibleEnemies = TagCharacter->GetVisibleEnemies();
if (VisibleEnemies.Num() == 0)
{
GoToRandomWaypoint();
return;
}
if (TagCharacter->Tagged)
{
MoveToActor(VisibleEnemies[0]);
}
else
{
MoveToActor(FindFleeTarget(VisibleEnemies));
}
}
Finally, lets write the FindFleeTarget
function.
By the time we get into the function, we are guaranteed to have at least one enemy. To bias our waypoint target away from all enemies, we're going to generate three Random Waypoint
s and then we'll find the one that is the furthest away from all enemies.
Once we generate the waypoints, lets Algo::Sort
them by distance to the ActorArrayAverageLocation
for the enemies:
AActor* ATagController::FindFleeTarget(const TArray<ATagCharacter*> Enemies) const
{
const FVector EnemyAverageLocation = UGameplayStatics::GetActorArrayAverageLocation(static_cast<TArray<AActor*>>(Enemies));
TArray<AActor*> FleePoints = { GetRandomWaypoint(), GetRandomWaypoint(), GetRandomWaypoint() };
// Sort by the greatest distance to the enemy average location
Algo::Sort(FleePoints, [&](const AActor* A, const AActor* B) {
return FVector::DistSquared(A->GetActorLocation(), EnemyAverageLocation)
> FVector::DistSquared(B->GetActorLocation(), EnemyAverageLocation);
});
return FleePoints[0];
}
Note: I had to add
const
toGetRandomWaypoint
- which should have been marked as such from the beginning!Note: We use
DistSquared
for our distance check, because we don't care about the exact distance, but only the relative distance from each waypoint. Using theSquared
version allows us to avoid an expensivesqrt
operation.Note: The funny-looking second parameter to
Algo::Sort
is a lambda. You can think of it as an unnamed function which: "captures" ([&]
) all local variables (by reference), accepts twoconst AActor*
as parameters, and has a function body with automatic return type deduction.Note: We return true if A should be earlier in the order than B, so we return true if A's distance to the enemies greater than B's distance to the enemies, and then return the first result.
Now if you drop in two AI characters, one who's Tagged
by default, you should see them start chasing each other around!
END ACCORDION
Gameplay: Tagging Behaviour
Now lets add "tagging" when the "it" bot gets close enough to its target.
For the Character, we want to add a couple of bits of game logic:
- When we are "it" and a "not-it" Character is in range, "tag" the target
- When we are tagged, update our behaviour
If we wanted to, we could also make this a little more "interactive", by raising an event in the Character, and having the Controller decide whether or not to "tag" the target. This would allow, for example, a small grace period if an AI bot were to catch a player bot - creating much more user-friendly gameplay.
However, as we are just making a purely AI-vs-AI demo, we're going to code everything to happen immediately.
Blueprint: Tagging Behaviour
Firstly, when the Character can touch something (Interaction Component Begin Overlap), and that thing is another player (BP_TagCharacter), and we are "it", and they are "not-it" - we will "tag" them!
Secondly, when we ChangeTaggedStatus
(and we are actively in game, because we have a controller), lets stop movement, so that we immediately find a new place to move to (or other bot to run towards)!
However, if you start running the game, you'll notice that your bots are getting tagged very early! This is because our Interaction Component is triggering the overlap event with the target's Vision Component! We want our Interaction Component to Ignore anything that's not a Pawn (which the Mesh and Capsule components are marked as) - so let's set up some Custom Collision:
At this point, depending on how your waypoints are set up, you might get the "it" bot running blindly past a potential target on its way to its waypoint. There are a few ways you can fix this, but I'll leave this as an excercise for the reader. Here are some ideas to get you started:
- On Controller Tick (but make Tick Interval 0.5s or higher):
- If we are moving towards a waypoint
- But
Get Visible Enemies
returns something - Call
Movement Behaviour
- On Character Vision Volume Begin Overlap
- If the other actor is an enemy Tag Character
- And we
Can See Target
- Call an event dispatcher (
New Target Seen
) - In the Controller:
- Call
Movement Behaviour
when the Character'sNew Target Seen
event fires
- Call
Note: Both of these methods should improve behaviour for both "it" and "not-it" bots! However, they both have pros and cons - the first might be a little bit more responsive (depending on how you tune the Tick Interval) - but it means all of our bots use the Tick event, which is slow. We'll eventually replace this with the built-in Unreal Engine Perception System and get the best of both worlds!
END ACCORDION
C++: Tagging Behaviour
Firstly, when the Character can touch something (Interaction Component Begin Overlap), and that thing is another player (TagCharacter), and we are "it", and they are "not-it" - we will "tag" them! We'll have to define a function, as well as add it as a delegate (just like we did with the Controller before):
// TagCharacter.h - ATagCharacter
UFUNCTION(BlueprintCallable)
void OnInteractionEnter(class UPrimitiveComponent* OverlappedComp, class AActor* OtherActor, class UPrimitiveComponent* OtherComp, int32 OtherBodyIndex, bool bFromSweep, const FHitResult& SweepResult);
// TagCharacter.cpp - ATagCharacter::ATagCharacter()
InteractionVolume->OnComponentBeginOverlap.AddUniqueDynamic(this, &ATagCharacter::OnInteractionEnter);
Then we'll implement the function as:
void ATagCharacter::OnInteractionEnter(UPrimitiveComponent * OverlappedComp, AActor * OtherActor, UPrimitiveComponent * OtherComp, int32 OtherBodyIndex, bool bFromSweep, const FHitResult & SweepResult)
{
auto Target = Cast<ATagCharacter>(OtherActor);
if (Target)
{
if (Tagged && !Target->Tagged)
{
Target->ChangeTaggedStatus(true);
}
}
}
Secondly, when we ChangeTaggedStatus
(and we are actively in game, because we have a controller), lets stop movement, so that we immediately find a new place to move to (or other bot to run towards)!
// TagCharacter.cpp - ATagCharacter::ChangeTaggedStatus()
// If we're in-game, stop moving!
auto AIController = Cast<AAIController>(GetController());
if (AIController)
{
AIController->StopMovement();
}
Note: We cast to
AAIController
instead ofATagController
to avoid circular references and having to forward declare anything. This means that, unlike the blueprint version where new behaviour is immediate, C++ TagCharacters will pause when touched. You can likely figure out a way to work around this if it bothers you! ;)
However, if you start running the game, you'll notice that your bots are getting tagged very early! This is because our Interaction Component is triggering the overlap event with the target's Vision Component! We want our Interaction Component to Ignore anything that's not a Pawn (which the Mesh and Capsule components are marked as) - so let's set up some Custom Collision:
// TagCharacter.cpp - ATagCharacter::ATagCharacter()
FCollisionResponseContainer InteractionResponse{ ECR_Ignore };
InteractionResponse.SetResponse(ECC_Pawn, ECR_Overlap);
InteractionVolume->SetCollisionResponseToChannels(InteractionResponse);
At this point, depending on how your waypoints are set up, you might get the "it" bot running blindly past a potential target on its way to its waypoint. There are a few ways you can fix this, but I'll leave this as an excercise for the reader. Here are some ideas to get you started:
- On Controller Tick (but make Tick Interval 0.5s or higher):
- If we are moving towards a waypoint
- But
GetVisibleEnemies
returns something - Call
MovementBehaviour
- On Character Vision Volume Begin Overlap
- If the other actor is an enemy Tag Character
- And we
CanSeeTarget
- Call a delegate (
NewTargetSeen
) - In the Controller:
- Call
MovementBehaviour
when the Character'sNewTargetSeen
event fires
- Call
Note: Both of these methods should improve behaviour for both "it" and "not-it" bots! However, they both have pros and cons - the first might be a little bit more responsive (depending on how you tune the Tick Interval) - but it means all of our bots use the Tick event, which is slow. We'll eventually replace this with the built-in Unreal Engine Perception System and get the best of both worlds!
END ACCORDION
Navigation: Filtering and Avoidance
Finally, lets give our "not-it" bots a chance to get away.
Lets filter our Navigation so that "it" bots don't try to use Launchpads to move, and lets make the AI try and adjust its movement so even if it's running straight past someone, it triesto avoid touching it.
Blueprint: Filtering
We're no longer going to be able to use the Simple Move to Actor
node - our move is just not that simple anymore.
However, it's got a nice and predictable behaviour, there's only one predicate - if our Character is Tagged
or not!
What we're going to do is make a new function which pretends to be a Simple Move to Actor
, and handles the additional logic!
To do so, find one of our calls to Simple Move to Actor
, select that node and the preceding self
node, right click and collapse to function:
Name our new function Move to Actor
, and then go ahead and find and replace all current calls of Simple Move to Actor
with our new function.
Tip: You can use
ctrl
+F
to find nodes, just like a text editor.
Inside the function, we now want to replace *Simple* Move to Actor
with Move To Actor
:
Finally, lets add a Navigation Filter if we're tagged, and use the default Navigation Filter if we're not. We're going to do three things to set this up:
- Create a new
NavArea_Launchpad
, the same way we createdNavArea_Stairs
in the last post. Create a new
NavFilter_Tagged
, which is a new blueprint based on theNavigationQueryFilter
class, and add the Area as excluded:In our
Move To Actor
function,Select
which Navigation Filter we're going to use, based on our Character'sTagged
state:
And that's it! Now our "not-it" AI has a chance to get away!
Note: The "it" AI can still use the launchpads if they accidentally run over them - feel free to fix that if it bothers you!
END ACCORDION
Blueprint: Avoidance
Setting up avoidance in our Character is quite easy, to get the basics right. Simply making every agent avoid every other one just requires turning on RVOAvoidance
on the Character Movement Component
.
Ensure that the Avoidance Weight is set to 0.5 as well - a good default value which you can tweak as desired.
If you wanted to tweak this so that It and Not-it have different behaviour, you'll want to set up this information using the Avoidance Groups. For example, set It bots to Avoidance Group 1, and have them Ignore Group 0 for avoidance purposes. For Not-it bots, leave them as Avoidance Group 0, and let them avoid both Group 0 and 1.
Tip: Unreal offers a slightly more advanced form of Avoidance called Detour Crowd Avoidance, which is slightly "better" than RVO - but we'll implement it another time!
END ACCORDION
C++: Filtering
Generally, when we call MoveToActor
, we're just passing in the first argument.
However, we now want to start passing in a Filter as well, and (for now), keep the rest of the arguments default.
To do this simply, we're going to wrap the AAIController
's method with our own:
// TagController.h
private:
UFUNCTION()
void TaggedMove(AActor* Goal);
The implementation is going to be incredibly simple:
// TagController.cpp
void ATagController::TaggedMove(AActor * Goal)
{
TSubclassOf<UNavigationQueryFilter> FilterClass = nullptr;
if (TagCharacter->Tagged)
{
FilterClass = UNavFilter_Tagged::StaticClass();
}
MoveToActor(Goal, -1.0f, true, true, true, FilterClass, true);
}
Go back and replace all calls of MoveToActor
with our new TaggedMove
.
As you can see, we're going to need a new class - UNavFilter_Tagged
. Create this as a child of UNavFilter_AIControllerDefault
.
Add a default constructor for our UNavFilter_Tagged
:
// NavFilter_Tagged.cpp
UNavFilter_Tagged::UNavFilter_Tagged()
{
FNavigationFilterArea FilterArea { };
FilterArea.AreaClass = UNavArea_Launchpad::StaticClass();
FilterArea.bIsExcluded = true;
Areas.Add(FilterArea);
}
We're also going to have a create a UNavArea_Launchpad
, from UNavArea
- but we won't need to add any code to it.
We will, however, adjust our ALaunchpad
class to use it:
// Launchpad.cpp - ALaunchpad::UpdateNavLinks()
Link.SetAreaClass(UNavArea_Launchpad::StaticClass());
And that's it! Now our "not-it" AI has a chance to get away!
Note: The "it" AI can still use the launchpads if they accidentally run over them - feel free to fix that if it bothers you!
END ACCORDION
C++: Avoidance
Setting up avoidance in our Character is quite easy, to get the basics right. Simply making every agent avoid every other one just requires setting the UCharacterMovementComponent::bUseRVOAvoidance
flag to true!.
You can do this in either blueprint or C++.
Ensure that the AvoidanceWeight
is set to 0.5 as well - a good default value which you can tweak as desired.
Tip: Unreal offers a slightly more advanced form of Avoidance called Detour Crowd Avoidance, which is slightly "better" than RVO - but we'll implement it another time!
END ACCORDION
Wrapping up
Now you know how to set up Navigation Filtering, how to get your AI to avoid each other, and hopefully have learned a little bit about best practices in UE4!
Everything might be feeling a little complicated now - especially in Blueprints - but we start using more powerful tools to help us from here.
For the next post (which is hopefully a little shorter), we're going to start looking at the in-built UE4 advanced AI system - Behaviour Trees!
Tip: You can visualise what your AI is doing by using the AI Debug view. To enable this in the editor, look under Show -> Developer -> AI Debug. Clicking on one of your AI controlled bots will then show what path they're taking, as well as hordes of other useful info to help you debug!
I'd love to hear whether you're following along using blueprints or C++ (or both)! Leave a comment, or tweet at me and let me know!
You can subscribe to this blog using any old RSS reader, if it tickles your fancy to follow along! Just plug
https://vikram.codes/blog
into your RSS reader of choice and it should work.