Unreal Engine DevLog #22: Reaction Shots & Prototype Cover System

I've been meaning to write down this dev log for quite some time. It's the last update on the X-COM:EU clone prototype that I was working on a while back. I would love to return to working on it sometime later, but for now, I'm concentrating on making tool kits for the Unreal Engine Marketplace.

The last update on the project was about an LoS based visibility system, which is covered here: http://unrealpossibilities.blogspot.in/2015/07/unreal-engine-devlog-19-line-of-sight.html

Since then, I had implemented Reaction Shots for the player controlled units, as well as a prototype cover system that displays the cover value of each corner of every tile.

Reaction Shots Preview Video:

Prototype Cover System Preview Video:

So that's going to be the last update on the project for quite some time. It's been a great learning experience for me as a starter experience in Unreal Engine 4. Also special thanks to Knut Overbye for creating one of the best products in the Unreal Engine Marketplace, without which the project wouldn't have made it this far. I've provided a link below to his toolkit. Feel free to check it out: ttps://www.unrealengine.com/marketplace/advanced-turn-based-tile-toolkit

FPS Tower Defense Toolkit v4.11 Basics: Wave Spawning Systems

The FPSTDT's Wave Handler class (starting with v1.4) supports 3 types of wave spawning systems: Unit Based Generator, Group Based Generator & Threat Based Generator.

1. Unit Based Generator

Provides maximum direct control over the wave spawning systems. Best suited for creating games with small waves by setting the properties of every unit on an individual basis. 

- The 'Enemy_Type' variable provides a dropdown to select from any of the 3 default enemy units.
- The 'Delay' variable determines the spawn time of the unit relative to starting time of the active wave.
- The 'SpawnPoint' variable can be used to choose from any of the available 'BP_EnemySpawnPoint' actors in the level.
- The 'WaveNumber' variable specifies the unit's associated wave number.

2. Group Based Generator

Provides good control over the wave spawning systems and easier to test different wave formations. Bots are spawned as automated groups and hence best suited for games that require large number of units in a wave, while still providing designers with the control of the unit types and the number of units in a group.

- The 'Enemy_Type' variable provides a dropdown to select from the 3 default enemy units. A group can contain only bots of a particular enemy type.
- The 'SpawnDelay' variable can be used to specify the time delay between spawning of each bot in a group.
- The 'SpawnPoint' variable can be used to choose from any of the available 'BP_EnemySpawnPoint' actors in the level. All units in a group will spawn from this spawn point.
- The 'WaveNumber' variable specifies the unit's associated wave number.
- The 'SpawnStartingTime' variable determines the starting time for spawning the first unit in the group, relative to the starting time of the active wave.
- The 'NumberOfUnits' variable specifies the number of units in the group.

3. Threat Based Generator

Provides least direct control over the spawning system, but facilitates automated generation of random wave formations. Bots are spawned randomly based on the specified enemy types until the threat level of the wave reached the specified value.

- The 'Enemy_Type' variable can be used to select all the bot types to be considered while generating the wave.
- The 'Delay' variable can be used to specify the time delay between spawning of each bot in a group.
- The 'WaveNumber' variable specifies the associated wave number.
- The 'ThreatRating' variable determines the maximum threat limit of the wave. Can be used to increase the difficulty of each wave automatically.

In order to facilitate creation of large number of waves, the 'NumberOfCycles' variable can be used to repeat the waves for a specified number of times, each cycle increasing the health of enemy units based on the active cycle number. Setting this number to zero can be used to create Endless waves.

Unreal Engine Diaries #10

  • To display the AI Perception range in the editor, go to Editor Preferences >> General >> Gameplay Debugger & tick the 'Perception' parameter. Also in post processing, set AA to 'FXAA' or none to display the debug lines better.

  • In the widget blueprint, select multiple widgets from the Hierarchy panel & then Right click >> 'Wrap with' to wrap the selected widgets within another widget like Canvas Panel, Border, etc.

  • Add 'Rotating Movement' component to actors to have them rotate automatically. The rotation rate for each axis can be set from the component. This could be used to create interactive objects like weapon pick ups in a shooter or effects like rotating coins in a side scrolling game.

  • Wrapping widgets with 'Invalidation Boxes' can be used to increase performance as they get executed only at the time of creation & not every frame unlike other widgets. This could be especially useful when there are lots of static UI elements that do not get updated at run time.

  • The 'Random unit vector in cone' node can be used to get random line trace target locations for creating shotgun spread patterns.

VR Tips Compilation #2

As mentioned in the previous post, I've been collecting tips about working with VR from the Unreal Engine Livestreams & GDC Talks. This is the second of the two VR Tips Compilation posts, while the first one can be found here: https://unrealpossibilities.blogspot.in/2015/12/vr-tips-compilation-1.html

  • Using force grabs to get objects from the environment is a good alternative to actually having the player grab the object, due to the awkwardness & differences between how it works in real life, compared to the lack of physical feedback in games. Also adding to this factor is the design of the control device used to accept inputs.

  • Remove all motion blur effects as using them in your game can give rise to Simulation Sickness.

  • Since run & gun is generally not an option in VR games, teleportation is one of the interesting viable alternatives. But even here, it is better to slowly fade the screen to back & then fade everything back in at the new location, instead of instantaneously shifting the player from one place to another.

  • In Unreal Engine, it is better to use screen reflection captures & disable screen space reflections from the post processing features.

VR Tips Compilation #1

I've been collecting tips & advice about working with VR, from the various Unreal Engine Livestreams & GDC Talks. Even though I myself have never even used a VR device, it seemed like a good idea to get to know more about it from the people who are already working on it. This post is the first of two VR tips compilations based on the data that I've collected so far. 

  • The first and most important thing that I've heard in almost all the talks is to never take the camera control away from the player. This would mean that traditional methods like moving the camera to shift the player's focus on to important game events, would probably be a bad idea. So would changing the camera angle to reflect the action from different perspectives.

  • Many games use Depth of Field & changing Field of View to zoom in on important/relevant targets while blurring out the edges. It seems that this is not going to be of much use when working in VR. Players might create this effect naturally by closing one of their eyes while aiming at a target. It would be a better practice to cut down on the rendering costs of DoF & use it elsewhere where it's truly required.

  • When fading the screen, it's better to fade to black than fade to white. Unlike reality, the player cannot put his hands to cover his eyes if the screen is too bright. On the other hand, if they move their head as a natural impulse, it wouldn't create the necessary response thus creating a break in immersion. And I believe that immersion could be the most important advantage of VR, and when it breaks, it's probably going to be more frustrating than in traditional gaming experiences.

  • During this early stage of VR, when the general public haven't been accustomed to the experience enough to make it feel natural, it would be a good idea to not make the experience too scary. Again, the natural response of covering your eyes don't work here & that could potentially create unfavorable experiences.

With that, I conclude the first post. The second and final post should be uploaded soon.

Unreal Engine Diaries #9

  • 'Shift + F1' can be used to gain mouse control & jump between the different game instance windows during multiplayer testing in the editor.

  • While working on VR, try to match the size of in game objects to their real life counterparts, as not doing so could make them stand out and reduce the immersion.

  • In the Material Editor details panel, turn on 'Fully Rough' [prevents reflection rendering pipelines from executing] & turn off 'Light Map Directionality' [both under the the 'Mobile' category] to make materials that are less expensive to render.  This is a pretty good option when dealing with far away objects in the level that do not require a lot of detail. Setting the Shading Model to 'Unlit' can also increase performance in instances where the additional detail is not required.

  • In PIE mode, press 'Ctrl + Shift + .' to bring up the GPU Profile. It would be a good idea to start looking for elements that cost more than a millisecond.

  • 'Switch has Authority' can be used to determine who is executing the script: the server or the client.

Unreal Engine Diaries #8

  • When adding new input parameters to a function that's already being called multiple times throughout the project, it's always better to immediately check every instance of the function call to make sure that the new input parameter is connected as required.

  • Drag & drop a variable from the variables list onto a get/set node of another variable to automatically replace the second variable with the first.

  • When attaching moving physics actors to the player character without physics handles, disable it's gravity & set the linear/angular velocities of all of it's components to zero in order to have it simulate physics & collision on the move.

  • Under default conditions, when a character changes it's direction of movement, it instantaneously turns to face the new direction. To change this behavior and enable smooth rotation based on direction changes, first go to the AI character blueprint >> Character Movement Component >> Enable "Orient Rotation to Movement" & set "Yaw" of the "Rotation Rate" based on how smooth the bot turning movement has to be. Then under the default attributes of the blueprint, disable "Use Controller Rotation Yaw" and it should now start having smoother turning movements.

  • If you're experiencing automatic brightness changes in your game, you can disable this effect by going to your viewing camera component >> Post process settings >> Auto Exposure >> Set min and max brightness to the same value.

Unreal Engine Diaries #7

  • While working on the Top Down Stealth Toolkit, I noticed that sometimes the character animations that worked in the PIE mode did not work in the Standalone mode. One of the solutions that worked for me was to connect the 'Event Blueprint Update Animations' in all the child anim BPs to their parent update animation events.
  • To find the angle between two rotator variables, it is better not to use normal subtraction to get the difference as this could give odd results in certain scenarios owing to the fact that the rotator values for actor rotation and world rotation follow the (0,180) & (-180,0) range. For example, when you add two positive values, it could produce a negative output and vice versa. In order to work around this situation, the 'Delta (Rotator)' node can be used to get the absolute angular difference between the two rotators.
  • When working on Top Down games, the 'Orient rotation to movement' parameter in the character movement component of the player character can be unticked to have it face the direction of mouse cursor instead of the movement direction.
  • The following method can be used to get the dimensions of a landscape actor:
    1. First create a reference to the landscape actor either through the level blueprint or using the 'Get All Actors of Class' function.
    2. Get the landscape actor reference and then use 'Get Actor Bounds' function to get the box extent.
    3. Break the box extent vector into it's float values representing the magnitude on each axis and then multiply each by 2 in order to get the length, width and height of the landscape actor.
  • In the default First Person template, if we do a line trace towards the world space equivalent of the center of the screen, it could be seen that the impact location of the trace and the crosshair location on the screen are offset by a certain amount. This is because the logic used to draw the crosshair on the screen from the HUD class does not take the texture size of the crosshair into account during calculations. To rectify this issue and display the crosshair at the true center, we can subtract both x and y location by half the corresponding dimensions of the texture used for crosshair, before plugging it into the draw texture function. In the default case, that would mean subtracting both by 8 units. Doing so should make sure that the trace hit locations match with the crosshair location.
    [ExtendedFirstPersonTemplate_PreciseAim Project Files link: http://unrealpossibilities.blogspot.in/2015/10/extended-first-person-template-improved.html]