Week 2 Day 4 - Game Development Terminology, Intro to Blueprints
The last 20 minutes or so we talk about blueprints, which are a visual programming language used inside UE4.
First, though, we went over some of the theory and math underlying video games:
1) Normalized vectors are vectors of length 1
2) Normal vectors are vectors perpendicular to a surface. If you have a piece of paper, for example, a normal vector would be like a pencil held eraser end first against it. Normal vectors are used for many things, including how light reflects off a surface. A "normal map" holds a different normal value for each point on a surface.
3) Mipmaps and LODs (Level of Details) are lower resolution/lower detail versions of textures/models for rendering things at a lower quality setting when they are far away. There's no reason rendering a million triangles for a single pixel. There's simply not enough shown to be worth all that effort. Some systems will generate these things automatically. Unreal Engine 5 will do something similar but different automatically by streaming triangles off disk as you get closer to them. This will require a fast SSD.
4) Backface culling means drawing only the front of a triangle and not the back. This cuts in half the number of triangles drawn on the screen, but looks weird if you ever enter into an object, as you can see out of it through the inside.
5) Two-sided rendering (and lighting) is the opposite, where the engine renders and lights both sides of an object, often used with very thin things like leaves.
6) Subsurface Scattering simulates light that bounces around inside something, like human skin or jade.
7) Self-Shadowing / Ambient Occlusion is having a model cast a shadow on itself.
8) Antialiasing is when you blur out the jagged edges in an image
9) Temporal Antialiasing removes patterns that appear when you're moving. Due to rounding errors, very narrow lines (like a staircase from a distance) will create very noticeable Moire patterns when you walk towards them. Temporal Antialiasing helps smooth these artifacts out.
10) HDR can mean one of two things: 1) Having extra bits of color information (and also increased brightness) on a TV or monitor display, or 2) Simulating your eye adapting to sudden changes in brightness. The first helps have more realistic colors (and reduces color banding) the second is an attempt to simulate a realistic human eye, but can be annoying to gamers who don't like losing sight of their targets in a game.
11) PBR means Physically Based Rendering. In the olden times, we would specify three values: color, specular, and diffuse (roughness) values. Nowadays, we can specify all sorts of physical parameters for a material, like its index of refraction, a normal map to simulate bumpiness, if it has subsurface scattering, or a clearcoat, and so forth, and the renderer will do a good job rendering it in the world.
We made a simple blueprint to make an object that shatters when you touch it by following these steps:
1) Click on the object and enable hit events. Also set it to not be a static object.
2) While it is selected, hit the blue button to add a blueprint to it.
3) In the event graph, delete the existing events and add an event called "event hit". This will get triggered every time something touches it.
4) Drag out from the white pin on it (that is the order in which things execute when something touches it) and create a node called "Spawn Emitter". Choose the explosion particle effect, and set the location of the emitter to be the hit location from the Event Hit node.
5) Drag out from the white pin on Spawn Emitter and choose "Play sound at location". Pick an explosion sound and set the location to also be the hit location.
6) Drag out from the play sound event and make a "Destroy Actor" node. It will default to deleting itself, which is what we want.
7) Hit compile & save
8) Go back into the world, hit Play, and walk up and touch the object!