Essay on Game Engine and Development Structure

(Part 2)

 

Render engine

 

Render Engines are a set of tools to generate images on models or assets. Renderers will receive information about textures and images of a model or asset from another sub-game engine such as the animation engine and interprets this information to render real-life like images. Advanced render engines are normally used in development of 3D games as it has become a golden rule for 3D games to represent the real world as accurately as possible. Two common techniques used by these engines are scanline and rasterisation. Scanline rendering is rendering an image row-by-row of a certain model or polygon. After rendering a row of pixels, it will move to the next row and so on. The main advantage of this rendering technique is that it doesn’t require all of the coordinates of vertices or points to be stored in the main memory for rendering purpose and thus, it saves memory space. Only vertices in a line that are being scanned are loaded into memory and unloaded straight away after that line has been rendered.

Rasterisation is the process of converting a vector image into a raster image. A vector image is an image is produced by following the geometrical representation of the image itself while a raster image follows an image’s pixel representation. Hence, vector images are always sharp and crisp because they do not rely on pixels. Rasterisation is required because image displays such as LCD screens rely on pixels to produce images. Render engines operate on OpenGL or DirectX environment as it rely on these to be able to generate images that are compatible for computer usage. Some popular examples of rendering software include 3dsmax, Adobe Photoshop, Adobe After Effects and Maya 3D. Render engines depend highly on the capability of GPUs (graphic processor units) to be able to function properly.

 

Physics Engine

 

A game is deemed incomplete if it does not obey the laws of Physics. There are two classes of Physics Engine namely, real-time and high-precision. Real-time engines have decreased accuracy and are used by game developers while high-precisions are normally used in computer animations and scientists for greater accuracy.

Real-time Physics engine are a set of software to simulate the movement and collision of two or more objects in their virtual environment. The main feature of this engine is collision detection. It works by accepting values from the user and calculating them according to algorithms based on equations of the laws of collision in Physics. These calculations are then ported onto objects and the interceptions between these objects are simulated in a virtual environment. However, there are many occasion where values entered are not calculated accurately and the simulation did not go according as planned, hence, requiring further revision of these engines. A popular real-time physics engine is PhysX developed by NVIDIA. PhysX allows smoother gameplay by offloading physics calculations from the CPU and is mainly a hardware accelerator.

 

Scripting Engine

 

This is the most essential part in game development to connect assets and create functions in the game. Most game engines use C++ as its primary programming language such as Starcraft series, Blizzard and Electronic Arts games. The main reason is because C++ is very flexible and at the same time powerful and customizable. New scripting languages such as QuakeC and UnrealScript which was developed by popular game developers are based on the C++ language. Other popular programming languages for game programming include Java, C, Lua and Python. Programmers usually use a fully Integrated Development Environment (IDE) which helps to run and compile codes. Some popular IDEs include Microsoft’s Visual Studio and CodeWarrior. Altogether, these IDEs and scripting languages make up the scripting engine itself.

 

Animation Engine

 

Animation engine is used to create animations that run in-game. A basic animation would normally start with a ‘ragdoll’ which is a basic animation model without any predefined movements. Animators would then use motion paths to map the movement of the model in blocks of frame. Early animation was done frame-by-frame and was very tedious but now, animations are created by setting a starting and ending frame over the course of several frames. This method is known as tweening. The animation engine is also responsible in providing information to the renderer on how to display the assets and models. One good advance example is Euphoria game animation engine created by NaturalMotion. This animation engine simulates animations differently every time it is repeated even though the values entered by the user is the same. However, the changes are only slight. Several games that used this animation engine includes Grand Theft Auto IV and Star Wars : The Force Unleashed.

 

Level Editor

 

Level Editors are a set of development tools involving creation of game levels such as missions and stages. As a user progresses through the game, the missions get seemingly harder because different missions are designed differently along with the increasing difficulty of AI bots. Level editors allow users to determine the placement of assets and models in virtual reality. For example, a box may be placed at point A for a certain level N. For the next level N+1, the box may be placed at point B which is at a different coordinate than point A. This difference may require gamers to encounter some obstacles in finishing the current level. Level editors also set the type of images or textures for assets in the game. Gravels, tiles, tree barks, and aluminum are all examples of textures applied in 3D games. 2D games are simpler and only involve a flat colour. Famous level editors include Valve’s Hammer Editor, 3D World Studio, Epic’s UnrealEd and id Software’s Q3Radiant.

 

 

Game Engine