Location: Main

Constructed Textures

Drag[en]ginePosted Tue 25th Jul 2017 at 13:18:41 by Dragonlord Authorized by Dragonlord

Constructed Textures

This post is a summary of the recent twitter posts collected for IndieDB. To stay in the loop watch the Epsylon Twitter feed.

See IndieDB News Article

Height Terrain Navigation Space

Drag[en]ginePosted Sat 11th Jun 2016 at 11:59:48 by Dragonlord Authorized by Dragonlord

A little candy before the next news post comes around.

There is currently a lot of work going on about game mechanics and more but more will be not disclosed yet. In the mean time to bridge the time until all is prepared for the news post have some Drag[en]gine goodies that also took place during this time. The focus in this news post is on Height Terrain Navigation Spaces and Navigation Space Optimizations. So no long words but instead a video.

ModDB: Height Terrain Navigation Space

And here a little short-list of what else went on that is Drag[en]gine specific.

  • Improving synthesizer (basic effects, loader scripts), dynamic music scripts and announcer scripts
  • Improving OpenAL handling limited hardware voices
  • Android debugging, speed improvements (armeabi-v7a-hard support) and android mess-up fixing
  • Working on "test project" for first time users to learn the ropes
  • Lots of improvements and new features on world, conversation and other editors
  • Various fixes and improvements on Blender export scripts
  • Various fixes and improvements on light/ambient handling with shadows
  • Eventually fixing the annoying libPNG problem
  • Navigation Spaces support on Height Terrains
  • Removing old and deprecated code to get closer to release state
  • Navigation Space Blocking improved with Optimization Pass
  • Navigation Space Linking improved for easier and more predictable use

So long until the real big news post comes around.

Dynamic Audio System

Drag[en]ginePosted Mon 25th Jan 2016 at 19:24:23 by Dragonlord Authorized by Dragonlord

This time around the Audio System received some overdue additions in preparation for what is to come.

Audio systems in games are notoriously static in nature. A buzz word is "sound shaders" but this is more parametrize play parameters than really dynamic sound. For this game project I need a sound system which supports dynamic sound and this is what I've cooked up in the last month.


At the core of the dynamic sound systems sit the Synthesizers. The link points to the wiki page which contains in-detail information I'm going to leave out here. In a nut-shell synthesizers allow to generate sound at run-time using sound production rules (or sources). Synthesizers are assigned to speakers and played back like regular sound files just that they are dynamic not static. Controllers can be defined to manipulate the generated sound at run-time, and live while playing back! The important feature here is that the synthesizers are generic in nature like the rest of the game engine. All of their actual use cases are implemented inside game scripts using/driving synthesizers instead of being hard-coded into the game engine. This provides much more flexibility to me and do you if you work later on with this game engine. For this the new synthesizer editor has been added so synthesizers can be easily created and tested.


[p]Two example implementations of synthesizer driving scripts are included in the game engine distribution: Dynamic music and announcers. Both use a simple synthesizer with a single chain source and are ready to use.

Dynamic music allows to modify music playing by transitioning through music parts using switches. As a test example I used the dynamic music files from Stalker Clear-Sky since they are well suited for this test-case. The included scripts load the dynamic music from an XML file created by hand. The file contains the music parts (sound files), switches used by the game and transitions between music parts depending on switch states. All is implemented in simple game scripts so it can be altered and extended without limits.

Announcers allow to produce in a simple way announcement systems like automated train announcement systems using a list of recorded words. As a test example I used the VOX files from half-life 1 since they are well suited for this test-case. The included scripts load the announcer from an XML file created by hand. The file defines where the word sound files are located. Once loaded a sentence can be given to the script and it plays back the announcement.

The video below shows these two scripts in action from inside the test project. This is a project included in the game engine distribution and is a sort of demo-project to learn the ropes. Copyrighted material as used for my implementation tests are obviously not included.

Dynamic Music and Announcer Test (IndieDB Video)

[p]These are only two small examples which the game will build upon. Since these are scripts it is simple to extend and improve. And now to something different.

Material Sounds

The game project uses reusable world geometry a lot. For this reason material sounds are not as simple as assiging a sound type to an object. Especially material-material impact sounds require usually a lot of work with recording tons of sound samples. Since I don't have a sound engineer and not this level of equipment I decided to cut down the number of sounds by using combined collision sounds. Instead of playing one sound for each individual material combination impacts play now a sound for each material involved. This reduces work a lot while allowing for more combinations. Material types support now a range of different sound events from impacts to actor movement sounds. Sounds are either pre-recorded sound samples or possibly synthesizers. Former is used right now for easier use but later can be used for special tricks.

To improve this the physics system has been also improved to properly handle kinematic and dynamic collisions in a similar way. Collision shape properties are now used on all elements to link collision shapes to object materials. The world editor supports now properties on component textures as seen in the screenshot below.

[p]This allows to assign arbitrary properties to textures while re-defining them in the editor. The game scripts use this to re-define per-texture material type in addition to those defined in element classes. The video below is work in progress on adding more material sounds as well as getting all objects their appropriate sounds assigned.

Material Sounds (IndieDB: Video)


With the synthesizer system in place I can now do this nifty little surprise I'm twiddling around in my head for a long time. I'm not going to say more for the time being :D .

AI! AI, Everywhere!

Drag[en]ginePosted Sat 26th Dec 2015 at 17:57:46 by Dragonlord Authorized by Dragonlord

AI! AI, Everywhere!

The last month a lot of work went into AI development, performance optimizing scripts and the usual feature improvements and bug-fixing. Since this game is a lot about detective investigation the game revolves a lot around dynamic interaction with characters and the environment. This requires a powerful conversation system and powerful AI. The AI does not just stand around waiting for the player to blow them in the face nor does it act as stupid cannon fodder. The AI system features now the required basic functionality.

Game AI usually consists of a behaviour tree or state machine. This works for simple AI but is rigid and error prone for complex AI. In this project player and non-player actors are handled with as much shared code as possible (if player can do so can CPU; if CPU can do so can player). To achieve this the game uses a 3-stage AI behaviour:

  • Actions: Actions button pressing or using button panels. The same for all actors (player and non-player) and thus scripted only once.
  • Behaviour: Where player triggers actions by his inputs the AI triggers them using behaviours. The work of past month focused on this part. Behaviour is context dependent. Context is the current task/mission of actors.
  • Agenda: Simulates the actor motives (master plan). The player faces an artificial game master (Criminal Master) which controls actor agendas and thus their behaviour. The player has to use his detective skills to figure out these (hidden) agendas to solve the case. This will be worked on next.

All this is explained and showed in videos below.

Actor Locomotion and Collision Avoidance

Drag[en]gine provides path calculation for AI using the Navigation System (static obstacles). Collision avoidance (or steering) of dynamic obstacles along the path is job of the scripts since this is very game specific. The improved collision avoidance deals with two typical scenarios: Move Around and Wait Behind.

A typical solution is using physical approach where actors and obstacles are magnets repelling each other. This solution works but movement looks like what it is: two repelling magnets. This is unnatural and not matching the game goals. To solve this problem collision avoidance is implemented similar to real humans avoiding collisions. Approaching an obstacle they change direction once not repel like magnets. Actors test collision along their moving direction in regular intervals (1-2s). If a future collision is detected a deviation angle is calculated winging the obstacle. This deviation is applied for 1-2s then the actor heads again for his intended goal. The closer the actor is to the obstacle the larger is the deviation angle. Combined with testing at larger intervals leads to a pleasant avoiding behaviour. Waiting behind obstacles is required if actors pass through narrow spaces like doors. Instead of winging actors adjust their speed if the obstacle is slower. Actors detect nearby walls so if they choose the wall side of the obstacle but don't fit through they pick the other side. Navigation space cost functions are used to pick cheaper movement costs. This keeps actors inside virtual corridors avoiding unfavourable terrain while avoid collisions. The video below gives an impression on how this looks like in action.

Actor Notifications

AI has to react to dynamic changes in the world and react intelligent to static world. To achieve this goal AI supports notifications. Objects can send notifications to actors with localized hints which it can respect or ignore. This is used for a couple of situations:

  • Actor approaching a dangerous place like road crossing during red lights. Makes actor wait before continue walking. Usable for all situations actors have to wait for something to happen before going on.
  • Actor approaching closed door he has to open by pressing a button. Makes building maps quick to build since world elements are "intelligent".
  • Actors noticing nearby objects inspecting them (so think twice where you hide).
  • Actor moves close to stairs and is notified he could use a nearby elevator instead or other alternate routes.

All this provides a world with rich and smart actors especially during (potentially dangerous) missions.

Actor AI - Indie DB

Actor Generation

The game defines a small set of unique characters to begin with. The majority of actors are generated from XML templates (modders can rejoice). You can modify appearance (random texture sets), starting inventory (random props), animations (all animators ever used) and more. As you progress with your investigation the game (CM) converts elected generated characters to important actors tying them into the story. So actors passing by can later become important making each investigation unique.

Actor Behavior

The goal is to create a living world the player investigates. Interacting with actors in different situations and locations alters flow and outcome. The right place and time is as important as asking the right questions. These AI have been implemented so far (more to follow).

  • CityLife. Actors casually navigating the city outside mission scenarios. Goal markers are placed throughout the city for actors to choose from. Goals are attached to Locations telling actors what behaviour to use and what props they can interact with. CityLife populates the city streets with life and makes it harder to know who is important or just a bystander.
  • Train. Actors planing to ride a train or other kind of vehicle (waiting, boarding, riding and leaving). Expect missions like tailing suspicious actors or trying to loose actors tailing you.
  • Elevator. Similar to train behaviour allowing actors to use elevators.
  • Office. Actors moving inside offices/buildings. The basic actor-at-work behaviour. You often find actors at their work places.

This is the initial version. Behaviours can interrupt and branch in a tree way to produce lively behaviour easily. Mission related actor AI is coming next. Expect a living world around you to dive into. The video below shows everything mentioned so far in action.

AI Behaviour - Indie DB


Besides AI I've also started working on a game engine related topic. I think you can guess what it's all about from the image below. Readers of the previous news post (and wiki) might recognize something ;)


Preparations have been also done for a special unique feature of this game. I'm not going to talk about it now since this would spoil the fun so look forward to it! But I'm sure you are going to like it (especially follow indie-devs and modders).


What also happened for those interested in details (as text file to not bloat the news post)
Helping Hands

This project is still in need of helping hands on the content production side. If you are a model artist (skilled in world props, buildings or humanoids) or texture artist you are welcome to get in contact with me. If you have other skills and want to help don't be shy and send me a PM too.

Twitter channel online

Drag[en]ginePosted Mon 29th Jun 2015 at 13:26:59 by Dragonlord Authorized by Dragonlord

Twitter Channel Online

You can now not only track the game engine and game project on the Drag[en]gine and Epsylon profiles, you can now also track it on the EpsylonGame Twitter Channel.

So if you want to support the love for this project tell your friends about this channel or re-tweet what you find there. Every support helps.