This time around the Audio System received some overdue additions in preparation for what is to come.
Audio systems in games are notoriously static in nature. A buzz word is "sound shaders" but this is more parametrize play parameters than really dynamic sound. For this game project I need a sound system which supports dynamic sound and this is what I've cooked up in the last month.
At the core of the dynamic sound systems sit the Synthesizers. The link points to the wiki page which contains in-detail information I'm going to leave out here. In a nut-shell synthesizers allow to generate sound at run-time using sound production rules (or sources). Synthesizers are assigned to speakers and played back like regular sound files just that they are dynamic not static. Controllers can be defined to manipulate the generated sound at run-time, and live while playing back! The important feature here is that the synthesizers are generic in nature like the rest of the game engine. All of their actual use cases are implemented inside game scripts using/driving synthesizers instead of being hard-coded into the game engine. This provides much more flexibility to me and do you if you work later on with this game engine. For this the new synthesizer editor has been added so synthesizers can be easily created and tested.
[p]Two example implementations of synthesizer driving scripts are included in the game engine distribution: Dynamic music and announcers. Both use a simple synthesizer with a single chain source and are ready to use.
Dynamic music allows to modify music playing by transitioning through music parts using switches. As a test example I used the dynamic music files from Stalker Clear-Sky since they are well suited for this test-case. The included scripts load the dynamic music from an XML file created by hand. The file contains the music parts (sound files), switches used by the game and transitions between music parts depending on switch states. All is implemented in simple game scripts so it can be altered and extended without limits.
Announcers allow to produce in a simple way announcement systems like automated train announcement systems using a list of recorded words. As a test example I used the VOX files from half-life 1 since they are well suited for this test-case. The included scripts load the announcer from an XML file created by hand. The file defines where the word sound files are located. Once loaded a sentence can be given to the script and it plays back the announcement.
The video below shows these two scripts in action from inside the test project. This is a project included in the game engine distribution and is a sort of demo-project to learn the ropes. Copyrighted material as used for my implementation tests are obviously not included.
[p]These are only two small examples which the game will build upon. Since these are scripts it is simple to extend and improve. And now to something different.
The game project uses reusable world geometry a lot. For this reason material sounds are not as simple as assiging a sound type to an object. Especially material-material impact sounds require usually a lot of work with recording tons of sound samples. Since I don't have a sound engineer and not this level of equipment I decided to cut down the number of sounds by using combined collision sounds. Instead of playing one sound for each individual material combination impacts play now a sound for each material involved. This reduces work a lot while allowing for more combinations. Material types support now a range of different sound events from impacts to actor movement sounds. Sounds are either pre-recorded sound samples or possibly synthesizers. Former is used right now for easier use but later can be used for special tricks.
To improve this the physics system has been also improved to properly handle kinematic and dynamic collisions in a similar way. Collision shape properties are now used on all elements to link collision shapes to object materials. The world editor supports now properties on component textures as seen in the screenshot below.
[p]This allows to assign arbitrary properties to textures while re-defining them in the editor. The game scripts use this to re-define per-texture material type in addition to those defined in element classes. The video below is work in progress on adding more material sounds as well as getting all objects their appropriate sounds assigned.
With the synthesizer system in place I can now do this nifty little surprise I'm twiddling around in my head for a long time. I'm not going to say more for the time being :D .
The last month a lot of work went into AI development, performance optimizing scripts and the usual feature improvements and bug-fixing. Since this game is a lot about detective investigation the game revolves a lot around dynamic interaction with characters and the environment. This requires a powerful conversation system and powerful AI. The AI does not just stand around waiting for the player to blow them in the face nor does it act as stupid cannon fodder. The AI system features now the required basic functionality.
Game AI usually consists of a behaviour tree or state machine. This works for simple AI but is rigid and error prone for complex AI. In this project player and non-player actors are handled with as much shared code as possible (if player can do so can CPU; if CPU can do so can player). To achieve this the game uses a 3-stage AI behaviour:
Drag[en]gine provides path calculation for AI using the Navigation System (static obstacles). Collision avoidance (or steering) of dynamic obstacles along the path is job of the scripts since this is very game specific. The improved collision avoidance deals with two typical scenarios: Move Around and Wait Behind.
A typical solution is using physical approach where actors and obstacles are magnets repelling each other. This solution works but movement looks like what it is: two repelling magnets. This is unnatural and not matching the game goals. To solve this problem collision avoidance is implemented similar to real humans avoiding collisions. Approaching an obstacle they change direction once not repel like magnets. Actors test collision along their moving direction in regular intervals (1-2s). If a future collision is detected a deviation angle is calculated winging the obstacle. This deviation is applied for 1-2s then the actor heads again for his intended goal. The closer the actor is to the obstacle the larger is the deviation angle. Combined with testing at larger intervals leads to a pleasant avoiding behaviour. Waiting behind obstacles is required if actors pass through narrow spaces like doors. Instead of winging actors adjust their speed if the obstacle is slower. Actors detect nearby walls so if they choose the wall side of the obstacle but don't fit through they pick the other side. Navigation space cost functions are used to pick cheaper movement costs. This keeps actors inside virtual corridors avoiding unfavourable terrain while avoid collisions. The video below gives an impression on how this looks like in action.
AI has to react to dynamic changes in the world and react intelligent to static world. To achieve this goal AI supports notifications. Objects can send notifications to actors with localized hints which it can respect or ignore. This is used for a couple of situations:
The game defines a small set of unique characters to begin with. The majority of actors are generated from XML templates (modders can rejoice). You can modify appearance (random texture sets), starting inventory (random props), animations (all animators ever used) and more. As you progress with your investigation the game (CM) converts elected generated characters to important actors tying them into the story. So actors passing by can later become important making each investigation unique.
The goal is to create a living world the player investigates. Interacting with actors in different situations and locations alters flow and outcome. The right place and time is as important as asking the right questions. These AI have been implemented so far (more to follow).
Besides AI I've also started working on a game engine related topic. I think you can guess what it's all about from the image below. Readers of the previous news post (and wiki) might recognize something ;)
Preparations have been also done for a special unique feature of this game. I'm not going to talk about it now since this would spoil the fun so look forward to it! But I'm sure you are going to like it (especially follow indie-devs and modders).
What also happened for those interested in details (as text file to not bloat the news post)
This project is still in need of helping hands on the content production side. If you are a model artist (skilled in world props, buildings or humanoids) or texture artist you are welcome to get in contact with me. If you have other skills and want to help don't be shy and send me a PM too.
You can now not only track the game engine and game project on the Drag[en]gine and Epsylon profiles, you can now also track it on the EpsylonGame Twitter Channel.
So if you want to support the love for this project tell your friends about this channel or re-tweet what you find there. Every support helps.
These past month had been fully in the name of performance improvements, feature adding and especially documentation. I don't go into much text here since the documentation part is where the big huge text is located for those interested in both the inner engine workings and especially how to developer with this game engine. This produced lots of documentation especially for the DragonScript language in form of API documentation and Wiki documentation. Now included is also the first demo application for the new features. More will follow to get you started with the game engine. But from the technical point of view what ate the majority of time had been a tremendous performance improvement workover that touched many engine areas. Various performance improvements including Parallel Processing and an advanced OpenGL render thread system consumed more time than expected but the result is a total performance boost of up to 150% over nominal this spring. There is still some room left especially in foreign code not that much under my control. And if this would be not enough already the Drag[en]gine has now also a whole new Canvas System. So before the Links to all the information a little view across some of the parts touched during the past time.
As you noticed there is no big content update. Since I had been all occupied with all the tech and documentation and I still have no helping hand except myself I could not also deal with content. So if you are from the artistic domain and you want to do something else than what is usually around you are welcome to step forward. No portfolio fillers. I had enough bad experiences with those.So happy game-deving until the next time.
This time around a lot of wrench work in the engine core parts has been done not translating well into a news post. Nevertheless some parts of it are interesting namely the beam particle simulation, particle lighting and various other improvemnts along the way. I'll keep the talk short and give more time to a video.
So far the particle and ribbon simulation mode has been full working for particle emitters. Beams had been not full implemented so this changed now. Particle emitters with beam simulation mode allow for a lot of possible effects. I'm not going to talk about them but show some examples connected with the changes in the video. In a nutshell you get ribbon like beam rendering with physically simulated particle path. Animated path is topic I'll talk about in another news post. Sheet rendering for ribbons/beams has been improved. It's now enabled by default. Particle light sources have been fixed and improved. Every emissive particle, ribbon or beam casts light automatically with the same complex light shader as all other light sources. Added beam curve for burst beam lifetime control. Also added particle warm-up time to get emitters starting in full beauty on enabling casting. Last but not least fixed a leak problem with particle emitters generated automatically on particle collision.
So here is the video with some examples.
ModDB Video, YouTube Video
Other stuff is not worth a full section but important nevertheless. SSS is now improved for transparent objects. Allows for properly damaged glass for example as in this test.
Furthermore upgraded to Bullet Physics 2.82. In the same time fixed some long standing issues with Bullet Physics. Also reworked the DOF for constraints adding support to implement static and kinematic join friction and breaking force on Bullet. Not working yet since bullet has some own ideas about certain things I need to re-implement first.
On the IGDE editors various cleanups have been applied I came across on my way to make the first release quicker (as it's done already).
Added animation scaling support. It had been broken in the export scripts. Scaling can be applied anywhere in an animator but care should be taken since physics modules might have troubles with strangely scaled colliders. Here an example shot.
In the OpenGL module I switched code over to use Texture Sampler Objects. This helps with keeping complex rendering maintainable and saved state change calls.
And to wrap it up the curve editor in the particle emitter editor received some additional menu operations to speed up some common tasks.
I've promised some editor usage videos. I'll make them next and present them as an article for those interested in learning about how the editors work as well as those interested in giving early feedback to incorporate until the first release.