User Tools

Site Tools


gamedev:textureproperties:normal

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
gamedev:textureproperties:normal [2019/05/08 22:22] dragonlordgamedev:textureproperties:normal [2019/05/24 23:43] (current) – ↷ Links adapted because of a move operation dragonlord
Line 1: Line 1:
 {{tag>graphic skin texture normal}} {{tag>graphic skin texture normal}}
 <WRAP youarehere> <WRAP youarehere>
-[[:start|Start Page]] >> [[gamedev:main|Game Development with the Drag[en]gine]] >> [[gamedev:skinproperties|Skin Texture Properties]] >> [[gamedev:texturepropertylist|Texture Property Database]] >> **normal**+[[:start|Start Page]] >> [[:gamedev|Game Development with the Drag[en]gine]] >> [[gamedev:skinproperties|Skin Texture Properties]] >> [[gamedev:texturepropertylist|Texture Property Database]] >> **normal**
 </WRAP> </WRAP>
  
-<WRAP boxheader> +====== Skin Texture Property: normal ====== 
-====== Summary ====== + 
-</WRAP> +Defines the surface normal in tangent space
-<WRAP boxcontent> +
-|Texture Property Name|normal| +
-|Description|Defines the surface normal in tangent space|+
 |Excepted Data Source|3 component image| |Excepted Data Source|3 component image|
 |Data Range|0 to 1 for all image components| |Data Range|0 to 1 for all image components|
Line 15: Line 13:
 |Affected Modules|Graphic, Physics| |Affected Modules|Graphic, Physics|
 |Linked Properties|[[gamedev:textureproperties:normal_strength|normal.strength]]| |Linked Properties|[[gamedev:textureproperties:normal_strength|normal.strength]]|
-</WRAP> 
  
 ====== Description ====== ====== Description ======
 +
 The **normal** texture property is used to add high resolution light rendering to objects that can not be represented with a low resolution game model. Normal based lighting is a standard in game development since a long time. The basic idea is to use textures to store high resolution surface normals not found on a low resolution model. Such normal map textures are no ordinary color textures. They store normal vectors packed into an 8-bit color. Such a normal map can be produced with various tools but typically two main ways exist. The **normal** texture property is used to add high resolution light rendering to objects that can not be represented with a low resolution game model. Normal based lighting is a standard in game development since a long time. The basic idea is to use textures to store high resolution surface normals not found on a low resolution model. Such normal map textures are no ordinary color textures. They store normal vectors packed into an 8-bit color. Such a normal map can be produced with various tools but typically two main ways exist.
  
Line 27: Line 25:
  
 ===== History ===== ===== History =====
 +
 Normal based lighting is a standard in game development and cinema since a long time. A game without normal mapping tends to look flat because objects lack the dynamic change of light and shadows as we know it from the real world. To deal with this problem some tricks had been created along the years. Normal based lighting is a standard in game development and cinema since a long time. A game without normal mapping tends to look flat because objects lack the dynamic change of light and shadows as we know it from the real world. To deal with this problem some tricks had been created along the years.
  
Line 36: Line 35:
  
 ===== Normal Maps from Images ===== ===== Normal Maps from Images =====
 +
 The quick and often dirty way is to use a normal map plugin for GIMP or other paint applications. These plugins take a color or grayscale image and convert it to a normal map. There exist different operators that can be used to calculate the resulting normal map. They vary in crispness, strength and broadness of pixels covered (in the sense of how much details of neighbors flows into the result). Results obtained with a normal map plugin can vary drastically. There exist tricks to improve the results for example using Photo Normal Reconstruction or using applications specialized in generating normal maps from images. This technique is well suited for Detail Normal Maps hence bumpy surface details of small scale and depth like scratches, leather patterns or embossed text. It is also often used to create normal maps from real world images. For the later though using Photo Normal Reconstruction is recommended but the results vary. For Geometry Normals as required for complex organic models this technique is not well suited at all. Here the second way to create normals maps is better used. The quick and often dirty way is to use a normal map plugin for GIMP or other paint applications. These plugins take a color or grayscale image and convert it to a normal map. There exist different operators that can be used to calculate the resulting normal map. They vary in crispness, strength and broadness of pixels covered (in the sense of how much details of neighbors flows into the result). Results obtained with a normal map plugin can vary drastically. There exist tricks to improve the results for example using Photo Normal Reconstruction or using applications specialized in generating normal maps from images. This technique is well suited for Detail Normal Maps hence bumpy surface details of small scale and depth like scratches, leather patterns or embossed text. It is also often used to create normal maps from real world images. For the later though using Photo Normal Reconstruction is recommended but the results vary. For Geometry Normals as required for complex organic models this technique is not well suited at all. Here the second way to create normals maps is better used.
  
 ===== Normal Maps from High Resolution Model ===== ===== Normal Maps from High Resolution Model =====
 +
 The second way uses a normal map generation too and a set of models. With this technique a high resolution model is used as the source of normal information and mapped onto the texture of a low resolution model. This process is called "Baking" and applications like Blender3D can do this for you as well as external tools specialized on normal map generation using a high polygon model. Using this technique can be tricky as there are various problems that can happen. The second way uses a normal map generation too and a set of models. With this technique a high resolution model is used as the source of normal information and mapped onto the texture of a low resolution model. This process is called "Baking" and applications like Blender3D can do this for you as well as external tools specialized on normal map generation using a high polygon model. Using this technique can be tricky as there are various problems that can happen.
  
Line 44: Line 45:
  
 ===== Tangent Space Problem ===== ===== Tangent Space Problem =====
 +
 A main problem is the splitting of normals and especially tangents along the low resolution model. The normals have to be transformed relative to a coordinate system located at the point of interest on the low resolution model. This coordinate system is typically called the "Tangent Space" and depends on the normal and tangent vectors at this very point in space. Normals as well as tangents are interpolated in the Drag[en]gine along faces in a low resolution model. Without proper splitting normals and tangents of neighboring faces are averaged resulting in singularities. Even without these singularities normals and tangents can interpolate across the low resolution model in funny ways. Normal map generation tools take care of these peculiarities but there is an annoying problem. These tools usually all work with their own specific definition of a tangent space and how tangents are split across models. These definitions vary only slightly but small differences can have huge visible artifacts as consequences. It is thus crucial to ensure the normal map generation tool produces an output fully consistent with the Drag[en]gine Normal Definition to work properly. A main problem is the splitting of normals and especially tangents along the low resolution model. The normals have to be transformed relative to a coordinate system located at the point of interest on the low resolution model. This coordinate system is typically called the "Tangent Space" and depends on the normal and tangent vectors at this very point in space. Normals as well as tangents are interpolated in the Drag[en]gine along faces in a low resolution model. Without proper splitting normals and tangents of neighboring faces are averaged resulting in singularities. Even without these singularities normals and tangents can interpolate across the low resolution model in funny ways. Normal map generation tools take care of these peculiarities but there is an annoying problem. These tools usually all work with their own specific definition of a tangent space and how tangents are split across models. These definitions vary only slightly but small differences can have huge visible artifacts as consequences. It is thus crucial to ensure the normal map generation tool produces an output fully consistent with the Drag[en]gine Normal Definition to work properly.
  
 ===== Normal Map Encoding ===== ===== Normal Map Encoding =====
 +
 Another problem is the encoding of normal maps. The components of normals are in the range from -1 to 1 while those of colors are in the range from 0 to 1. The most common solution is to map all normal components directory from -1..1 to 0..1 using one color component for each normal component. For the choice of mapping between the components as well as flipping or not flipping the result there are tons of different ways normals can be mapped. The Drag[en]gine requires the normals to be encoded in the following way to work properly. Another problem is the encoding of normal maps. The components of normals are in the range from -1 to 1 while those of colors are in the range from 0 to 1. The most common solution is to map all normal components directory from -1..1 to 0..1 using one color component for each normal component. For the choice of mapping between the components as well as flipping or not flipping the result there are tons of different ways normals can be mapped. The Drag[en]gine requires the normals to be encoded in the following way to work properly.
 |Normal Component|-1 Normal|0 Normal|1 Normal|Color Component| |Normal Component|-1 Normal|0 Normal|1 Normal|Color Component|
Line 59: Line 62:
  
 ===== Blender3D GBuffer NormGen Script ===== ===== Blender3D GBuffer NormGen Script =====
 +
 The Blender3D export scripts contain a GBuffer based normal normal generator script. This script converts normals generated by the blender backing process into normals working with the Drag[en]gine. Normal map generators in general have a problem with the middle value. A (0, 0, 1) normal converts to 127.5 . Slight irregularities result in tiny different color values chosen for nearly the same encoded normal. Not visible from the naked eye they turn into artifacts while rendering. Especially texture compression reacts badly to such small irregularities in the normal maps. The Blender3D export scripts contain a GBuffer based normal normal generator script. This script converts normals generated by the blender backing process into normals working with the Drag[en]gine. Normal map generators in general have a problem with the middle value. A (0, 0, 1) normal converts to 127.5 . Slight irregularities result in tiny different color values chosen for nearly the same encoded normal. Not visible from the naked eye they turn into artifacts while rendering. Especially texture compression reacts badly to such small irregularities in the normal maps.
  
Line 70: Line 74:
  
 ====== Physics Module ====== ====== Physics Module ======
 +
 The **normal** texture property can be used for **[[gamedev:particleemitters|particles]]** to influence their bouncing direction. This aligns the physical response of small particles bouncing off surfaces with the graphical representation of the material. The **normal** texture property can be used for **[[gamedev:particleemitters|particles]]** to influence their bouncing direction. This aligns the physical response of small particles bouncing off surfaces with the graphical representation of the material.
  
Line 75: Line 80:
  
 ====== Examples ====== ====== Examples ======
 +
 <WRAP column 45%> <WRAP column 45%>
 <WRAP center box 450px> <WRAP center box 450px>
gamedev/textureproperties/normal.1557354179.txt.gz · Last modified: 2019/05/08 22:22 by dragonlord