Originally Posted By: Hummel
I thought the idea is to speed up directional lighting by calculating the lighting for "all" possible view space normals only once per frame, e.g. illuminating a "sphere" in view space which gives you the material/lighting map.


The idea of MatCap'ing is not the speedup, but more the idea to "bake" complex lighting and -material properties, produced e.g. in 3DS Max. In the following picture, I assigned a concrete material to a sphere, that is above a grass field, I added a sun with winter'ish light (cold blue) and a backlight with a warm feeling. When projected on the geometry, the complex lighting is preserved:



You can also do this in realtime by rendering a sphere in e.g. front of the camera and use that as a lookup for ambient terms, like I do here in a static fashion. -- The speedup is obvious, of course. Question is, if it looks fine, because point/directional light/specular directions are not corresponding anymore for objects - that is why this is suited only for ambient terms, if applied to everything. If you create local lightprobes, though, you could do a multitexture lookup and blend them according to the distance to the object. But this might create some overhead if done in realtime, which I don't want to do.

Last edited by HeelX; 07/28/12 12:58.