few months ago i'v read an article on intel research group called: Morphological Antialiasing.
this technique designed for cpu but with few tricks and hacks we can use it on gpu as well.
the algorithm consist of 3 main steps:
1. find discontinuities between pixels
2. identify predefined patterns
3. blend pixels in the neighborhood of the patterns in 2
1. simple edge detection on the image, using depth or color differences should do the work, keep in mind that you should encode edge type in you color channels so you could use it in 2, lets say red is horizontal edges, and g is vertical edges.
2. this is tricky, you basically need to identify few shapes: Z, U, L
see image below (grabbed from original intel article):
so based on the article, you only need to identify L shapes, as Z, U can be split into L shapes.
for more deep information please refer to the original article that can be found here:
to identify L shapes there are few tricks, a simple one is to just follow the edges you mark in 1 and see if you get a match (few loops for each side: left/right/top/bottom and of course branching), if so you compute blend weights for these and continue to the next edges.
at the end, you end up with blend weights texture so you could blend the pixel to get the final image, you can use a the trick described in gpu pro 2, they encode the final weights in textures and sample it.
btw: if you have ATI HD 6850+ you have built in support for that, so no need to worry, for consoles you may want to worry a little ;)
this technique isn't simple to implement as a first shot, i tried few algorithms and techniques before i got this thing working.
after seeing the demo from gpu pro 2, i'v got to say i was impressed by the speed of their implementation so i put some tricks into mine as well to get the missing cycles :)
optimization tip: when doing edge detection pass, use the stencil buffer to mark these pixels, then in the next step, use the stencil apply your "massive" shapes/blend weights shader only on edges pixels, this way you won't waste power on irrelevant pixels
here is a few screenshots of the main steps and result:
as you can see, this technique have very good result.
extra: few other techniques you should check:
* GPAA - show it at humus
* FXAA - nvidia sdk 11 (looks pretty good)
cya until next time...