White Light equals Brown Noise

Here is a recent show­er thought:

It is usu­al­ly said that the col­ors of noise are inspired by the spec­tral dis­tri­b­u­tions of cor­re­spond­ing col­ors of light. For exam­ple, the ‘white’ in white noise is an allu­sion to white light which is thought to have a (most­ly) flat spec­trum. But is this so? How does white light actu­al­ly look like, as an elec­tro­mag­net­ic wave?

So I made fol­low­ing fig­ure which shows some pow­er-law dis­tri­b­u­tions in the vis­i­ble range of wave­lengths col­ored by their the­o­ret­i­cal appearance:

Fig 1: Spec­tral dis­tri­b­u­tions with expo­nents from 0 to −4.

The col­or of the flat line is also known as Stan­dard Illu­mi­nant E‍‍ – or equal-ener­gy white light. Com­pared to the D65 white back­ground it has a rather pink­ish appear­ance with a cor­re­lat­ed col­or tem­per­a­ture of about 5500 K.

Weit­er­lesen

Followup to Atmospheric Scattering – Part 1: Overview

This post is the first in a series to fol­low-​up on my 2012 GPU Pro 3 arti­cle about atmos­pher­ic scat­ter­ing [11]. What I showed there was a full sin­gle-​scat­ter­ing solu­tion for a plan­e­tary atmos­phere run­ning in a pix­el shad­er, dynam­ic and in real time, with­out pre-​com­pu­ta­tion or sim­pli­fy­ing assump­tions. The key to this achieve­ment was a nov­el and effi­cient way to eval­u­ate the Chap­man func­tion [2], hence the title. In the time since then I have improved on the algo­rithm and extend­ed it to include aspects of mul­ti­ple scat­ter­ing. The lat­ter caus­es hor­i­zon­tal dif­fu­sion (twi­light sit­u­a­tions) and ver­ti­cal dif­fu­sion (deep atmos­pheres), and nei­ther can be ignored for a gen­er­al atmos­phere ren­der­er in a space game, for example.

I have writ­ten a Shader­toy that reflects the cur­rent state of affairs. It’s a mini flight sim­u­la­tor that also fea­tures clouds, and oth­er ren­der­ing good­ies. A WebGL 2 capa­ble brows­er is need­ed to run it. Under Win­dows, the ANGLE/Direct 3D trans­la­tor may take a long time to com­pile it (up to a minute is noth­ing unusu­al, but it runs fast after­wards). When suc­cess­ful­ly com­piled it should look like this:
Weit­er­lesen

Elite Dangerous: Impressions of Deep Space Rendering

I am a backer of the upcom­ing Elite Dan­ger­ous game and have par­tic­i­pat­ed in their pre­mi­um beta pro­gramme from the begin­ning, pos­i­tive­ly enjoy­ing what was there at the ear­ly time. ‘Pre­mi­um beta’ sounds like an oxy­moron, pay­ing a pre­mi­um for an unfin­ished game, but it is noth­ing more than pur­chas­ing the same backer sta­tus as that from the Kick­starter cam­paign.

I came into con­tact with the orig­i­nal Elite dur­ing christ­mas in 1985. Com­pared with the progress I made back then in just two days, my recent per­for­mance in ED is lousy; I think my com­bat rat­ing now would be ‘com­pe­tent’.

But this will not be a game­play review, instead I’m going to share thoughts that were inspired while play­ing ED, most­ly about graph­ics and shad­ing, things like dynam­ic range, sur­face mate­ri­als, phase curves, ‘real’ pho­tom­e­try, and so on; so … after I loaded the game and jumped through hyper­space for the first time (actu­al­ly the sec­ond time), I was greet­ed by this screen fill­ing disk of hot plasma:

ED001

Weit­er­lesen

Journey into the Zone (Plates)

I have exper­i­ment­ed recent­ly with zone plates, which are the 2‑D equiv­a­lent of a chirp. Zone plates make for excel­lent test images to detect defi­cien­cies in image pro­cess­ing algo­rithms or dis­play and cam­era cal­i­bra­tion. They have inter­est­ing prop­er­ties: Each point on a zone plate cor­re­sponds to a unique instan­ta­neous wave vec­tor, and also like a gauss­ian a zone plate is its own Fouri­er trans­form. A quick image search (google, bing) turns up many results, but I found all of them more or less unus­able, so I made my own.

Zone Plates Done Right

I made the fol­low­ing two 256×256 zone plates, which I am releas­ing into the pub­lic so they can be used by any­one freely.

Cosine zone plate with constrast weighting

Cosine zone plateCC0

Zone plates with contrast weighting

Sine zone plateCC0

Weit­er­lesen

Yes, sRGB is like µ‑law encoding

I vague­ly remem­ber some­one mak­ing a com­ment in a dis­cus­sion about sRGB, that ran along the lines of

So then, is sRGB like µ‑law encoding?

This com­ment was not about the col­or space itself but about the spe­cif­ic pix­el for­mats nowa­days brand­ed as ’sRGB’. In this case, the answer should be yes. And while the tech­ni­cal details are not exact­ly the same, that anal­o­gy with the µ‑law very much nails it.

When you think of sRGB pix­el for­mats as noth­ing but a spe­cial encod­ing, it becomes clear that using such a for­mat does not make you auto­mat­i­cal­ly “very picky of col­or repro­duc­tion”. This assump­tion was used by hard­ware ven­dors to ratio­nal­ize the deci­sion to lim­it the sup­port of sRGB pix­el for­mats to 8‑bit pre­ci­sion, because peo­ple “would nev­er want” to have sRGB sup­port for any­thing less. Not true!Screen Shot 2014-03-06 at 19.02.54I’m going to make a case for this lat­er. But first things first.

Weit­er­lesen

Followup: Normal Mapping Without Precomputed Tangents

This post is a fol­low-up to my 2006 ShaderX5 arti­cle [4] about nor­mal map­ping with­out a pre-com­put­ed tan­gent basis. In the time since then I have refined this tech­nique with lessons learned in real life. For those unfa­mil­iar with the top­ic, the moti­va­tion was to con­struct the tan­gent frame on the fly in the pix­el shad­er, which iron­i­cal­ly is the exact oppo­site of the moti­va­tion from [2]:

Since it is not 1997 any­more, doing the tan­gent space on-the-fly has some poten­tial ben­e­fits, such as reduced com­plex­i­ty of asset tools, per-ver­tex band­width and stor­age, attribute inter­po­la­tors, trans­form work for skinned mesh­es and last but not least, the pos­si­bil­i­ty to apply nor­mal maps to any pro­ce­du­ral­ly gen­er­at­ed tex­ture coor­di­nates or non-lin­ear defor­ma­tions. Weit­er­lesen

Branchless Matrix to Quaternion Conversion

(Edit: This arti­cle is a more in-depth write­up of an algo­rithm that I devel­oped around 2005, and first post­ed to Mar­tin Baker’s Euclid­ean Space web­site. That time was the height of the Intel Net­Burst archi­tec­ture, which was noto­ri­ous for its deep pipeline and high branch mis­pre­dic­tion penal­ty. Hence the moti­va­tion to devel­op a branch-free matrix to quater­nion con­ver­sion rou­tine. What fol­lows is the com­plete deriva­tion and analy­sis of this idea.)

The orig­i­nal rou­tine to con­vert a matrix to a quater­nion was giv­en by Ken Shoe­make [1] and is very branchy. There is a way to elim­i­nate these branch­es and arrive at a com­plete­ly branch-free and high­ly par­al­leliz­able code. The trade off is the intro­duc­tion of 3 addi­tion­al square roots. Jump to the analy­sis sec­tion and the end of this arti­cle, or con­tin­ue fist with the math bits.

Weit­er­lesen

The Blinn-Phong Normalization Zoo

It is good to see how phys­i­cal­ly based shad­ing is final­ly gain­ing momen­tum in real time graph­ics and games. This is some­thing I have been advo­cat­ing for a long time. Devel­op­ers are spread­ing the word. I was espe­cial­ly sur­prised to learn about Call of Duty: Black Ops join­ing the club [1]. Even a slick 60 Hz shoot­er with no cycles to spare can afford to do PBS today!

This leads me to the top­ic of this post, the nor­mal­iza­tion of the Blinn-Phong spec­u­lar high­light. Why am I writ­ing about it? It came to my mind recent­ly with the cur­rent batch of pub­li­ca­tions from peo­ple adopt­ing phys­i­cal­ly based shad­ing mod­els. This got me check­ing the maths again and I com­piled a list with nor­mal­iza­tion fac­tors for dif­fer­ent shad­ing mod­els, giv­en here in this post. I would also like to elab­o­rate a lit­tle on the mod­el that I wrote about in ShaderX7 [2]. Be aware this post is a large brain dump.
Weit­er­lesen