Nov 212012

In the picture above I am visualizing the normals of each hit. It is using multisampling and purely diffuse material so that some rays get scattered inte space (the box is missing one side). Note that the back wall is missing — the rectangle had the points stored in incorrect order so that the normal was pointing out the opposite side. The fixed back wall can be seen below.

Now that I can scatter and have material properties, I can do actual path tracing. The image below uses essential lambertian surfaces, hence the absorption is relative to $cos(\phi)$ and it uses only diffuse scattering. I increased the number of samples from 20 to 50 which turns out to make the image darker since more more rays end up in space without hits.

Nov 072012

Double inverted y axis caused confusion, and a stubborn bug in the triangle intersection code that I only found after looking over the code 4 times — some plus should have been a minus!

But now adding a rectangle object is trivial (screenshot uses multisampling with 10 samples)

Nov 072012

Some documentation of how my work progressed:

Initially after setting up a window to draw in my first attempt to render a sphere looked very interesting:

I had no clue what was wrong and suspected for a very long time that my math was off. But after rederiving and comparing to other sources I could not find any errors. So even 6 revisions later the picture did not change much:

It turns out that it was an issue with the types: I was still using integers instead of floats in some places, so some calculations were rounded and some 8bit integers regularly overflowed creating these patterns. With this fixed, in r11 I finally was able to display a simple sphere.

At this time rendering was still quite slow, even with only one sphere to display. The issue was a basic multithreaded architecture which passed every pixel to draw between threads. So in r14 everything looked the same, but the architecture was much improved.

After some more reengineering I implemented bounding boxes, which look quite boring visualized:

Some simple shading based on the distance to the eye, notice how the top left sphere is closer to the screen. However at this point the shading is just a hack and not physically based at all.

Next was implementing multisampling. Right now it is just uniformly distributed around the center of the pixel. Note that the shading is turned off again, which before produced quite aliased pictures.

Rendering time becomes much slower due to the multisampling, so next up was a bounding volume hierarchy. The next scene rendered 288 objects in the same time that the previous scene rendered 3 objects:

Very subtle mistakes are hard to find, but produce these interesting artifacts. Notice how the center of the previous picture is square, although the blue mass is composed of spheres. Taking one wrong branch in the bvh code caused this, so using some other colors the squares of spheres now are distinguishable and don’t have artifacts:

The partially rendered image can be displayed in progress, which combined with multithreading is much more visually pleasing:

Visualizing the reflection vector by shading based on the cos of the hit and the normal:

First attempt at recursive raytracing introduced very interesting visual bugs. Here the light source is a sphere in the center of the screen close to the eye in front of the image plane. It is unclear to me at the time what caused the square to appear – it turns out later there still was  bug in the intersection code.

Finally implementing a different shape, the triangle proves to be difficult. At some angles it will completely fail to render, and the following scene is not what would be expected: the triangle on the left differs from the one on the right by only one vertex, the bottom left one is moved to the left. The others remain the same, yet it renders as if it was moved entirely to the left.

BlinnPhong shader that helped to debug inverted reflection vectors:

Something is still odd with the recursive rays though. The next picture shows those rays that hit an object after one reflection, again with a light source at the center between eye and image plane:

Previously I had to fix the bug that reflected rays don’t have to obey the same range limit as initial rays, but in that bugfix I allowed recursive rays to go backwards too – which is what caused these false hits in the negative direction.

Now with a big light source in front of the image plane and using a shader that visualizes again the cos between hit and normal, we see the three spheres hit the light until a cut-off point where the rays go essentially off to the sides of the image instead of getting reflected back towards the image plane. That is what causes the black halos around the spheres.

Sep 232012

The biggest confusion that I had when getting this to work is that you can only manipulate the UEFI boot loader when booted in UEFI mode. Since my existing Debian installation, as well as the installer CD booted in BIOS emulation mode, I failed to set up the boot loader at first. Key is the efi shell, which I can conveniently access on my ASUS motherboard by hitting Exit in the EZ mode to get to the Advanced mode, then hitting Exit again.


Update: For proper kernel configuration, see [https://www.kernel.org/doc/Documentation/x86/x86_64/uefi.txt] . Starting with 3.7 I had no kernel output any more without these settings.

Sep 052012

Install Go

http://golang.org/doc/install/source or http://code.google.com/p/go/downloads/list

Sources is basically:

hg clone -u release https://code.google.com/p/go
cd go/src && ./all.bash
export GOROOT=.../go
# see http://stackoverflow.com/questions/7970390/what-should-be-the-values-of-gopath-and-goroot
export GOBIN=$GOROOT/bin


go get -u github.com/nsf/gocode


# make sure you have sdl library + headers installed
apt-get install libsdl1.2-dev libsdl-mixer1.2-dev libsdl-image1.2-dev libsdl-ttf2.0-dev
# install the go library
go get -v github.com/0xe2-0x9a-0x9b/Go-SDL/...




Latex test: l(t) = s + t*d

Jul 242012

So in the day and age where webmail clients are a common access method for email, why are some people still using a native email client? Maybe they’re used to it, or maybe they like it that they offer different usability options. Yes, you can more easily customize an application than a website where any change potentially impacts all your user base.

But then how is it possible that these applications, which have been around a long time, can’t get what seems to be basic options for the power user right? I for one would like to have a sensible default sort order in Thunderbird: I would like to sort by the order the email was received, which allows me to quickly notice new emails, while also seeing a threaded view to accomodate email lists threads.

This is not an unusual configuration, for example gmail works pretty much exactly like that. But how is it possible that this configuration is really difficult to get right in Thunderbird? There is a bug to this extend open since 2004! That is a sad state of desktop applications.


P.S. Yes, it is open source, and we all can contribute, but that can program knows it’s not so easy to do this for such a large software project.