Sunday, October 7, 2012

ISMAR 2011

Having nice global illumination between real and virtual objects is not enough to get a plausible illusion. There are still some important points missing that make virtual objects easily distinguishable from real objects:
  • The tracking often jitters or worse gets lost for virtual objects
  • The colors of virtual objects do not match the colors of the surrounding real objects
  • The virtual objects have a perfect appearance while the real objects don't
  • Camera artifacts introduce distortions to the real objects, but normally not to virtual ones as they are rendered on top of the video image
So in order to get a plausible illusion we have to deal with those problems. In our paper "Adaptive Camera-Based Color Mapping For Mixed-Reality Applications" we tried to find an automatic method that adapts the color mapping characteristics of the camera onto the virtual objects.


New Zealand

I had the luck to spend an awesome time in Christchurch, New Zealand at the famous HITLabNZ. During my three month stay we've developed a small user study that analyzed the influences of our differential instant radiosity method for specific task execution in mixed reality scenarios. However, for the chosen tasks no significant differences could be measured.

A big thanks goes to all the people at the HITLabNZ for such a great time!!!

Writing more about New Zealand is pretty much useless - you should pack your stuff and go there: IT IS ABSOLUTELY AMAZING!!! :-)



ISMAR 2010

The first work we tried to publish was called "Differential Instant Radiosity for Mixed Reality". On this paper Christoph Traxler, Werner Purgathofer, Michael Wimmer and myself were working on a real-time global illumination rendering system for mixed reality scenarios. My master thesis already dealed with real-time global illumination by using a method called instant radiosity. It basically places virtual point lights at positions where light hits a surface.
We submitted our work to the ISMAR 2010 conference held in Seoul, South Korea and were happy to hear that it got accepted. We also presented our method on a demo booth and were very happy again about the very positive feedback of the people. However, to our total surprise, we won the "Best Paper Award" for our method - YEAH!!! :-)


Main Features:
  • Real and virtual objects cast shadows onto each other
  • Real and virtual objects may cause indirect illumination onto each other
  • Real and virtual spot light sources are supported
  • Multiple light bounces
Limitations:
  • Real geometry must be pre-modeled
  • Double shadowing artifacts may occur
  • Inconsistent color bleeding may occur

What happened so far....

Okay, so my master thesis was done and the blog - while never really alive - went completely dead.

However, since 2009 lots of stuff happened. One big change was the start of my PhD or Doctoral studies at the Vienna University of Technology. I got the chance to work on a very interesting project called RESHADE. RESHADE means "Reciprocal Shading for Mixed Reality". The main goal of this project is to develop methods that make virtual objects indistinguishable from real objects in mixed reality scenarios. You can find a more detailed description here: http://www.cg.tuwien.ac.at/projects/RESHADE/

Since I started working on this project I also managed to publish some work that I will present in follow up posts...