Other research

Apart from fighting with geometry shaders, I work(ed) on several other research topics related to textures in virtual 3D city models.

Texture-based Ambient Occlusion

a nice sunset

Our ambient occlusion implementation can produce such nice sunsets!

This piece of work started in cooperation with my former colleage Henrik Buchholz. Goal was an implementation of Ambient Occlusion suitable for 3D city models. Due to the unpredictable geometric quality, a vertex-based implementation could not produce consistent quality. Instead, we had to bake lightmaps for the full model. Blender, Maya & Co can do this trick for a while already. But we wanted to produce the light maps on the graphics card without the help of some raytracer.

In the end, we had to find solutions to a multitude of problems: equally distributed sampling of a hemisphere, surface parameterization, texture atlas packing, robust shadow mapping, atlas mipmapping, and scene graph management. And since this was not enough, we added color and direct sun light :-).

Surface properties in CityGML

Summer theme for LoD1 Winter theme for LoD1
Summer theme for LoD2 Winter theme for LoD2

CityGML can provide multiple themes for a single data set. Here, a "summer" (left) and a "winter" (right) theme have been defined. Within a theme, different representations (levels of detail) of a single object can have differing appearances.

CityGML is an international standard for the exchange and storage of rich virtual 3D city models. It has been issued by the Open Geospatial Consortium (OGC). I am the representative of my institute in the originating Special Interest Group 3D (SIG 3D) of the initiative Geodata Infrastructure North-Rhine Westphalia (GDI NRW) in Germany. In the scope of this group, I worked on the development of the appearance module. It enables the use of textures and materials in CityGML.

The particular challenge wasn't so much loads of features - CityGML intentionally does not support shaders, lights, or cameras - but proper integration into the framework set by GML, the Geography Markup Language. The result provides benefits for visualization, as textured city models now can be more than just geometry, and for geoinformation, since city models now can be more than just color-mapped.

Facade extraction from aerial imagery

mapped buildings

One HRSC channel mapped to some buidlings. Due to the rather small oblique angle, the texture quality becomes quiet low.

A core question relating to textures in city models is where to get them from. Ortho images have become a commodity these days that can be bought by the square kilometer. But what if you want facade textures? And you don't want to walk through the streets taking pictures and rectify them building by building. One idea is using the same input as for ortho image creation and use the pespective pieces, that normally get tossed. This is what I tried a few years back with High-Resolution-Stereo-Camera (HRSC-AX) imagery. This multi-spectral camera uses several pushbroom image sensors. For using such images as projective textures, we need to implement the pushbroom camera model.

Multiperspective views of 3D city models

Pedestrian's view deformation Bird's eye view deformation

Our multi-perspective views fuse two perspectives seamlessly in one image. The intention is to provide depth-impression and a map-like view simultaneously. We present two flavors: the pedestrian's view deformation (left) and the bird's eye view deformation (right).

Ok. There is this nice, large city model of yours with you standing right in there. But the only thing you see are the 5 closest buildings and lots of sky. Not very useful. A similar problem and its solution can be seen in current navigation systems: The screen can show only a limited portion of the map (or 3D city model), even if seen at an oblique angle. The solution is bending the terrain to keep the horizon in view all the time. Together with my colleagues Matthias Trapp and Markus Jobst, I implemented this effect and generalized it.

In our approach, we consider the terrain as two ridig pieces connected by a bendable zone. This provides the viewer with two seamlessly connected linear perspectives of the city model. We not only allow for bending down, which is useful for a bird's eye view, but also for bending up, which helps in a pedestrian's view setting. The resulting images then combine both depth impression and a map-like view.

All images and content copyright 2005-2011 by Haik Lorenz. No reproduction without permission.
Contact: Haik Lorenz, contact@haik-lorenz.de