Year by year the growing of Blender come increasing and surprising us. The short movie Tears of Steel proves this, mainly with the new feature of Blender, the camera tracking.
This article will show some tests with this technology in conjunction with Pyhton Photogrammetry Toolbox. Firstly we did an attempt to reconstruct a scene partially with PPT and match it with the footage.
Second, we use the camera tracked and we imported other scene (sphynx) to use the real movement of the camera.
Why use PPT instead a modeling over a picture? 1) Because the reconstruction over a picture is subjective and have the distortion of the perspective. 2) Because scanning complex objects can be easer than modeling it (thing in a statue broken, or an assimetric vase). 3) Because make the texture will be easier when we use the reference pictures. 4) Because you can use the frames of the footage to reconstruct the scene. 4) Because the work of ilumination can be easier cause the texture already be illuminated and the scene (background) be ready.
How can I use the camera tracking in Blender? Make the process can be more easier than you think. A good videotutorial can be found here. Once you have the scene tracked, you can do the reconstruction using PPT.
With this, you can receive the cameras with the pictures to project the texture on model.
In archaeology, the Blender traking can be used, for example, to reconstruct ancient builds over current ruins.
The uses can be many, your creativity is the limitation.
A big hug!