Skip to content
Snippets Groups Projects
Commit cf23b009 authored by Danny Griffin's avatar Danny Griffin
Browse files

passive text revision

parent 24d2b662
Branches
No related tags found
No related merge requests found
Pipeline #35642 passed
......@@ -6,14 +6,10 @@ nav_order: 1
# Computer vision
## [Camera sensors]({{ '/topics/01_cameras.html' | relative_url }})
## [Camera sensors]({{ '/1_cameras.html' | relative_url }})
Sensor basics
## [Passive scanning]({{ '/topics/02_passive.html' | relative_url }})
## [Passive scanning]({{ '/2_passive.html' | relative_url }})
Testing - Photogrammetry / Stereo Photogrammetry
## [Active scanning]({{ '/topics/03_active.html' | relative_url }})
Testing - Mechanical Methods / Light / Lasers
## [Active scanning]({{ '/3_active.html' | relative_url }})
topics/02_passive/images/brdf.png

84.1 KiB

topics/02_passive/images/light_stage.png

522 KiB

......@@ -27,8 +27,9 @@ Photogrammetry is the collection and organization of reliable information about
<p><img src="images/house_scanning.jpg" alt="House"></p>
# Stereo Matching and Photogrammetry
# Stereo Matching
<p>Stereo matching is also known as "disparity estimation", referring to the process of identifying which pixels in multiscopic views correspond to the same 3D point in a scene.</p>
<p>Early uses in stereophotogrammetry, the estimation of 3d coordinates from measurements taken from two or more images through the identification of common points. This technology was used throughout the early 20th century for generating topographic maps.</p>
......@@ -105,3 +106,10 @@ Increasingly industry pairs vision systems for photogrammetry with laser systems
<p>Depth estimation on Light Field data is an active domain. For now, algorithms are commonly tested on ideal, synthetic light fields such as this <a href="https://lightfield-analysis.uni-konstanz.de/">dataset</a>. Here is one example of point cloud obtained from a stereo <a href="https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8478503">matching method</a>:</p>
<iframe title="4D light field - depth estimation" frameborder="0" allowfullscreen="" mozallowfullscreen="true" webkitallowfullscreen="true" allow="fullscreen; autoplay; vr" xr-spatial-tracking="" execution-while-out-of-viewport="" execution-while-not-rendered="" web-share="" src="https://sketchfab.com/models/b9edfdd28c154ecf995da7b8c6590da8/embed"> </iframe>
# Light Stage
<p>This <a href="http://www.pauldebevec.com/">impressive device</a> was built for capturing the Bidirectional Reflectance Distribution Function (BRDF), which can describe the material’s optical properties in any direction and any illumination conditions. Thanks to the linearity of lighting, we can decompose the total illumination based on its direction. The viewing angle also plays a role for reflective or special materials (e.g. iridescence).</p>
<p><img src="images/brdf.png" alt=""></p>
<p>In the most complex case, objects need to be captured from several locations and illuminated from as many directions as possible.</p>
<p><img src="images/light_stage.png" alt=""></p>
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment