- Timestamp:
- 08/24/05 13:49:37 (19 years ago)
- File:
-
- 1 edited
Legend:
- Unmodified
- Added
- Removed
-
trunk/VUT/doc/SciReport/preprocessing.tex
r246 r247 31 31 can however be chosen for urban scenes where of even small objects can 32 32 be more distructing for the user. 33 \end{itemize} 33 34 34 35 … … 52 53 \subsection{From-object based visibility} 53 54 54 Our framework is based on the idea of 55 56 A visibility sample is computed by casting a ray from an object 57 towards the viewcells and computing the nearest intersection with the 58 scene objects. All view cells pierced by the ray segment can the 59 object and thus the object can be added to their PVS. If the ray is 60 terminated at another scene object the PVS of the pierced view cells 61 can also be extended by this terminating object. Thus a single ray 55 Our framework is based on the idea of sampling visibility by casting 56 casting rays through the scene and collecting their contributions. A 57 visibility sample is computed by casting a ray from an object towards 58 the viewcells and computing the nearest intersection with the scene 59 objects. All view cells pierced by the ray segment can the object and 60 thus the object can be added to their PVS. If the ray is terminated at 61 another scene object the PVS of the pierced view cells can also be 62 extended by this terminating object. Thus a single ray can make a 63 number of contributions to the progressively computed PVSs. A ray 62 64 sample piercing $n$ viewcells which is bound by two distinct objects 63 contributes by at most $2*n$ entries to the current PVSs. 65 contributes by at most $2*n$ entries to the current PVSs. Appart from 66 this performance benefit there is also a benefit in terms of the 67 sampling density: Assuming that the view cells are usually much larger 68 than the objects (which is typically the case) starting the sampling 69 deterministically from the objects increases the probability of small 70 objects being captured in the PVS. 64 71 65 72 At this phase of the computation we not only start the samples from … … 73 80 74 81 75 \subsection{ ProgressiveRandomized Sampling}82 \subsection{Basic Randomized Sampling} 76 83 77 84 85 The first phase of the sampling works as follows: At every pass of the 86 algorithm visits scene objects sequentially. For every scene object we 87 randomly choose a point on its surface. Then a ray is cast from the 88 selected point according to the randomly chosen direction. We use a 89 uniform distribution of the ray directions with respect to the 90 halfspace given by the surface normal. Using this strategy the samples 91 at deterministicaly placed at every object, with a randomization of 92 the location on the object surface. The uniformly distributed 93 direction is a simple and fast strategy to gain initial visibility 94 information. 78 95 79 96 80 \subsection{Importance Sampling} 97 The described algorithm accounts for the irregular distribution of the 98 objects: more samples are placed at locations containing more 99 objects. Additionally every object is sampled many times depending on 100 the number of passes in which this sampling strategy is applied. This 101 increases the chance of even a small object being captured in the PVS 102 of the view cells from which it is visible. 103 104 105 \subsection{Accounting for View Cell Distribution} 106 107 The first modification to the basic algorithm accounts for 108 irregular distribution of the viewcells. Such a case in common for 109 example in urban scenes where the viewcells are mostly distributed in 110 a horizontal direction and more viewcells are placed at denser parts 111 of the city. The modification involves replacing the uniformly 112 distributed ray direction by direction distribution according to the 113 local view cell density. We select a random viecell which lies at the 114 halfpace given by the surface normal at the chosen point. We pick a 115 random point inside the view cell and cast a ray towards this point. 116 117 118 \subsection{Accounting for Visibility Events} 119 120 81 121 82 122 … … 85 125 86 126 \subsection{Exact Verifier} 127 128 The exact verifier computes exact mutual visibility between two 129 polyhedrons in the scene. This is computed by testing visibility 130 between all pairs of potentially polygons of these polyhedrons. 131 132 133 87 134 88 135
Note: See TracChangeset
for help on using the changeset viewer.