source: trunk/VUT/doc/SciReport/sampling.tex @ 251

Revision 251, 7.6 KB checked in by mattausch, 19 years ago (diff)

added some optimizations for online culling and view cell generation

Line 
1\chapter{Global Visibility Sampling Tool}
2
3
4\section{Introduction}
5
6
7The proposed visibility preprocessing framework consists of two major
8steps.
9\begin{itemize}
10\item The first step is an aggresive visibility sampling which gives
11initial estimate about global visibility in the scene. The sampling
12itself involves several strategies which will be described in
13section~\ref{sec:sampling}. The imporant property of the aggresive
14sampling step is that it provides a fast progressive solution to
15global visibility and thus it can be easily integrated into the
16game development cycle.
17
18\item The second step is visibility verification. This step turns the
19previous aggresive visibility solution into either exact, conservative
20or error bound aggresive solution. The choice of the particular
21verifier is left on the user in order to select the best for a
22particular scene, application context and time constrains. For
23example, in scenes like a forest an error bound aggresive visibility
24can be the best compromise between the resulting size of the PVS (and
25framerate) and the visual quality. The exact or conservative algorithm
26can however be chosen for urban scenes where of even small objects can
27be more distructing for the user.
28\end{itemize}
29
30
31
32In traditional visibility preprocessing the view space is
33subdivided into viewcells and for each view cell the set of visible
34objects --- potentially visible set (PVS) is computed. This framewoirk
35has bee used for conservative, aggresive and exact algorithms.
36
37We propose a different strategy which has several advantages for
38sampling based aggresive visibility preprocessing. The stategy is
39based on the following fundamental ideas:
40\begin{itemize}
41\item Replace the roles of view cells and objects
42\item Compute progressive global visibility instead of sequential from-region visibility
43\end{itemize}
44
45Both of these points are addressed bellow in more detail.
46
47\subsection{From-object based visibility}
48
49Our framework is based on the idea of sampling visibility by casting
50casting rays through the scene and collecting their contributions. A
51visibility sample is computed by casting a ray from an object towards
52the viewcells and computing the nearest intersection with the scene
53objects. All view cells pierced by the ray segment can the object and
54thus the object can be added to their PVS. If the ray is terminated at
55another scene object the PVS of the pierced view cells can also be
56extended by this terminating object. Thus a single ray can make a
57number of contributions to the progressively computed PVSs. A ray
58sample piercing $n$ viewcells which is bound by two distinct objects
59contributes by at most $2*n$ entries to the current PVSs. Appart from
60this performance benefit there is also a benefit in terms of the
61sampling density: Assuming that the view cells are usually much larger
62than the objects (which is typically the case) starting the sampling
63deterministically from the objects increases the probability of small
64objects being captured in the PVS.
65
66At this phase of the computation we not only start the samples from
67the objects, but we also store the PVS information centered at the
68objects. Instead of storing a PVSs consting of objects visible from
69view cells, every object maintains a PVS consisting of potentially
70visible view cells. While these representations contain exactly the
71same information as we shall see later the object centered PVS is
72better suited for the importance sampling phase as well as the
73visibility verification phase.
74
75
76\subsection{Basic Randomized Sampling}
77
78
79The first phase of the sampling works as follows: At every pass of the
80algorithm visits scene objects sequentially. For every scene object we
81randomly choose a point on its surface. Then a ray is cast from the
82selected point according to the randomly chosen direction. We use a
83uniform distribution of the ray directions with respect to the
84halfspace given by the surface normal. Using this strategy the samples
85at deterministicaly placed at every object, with a randomization of
86the location on the object surface. The uniformly distributed
87direction is a simple and fast strategy to gain initial visibility
88information.
89
90
91The described algorithm accounts for the irregular distribution of the
92objects: more samples are placed at locations containing more
93objects. Additionally every object is sampled many times depending on
94the number of passes in which this sampling strategy is applied. This
95increases the chance of even a small object being captured in the PVS
96of the view cells from which it is visible.
97
98
99\subsection{Accounting for View Cell Distribution}
100
101The first modification to the basic algorithm accounts for irregular
102distribution of the viewcells. Such a case in common for example in
103urban scenes where the viewcells are mostly distributed in a
104horizontal direction and more viewcells are placed at denser parts of
105the city. The modification involves replacing the uniformly
106distributed ray direction by directions distributed according to the
107local view cell directional density. It means placing more samples at
108directions where more view cells are located. We select a random
109viecell which lies at the halfpace given by the surface normal at the
110chosen point. We pick a random point inside the view cell and cast a
111ray towards this point.
112
113
114\subsection{Accounting for Visibility Events}
115
116 Visibility events correspond to appearance and disapearance of
117 objects with respect to a moving view point. In polygonal scenes the
118 events defined by event surfaces defined by three distinct scene
119 edges. Depending on the edge configuration we distinguish between
120 vertex-edge events (VE) and tripple edge (EEE) events. The VE surfaces
121 are planar planes whereas the EEE are in general quadratic surfaces.
122
123 To account for these event we explicitly place samples passing by the
124 object edges which are directed to edges and/or vertices of other
125 objects.
126
127 \subsection{View Cell Representation}
128
129In order to efficiently use view cells with our sampling method, we require a view cell representation which is
130
131\begin{itemize}
132\item optimized for viewcell - ray intersection.
133\item flexible, i.e., it can represent arbitrary geometry.
134\item naturally suited for an hierarchical approach. %(i.e., there is a root view cell containing all others)
135\end{itemize}
136
137We meet these requirements by using a view cell BSP tree, where the BSP leafs are associated with the view cells.
138Using the BSP tree, we are able to find the initial view cells with only a few view ray-plane intersections.
139The hierarchical structure of the BSP tree can be exploited as hierarchy of view cells. If neccessary, we could further subdivide a BSP leaf view cell quite easily.
140
141Currently we use two approaches to generate the initial BSP view cell tree.
142
143\begin{itemize}
144\item We use a number of dedicated input view cells. As input view cell any closed mesh can be applied. The only requirement
145is that the view cells do not overlap. We insert one view cell after the other into the tree. The polygons of a view cell are filtered down the tree, guiding the insertion process. Once we reach a leaf and there are no more polygons left, we terminate
146the tree subdivision. If we are on the inside of the last split plane (i.e., the leaf is representing the inside of the view cell), we associate the leaf with the view cell (i.e., add a pointer to the view cell). Hence a number of leafes
147can be associated with the same input view cell.
148\item We apply the BSP tree subdivision to the scene geometry. When the subdivision terminates, the leaf nodes
149also represent the view cells.
150\end{itemize}
151
Note: See TracBrowser for help on using the repository browser.