• Tidak ada hasil yang ditemukan

Visual Workflows for Oil and Gas Exploration - Thomas Höllt

N/A
N/A
Protected

Academic year: 2023

Membagikan "Visual Workflows for Oil and Gas Exploration - Thomas Höllt"

Copied!
153
0
0

Teks penuh

The Energy Information Administration predicts that oil and gas will account for more than 50% of energy traded worldwide in 2035 [104]. Seismic surveys, resulting in large amounts of data of various types, are performed to understand the properties of the subsurface and to locate oil and gas reservoirs.

Contributions of this Thesis

Subsurface Modeling and Seismic Visualization

In addition, we present a framework for interactive analysis of the parameter space of the presented surface extraction technique (Chapter 4). We automatically sample the parameter space of the cost function used for horizon extraction to compute sets of horizon surfaces.

Visual Analysis of Uncertainties in Ocean Forecasts

We provide an interactive system that enables analysis and visualization of the extracted ensemble data and facilitates real-time exploration of the parameter space of this data. Finally, we present a new shading technique aimed at highlighting horizon structures in seismic volume data, as well as a technique for single-pass exploded images based on extracted horizons (Chapter 5).

Subsurface Modeling

Borehole and borehole data come in three different types: (1) measured in spatial depth at the boreholes, (2) measured in spatial depth, and converted to the time or depth migration domain, or (3) measured directly in the time domain. 22] explains the need for depth conversion of the seismic interpretation and why depth migration, even if it is not sufficient to result in a seismic cube in the depth domain to get a good subsurface model.

Analysis of Ocean Forecasts

Section 2.1 provides an overview of applications, workflows and techniques specific to subsurface modeling and seismic interpretation. The importance of oil and gas to today's society has resulted in a significant amount of previous work in seismic interpretation and visualization, which.

Volume Visualization for Seismic Data

Exploded views allow you to see inside the data by cutting away parts, but instead of removing those parts, context is maintained by displacing the cut away objects. 5] have created a cut plane that adapts to the volumetric data by acting as a deformable membrane.

Image and Volume Segmentation

There are a number of publications that address the extension of Live Wire to 3D to create minimal costs. The shortest path (magenta in b) connecting two points on the two circles does not lie on the surface of the catenoid.

Volume Deformation

This work formulates the minimum cost surface problem as an extension of the original 2D minimum cost path problem and shows that it can be solved by linear programming. This approach results in the real minimum cost surface and is also topology independent.

Ensemble Visualization

They reassemble the volume during deformation and render the deformed volume with standard volume rendering. 83] describes a system for the comparative analysis of 2D feature ensembles used in the development process of powertrain systems.

Uncertainty Visualization

Weaknesses and Proposed Enhancements

Therefore, we propose side-by-side views of the time and depth domains, where the depth domain view provides on-the-fly updates based on the intermediate results from the horizon extraction and velocity modeling. In addition, to facilitate the surface extraction itself, we propose the use of an interactive, user-controlled global optimization technique instead of local solvers.

Joint Time/Depth Domain Visualization

Rendering Pipeline

Invalid fragments, ie. fragments belonging to triangles in the grid that are not yet covered by the interpretation are discarded. In addition, we use a fragment shader to calculate several surface properties on the fly.

On the Fly Deformation

In the beamformer setup step 5, the extents of the boundary geometry must be adjusted to fit the deformed volume. The position of the sample in the warped space is used to look up the ID of the current layer.

Performance

Since the performance mostly depends on the size of the surface, the calculation for a larger data set is important. Additionally, several additional texture lookups must be performed to obtain the layer ID of the current pattern.

Prism-Based Workflow

User interaction in this approach is mostly limited to the 2D optimization on the sides of the prism. To extend the horizon tracking to areas not covered by pits, additional points of interest (e.g. the corners of the volume) can be added to the triangulation.

Horizon Extraction using Global Optimization

Requirements for Horizon Extraction

Since only the local neighborhood is evaluated, it is not possible to guarantee that any point will be part of the final path or surface except the seed point. The result of clear boundary approximations is a curve or surface, which encloses the segmented object.

Local Horizon Extraction: Tracing Diffusion Tensors

The purple circles in Figure 3.9b show another problem, caused by the local nature of the approach. In Figure 3.9b this can be seen on the left and right side of the slice, which correspond to the same well position.

Minimum Cost Circulation Network Flow

Then the volume in the prism should be represented as a weighted, directed graph. The initial surface is a series of facets in the archetype, bounded by the closed contour obtained from the prism sides.

Extracting the Surface from the Saturated Graph

The other three pieces are on each side of the prism, bounded by the initial trace at the bottom, the cap slice at the top, and the edges of the prism at the sides. If there are two alternative surfaces bounded by the initial contour (the upper left part of the path in Figure 3.11d), both must be of the same cost and are thus equivalent solutions.

Cost Function

It is therefore defined by the similarity of the voxels belonging to the edge/aspect in question. As described above, we define the agility component by a combination of the scalar value as well as the local waveform.

Results

Evaluation

After 60 minutes, the expert completed 18 slices in ten-slice increments, covering approximately 60% of the data set. Using the Petrel software, the expert was able to complete 18 slices (approximately 60% of the data set) within the 60 minute time limit.

Conclusion

In Section 4.2, we present the statistical analysis that builds the basis for ensemble data visualization (Section 4.3). Since the surfaces for each parameter setting are computed independently of all others, the ensemble computation can be easily parallelized for faster ensemble data computation.

Statistical Analysis

Once starting points and constraints are defined, the user can define a range and sampling rate for each parameter to calculate the ensemble. For each parameter setting, the seed point and constraints, as well as the parameterized cost function, are given to the surface extraction algorithm.

Ensemble Visualization

While the maximum likelihood surface fits the ridgeline well for large parts of the surface, the midsurface front shows large differences in values. This does not require any data transfer to or from the GPU except for the surface ID in the render group.

Computation and Visualization Pipeline

Changes in the parameter range cause an update of the 3D histogram and subsequently of the representative surface and property structure. With the (x, y) portion of the resulting volume position, we can directly query the histogram and probability density distribution for this position.

Results

Although the charges (color-coded in (a)) show little variation across the surface and thus reveal no problems in segmentation, the standard deviation (b) clearly shows an area in the center of the surface (compare the areas marked by the arrows). c) shows a cut-out part of the surfaces, the blue line corresponds to (a) the green line with the maximum probability surface in (b). The variance is represented by color coding the maximum likelihood surfaces of the respective parts of the ensemble.

Conclusion

Instead, we compute the inverse of the waveform-based component of the local cost on-the-fly during rendering and use it to modulate the opacity from the transfer function, in the same way that the gradient size is used for gradient size-modulated shading [53]. Using the non-linear filter wave described by Equation 3.11 in Section 3.4.5, we can roughly estimate the distance of the current sample to a local minimum or maximum in the seismic trace.

Exploded Views

The exploded views allow inspection of the extracted horizon surfaces in context of the seismic data. The fragment shader for rendering the exploded view is only a slight modification of the shader described in Section 3.2.2.

Application Framework

Statistical Analysis

We define and visualize a simple risk estimate as the percentage of ensemble members above the defined critical height. The risk value is not only dependent on the ensemble itself, but also the critical sea level.

Ensemble Visualization

Graphs are colored using the associated risk, i.e. the fraction of the distribution above the selected critical sea level. The risk can be easily estimated by looking at the fraction of the glyph that is above the critical sea level.

Computation and Visualization Pipeline

Performance

The columns titled dw/oandw show the computation times for just this property, assuming all dependencies are already cached, and the computation time including all dependencies, respectively. With the dependencies precomputed, the computation of both properties is trivial, resulting in very short computation times.

Results

Placement Planning

In Figure 6.7b, the sea level of the mean surface for a single time step is mapped to the third dimension. By animating all time steps, the user can now get an overview of the average sea level at the selected area of ​​interest, as well as the corresponding uncertainties.

Final Placement / Operational Phase

Moving the position slightly results in several time steps that definitely require a halt in operations. b. Assuming a color map as described, after loading the data, the user can immediately identify safe and unsafe time steps by the color of the corresponding glyphs.

Conclusion

We present a framework for interactive visual analysis of ensemble sea level simulations. In Proceedings of the 4th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH ’77, p.

Referensi

Dokumen terkait

Apabila kombinasi dari opsi-opsi tersebut diterapkan semuanya, maka emisi GRK dapat mendekati nol pada 2050, dengan catatan listrik yang dihasilkan berasal dari energi terbarukan