Skip to content

Commit 0dca07d

Browse files
authored
Add example files for gravity and petrophysics inversion (#9)
# Add vector-geology examples and update development logs This PR adds two complete examples for vector-geology: gravity inversion and petrophysics inversion. These examples demonstrate how to use GemPy for probabilistic modeling in geological contexts. ## Changes: - Added `.env.example` file with template for API tokens and folder paths - Updated `.gitignore` to exclude `.nc` files and version file - Updated development logs with more detailed task tracking for Vector examples - Added complete example code for gravity inversion with Bayesian methods - Added complete example code for petrophysics inversion using borehole data - Added auxiliary functions for both examples in separate files - Renamed model function in `model_examples.py` to be more descriptive - Updated test to use the renamed model function - Added subsurface-terra to dev requirements The examples include visualization, data processing, and probabilistic modeling techniques that showcase GemPy's capabilities for geological modeling with uncertainty quantification.
2 parents 4f51677 + ba0166c commit 0dca07d

File tree

17 files changed

+1049
-9
lines changed

17 files changed

+1049
-9
lines changed

.env.example

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
LIQUID_EARTH_API_TOKEN = "foobar-1499cbe49808c8cfef14303d2aaacXDx-w1vc1bD065Wj0ABi3HD2fzQ"
2+
BOREHOLES_SOUTH_FOLDER = "/mnt/d/OneDrive - .../boreholes_south"

.gitignore

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@
22
__pycache__/
33
*.py[cod]
44
*$py.class
5+
*.nc
56

67
# C extensions
78
*.so
@@ -160,4 +161,5 @@ cython_debug/
160161
.idea/
161162

162163
docs/build
163-
docs/source
164+
docs/source
165+
/gempy_probability/_version.py

docs/dev_logs/2025_May.md

Lines changed: 21 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -5,12 +5,17 @@
55
- [ ] Keep extracting code into modules and functions
66
- [ ] Posterior analysis for gempy
77
- [ ] Investigate speed
8-
- [ ] Investigate segmentation function
8+
- [x] Investigate segmentation function
99
https://chatgpt.com/share/68172f53-4f7c-8000-8f42-205a033ed899
1010
- [ ] Create a robust api to make sure we do not mess up assigning priors to gempy parameters
1111
- [ ] do we need to use index_put or just normal indexing?
1212
--- Vector examples
13-
- [ ] Bring the Vector example here...
13+
- [ ] Bring the Vector examples here...
14+
- Stonepark
15+
- Gravity
16+
- Multidimensional likelihood
17+
- spremberg
18+
- Categories
1419
- [ ] Using categories as likelihood function (maybe I need to check out with Sam)
1520
--- MinEye
1621
- [ ] Revive magnetics
@@ -20,11 +25,13 @@
2025
- [x] Compressing code
2126
- [x] GemPy posterior visualization
2227
- [x] Add proper thickness likelihood
23-
- [ ] Speed
28+
- [x] Segmentation function
29+
- [ ] @Look-Bellow What would I want to have to do StonePark example as flowless as possible?
30+
- [ ] Speed: Blocked by having a model that is not too big.
2431
- [ ] Time test for torch CPU
2532
- [ ] Time test for keops CPU
2633
- [ ] Time test for keops GPU
27-
- [ ] Options and config presets
34+
- [ ] Options and config presets
2835

2936
### Possible Design:
3037
- Geological Model
@@ -38,4 +45,13 @@
3845

3946
### Likelihood functions
4047
- ? Should I start to create a catalog of likelihood functions that goes with gempy?
41-
- This would be useful to be able to visualize whatever the likelihood function represents independent of the probabilistic model.
48+
- This would be useful to be able to visualize whatever the likelihood function represents independent of the probabilistic model.
49+
-
50+
### What would I want to have to do StonePark example as flowless as possible?
51+
1. A way to save the model already with the correct **nugget effects** instead of having to "load nuggets" (TODO in gempy)
52+
2. Likelihood function plug and play
53+
3. Some sort of pre-analysis if the model looks good. Maybe some metrics about how long it takes to run the model.
54+
1. Condition number
55+
2. Acceptance rate of the posterior
56+
4. Easy pykeops setting using .env
57+
5. Some sort of easy set-up for having a reduced number of priors

examples/examples/README.rst

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
Examples
2+
========
3+
4+
This examples intend to be a gallery of what other people has done with GemPy.
5+
Hopefully with the help of the community we can create a complete library of
6+
structures that can serve as template for new users.

examples/examples/__init__.py

Whitespace-only changes.
Lines changed: 119 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,119 @@
1+
2+
import numpy as np
3+
4+
import gempy as gp
5+
import xarray as xr
6+
from vector_geology.model_building_functions import optimize_nuggets_for_group
7+
8+
9+
10+
11+
def initialize_geo_model(structural_elements: list[gp.data.StructuralElement], extent: list[float],
12+
topography: xr.Dataset, load_nuggets: bool = False
13+
) -> gp.data.GeoModel:
14+
structural_group_red = gp.data.StructuralGroup(
15+
name="Red",
16+
elements=[structural_elements[i] for i in [0, 4, 8]],
17+
structural_relation=gp.data.StackRelationType.ERODE
18+
)
19+
20+
# Any, Probably we can decimize this an extra notch
21+
structural_group_green = gp.data.StructuralGroup(
22+
name="Green",
23+
elements=[structural_elements[i] for i in [5]],
24+
structural_relation=gp.data.StackRelationType.ERODE
25+
)
26+
27+
# Blue range 2 cov 4
28+
structural_group_blue = gp.data.StructuralGroup(
29+
name="Blue",
30+
elements=[structural_elements[i] for i in [2, 3]],
31+
structural_relation=gp.data.StackRelationType.ERODE
32+
)
33+
34+
structural_group_intrusion = gp.data.StructuralGroup(
35+
name="Intrusion",
36+
elements=[structural_elements[i] for i in [1]],
37+
structural_relation=gp.data.StackRelationType.ERODE
38+
)
39+
40+
structural_groups = [structural_group_intrusion, structural_group_green, structural_group_blue, structural_group_red]
41+
structural_frame = gp.data.StructuralFrame(
42+
structural_groups=structural_groups[2:],
43+
color_gen=gp.data.ColorsGenerator()
44+
)
45+
# TODO: If elements do not have color maybe loop them on structural frame constructor?
46+
47+
geo_model: gp.data.GeoModel = gp.create_geomodel(
48+
project_name='Tutorial_ch1_1_Basics',
49+
extent=extent,
50+
resolution=[20, 10, 20],
51+
refinement=5, # * Here we define the number of octree levels. If octree levels are defined, the resolution is ignored.
52+
structural_frame=structural_frame
53+
)
54+
55+
if topography is not None:
56+
gp.set_topography_from_arrays(
57+
grid=geo_model.grid,
58+
xyz_vertices=topography.vertex.values
59+
)
60+
61+
if load_nuggets:
62+
import os
63+
64+
project_root = os.getcwd()
65+
path_to_temp = os.path.join(project_root, "../temp")
66+
apply_optimized_nuggets(
67+
geo_model=geo_model,
68+
loaded_nuggets_red=(np.load(path_to_temp + "/nuggets_Red.npy")),
69+
loaded_nuggets_blue=(np.load(path_to_temp + "/nuggets_Blue.npy")),
70+
loaded_nuggets_green=(np.load(path_to_temp + "/nuggets_Green.npy"))
71+
)
72+
73+
geo_model.structural_frame.get_element_by_name("KKR").color = "#A46283"
74+
geo_model.structural_frame.get_element_by_name("LGR").color = "#6394A4"
75+
geo_model.structural_frame.get_element_by_name("WAL").color = "#72A473"
76+
geo_model.structural_frame.get_element_by_name("ABL").color = "#1D3943"
77+
geo_model.structural_frame.basement_color = "#8B4220"
78+
79+
geo_model.update_transform()
80+
81+
return geo_model
82+
83+
def setup_geophysics(env_path: str, geo_model: gp.data.GeoModel):
84+
import pandas as pd
85+
from dotenv import dotenv_values
86+
config = dotenv_values()
87+
88+
df = pd.read_csv(
89+
filepath_or_buffer=config.get(env_path),
90+
sep=',',
91+
header=0
92+
)
93+
94+
# Remove the items that have X > 5650000
95+
df = df[df['X'] < 565000]
96+
97+
interesting_columns = df[['X', 'Y', 'Bouguer_267_complete']]
98+
# %%
99+
device_location = interesting_columns[['X', 'Y']]
100+
# stack 0 to the z axis
101+
device_location['Z'] = 0
102+
103+
gp.set_centered_grid(
104+
grid=geo_model.grid,
105+
centers=device_location,
106+
resolution=np.array([10, 10, 15]),
107+
radius=np.array([5000, 5000, 5000])
108+
)
109+
110+
gravity_gradient = gp.calculate_gravity_gradient(geo_model.grid.centered_grid)
111+
112+
# %%
113+
from gempy_engine.core.backend_tensor import BackendTensor
114+
geo_model.geophysics_input = gp.data.GeophysicsInput(
115+
tz=BackendTensor.t.array(gravity_gradient),
116+
densities=BackendTensor.t.array([2.61, 2.92, 3.1, 2.92, 2.61, 2.61]),
117+
)
118+
119+
return interesting_columns
Lines changed: 94 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,94 @@
1+
import os
2+
import xarray as xr
3+
import numpy as np
4+
import gempy as gp
5+
import subsurface
6+
7+
8+
def process_file(filename, global_extent, color_generator: gp.data.ColorsGenerator):
9+
dataset: xr.Dataset = xr.open_dataset(filename)
10+
file_extent = _calculate_extent(dataset)
11+
global_extent = _update_global_extent(global_extent, file_extent)
12+
13+
base, ext = os.path.splitext(filename)
14+
base = os.path.basename(base)
15+
structural_element = _extract_surface_points_and_orientations(dataset, base, color_generator)
16+
17+
return structural_element, global_extent
18+
19+
20+
# Function to calculate the extent (ROI) for a single dataset
21+
def _calculate_extent(dataset):
22+
x_coord, y_coord, z_coord = dataset.vertex[:, 0], dataset.vertex[:, 1], dataset.vertex[:, 2]
23+
return [x_coord.min().values, x_coord.max().values, y_coord.min().values, y_coord.max().values, z_coord.min().values, z_coord.max().values]
24+
25+
26+
# Function to extract surface points and orientations from the dataset
27+
28+
29+
def _extract_surface_points_and_orientations(dataset, name, color_generator) -> gp.data.StructuralElement:
30+
name = name.replace("Stonepark_", "")
31+
32+
# Extract surface points and orientations
33+
unstruct = subsurface.UnstructuredData(dataset)
34+
ts = subsurface.TriSurf(mesh=unstruct)
35+
triangulated_mesh = subsurface.visualization.to_pyvista_mesh(ts)
36+
# print(name)
37+
# subsurface.visualization.pv_plot([triangulated_mesh])
38+
39+
# Decimate the mesh and compute normals
40+
decimated_mesh = triangulated_mesh.decimate_pro(0.95)
41+
normals = decimated_mesh.compute_normals(point_normals=False, cell_normals=True, consistent_normals=True)
42+
normals_array = np.array(normals.cell_data["Normals"])
43+
44+
# Filter points
45+
vertex_sp = normals.cell_centers().points
46+
vertex_grads = normals_array
47+
positive_z = vertex_grads[:, 2] > 0
48+
largest_component_z = np.all(vertex_grads[:, 2:3] >= np.abs(vertex_grads[:, :2]), axis=1)
49+
filter_mask = np.logical_and(positive_z, largest_component_z)
50+
51+
surface_points_xyz = vertex_sp[filter_mask]
52+
orientations_gxyz = vertex_grads[filter_mask]
53+
nuggets = np.ones(len(surface_points_xyz)) * 0.000001
54+
55+
# Create SurfacePointsTable and OrientationsTable
56+
every = 10
57+
surface_points = gp.data.SurfacePointsTable.from_arrays(
58+
x=surface_points_xyz[::every, 0],
59+
y=surface_points_xyz[::every, 1],
60+
z=surface_points_xyz[::every, 2],
61+
names=name,
62+
nugget=nuggets[::every]
63+
)
64+
65+
every = 10
66+
orientations = gp.data.OrientationsTable.from_arrays(
67+
x=surface_points_xyz[::every, 0],
68+
y=surface_points_xyz[::every, 1],
69+
z=surface_points_xyz[::every, 2],
70+
G_x=orientations_gxyz[::every, 0],
71+
G_y=orientations_gxyz[::every, 1],
72+
G_z=orientations_gxyz[::every, 2],
73+
nugget=4,
74+
names=name
75+
)
76+
77+
structural_element = gp.data.StructuralElement(
78+
name=name,
79+
surface_points=surface_points,
80+
orientations=orientations,
81+
color=next(color_generator)
82+
)
83+
84+
return structural_element
85+
86+
87+
# Function to update the global extent based on each file's extent
88+
def _update_global_extent(global_extent, file_extent):
89+
if not global_extent:
90+
return file_extent
91+
else:
92+
return [min(global_extent[0], file_extent[0]), max(global_extent[1], file_extent[1]),
93+
min(global_extent[2], file_extent[2]), max(global_extent[3], file_extent[3]),
94+
min(global_extent[4], file_extent[4]), max(global_extent[5], file_extent[5])]

0 commit comments

Comments
 (0)