Skip to content

Commit b916ef3

Browse files
committed
reformating README for better referencing
1 parent ca88cbf commit b916ef3

File tree

3 files changed

+61
-67
lines changed

3 files changed

+61
-67
lines changed

README.md

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,18 @@
11
# Optimal projection for parametric importance sampling in high dimensions
2-
by Maxime El Masri (ONERA and ISAE), Jérôme Morio (ONERA) and Florian Simatos (ISAE)
3-
4-
Submitted to Computo
5-
62

73
[![build status](https://github.com/computorg/published-202402-elmasri-optimal/workflows/build/badge.svg)](https://github.com/computorg/published-202402-elmasri-optimal/)
84
[![](https://img.shields.io/github/last-commit/computorg/published-202402-elmasri-optimal.svg)](https://github.com/computorg/published-202402-elmasri-optimal/commits/main)
95
[![DOI:10.57750/jjza-6j82](https://img.shields.io/badge/DOI-10.57750/jjza-6j82.svg)](https://doi.org/10.57750/jjza-6j82)
10-
[![HTML](https://img.shields.io/badge/article-HTML-034E79)](https://computo.sfds.asso.fr/published-202402-elmasri-optimal/)
11-
[![PDF](https://img.shields.io/badge/article-PDF-034E79)](https://computo.sfds.asso.fr/published-202402-elmasri-optimal/published-202402-elmasri-optimal.pdf)
12-
[![review 1](https://img.shields.io/badge/review-report%201-blue)](https://github.com/computorg/published-202402-elmasri-optimal/issues/2)
13-
[![review 2](https://img.shields.io/badge/review-report%202-blue)](https://github.com/computorg/published-202402-elmasri-optimal/issues/3)
6+
[![reviews](https://img.shields.io/badge/review-report-blue)](https://github.com/computorg/published-202402-elmasri-optimal/issues?q=is%3Aopen+is%3Aissue+label%3Areview)
147
[![SWH](https://archive.softwareheritage.org/badge/origin/https://github.com/computorg/published-202402-elmasri-optimal/)](https://archive.softwareheritage.org/browse/origin/?origin_url=https://github.com/computorg/published-202402-elmasri-optimal)
158
[![Creative Commons License](https://i.creativecommons.org/l/by/4.0/80x15.png)](http://creativecommons.org/licenses/by/4.0/)
169

10+
Authors:
11+
12+
- Maxime El Masri (ONERA and ISAE)
13+
- Jérôme Morio (ONERA)
14+
- Florian Simatos (ISAE)
15+
16+
We propose a dimension reduction strategy in order to improve the performance of importance sampling in high dimensions. The idea is to estimate variance terms in a small number of suitably chosen directions. We first prove that the optimal directions, i.e., the ones that minimize the Kullback--Leibler divergence with the optimal auxiliary density, are the eigenvectors associated with extreme (small or large) eigenvalues of the optimal covariance matrix. We then perform extensive numerical experiments showing that as dimension increases, these directions give estimations which are very close to optimal. Moreover, we demonstrate that the estimation remains accurate even when a simple empirical estimator of the covariance matrix is used to compute these directions. The theoretical and numerical results open the way for different generalizations, in particular the incorporation of such ideas in adaptive importance sampling schemes.
17+
1718

_quarto.yml

Lines changed: 52 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,57 @@
11
project:
2-
title: "Optimal projection for parametric importance sampling in high dimensions"
3-
type: website
42
render:
53
- published-202402-elmasri-optimal.qmd
64

5+
title: Optimal projection for parametric importance sampling in high dimensions
6+
author:
7+
- name: Maxime El Masri
8+
affiliation: '[ONERA/DTIS](https://www.onera.fr/), [ISAE-SUPAERO](https://www.isae-supaero.fr/), [Université de Toulouse](https://www.univ-toulouse.fr/)'
9+
orcid: 0000-0002-9127-4503
10+
- name: Jérôme Morio
11+
url: 'https://www.onera.fr/en/staff/jerome-morio?destination=node/981'
12+
affiliation: '[ONERA/DTIS](https://www.onera.fr/), [Université de Toulouse](https://www.univ-toulouse.fr/)'
13+
orcid: 0000-0002-8811-8956
14+
- name: Florian Simatos
15+
url: 'https://pagespro.isae-supaero.fr/florian-simatos/'
16+
affiliation: '[ISAE-SUPAERO](https://www.isae-supaero.fr/), [Université de Toulouse](https://www.univ-toulouse.fr/)'
17+
description: |
18+
This document provides a dimension-reduction strategy in order to improve the performance of importance sampling in high dimensions.
19+
abstract: |
20+
We propose a dimension reduction strategy in order to improve the performance of importance sampling in high dimensions. The idea is to estimate variance terms in a small number of suitably chosen directions. We first prove that the optimal directions, i.e., the ones that minimize the Kullback--Leibler divergence with the optimal auxiliary density, are the eigenvectors associated with extreme (small or large) eigenvalues of the optimal covariance matrix. We then perform extensive numerical experiments showing that as dimension increases, these directions give estimations which are very close to optimal. Moreover, we demonstrate that the estimation remains accurate even when a simple empirical estimator of the covariance matrix is used to compute these directions. The theoretical and numerical results open the way for different generalizations, in particular the incorporation of such ideas in adaptive importance sampling schemes.
21+
keywords:
22+
- Rare event simulation
23+
- Parameter estimation
24+
- Importance sampling
25+
- Dimension reduction
26+
- Kullback--Leibler divergence
27+
- Projection
28+
bibliography: references.bib
29+
github-user: computorg
30+
repo: published-202402-elmasri-optimal
31+
date: 03/11/2024
32+
date-modified: last-modified
33+
draft: false
34+
published: true
35+
google-scholar: true
36+
citation:
37+
type: article-journal
38+
container-title: "Computo"
39+
doi: "10.57750/jjza-6j82"
40+
publisher: "French Statistical Society"
41+
issn: "2824-7795"
742
format:
8-
computo-html:
9-
code-fold: true
43+
computo-html: default
44+
computo-pdf: default
45+
execute:
46+
keep-ipynb: true
47+
jupyter:
48+
jupytext:
49+
text_representation:
50+
extension: .qmd
51+
format_name: quarto
52+
format_version: '1.0'
53+
jupytext_version: 1.14.2
54+
kernelspec:
55+
display_name: Python 3 (ipykernel)
56+
language: python
57+
name: python3

published-202402-elmasri-optimal.qmd

Lines changed: 0 additions & 55 deletions
Original file line numberDiff line numberDiff line change
@@ -1,58 +1,3 @@
1-
---
2-
title: Optimal projection for parametric importance sampling in high dimensions
3-
author:
4-
- name: Maxime El Masri
5-
affiliation: '[ONERA/DTIS](https://www.onera.fr/), [ISAE-SUPAERO](https://www.isae-supaero.fr/), [Université de Toulouse](https://www.univ-toulouse.fr/)'
6-
orcid: 0000-0002-9127-4503
7-
- name: Jérôme Morio
8-
url: 'https://www.onera.fr/en/staff/jerome-morio?destination=node/981'
9-
affiliation: '[ONERA/DTIS](https://www.onera.fr/), [Université de Toulouse](https://www.univ-toulouse.fr/)'
10-
orcid: 0000-0002-8811-8956
11-
- name: Florian Simatos
12-
url: 'https://pagespro.isae-supaero.fr/florian-simatos/'
13-
affiliation: '[ISAE-SUPAERO](https://www.isae-supaero.fr/), [Université de Toulouse](https://www.univ-toulouse.fr/)'
14-
description: |
15-
This document provides a dimension-reduction strategy in order to improve the performance of importance sampling in high dimensions.
16-
abstract: |
17-
We propose a dimension reduction strategy in order to improve the performance of importance sampling in high dimensions. The idea is to estimate variance terms in a small number of suitably chosen directions. We first prove that the optimal directions, i.e., the ones that minimize the Kullback--Leibler divergence with the optimal auxiliary density, are the eigenvectors associated with extreme (small or large) eigenvalues of the optimal covariance matrix. We then perform extensive numerical experiments showing that as dimension increases, these directions give estimations which are very close to optimal. Moreover, we demonstrate that the estimation remains accurate even when a simple empirical estimator of the covariance matrix is used to compute these directions. The theoretical and numerical results open the way for different generalizations, in particular the incorporation of such ideas in adaptive importance sampling schemes.
18-
keywords:
19-
- Rare event simulation
20-
- Parameter estimation
21-
- Importance sampling
22-
- Dimension reduction
23-
- Kullback--Leibler divergence
24-
- Projection
25-
bibliography: references.bib
26-
github-user: computorg
27-
repo: published-202402-elmasri-optimal
28-
date: 03/11/2024
29-
date-modified: last-modified
30-
draft: false
31-
published: true
32-
google-scholar: true
33-
citation:
34-
type: article-journal
35-
container-title: "Computo"
36-
doi: "10.57750/jjza-6j82"
37-
publisher: "French Statistical Society"
38-
issn: "2824-7795"
39-
format:
40-
computo-html: default
41-
computo-pdf: default
42-
execute:
43-
keep-ipynb: true
44-
jupyter:
45-
jupytext:
46-
text_representation:
47-
extension: .qmd
48-
format_name: quarto
49-
format_version: '1.0'
50-
jupytext_version: 1.14.2
51-
kernelspec:
52-
display_name: Python 3 (ipykernel)
53-
language: python
54-
name: python3
55-
---
561

572
# Introduction
583

0 commit comments

Comments
 (0)