|
1 | 1 | project: |
2 | | - title: "Optimal projection for parametric importance sampling in high dimensions" |
3 | | - type: website |
4 | 2 | render: |
5 | 3 | - published-202402-elmasri-optimal.qmd |
6 | 4 |
|
| 5 | +title: Optimal projection for parametric importance sampling in high dimensions |
| 6 | +author: |
| 7 | + - name: Maxime El Masri |
| 8 | + affiliation: '[ONERA/DTIS](https://www.onera.fr/), [ISAE-SUPAERO](https://www.isae-supaero.fr/), [Université de Toulouse](https://www.univ-toulouse.fr/)' |
| 9 | + orcid: 0000-0002-9127-4503 |
| 10 | + - name: Jérôme Morio |
| 11 | + url: 'https://www.onera.fr/en/staff/jerome-morio?destination=node/981' |
| 12 | + affiliation: '[ONERA/DTIS](https://www.onera.fr/), [Université de Toulouse](https://www.univ-toulouse.fr/)' |
| 13 | + orcid: 0000-0002-8811-8956 |
| 14 | + - name: Florian Simatos |
| 15 | + url: 'https://pagespro.isae-supaero.fr/florian-simatos/' |
| 16 | + affiliation: '[ISAE-SUPAERO](https://www.isae-supaero.fr/), [Université de Toulouse](https://www.univ-toulouse.fr/)' |
| 17 | +description: | |
| 18 | + This document provides a dimension-reduction strategy in order to improve the performance of importance sampling in high dimensions. |
| 19 | +abstract: | |
| 20 | + We propose a dimension reduction strategy in order to improve the performance of importance sampling in high dimensions. The idea is to estimate variance terms in a small number of suitably chosen directions. We first prove that the optimal directions, i.e., the ones that minimize the Kullback--Leibler divergence with the optimal auxiliary density, are the eigenvectors associated with extreme (small or large) eigenvalues of the optimal covariance matrix. We then perform extensive numerical experiments showing that as dimension increases, these directions give estimations which are very close to optimal. Moreover, we demonstrate that the estimation remains accurate even when a simple empirical estimator of the covariance matrix is used to compute these directions. The theoretical and numerical results open the way for different generalizations, in particular the incorporation of such ideas in adaptive importance sampling schemes. |
| 21 | +keywords: |
| 22 | + - Rare event simulation |
| 23 | + - Parameter estimation |
| 24 | + - Importance sampling |
| 25 | + - Dimension reduction |
| 26 | + - Kullback--Leibler divergence |
| 27 | + - Projection |
| 28 | +bibliography: references.bib |
| 29 | +github-user: computorg |
| 30 | +repo: published-202402-elmasri-optimal |
| 31 | +date: 03/11/2024 |
| 32 | +date-modified: last-modified |
| 33 | +draft: false |
| 34 | +published: true |
| 35 | +google-scholar: true |
| 36 | +citation: |
| 37 | + type: article-journal |
| 38 | + container-title: "Computo" |
| 39 | + doi: "10.57750/jjza-6j82" |
| 40 | + publisher: "French Statistical Society" |
| 41 | + issn: "2824-7795" |
7 | 42 | format: |
8 | | - computo-html: |
9 | | - code-fold: true |
| 43 | + computo-html: default |
| 44 | + computo-pdf: default |
| 45 | +execute: |
| 46 | + keep-ipynb: true |
| 47 | +jupyter: |
| 48 | + jupytext: |
| 49 | + text_representation: |
| 50 | + extension: .qmd |
| 51 | + format_name: quarto |
| 52 | + format_version: '1.0' |
| 53 | + jupytext_version: 1.14.2 |
| 54 | + kernelspec: |
| 55 | + display_name: Python 3 (ipykernel) |
| 56 | + language: python |
| 57 | + name: python3 |
0 commit comments