You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: lectures/navy_captain.md
+29-30Lines changed: 29 additions & 30 deletions
Original file line number
Diff line number
Diff line change
@@ -42,59 +42,56 @@ This lecture follows up on ideas presented in the following lectures:
42
42
* {doc}`Exchangeability and Bayesian Updating <exchangeable>`
43
43
* {doc}`Likelihood Ratio Processes <likelihood_ratio_process>`
44
44
45
-
In {doc}`A Problem that Stumped Milton Friedman <wald_friedman>`we described a problem
45
+
{doc}`A Problem that Stumped Milton Friedman <wald_friedman>` described a problem
46
46
that a Navy Captain presented to Milton Friedman during World War II.
47
47
48
-
The Navy had instructed the Captain to use a decision rule for quality control that the Captain suspected
49
-
could be dominated by a better rule.
48
+
The Navy had told the Captain to use a decision rule for quality control.
50
49
51
-
(The Navy had ordered the Captain to use an instance of a **frequentist decision rule**.)
50
+
In particular, the Navy had ordered the Captain to use an instance of a **frequentist decision rule**.
52
51
53
-
Milton Friedman recognized the Captain's conjecture as posing a challenging statistical problem that he and other
54
-
members of the US Government's Statistical Research Group at Columbia University proceeded to try to solve.
52
+
The Captain doubted that that rule was a good one.
55
53
56
-
One of the members of the group, the great mathematician Abraham Wald, soon solved the problem.
54
+
Milton Friedman recognized the Captain's conjecture as posing a challenging statistical problem that he and other members of the US Government's Statistical Research Group at Columbia University proceeded to try to solve.
55
+
56
+
A member of the group, the great mathematician and economist Abraham Wald, soon solved the problem.
57
57
58
58
A good way to formulate the problem is to use some ideas from Bayesian statistics that we describe in
59
59
this lecture {doc}`Exchangeability and Bayesian Updating <exchangeable>` and in this lecture
60
60
{doc}`Likelihood Ratio Processes <likelihood_ratio_process>`, which describes the link between Bayesian
61
61
updating and likelihood ratio processes.
62
62
63
-
The present lecture uses Python to generate simulations that evaluate expected losses under **frequentist** and **Bayesian**
64
-
decision rules for an instance of the Navy Captain's decision problem.
63
+
The present lecture uses Python to generate simulations that evaluate expected losses under **frequentist** and **Bayesian** decision rules for an instance of the Navy Captain's decision problem.
65
64
66
-
The simulations validate the Navy Captain's hunch that there is a better rule than the one the Navy had ordered him
67
-
to use.
65
+
The simulations confirm the Navy Captain's hunch that there is a better rule than the one the Navy had ordered him to use.
68
66
69
67
## Setup
70
68
71
-
To formalize the problem of the Navy Captain whose questions posed the
72
-
problem that Milton Friedman and Allan Wallis handed over to Abraham
73
-
Wald, we consider a setting with the following parts.
69
+
To formalize the problem that had confronted the Navy Captain, we consider a setting with the following parts.
74
70
75
71
- Each period a decision maker draws a non-negative random variable
76
-
$Z$ from a probability distribution that he does not completely
77
-
understand. He knows that two probability distributions are possible,
72
+
$Z$. He knows that two probability distributions are possible,
78
73
$f_{0}$ and $f_{1}$, and that which ever distribution it
79
74
is remains fixed over time. The decision maker believes that before
80
-
the beginning of time, nature once and for all selected either
75
+
the beginning of time, nature once and for all had selected either
81
76
$f_{0}$ or $f_1$ and that the probability that it
82
77
selected $f_0$ is probability $\pi^{*}$.
83
78
- The decision maker observes a sample
84
-
$\left\{ z_{i}\right\}_{i=0}^{t}$ from the the distribution
79
+
$\left\{ z_{i}\right\}_{i=0}^{t}$ from the distribution
85
80
chosen by nature.
86
81
87
82
The decision maker wants to decide which distribution actually governs
88
-
$Z$ and is worried by two types of errors and the losses that they
83
+
$Z$.
84
+
85
+
He is worried about two types of errors and the losses that they will
89
86
impose on him.
90
87
91
-
- a loss $\bar L_{1}$ from a **type I error** that occurs when he decides that
88
+
- a loss $\bar L_{1}$ from a **type I error** that occurs if he decides that
92
89
$f=f_{1}$ when actually $f=f_{0}$
93
-
- a loss $\bar L_{0}$ from a **type II error** that occurs when he decides that
90
+
- a loss $\bar L_{0}$ from a **type II error** that occurs if he decides that
94
91
$f=f_{0}$ when actually $f=f_{1}$
95
92
96
93
The decision maker pays a cost $c$ for drawing
97
-
another $z$
94
+
another $z$.
98
95
99
96
We mainly borrow parameters from the quantecon lecture
100
97
{doc}`A Problem that Stumped Milton Friedman <wald_friedman>` except that we increase both $\bar L_{0}$
@@ -215,8 +212,10 @@ In particular, it gave him a decision rule that the Navy had designed by using
215
212
frequentist statistical theory to minimize an
216
213
expected loss function.
217
214
218
-
That decision rule is characterized by a sample size $t$ and a
0 commit comments