-
Notifications
You must be signed in to change notification settings - Fork 22
Description
I'm starting a series of issues to track some potential ideas to accelerate and also increase the numerical stability of our Kálmán filter.
The
From an initial covariance matrix
def udu(P):
U = numpy.zeros((6, 6))
D = numpy.zeros((6, 6))
D[5, 5] = P[5, 5]
U[:, 5] = P[:, 5] / D[5, 5]
for j in reversed(range(0, 5)):
tmp = D[j, j]
for k in range(j+1, 6):
tmp += D[k, k] * U[j, k] * U[j, k]
D[j, j] = P[j, j] - tmp
for i in reversed(range(0, j + 1)):
tmp = U[i, j]
for k in range(j + 1, 6):
tmp += D[k, k] * U[j, k] * U[i, j]
U[i, j] = (P[i, j] - tmp) / D[j, j]
U[j, j] = 1.0
return U, D
The Kálmán prediction step becomes slightly more complicated than before. Instead of simply updating the covariance matrix with the Jacobian, i.e.:
We have to find updated
...that preserves the diagonality of update_covariance
method to a new
The Kálmán update step also changes, as the filtered covariance computation also changes. For this, we use the Agee-Turner algorithm.
Caution
I'm not currently sure if the
Note
The diagonal matrix
Useful documents: