Skip to content

Model reparameterization changing outcome probabilities #425

@juangmendoza19

Description

@juangmendoza19

calling set_all_parameterizations on a full TP model to a GLND model is changing some outcome probabilities non-trivially.

To reproduce:

`from pygsti.modelpacks import smq1Q_XY as std

datagen_model = std.target_model("GLND")

#Arbitrary error where I observed the problem
error_vec = [0] * 48
error_vec[0] = .01
datagen_model.from_vector(error_vec)

design = std.create_gst_experiment_design(16)

#Circuit with the maximum difference
bad_circuit = design.all_circuits_needing_data[394]

datagen_model_copy = datagen_model.copy()
datagen_model_copy.set_all_parameterizations("full TP")
datagen_model_copy.set_all_parameterizations("GLND", ideal_model=std.target_model("GLND"))

datagen_model.probabilities(bad_circuit)['0'] - datagen_model_copy.probabilities(bad_circuit)['0']`

Expected behavior
The code above outputs a probability difference of -1.406968064276981e-08. This is a substantial difference causing issues in my current project which requires comparison of gauge-equivalent models.

Environment:

  • pyGSTi version 0.9.12.1pyth
  • python version 3.10.14
  • macOS Sonoma 14.4.1

Additional context
After an email exchange with Riley and Corey, Riley identified the problem in the state preparation. One of the vector entries deviates by 2.7057608985464707e-08 after conversion. This makes sense considering the model only has errors in the state preparation.

I believe I have identified the issue being in pygsti/modelmembers/states/__init__py line 269. This scipy optimization returns the exact number above "2.7057608985464707e-08" as the error. I tried changing the tolerance of the optimization, but this did not seem to change its behavior.

Metadata

Metadata

Assignees

Labels

bugA bug or regression

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions