Skip to content

Add NSGA2 and NRMSE #7

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Jul 30, 2024
Merged

Add NSGA2 and NRMSE #7

merged 5 commits into from
Jul 30, 2024

Conversation

Domsall
Copy link
Collaborator

@Domsall Domsall commented May 21, 2024

To use nsga2, we need to add pymoo or a newer pygad-solution.

I found a bug using the pygad-solution that I could only solve by changing the pygad code: ahmedfgad/GeneticAlgorithmPython#261

I also added many parameters to the command line for the possibility of changing the values without changing the code.

Solves #10

@stepeos
Copy link
Owner

stepeos commented Jul 8, 2024

LGTM! I'll do some testing and merge asap

@stepeos
Copy link
Owner

stepeos commented Jul 11, 2024

I'm having a bit of trouble getting the nsde to work, I haven't tried the other one's yet.
does

--action=calibrate ${workspaceFolder}/test_data/ ${workspaceFolder}/test_data/result/ nsde --population-size=200 --max-iter=10 --force-selection --param-keys=speedFactor,minGap,accel,decel,startupDelay,tau,delta,tpreview,tPersDrive,tPersEstimate,treaction,ccoolness,jerkmax,epsilonacc,taccmax,Mflatness,Mbegin --mop=distance,acceleration --gof=theils_u

work for you?

I think it's because of pymoo/core/problem.py:default_shape, see line 453:
F=(n, problem.n_obj),
We need to return the rmse error in the vectorized target for each MOP, this change is not trivial.
So for using distance,acceleration with pop size 200, we need a output shape of (200,2).
I will have to look on how we could rewrite measure_of_performance_factory to support that, but I think the simplest approach is to remove the scipy and pygad optimization options and rewrite the optimization factories
How do you feel about removing those and solely depending on pymoo(de)?

@Domsall
Copy link
Collaborator Author

Domsall commented Jul 12, 2024

It works perfectly with NRMSE. I only used that GoF, because it performs best according to the literature.

Do you have an idea where to calculate the relative Error of each parameter combination und put the results into the csv-files? I would like to have the error of distance, speed and acceleration of each combinations to compare different GoFs and results.

@stepeos
Copy link
Owner

stepeos commented Jul 28, 2024

@Domsall can you check if 3f787fe works for you? I changed the MOP factory so that mulitple measures of performance can be either summed or returned as tuple, the latter being used for pymoode & pymoo. Those should now work with theil's U and the others.

Do you mean you want not only the summed error (Error(distance)+Error(acceleration)), but rather separate entries for each measure of performance error per iteration, do I understand correctly?
so instead of

,iteration,weightedError,convergence,speedFactor,minGap,accel,decel,emergencyDecel,startupDelay,tau,delta,stepping,tpreview,tPersDrive,tPersEstimate,treaction,ccoolness,sigmaleader,sigmagap,sigmaerror,jerkmax,epsilonacc,actionStepLength,taccmax,Mflatness,Mbegin,leader,follower,recordingId,algorithm,pop-size,objectives,paramKeys,weights,gof
0,1,0.0788671198053691,-1,1.2855707451145828,1.318680612079782,1.5195624232108904,2.0885433641765334,15,0.2711937168822218,1.7876845439698485,1.5757631151187173,0.25,7.620127425618868,5.557696634958655,13.005749369333241,0.23046687465248375,0.3215290425864744,0.0001,0.0001,0.0001,4.446657923678764,2.286665232349671,0.0001,1.4450888727958722,1.877283706894736,1.080154097499501,122,148,1,nsde,200,"distance,acceleration","speedFactor,minGap,accel,decel,startupDelay,tau,delta,tpreview,tPersDrive,tPersEstimate,treaction,ccoolness,jerkmax,epsilonacc,taccmax,Mflatness,Mbegin",,theils_u

you want somewhat this:

,iteration,weightedError,ERROR(distance),ERROR(acceleration),convergence,speedFactor,minGap,accel,decel,emergencyDecel,startupDelay,tau,delta,stepping,tpreview,tPersDrive,tPersEstimate,treaction,ccoolness,sigmaleader,sigmagap,sigmaerror,jerkmax,epsilonacc,actionStepLength,taccmax,Mflatness,Mbegin,leader,follower,recordingId,algorithm,pop-size,objectives,paramKeys,weights,gof

If not, please make an example csv header and result for the desired entries.

@Domsall
Copy link
Collaborator Author

Domsall commented Jul 29, 2024

To your second bullet point:
Yes, absolutely. And the error should be the relative Error.

@Domsall
Copy link
Collaborator Author

Domsall commented Jul 29, 2024

The other changes: Looks great, should work like that. I will test and compare after you added the relative Error and merged everything.

@stepeos
Copy link
Owner

stepeos commented Jul 29, 2024

To your second bullet point: Yes, absolutely. And the error should be the relative Error.

So for NRMSE with distance and acceleration as measures of performance, you mean relataive to their total error

rel_error_distance=(NRMSE(distance)/(NRMSE(distance) + NRMSE(accel))

?

@Domsall
Copy link
Collaborator Author

Domsall commented Jul 29, 2024

Giving it more thought, I think the best idea is to simply always output the corresponding distance/speed/acceleration error of the chosen GOF next to the weightedError. That way any relative error can be calculated after the optimization.

EDIT: These errors are already in "all_results", so there may be no need to change anything

@Domsall Domsall merged commit fc58339 into main Jul 30, 2024
@Domsall Domsall deleted the nsga branch July 30, 2024 13:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants