Skip to content

Conversation

lindnemi
Copy link
Collaborator

Before:
power_limit_old
After:
power-limit-new

Description

I learned a nice trick (for some of you this probably old news, but I didn't know it and I think it is very helpful):

In PyPSA-DE we want to limit the total capacity of power imports. A relatively straightforward way to attempt this is to define a constraint for every snapshot t:

incoming_flows = n.model["Line-s"].loc[t, incoming_lines]
sum(incoming_flows) <= limit

However, "Line-s" may be both positive and negative. Hence this limit will only apply to the net import. In other words: If the model would like to import more from France, it can do so if it exports a bit to Switzerland. (This can get pretty bad, sometimes we observe gross imports that are twice as big as the limit.)

To avoid this behaviour, what you actually want to constrain are only the gross imports, i.e., positive parts of the flows,

sum(max(0, incoming_flows)) <= limit

However, max is a nonlinear function. Here I thought, it's impossible to achieve this in PyPSA-DE without converting all lines to links, or the LP to a MILP. But actually there is another way, and ChatGPT gave it away pretty quickly:

For every incoming flow introduce an auxiliary variable,

p[i] >= 0
p[i] >= incoming_flows[i]

then add the constraint

sum(p[i]) <= limit

and voila, the max has been linearized and the constraint applies to the gross import.

Before asking for a review for this PR make sure to complete the following checklist:

  • Workflow with target rule ariadne_all completes without errors
  • The logic of export_ariadne_variables has been adapted to the changes
  • One or several figures that validate the changes in the PR have been posted as a comment
  • A brief description of the changes has been added to Changelog.md
  • The latest main has been merged into the PR
  • The config has a new prefix of the format YYYYMMDDdescriptive_title

@lindnemi lindnemi requested a review from JulianGeis September 26, 2025 12:40
Copy link
Contributor

@JulianGeis JulianGeis left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I reviewed the PR:

  • workflow runs through in 365H resolution
  • import and export limits are met 15 MW in 2020 and 35 MW in 2045
image image - [x] code is well documented

Possible problems:

  • This PR creates many auxiliary variables, for the 365H run it did not significantly increase the solving time, but this should maybe be checked with a 3H run
  • When I import the networks now, there is a Performance warning:
    /home/julian-geis/mambaforge/envs/p-de-public_de/lib/python3.12/site-packages/pypsa/io.py:442: PerformanceWarning: DataFrame is highly fragmented. This is usually the result of calling frame.insert many times, which has poor performance. Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use newframe = frame.copy() -> I am not sure if this is directly related but seems to be likely
  • I tried a few things like to create all variables at once and assign them. but did not really work out, maybe just keep it as is

@lindnemi
Copy link
Collaborator Author

Thanks for the review! I rewrote the constraint with more Linopyesque syntax, in the low resolution model the warning are gone, hopefully for the High Res as well, we will see

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants