[Dev Tools] @matterlabs/hardhat-zksync-node
slow performance in hardhat tests
#1039
-
Team or ProjectSafe ZK chainEra EnvironmentMainnet Select the Dev Tool you are usingHardhat Plugins Provide the version of the tool (if applicable)"@matterlabs/hardhat-zksync-node": "^1.5.1", Provide a brief description of the functionality you're trying to implement and the issue you are running into.First of all, working with the node plugin has been a nightmare. Breaking changes were introduced under a minor version without warning, forcing us to spend significant time just getting our test setup to work. But even that pales in comparison to its performance issues relative to Hardhat’s node. Running the same tests with Hardhat’s node takes only 5 seconds on my machine. Running tests with zksync node takes 11m 48s. Same test suite. I can reproduce this behaviour consistently across macOS, Linux, and WSL, so I don't believe machine specs are the issue, but let me know if they would be helpful. Repo Link (Optional)https://github.com/safe-global/safe-smart-account Additional Detailsto reproduce, run:
compare run times |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
Thanks for posting, and apologies for the frustration endured using If you were not aware, we delivered EVM equivalence in the v27 upgrade so you can make use of L1 tooling (e.g. hardhat) without needing any of the custom zksync hardhat plugins. The EVM experience will also be further improved when ZKsyncOS lands. |
Beta Was this translation helpful? Give feedback.
Okay so I did some fixes that results in a ~1.8x speed up of the
hardhat-zksync-node
hardhat test runner and properly sets--quiet
param so we no longer spill tx summary output etc to stdout.Once these 2 PRs (matter-labs/anvil-zksync#729, matter-labs/hardhat-zksync#1790) land, and released you will see an improvement.
That being said, there is still one area we can get an even more speed up (batch closing) but will require additional refactoring that has no immediate time table but is on the radar.
I'm going to mark this as answered for now, feel free to re-open if there is more you'd like to resolve / discuss or open a new discussion that will be triaged accordingly. Really appreciate y…