Skip to content

Benchmark Workflow #162

@alecandido

Description

@alecandido

Ok, blaming in Git it seems like that the problem described in #161 (comment) it is my fault, but something I did one year ago.

This means that yadism is approximately unmaintained, and almost no development happened in the meanwhile...

However, we need to drop asserts, since we never developed a suitable benchmark metric to automatically applied, see #83.
For this, I propose:

  • add in banana a no-assert mode, to be triggered by the workflow
  • add in banana report extraction, such that we can upload them as artifacts in the CI
    • the other, sensible, option is to upload the whole database, but I'd rather do both, since it is less clear to people how to do both...
  • re-enable .github/workflows/benchmark.yml, deactivated with the on value content in ef75c7a

Even though once you have the db and the extraction method this should be trivial (if well-documented), the workflow outcome should be intuitive to consume on its own.
It is as much trivial to run the conversion in the CI and upload both.

Metadata

Metadata

Assignees

Labels

benchmarksBenchmark (or infrastructure) relatedbugSomething isn't workinggood first issueGood for newcomersrefactorRefactor code

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions