Skip to content

Commit 1416a9c

Browse files
authored
chore: fix sql seq for benchmark report (#15030)
1 parent 2f64773 commit 1416a9c

File tree

2 files changed

+6
-6
lines changed

2 files changed

+6
-6
lines changed

.github/workflows/reuse.benchmark.yml

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -104,6 +104,7 @@ jobs:
104104
- { dataset: hits, size: Large }
105105
- { dataset: tpch, size: Small }
106106
- { dataset: tpch, size: Large }
107+
- { dataset: load, size: Small }
107108
fail-fast: true
108109
max-parallel: 1
109110
steps:
@@ -187,6 +188,8 @@ jobs:
187188
dataset:
188189
- "tpch"
189190
- "hits"
191+
- "load"
192+
- "internal"
190193
steps:
191194
- uses: actions/checkout@v4
192195
- name: Install Dependencies
@@ -196,7 +199,7 @@ jobs:
196199
- uses: actions/download-artifact@v4
197200
with:
198201
path: benchmark/clickbench/results
199-
pattern: benchmark-*
202+
pattern: benchmark-${{ matrix.dataset }}-*
200203
merge-multiple: true
201204
- name: Generate report and upload to R2
202205
working-directory: benchmark/clickbench

benchmark/clickbench/update_results.py

Lines changed: 2 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -16,17 +16,14 @@
1616

1717
def update_results(dataset, title, url):
1818
queries = []
19-
for query_file in glob.glob(f"{dataset}/queries/*.sql"):
19+
for query_file in sorted(glob.glob(f"{dataset}/queries/*.sql")):
2020
with open(query_file, "r") as f:
2121
queries.append(f.read())
2222
results = []
2323
for result_file in glob.glob(f"results/{dataset}/**/*.json", recursive=True):
2424
logger.info(f"reading result: {result_file}...")
2525
with open(result_file, "r") as f:
26-
result = json.load(f)
27-
# if dataset == "tpch":
28-
# result["result"].insert(0, [0.01, 0.01, 0.01])
29-
results.append(result)
26+
results.append(json.load(f))
3027

3128
logger.info("loading report template %s ...", TEMPLATE_FILE)
3229
templateLoader = FileSystemLoader(searchpath="./")

0 commit comments

Comments
 (0)