Skip to content

Performance questions #96

Open
Open
@alexprengere

Description

@alexprengere

Incredible work! I was trying to see what kind of overhead there is to call a wasm function from Python.
I am using WSL2 on Windows, with a recent Fedora, CPython3.10.
Re-using the exact gcd.wat from the examples, it looks like any call has a "cost" of about 30μs.

The code I used is a simple timer to compare performance:

import wasmtime.loader
import time
from math import gcd as math_gcd
from gcd import gcd as wasm_gcd

def python_gcd(x, y):
    while y:
        x, y = y, x % y
    return abs(x)

N = 1_000
for gcdf in math_gcd, python_gcd, wasm_gcd:
    start_time = time.perf_counter()
    for _ in range(N):
        g = gcdf(16516842, 154654684)
    total_time = time.perf_counter() - start_time
    print(f"{total_time / N * 1_000_000:8.3f}μs")

This returns about:

   0.152μs  # C code from math.gcd
   0.752μs  # Python code from python_gcd
  31.389μs  # gcd.wat

Note that I tested this with an empty "hello world", and the 30μs are still there.
I am wondering about 2 things:

  • is this overhead inevitable and linked to the design, or are there ways to reduce it?
  • I noticed in the gcd.wat code that input parameters are expected to be i32, but the code does not fail when parameters exceed 2**32, so I was wondering how this works

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions