[P] mlx-onnx: Run your MLX models in the browser using ONNX / WebGPU
Web Demo: https://skryl.github.io/mlx-ruby/demo/
Repo: https://github.com/skryl/mlx-onnx
What My Project Does
It allows you to convert MLX models into ONNX (onnxruntime, validation, downstream deployment). You can then run the onnx models in the browser using WebGPU.
- Exports MLX callables directly to ONNX
- Supports both Python and native C++ interfaces
Target Audience
- Developers who want to run MLX-defined computations in ONNX tooling (e.g. ORT, WebGPU)
- Early adopters and contributors; this is usable and actively tested, but still evolving rapidly (not claiming fully mature “drop-in production for every model” yet)
Comparison
- vs staying MLX-only: keeps your authoring flow in MLX while giving an ONNX export path for broader runtime/tool compatibility.
- vs raw ONNX authoring: mlx-onnx avoids hand-building ONNX graphs by tracing/lowering from MLX computations.
submitted by /u/rut216
[link] [comments]
Like
0
Liked
Liked