[D] Calling PyTorch models from scala/spark?

Hey everybody, I work for a firm on an engineering team that uses AWS. Historically they’ve used PySpark to deploy deep loading models that I’ve built, but I’ve been tasked with researching to see if there’s a way to call models for inference as they say there is a decent amount of overhead as they are transitioning to a new mode of operation.

They are running a spark cluster with around 300 nodes, and ultimately hope there is a solution to perform inference either using scala natively(preferred), or some aws service that could serve the results.

Anyone have experience with this? Thanks in advance.

submitted by /u/Annual-Minute-9391
[link] [comments]

Liked Liked