> So don't do that. If you can run tensorflow on your device, you can call out to a local process.
Not from a webapp (without jumping through a dozen other hoops.) With tensorflow.js, you can do (for example) pose estimation, or face detection, or audio recognition, right in the browser without sending data to a remote server.
> But that's not a good reason. It's just a reason.
Yes, of course it's a reason. The point is that it can be a good reason in many cases.
So now we're down to "I want to run a neural network exclusively in the browser" as the primary reason you'd want to use this.
OK, fine. That's a niche use-case for a domain where scale and performance matter so much that we're building specialized hardware to support it. For 99.99% of developers, they would be better advised to find another way to solve their problem using more conventional tools.
There was a guy who built a life-sized house out of Lego once. It was a cool trick, but the difference between him and "modern Javascript" developers is that he didn't try to make anyone live in the house.
How can you run Python tensorflow locally? We're talking about web apps here.
Try to understand what is at play here. It's not all about raw performance, there are many more important things to consider that GP explained extensively.
Yup, which is a perfectly valid reason.
> If calling out to a binary is a security problem for you
You miss the point. The security problem is sending the raw data from the client to the server.
> So again, this boils down to "I don't want to use Python and I'd prefer to use JS instead."
Which once again, is a perfectly valid reason
> Servers are expensive so hosting static weights on a server is cheaper?
No, CPU/GPU intensive tasks on a server is expensive but storing a few static weights is cheap.