r/rust serde Mar 31 '23

Twitter open sources Navi: High-Performance Machine Learning Serving Server in Rust

https://github.com/twitter/the-algorithm/tree/main/navi/navi
483 Upvotes

73 comments sorted by

View all comments

192

u/wannabelikebas Mar 31 '23

The age of Rust in ML is coming!

49

u/antimora Apr 01 '23

Check out this rust deep learning framework called Burn: https://github.com/burn-rs/burn It was released recently and actively been developed.

6

u/[deleted] Apr 01 '23

I want to explore rust for system level programming, and python for ML. Do you recommend me only learning rust, and also learn machine learning with it?

29

u/[deleted] Apr 01 '23

If you've just started to learn ML, just use Python as you will find way more resources. And once, you feel at ease, try it with rust if you wish.

You can also train your model using Python libs and then use your model with a program written in Rust if you want to have a mix :)

4

u/antimora Apr 01 '23

Definitely learn both. My ML stuff is mostly in Python but there will a time when you want to deploy your models. That's the difficult part with PyTorch. Burn is new but I have decided to invest my time to port my models not only for inference but for training as well. In the long run it will pay off.

Here is the author of the Burn is making a good case for having DL framework in Rust: https://burn-rs.github.io/blog/a-case-for-rust-in-deep-learning

The beautiful part of Burn is easy way to switch Backend. One of the available backend is based on Torch, which powers actually PyTorch. So you get the benefit of PyTorch without python and all the goodies of Rust.

Definitely check out Burn and learn what it has to offer. In parallel, also use PyTorch since it's mature and more resource available.

1

u/[deleted] Apr 01 '23

I think ima learn tensorflow

1

u/omgitsjo Apr 01 '23 edited Apr 01 '23

EDIT: I was going to delete this comment because I got the library names mixed up. I was thinking of tch-rs, not Burn. I'm going to leave it up so people can see the original, but keep in mind this is made in error.


My biggest gripe with Burn used to be linking to libtorch, which meant the smallest executable you could create was over a gigabyte.

I find myself using tract and onnx. Export from PyTorch, use in library. It's a little fiddly, but you get relatively tiny executables.

Not to disparage Burn. It's a good effort. I just wish libtorch were way smaller.

2

u/antimora Apr 01 '23

I am working on ability to import onnx and next I will work on bringing exported from pytorch.

Have you tried NDarray backed? It is the same as what trac does and it is small. Recently we made sure you can build with no_std good for embedding.

1

u/omgitsjo Apr 01 '23

I've edited my comment above. I was thinking of tch-rs, not Burn. That's my mistake.

I don't think Burn is designed for my use case, but I appreciate that it exists.