Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The machine is great! How is its performance for AI model training? A lot of library and tools are not built for M series chip


Poor. My M3 Max/128GB is about 20x slower than 4090. For inference it's much better, still much slower than 4090 but it enables working with much larger LLMs albeit at ~10t/s (in comparison, Threadripper 2990WX/256GB does like 0.25t/s). M4 Max is likely going to be ~25% faster than M3 Max based on CPU perf and memory bandwidth.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: