Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

M3 Max 128GB here and it’s mad impressive.

Im spec’ing out a Mac Studio with 512GB ram because I can window shop and wish but I think the trend for local LLMs is getting really good.

Do we know WHY openAI even released them?



> Do we know WHY openAI even released them?

Regulations and trying to earn good will of developers using local LLMs, something that was slowly eroding since it was a while ago (GPT2 - 2019) they released weights to the public.


If the new gpt 5 is actually better, then this oss version is not really a threat to Openai's income stream, but it can be a threat to their competitors.


> Do we know WHY openAI even released them?

Enterprises can now deploy them on AWS and GCP.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: