Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> An interesting question is "why do standards tend to run behind state-of-the-art"? There are so many examples of this in the past (coming from a graphics background, OpenCL vs. CUDA). Is this something we can fix?

Its not broken. Standards are, ideally, distillations and applications of proven experience in a generally applicable way. You don't want standards trying to ride the bleeding edge, they want them to apply what is proven.

(OTOH, its good to have those working on the bleeding edge also opening for discussion how those experiments might be standardized while the experiments are still ongoing, and one vehicle for those discussions is draft standards or draft updates to existing standards.)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: