Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Then train the models on real world data? Verify outputs enough until confidence is achieved.

The computer can do whatever it wants. People can do whatever they want. The question will be what level of security access will they have.

The key difference today is people are really good at making rationalizations for individual decisions. Computers are not.

Sometimes decisions are generally important, when they are important they require trust, and if these models never generate a method of demonstrating success/trust they won't be adopted.



>> Then train the models on real world data? Verify outputs enough until confidence is achieved.

And therein lies the rub. Do people actually want to know the truth? They invent methods to obtain it, but is that their aim or is it to confirm their current understanding of the world? Unlike a human being that can be forced into silence through coercion or manipulation, a computational model, once proven with certainty, is never going to go back in Pandora's box. This contradiction of pursuing truth but suppressing the inconvenience of its conclusions hasn't disappeared for some reason despite greater and greater knowledge of ourselves. Will it ever?


Both? I'm not 100% following. I think people want to know the truth, change is slow and admitting mistakes is often looked down on. Humanity made significant changes based on what was learned in the past 100 years. How this will be used, and who will reap it's benefits is unclear.


Perhaps I wasn't very clear. Would the general public accept the truth of a data model that proved that a previously controversial/unacceptable idea was factually correct to the point it could not be denied by a reasonable person?

Several attempts at training with real world models have produced results that have been attacked for being "algorithmically biased". This isn't because there was anything experimentally wrong with the dataset or choice of model on their own, but because of the result's contradiction of a certain worldview. While, I agree with you that people come closer to the truth in the long-term, in the short- and medium-term, I expect certain people to pad their own data and quash the different findings out of ideological motivation for some time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: