Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Google and Facebook Have Failed Us (theatlantic.com)
292 points by DLay on Oct 3, 2017 | hide | past | favorite | 179 comments


Zeynep Tufekci had a couple great columns about this recently regarding FB and the US elections this year. While I recognize that there's always a concern about who the gatekeepers get to be in the marketplace of ideas, I think this is a considerably more complex issues than "let all the information go wherever it wants and let the people sort it out"

Putting 4Chan in a top news slot is not "allowing unedited and unfiltered access to information", it is algorithmicly promoting a cesspool of disinformation into a spot that many users believe is fairly authoritative. There is no way to "not make a choice" here and let information be free. There is only figuring out ways to filter and sort the firehose of information that is now at all of our fingertips. (A situation, I should add, that human brains are not necessarily prepared for)

https://www.nytimes.com/2017/09/29/opinion/mark-zuckerberg-f...

https://www.nytimes.com/2017/09/23/opinion/sunday/facebook-a...


> into a spot that many users believe is fairly authoritative

I mean that's the real issue. You can't be unbiased while trying to be a source of truth. You can be impartial and return results based on relevancy or you can intervene in an attempt to return the truth. Google has ventured into dangerous territory by mixing the two -- especially with their automatic snippets that answer questions.

> There is only figuring out ways to filter and sort the firehose

But it's an unsolved problem for how you put the control of this filtering into the users hands. The best we have right now is to just return everything and let the users choose what to pay attention to.


> You can be impartial and return results based on relevancy

I don't think that's possible. Take a search for "Las Vegas shooting". What is the impartial process for deciding the most relevant result for that? Relevance is inherently editorial.


So first "top news spot" means that if you searched for the name of this person who got falsely accused on 4chan, you found the false accusation on 4chan (as the first result). That's all.

Calling this a "top news spot" ... I have no words. Perhaps we should say that this complaint is "fake news" ? I get that newspapers and even journalists have their existence threatened by Facebook and Google, but ... this is a low and very thin argument.

I don't understand what they want either. Censorship seems to be the real issue. The article is not so much taking offence at the information available, they're just angry that the information wasn't filtered through the "normal" channels. They were very unhappy various things didn't get suppressed. That Google had the gall to surface "unauthorized" information.

Never mind that if I was said falsely accused person, I would definitely have wanted that information surfaced. Including to a lawyer, and later to a judge. At issue are the person starting this rumour, and the publisher (ie. 4chan).

Their point is that in this case, things should have gotten suppressed, like naming a person as a culprit that had nothing to do with the incident. Now keep in mind that this is hyperbole: any searches related to the Las Vegas shooting never named anyone, but if you searched for that person's exact name it would surface a "fresh" result ... from 4chan, claiming he had something to do with the shooting.

I would argue that newspapers have done much worse than this. Much worse. Often repeatedly and in cases where there was an obvious financial or political motive for them to spread misinformation. I'd rather have all information, knowing that some may be wrong.

And the newspapers' "right" to filter all information I may see ? Screw that, if I want to see your filtered info, I'll go to your site. I may want to do that, for verification and because I think your filter is useful, but I do not want you or anyone filtering everything I see.


Absolutely right. 4chan is a discussion board not a news site, and does not 'report' the news and is self-contradicting on virtually every issue (because it's a discussion board). Now observe the media try to pull a fast one by drafting off of Facebook and Google's algorithm issues. Their real beef is that heated discussion leads to a more organic narrative rather than the one they've loosely tried to script.

Also it should not be news that a website largely devoted to conspiracy theorizing heats up in page rank after an event like this. Even if the discussion is bogus, let's face it people deep down want to entertain the possibility of conspiracy after every man-made disaster event- so it's relevant, especially in those first few hours before official sources have commented.


This is entirely a blown-up issue that is being used as a cover to start censoring and monitoring our communications on a far worse level.


Calm down! Nobody mentioned children yet!


We’re almost certainly witnessing the endgame to the 1990s boom of Silicon Valley. Google, Facebook and other content platforms will be regulated for impartiality, particularly around political speech. It doesn’t make sense to let Twitter, Facebook et al become private censors, and it’s clear that people retweet faster than they think.


> Take a search for "Las Vegas shooting". What is the impartial process for deciding the most relevant result for that? Relevance is inherently editorial.

In that case, the search should be done on the NYT site (or similar).

Or perhaps Google should add a "Use Edited Sites only" preference as a category of SafeSearch.


> In that case, the search should be done on the NYT site (or similar).

Huh? I'm saying the act of putting one site over another is itself a decision based on opinion.

Should machine learning be used to determine if the site has a summary of the situation...or is that even what the user was asking for? Should it just have those exact words in it? Should it be the most cited link? From the most cited source overall or on this particular topic? Should it instead be based on the clicks that links to that site gets? Was the most relevant site even in the sites originally in the list of sites to crawl for an index?

Also, not sure if you were just making a rhetorical point, but you'll note that your

> on the NYT site (or similar).

just kicks the can down the road. Why NYT? What similar sites are a good match instead? How do you even determine that some piece of writing is an editorial? Does it have to call itself that to qualify?


Even if you manage to put the control of filtering in people's hands, it doesn't solve the problem. Social science convincingly shows people don't want to be told the truth if it runs counter to prior beliefs, and will use all the tools at their disposal to avoid or discredit it. In fact, forcing the truth on them will convince them even more of their false beliefs (see: the backfire effect). Give people more control over their filters, and they will create even tighter filter bubbles.

Maybe the solution to what news to let through the filter bubble is to apply a very simple answer: none at all. Google shouldn't show anything that's more recent than a few weeks.


The unfiltered results ARE the truth. 4chan is the truth. In the sense that "Someone posted this thing on 4chan." And that's the only truth there is in internet-land: Someone posted something.

Is it real? Is it true, what they posted? Or is it "fake news?" You will never know, and you probably shouldn't try to take it that far. Or if you do, at least be able to handle the cognitive dissonance of believing it and not-believing it at the same time. The only thing you can truly know is what you witness in person. Or perhaps what you hear from trusted parties, but even there, your trust may be misplaced, or they may report something false in good faith that was the result of their having been deceived or mistaken. And you can misperceive and twist reality even all by yourself in the presence of supposedly non-subjective and "real" stimuli.

Regardless, from the vantage point of "looking at a computer," all you know about anything you see there, is that someone put it there. And even the "someone" part isn't necessarily true... it could be AI-generated like some sports and financial news is now.

The good news is, that which you can't witness with your own eyes, rarely has any real effect on your life. I'm experimenting with that. The sun still comes up, and it's still time to get some shit done. Sure sure, of course there are myriad ways someone or something far away can have an influence on me. What I'm saying is, what if you ignore that? What if you construct a mental model of the world where it's actually not one big joyous tapestry of unity and interconnectedness, but just a fragmented patchwork and you have your own little square and that's it. Neither view is right per se, but they're both equally valid, which is to say, bullshit, yet incontrovertible to those who believe it. I'll say one thing though, take the fragmented view and suddenly you don't especially find yourself immersed in news of shootings and Google can't "fail you" because you don't rely on it as a window to the world - you only use it for what it's actually good at: looking for shit on the internet.


> Is it true, what they posted? Or is it "fake news?" You will never know

What nonsense. There are degrees of trust you can put in claims, based on a variety of factors. And you don't really believe it, either, as further on in your comment you say "The only thing you can truly know is what you witness in person". If you could know something is true from personally witnessing it you could know that a claim contradicting it is false.

> The only thing you can truly know is what you witness in person.

That's not true, either. Just because you "see" something doesn't mean what perceive is an accurate picture of what is actually there. Your brain is doing a huge amount of inference, based on expectations, to build the details you perceive. This goes up from basic things like the shapes and colours there, to what objects you individuate, to recognition of things like the expression someone is making or their body language, or perceiving what activity someone is undertaking. On top of that, when you talk about what you've witnessed, usually this is what you remember that you witnessed, and memory is notoriously unreliable.

I'd put far more trust in an experiemental "fact" that has been tested many times over in the lab, or in a body of knowledge that has been tested many times over in engineering (whose accuracy is "proved" by the fact that the machinery, such as a plane, actually works), than what someone thinks they saw for themselves a week ago.


I already said all that a lot more efficiently in my comment.


> The only thing you can truly know is what you witness in person.

> The good news is, that which you can't witness with your own eyes, rarely has any real effect on your life

Not only anti-rationalist and anti-Enlightenment, but you're even disagreeing with object permanence!


> Not only anti-rationalist and anti-Enlightenment, but you're even disagreeing with object permanence!

And not even "not only" that, it also veers into the dirty movement of muddying the waters of responsible reporting, into a discussion of "what is truth really?". When "nothing is true", every venue from 4chan to infowars is equal in its non-truth to AP/Reuters/WaPo/NYT/Atlantic/etc.

When all media is fake, listening to Alex Jones talk about water supplies making children gay is somehow legitimized.


Yeah no it doesn't. Just one more thing to ignore. Unless I personally saw someone turn gay from drinking water, but even if I thought I saw that, I wouldn't totally believe that either.

And I did mention trusted parties, which could be a reporter. You guys are just not in the mood for philosophising tonight I guess. You want to be certain of things. Disappointing response. Should've avoided the term "fake news." It's Pavlovian.


That seems to be the inevitable result of an uncritical embrace of nihilism.

If nothing matters, 4chan is great.

If things actually matter, 4chan is an existential threat.


> The only thing you can truly know is what you witness in person.

Why do you trust your eyes? Eyewitness accounts are notoriously inaccurate.

Why trust your brain? How do you even know that the world you think exists is real? Maybe you are just dreaming right now.

You went part-way down the rabbit hole, then stopped. Better to go all the way, then decide on a rational basis for belief (which is probably more complicated than only trusting things you see firsthand), and build from there.


"I trust my own lying eyes"


> The unfiltered results ARE the truth.

No.

Using your structure - those are not the truth.

They are the first result thrown up out of a search engine. Just as much as "someone posted this thing on 4 chan".

This is not truth, any more than

"pull the trigger and kill Godot when he enters the room" are words in a sentence.

Truth is intriniscally lined to meaning.

You are implying that people are entering search terms into a search engine expecting the results to be a form of interprative art.

That is not what people are searching for - in this day and age.

Maybe during the days of alta vista or geopages, when most people knew the results were junk - and you were an advanced tech users, that would hold more on a probabilistic basis.


It would be trivial for FB and Google to allow tailoring of feeds. Even keyword exclusions or subject filters would be a huge step-up from the gamified engagement-getting monster that we have now.


Yes - which is why they won't do it. Like out-of-order feeds, where engagement conflicts with usefulness they'll pick engagement and ad views every time.


I'm ready to walk away from all algorithmically sorted feeds.

I'm tired of the ad tech kool aid that causes PMs + execs to consistently put UX and users in last place.


This is more of a political challenge than a technical one. One that I'd say is worth fighting for.


This is a very good point! I would also add that those with the resources to control filtering (Google, Facebook, China, Russia, the US, etc) will be the ones who ultimately control it.


The problem comes in our choice of who should be the "gatekeepers of information"? Certainly not The Altlantic, The New Yorker (as seen sited in this discussion) or any other traditional news outlet. They have shown time and time again that the peiple running these outlets, not only, have very biased stations when presenting the news (i.e. from MSNBC to Fox News and the rainbow of bias between), they have shown, for a very long time, that these same traditional news sources are easily bought and do not handle a 24 hour news cast well, at all (look at the reports of something as simple as Tom Petty's death, yesterday, a dozen hours or so before his death). While I certainly don't trust Google or Facebook to report the news, I have trouble relying on the "gatekeepers" (blech) of the news, as well.


> They have shown time and time again that the peiple running these outlets, not only, have very biased stations when presenting the news

That’s a really sweeping equivalence claim to make without any proof, much less a way to account for the mountain of evidence contradicting your assertion. Dismissing real organizations with editors, fact checkers, a range of viewpoints, etc. because they aren’t fun by dispassionate robots is simply an error — you might read a story in the NYT and expect them to favor a different side than the WSJ but most people know that both employ decent journalists, check sources, and print corrections when necessary.


I don't know if what I've read is actual proof and a lot of this seems to point to "opinion" pieces on news networks, I guess. I have not done the legwork, but here are some articles I've found going towards what I stated:

http://www.npr.org/2016/12/14/505547295/fake-news-expert-on-...

https://www.nytimes.com/2016/11/09/business/media/what-weve-...

http://www.politifact.com/truth-o-meter/statements/2011/jun/...

https://www.alternet.org/story/154875/the_science_of_fox_new...

In retrospect, though, I probably think it's more prevalent than it is in actuality. To be completely honest, the sensationalism by media outlets exposing misinformation by media outlets "on the other side" of the news gets to me, sometimes. As for corrections, though, mea culpa's seem to be a mid page item, at best =/


Asking the rich and powerful gatekeepers of the majority news sources of the public to filter content for us in a way they have been relatively reticent to do, mostly out of pure altruism, is asinine. I have no love for Facebook or Google, but the world wouldn't be a better place if they cultivated news for us, at least not for long.


I think there's already too much filtering.

People need to make up their own minds regarding what is true and what isn't, what's worth their time and what's bullshit, which links to click on and which to avoid. If people can't handle this then we need to spend more money on education, or just accept that some people disagree with us.

I don't think aggressively filtering "untrustworthy" content will lead to a net benefit.


These platforms have hyper-optimized their products to show as much stuff as possible that people will engage with (share, click), so that they also engage with ads and stay active.

The problem is there is often some time when there is no news. No news isn't interesting, so it doesn't get shared as much as "news". Take the moments immediately following a disaster. Two types of news articles will show up:

"There was a shooting. We don't know anything yet, but we will keep you posted". Boring.

"We know who the killer is. Bobby Bobertson done it". Woah!

Now imagine you are a heavy social media user. You don't want to stay silent on such a big news story, so you want to share something showing you are engaged. Which story do you share?

(edit: removed duplicate "the problem is")


This comment is absolutely right. The algorithms are not optimized to inform people, they are optimized to maximize the number of ads people click on while consuming whatever content the algorithms promoted to the top.

This is at best a conflict of interest when it comes to content that ought to be viewed as news. But there is significant pressure these days to blend news with entertainment (which simply means engagement, which entails ad sales).


I was never so happy to stay off social media as I was last night. I was awake, but not online, for about 2 hours after the LV shooting. My phone had a headline about 2 people dead when I went to bed. My wife woke up at 5am and shared the news with me when I woke up. By the time I engaged with the news sites, it was 8am this morning and most of the initial bullshit had been filtered away. Saved myself from a lot of unnecessary anxiety.


I think at this stage it's like insisting that people filter their own water when there are organisations out there actively dumping toxic waste into it.

It's not just the Internet though, it's organisations like Infowars who are out there claiming that the Vegas shooting was some kind of "false flag", after his previous disinformation on the Sandy Hook incident.


I believe your metaphor is falling short. Taken to the opposite extreme, censorship would be giving the people clean water? Or are we about to make claims about "absolute truth" in breaking news?


Absolute truth is hard to come by. Absolute falsehood is all over the place.

Let's not straw man by pretending that the existing system is entirely 100% censorship free, either.

(I don't have a clear proposal and it's a complex problem, but in the worst case - Rwanda - some of the people involved in hate radio were eventually charged with crimes against humanity)


You've summed up the age of facebook brilliantly.

> People need to make up their own minds regarding what is true and what isn't, what's worth their time and what's bullshit, which links to click on and which to avoid.

This is exactly what everyone is doing, and exactly the problem.

> we need to spend more money on education

And this is what many agree we need.

> or just accept that some people disagree with us.

And this is exactly what many have settled for since Trump took office.

> aggressively filtering "untrustworthy" content will lead to a net benefit.

It will lead to the Nightly News. But I the people have spoken already. They prefer Facebook. They prefer making up their own minds regarding what is true and what isn't.


It's not cognitively possible for a user to simultaneously view every piece of content. Someone has to make a decision about what goes at the top.

Of course, users aren't passive, they could get their news elsewhere or change the filtering. But we know that defaults matter a lot. For many people, the order in which platforms present news is going to be profoundly influential.

The inescapability of choosing defaults settings, and the empirical fact that default settings do influence people a lot, taken together, imply an ethical responsibility on the part of platforms.

Making people smarter would be great - but you've got to be realistic about how effective that would be, and how long it would take.

What other usability problem would you solve by changing the people rather than the technology?


I don't know about spending more on education, but Americans were astoundingly unprepared to even make basic differentiation between the parties, let alone sort through "bullshit".


If we simply leave it up to people, we already know the outcome. Some, maybe even a lot of people will take fabricated lies as facts and proceed to engage other people with these facts, sometimes violently.

By doing that we are empowering the people who fabricate lies to move an agenda. It's ridiculously cheap to make fake news and the value gained is immense because people are having a hard time filtering it themselves. It is the social media equivalent of the Mirai Botnet.

Do we want to live in a world where the best lier wins?


We've always lived in a world where the best liar wins. The best liar is in the Whitehouse, we still live in that world today.

The preoccupation with objective truth is a relatively recent notion, and only of interest to what appears to be a minority of the population. The rest are happy to have their biases confirmed, it doesn't much matter whether it's true.

The difference between now and 30 years ago, is that now if someone actually cares about getting to the bottom of something, the tools for doing so are much more accessible, making the ongoing prevalence of bullshit that much more frustrating. We're not getting any stupider, but the gap between the informed and the disinformed or uninformed has widened substantially.

Even here on HN, when it comes to the handful of subjects I'm knowledgable in, it's pretty common to see some masterclass bullshitting holding top spot in a thread.


> We've always lived in a world where the best liar wins.

True, but we also seem to have collectively become a lot more skittish about excoriating bullshit because "everyone does it". That's definitely not the optimal solution.


As I posted elsewhere on this thread, journalism isn't exactly about the straight facts. It's always more about story and slant related to facts, sometimes more loosely than others.

For example, it will take a lot of time for investigators to get close to the truth of what happened but no one is going to wait for official reports before publishing their takes on the incident --and I'm not nor would expect that. But we also can't very well expect only objective truth to be present in news stories/articles.

Should Google FB do a better job a filtering obviously bogus information? Sure. Should they only source from a "pool" of sources in the era of citizen journalism? I don't think I want to cede that to news orgs who want to perpetuate their aura as the only legitimate purveyors of news.


I wish there was a journalist outlet that would not care about story. Just facts. Preferably in bullet points (narratives are misleading). I'd be a paying subscriber to that.


Which facts? The very selection of relevance itself tells a story.

However, you can sort of get this if you limit yourself to "agency" news, ie AP or Reuters. If you have a big chunk of money and news is important in your work, you'll have a Bloomberg terminal.


All of them? Maybe I'm naïve, but I don't buy the inherent importance of doing selection. If I could trust that the news source includes all relevant facts they encounter, it'll be enough; I can reach my own conclusions.

> However, you can sort of get this if you limit yourself to "agency" news, ie AP or Reuters.

I've heard something about "wire services" before, but I'm not sure if there are any accessible to individuals at prices a regular person like me could afford. Do you know of any?


How do news organisations get "facts"? Recycling https://news.ycombinator.com/item?id=15356836 :

Journalism is not about digging until you find some kind of underlying "fact", there simply isn't time and what you're asking for could potentially be an extraordinary amount of work; journalism is about reporting what the reporter sees and what people tell them. Obviously it's incumbent on them to evaluate their sources, but not usually to write next to the sources' words the journalist's own opinion of the situation.

So for example in the Vegas situation, there will be a large number of official statements put out by various official bodies; or the reporter might call the hospital or a contact there; or they might go there and interview some people standing around.

But remember that this stuff does not cross the bar into what science or history would consider a fact. It's a snapshot of who's available and summary of what they said. There are always more people or organizations you could call for a quote, more people you could interview. And of course not all of them can or should be reported directly - there may be whistleblowers or vulnerable people to protect with a layer of indirection.


> All of them? Maybe I'm naïve, but I don't buy the inherent importance of doing selection. If I could trust that the news source includes all relevant facts they encounter, it'll be enough; I can reach my own conclusions.

The problem with this thinking is right there in the word "relevant". What's relevant? The name of the shooter? Maybe he had a history with someone in the band? What high schools did they go to? What were their job histories? Where have they lived? What weapons did the shooter use? Are they easy to obtain? Were they legal? Why or why not? Guns can be more or less effective depending on the conditions. What was the weather like? Was it normal weather for Las Vegas at this time of year? Is it ever bad enough to be a concern?

Somebody always chooses what is and isn't relevant. There is no such thing as a complete, ""unbiased"" catalog of all the facts.


Wire services typically sell their feeds to newspapers; nowadays, though, the bigger services make their feeds available on Twitter:

https://twitter.com/AP

https://twitter.com/Reuters

https://twitter.com/AFP

(Associated Press, Reuters, and Agence France-Presse are three of the bigger ones)


Neither is perfect. Today exemplifies that better than most. Citizen journalists gave us amazing raw footage of LV and sent us chasing the wrong man for much of the morning. Professional journalists did better with LV coverage but reported Tom Petty dead and then retracted their stories hours before he finally passed.

I guess I would ask Google and Facebook, etc. to keep 4chan and reddit off the top of news results in favor of more reputable news orgs. At the same time, I recognize that sites like those can absolutely provide unique info that mainstream media cannot. However, we have to recognize that the info is raw and unfiltered. It cannot become canon without further confirmation.


You can't just decide what is true.

Think back to the US election you can't just "decide" that the story of thousands of Ohio ballots pre-marked for Hillary Clinton was true. http://www.snopes.com/clinton-votes-found-in-warehouse/

You can't just decide that the story of a protester with a rape Melania was true http://www.snopes.com/2016/11/14/melania-sign-at-anti-trump-...

You can't just decide for yourself that climate change isn't true.

The consequences of people deciding for themselves would be globally disastrous.


...so you outsource your Truth to Snopes, got it. I hope Snopes never lets you down.

people will always draw conclusions based on their own observations, good thing Einstein didn't have snopes to say "Actually, the Theory of Relativity is a myth: Pants on Fire rating!


Snopes cites sources and has spent the last couple decades building a reputation for reliability. That doesn’t make them perfect any more than, say, the dictionary contains no errors but the odds overwhelmingly favor them being right over some random internet commenter.


Snopes is apparently beyond reproach on HN.. as the downvotes prove, ha.


We have no hope of ever understanding even a fraction of the things in the world well enough to make an informed decision about them. For example, I am not a climate researcher, so I simply don't understand climate science well enough to make an informed decision about the topic of climate change. I have opinions about it, but I recognize those are based on faith of who I think has the best handle on the truth about this topic. Every one of us takes most things in our lives on faith to one degree or another.

And that's why it matters that google curates content. 99% of what they curate we cannot research in depth ourselves within our lifetime. We have to be able to trust that they've done the job of curation properly.


In other words:

We have free will, and the incidentals effects on that exercise of free will have no other effects on our life and larger society - ie the exercise of free will has no effect on the larger world itself, and so we need not look further.


> People need to make up their own minds regarding what is true and what isn't

We kind of tried that with religion and people still persist in believing all kinds of stuff


So?


problem is when things are not as easy as black & white or good & bad. In reality things can be fuzzy.


I dont know if i could disagree with the author more.

They didn't fail me. I don't want or need to know about shootings, crashes, floods, etc. Not instantly, anyway. I did not seek news in las vegas: I avoid it.

If anything the previous news gatekeepers got us used to believing that if we dont stop everything we are somehow uncaring. So plane crashes, shootings etc hold a "insta-pass" to news. All hands on deck approach.

To me, the authors premise of FB and G as bad gatekeepers is wrong. Information is hardly ever right right away. Nobody i know expects that. Even with that, i believe people know the difference between a photo sharing site, a email and search/map site and a news organization.


Well Google has a top-level subdomain news.google.com that serves a similar function to a collection of newspaper section front pages. I don't think it's unreasonable to suggest they have some responsibility for what they post. As engineers we are too quick to give them a pass because it's algorithmic, and human intervention wouldn't scale. However it's worth remembering that Google is not a startup, it's one of the biggest and most powerful companies in the world, we shouldn't treat them like an engineering org on a shoestring budget. You might not care personally or have any expectations, but that doesn't mean we are out of line demanding solutions to difficult problems from companies that make such profit off of our information.


I worked at Google News for 5 years, and completely agree w/Alexis. Google promoted 4Chan as a top result for people who wanted to know about Las Vegas; that's a fail, period. It is a fail. It should not have happened; it should not happen again; it should never happen.

Great, maybe you personally weren't fooled. That's entirely and totally irrelevant since there are 7 billion people on the planet and Google (and FB and everyone else) wants to serve all of them. Not all of whom are exactly like you.

Google fucked up, admitted it (kinda) and needs to do better. Period. Not hard to understand.


> that's a fail, period.

I disagree. If that's what people are viewing and if that's where the discussion of the news was happening then google should reflect that.

If 4chan was the only source google was pushing then fine. But as long as google is posting a wide variety of "news", then anything, including 4chan should be allowed.

Edit: People forget that the TMZ and even the inquirer used to be attacked also but they have produced fine journalism from time to time. And establishment organizations like Rolling Stone, The Atlantic, NYTimes, etc have produced terrible journalism from time to time.

No entity has a complete monopoly on facts and news. We should have diverse set of news/opinions/etc.


> discussion of the news

That's the problem. If I get the article right, Google listed results from 4chan in their news boxes that appear above search results –– and that's a problem because in this place you'd expect trustworthy information and not just some speculation.

Listing a discussion (even if it's on 4chan) within the regular search results is not a problem and not what the author is criticizing.


> If that's what people are viewing and if that's where the discussion of the news was happening then google should reflect that.

Nope. Nooooope. Nuh uh. Not at all. Could not disagree more profoundly.

Google has a responsibility to its users. Surfacing a 4Chan result in a top-results box is a fail for those users who see it. The Google search results in that case are worse than returning nothing at all.

Let me put it differently. There are some sites that should never be treated as authoritative on any subject under any circumstances. 4Chan is one of those sites. It's one thing for a user to be looking for something _on 4Chan_ but for anyone else, 4Chan results are garbage and an algorithm fail 100% of the time (at least, that's what I would argue; obviously I do not work at Google and cannot speak for them).


> Google promoted 4Chan as a top result for people who wanted to know about Las Vegas

Great Job!

But in all seriousness 4chan and other sites tend to react very fast, was there any manipulation going on or was it just that they where getting picked up due to high initial activity?


I'm sure there's a lot of truth in the given explanation: the terms that were returning this 4Chan result had basically no search history beforehand, and all of a sudden this post was probably getting a lot of links, traffic, whatever. That's not exculpatory in the least, but it does make sense to me.


> To me, the authors premise of FB and G as bad gatekeepers is wrong. Information is hardly ever right right away. Nobody i know expects that. Even with that, i believe people know the difference between a photo sharing site, a email and search/map site and a news organization.

I think Facebook does a decent job of presenting a news feed driven by your "friends." Not that it's a good thing (it's certainly not in most cases) but by and large people are seeing the good and bad information that their friends are promoting, which is what they want, because they want to maintain their social awareness and connectedness more than they want to be informed. Facebook also does a pretty good job of indicating the social source of the news, so you can evaluate it in the light of your "friend"'s credibility. Facebook just lets people be people, with all the horrible implications of that.

Google straight-up presents articles as "news" and then disclaims any responsibility for it, on the implicit argument that a company with tens of thousands of employees, billions in market cap, and one of the most visible platforms in the world shouldn't be held to the same standard as a small city newspaper because they're using algorithms. They direct blame to their algorithms and then argue that algorithms are inherently blameless, a trick that we keep letting them get away with. Their excuse of "it's just algorithms" is as cheap and meaningless as "it's just for the lulz." They don't say "here's the news chosen by an algorithm which of course often makes horrific mistakes a human never would." They present "Google News" with layout and sections just like a professionally produced news site, until they screw up, and then they say, "Well, of course it's not journalism, don't make that mistake. It's just algorithms. Didn't you notice the Google colors and subdued graphic design? We're not trying to fool anybody."

Their take this time:

Within hours, the 4chan story was algorithmically replaced by relevant results.

Ah, so everything worked as designed. No need for humans to take responsibility. Just stand back and let the system work. Again, a small city newspaper that left something like that up for hours would be condemned for callousness and suspected of intentionally promoting disinformation. Google seems to be on its way to arguing that its systems either cannot or should not be overridden by humans, even to correct horrific mistakes, if it's not there already.


<mistakes a human never would

A great line! Have to keep that top of mind reading anything online

But G has no Murrow-esque anything and i think its unrealistic and unfair to expext it can.

G is simply a live canvas where everybody splatters paint and people walk by the gallery at different times seeing different things


You overestimate the intelligence of the general American public.


general American public -> general public


I agree and the author, Alex Madrigal, seems to be ignorant to the history of the dissemination of misinformation. He commits the same error as other journalists complaining about Google/Facebook algorithms by not mentioning their "human judgement" within news orgs causes similar problems.

Let's look at how the news publications handled the "truthiness" of Iraq WMD:

- No skepticism that WMD exists: September 2002 article (WMD fever 6 months before the March 2003 invasion of Iraq)[1].

- WMD not found: January 2004 article (10 months after the invasion when soldiers found nothing.)[2]

Before March 2003, all the influential news orgs in USA such as NYTimes, Washington Post, and all 4 major tv networks pandered to the Bush administration's "slam dunk" narrative for Iraq having WMD.

You can't make excuses for those USA news orgs and say they were tricked. That's because the news outlets in Europe (especially the journalists in France and Sweden) were more skeptical about WMD and wrote that attacking Iraq would be a big mistake.

It's fascinating how the journalists write as if they're on a perch of higher judgement over the mass of helpless readers when they themselves fall victims to the same scams of misinformation and spread that misinformation to audiences just like bad Google algorithms.

(See documentaries "War Made Easy"[3] and "Buying the War"[4] that show how journalists lack independent critical judgement and fall in lockstep to spread misinformation.)

>There’s no hiding behind algorithms anymore. The problems cannot be minimized. The machines have shown they are not up to the task of dealing with rare, breaking news events, and it is unlikely that they will be in the near future. More humans must be added to the decision-making process, and the sooner the better.

But humans and their judgement are not even up to the task of slow-moving news like the buildup to war. For the Iraq fiasco, maybe the better solution in 2003 is "more algorithms" in such that USA readers would end up in a 50/50 split with half believing Iraq had WMD and the other half remaining skeptical. Instead, the polls showed that majority of Americans believed Iraq had WMD.[5] How did they get that wrong belief?!? The mainstream news, that's how. (Facebook "fake news" didn't exist in 2003.)

Yes, Google needs better algorithms; but I'm not convinced that people like Alex or government officials should be the ones regulating the algorithms. They have a proven track history of being as flawed as the readers they're trying to inform with the "truth". (Thought experiment: if Google search results in Feb 2003 had returned more results saying "no proof Iraq has WMD", American journalists and the government would have criticized Google's algorithm for spreading "fake news"!)

[1] http://www.nytimes.com/2002/09/08/world/threats-responses-ir...

[2] https://www.theatlantic.com/magazine/archive/2004/01/spies-l...

[3] https://www.youtube.com/watch?v=R9DjSg6l9Vs

[4] https://www.youtube.com/watch?v=0KzYL6e3sV0

[5] http://news.gallup.com/poll/8623/americans-still-think-iraq-...


Came here to say pretty much this :-)


The reason why those of us on this forum can tell the difference between good and bad information is because we received a good education and learned critical analysis skills.

Facebook, Google and the internet in general are an objectively bad source of knowledge. They are subjectively good sources of knowledge.

The vast majority of people do not have the skills to use the internet as a source for good knowledge.

The contents of the average library contain better knowledge than the internet. The contents of a library at a world-class University and even more-so.

The internet is the most overrated institution of contemporary society when it comes to obtaining knowledge.

It is somewhat good for entertainment but probably not better than your average game of Dungeons and Dragons.


> The reason why those of us on this forum can tell the difference between good and bad information is because we received a good education and learned critical analysis skills.

We really need to stop congratulating ourselves on how smart we are and we would never fall for such blatant lies, because we are such highly-educated critical thinkers. There are a bunch of silly beliefs that we witness even amongst us like Steve Jobs wanting to cure cancer with unorthodox dietary changes, that having no gender parity in tech is both good and natural, or this medical doctor with a respected career who believes the earth is flat:

https://www.gizmodo.com.au/2017/03/the-men-who-believe-the-e...

We're all vulnerable to believe and defend the silliest things and then call ourselves critical thinkers while we do so. We can all be tricked. There are well-documented, effective methods to make us perceive and believe any number of things. Illusionists and magicians make a living out of doing this overtly; journalists and politicians do so less transparently.

Quite frankly, half the time I'm walking around slightly horrified that perhaps I'm believing and defending the stupidest lies and I think everyone else should feel the same way at least part of the time.


Do you believe Steve Jobs could not have told you the statistics on treatment for the normal treatment for the cancer he had? Or that he was not aware that most reputable literature on homeopathics is less than encouraging? Turning the stables do you feel you have access to all the information necessary on his situation and logic to be able to accurately judge his actions?

Society as a whole is becoming increasingly arrogant. If an individual does not agree with "us" that they must be somehow wrong, misinformed, or illogical. The reason for this is easy to understand. We all hold the beliefs we do because we hold them to be the most logical and justified. So somebody who doesn't hold such beliefs must be wrong, and we need to "fix" them. Yet of course the exact same is true of people who hold differing beliefs.

And this is in no way to suggest that both sides can be right on issues where there is indeed a clear answer. On many issues one side will be wrong, and one side will be right. And this is perfectly fine. It is critical in society that people are at liberty to make decisions that we may not consider logical, or even appropriate. Elon Musk deciding to spend all of his fortune attempting to start up an aerospace company and an electric car company is certainly something that very few would call a logical endeavor. It is missing the gravity of such risk to suggest it's only money. He gets emotional to this day when speaking of his time then - one can only speculate what would have happened had everything collapsed; there was certainly much more than money at risk there. But he succeeded. So we regard him as brilliant instead of a misguided fool, illustrating the absurdity of our labeling.

If you want something other than money, consider the history of penicillin. Now considered one of the most important medical discoveries of all time - the whole concept of injecting mold into one's body to inhibit bacteria is not exactly what most would call logical. And for years his discovery was completely disregarded by the public and by medical science. And that's fine - but it's also fine that people are free to pursue outlandish avenues and ideas without condemnation and judgement other than "Well, that's not what I'd think given the body of evidence available."


You're imagining Steve Jobs made some sort of rational choice. It sounds like he pretty much ignored the facts, as well as the advice of family and friends. Here's his biographer's quote to CBS:

"I think that he kind of felt that if you ignore something, if you don't want something to exist, you can have magical thinking. It'd worked for him in the past. He regretted it."


[flagged]


Science can't tell you if something is "natural" or good/bad. Science merely tells you what is/what we observe. (with some probability of being wrong, and subject to bias, etc).

Whether something is good/bad is up to us as a society to define, not science. What is considered good/bad depends on each of our individual goals/ethics/morals, which could differ from one another. This is why spoonfeeding people information saying "X is good" / "X is bad" is dangerous, and it becomes more dangerous when social media platforms allow such information to be amplified.


I would like to see that evidence, thanks.

Edit: specifically I would like to see evidence that 1) a behavioral trait occurs in one gender identity that is universal across the majority of human culture to eliminate cultural bias, 2) that this trait has a causative effect on the career prospects of that person again across the majority of human culture to avoid culture bias (also that when this trait is strong in a gender not normally correlated that it causes same causative effect)


> I would like to see that evidence, thanks.

You won't find it. This is the problem in the soft sciences. There are certain hypotheses that you just can't test and this is one of them. Choosing to believe that unbalanced gender representations in certain fields is unnatural is just as unfounded as the parent's opposite view. The best we can do is make conjectures based on the results of things we can test but it's incredibly difficult to determine how much the things that are possible to know matter at the population scale.


That’s a very pessimistic view of what science can do. It’s true that the softer fields are harder to experiment in but scientists have many tools for that – it’s not like people just gave up the first time they hit a problem harder to test than Mendel’s peas.

As a few examples, you could do broad reviews to see if different cultures have different outcomes (e.g. the former communist bloc countries having higher percentages of female engineers and scientists undercuts the arguments that this is dictated by biology), draw comparisons from other fields (e.g. the same justifications were used to excuse gender ratios in law, medicine, music, etc. but cultural changes brought greater equality on a timescale far far than biology could change), or try to find underlying biological explanations — e.g. we didn’t need a cultural change to explain why women aren’t power lifting as much on average because there’s an underlying mechanism and there are individuals who have outlier levels of testosterone & other factors and those people generally perform as the biological mechanism would predict.

Most importantly, this could start by questioning the assumptions used to explain the status quo. For example, did the requisite skills change after the early 1980s when female CS participation declined? Lots of men like to excuse different participation rates with some argument based on mathematical or spatial skills, which could be tested to see how many successful people rely on those skills vs. more equal fields like math or chemistry.


The OP said: "There are a bunch of silly beliefs that we witness even amongst us ... that having no gender parity in tech is both good and natural."

I didn't make any statement on "good" because it was poorly defined. It could be defined as anything from the individual's opinion to measurable outcomes (society's productivity or reported happiness). I think "natural" in this debate tends to mean "biological factors" and not "social and cultural factors". I simply questioned the scientific basis for the OP's belief.

So, how can we know that the rates at which males & females choose to study and work in a given field reflects their "natural" inclination rather than social or cultural factors?

1) We can look at the rates cross-culturally. In search of the numbers relating to "tech", I found the Stack Overflow developer survey (admittedly a subset of "tech"). Stack Overflow references Quantcast visitor numbers in their developer survey[1], so I'll look at those numbers too, around the world. The stackoverflow.com visitors by gender in: India: 94% Male, 6% Female[2] USA: 88% Male, 12% Female[3] UK: 91% Male, 9% Female[4] China: 86% Male, 14% Female[5] Germany: 95% Male, 5% Female[6] Iran: 88% Male, 12% Female[7]. The numbers seem pretty consistent, suggesting that the imbalance isn't due to social/cultural factors.

2) We can look at differences in interest in newborns. There is much discussion about males on averaging being more interested in "things" vs females on average being more interested in people. Here's the abstract from a study I found on the topic: "Sexual dimorphism in sociability has been documented in humans. The present study aimed to ascertain whether the sexual dimorphism is a result of biological or socio-cultural differences between the two sexes. 102 human neonates, who by definition have not yet been influenced by social and cultural factors, were tested to see if there was a difference in looking time at a face (social object) and a mobile (physical-mechanical object). Results showed that the male infants showed a stronger interest in the physical-mechanical mobile while the female infants showed a stronger interest in the face. The results of this research clearly demonstrate that sex differences are in part biological in origin."[8]

Happy to be shown how my numbers & study are incorrect or insufficient.

Personally, I'd prefer to work in a gender-balanced environment, but I wouldn't want those of either gender to be forced into a role they don't want - I think that would lead to unfair (non-meritocratic) treatment of members of the minority and majority gender. I want everyone to be free to choose their field of study and work, and to be free to fulfil their potential in that field. If that naturally results in male- or female-dominated professions, I don't think it's something that needs to be rebalanced with policy, if that's even achievable.

P.S. Interesting that I was socially censored for this: "What evidence do you have that _______ is good and/or natural? The scientific evidence seems to suggest to me that widespread _______ in many fields is natural." I'd say that social censorship as opposed to free discussion plays into the formation of silly beliefs that the OP complains about. Bad ideas are destroyed with discussion. Bad ideas that can't be discussed can't be destroyed. I'm happy to be corrected on the science regarding the current state of affairs.

References: [1] https://insights.stackoverflow.com/survey/2017#developer-pro... [2] https://www.quantcast.com/stackoverflow.com?country=IN#/demo... [3] https://www.quantcast.com/stackoverflow.com?country=US#/demo... [4] https://www.quantcast.com/stackoverflow.com?country=GB#/demo... [5] https://www.quantcast.com/stackoverflow.com?country=CN#/demo... [6] https://www.quantcast.com/stackoverflow.com?country=DE#/demo... [7] https://www.quantcast.com/stackoverflow.com?country=IR#/demo... [8] http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.627...


1) I asked for traits correlated with gender, 2) I asked for Traits correlated with career prospects.

1 merely correlates gender with career prospects and assumes traits exist biologically that performs the causative effect.

2 merely correlates behavior traits and assumes that there is a causative effect to career prospects 10, 20, 30 years down the line.

I do not want assumptions since they make a donkey out of you and me.


Sex differences in personality are exhibited in other primates as well. That rules out culture as the sole generator of them.


That argument still requires evidence that those personality differences exist, aren’t learned (other primates have learned behavior, too – see e.g. http://www.radiolab.org/story/91694-new-baboon/), and are significant to the very complex behaviors under discussion.

The other area it ignores is the degree to which humanity’s distinctive advantage is plasticity. We can learn to do things like be comfortable in an enclosed space full of strangers – imagine a subway full of baboons! – which doesn’t mean that biology can’t be a factor but does mean that it’s really important (and hard) to critically test those assumptions.


There are also experiments (Simon Baron-Cohen) with infants at one day of age showing the differences in interests (human faces vs things).

In that vein we can also include personality changes displayed by transgender people who undergo hormone replacement therapy.

Also studies on the personalities of separated twins.

Everything points to a mix of biology and socialization. The idea of parity in division of labor as the ideal is purely dogmatic.


Simon Baron-Cohen has been pushing that idea for a long time but while they’ve gotten a lot of attention for his career, his conclusions are far from definitive. There are various papers contesting his conclusions and review articles have failed to support the bold claims – e.g. https://software.rc.fas.harvard.edu/lds/wp-content/uploads/2...

That doesn’t mean that there’s nothing here but I find it telling that the neuroscientists I used to support were far more skeptical than, say, the Damore fans here. This topic came up a bit and there was a strong consensus that the science was still too early to say anything — there are low level biological differences (e.g. percentages of white matter) but that hadn’t been linked to high-level skills, and the few low-level differences were still being studied to identify the cause. This is all complicated by the plasticity of the brain based on usage so answering even simple-sounding questions is usually a significant percentage of multiple people’s research careers.



In NZ, the owner of a Onion-like satirical website called The Civilian published a piece that very much follows your arguments. It's a great read:

http://www.stuff.co.nz/entertainment/86799000/how-i-accident...


> The vast majority of people do not have the skills to use the internet as a source for good knowledge.

I’m no expert, but I see this too often. People who truly believe fake news and bot comments on Facebook. Other clicking on adds because can’t differentiate them from real Google search results. Many that doesn’t have the patience to even think about verifying the information. Humans are going to do human things, and these companies know that very well.


"The vast majority of people do not have the skills to use the internet as a source for good knowledge."

And this is known how?

The aristocracy considers the people ignorant and prefers it that way, but history shows this is not always objectively so.

The ideas of freedom of information, of freedom of expression are repugnant to the aristocrat (who always consider they know best), but these are founding principals of so far the most innovative and productive societies history has known. As opposed to aristocracies which are feudalism and stagnation.

“Scratch a conservative and you find someone who prefers the past over any future. Scratch a liberal and find a closet aristocrat. It’s true! Liberal governments always develop into aristocracies. The bureaucracies betray the true intent of people who form such governments. Right from the first, the little people who formed the governments which promised to equalize the social burdens found themselves suddenly in the hands of bureaucratic aristocracies. Of course, all bureaucracies follow this pattern, but what a hypocrisy to find this even under a communized banner. Ahhh, well, if patterns teach me anything it’s that patterns are repeated. ― Frank Herbert, God Emperor of Dune


> The ideas of freedom of information, of freedom of expression are repugnant to the aristocrat (who always consider they know best), but these are founding principals of so far the most innovative and productive societies history has known.

I'm not entirely sure this is true. Before Internet, what we had was mostly freedom of the press - which is to say, the freedom to spread information (or disinformation) by those with financial means to do so. This is far more free than the preceding eras of widespread censorship, where the only one spreading information (or disinformation) was the government and its agents. But it's not quite a free for all. In effect, the ability to spread information on a large scale was still restricted to the "elite", defined in financial terms.

Internet, and especially social networks with their graphs, changed that. A little money, and a carefully crafted story, go a long way in terms of dispersal and effect. Big money still gives you more opportunities, but even small players can make a difference. And we've yet to see the full effect that this is going to have on our societies.


So much information in the times before freedom of the press was spread without the press at all. The press made not only communication but also censorship feasible on an unprecedented scale.


It certainly was - but it was also spread much, much slower, and filtered (by those passing it along) on every step of that way.


It was filtered by individuals but coordinated filtering was much more difficult -- a crucial distinction.


It was like this in the time of every morning, midday and evening newspaper as well. Still not a reason to introduce, moreover to actively encourage censorship. While this particular case may be true, those in power will just use this precedent as an example, more and more. It is like a tiny hole in water barrier, at first it is small, then it will grow larger as some people would want to justify more "filtering", more and more, until the water will destroy any barrier, and full featured censorship will come. This is already a case, but still can probably repaired back, we will see.


This is the old school moderator answer from back in the day when the net was young.

I.E - the old school binary censorship "block/ do not block" - let people decide

The update to this is that censorship is ALREADY happening without explicit actions by parties which want to do right.

The strategy is to saturate the pipe.

After time on the net, the concept of signal vs noise has been advanced enough that we can now just override the signal, by pushing enough noise.

You are talking about small holes, while you are already surrounded by a deluge.

Once twitter and facebook took off, this was a done deal.

The issue is frankly one of communication strategies (and not even meta-communication, this is from propaganda playbooks).

With enough automation, its possible to saturate the pipe.

Humans are built with limitations, for example short term memmory, suited for 5-7 "objects".

You don't need to resort to explicit censorship anymore.


> The vast majority of people do not have the skills to use the internet as a source for good knowledge.

We (and others, this is not about us) are wary enough that its consequences are dampened. Shielded because we know, or know enough, what Google and Facebook are. Algorithms on top of data maximizing popularity metrics. And popularity is taken so seriously as to be seen as a measure of truth (see: time spent watching any entertainment series 4th rerun). Be it mind share, clicks, network reach or whatever other criteria that can be deemed to convey content value, "significance and truth" is not it. There is no analytic measure of significance and truth. Even less so one valid across 3.7 billion internet users (value from a quick search, not to be taken too seriously as not all networks are connected, not all content will reach everywhere).

Facebook is not a network of friends. It is a graph of connections whose content acts as bait to keep you navigating it. It preys, displaying notoriety as the utmost standard, and equating interaction with success. Reality pales against the interface of social media, that openly displays number of likes, comments, shares, friends, stats we only catch a glimpse of outside. Your social media past is an absolute, like your current weight or your current number of kidneys, there's nothing more to it. There is no wiggle space. It is a record of your path that allows no perspective. There is only that one photo, which you carefully picked. On that day you shared that link, and that was your day. That is what you are for the network, and nothing more.

As for Google, you are not navigating truth. At most, you are navigating utility, as measured by ranking. There is no truth in search order, nor should you expect to find it there. You still need to think. Why was 4chan first? Something to do with being first, being active, being inflammatory, being accessed often?

From the article:

> 4chan results, they said, had not shown up for general searches about Las Vegas, but only for the name of the misidentified shooter. The reason the 4chan forum post showed up was that it was “fresh” and there were relatively few searches for the falsely accused man. Basically, the algorithms controlling what to show didn’t have a lot to go on, and when something new popped up as searches for the name were ramping up, it was happy to slot it as the first result.

There. If you search from something that 4chan talked about, you get 4chan results. Nevermind that you don't know what 4chan is, but it does, and it creates content! This, the article says, is a problem. The internet is seen as a stream of facts and when the common man discovers they were fed crap (in their deeply biased opinion, and if they ever notice at all), they complain to the stream provider! Instead of admitting "damn, I am stupid for falling for that, I should know better by now. What can I learn? Well, let me add 4chan to a filter, or learn what that place is so I can avoid it". Google hiked for weeks to a remote location, found the densest of jungles and entered a small opening, with a small swamp in the middle. Swam in the putrid water and found something akin to culture. There, in one of their rituals, he found your search term. Took a picture, and brought it to you. And you expected it to conform to your ideals of truth. There, as in other places, fringe thoughts are expressed, both real and fake[1]. But the article writer wanted Google, the tap he let open in the comfort of his living room, right in the middle of his truth of existence, to tell him HIS truth. But instead he got doxxing gone bad, users posting in epic threads and probably someone from /r/4chan asking to be included in the screenshot, as others scrambled to archive the thread, or derail it with spiderman pics, or even pretend-child-pron so the mods would delete the thread so as not to call the attention of "normies" once the outside world knows about the latest ritual they found: blaming Geary Danley for what happened in Vegas. Most of them aren't even in Vegas, or care enough about shootings to see the bigger picture. But they found lulz to be had.

And sure enough, here it is, in the article's conclusion:

> There’s no hiding behind algorithms anymore. The problems cannot be minimized. The machines have shown they are not up to the task of dealing with rare, breaking news events, and it is unlikely that they will be in the near future. More humans must be added to the decision-making process, and the sooner the better.

"The problems". But what are the problems? "The machines (...) are not up to the task (...)" Ok. Now we know that. What next? "More humans must be added to the decision-making process".

No! A thousand times no! Don't touch information, don't shape the stream! 4chan exists, and is made of people! And you want to have PEOPLE (not the same ones, but also prone to their own rituals and biases) to moderate the stream to the outside world! There are a lot of voices in every controversial (and controversial varies a lot) source, 4chan is one of them! And the solution is to make the stream as one sided as possible, neutered, comfortable only to some?

We hold two things in your hands:

1) mind can believe and say pretty much anything

2) internet connects everything: basements to penthouses to marketing departments to psyops operatives to deep swampy lands full of dark terrors to your children

Lets ignore that, pick a select few and filter the stream. And guess what, there's a bit of swamp in all of us. [2]

After what seems like a troll comment here[0], here's a comeback from a concerned citizen:

> you're aware it's unwise to make those kinds of statements on the Web? Especially when you "Work" for a university?

This is how seriously people take the internet. And the solution is to keep out the weirdos. Or ask them to keep the weirdness inside. Good thing we (the writer of the article) are in charge. Let's just clean up google and social media. Because that is the full extent of the internet. The swamp will disappear as soon as we cleanup google, and the children are safe again.

Google and facebook are doing just fine. We are failing to build defenses for this globalism of the mind.

[0]: https://techcrunch.com/2017/10/02/how-reports-from-4chan-on-...

[1]: https://www.reddit.com/r/OutOfTheLoop/comments/5xdacv/what_i...

[2]: https://www.goodreads.com/quotes/548087-no-tree-it-is-said-c...


What concrete actions are you proposing that we take? What defenses are there for the "globalism on the mind"?

When I see you write:

>> There is no truth in search order, nor should you expect to find it there.

It seems you decry action from a position of intellectual weakness. e.g We cannot know what we do, therefore we cannot modify or temper our actions.


Intelligence does NOT prevent one from falling for junk news. I know way to many intelligent people who believe ridiculous stuff for this to in any way be true.


Yes, it takes skill to obtain information from the internet, just like it takes skill to obtain information from reading books in the library(you have to know how to read).

If you lack these skills, you are limited to the comicbook section in the library, and to cat videos on the internet.

Internet is a much superior tool of obtaining information, obviously. Just because it takes skill to use it, doesn't mean that it's worse.


First, it may be better for you and I. It may be much worse than a library for most other people. It also may be worse than a library at a world-class university for even you or I. It may only be more convenient than going to a library or walking the halls of a university.

I don't think the answer is obvious to anyone who takes the time to properly examine the matter.

Unfortunately for the both of us the medium itself is terrible for philosophical enquiry so we probably won't cover much ground in a discussion framed by popular opinion.

You will now set out to prove me wrong and hope to gather support from people who already agree with you. We will not at all engage in the cooperative quest for wisdom.

Such is life on the internet.


Google is more a conduit of news rather than a gatekeeper. FB is both a gatekeeper and a conduit to news as well as a news producer (since it pays or underwrites some orgs to produce news for it).

That said, I think The Atlantic is wrong in placing blame on Google when they themselves engage in questionable journalism (clickbait as well as often biased news-source).

There is no way we're going to come up with something "objectively truthful" because, news as it exists today is not about facts. It's journalism and journalism has always been about putting a slant on things --quoting unauthoritative sources ("man named Joe claims," etc.) loose factchecking, and more. All to sell more news (packaged with ads or for subscriptions, where you don't want to alienate your subscriber "base".

This is an attempted landgrab by establishment news orgs to corner what is news. They want to be the only sanctioned sources of news and punish alternative sources --using an obvious outlier to promote their pov.


> There is no way we're going to come up with something "objectively truthful" because, news as it exists today is not about facts. It's journalism and journalism has always been about putting a slant on things --quoting unauthoritative sources ("man named Joe claims," etc.) loose factchecking, and more. All to sell more news (packaged with ads or for subscriptions, where you don't want to alienate your subscriber "base".

I look at it much differently: Nothing is perfect, no news source, no politician, no text editor, no scientific theory. They all fail our ideals but some are a lot better than others, some are a little better, and as others have pointed out in this discussion, an essential skill is to differentiate between them and also to assume the uncertainties of imperfection and to read critically.

To throw together all journalism as 'imperfect' is like throwing together all science, from the theory of evolution to latest upload to arXiv, as all a failure because it's imperfect. Yes, read Darwin critically, but don't conflate his work with everyone else's. The more I accept inevitable human imperfection, the more I'm amazed by people like Darwin and by organizations like the NY Times, which cranks out a very good, very difficult product on an amazing schedule. That they and others accomplish all of that under the constraints of humanity is truly amazing to me.

> The Atlantic is wrong in placing blame on Google when they themselves ...

We're kind of stuck with ideas from hypocrites, at least until the great AI awakening - and if it's strong AI, hypocrisy might be the first sign that it's working.


I think the Atlantic wants it's cake and eat it too:

>The problems with surfacing this man’s group to Facebook users is obvious to literally any human.

So it's obvious to any human, yet people aren't good enough to differentiate fake from real news.

>Most people who joined the group looking for information presumably don’t know that the founder is notorious for legal and informational hijinks

I mean, the author is picking and choosing anecdotes to make a point in order to further their agendum.

On the other hand I agree with you but it's irrelevant to the authors' point. The author, I would posit, is not really interested in sourcing better news (it's not impossible, for example, for 4chan to have the best info on a particular issue (via a contributor) while The Atlantic may have none. But they are arguing a 4chan should not be surfaced at all --not that 4chan should be better scrutinized and only surface accurate information.

It's an underhanded and opportunistic attempt to delegitimize non professional journos and build a moat around the "profession".


> That said, I think The Atlantic is wrong in placing blame on Google when they themselves engage in questionable journalism (clickbait as well as often biased news-source).

They're hypocritical but they're not wrong.


> That said, I think The Atlantic is wrong in placing blame on Google when they themselves engage in questionable journalism (clickbait as well as often biased news-source).

The atlantic and the rest of traditional media just wants google to link their clickbait stories. It's ultimately all about money.

> using an obvious outlier to promote their pov.

As per usual. I wonder how the atlantic would like google to us the outlier ( rolling stones UVA story ) to ban all traditional media.

At the end of the day, this guy has to write something so that the atlantic would pay him. So he just went to his normal story. The guy is no more informed than the rest of us. He just gets paid to write silly stories.


Google is all about filtering. It doesn't just show you a random selection of every page on the internet containing your search term. Instead it applies subjective criteria to present you with what Google thinks would be the "best" results for you.

This is why people use Google.

So I don't understand when people make the generic argument against filtering, that filtering is categorically bad and Google shouldn't be in the business of deciding what we see and don't see. That's the whole point of Google!


> This is why people use Google.

Wait, what? I use it despite that. Since 2000. Everybody seeing the same search results was never a problem, and sometimes even useful. There is a wide area between "just show random pages" (which probably not even the first prototype of any search engine ever did) and a "personal" filter you can't configure at all.


I don't mean the filter is personalized to you, I just mean that there is a filter! That is, Google doesn't believe every website is equally "good", they Rank Pages based on subjective criteria.

I just don't get why no one has a problem with Google trying to filter out spam (even though it can't be detected with 100% accuracy) but trying to prevent 4chan/GatewayPundit's blatant and obvious lies from being elevated is a place to take a principled stand.

Where are the tears for the spammers and scammers who have been so unfairly censored by Google for so many years?


> I just don't get why no one has a problem with Google trying to filter out spam (even though it can't be detected with 100% accuracy) but trying to prevent 4chan/GatewayPundit's blatant and obvious lies from being elevated is a place to take a principled stand.

What does this have to do with personal filters? Yes, there are filters, Google has to "make decisions", and parse language. That's a given. But that doesn't require showing different things to different people.

With a personalized algorithm behind the curtain, googling for "Pizza" might show you where to get Pizza near where you are, and let's say "Pizza in general" would give the results that just typing "Pizza" without a personal filter would give. So you have "Pizza / Pizza in general" vs. "Pizza shop in $city / Pizza". Big deal? What was gained here? You trade some convenience for the impossibility to get the search results everybody else is getting.

When I ask someone what they think is the X-est Y, and they first have a list of questions for me to answer first so they can tailor their response to me -- not to clarify the question, mind you, but so they can tell me something I might like to hear -- I won't be asking that person for their opinion again.

> they Rank Pages based on subjective criteria.

That's not how it works:

https://en.wikipedia.org/wiki/PageRank


I feel like we're talking past each other. Personalization of search results is orthogonal to this article and discussion so I'm going to pass on responding to your points there.

As for PageRank, I really appreciate the Wikipedia link. Super helpful. If you're arguing that PR is not subjective because it uses an algorithm, my response is: the algorithm encodes subjective judgements about how to evaluate "quality" in a web page. For example, the subjective judgement that a web page is likely to be of higher quality if there are a lot of other high quality web pages that link to it.


For the downvoters, this may come as a shock to you, but you don't get to tell me the reality of my life, either. And there is more than "random matches" and "personal filter", that's so banal it shouldn't even need saying, but apparently it does. So far you've only shown the desire to shut up what you can not argue with.


It's rather bold for the Atlantic to complain about low quality news when they have clickbait garbage at the bottom of every article.


It's not either/or; many high quality news sources have clickbait rubbish stories to help pay the bills for the more expensive, higher quality reporting. It's either that, or paywalls (or both)


Not 100% surprising that journalists would view the machines that have wrested control of the news from them with hostility. Facebook definitely could use improvements, and it sounds like Google's search ranking in these rare kinds of situations could too, but the vitriol here — and the suggestion to replace machine ranking with ... surprise, journalists! — makes it seem like the "Us" who've been failed by ranking algorithms might specifically refer to The Atlantic.


You are right, but on the other hand, we pay Facebook dearly with our personal data and in return we get to do the work journalists should do, of navigating in a sea of information, most of it fact-free. There must be a more fair distribution of time and income.


Google and Facebook have failed the Atlantic which is seizing on tragedy to imply only sanctioned media (them of course) should be available through Internet searches.

However most of the rest of us are not failed as we already know results from 4-Chan probably aren't accurate.

Please stop failing all of us with this desperate bid to get back in control of information dispersal there The Atlantic. I'd much rather see a smattering of nonsense and be able to make that determination for myself rather than you deciding what should be available for viewing.


Why is this getting downvotes? It's spot on. The "mainstream media" have been failing us for years, yet now suddenly it's all Google's and Facebook's fault that people are getting inaccurate information fed to them?


+1


+1 There are plenty of brave journalists/bloggers but very few editors or corporate media vehicles prepared to publish their investigative reporting. I wouldn't dream of using FB as a go to 'news' source, while Google is essentially an aggregator/curator of information sources I may or may not take seriously. The idea that these lightweight websites should somehow be the gatekeepers of 'the truth' is pretty offensive. I prefer to triangulate across multiple sources and make up my own mind base don the information I uncover. Example: https://lasvegassun.com/news/2017/oct/02/gunman-who-killed-5...

A lead story in the local Vegas newspaper states the alleged gunman was a multi millionaire property owner. Is this true? I have no way of proving it but it is certainly food for thought...


Has the average person heard of 4-chan? If they had, would they understand it as anything different to Facebook, which they seem to trust, or the comments section under a news story, ditto?

People don’t know what accurate and trustworthy looks like. Even when things are labeled as fiction, they can have an effect - Scotish Independence became more popular when the film “Braveheart” was released, while Islamophobia went up after “American Sniper” was released. Or am I just repeating plausible but fictional urban legends that I’ve mistaken for reality?


Speculation on motive begins immediately when we learn what the target was and the weapons that were used. You can't censor that without censoring the entire incident.


That is as irrelevant as it is correct. Yes, humans speculate, but we shouldn’t be amplifying our known cognitive biases by presenting wild speculation (let alone 4-chan) as if were authoritative truth.


The article is placing the blame on Google and Facebook when it should be placed on us, the "consumers" of media. We need to take a look in the mirror and understand who we are and what we want.

Ask yourself. Really ask yourself:

- What kind of information am I looking for and why? Am I just a digital rubbernecker? Do I want to break this story at the watercolor? Or do I need to know if a loved one was hurt?

- Do I need to know shortly after an event occurs or am I able to tolerate a little bit of delay in my gratification?

Shortly after an event, there are many stories and they haven't yet been synthesized into one narrative. There are people who want to capitalize on tragedy and their voices are part of the mix. Over time, the noise is stripped away and what we get is the official narrative and that is one we can "count" on. If you are willing and able to wait until the official narrative comes out then a) congratulations, you have will power unheard of in this day and age, and b) you get the benefit of skipping all of the intermediary chaos states.

If you understand why you need to know and when you need to know it, then you can control the quality of the news you expose yourself to.


These kinds of statements seem deep and meaingful, but ultimately they are vaccuous.

The only reason any market entity exists is because people buy things from them, its the buyers fault.

The only reason corrupt governments exist is because the people elect them, or at the very least don't overthrow them, its the governed's fault.

Its a stupid oversimplication that places all the blame on the giant, vauge, and futile "everyone", while ignoring incentive structures and pretty much every ounce of nuance and sublty.

Not to mention that it totally ignores how entities like the aformentioned, and amagoosoftbook shape our preferences, both actively, and unitentionally.

Its a hopeless putdown of basicly any ability for anything to change, ever.


I am speaking specifically about this particular article and the topic it is covering.

Whatever assumptions you used to arrive at your conclusions were introduced by the story you wanted to tell. I'd rather you didn't extrapolate from what I wrote.

Ask me nicely and I will tell you why fixing Facebook and Google are not the answer.


The original replier doesn't sound as if they're extropolating to me. You're also telling a story, and shouldn't be the only one in the conversation to do so.

> Over time, the noise is stripped away and what we get is the official narrative and that is one we can "count" on.

I'd ask why fixing Facebook or Google isn't the answer, but that would play into _your_ extropolating narrative. Instead, why shouldn't Google and Facebook be responsible, and held accountable, for posting absolutely fake news automatically? It's not a "firehose" or "unsubstantiated" information, it's projected as "news".


You've simply restated the problem. Why not contribute to the discussion?

How would you fix Google and Facebook? How would you hold them responsible and accountable? And why?


> You've simply restated the problem. Why not contribute to the discussion?

I'm sorry, but I've already asked a direct question and do not consider it a restatement of the problem.


It really would be a better exercise for you to think through how you would actually hold them accountable.

What service do they provide, concretely? What are their SLAs? What does your contract with them say? Every service is a contract between a service provider and a paying customer. There is a specific set of parameters the service provider must adhere to. What would a contract with Facebook or Google and their non-paying customers look like? Should people pay for the service or should we expect Google to eat the cost?

What is "news"? How do you know it when you see it? How do they detect fake news? What about all of the opinion pieces that appear on the Google News front page? Should those be included or excluded? Are we talking about white listing websites? Who controls that list and how do you get on it? Or, should Google hire people to filter the news for you?

I am asking you to think about what you are proposing so that you can see that there are problems there, some of them big in scope. Considering think about it is not a solution.


I see you've asked another series of questions without answering the one posed. In many respects, this might be seen as changing the conversation to something within your realm instead of answering a question.

> Really would be a better exercise for you to think through how you would actually hold them accountable.

In short, the questions you've posed range from "bring your life experiences together" and "think through the specifics". I have, and my thoughts remain unchanged.

Posing that many questions across such a broad range of subjects and varying levels of specificity might be seen as a disengenious way to continue the conversation.

I have posed a straightforward question. Instead of an answer, I've received only other "thought provoking" questions. In response, that's how I arrived at my question to you, two responses ago.

In the future, you might find better dialogue by answering questions posed.


Huh? Why would you blame your car or someone else's car for driving when someone slams into you from behind?

You blame the brakes when they stop braking. The car when it stops driving. Don't blame the car when the people are at fault.

It's hilarious seeing people blame Facebook as if it's some real person trying to trick everyone, when it fact it's just someone using the tool wrongly.. hint: you can use nearly every tool wrong. See: 'Man kills person with knife that is only for cutting bread.'

When you start to blame things like Facebook or Google for doing their job and presenting popular/trending topics, you lose any grasp on the real problem... and that is much uglier, so I don't blame those that do.

Would you stop using bread knives because they cut bread fine, but kill people poorly?


There's a large difference between vehicles, which are wholly owned by a consumer, and services.

Running an open source news aggregator on my personal server is something I would own and be responsible for, similar to a fully owning a car. If I selected a ride share and the driver wrecks, then a company is responsible for my personal injury as well as the service I didn't receive.

> When you start to blame things like Facebook or Google for doing their job and presenting popular/trending topics, you lose any grasp on the real problem

Suggesting someone who blames Facebook or Google has "lost sight" is a very confident statement that you have not backed up. If this question were settled then there would be no thread on HN.

I _consider_ looking into the matter more. Google and Facebook play in the news aggregation space, and getting it wrong has lead to serious questions about their role.

I stand by looking into things further. Please feel free to clarify your claim that someone has "lost grasp on the real problem" when they start to blame Facebook and Google. Should either be responsible then blame should be expected.


>If this question were settled then there would be no thread on HN.

That contradicts your idea of "free personal news" to which can be attributed to one name. (HN/Facebook..)

Blame those that do, not those that allow to do. Selling knives could be made illegal if people chose to use them badly. This is not solving any problem.


>Do I need to know shortly after an event occurs

The traditional media is pushing "minute-by-minute" news pretty hard as well, but you're right in that people would rather have information right now, instead of correct information in a little while. The result is misinformation, deliberate or not, and some times that flawed information will taint the debate afterwards.

I can understand people wanting to know if loved ones are safe, and Facebook is actually helping in that regards, but there needs to be a delay on information, long enough to get verification.

Believing that Google, YouTube and Facebook are to blame for low quality information is missing the point. Twitter and Facebook for instance have often been the source of truthful and detailed information faster than the tradition media. Likewise tradition media as also contributed to the spreading of false information. Often this isn't deliberate, but because journalists are getting increasingly worse at their job. What I believe is happening is that journalists are upset that they can just sit at their desks, cruising the internet and cover stories that way, because they aren't capable of disseminating the incoming the information they read on the internet.


tl/dr: We don't think people are capable of exercising judgment, so we want the computers at Google and Facebook to do it for them. Good luck with that.


It's not that they are incapable of exercising judgement. It's that when they do exercise judgement, it is consistently towards bad content. False content.


This problem needs to be fixed with education. The article demonstrates an extremely profound misunderstanding of technology.


Any problem which ends with "education is needed to fix it" will fail.

Education is a non-solved problem, and you may as well say "society will fix it", to get a better approximation of what Education entails.

Education on a subject is influenced and affected by

1) Student health, motivation, previous education, available time, availble security, water, electricity, nutrition, parental nutrition, parental education, stability, age, and so on.

2) Education and subject matter is affected by development of the country, politics of the country, cultural importance of the subject, job demands, payment/funds available for education, structure of educational systems and more.

So in other words, its not "fixable".

Do note, that if you developed a way to transpose both information and meaning and expertise, a la matrix, into someones brain directly - you would not only "solve" education, but you would also be able to program legions of soldiers/attackers/drones with the same technology.


What kind of education do you propose? Who would it target? How much would it cost? Who would be responsible for implementing it? Who would pay for it? Who would be responsible for overseeing their implementation? By what metric would we determine success or failure? Over what time scale do you propose the education to last?

This is a problem now. Your solution has too many associated questions to be implemented immediately. Without the answers to at least some of the questions I posed, I cannot agree that your proposal is the remedy.


It's not that easy. To trust a piece of news requires time (for example for cross checking over multiple news outlets) and that's a luxury I (and many others) don't have.

We do pay (indirectly) pay Facebook and Google and we do not get this filtering service at all.


> Worse, when I asked Google about this, and indicated why I thought it was a severe problem, they sent back boilerplate.

> Unfortunately, early this morning we were briefly surfacing an inaccurate 4chan website in our Search results for a small number of queries. Within hours, the 4chan story was algorithmically replaced by relevant results. This should not have appeared for any queries, and we’ll continue to make algorithmic improvements to prevent this from happening in the future.

I believe Google can be lax about quality of results because it has no real competitor. These are the moments when I wish there was a healthy competition among search engines, so that all of them would be on their toes to deliver best results.

Having said that, did anyone try DDG or Bing during the initial period after the event? How was the quality of the news results on those sites?


The real crime is that there is an assumption of companies with their own self-interests at the forefront are somehow benevolent arbiters of information on the Internet. This shows more the naive crassness of the media.


Nailed it. Exactly on what timeline do these people think these megacorp clickfarms were constitutionally/divinely designated as arbiters of truth? They have always been run by trash-purveying ad companies and have always been corrosive [1].

[1] https://www.anxiety.org/social-media-causes-anxiety


It's companies fighting other companies for the mindshare of the public and profits. Your modern day "journalist" has no convictions anymore, they weave whatever tale their editors want to tell, whether it's true or not. This is our dire future, and I don't see a way out of it.


I agree with the article. I think more humans do need to be added to the "decision-making" process.

I mean, com'on, promoting 4chan to the top spot in a news channel? Too many people in this country don't bother to check, they figure if it's the top article it must be true.


> I agree with the article. I think more humans do need to be added to the "decision-making" process.

Not really useful when most journalists already demonstrate how even professionals can't check their sources.


Still needs the ron paul gif


Reddit and 4chan had more informative information quicker than all the news outlets.

They did not mislead me on anything about the incident. (They did have misleading comments that were easy to ignore)

Sorry, but just because there are dumb people in the world why should my results on Goolge be censored.

It's also bullshit, 4chan is very well know as a poor source and looks nothing like a newspaper so the concept it would be mistaken for a fact based news outlet shows more how dumb the writer and people with this belief are more than anything else.

This writer is so superior they have to save us dummies from facebook groups and 4chan? I find it insulting.


Well, you could have been on a certain subreddit, and you would be worried about false flag operations after a point.

I will agree, that several networks can get information out faster than older systems.


If we all such smart people/techies, can we spitball a better breaking news feed algorithm into existence?

I'll kick it off -

1. Focus all news on Who? What? When? Where? Omit the "Why?".

2. When in doubt cite the primary news sources (police feeds/courtrooms/local authorities/opposition leaders/pictures/geo tagged and timestamped audio&video feeds with geo tags)

3. Secondary news sources like news orgs and journalist videos are second in importance

4. Quote tertiary news outlets like Reddit/HN/4chan only when they can be cross verified with a primary source

More suggestions?


Just a comment:

Reputation is key.

Having studied game theory for systems of agents producing results and content, I feel fairly confident in this conclusion: the only reliable long-term design to incentivize beneficient, truthful behavior involves assigning good reputation in return for good behavior, and assigning future rewards/attention based on past reputation.

Based on this, I postulate that any good news feed algorithm must use reputation scores based on past history to choose what to believe and what to show when a new story comes out. Humans of course do this instinctively, which many modern news organizations will have discovered too late as they sell out their integrity, accuracy, and priorities.

Remember what von Neumann said about arithmetic attempts to generate random numbers? Similarly, anyone who attempts to algorithmically determine "truth" based on instantaneous metrics is living in a state of sin.


When it comes to sourcing information from discussion boards, each top thread is effectively the one and only time the algorithm will interact with that thread for that topic.

Unless each thread is monitored and updated by a small subset of power users when weighting for reputation matters, it is likely that the persons managing a thread are volunteers at the scene of breaking news or have been (self-)coopted into that role by circumstances. Such self selected thread managers can safely be treated as a one-time participant in the prisoner's dilemma game.

If this is true, the algorithm can safely assume that the other prisoner (the "thread") is going to betray them and so, betray them first by low-weighting them.

Of course, you use some sort of reputation for the source of the threads - for breaking tech news, a GitHub issue tracker is probably going to be a better tracker than an HN thread than a general purpose Reddit thread than a 4chan thread.


> Of course, you use some sort of reputation for the source of the threads

Yes, this is what I mean. A news source should have a history of correctness before being trusted to report future news. The issues you're discussing are exactly the reason we need reputation -- otherwise some false news can arrive once, get a ton of attention, and do a lot of damage before going away forever (and we see this on e.g. reddit).


Reputation can be gamed and manipulated in much the same way that any other information can be. Case in point: which information sources get high-profile stories like this one accusing them of failing us and spreading dangerous rumours is almost entirely a function of the current preferred media narrative. I've seen so many reputable journalists start and spread viral bullshit on social media that becomes widely believed, but that issue doesn't fit the narrative whereas making a big deal about a Google search that would only ever be made by people who'd already come across the same false claim somewhere else does.


> Reputation can be gamed and manipulated in much the same way that any other information can be.

Yes, it's not foolproof, but it's still a vital piece of the puzzle. With a good reputation system (and I'm not saying such a thing is easy, or perhaps even doable) "reputable" journalists won't stay reputable for long if they behave as you mention.

For example, here's a very crude reputation system: politifact (to pick an example). Imagine if Google just cut from "News" results any site with a certain number of bad ratings from politifact in the past year. This is is gameable and not sustainable, but it would be a decent start.


I've been toying with the idea of building a meta-news site of sorts for a while. Only one page per story (as opposed to the scatter-shot approach of normal news sites), stick just to facts, don't report death tolls while the bodies are still warm, anti-sensationalist and so on. Just curate a list of primary and secondary sources for a given story and present an overview of the story with opposing viewpoints. Very dry from a presentation perspective.


How do you make it sufficiently authoritative and trustworthy so that the general population including us doesn't go out and seek hearsay and speculation to get their jollies.

I mean, what interest could the identity of the killer have for people while the shootout was in progress. Why not just stop once the basic unbiased facts have been surfaced? Everything else can wait for the cold light of reason and time.


The difficulty is that curating involves editorial decision making and some threshold for what you consider to be a fact. There's very little someone won't find fault with. But maybe if the sites editorial standards are also published, people could adjust for that themselves?


Since Google and co were never given the responsibility of telling anyone the truth this seems to be another naked powergrab to spread FUD and scare stories so someone can have a 'monopoly on truth'.

Diversity of opinion and widespread rumour mongering is a natural state of affairs in human society. Those who are unsettled by this and seek some sort of uniformity and control betray a troubling megalomania. Elevating yourself over others to decide somehow only 'you' are able to discern the facts confirms it.

That leads to not to Google or Facebook curating news but 'one source of truth' controlled by the government and authoritarian entities, using the exact same logic. Some people may be experts at C and Go but betray an astonishing illiteracy when it comes to history, freedom and evolution of human society.

Notions of truth have been debated for eons so there is already a large body of knowledge. You need an educated literate population and accept people will have wildly differing views and trying to protect the 'ignorant' not only turns your society into a tightly controlled cage but reflects a streak of authoritarianism and megalomania in the 'protectors,' now prevalent among many technical folks.


"In the crucial early hours after the Las Vegas mass shooting, it happened again: Hoaxes, completely unverified rumors, failed witch hunts, and blatant falsehoods spread across the internet."

It's the nature of the internet and social media. People rant, rave and gossip. It's not google or facebook's job to curate what people say, think or do.

I wish google and facebook and the social media companies would unite to fight back against traditional media. They've been attacked for the past couple of years relentlessly.

> their active role in damaging the quality of information reaching the public.

With all due respect. If google/fb/social media wanted to filter out quality of information, they would ban theatlantic and the rest of the traditional media.

At the end of the day, this guy is just complaining that people are being people. Eventually things get sorted out.

Looking at the guy's stories list, it seems like all he does is whine about facebook and social media.

https://www.theatlantic.com/author/alexis-madrigal/

Who is alexis madrigal that we should even pay attention to him?


> Gabe Rivera, who runs a tech-news service called Techmeme that uses humans and algorithms to identify important stories ...

wait a second, you mean a 3rd party site can do a better job at filtering out fake news than Google/Facebook? and people can instead just go to a 3rd party site for high quality news?

but The Atlantic thinks Google/Facebook should have all the power and ace out the little 3rd party companies who offer a better product?

waaaaah?!


Some conservatives are concerned about liberal viewpoints on Facebook and consider it a conspiracy. Some liberals are concerned about conservative viewpoints on Facebook and consider it a conspiracy. Almost everyone is concerned about "trash" when it takes a viewpoint they oppose, less concerned about trash when it promotes a viewpoint they agree with.

Everyone thinks they know what "Fake News" is and most people who are vested think Google's algorithms should rank them higher and less deserving sites lower.

As with most human opinion it's a matter of self interest and so is the parent article. Stripped of the holier than thou mantel it's whining about page ranking. Common whine. Lots of us have it. But most of us aren't nasty enough to seize on tragedy in order to make the point. Which is just another piece of evidence about the nature of old media and why it shouldn't have a monopoly on information.

Personally I'm ok with semi-dumb pipes. I think in the aggregate it's better for all of us regardless of persuasion.


Google admits to scraping 4chan for news stories. This isn't just about "the algorithms" - a human decided that 4chan could potentially show up in that carousel.

https://arstechnica.com/information-technology/2017/10/googl...

Google News' statement claims that these false reports landed on the service's "Top Stories" feed due to a burst of activity for a name that had never received many search attempts. "When the fresh 4chan story broke, it triggered Top Stories, which unfortunately led to this inaccurate result," the statement reads.

"We use a number of signals to determine the ranking of results—this includes both the authoritativeness of a site as well as how fresh it is," the statement continues. "We're constantly working to improve the balance and, in this case, did not get it right."


Last I heard, 4Chan founder Chris Poole now is employed by Google as of about a year ago. Shortly after that, people on various boards noticed /pol/ results etc. showing up in Google News. I'm not sure if there's a coincidence in that timing or not :)

At any rate, yeah, someone had to approve including message boards as a potential source to scrape for news. Not a good decision in my mind -- 4Chan's reputation makes this decision even more questionable, sure, but I can't think of any general social media / message boards / comment sections / etc. that would qualify as "news" in my mind. Certainly this lowers the value of Google News to me.


The value of the systems built by Google and Facebook and the quality/legitimacy of news from these systems is pretty much dependent on the users. The feeds are built by what users share. A platform is only as good as the people contributing to it.

When it is known that insanity in individuals is rare but in groups, nations, epochs (and also internet) it is the rule - do not look at google/facebook news feeds to be sources of valid information. If we all are smart and intelligent enough to use these systems wisely, the benefits are enormous. But it also allows trolls and people who want to create confusion to operate with freedom.

For no or 0 money you will obviously get poor quality news. You need editors to verify sources and then put them in print or online. Go and pay an online newspaper so their editors will have jobs to do what you expect and they do not need to publish click-bait articles just for ad money so that their firm can make money.


If the results can't be trusted, don't use Google or (especially) Facebook as your news sources. As understand it, Google will aggregate its news - it hasn't got someone picking the articles. Stick with an established news source - i'll pick the BBC because I live in the UK.


The reason fake news, clickbait and SEO techniques on poor content are able to prevail is the same reason spam mail is still a thing in 2017. Because it's an evolutionary race where each innovation is countered by a trick to avoid it.


"The problems with surfacing this man’s group to Facebook users is obvious to literally any human. But to Facebook’s algorithms, it’s just a fast-growing group with an engaged community."

"Most people who joined the group looking for information presumably don’t know that the founder is notorious for legal and informational hijinks."

Those two paragraphs appear back to back in the article and are completely contradictory. 5000 people joined because it isn't "obvious to literally every human."

[edited to add quotation marks to make my comment clearer]


It isn't important for the general public to know accurate trivia about tragic events within hours of their occurrence. One would think a monthly magazine writer would know that...


Actually, I would tend to think net neutrality should apply to content host as well. Provided content is not illegal, obviously.

The best to do to fight false info is to shame those who share it, just like we shame news outlets which do not verify their sources. It's probably the best way to make sure everybody double check what they post (plus, if citizen start to check their sources, that's a huge win).


If you get your news from your neighbor, friend, relative or a site that claims to be satire, your probably confusing gossip with news.


Not to deny there's a problem but this quote: "the problems in the system because there are computers in the decision loop"... Does the author expect hand made search results?


It's at least a little bit ironic that on the same day major news outlets were announcing the death of a celebrity before he had actually died, and the story stayed up for hours.


Google and Facebook were in it for the general good? ..i never knew


People being too stupid to discern between 4chan and real news is not Facebook's nor Google's nor my problem. How about promoting critical thinking instead?.


I think more of an issue besides google and facebook would be the way reporters used the information and also presented twitter as a unequivocal source of information.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: