Having studied game theory for systems of agents producing results and content, I feel fairly confident in this conclusion: the only reliable long-term design to incentivize beneficient, truthful behavior involves assigning good reputation in return for good behavior, and assigning future rewards/attention based on past reputation.
Based on this, I postulate that any good news feed algorithm must use reputation scores based on past history to choose what to believe and what to show when a new story comes out. Humans of course do this instinctively, which many modern news organizations will have discovered too late as they sell out their integrity, accuracy, and priorities.
Remember what von Neumann said about arithmetic attempts to generate random numbers? Similarly, anyone who attempts to algorithmically determine "truth" based on instantaneous metrics is living in a state of sin.
When it comes to sourcing information from discussion boards, each top thread is effectively the one and only time the algorithm will interact with that thread for that topic.
Unless each thread is monitored and updated by a small subset of power users when weighting for reputation matters, it is likely that the persons managing a thread are volunteers at the scene of breaking news or have been (self-)coopted into that role by circumstances. Such self selected thread managers can safely be treated as a one-time participant in the prisoner's dilemma game.
If this is true, the algorithm can safely assume that the other prisoner (the "thread") is going to betray them and so, betray them first by low-weighting them.
Of course, you use some sort of reputation for the source of the threads - for breaking tech news, a GitHub issue tracker is probably going to be a better tracker than an HN thread than a general purpose Reddit thread than a 4chan thread.
> Of course, you use some sort of reputation for the source of the threads
Yes, this is what I mean. A news source should have a history of correctness before being trusted to report future news. The issues you're discussing are exactly the reason we need reputation -- otherwise some false news can arrive once, get a ton of attention, and do a lot of damage before going away forever (and we see this on e.g. reddit).
Reputation can be gamed and manipulated in much the same way that any other information can be. Case in point: which information sources get high-profile stories like this one accusing them of failing us and spreading dangerous rumours is almost entirely a function of the current preferred media narrative. I've seen so many reputable journalists start and spread viral bullshit on social media that becomes widely believed, but that issue doesn't fit the narrative whereas making a big deal about a Google search that would only ever be made by people who'd already come across the same false claim somewhere else does.
> Reputation can be gamed and manipulated in much the same way that any other information can be.
Yes, it's not foolproof, but it's still a vital piece of the puzzle. With a good reputation system (and I'm not saying such a thing is easy, or perhaps even doable) "reputable" journalists won't stay reputable for long if they behave as you mention.
For example, here's a very crude reputation system: politifact (to pick an example). Imagine if Google just cut from "News" results any site with a certain number of bad ratings from politifact in the past year. This is is gameable and not sustainable, but it would be a decent start.
Reputation is key.
Having studied game theory for systems of agents producing results and content, I feel fairly confident in this conclusion: the only reliable long-term design to incentivize beneficient, truthful behavior involves assigning good reputation in return for good behavior, and assigning future rewards/attention based on past reputation.
Based on this, I postulate that any good news feed algorithm must use reputation scores based on past history to choose what to believe and what to show when a new story comes out. Humans of course do this instinctively, which many modern news organizations will have discovered too late as they sell out their integrity, accuracy, and priorities.
Remember what von Neumann said about arithmetic attempts to generate random numbers? Similarly, anyone who attempts to algorithmically determine "truth" based on instantaneous metrics is living in a state of sin.