I watched it last night. It's pretty good for non HN crowd that might not know this already. But I personally felt that they forgot the most important question: what pushed algorithm to recommend what they are recommending? Why are Youtube's rabbit holes more likely to be conspiracy theory related and/or inflammatory content? In the beginning, the algorithm was probably equally pushing for this type of content and more educational content. Why can't we make science related/educational content interesting enough for it to become a rabbit hope recommended by algorithms?
A large amount of my recommended YouTube videos are educational, but I subscribe to a lot of nature docs, history etc, I don't think the algorithm dislikes this kind of content for me. So I guess the answer to your question is, we can?
I agree with that. I think the question that's left unanswered is "why can't we make educational/scientific/historic content that drives up engagement?"
Probably because reality is complicated, confusing and boring. What drives up engagement are clear, coherent stories and narratives with twists and turns that leave us shocked or awed. If you want educational content to be engaging, you have to find a way to make a good narrative out of truth. That's what this documentary and all other successful documentaries attempt to do.
But sometimes (most of the time, really) truth just doesn't fit this structure, so to make it engaging, you have to deform and twist it a bit. And if reality is really, actually, mundane, there's not much you can do to make it engaging, whereas it's extremely easy to make engaging, convincing falsehoods about nearly any subject.