Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In advance, sorry about the late reply.

> The hardness of the n-body problem isn't necessarily an expression of fundamental randomness rather than technical uncertainty. But one way or another, aren't they both an expression of the same thing?

No, they are not. Randomness is randomness, and uncertainty is uncertainty. It's entirely possible to be have bounded uncertainty about the amount of randomness in a contrived system.

> The more accurately I model these systems, the more their outcome (or rather the outcome of the abstract macro-system of which I become aware) becomes dependent on the few things I don't know

No. The accuracy of your model of a system has no effect on what influences the system.

The uncertainty in the outcome of your model depends on the uncertainty in the inputs, but that's axiomatic.

> to the point that simply the act of checking the accuracy of the prediction has an unpredictable effect.

If and only if your checking mechanism is part of the system. Now, every checking mechanism we're likely to deal with is part of the universe, but that doesn't make the universe nondeterministic, merely impossible to isolate.

> Modeling an n-body system in the physical universe exactly means modeling every piece of information in the universe. If you don't do that, the unknowns will multiply into significant divergence at some t, however distant.

Yes, modelling a deterministic system requires modelling a deterministic system. If you model it except for the influence of some parts, your results will be what the model would have been in the absence of those parts.

If you don't model everything, you won't model everything. This is not an argument in favor of nondeterminism.

> Quantum theory suggests that even if you did have a computer the size of the universe that didn't affect the universe, it is intrinsically impossible to make an accurate prediction.

That is one interpretation; there are a number of interpretations of quantum theory that are deterministic.

> At that point, it seems like splitting hairs to say "Yes, but it's still really deterministic." What "real" are you talking about? Certainly none that I have experience with.

Consider a system with only two values, which we'll call "N" and "time" for the sake of my sanity. We can imagine a purely deterministic system where N = time * 1000. We can also have a nondeterministic system where N = Nprev + (1000 +- 1). If our measurement apparatus can only measure N's value relative to the previous value with an uncertainty of 10, it will be unable to distinguish between the two systems. This does not imply that the systems are the same, and it is not splitting hairs to consider them different. The first is "really" deterministic, and the second is "really" nondeterministic.

If our measurement apparatus improved such that we could measure with an uncertainty of .1, we would be able to establish that the first system was consistent with both nondeterminism and determinism, and the second system consistent only with nondeterminism. Nondeterminism can never be epistemologically ruled out; this does not mean we should conclude that it exists in reality.

> So how does determinism become the default?

Because for all the cases where we have enough data and processing power to test, determinism has been shown to be consistent with the data. If that were not the case, it wouldn't be the default.

Besides which, earlier you asked about the 'fundamentally non-deterministic nature of the universe'. My point is merely that no such nature has been demonstrated, nor is there any evidence to suggest it. There could be - if things often seemed to happen without cause, there would be plenty of evidence of cases where determinism is inconsistent with data. The closest we get is with quantum measurements, and it's far from demonstrated that these are genuinely a case of non-determinism.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: