This doesn't mention the most economically sound and complete solution to traffic: dynamic congestion pricing on roads.
Due to the effects described in the article, entering a road that's close to congested imposes negative externalities due to the delay on everyone behind you, even higher if you are pushing the road below optimal throughput. Push that externality into the price, and suddenly drivers will change their behavior in the desired fashion:
1. People will move their travel to less expensive times. Even if no other change occurs than people waiting for prices to fall, the roads operate at much higher throughput due to never getting into the region of diminishing throughput.
2. People will carpool/vanpool/mass transit- no need for any special treatment for transit, a bus with 50+ people can simply outbid most cars on the road for space, even accounting for the difference in road space taken by the bus. With the economic incentive in place, you'd even expect private buses/etc to pop up spontaneously. Right now, its rarely worth it to pool/bus- it adds extra time for you, but the benefit to the road you never see. With proper pricing, its still faster to take a car, but a lot more expensive- and the carpool/bus/etc is still probably faster than driving would be with congested roads.
3. Similarly, the high prices will incentivize alternatives such as biking, subways, etc, and give very good information on exactly what routes are in high demand when, estimates of how much an improvement would be worth, etc.
Naw, the most economic solution is to make bigger bumpers and let cars push each other forward.
Think about a hose. If you have it at a certain flow and then increase the flow the water doesn't go out faster because it wants it. It flows faster because it's being pushed.
Same thing with cars, as more cars get onto the highway you want them to go at a higher speed so that the throughput matches the on ramp. We just need to cut down the number of 4 lane highways so that we have space to put exit ramps on both sides of a 2 lane highway but the increased speed will make up for it.
Yes, we should just make couplings so that a long string of cars can be attached together. The trailing cars could all follow the lead car. If you add some guide rails to the road you don't even need a fancy autonomous steering system. Swap the rubber tires for steel wheels on the guide rails and you reduce friction losses and eliminate flat tires. A centralized dispatch and signaling system could keep the system free flowing and enable high capacities.
At high demand times, you have to be very rich indeed to outbid a full bus without even thinking about it. There aren't enough people who can do that.
But say this does happen a lot-this means rich people pay enormous road use fees, which can then be used for road maintenance, construction, and improvement, as well as other transit infrastructure!
So, the rich willingly subsidize infrastructure for everyone? Seems like a win-win!
That's a nice pipe dream, but what would happen in reality is that all of the congestion fees would go to the rich (perhaps in the form of tax cuts), who would use it to buy more stock, bribe some politicians to ban buses, and then triple the congestion charge because fuck you.
The congestion fees would go to the government responsible for the roads. Of course, they could be captured by the rich, but most governments spend most of their money not on the rich.
You'd set the congestion charge, by law (at least on public roads), to the minimum required for efficient road use- not the revenue maximizing price, which would likely be much higher due to monopoly.
Why do people insist on this tired unimaginative trope. We have the past and present to look at. We know how these things work.
The rules will be crafted, the commas in the laws placed, the contracts handed out, to support those who supported the endeavor. If the plumber's trade group agrees to support it their vans will be exempt. If Palantir supports it, the RFP will be written to make it nigh on impossible to not buy their stuff. No matter how flagrant the badness of the system, if the tech industry makes even a cent, the comment section full of techies will engage in olympic level mental gymnastics and not just do bending over backwards but doing full on backflips to justify the goodness of the system. If the bus drivers have such a comment section they'll do it too.
This is how things were. This is how they are. This is how they will be. Well, right up until the point where the rest of society gets sick of our shit and leaves us in a big communal hole or gives us a free shower or whatever happens to the fashionable way to do that thing is at that point in the future...
But I suppose maybe you're right and they'll throw a few pennies of tax cuts at it if they just need a little upper middle class support to drag it across the finish line.
Depends how you define "equal". One approach is simply to scale the charge by income -- effectively, convert time to money, and charge you a congestion tax of some amount of money-earning time. "6 minutes" is 1/10000 of your annual income -- $2 for someone earning $20,000 per year, $20 for someone earning $200,000 per year.
But does a vehicle with several people in it pay for the max, min, median, average, or the driver's time? I suspect "driver" is easiest, it seems like it might work but I'll bet there are some screwy ways to game that rule, too.
And also in essentially any relevant private market for goods and services where capacity is limited, especially when there are more and less desirable times.
How do you propose people find out the cost of traveling if the pricing is dynamic? People won’t check beforehand, and they’ll already be in their cars when they find out the cost
Navigation/maps providers like Google/Apple maps, etc, will incorporate price estimates as well as time estimates- they can even show multiple options if there are price-time tradeoffs available.
There's a formal equivalence between Markov chains and literally any system. The entire world can be viewed as a Markov chain. This doesn't tell you anything of interest, just that if you expand state without bound you eventually get the Markov property.
Why can't an LLM do backtracking? Not only within its multiple layers but across token models as reasoning models already do.
You are a probabilistic generative model (If you object, all of quantum mechanics is). I guess that means you can't do any reasoning!
In the theoretical section, they extrapolate assuming a polynomial from 40 to thousands of dimensions. Why do they trust a polynomial fit to extrapolate two orders of magnitude? Why do we even think it's polynomial instead of exponential in the first place? Most things like this increase exponentially with dimension.
In fact, I think we can do it in d=2k dimensions, if we're willing to have arbitrarily precise query vectors.
Embed our points as (sin(theta), cos(theta), sin(2 x theta), cos(2 x theta)..., sin(k x theta), cos(k x theta)), with theta uniformly spaced around the circle, and we should be able to select any k of them.
Using a few more dimensions we can then ease the precision requirements on the query.
In practice you're actually hitting further problems because you don't have those synthetic top-k tasks but rather open-domain documents and queries to support.
And if you hope to get better than "just" having the top-k correct and instead get some sort of inclusion/exclusion boundary between what should be matched and what should not be matched, you'll hit the same bounds as apply to context length limitations for kq dimenionality in a transformer's attention layers, as I mentioned about 6 weeks ago: https://news.ycombinator.com/item?id=44570650
I'm not following your construction. In the k=2 case, how do you construct your 4-dimensional query vector so that the dot product is maximized for two different angles theta and phi, but lower for any other arbitrary angle?
Viewing our space as two complex points instead of four real:
Let H(z) = (z- exp(i theta)) (z - exp(i phi))
H(z) is zero only for our two points. We'll now take the norm^2, to get something that's minimized on only our two chosen points.
|H(exp(i x))|^2 = H(z) x conj(H(z)) = sum from -2 to 2 of c_j x exp(i j x)
For some c_j (just multiply it out). Note that this is just a Fourier series with highest harmonic 2 (or k in general), so in fact our c_j tell us the four coefficients to use.
For a simpler, but numerically nastier construction, instead use (t, t^2, t^3, ...), the real moment curve. Then given k points, take the degree k poly with those zeros. Square it, negate, and we get a degree 2k polynomial that's maximized only on our selected k points. The coefficients of this poly are the coefficients of our query vector, as once expanded onto the moment curve we can evaluate polynomials by dot product with the coefficients.
The complex version is exactly the same idea but with z^k and z ranging on the unit circle instead.
If you want heat, why bother converting sunlight to electricity first? You lose 80%. Is it that much more expensive to use mirrors to concentrate sunlight and capture near-100% of the energy as heat?
Because with a heliostat you have to move the energy as heat and that isn't as efficient and definitely not as easy or cheap as moving it as electricity.
Do the numbers. Making the top of the heliostat isn't what matters. What matters is making the inside of a pile of dirt that hot.
I broadly agree with you, but there is really a point here about land ownership.
Although developments of the land do improve the value, and thus land ownership has significant utility economically by incentivizing this, there isn't really an economic justification for the owner receiving value for the land itself- why should someone have exclusive rights to a piece of land they didn't create? They bought it, sure, but why did the previous owner have perpetual exclusive rights?
I'd advocate for a small property tax as a replacement for other taxes, because the component that does tax "land value" won't cause economic harm, but all of income tax causes deadweight loss. (Note, Land Value Tax is great in theory, but impossible to define practically- property tax good enough, much harder to game!)
Note that in practice, the biggest abuser of land hoarding is local governments with extremely restrictive zoning that stops productive development of the land- from an economic perspective they own the land, and have sold (or in reality, seized) some but not all of the rights from the 'landowner'. Although this can have advantages to help with coordination problems, in practice it's caused enormous economic damage to many cities by preventing development. At its heart, it's a problem with land hoarding.
It's because going from 1 to 2 changes the expected worst case load from an asymptotic log to an asymptotic log log, and further increases just change a constant.
My mistake: R6RS has `syntax-rules`, not `syntax-case` as far as I can tell. However, `syntax-rules` and `syntax-case` are equivalent in power. [1]
It does not have the same power as `defmacro`: you cannot define general procedural macros with `syntax-rules`, as you are limited to the pattern-matching language to compute over and construct syntax objects.
I think you got your wires crossed. R5 and R7 only have `syntax-rules` macros. R6 has both (`syntax-rule` can be trivially defined as a `syntax-case` macro).
R6 having `syntax-case` macros is one of the more controversial things about it; a surprising number of implementers don't care for them.
Those don't have to be the only two options. You can also criminalize hiding your mental illness as a pilot while incentivizing self-reporting via sufficiently generous disability insurance.
One can generalize this to k missing numbers the same way as we typically do for the addition case by using finite fields:
XOR is equivalent to addition over the finite field F_2^m. So, in this field, we're calculating the sum. If we have two numbers missing, we calculate the sum and sum of squares, so we know:
x + y
x^2 + y^2
From which we can solve for x and y. (Note all the multiplications are Galois Field multiplications, not integer!)
Similarly for k numbers we calculate sums of higher powers and get a higher order polynomial equation that gives our answer. Of course, the same solution works over the integers and I'd imagine modular arithmetic as well (I haven't checked though).
This will depend on the field, and for F_2^m you want odd powers: sum(x), sum(x^3), sum(x^5) etc. Using sum(x^2) won't help because squaring over F_2^m is a field homomorphism, meaning that sum(x^2) = sum(x)^2.
This is also how BCH error-correction codes work (see https://en.wikipedia.org/wiki/BCH_code): a valid BCH codeword has sum(x^i where bit x is set in the codeword) = 0 for t odd powers i=1,3,5, ... Then if some bits get flipped, you will get a "syndrome" s_i := sum(x^i where bit x was flipped) for those odd powers. Solving from the syndrome to get the indices of the flipped bits is the same problem as here.
The general decoding algorithm is a bit involved, as you can see in the Wikipedia article, but it's not horribly difficult:
• First, extend the syndrome: it gives sum(x^i) for odd i, but you can compute the even powers s_2i = s_i^2.
• The syndrome is a sequence of field values s_i, but we can imagine it as a "syndrome polynomial" S(z) := sum(s_i z^i). This is only a conceptual step, not a computational one.
• We will find a polynomial L(z) which is zero at all errors z=x and nowhere else. This L is called a "locator" polynomial. It turns out (can be checked with some algebra) that L(z) satisfies a "key equation" where certain terms of L(z) * S(z) are zero. The key equation is (almost) linear: solve it with linear algebra (takes cubic time in the number of errors), or solve it faster with the Berlekamp-Massey algorithm (quadratic time instead, maybe subquadratic if you're fancy).
• Find the roots of L(z). There are tricks for this if its degree is low. If the degree is high then you usually just iterate over the field. This takes O(#errors * size of domain) time. It can be sped up by a constant factor using Chien's search algorithm, or by a logarithmic factor using an FFT or AFFT.
You can of course use a different error-correcting code if you prefer (e.g. binary Goppa codes).
Edit: bullets are hard.
Further edit just to note: the "^" in the above text refers to powers over the finite field, not the xor operator.
Yesterday I linked to an implementation (with complexity quadratic in the number of errors) I helped to create in another comment in this thread.
> constant factor using Chien's search algorithm
Chien's search is only really reasonable for small field sizes... which I think doesn't really make sense in this application, where the list is long and the missing elements are relatively few.
Fortunately in characteristic 2 it's quite straight forward and fast to just factor the polynomial using the berlekamp trace algorithm.
If you imagine a polynomial L(z) that's zero at all the missing numbers, you can expand the coefficients out. For example, with 2 missing numbers (x,y), you have:
L(z) = z^2 - (x+y)z + xy.
You already have x+y, but what's xy? You can compute it as ((x+y)^2 - (x^2 + y^2))/2. This technique generalizes to higher powers, though I forget the exact details: basically you can generate the coefficients of L from the sums of powers with a recurrence.
Then you solve for the roots of L, either using your finite field's variant of the quadratic formula, or e.g. just by trying everything in the field.
* But wait, this doesn't actually work! *
Over fields of small characteristic, such as F_2^m, you need to modify the approach and use different powers. For example, in the equations above, I divided by 2. But over F_2^m in the example shown above, you cannot divide by 2, since 2=0. In fact, you cannot solve for (x,y) at all with only x+y and x^2 + y^2, because
So having that second polynomial gives you no new information. So you need to use other powers such as cubes (a BCH code), or some other technique (e.g. a Goppa code). My sibling comment to yours describes the BCH case.
Due to the effects described in the article, entering a road that's close to congested imposes negative externalities due to the delay on everyone behind you, even higher if you are pushing the road below optimal throughput. Push that externality into the price, and suddenly drivers will change their behavior in the desired fashion:
1. People will move their travel to less expensive times. Even if no other change occurs than people waiting for prices to fall, the roads operate at much higher throughput due to never getting into the region of diminishing throughput.
2. People will carpool/vanpool/mass transit- no need for any special treatment for transit, a bus with 50+ people can simply outbid most cars on the road for space, even accounting for the difference in road space taken by the bus. With the economic incentive in place, you'd even expect private buses/etc to pop up spontaneously. Right now, its rarely worth it to pool/bus- it adds extra time for you, but the benefit to the road you never see. With proper pricing, its still faster to take a car, but a lot more expensive- and the carpool/bus/etc is still probably faster than driving would be with congested roads.
3. Similarly, the high prices will incentivize alternatives such as biking, subways, etc, and give very good information on exactly what routes are in high demand when, estimates of how much an improvement would be worth, etc.