Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Digital Equipment answers a user's complaint that the year 2000 should not be a leap year. (york.ac.uk)
73 points by bdfh42 on Oct 29, 2008 | hide | past | favorite | 19 comments


FTA: "Although one can never be sure of what will happen at some future time, there is strong historical precedent for presuming that the present Gregorian calendar will still be in effect by the year 2000. Since we also hope that VMS will still be around by then, we have chosen to adhere to these precedents."

The writer proved remarkably prescient but,alas, Digital Equipment Corporation was not there to greet the new millennium.


"alas, Digital Equipment Corporation was not there to greet the new millennium."

Or even to (February) 2000.


Poor, DEC! VMS made the trek, though.


I remember the day the Compaq takeover was announced.

Oh boy. I knew that day the Alpha was doomed. Pitty... Such a promising achitecture.

Well... Back to my x86... I have work to do.


By the end, and even the last few years, the Alpha was no longer as fast as competing processors for the vast majority of workloads. The x86 architecture had become RISC behind the scenes and Intel and AMD had figured out ways to accelerate CISC to the point where it may have even been a net win for performance (or at least not a significant loss...smaller numbers of instructions had to traverse the slow RAM to CPU bus, and could then be "decompressed" into fast executing but more verbose instructions all within the CPU pipeline, where the bus was dramatically faster--raw clock speed had just caught up to the Alpha, but performance was significantly better, and by 2003 when the last ever Alpha clocked at 1.3GHz the Athlon 64 was released with clocks up to 2GHz). Others, like Sun and IBM and others, had made more progress on the parallelization and shared memory fronts, as well, making Alpha less useful for scientific and other large scale computing work.

Finally, the Alpha technology was sold off to Intel. The advantages of the architecture (and some of the engineers involved) have been assimilated into the CPU borg. It may be ugly, due to such a long and sordid history, but the x86 architecture is now wicked fast. Also the Alpha had elephantine power and cooling requirements. My last company once attended an event where we setup our boxes side by side with a competitors Alpha-based systems, and they had discreetly placed a box fan behind their biggest unit, because, without it, the ambient temperature of the room was too high for the Alpha box to operate reliably. Their box was much faster than ours though. I asked if they included a box fan, or if that cost extra.

But don't let me stand in the way of nostalgia. I, too, remember staring in awe at a DEC that was running at 400MHz, when the Intel architectures were still stumbling along at 166, or something along those lines (and with a dramatically slower bus, and only dual CPU capability in the Pentium Pro).


Are you sure? I don't have the numbers handy, but I dimly remember that, in around 1998-1999, the Alpha 21264 ran circles around Intel and AMD CPUs of the same era, in integer and especially floating-point performance. I'm not talking about clock speed, but SPEC benchmark results.


Yes, I'm sure. The only date I specifically mentioned was 2003, which is when the last Alpha was released and the 64 bit x86 architecture became available. In 1998-1999 the Alpha was still probably in the lead on all counts except possibly price/performance. Things move fast in CPUs, and a significant lead one year can turn into a trailing position the very next.


Well... By 2003 the Alpha has been more or less abandoned in terms of R&D expenditures. It's impossible to estimate how much of its perceived lack of relative performance was due to progresses in x86 designs and how much was simply due to a lack of research in progressing the architecture.


That sort of response well before Wikipedia is quite impressive!


Years divisible by 100 are normally not leap years, UNLESS they are divisible by 400.

http://en.wikipedia.org/wiki/Leap_year#Algorithm

I knew about the 100, didn't realize the 400 though until I looked into it. The response is comical, but you can understand where the user got confused.


Well that was certainly entertaining to read... I can only imagine the user on the other end of this being like...=O touche...I also learned about how the calendar came to be because of that! So, another plus.

Does anyone else think its interesting that throughout history there have been numerous... "this isn't working for me, lets just skip some days and it'll be okay!" Heck, this may not even really be there year 2008.


They knew that a lunation (the time from one full moon to the next) was 29 1/2 days long, so their lunar year had a duration of 364 days. This fell short of the solar year by about 11 days.

Isn't something wrong here?


Yeah. The lunar year is 351 days, as-is the case with the modern Islamic calendar.


"By imperial decree, one year was made 445 days long to bring the calendar back in step with the seasons."

Er, is that right?



Another interesting tid-bit for the comments:

The year 1582 didn't have Oct. 5th through Oct. 14th (inclusive). They just skipped from the fourth of October to the fifteenth!


Which resulted in widespread violence when landlords tried to make their tenants pay a full month's rent.


What I'd like to read is the customer's reply to the reply :)


If that were me, my reply would be to go out and by another DEC :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: