Not a second more
November 4, 2011 3:51 AM   Subscribe

 
Oh man, please don't change it. Calculating astronomical stuff precisely is hard enough already, we don't need a(nother) discontinuity to account for.
posted by DU at 4:21 AM on November 4, 2011 [1 favorite]


Time is an illusion.

Lunchtime doubly so.
posted by loquacious at 4:23 AM on November 4, 2011 [7 favorites]


Just to give a historical perspective: in ye olde times, the motion of celestial bodies was acknowledged to be the most perfect form of motion, and God, who with divine love moved the heav'nly spheres, was thought to be a "pretty OK kind of guy".

But, thanks to all your so-called "science", the Earth is wobbling around like a drunken and lecherous wedding-guest, lurching from bridesmaid to vomit-hole. The result is that time itself is all outta whack, and scientists are now thinking of abolishing it altogether.

In that context, shouldn't we turn back to Universal Jesus Time ("UJT") and stop the cosmological rot, before it's too late?

You people can all watch a cesium atom bounce around like a fucking maniac if you like, but I know it's lunchtime when the LORD says it is. And the LORD is telling me, "take a loooong lunch today, my child". And the LORD is suggesting I take that lunch at Denny's, and maybe have the cheeseburger. And the LORD is saying, "just take some money out of the company pension fund - no one will miss it". And the LORD wants me to transfer that money to a bank account in Switzerland, and then take a flight to Switzerland and preach the gospel there. With the pension fund money. So let's not question the LORD on this one.
posted by the quidnunc kid at 4:28 AM on November 4, 2011 [35 favorites]


I want two times:
  • My machines (and all of the machines in the world that my machines connect to) need to know machine time, something universal and unshifting, maybe just TAI. Machines don't need leap seconds. They all just need to agree on exactly the same time, all the time, forever, and it has nothing to do with the earth or the sun. It's just a counter.
  • I (and the people around me) need to know (from wall clocks, watches, train schedules, telephones, etc.) the local human time, which doesn't have to be exact at all (doesn't have to precisely correspond to machine time). And if you're going to adjust human time to help humans cope with the seasons, let the local human time adjust itself gradually (a few seconds forward or back every day) so you never feel those abrupt fall-back-and-spring-ahead jolts.
posted by pracowity at 4:33 AM on November 4, 2011 [8 favorites]


I say we abolish the leap second and bet it all on civilization collapsing before the need for a "leap hour" becomes pressing. It won't be a big deal; in the broken AfterWorld, the sky is a mottled, rotten black all day and all night; nothing grows, nothing turns its face to the sun, there is no summer, no winter; no quotation marks or commas; nothing lives but cannibal gangs squabbling over scrapings of moss and limbs of obscene provenance.

Plus, if I'm wrong and 2200 is actually a computerized iUtopia, then we can just declare the leap hour "Do As You Please Time." The spectacle of respectable android businessmen gorging themselves on treacle pie right from the tray, children voting, etc. is sure to spark a revolution against the Pleasure Overlords and restore the soul of humanity.
posted by No-sword at 4:38 AM on November 4, 2011 [16 favorites]


My machines (and all of the machines in the world that my machines connect to) need to know machine time, something universal and unshifting...

seconds-since-1970
posted by DU at 4:39 AM on November 4, 2011 [7 favorites]


I don't get one point...the complaints are that adding one second would probably be an expensive/inaccurate/dangerous? procedure for machines that rely on atomic clocks? So let's keep the machine on an international atomic clock accuracy and human beings on daylight/nighttime with adjustments.
posted by elpapacito at 4:39 AM on November 4, 2011 [2 favorites]


We could just ask Falva Flav.
posted by The Ultimate Olympian at 4:42 AM on November 4, 2011 [5 favorites]


It's hard to imagine diverging times are going to be easier to cope with than leap seconds. If we can handle timezones, and daylights savings, and leap years... well for it to even be an issue I feel like there must be more to it than the article covers.

Related, google hacked their NTP servers to account for the 2008 leap second a bit more gracefully. It's an interesting read. To computers a second in a terribly long time. I work for a far far far smaller IT organisation than google and a second is still long enough for us to do several thousand transactions.
posted by adamt at 5:00 AM on November 4, 2011 [1 favorite]


Pracowity, we basically have that already. TAI, which is a French acronym for International Atomic Time, is an absolute value, one that accumulates based on an average of a bunch of atomic clocks all over the world. UTC (French for Universal Coordinated Time) is an offset from TAI that accounts for the actual rotation of the Earth, in all its wobbly, sloshy inaccuracy. It's adjusted to always be within one second of UT1, which is time as defined by the Earth's rotation. Presently, UTC is at +34 seconds to TAI... we've had a half-minute of wobble since we started tracking time well enough to notice.

I don't think abolishing UTC is going to solve anything. It's just an offset. For applications that don't like leap seconds, just use TAI instead. And do it in GMT, so that it's obviously not wall clock time anywhere but Greenwich.
posted by Malor at 5:01 AM on November 4, 2011 [3 favorites]


so that it's obviously not wall clock time anywhere but Greenwich

Oh yes, Greenwich. It's so weird when you're on the DLR and the driver has to announce the time so that everyone can set their watches. Of course, if you're not getting off in Greenwich, you might as well just leave it, because you're only going to have to set your watch back again at Deptford Bridge.
posted by le morte de bea arthur at 5:10 AM on November 4, 2011 [1 favorite]


Ugh. So it's governments (i.e. their business interests) voting on this, not individual scientists? I guess it's goodbye leap second then. I'm wonder if they'll eliminate the year 2100 problem by just voting away the Gregorian calendar. GRAR
posted by gubo at 5:14 AM on November 4, 2011


Why don't we just move the planet faster?
posted by blue_beetle at 5:15 AM on November 4, 2011 [2 favorites]


Well, if time is money, can't we just fix the global financial crisis by messing with the clocks? There is no overtime or weekends in the Eternal Now....
posted by GenjiandProust at 5:27 AM on November 4, 2011


We need leap seconds. We just don't need to use them everywhere.

Code that manipulates dates and times is often buggy, largely because every method that has ever been devised for keeping track of these things assumes that dates and times are fundamentally the same kind of thing when in fact they are only loosely related.

But dates are not times, and times are not dates. Let's stop pretending they're the same thing.

What should happen is that computers stop using 32-bit Unix time (which addresses leap seconds in a way that requires horrible hacks to work around and will overflow in 2038) or 64-bit Windows NT time (which pretends that leap seconds don't even matter because hell, who cares, but at least won't overflow for another 58,000 years) and start keeping track of two things:

(1) current time: Barycentric Coordinate Time expressed as a signed 128-bit integer, where a time interval of 1 corresponds to a single period of the same caesium radiation that defines the SI second, and time value 0 (epoch) is TCB instant 1977-01-01T00:00:32.184. I have not bothered to work out when this will overflow but would be surprised to find it failing to outlast the Universe; if I'm wrong, make it 256 bits and express it in Planck intervals. I favour BCT over TAI because once we start mining asteroids we'll be changing the Earth's mass and f*ck going through all this cutover pain again.

(2) current date: a signed 64-bit integer where a date interval of 86,400 is one historical day and 0 means midnight at the beginning of the day 1-Jan-1601 Gregorian (for the same reason Cutler picked that epoch for Windows).

The midnights that delimit dates occurring after the invention of UTC would be UTC midnights, complete with leap seconds. The 86,400 "solar seconds" that make up a day should be perfectly adequate for political and/or accounting purposes even though they represent only approximately equal time intervals. "Local time" adjustments such as time zones and daylight saving would apply only to dates.

Software that needs to synchronize events to high precision, or scientific applications that need accurate (as opposed to politically convenient) interval measurements, would use the BCT-derived timestamp. Extensions to NTP and GPS could supply timestamps in that format.

I don't imagine there would be a pressing practical need for precise BCT specification for the date-delimiting midnights before the invention of UTC; at some point these would necessarily wander beyond best-guess into nonsense in any case.
posted by flabdablet at 5:59 AM on November 4, 2011 [11 favorites]


This is the first time I've heard that the year 2100 is a 'problem'. I would think that people could figure out how to use a well-defined system that's been in place for over 400 years. (But then, I thought people would have known that the year 2000 was part of the 20th century, so I should have guessed.) Then again, by 2100 the idiocracy should be firmly established.
posted by MtDewd at 6:00 AM on November 4, 2011 [1 favorite]


2100 is not going to be a problem. 2000 was a problem for two reasons, neither of which obtain in 2100.

1) A lot of existing code that used 2 digit years. That code has been beaten out of running systems by now and there's no shortage of storage space now so nobody tends to create new.

2) 2000 was an odd duck from a leap year perspective, because it was a third level case that only comes up every 400 years. 2100 does not meet that case. We could have a Year 2400 problem, though. Presumably in 400 years a lot of generations of software will have recreated the ignorance that made 2000 a problem (to the extent that it was).
posted by DU at 6:10 AM on November 4, 2011


Who cares? The world ends in 2012 anyway...
posted by wittgenstein at 6:11 AM on November 4, 2011 [1 favorite]


I hope that the leap second isn't abolished before this WWVB-synchronized clock I have designed and built gets a chance to display one.

On the other hand, it sure does seem simpler to not track leap seconds. I don't think there's any need to deal with things like leap hours either. Except in the far, far future (way beyond y3k, when the first hypothetical leap hour would be introduced) children will have to learn things like "the sun is exactly overhead at about 11AM on the equinoxes"; this fact will stay true for essentially your whole life. In the far far future the length of the day will become radically different, and nobody will learn this silly idea that looking at the clock will tell you anything about the position of the sun. Gradually, as the moving day makes each region's UTC offset become approximately zero, the world can abolish local UTC offsets and we can be rid of timezones altogether around year 20k as an added bonus. (astronomers do care about where the sun and other astronomical objects appear in the earth's sky at any time, but they have much more advanced systems for accounting for this…)

But by the same argument, we might as well not track the length of the year any better than "1 year = 365 days" for civil activities, and certainly not any better than "4 years = 1461 days" (The simplest "every 4 years" leap year rule); again, on the scale of a human life, it won't change from "I'm dreaming of a white christmas" to "I'm dreaming of a fall christmas", but maybe your grandchildren will not grow up expecting to ever see snow on christmas. In fact, in other cultures the holidays move freely compared to the solar and modern civil year (I'm thinking specifically of Ramadan, which moves around the civil calendar once every 34 years according to wikipedia), and this doesn't cause any problem I'm aware of. (farmers do care about things like "last freeze date", but they can simply consult almanacs or online calculators for this purpose; these would be calculated from the same data the astronomers use)

So let's either select the radically simpler system that is perfectly adequate for day-to-day use: 1 day = 86400 SI seconds, 1 year = 365 days, or let's keep our beautiful but baroque system of unforeseeable leap seconds (1 day = 86400±1 seconds) and a leap year cycle that makes 400 years = 146097 days.

As for year 2100 problems, I think there will be some minor ones. We'll have a whole crop of software written starting in about 2010-2020 (whenever new programmers think of y2k as something that happened to their parents, and 2100 as something impossibly far away) that simply isn't tested for this sort of thing. Raise your hand if you write software and have written any test for correct behavior for dates past 2099. (I know that my earlier-mentioned WWVB receiver gets post-2099 wrong because even in 2011 the WWVB time signal does not transmit the century; as a result, my software assumes that dates are always in the range 2000..2099 and I don't see what I could do differently)
posted by jepler at 6:55 AM on November 4, 2011


By the way, here's a website I've read while trying to keep up to date about the future of leap seconds: http://www.ucolick.org/~sla/leapsecs/
posted by jepler at 6:58 AM on November 4, 2011


There was a great article about this issue in Harper's a few years back: Clash of the Time Lords. (For non-subscribers, it seems to be up on scribd here.)
posted by whir at 7:33 AM on November 4, 2011


I read somewhere that earthquakes cause the earth's rotation to speed up and slow down.

So why not explode a few atomic bombs now and then as needed to induce earthquakes to bring the earth's rotation back in line with our cesium clocks?
posted by ZenMasterThis at 8:07 AM on November 4, 2011 [1 favorite]


my software assumes that dates are always in the range 2000..2099 and I don't see what I could do differently

Keep current_year in EEPROM or clock-chip CMOS RAM; a 16-bit int is plenty. Arrange to set it to 2011 before the clock is first switched on and/or give the user a way to adjust it.

Each time you get a two-digit year clock_yy from your current time source (time server or backup internal RTC), process it as follows:

const MAX_LEAP_FORWARD = 98;
const MAX_LEAP_BACKWARD = MAX_LEAP_FORWARD - 99;

signed 8-bit int year_diff = clock_yy - (current_year mod 100);
if (year_diff > MAX_LEAP_FORWARD) year_diff -= 100;
if (year_diff < MAX_LEAP_BACKWARD) year_diff += 100;
current_year += year_diff;

MAX_LEAP_FORWARD is the maximum number of years you can leave your clock powered down and still have it work correctly on startup; MAX_LEAP_BACKWARD is the worst allowable disagreement between the years from your various clock sources. It's unlikely that these would ever legitimately disagree by more than 1, but if they could, reduce MAX_LEAP_FORWARD some.
posted by flabdablet at 8:13 AM on November 4, 2011


nothing grows, nothing turns its face to the sun, there is no summer, no winter; no quotation marks or commas

::snicker::
posted by FatherDagon at 8:27 AM on November 4, 2011


Also, thinking further about it, applying Dave Cutler's reasoning to picking an epoch date would make 1-Jan-2001 a better choice. As well as being the first day of the current millennium which is cute, it's the first day of the current Gregorian 400-year cycle (as was 1-Jan-1601 when Windows NT was being designed) which is convenient.
posted by flabdablet at 8:28 AM on November 4, 2011


flabdablet, with that scheme, an incorrectly-received 2-digit-year risks pushing the clock a century into the future, forever. For instance, I think an error in the "50s years" bit (WWVB is in BCD) will do this, since it would e.g., take the year from 2011 -> 2061 and then when the correct 50s bit is received it would go from 2061 -> 2111. Though if MAX_LEAP_FORWARD ≤ 49 then I don't think this can ever happen due to a single-bit error…

In my software, which is strictly hobbyist-level, the "epoch year" is configurable at build time; I think it's much more reasonable to assume a new device or a firmware upgrade at least once every 100 years. Heck, I'd be tempted to bet that there will not be a WWVB signal compatible with current receivers and decoders in 2100 anyway.

(yes, I know that these are just the kinds of crappy rationalizations that lead people to these 2-digit-year solutions I mention above and I do try to appreciate the irony of that…)
posted by jepler at 8:43 AM on November 4, 2011


OK, so add some logic such that if year_diff comes out > 1 or < -1 you require ten successive clock_yy readings to have matched each other before you allow the current_year += year_diff line to run.
posted by flabdablet at 8:58 AM on November 4, 2011 [1 favorite]


And of course you'd be rejecting any clock_yy that was not valid BCD to begin with.
posted by flabdablet at 9:05 AM on November 4, 2011


What does Pope Gregory say?
posted by fallingbadgers at 9:12 AM on November 4, 2011 [1 favorite]


Also, packed BCD doesn't have a 50s bit; it has 1s, 2s, 4s, 8s, 10s, 20s, 40s and 80s bits.
posted by flabdablet at 9:13 AM on November 4, 2011


As a few people said above, dealing with leap seconds is small potatoes to the twice-yearly daylight saving/standard time switch. Let's all just go on permanent daylight saving time. Russia, Belarus, and Ukraine all decided to do just that this year. I think it makes a lot of sense.
posted by zsazsa at 9:34 AM on November 4, 2011


As one exhibiting delayed sleep phase syndrome, I really dislike the idea of making my permanent jetlag a permanent hour worse.
posted by flabdablet at 9:49 AM on November 4, 2011


if MAX_LEAP_FORWARD ≤ 49 then I don't think this can ever happen due to a single-bit error

With MAX_LEAP_FORWARD set to exactly 49, I have not yet found a case where temporarily erroneous (though valid BCD) clock_yy readings can cause a permanently incorrect century. Setting it to less than 49 would probably allow the century to ratchet backwards given suitable erroneous clock_yy data.

Time-related logic is a place bugs love to hide, that is for sure.
posted by flabdablet at 10:00 AM on November 4, 2011


By the way, my first attempt looked like this:

current_yy = current_year mod 100;
current_century = current_year - current_yy;
if (current_yy == 99 && clock_yy == 0) current_year += 100;
else if (current_yy == 0 && clock_yy == 99) current_year -= 100;
else current_year = current_century + clock_yy;

This looks simple and correct, but it fails for clocks built in 1999 and not switched on until 2001.
posted by flabdablet at 10:17 AM on November 4, 2011


Whoops - those += and -= amounts should be 1, not 100.
posted by flabdablet at 10:18 AM on November 4, 2011


thanks for thinking that all through for me, flabdablet. "50s bit" indeed :-/
posted by jepler at 1:09 PM on November 4, 2011


I always thought the leap second was too audacious and showy. It was bound to stir up controversy and conspiracy theories.

So yes, let's abolish and replace it with the siesta second. You've seen how a spinning skater speeds up by pulling in her arms. With 7 billion people on the planet, we can speed up the rotation by having everyone who isn't already sleeping lie down and take a siesta - or at least adjust the lazyboy to a reclining position. This will be MANDATORY worldwide to keep the astronomers happy.

Everyone on the planet will get more rest. Less wars, less pollution, less anti-depression pills. There just has to be one person assigned to stay awake and watch the two clocks at the start of the (daily, weekly, hourly, whatever's required) siesta second. Every now and then he lifts the donkey balls, and when the clocks are in sync, the siesta's over.
posted by Twang at 10:26 PM on November 4, 2011


Thank fuck I've been keeping on swatch time. This whole 'leap-second' fiasco finally makes up for never having made a meeting in thirteen years...
posted by pompomtom at 4:23 AM on November 5, 2011 [1 favorite]


> I want two times

But how you set the "human" clocks? If you're asking cellphones and computers to keep track of NTP and cell-network and GPS time internally, but to show you "human" time which is still offset to match the sun's transit, you're asking for a bunch of bugs. Look at all the bugs revealed at the last daylight savings change or the ones that pop up when you set an appointment for a different time zone and fly there.
posted by morganw at 6:56 AM on November 18, 2011


« Older Journey ends   |   Gwar Guitarist Cory Smoot has died. Newer »


This thread has been archived and is closed to new comments