The Unix timestamp will begin with 17 this Tuesday
unixtimestamp.comThe hectomegaseconds fly by so fast
Edit: here was the front page of the New York Times at 1600000034,
https://web.archive.org/web/20200913122714/https://www.nytim...
and here's 1500000301 and 1400000634, and 1300007806
https://web.archive.org/web/20170714024501/http://www.nytime...
https://web.archive.org/web/20140513170354/http://www.nytime...
https://web.archive.org/web/20110313091646/http://www.nytime...
3.17 years.
Unix timestamp 1 600 000 000 was not too long ago. That was on 2020-09-13 12:26:40 UTC. Discussed on HN back then here: https://news.ycombinator.com/item?id=24452885
My own blog post here commemorating the event: https://susam.net/maze/unix-timestamp-1600000000.html
Given that 100 000 000 seconds is approximately 3 years 2 months, we are going to see an event like this every few years.
I believe the most spectacular event is going to be the Unix timestamp 2 000 000 000 which is still 9½ years away: 2033-05-18 03:33:20 UTC. Such an event occurs only once every 33 years 8 months approximately!
By the way, here's 1700000000 on Python:
$ python3 -q
>>> from datetime import datetime
>>> datetime.utcfromtimestamp(1_700_000_000)
datetime.datetime(2023, 11, 14, 22, 13, 20)
>>>
GNU date (Linux): $ date -ud @1700000000
Tue Nov 14 22:13:20 UTC 2023
BSD date (macOS, FreeBSD, OpenBSD, etc.): $ date -ur 1700000000
Tue 14 Nov 2023 22:13:20 UTC> I believe the most spectacular event is going to be the Unix timestamp 2 000 000 000 which is still 9½ years away: 2033-05-18 03:33:20 UTC. Such an event occurs only once every 33 years 8 months approximately!
Egads! 33 years! I spent my late 90:ies mudding[0] and for some reason we had a lot of save files named by their epoch timestamp. When I ended up responsible for parts of the code base, I spent a lot of time dealing with those files, and they were all in the 800- or 900- million range. At some point I was pretty much able to tell at a glance roughly what date any number in that range corresponded to, within perhaps a few weeks.
Weird environments foster weird super powers.
The only reference point i have is that the millennium was very roughly around 1000000000. And that's only because of an agonising C/Prince pun in NTK [1]:
"Tonight I'm gonna party like it's (time_t) 1E9"
I went to an EFF billion second party 2001-09-08 in golden gate park and then the next week was 9/11.
I don't know whether to thank or revile you for sharing that link. I've never heard of NTK before and I was at once amazed at something that tickled me so and also realizing it's no longer around. Now I feel sad.
It was so good. But then, the web was young, and everything was. You are quite right to feel sad. I know this doesn't help.
That pun is horrendous beyond belief, I love it.
The spectacular events is in Jan 2038, when it reaches a nice round number of 2,147,483,648
Or it doesn't reach...
I hope to arrange an extended tour of Iron Mountain for early 2038.
Unix time will reach it, even if 32 bit signed representations tick over.
I think 2222222222 is more special Sat Jun 02 2040 03:57:02 GMT+0000
I might even get to experience 3333333333 if I am lucky. What a day, what a day, yes indeed!
I hope you have enough bits!
I would like to argue, the next and final special date is 2^32=4294967296 which is Sun Feb 7 06:28:16 AM UTC 2106.
You're using unsigned timestamps?
I used to follow your blog about 15 years ago, what a blast of nostalgia! I’ll add it to my RSS feed. Keep it up.
Perfect time to fire up https://datetime.store/ and try your luck on the perfect shirt!
That website gonna crash so hard on Tuesday…
“Boss! We’re being dee dossed!”
“No, son, it’s Tuesday”
You know how you can use a "death clock" to figure out when you should statistically expire?
I wonder if you can find a shirt that would print that
By the time you receive the shirt the answer would have changed right ?
This is quite clever. I'm going to get one.
This reminds me of an idea I had - the 1BTC coffee mug - this was back during the early first rise to $20k.
internet is really wonderful sometimes
I remember staying up late to see the tick to over from 999,999,999 to 1 billion, thinking "I'll remember this week my whole life". Little did I realise how 60 hours later the whole world would remember.
Timestamp 1000000000 (Sat 2001-09-08 18:46:40 PDT) triggered a bug in the bug reporting system (Remedy) we were using at the time.
The system stored timestamps as a string representing seconds since the epoch, but it assumed it would fit in 9 digits. At 1000000000, it started dropping the last digit, so it went back to Sat 1973-03-03 01:46:40 PST, advancing at 10% of real time. It was fixed fairly quickly.
Wow, that’s a really stupid bug, especially given the general level of awareness of the Y2K problem at the time the system probably was being built. Whoever the hell looked at a 9 digit timestamp that started with a 9 and said “nah, we’re never gonna need more than 9 digits to store that” should have their programming license revoked.
I didn't remember it was so close but those were the days I obsessively read Slashdot, which helped during 9/11, and which certainly covered the epoch event.
please explain
> 2001-09-09T01:46:40.000+00:00
2 days later...
999999999 was on Sept 9, 2001
9/9, a remarkable coincidence
If only it were year 1001 (binary 9)
What's even remarkable is 999999999,9/9 has 11 9s.
2001-09-09 01:46:40 UTC
September 11th
I went to a 1234567890 "gathering" in a hotel lobby in Boston in 2009
I was doing the late shift on a trading floor at a big bank.
The head of the derivatives tech support team pointed out it was about to hit so we opened up a shell and did a "watch" command + outputting the "date" command in epoch seconds and watched it happen.
Then we went back to working.
I remember that moment! I was out at a bar or something at the time but I was prepared and had my laptop with me haha. I was mashing the up arrow and enter to make sure I didn’t miss it.
One of my favorite bits of Vinge's A Deepness in the Sky is the use of base-10 time: ksec, Msec, etc. There is a nice time log scale with Earth time to base-10 time conversions.
Yes! It is as a direct result of that book that I now know without having to look it up that a ksec is about a quarter hour and a Msec is on the order of a fortnight, which comes in handy when doing back-of-envelope estimation more often than you'd expect. (I'd already known that a Gsec was about a third of a century thanks to Tom Duff's observation.[0]) I don't see us moving to such a system anytime soon in general (tying to the circadian cycle is just too convenient) but I'm a little surprised I don't see it more often in discussions of humans in space.
[0] "How many seconds are there in a year? If I tell you there are 3.155 x 10^7, you won't even try to remember it. On the other hand, who could forget that, to within half a percent, pi seconds is a nanocentury." --Tom Duff
Though the software archaeologists in the book mistakenly thought the time_t epoch marked the moon landing, not just 1970. :)
I also like the Emergents. Liberal progressives creating a utopia. 100% employment, palaces made out of diamond for everyone.
The Emergent "utopia" is both horrifying and eerily believable. I've known some grad students and some tech workers who are way too close already.
But it's reallly strange to try to map the Emergent political structure onto any modern political axis. It's not "liberal progressive" or "traditional conservative" or "libertarian". Or any other popular political ideology. It's certainly authoritarian, but uniquely so. It's almost a dystopia run by project managers and exploiting specialists.
Also a fun bit: The traders in the book count base their epoch on the first moon landing, but if you pay attention, the lowest levels of software count from a different epoch.
They merely got confused... thousands of years from now, they assume unix epoch time is based on the moon landing. And it's only a few months off anyway. Not much is left of Earth as far as they know, to be able to properly understand that those were two different events.
The Emergents are what they are, because they are, at the most fundamental level, busybodies who want to control others. Sometime in their own history, they found an excuse (the Emergency) to do that, and they never stopped doing it even after the crisis was over. In this way they map to most other authoritarian regimes in reality, but especially to the leftist authoritarian regimes. They hate "peddlers" after all, who sell things to others at fair prices and of their own free will. Not unlike the mutterings you see all over social media concerning capitalism.
That sounds more like a "The Giver" style distopia
Spoiling the joke, but the Emergents are very clearly dystopian and NoMoreNicksLeft knows it ;)
Definitely give the book a read! One of my favorites.
5,148 days left until January 19, 2038.
Assuming I live that long, the next day will be my 65th birthday. Just in time for digital Armageddon.
It makes more sense to celebrate when a (relatively) high order bit changes from 0 to 1, not when the decimal representation changes.
I like the unit of 100 million seconds. Longer than a year, shorter than a decade. The era of 1.6e8 was the pandemic. What will 1.7e8 bring?
I love that others get excited about this. UNIX Timeval Aficionados should try out this tf tool [1]. I used my buddy's C/Lex/Yacc one daily for 1.5 decades, then ported it to Golang + Homebrew to share the love:
[1] https://github.com/neomantra/tf
brew tap neomantra/homebrew-tap
brew install tf
Printing out these round ones. `tf` auto-detects at 10-digits, so I started there in the `seq`. > for TV in $(seq -f %.f 1000000000 100000000 2000000000); do echo $TV $TV | tf -d ; done
2001-09-08 18:46:40 1000000000
2004-11-09 03:33:20 1100000000
2008-01-10 13:20:00 1200000000
2011-03-12 23:06:40 1300000000
2014-05-13 09:53:20 1400000000
2017-07-13 19:40:00 1500000000
2020-09-13 05:26:40 1600000000
2023-11-14 14:13:20 1700000000
2027-01-15 00:00:00 1800000000
2030-03-17 10:46:40 1900000000
2033-05-17 20:33:20 2000000000
Some funny dates. -g detects multiple on a line, -d includes the date: > echo 1234567890 __ 3141592653 | tf -gd
2009-02-13 15:31:30 __ 2069-07-20 17:37:33
Enjoy... may it save you time figuring out time!In relation to UNUX time; the 20000th UNIX day is at 2024-10-04 (the 4th of October).
It's a special day, since the next round UNIX day is 30000, at 2052-02-20.
Reminds me of sdate for the Eternal September epoch. 10000 Sep 1993 was 2021-01-16.
http://www.df7cb.de/projects/sdate/
one commit message for the QDBs:
Sep 17 2001 (1000684800) is a special date from git-format-patch. Its significance is lost to time.From 14df411817feda9decf9dd8a6cd555d71f199730 Mon Sep 17 00:00:00 2001 From: Christoph Berg <myon@debian.org> Date: Thu, 4 Jun 2020 20:05:49 +0200 Subject: [PATCH] Fix long --covid option scripts/sdate.in | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-)
The next one lands on a nice round hour, since it'll be at exactly 3:00:00 AM.
@ date -d '@1800000000' Fri Jan 15 03:00:00 AM EST 2027
The Unix timestamp inspired me to throw a birthday party on the day when I got a billion seconds old: 31,7 years :)
Instant bookmark for me. I've always loved the idea of measuring time in computers by a single integer like the timestamp does, but it always seems like such a pain to work with outside of that.
This might interest you, if you haven't already seen it: Unix timestamp clock, in hex! https://retr0.id/stuff/2038/
FWIW these bash shortcuts are handy if you do this a lot:
tss() { date -Is -u -d @$1 ; } tsm() { date -Ins -u -d @$( echo "scale=3; $1 / 1000" | bc) | sed -E -e 's/[0-9]{6}\+/\+/' -e 's/,/./' ; } tsn() { date -Ins -u -d @$( echo "scale=9; $1 / 1000000000" | bc) | sed 's/,/./' ; } $ tss 1700000000 2023-11-14T22:13:20+00:00 $ tsm 1700000000000 2023-11-14T22:13:20.000+00:00 $ tsn 1700000000000000000 2023-11-14T22:13:20.000000000+00:00Because the bases are all wrong. Common number bases are 10, 16, maybe 8 if you live in the 70s, and 2.
Except for the utterly unwieldy binary, none of those bases adapt well to the bases used in representing time, which are mostly the (partially related) bases 60, 12, and, annoyingly, thirty-ish.
So you always end up doing opaque arithmetic instead of “just looking at the digits” (which you still can do in decimal for century vs years for example, because we defined centuries to be exactly that).
> I've always loved the idea of measuring time in computers by a single integer like the timestamp does
Why?
When did Unix time start being used?
Was it being used in 1970 and actually started at 0?
Or did they just pick a date to start it and if so what was the initial Unix time when it was first used?
>The Unix epoch is midnight on January 1, 1970. It's important to remember that this isn't Unix's "birthday" -- rough versions of the operating system were around in the 1960s. Instead, the date was programmed into the system sometime in the early 70s only because it was convenient to do so, according to Dennis Ritchie, one the engineers who worked on Unix at Bell Labs at its inception.
>"At the time we didn't have tapes and we had a couple of file-systems running and we kept changing the origin of time," he said. "So finally we said, 'Let's pick one thing that's not going to overflow for a while.' 1970 seemed to be as good as any."
Starting Tue Nov 14 2023 22:13:20 GMT+0000 to be exact!
Yesterday, I was digging into some stuff in the database and saw some events scheduled for 17*. My initial reaction was that it was some far-off date. Then I realized ... nope, not far away at all.
I'm more interested in such events when these coincide with a beginning of a year, month or week but it's a little too early to work out the math now
It's now!
When I first used Unix it started with 6. I feel old.
... which happens roughly every three years.
Or roughly every 136 years.
How so?
See for yourself
> date -d '@1600000000'
> date -d '@1700000000'
1 year are 31557600 seconds so roughly a third of 100 million seconds 1.7 billion seconds since the epoch is the next big rollover since 2020 and the 5th-last before 31 bits are not enough to hold the seconds since the epoch.
There's a lot of epoch love in the comments. For me, it's never "clicked". I assumed that after seeing a ton of timestamps that I'd have a Neo-seeing-the matrix moment with timestamps but it just hasn't happened. Can you all easily decode them?
Is there talk anywhere of using a human-readable timestamp instead? e.g. YYYYMMddHHmmssSSSSZ
Sure there is. But since it is not a continuous range there are the fixed separators --T::. between the parts. It is the javascript time format, which is a subset of the RFC3339 and ISO8601 time formats. The separators help at least to allow for a variable amount of sub-second digits.
The hyphen and colon separators are optional though, so YYYYMMDDThhmmss.ssssZ is a valid ISO8601 format.