this post was submitted on 18 Dec 2024
1082 points (98.3% liked)

memes

10661 readers
1759 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

Sister communities

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Rusty@lemmy.ca 95 points 4 days ago (9 children)

I don't think 10000 year is a problem. There is a real "year 2038 problem" that affects system storing unix time in signed int32, but it's mostly solved already. The next problem will be in year 33000 or something like that.

[–] gnutrino@programming.dev 51 points 4 days ago (2 children)

There are so many problems there is an entire Wikipedia page dedicated to them.

[–] marcos@lemmy.world 14 points 4 days ago

Yes, there are random systems using every kind of smart or brain-dead option out there.

But the 2038 problem impacts the previous standard, and the current one will take ages to fail. (No, it's not 33000, unless you are using some variant of the standard that counts nanoseconds instead of seconds. Those usually have more bits nowadays, but some odd older systems do it on the same 64 bits from the standard.)

I'm pretty certain most of my work inevitably ends up being related to a time issue

[–] Ephera@lemmy.ml 22 points 4 days ago (2 children)

Well, I looked at a Year 10000 problem less than 2 hours ago. We're parsing logs to extract the timestamp and for that, we're using a regex which starts with:

\d{4}-\d{2}-\d{2}

So, we assume there to be 4 digits for the year, always. Can't use it, if you live in the year 10000 and beyond, nor in the year 999 and before.

[–] Frozengyro@lemmy.world 12 points 4 days ago

Just start over at year 0000 AT (after ten thousand)

[–] itslilith@lemmy.blahaj.zone 6 points 4 days ago (1 children)

The ISO time standard will certainly need to be redone

[–] Ephera@lemmy.ml 5 points 4 days ago (2 children)

Do you think so? Surely, it's able to handle dates before the year 999 correctly, so I'd also expect it to handle years beyond 10000. The \d{4} is just our bodged assumption, because well, I have actually never seen a log line with a year that wasn't 4 digits...

[–] itslilith@lemmy.blahaj.zone 8 points 4 days ago (1 children)

Kinda?

Each date and time value has a fixed number of digits that must be padded with leading zeros.

To represent years before 0000 or after 9999, the standard also permits the expansion of the year representation but only by prior agreement between the sender and the receiver.[21] An expanded year representation [±YYYYY] must have an agreed-upon number of extra year digits beyond the four-digit minimum, and it must be prefixed with a + or − sign[22] instead of the more common AD/BC (or CE/BCE) notation; by convention 1 BC is labelled +0000, 2 BC is labeled −0001, and so on.[23]

[–] Ephera@lemmy.ml 5 points 3 days ago

Oh wow, I really expected the standard to just say that however many digits you need are fine, because you know, maths. But I guess, this simplifies handling all kinds of edge cases in the roughly 7975 years we've still got.

[–] pennomi@lemmy.world 12 points 4 days ago

It’s a UX problem rather than a date format problem at that point. Many form fields require exactly 4 digits.

[–] GissaMittJobb@lemmy.ml 7 points 4 days ago (1 children)

It's going to be significantly more than the year 33000 before we run out of 64-bit epoch timestamps.

The max value for signed 64-but epoch values is more than 292 billion years away, or 20 times the age of the universe itself.

So yeah, we're basically solid forever with 64-bit

[–] frezik@midwest.social 1 points 3 days ago* (last edited 3 days ago)

33,000 would come from other programs that store the year as a 16-bit signed int. Year 32,768, to be precise.

[–] toddestan@lemm.ee 3 points 3 days ago* (last edited 3 days ago)

I've been curious about that myself. On one hand, it still seems far away. On the other hand, it's a bit over 13 years away now and I have gear actively in use that's older than that today.

[–] kevincox@lemmy.ml 5 points 3 days ago

it’s mostly solved already

I wished I believe this. Or I guess I agree that it is solved in most software but there is lots of commonly used software where it isn't. One broken bit of software can fairly easily take down a whole site or OS.

Try to create an event in 2040 in your favourite calendar. There is a decent chance it isn't supported. I would say most calendar servers support it, but the frontends often don't or vice-versa.

[–] Diplomjodler3@lemmy.world 5 points 4 days ago

Luckily I'll be retired by then.

[–] JackbyDev@programming.dev 1 points 3 days ago

I don't think it will be a problem because it's 8,000 years away lol, but people do store time in ISO 8601 strings.

[–] HKPiax@lemmy.world 3 points 3 days ago (2 children)
[–] frezik@midwest.social 13 points 3 days ago* (last edited 3 days ago) (2 children)

A common method of storing dates is the number of seconds since midnight on Jan 1, 1970 (which was somewhat arbitrarily chosen).

A 32-bit signed integer means it can store numbers between 2^31^ through 2^31^ - 1 (subtracting one comes from zero being effectively a positive number for these purposes). 2^31^ - 1 seconds added to Jan 1, 1970 gets you to Jan 19, 2038.

The solution is to jump to 64-bit integers, but as with Y2K, there's a lot of old systems that need to be updated to 64-bit integers (and no, they don't necessarily have to have 64-bit CPUs to make that work). For the most part, this has been done already. That would put the date out to 292,277,026,596 CE. Which is orders of magnitude past the time for the sun to turn into a red giant.

[–] pfm@scribe.disroot.org 2 points 3 days ago

Maybe it's not LI5, but I certainly enjoy your explanation for including several important facts and context. I respect your skill and knowledge, dear internet stranger.

midnight on Jan 1, 1970 (which was somewhat arbitrarily chosen).

well not so much, as far as I remember the first end-user computers became available in 1971 or 1972 or something, and the internet also underwent some rapid developments in that time, so the date has a certain reasoning to it.

[–] teije9@lemmy.blahaj.zone 8 points 3 days ago

Unix computers store time in seconds that have passed since january first 1970. one there have been too many seconds since 1970, it starts breaking. 'signed' is a way to store negative numbers in binary. the basics of it are: when the leftmost bit is a 1, it's a negative number (and then you do some other things to the rest of the number so that it acts like a negative number) so when there have been 09999999 seconds since 1970, if there's one more second it'll be 10000000, which a computer sees as -9999999.