• Rusty@lemmy.ca
    link
    fedilink
    English
    arrow-up
    91
    ·
    4 days ago

    I don’t think 10000 year is a problem. There is a real “year 2038 problem” that affects system storing unix time in signed int32, but it’s mostly solved already. The next problem will be in year 33000 or something like that.

      • marcos@lemmy.world
        link
        fedilink
        arrow-up
        13
        ·
        4 days ago

        Yes, there are random systems using every kind of smart or brain-dead option out there.

        But the 2038 problem impacts the previous standard, and the current one will take ages to fail. (No, it’s not 33000, unless you are using some variant of the standard that counts nanoseconds instead of seconds. Those usually have more bits nowadays, but some odd older systems do it on the same 64 bits from the standard.)

    • Pennomi@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      4 days ago

      It’s a UX problem rather than a date format problem at that point. Many form fields require exactly 4 digits.

    • toddestan@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      3 days ago

      I’ve been curious about that myself. On one hand, it still seems far away. On the other hand, it’s a bit over 13 years away now and I have gear actively in use that’s older than that today.

    • JackbyDev@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 days ago

      I don’t think it will be a problem because it’s 8,000 years away lol, but people do store time in ISO 8601 strings.

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        13
        ·
        edit-2
        4 days ago

        A common method of storing dates is the number of seconds since midnight on Jan 1, 1970 (which was somewhat arbitrarily chosen).

        A 32-bit signed integer means it can store numbers between 231 through 231 - 1 (subtracting one comes from zero being effectively a positive number for these purposes). 231 - 1 seconds added to Jan 1, 1970 gets you to Jan 19, 2038.

        The solution is to jump to 64-bit integers, but as with Y2K, there’s a lot of old systems that need to be updated to 64-bit integers (and no, they don’t necessarily have to have 64-bit CPUs to make that work). For the most part, this has been done already. That would put the date out to 292,277,026,596 CE. Which is orders of magnitude past the time for the sun to turn into a red giant.

        • pfm@scribe.disroot.org
          cake
          link
          fedilink
          arrow-up
          2
          ·
          4 days ago

          Maybe it’s not LI5, but I certainly enjoy your explanation for including several important facts and context. I respect your skill and knowledge, dear internet stranger.

        • gandalf_der_12te@discuss.tchncs.de
          link
          fedilink
          arrow-up
          1
          ·
          4 days ago

          midnight on Jan 1, 1970 (which was somewhat arbitrarily chosen).

          well not so much, as far as I remember the first end-user computers became available in 1971 or 1972 or something, and the internet also underwent some rapid developments in that time, so the date has a certain reasoning to it.

      • teije9@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        8
        ·
        4 days ago

        Unix computers store time in seconds that have passed since january first 1970. one there have been too many seconds since 1970, it starts breaking. ‘signed’ is a way to store negative numbers in binary. the basics of it are: when the leftmost bit is a 1, it’s a negative number (and then you do some other things to the rest of the number so that it acts like a negative number) so when there have been 09999999 seconds since 1970, if there’s one more second it’ll be 10000000, which a computer sees as -9999999.