• Aceticon@lemmy.world
      link
      fedilink
      arrow-up
      36
      ·
      edit-2
      10 months ago

      Dates with the year stored as two digits only (say, 1995 was stored as “95”), which worked fine for things like comparisons (for example: “is the year in entry A before or after the year in entry B?”) which were just done by numerical comparison (i.e. 98 > 95 hence a date with a year ending in 98 is after a date with the year ending in 95), until 2000 were the year being store would become “00” and all those assumptions that you could compare those stored years as numbers would break, as would as all the maths being done on two digits (i.e. a loan taken in 1995 would in 1998 be on its 98 - 95 = 3rd year with that system, but in 2000 it would be on its 98 - 00 = - 98th - so negative - year which would further break the maths downstream with interesting results like the computer telling the bank it would have to give money to the lender to close the loan).

      Ultimatelly a lot of work was done (I myself worked in some of that stuff) and very few important things blew up or started producing erroneous numbers when the year 2000 came.

      • stebo02@sopuli.xyz
        link
        fedilink
        arrow-up
        3
        ·
        10 months ago

        I wonder why they didn’t think about making computers and clocks count past 100 when creating them? Did they not expect to ever get to the year 2000?

        • danque@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          10 months ago

          The year is 2038, nothing happened. Seems like a lot of nothing. (Meanwhile behind the scenes. Developers are happy they prevented a major problem).