•  4am   ( @4am@lemm.ee ) 
      link
      fedilink
      English
      157 months ago

      “Lol Elon rocket go boom, science isn’t real” is also happening

      Stupid people just think they’re the smartest ones in the room now

      •  Scrof   ( @Scrof@sopuli.xyz ) 
        link
        fedilink
        English
        87 months ago

        Well considering Elon situation I wouldn’t blame anyone for making fun of his idiotic ventures. Also starship is actually dumb and saying “you expected for it to blow up” is something no real scientist would’ve said unless they were making a bomb.

        •  CybranM   ( @CybranM@feddit.nu ) 
          link
          fedilink
          English
          77 months ago

          How is Starship dumb exactly? Making a new thing at any extreme of our current capability is going to be hard and its not unexpected when something goes wrong. What would be dumb is if they put human lives on the line

    •  neidu2   ( @neidu2@feddit.nl ) 
      link
      fedilink
      English
      10
      edit-2
      7 months ago

      I wasn’t working in the IT field back then, as I was only 16, but as I knew that it’d most likely be my field one day (yup, I was right), I followed this closely due to interest, and applied patches accordingly.

      Everything kept working fine except this one modem I had.

    • Y2K specifically makes no sense though. Any reasonable way of storing a year would use a binary integer of some length (especially when you want to use as little memory as possible). The same goes for manipulations; they are faster, more memory efficient, and easier to implement in binary. With an 8-bit signed integer counting from 1900, the concerning overflows would occur in 2028, not 2000. A base 10 representation would require at least 8 bits to store a two digit number anyway. There is no advantage to a base 10 representation, and there never has been. For Y2K to have been anything more significant than a text formatting issue, a whole lot of programmers would have had to go out of their way to be really, really bad at their jobs. Also, usage of dates beyond 2000 would have increased gradually for decades leading up to it, so the idea it would be any sort of sudden catastrophe is absurd.

      • The issue wasn’t using the dates. The issue was the computer believing it was now on those dates.

        I’m going to assume you aren’t old enough to remember, but the “only two digits to represent the year” issue predates computers. Lots of paper forms just gave two digits. And a lot of early computer work was just digitising paper forms.

      •  GoodEye8   ( @GoodEye8@lemm.ee ) 
        link
        fedilink
        English
        257 months ago

        You’re thinking of the problem with modern solutions in mind. Y2K originates from punch cards where everything was stored in characters. To save space only the last 2 digits of the year because back then you didn’t need to store the 19 of year 19xx. The technique of storing data stayed the same for a long time despite technology advancing beyond punch cards. The assumption that it’s always 19xx caused the Y2K bug because once it overflows to 00 the system doesn’t know if it’s 1900 or 2000.

      •  frezik   ( @frezik@midwest.social ) 
        link
        fedilink
        English
        157 months ago

        With an 8-bit signed integer counting from 1900…

        Some of the computers in question predate standardizing on 8 bits to the byte. You’ve got a whole post here of bad assumptions about how things worked.

      •  Matombo   ( @Matombo@feddit.de ) 
        link
        fedilink
        English
        67 months ago

        Oh boy you heavily underestimate the amount and level of bad decision in legacy protokoll. Read up in the toppic. the Date was for a loong time stored as 6 decimal numbers.