• This is less an issue of “smartness” and moreso because analog signals degrade gracefully whereas digital signals are all or nothing unless specific mitigations are put in place. HDMI hits kind of a weird spot because it’s a digital protocol based on analog scanlines; if the signal gets disrupted for 0.02 ms, it might only affect the upper half and maybe shift the bits for the lower half. Digital is more contextual and it will resynchronize at least every frame, so this kind of degradation is also unstable.

    • analog signals degrade gracefully whereas digital signals are all or nothing unless specific mitigations are put in place

      Not really. Digital signals come over analog mediums, and it’s up to the receiver to decide how much degradation is too much. Mitigations like error correction are intended to reduce the final errors to zero, but it’s up to the device to decide whether it shows/plays something with some errors, and how many of them, or if it switches to a “signal lost” mode.

      For example, compressed digital video has a relatively high level of graceful degradation: full frames come every Nth frame and they are further subdivided into blocks, each block can fail to be decoded on its own without impacting the rest, then intermediate frames only encode block changes, so as long as the decoder manages to locate the header of a key frame, it can show a partial image that gets progressively more garbled until the next key frame. Even if it misses a key frame, it can freeze the output until it manages to locate another one.

      Digital audio is more sensitive to non-corrected errors, that can cause high frequency and high volume screeches. Those need more mitigations like filtering to a normalized volume and frequency distribution based on the preceding blocks, but still allow a level of graceful degradation.