• radostin04@pawb.social
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    1 year ago

    Inaccurate meme - the white and red RCAs in composite typically don’t actually carry the left and right channels - usually, the white one is L+R, meaning both the left and right channels combined into one, and the red one is L-R, the difference between the right and left channels.

    This is done so that a mono television, which will only have a yellow and white port, will still be able to hear both audio channels, as opposed to having to completely miss out on one of them

    • rektifier@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      This must be BS or a regional thing. All the RCA ports I’ve seen in North America are labeled L and R, not L+R and L-R.

      • radostin04@pawb.social
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        It’s possible that it might only be a thing in PAL regions - I’d try, but I don’t have anything that uses composite to try now.

        • OADINC@feddit.nl
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          I can confirm that everything that uses component and L/R that I have used in my life (born in 2001 in the Netherlands, so PAL) has separate L and R channels. I have confirmed this with my multimeter before.

  • BudgetBandit@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    laughs in european

    I present to you: the Scart.

    Our gaming consoles came with it.

    We were clueless the first time we hooked up our N64 at gran-gran, since the old TV did not have a Scart connector, but we figured out that the Scart’s colored cables go in there.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Scart was amazing. RGB, composite, component, audio. All in one cable. Granted that cable and connector were enormous, but one cable nonetheless.

      • BorgDrone@lemmy.one
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        SCART was terrible.

        Theoretically it had all that in one cable, in practice it never did. You’d usually have 3-4 SCART ports on a TV, but not all ports accepted our output the same signals. There was no way to tell from the outside what the output or input from a SCART port so you either had to try different port combinations or look it up in the manual (if you had one). Most TV’s had one port that accepted s-video, on that accepted RGB and they usually accepted composite on all ports.

        Worse, not all cables had all 21 connections. If you were lucky you could tell because not all pins on the connector would be there (but this wasn’t necessary the case).

        Usually there was also one port on a TV that output the video from the tuner. This was used for analog pay TV decoders. You would hook it up to that SCART port and it would get the scrambled video from the TV and return the descrambled video over the same port.

        Also, due to the size and design of the connector it was almost impossible to insert it blindly. Inserting one into the back of one of those enormous CRT television was always a challenge.

  • argv_minus_one@beehaw.org
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    1 year ago

    VGA was so much better.

    The composite video output commonly seen on 1980s microcomputers couldn’t display high-resolution text without severe distortion making the text unreadable. This could be seen on the IBM PCjr, for example, where the digital RGB display it came with could display 80×25 text mode just fine, but if you connected a composite video display (i.e. a TV) instead, 80×25 text was a blurry, illegible mess. The digital video output was severely limited in color depth, however; it could display only a fixed palette of 16 colors, whereas the distortion in the composite video could be used to create many more colors, albeit at very low resolution.

    Then along came the VGA video signal format. This was a bit of a peculiarity: analog RGB video. Unlike digital RGB of the time, it was not limited in color depth, and could represent an image with 24-bit color, no problem. Unlike composite video, it had separate signal lines for each primary color, so any color within the gamut was equally representable, and it had enough bandwidth on each of those lines to cleanly transmit a 640×480 image at 60Hz with pretty much perfect fidelity.

    However, someone at IBM was apparently a bit of a perfectionist, as a VGA cable is capable of carrying an image of up to 2048×1536 resolution at 85Hz, or at lower resolutions, refresh rates of 100Hz or more, all with 24-bit color depth—far beyond what the original VGA graphics chips and associated IBM 85xx-series displays could handle.

    Also, the VGA cable system bundled every signal line into a single cable and connector, so no more figuring out which cable plugs in where, and it being so future-proof meant that, for pretty much the entire '90s, you could buy any old computer display and plug it into any old computer and it would just work.

    Pretty impressive for an analog video signal/cable/connector designed in 1987.