These days, kids identify them by the aspect ratio.

  • jet@hackertalks.com
    link
    fedilink
    English
    arrow-up
    56
    ·
    1 year ago

    And video quality. Watching some historical videos from my childhood, like tv shows on youtube… the quality is pure potato. Either the archiving is terrible, or we just accepted much worse quality back then.

    • Hypersapien@lemmy.worldOP
      link
      fedilink
      arrow-up
      26
      ·
      1 year ago

      People always said that Betamax was better quality than VHS. What never gets mentioned is that regular consumer TVs at the time weren’t capable of displaying the difference in quality. To the average person they were the same.

      • jeffw@lemmy.world
        link
        fedilink
        arrow-up
        6
        arrow-down
        3
        ·
        1 year ago

        You kinda can tell though. CRTs didn’t really use pixels, so it’s not like watching on today’s video equipment though

        • NuPNuA@lemm.ee
          link
          fedilink
          arrow-up
          4
          arrow-down
          2
          ·
          1 year ago

          CRT screens definitely used pixels, but they updated on the horizontal line rather than per pixel. This is why earlier flatscreen LCDs were worse than CRTs in a lot of ways as they had much more motion blur as stuff like “sample and hold” meant that each pixel wasn’t updated every frame if the colour info didn’t change. CRTs gave you a fresh image each frame regardless.

          • Psyduck_world@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            1 year ago

            I have heard that pixels in CRTs are round and LCD/LED are square, that’s the reason why aliasing is not too noticeable on CRTs. Is this true or another internet bs?

            • NuPNuA@lemm.ee
              link
              fedilink
              arrow-up
              4
              ·
              1 year ago

              They’re not round persay, but they aren’t as sharp so have more light bleed into one another giving a natural alaising effect. This is why some old games where the art is designed to account for this bluring look wrong when played on pixel perfect modern TVs.

    • Capt. Wolf@lemmy.world
      link
      fedilink
      arrow-up
      19
      ·
      1 year ago

      There’s a lot of archival video that is just terrible. Digital video compression issues have damaged a lot of old footage that’s gotten shared over the years, especially YouTube’s encoders. They will just straight up murder videos to save bandwidth. There’s also a lot of stuff that just doesn’t look great when it’s being upscaled from magnetic media that’s 240x320 at best.

      However, there’s also a lot of stuff that was bad to begin with and just took advantage of things like scanlines and dithering to make up for poor video quality. Take old games for example. There’s a lot of developers who took advantage of CRT TVs to create shading, smoothing, and the illusion of a higher resolution that a console just wasn’t capable of. There’s a lot of contention in the retro gaming community over whether games looked better with scanlines or if they look better now without them.

      For example.

      Personally, I prefer them without. I like the crisp pixelly edges, but I was also lucky enough to play most of my games on a high quality monitor instead of a TV back then. Then emulators, upscaling, and pixel smoothing became a thing…

    • Dandroid@dandroid.app
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      I watch a lot of hockey. Just watching hockey games from the 2000s are full on potato. I don’t remember them looking that bad back then.

        • NuPNuA@lemm.ee
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          All sports have been, also the rise of faster refresh LCD as those early flat screens blurred a lot.

    • Carighan Maconar@lemmy.world
      link
      fedilink
      arrow-up
      20
      ·
      1 year ago

      I noticed when watching Good Omens on Amazon Prime that they offer a language option “Original + Dialogue Boost”.

      It works wonders. Almost feels like back in the days again when TV shows wanted dialogue to be understood.

      • Send_me_nude_girls@feddit.de
        link
        fedilink
        arrow-up
        14
        ·
        1 year ago

        Sure, microphones got better but there is more too it. One huge factor is the mixing for cinemas and not for home theaters or worse for TV speaker.

        • AggressivelyPassive@feddit.de
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          No, the video actually goes into that. Directors think it’s “more real” to have mumbled dialogues. But they seem to misinterpret that as “more mumble = more good”.

          • CeruleanRuin@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            1 year ago

            It’s a combination of both. Studios typically will mix the end result for the highest-end sound setups, which most people don’t actually have. If you’re lucky enough to have a full surround with the ability to properly dial in equalizer and other settings, you probably won’t have a problem hearing the dialogue even when it’s mumbled. But on conventional tv speakers, it can easily get lost in the mix.

      • barnsbauer@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        This video was exactly what first came to mind when I read “badly understandable dialogues”! It bothers me that as we got better mics, the actors became more unintelligible instead of the other way as one would predict.

    • drz@lemmy.ca
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      I think most people have given up and use subtitles on all the time.

    • Kiosade@lemmy.ca
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I hear this all the time, and maybe I just don’t watch THAT many shows/movies, but I haven’t come across anything where the actors sound like they’re mumbling. Do you have a few examples I could look up?

    • CeruleanRuin@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I’ve used subtitles for most of my adult life, ever since having kids. First it was so I could watch without waking the baby, and then it was so I could follow along over all the noise in the house. And I never went back. So as sound mixing changed and got muddier, I guess I didn’t notice, because I was already used to not being able to hear half the dialogue anyway.

  • 🇨🅾️🇰🅰️N🇪@lemmy.world
    link
    fedilink
    arrow-up
    21
    ·
    1 year ago

    When I was a kid I used to think black and white meant the TV show or whatever used to be in color but since it got old it turned black and white. My thought process was they changed color just like old people’s hair turns grey… This was 35 years ago before internet.

  • balance_sheet@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    1 year ago

    I was a private tutor about few years ago teaching 16 year old. I was 22.

    I still can’t forget his face looking at me like a living fossil talking about how crazyit was to have a touch screen phone the first time…

    • Grimlo9ic@kbin.social
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      That’s such a trip. Only a 6 year difference between the two of you, yet you experienced the dawn of something and they didn’t, and it shapes both of your perspectives so much.

      Even though it technically applies to transistors, Moore’s Law has been a good barometer for the increase of complexity and capabilities of technology in general. And now because of your comment I’m kinda thinking that since the applicability of that law seems to be nearing its end, it’s either tech will stagnate in the next decade (possible, but I think unlikely), or we may be due for another leapfrog into a higher level of sophistication (more likely).

  • NuPNuA@lemm.ee
    link
    fedilink
    arrow-up
    18
    ·
    1 year ago

    Even early 16:9 stuff looks pretty dated now if it hasn’t been remastered to 1080/4k.

  • PhiAU@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    1 year ago

    Re-watching Buffy the Vampire Slayer with my kids in new hi-def, and you can clearly and easily see the stunt doubles now, and the SFX look really dated now that you can see them clearly.

    It’s amazing what old CRTs would let you get away with.

  • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    1 year ago

    Lotta old shows are re-formated just to have the wider screen, since they would still film at higher res for movies or just because. It’s not just an indication of age if something is still only in 4:3, it’s an indication of thrift or just a general lack of giving a shit about the future.

  • rm_dash_r_star@lemm.ee
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    Can always tell when a show is 4:3 aspect. Recently I’ve noticed some modern TV shows adopting the theater aspects of flat (1.85:1) or scope (2.4:1) which I think is pretty cool. The last episode of Strange New Worlds I watched was in scope, that’s some high end filming.

  • some_guy@lemmy.sdf.org
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    I identified them by awkward haircuts and clothing styles. I knew something was off / wrong, but it wasn’t until adulthood that I was able to piece it together.

  • perviouslyiner@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Asteroid City switched between aspect ratios as well as switching between black&white as they swapped between the TV story and the ‘real’/cinema story.