• HelloThere@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 year ago

    This is literally all about what you’re used to.

    From my “UX” I know that

    • < -10 = I can literally feel the heat being sucked from my bones
    • < 0 = fucking cold, big coat and gloves
    • < 10 = big coat, maybe gloves if you’re feeling soft
    • < 15 = light jacket, especially if sitting around a lot
    • < 20 = t-shirt weather
    • < 25 = t-shirt and a beer garden
    • <30 = absolute murder, unless you’re on holiday when it’s great
    • 30+ = kill me now

    I have no idea what 70F is - because I’ve never used it.

    • Draces@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Except you do. It’s 70% between really cold and really hot. You’re saying percentages are confusing? Come on.

      • HelloThere@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 year ago

        70% of what?! What I consider hot, living in the north of England, is very different to what someone in Spain, or Nigeria, would consider hot.

        Temperature isn’t volume, no one can conceptualise a 70% reduction in temperature because it’s literally not how any one, nor any scale other than Kelvin, considers it.

        You can’t, like, grab heat and go “oh yea, there’s less here”.

        Absolute clown shoes.

        Edit: typos, various shit.

        • Draces@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Of reasonable temperatures outside of extreme climates. Temperature above 100 F enter dangerous zones just like below 0 enter dangerous zones. 70F then is easy to understand at 70% between the two (please don’t make me explain linear interpolation to you). It’s intuitive. 21.1C’s relationship to boiling water is meaningless when I need to know if I need a coat or need to stay inside. Fahrenheit has a better use case this and a few other niche cases. Keep pretending I didn’t make a point and you’re not the clown here.

          • HelloThere@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            Temperature above 100 F enter dangerous zones just like below 0 enter dangerous zones.

            Except that’s not true, the deadly zones start earlier than that.

            A heat-period is defined as day(s) on which a Level 3 Heat Health Alert is issued and/or day(s) when the mean Central England Temperature is greater than 20°C; between June and August 2022, there were five heat-periods that met this criterion.

            https://www.ons.gov.uk/peoplepopulationandcommunity/birthsdeathsandmarriages/deaths/articles/excessmortalityduringheatperiods/englandandwales1juneto31august2022

            20C is 68F, 30C is 86F.

            While the report does make it clear that these were already vulnerable people who were already expected to die soon - within weeks or maybe months - it was the heatwave that pushed them over the edge.

            In the UK, with our brick houses built to absorb and retain heat, and absence of AC, average temps above 20C/68F do kill.

            Similarly, it’s reported that two thirds of the deaths in the 2021 Texas storm were due to hypothemia, in a state where houses are also built to shed heat. For the majority of the state, as seen in the article below, the temps were negative C but above 0F. I also think it’s fair to suggest a good many of these people were likely already vulnerable.

            https://www.bbc.co.uk/news/world-us-canada-56095479

            I absolutely agree that 0F and below temps are even colder, and even more deadly, but to suggest that is where it starts to be deadly is wrong.

            Ultimately how humans experience and deal with tempurature has nothing to do with the scale we use to measure it, but what it is compared to what we are used to and how prepared we are to protect ourself against being “too hot” or “too cold”. It’s pretty much a perfect example of subjectivity.

            If you prefer to use F than C, or K, or any other method, then go for it. But to try and argue that either method is inheriantly better or superior based solely on subjectivity is a fools errand.

            Everything in metric is defined around distilled fresh water. The temp scales between 0-100 for solid/ice and gas/stream, and because water is almost incompressible then weight, quantity and volume all interact as well (1kg of water = 1 litre, 1 metre^3 = 1000kg = 1000litres).

            Is that easier? I bake a lot, so not having to measure volume for water and instead being able to use weight as a 1:1 conversion sure makes easier when hydrating mixtures - but my oven being at 200C or ~400F makes no practical difference. Again it’s just what we’re used to.

            That said, and I get why they were invented, but using cups, and thus volume, for compressible ingredients like flour honestly makes no sense. But now we’re wildly off topic.

            • Draces@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 year ago

              Do you think maybe if you have to write an essay to prove Celsius is more user friendly it might not be?

              Why are you talking about baking, obviously Celsius is better for baking.

              Temperatures sure seem to get more dangerous around 100F than 100C you’re being intentionally pedantic.

              You’re trying to use extreme weather events to pretend I said it never goes above 100F or that exceptional weather can have exceptional effects. Could you be more disingenuous?

              And it’s overwhelming obvious you’re from England and haven’t experienced various climates. The US (ya know the ones still using F) has a hell of a lot more varied climates that generally can go between 0F in winter and 100F in summer (I said generally. Don’t bother with another “acchtually” response).

              A normal person going to work doesn’t care when distilled water boils or freezes. Read my comment again. I’m talking about a niche situation outside of scientific context. I am not and never did argue that C is better that F except for a very narrow set of use cases. I am 100% for the adoption of C in the states but you need to understand the pros and cons and why people would be reluctant. Keep your head in the sand I guess I really don’t care to prove a very simple point to you if you still haven’t got it

                  • HelloThere@sh.itjust.works
                    link
                    fedilink
                    English
                    arrow-up
                    0
                    ·
                    1 year ago

                    Ultimately how humans experience and deal with tempurature has nothing to do with the scale we use to measure it, but what it is compared to what we are used to and how prepared we are to protect ourself against being “too hot” or “too cold”. It’s pretty much a perfect example of subjectivity.

                    If you prefer to use F than C, or K, or any other method, then go for it. But to try and argue that either method is inheriantly better or superior based solely on subjectivity is a fools errand.

                    As covered in my essay.