K = 273 + 5(F - 32)/9 is even worse
Why use kelvin when R = F - 459.67
Sure, if one would use Fahrenheit as go to unit…
deleted by creator
“How hot is it, honey?”
“Not a lot, just 298.15 K”
Yeah but C makes more sense. 0-10 is cold but not freezing, 10-20 is cool, 20-30 is warm, 30-40 is hot, 40+ is “you’re gonna die of heat exposure! Get inside, what are you doing?!” increasing in urgency with the number. If it’s in the negatives, it’s the same as the 40+ except “cold exposure”.
It makes more sense in terms of our perception. But from a science perspective
KlevinKelvin makes more sense since you can’t go lower than 0 K and negative temperature doesn’t really make sense, since it’d mean something like negative energy.Who’s Klevin?
A lord.
A mistake plus Klevin gets you home by 7
Negative absolute temperature is a thing. Lasers exhibit negative temperatures when active, i.e. the lasing medium has a negative temperature expressed in Kelvin. Adding more energy doesn’t increase its entropy, it just turns into more laser light. Any such system with bounded entropy can have a negative thermodynamic temperature.
I had a suspicion there was going to be a response like this. Never heard of it but sounds very interesting.
I doubt I’ll properly understand it without a good YouTube video. I shall embark on a search
Yeah but from an every day perspective you’ve basically got 250 or so units there for no reason.
C++ is better ofc /s
Do you have a moment to talk about our lord and saviour, Rust?
NO SOLICITING!
F has that too. Below 0, f it’s cold. Above 100, f it’s hot. 0-25 winter sports baby, 25-50 bleh it’s wet and nasty, 50-75 chefs kiss, 75-100 let’s hit the beach.
I wouldn’t say it corresponds between negative and positive. -20 is already a lot more dangerous than 40.
Depends where you live, I guess. I don’t really consider -20 to be dangerous but I live in Canada so I know how to dress for cold weather.
Canada is dry. Humidity is the other Killing factor there.
deleted by creator
I’ve always thought it’s like: 0- is freezing, 10 is cold, 20 is ok, 30 is hot, 40+ is hell
Above 25 is hot, 30 is hell, 40+ is dead for me
On summer we normally hit 40+ (in some places even 45+) in Spain. I can confirm it’s hellish.
deleted by creator
Also 0 is freezing so it can’t snow unless the temperature is below zero.
It very well can, it just won’t stick around.
And 100 is boiling, so cooking is easier
298.15 - 273,5 so its 24.65°C? I’d argue that is a lot. But i may just be heat sensitive
Edit: fixed typo Edit2: fixed another typo. I gotta start proof reading before sending
VERY generally speaking, 20s are warm, 30s are hot. Humidity changes this a lot. And yes, personal sensitivity to heat plays a role. I live in a dry climate, and I feel rather comfortable until we’re close to 30 ºC. I remember reading something like the ideal room temperature for humans was around 20-22 ºC.
For those using F, this is, more or less, the scale of C:
Below 0: freezing (0 ºC being the freezing point of water, duh!)
0 to 10: cold (don’t go out without a coat)
10s: cool (a sweatshirt or light coat may do)
20s: warm
30s: hot
40s: uncomfortably hot (stay in the shade and hydrate)
50s: you’re dead (or you wish you were. Unsafe for humans)Hot is still relative. Are you talking about soup, a cup of coffee/tea or outside temperature? People would probably answer differently in each instance.
Well… I said VERY generally speaking. And as I’m defining a gradient of temperatures (clearly it’s not the same 30 °C than 38 °C), I’m also defining a gradient of “hot” sensations, from feeling a bit of heat in your body, to feeling like an oven. That’s the thing with generalizations. I’m not trying to be precise here, just give a general idea to those that are not used to Celsius (I’ve seen the same being done with Farenheit and found it useful). Cheers.
Eh, as a weirdo who uses Celsius a lot but lives in Buffalo, NY…
-20s is cold. Coat, gloves, scarf, & hat. Long underwear. Not too much evaporation from the lake since it can freeze, so not much snow.
-10s is chilly. Coat, probably zip it up towards the lower end of the range. Decent chance of apocalyptic snow.
0-10s is cool. Wear a sweater.
10s is nice. Maybe consider long sleeves & pants if it gets a bit cooler.
20s is shorts & t-shirt weather.
30s is all AC, all the time. Uncomfortably hot not too far into the range.
40s is “the humidity is now so high the air is soup, filled with mosquitoes”.
Not a lot, to be honest.
Well into winter?!
Yes. It used to be around 27°C, but you know, climate change and stuff. The temperature in my hometown is around 35 °C.
That’s not a lot IMHO. It’s quite warm, but not hot.
I’m from the UK and part of the forgotten generation, so I was pretty much brought up on both systems. The “cheaper” way to do F from C is double it and add thirty. It works reasonable enough. Of course the reverse is minus thirty then half.
I’m definitely more native in deg C, but am fine with deg F too. Yes, I added deg to stop the coding jokes.
F = C + 32 + ‘ish’.
That’s not right, it should be F = 2C + 32 + ‘ish’
Yeah, SCIENCE, bitch!
For easier mental C -> F conversion, take (2C)-(2C/10)+32 where C = temp in c.
16c × 2 = 32
32/10 = 3.2
32 - 3.2 = 28.8
28.8+32 = 60.8, which is exactly correct.
Or if you just want a rough estimate for weather, 2C+(22 to 32) gets you close enough with easier mental math
16x2+30 = 62, just 1 degree off.
37x2+30 = 104 is like 6 degrees off, though. The further from zero you are, the smaller the number you should add.
Rankine gang rise up
You’re not supposed to memorize the formulas, you’re supposed to draw it out on a graph from which the formula is derived.
It’s literally just a basic linear graph like from Algebra class. You can just trace your finger along the line and find the right temperature, no Egyptian hieroglyphics required.
Fahrenheit is the way to go.
Taking this opportunity to argue why Fahrenheit over Celsius makes way more sense. Fahrenheit is essentially based on a scale of 0 to 100, which is a scale we use for most things in life. Yes we can go below 0 or above 100, but it’s such a perfect scale to understand exactly HOW hot or cold something is.
It’s 70 degrees outside? Wow, we’re at 70% heat. I bet that feels really nice and not too hot. Oh no it’s 100 degrees so we’re at 100% heat? Probably want to stay inside today. Water freezes once we get to 32%? Makes sense, that’s pretty cold. I’ll need to wear a winter coat. 0% outside? No way I want to go out in that.
In terms of understanding how how or cold it is outside, looking at the temperature as a number out of 100% is a perfect way of understanding that and having a frame of reference.
Celsius is so dumb. A 70 degree day in summer sounds great. 70% heat, not too hot, not too cold, just right. But convert that to Celsius? It’s 21 degrees in the summer? What does that even mean? Stupid.
Also because of the way the math works, the scale for Celsius makes no sense. It’s 0 degrees out in the U.S., that’s -18 Celsius. But if it’s 100 in the U.S., that’s only 38 Celsius? What kind of stupid scale runs from -18 to 38? 0 to 100 is the way to go.
Imagine if test scores ran from -18 to 38. Would you support this nonsensical scale then?
To be clear, I’m on board with the metric system and I definitely don’t think the U.S. does everything right. But Celsius is trash.
Except that my winters are -10% temperature my summers are 115% temperature.
With the current climate policy, soon the winters -10ºC and the summers at 115ºC, but
That just means it’s time to move
It means your temperatures are not compatible with human life. There is no confusion that 120% Hot would be “really fucking hot”.
Both of those temperatures mean you’re outside the human scale and should limit your time outside.
It’s pretty unclear that I can for a walk outside at 40 C but if I do that at 48C I might die.
It’s pretty unclear that I can for a walk outside at 40 C but if I do that at 48C I might die.
Not unclear for someone familiar with the system
to a european like me 90 ‘%’ or 20 ‘%’ human comfort would be very confusing - you could probably guess that one is hot and the other’s not but I’d have no point of reference until I convert it to Celcius. I think the numbers that someone grows up with will always make more sense no matter what
I mean we use scales from 0 to 100 in every field. -18 to 38 is a scale used absolutely nowhere.
But I agree Humans can get used to pretty much anything, and once they do - it’s all they will prefer over the unknown.
So 0°F was defined as the freezing temperature of a solution of brine made from a mixture of water, ice, and ammonium chloride (why in the world?). Originally, 90°F was set as human body temperature, which was later changed first to 96°F, and now it’s about 98.6°F.
Celsius is just: 0°C is the freezing temperature of water 100°C is the boiling temperature of water
Nobody uses a scale between -18 and 38. People in countries using Celsius just learned as a child that body temperature is 38°C, that’s all. -18°C has no special meaning to us.
At 0°C outside it’s freezing (32°F). 10°C is quite cool (50°F), you’ll need a jacket. 20°C is a comfortable temperature for me, if it’s sunny (68°F). 30°C is getting rather warm (86°F). 40°C is hell outside, or a bad fever (104°F). To boil water, heat it to 100°C (212°F).
I get that this seems confusing at first when you’re used to completely different orientation points, but for people who are used to C, it’s very intuitive.
A Canadian would call 25 C a very hot day. As a Brazilian I’d be laughing and considering using warmer clothing.
Claiming that one scale works better for human perception of temperature is quite literally wrong, by definition, as your scale can’t account for how extremely subjective this is.
With Celsius, however, the subjectivity is gone: everybody knows what a fucking ice cube is, everybody has boiled water, everybody knows roughly how warm a body should be. It’s super easy.
Also, what the fuck are you talking about scales that “only go to 38” and comparing it to test scores? Celsius can go to temperatures in the -200 range and essentially infinitely up. A soldering iron is hotter than 38 C, I can guarantee you that.
I get where you’re coming from cause you’re used to F, so comparing it to C naturally keeps you thinking of F as baseline (would be the same for me the other way around). But saying that C aint on a 0-100 scale is just objectively wrong. At 0 C water freezes, at 100 C water boils. It’s still a 0-100 scale, just based on something different. When it comes to metric vs imperial, it’s an easy conclusion to me, when it comes to temperature I think it’s more nuanced. I don’t have a better explanation on the difference other than “F is human focused” while “C is science focused” (I know, doesn’t quite cover it, just the best I got)
But then again, it really comes down to what you’re used to feeling “right”. For example, I could make similar conversions from C to F instead, and get weird numbers (water freezes at 32, and boils at 212? That makes no sense)
Never had too strong of an opinion on what temperature unit to use, but I will say this. C is a lot closer to K (kelvin) which is what is used for science, while converting F to K is a mess. So the one benefit C gives is a slightly easier time to get into that
I do need to let the nerd in me get this out too, F used to also be defined by water freezing/boiling. Meaning that technically C is a 0-100 scale, while F is a 32-212 scale. (Nowadays they’re both defined by K, making this point kinda irrelevant)
But why do I care when Water Boils at Sea level? What am I to do with that knowledge in my day to day? The 0-100 is irrelevant to me.
I’m dead long before water boils. And I’m very uncomfortable below water freezing but it won’t kill me quickly.
Fahrenheit has one advantage here: You’re used to it. If you’re used to Celsius, you know that 25° is warm and 5° is cold and don’t give a shit about it not being a 0-100 scale for that particular use case.
The 0-100 thing is pretty much the only argument I’ve ever heard in favor of Fahrenheit btw. Again, if you’re used to one of them, that’s the one that will make the most sense.
Being used to Celsius has the advantage of automatically being used to Kelvin. For example, if you ever want to calculate anything to do with the energy required to heat something to a certain temperature, you will have a way better time with Kelvin. Being used to and measuring in Celsius helps a lot here.
But sure, I get that you’re used to Fahrenheit. It’s just that the whole world has decided to use Celsius. Honestly, for good reason.
When it’s snowing and freezing outside is very helpful knowledge for places where that happens. That being at zero is nice.
I’m dead long before water boils
No sauna for you I guess
Why does this matter? I have no idea what 0% or heat means, but know what -18° or 38° mean. For me 32% heat doesnt mean anything.
How hot it feels can change. In summer, 10°C feels cold, but in winter it feels hot
A lot more people use °C so more people would have to relearn how temperature is measured
I dont think either of them is better than the other, its just that you got used to what numbers means its hot or cool
You are entitled to your opinion and obviously have your own preference and your way of explaining the scale is very good as well, but saying that Celsius is objectively worse is just wrong.
The argument for Fahrenheit based on a perceived “0 to 100 scale” representing a percentage of heat can be critiqued as it misunderstands how temperature scales work. Temperature is not a percentage system; it’s a measure of thermal energy. The notion that 70 degrees Fahrenheit represents “70% heat” is not scientifically accurate as it implies that temperature is a linear scale capped at 100, which it is not. The Fahrenheit scale was actually based on arbitrary points: the freezing point of brine (0°F) and the average human body temperature (96°F at the time, which has since been adjusted to 98.6°F).
Celsius, on the other hand, is based on the freezing and boiling points of water at 0°C and 100°C respectively, under standard atmospheric conditions. This makes it a decimal and scientifically consistent system that is easier to relate to the states of water, an essential reference in science and daily life.
Comparing temperatures to percentages, like test scores, is a flawed analogy because temperature doesn’t have an upper limit “score” and is not designed to be read as a proportion. The scale from -18 to 38 in Celsius correlates directly with the physical properties of water, which is logical for scientific purposes.
Moreover, many argue that Celsius is more intuitive for everyday weather-related use outside of the U.S., as the scale is more granular for colder climates (where a one-degree change in Celsius is noticeable) and aligns well with the metric system, which is used globally for scientific measurement.