Jump to content

Fahrenheit Gang or Celsius Clan?


Recommended Posts

"The lower defining point, 0 °F, was established as the freezing temperature of a solution of brine made from equal parts of ice, water and a salt (ammonium chloride). Further limits were established as the melting point of ice (32 °F) and his best estimate of the average human body temperature (96 °F, about 2.6 °F less than the modern value due to a later redefinition of the scale)."

Simplicity at its best, choose three differents references for one scale. 

Team Kelvin / Celsius. SI always win ;)

Link to comment
Share on other sites

51 minutes ago, Romain_L said:

"The lower defining point, 0 °F, was established as the freezing temperature of a solution of brine made from equal parts of ice, water and a salt (ammonium chloride). Further limits were established as the melting point of ice (32 °F) and his best estimate of the average human body temperature (96 °F, about 2.6 °F less than the modern value due to a later redefinition of the scale)."

Simplicity at its best, choose three differents references for one scale. 

Team Kelvin / Celsius. SI always win ;)

That way do define things is seriously messed up. Almost like this person did not want ordinary people to be able to verify this. Unless ammonium chloride was a common household item back then? Now, I get that defining physical measures was a difficult task, but the not-so-well-done attempts should be dropped when better things become available and most of the human race did that. I also understand that it is hard to let go of the units you were brought up with. 

Link to comment
Share on other sites

14 minutes ago, Gurgel said:

That way do define things is seriously messed up. Almost like this person did not want ordinary people to be able to verify this. Unless ammonium chloride was a common household item back then? Now, I get that defining physical measures was a difficult task, but the not-so-well-done attempts should be dropped when better things become available and most of the human race did that. I also understand that it is hard to let go of the units you were brought up with. 

The Fahrenheit scale was proposed in 1724. I don't the scientist back then was concerned at what was available at everyday household considered most people were dirt poor. 

 

Link to comment
Share on other sites

4 hours ago, NurdRage said:

While not a Fahrenheit or Celsius issue itself, NASA did once lose an expensive mars orbiter because they screwed up Imperial and Metric units.

https://en.wikipedia.org/wiki/Mars_Climate_Orbiter

Bottom line, stick to something consistent.

That one was at true highlight of stupidity and incapability. Sometimes I ask myself how people that screw these things up get to be on NASA teams. On the other hand, the clock issue on the recent Boeing space screw-up was also pretty impressive. 

Link to comment
Share on other sites

On 12/30/2019 at 8:28 PM, auron471 said:

Which one are you?

Fahrenheit Forever!

Fahrenheit is great for "people-friendly" answers as most human-acceptable temperatures are between 0 and 100 with the most comfortable temperatures being around 70 to 80.  Easy -- especially if you're in a scholastically challenged country, such as the USA, like I am.

Celcius is great for easy sciency stuff.  Water freezes at 0 and boils at 100.  Nice and easy -- unless you start talking about other room temperature liquids.  They don't get such special treatment and have boiling temperatures like methyl alcohol that boils at 64.7 C.  However, since you know that water boils at 100 C, its very easy to say, "Oh, methyl alcohol boils at a colder temperature than water does."

Kelvin is the real temperature system.  It starts at 0, which is the coldest anything can get, and only goes up from there.  The lack of any thermal energy whatsoever is, by definition, zero degrees Kelvin.  Each degree of temperature change is "spaced" the same as the celcius scale, so if you're familiar with that, it kinda makes sense.  Except that now water freezes at 273.15 degrees, making the normal average person do some mental gymnastics to figure out what the heck you're talking about.  Then again, it makes it possible to talk about the temperature of the dark side of Neptune and the center of the sun using the same number system without those pesky negative symbols show up to confuse your math.  So.. bonus?

Anyway, to answer the OP...  I suppose that when talking about people-related temperatures, I prefer Fahrenheit.  When doing math that uses temperatures, I generally end up using Celcius -- for example, thermal energy calculations in ONI.  That way when I share my equations, the average sciency individual understands what is going on.  

21 hours ago, NurdRage said:

Bottom line, stick to something consistent.

And that's why I use Celcius for temperature maths or any time I'm writing papers.  When I speak with people about "normal" temperatures, I use Fahrenheit.

Link to comment
Share on other sites

Whats the advantage of "60-70F is fine for humans" over "5-25°C is fine for humans"? Right, there is none.

If you learn it the "right way" from the beginning, it does not matter at all. But everything else is easier later. And nearly everyone uses metrics, because its international and easier to compare. What happens if everyone uses metrics, but Lockheed Martin uses imperials, we have seen when the NASA lander crashed into Mars. Every unit was in SI units, but L.M. refused to use them, because AMERICA! Multi-Million-Dollar scrap-mission. Yeah!

In fact, Fahrenheit does not make any sense at all, but if you talk to people from the US. (and a few other countrys). Over here in europe, its utterly useless to even know about it. Nobody uses it, nobody needs it, nobody even wants it. 

Link to comment
Share on other sites

2 hours ago, KittenIsAGeek said:

Fahrenheit is great for "people-friendly" answers as most human-acceptable temperatures are between 0 and 100 with the most comfortable temperatures being around 70 to 80.  Easy -- especially if you're in a scholastically challenged country, such as the USA, like I am.

Actually, no. I am in Europe, and nobody has any issue with Celsius, regardless of education or smarts. Sure, people with lower levels of education cannot actually do _calculations_ with it, but nobody has any issues judging whether a specific temperature is comfortable, cold or hot.

I agree on Kelvin, but there is a reason SI accepts Celsius as unit. For most purposes, Celsius is fine in scientific or engineering calculations.

Link to comment
Share on other sites

21 hours ago, Gurgel said:

Actually, no. I am in Europe, and nobody has any issue with Celsius, regardless of education or smarts. Sure, people with lower levels of education cannot actually do _calculations_ with it, but nobody has any issues judging whether a specific temperature is comfortable, cold or hot.

Allow me to elaborate just a bit on my statement.  First off, both temperature scales (Celcius and Fahrenheit) were developed in the northern hemisphere, north of the tropics and south of the arctic.  So lets use that as our example.

Zero degrees C is 32 degrees Fahrenheit.  Winter temperatures between the arctic circle and the tropic of cancer generally drop below this point, or we wouldn't have snow.  Sometimes lower than 0 degrees Fahrenheit, but not generally except in the high mountains or the far north.  So lets say we chose 0 F (about -18 C) as our 'low' point for generally habitable temperatures.  In the same region during the summer, temperatures can generally rise around 100 F.  Certainly there are warmer temperatures, but unless you're in Death Valley, higher temperatures are outliers that aren't there for very long.  So lets set 100 F (about 38 C) as our 'high' point for generally habitable temperatures.  

This gives us about 100 degrees to play with when dealing with "normal" temperatures that humans deal with on a daily basis.  Meanwhile, in Celcius, we're using only 56 degrees.  The average person is going to be dealing with whole numbers, rather than fractions, so Fahrenheit gives more flexibility for the average individual than Celcius over the same temperature range.  Additionally, the average individual using Fahrenheit will only be dealing with negative numbers on a rare occasion, where about 1/3rd of the range in Celcius is negative.

I'm not saying that its confusing for people.  Clearly individuals living in countries that have adopted the Celcius scale as standard have no problem with it.  What I _am_ saying is that Fahrenheit was specifically designed to give a wide range of non-negative, easily useful numbers over the temperatures that humans generally deal with on a daily basis.  That is one reason why it was so wide spread outside of academic circles until a decade or two ago.

Celcius, on the other hand, was designed to use the properties of water at STP (standard temperature and pressure) as the basis of the scale.  You have a 100 degree range between freezing and boiling.  There is absolutely nothing wrong with this at all.  This particular scale is nice because it is easy to calibrate and test your equipment.  If your pot of water is boiling and your thermometer says something other than 100C, you clearly have a problem.  Meanwhile in Fahrenheit you have to remember the number 212.  Again, there is nothing wrong with this and obviously countries like the USA don't have trouble with the number 212.  However, this shows the design differences that went in to developing the two temperature systems.  One favors human habitability, the other scientific inquiry.

Finally.. the primary reason why I use Celcius over Fahrenheit when doing thermal energy math is simply because of the SI standardization that chose to use Celcius.  If I were to use Fahrenheit at this point, I would have to convert temperatures before I can do math on them, then convert them back to compare to other results.  Its a lot of extra work.  The reason I prefer Fahrenheit over Celcius when talking with people is that, in my opinion, its easier to say, "Goodness its hot out today -- I bet its over 100!" than to say "I bet its over 38!" 

Link to comment
Share on other sites

On 1/5/2020 at 3:25 AM, KittenIsAGeek said:

The reason I prefer Fahrenheit over Celcius when talking with people is that, in my opinion, its easier to say, "Goodness its hot out today -- I bet its over 100!" than to say "I bet its over 38!" 

Sure, historical units are kept alive because many people have some perception of them being "easier". If you grow up with SI, there really is no issue anywhere.

Link to comment
Share on other sites

4 minutes ago, Gurgel said:

Sure, historical units are kept alive because many people have some perception of them being "easier". If you grow up with SI, there really is no issue anywhere.

If by "historical" you mean "both scales developed at about the same time, but Celcius came second," then sure.  They both date to the 1700s, but Anders Celcius preferred a centigrade scale and used the freezing and boiling points of water to calibrate 0 and 100.  He developed his scale after Daniel Fahrenheit revealed how he calibrated his Fahrenheit scale.  So .. sure.  Celcius is newer.  Barely.  

Link to comment
Share on other sites

19 minutes ago, SharraShimada said:

F&C may be related in time, but its not just the temperature thats weird in imperial messurement. Even the users of imperails are not united. 1 Liter is 0,264172 US Gallons, and 0,219969 british ones. 

That's an entirely different (although related) topic.  Teaspoon, tablespoon, cup, pint, quart, gallon, barrel... Yeah, I once wrote a research paper where I intentionally put all my volumetric measurements in such units.  It got the reaction I was expecting, but it isn't something I'm going to repeat.  Its important to note that the US volumetric units have the same names as Imperial units, but are quite different.  1 pint = 16 oz in the US, but is 20 oz elsewhere.  This can be especially entertaining when you live on the border between Canada and the US.

Link to comment
Share on other sites

I'm Canadian and I'm under 50, so Celsius is the only one I understand intuitively.

The idea of placing the point where rain turns to snow anywhere other than zero, in a system where negative numbers exist in the first place and measure temperatures found in Earth's climate (so, not you Kelvin), baffles and outrages me. I can see the advantage of the smaller units in the Fahrenheit scale for measuring weather or other aspects of a human environment like pool water without having to use a lot of decimal places, I just object to the location of the zero point.

Link to comment
Share on other sites

10 hours ago, KittenIsAGeek said:

If by "historical" you mean "both scales developed at about the same time, but Celcius came second," then sure.  They both date to the 1700s, but Anders Celcius preferred a centigrade scale and used the freezing and boiling points of water to calibrate 0 and 100.  He developed his scale after Daniel Fahrenheit revealed how he calibrated his Fahrenheit scale.  So .. sure.  Celcius is newer.  Barely.  

Celsius is a current SI unit (strictly speaking it is an "SI derived unit", putting it on the same level as Hz, W, V, N, for example). That makes it very much non-historic as SI was a complete revision and reformation of the unit-system. You could even argue that with the redefinition of the base-units, Celsius is now vintage 2019.

Link to comment
Share on other sites

15 minutes ago, Gurgel said:

Celsius is a current SI unit. That makes it very much non-historic.

OK, if we're going to go that route... I was arguing that there was no need to refer to either scale as historic, since both are still used and both were developed at the same time, meaning that they are contemporary to each other.  There are many units of measurement that we use in the modern world that date back into annals of history.  Our concept of hours, minutes, and seconds, which we use for time, has been found in ancient Babylonian writings.  Angle measurements consisting of degrees, minutes, and seconds as we use them today have been found in ancient Sumarian writings.  I don't think that anyone would call either method "historic," despite the fact that both date to the beginnings of recorded history.

That said... Kelvin is the SI unit for temperature, not Celcius.  Since the centigrade scale is shared between the two units of measurement, conversion is a simple addition or subtraction,  they often are conflated.  But only Kelvin is listed in the SI tables.

Link to comment
Share on other sites

Celcius was originally proposed to use 100 as the freezing point and 0 as the boiling point, for the exact reason of avoiding using negative numbers in the weather and day to day life.  That being said, I don't buy that F is more intuitive for a person and day to day life.  If standard room temperature was some nice centered number like 50, 0, or 100, then okay.  That would make sense as a human centric value.  It isn't.  Standard room temperature is around 70 °F, so we've already got a pretty arbitrary value as our baseline.

There isn't anything particularly special about 0 °F or 100 °F over 20 °F or 90 °F, both are uncomfortably cold and hot for most people respectively.  Outright dangerous temperatures are around -18 °F and 106 °F.  Regardless of which system you use, you'll have to memorize what the breakpoints are for clothing and danger.  Celcius's two major intuitive values are ones we use in day to day life, for determining when it snows and when our cooking boils.  These aren't huge, but they are pretty concrete values.

Link to comment
Share on other sites

24 minutes ago, KittenIsAGeek said:

That said... Kelvin is the SI unit for temperature, not Celcius.  Since the centigrade scale is shared between the two units of measurement, conversion is a simple addition or subtraction,  they often are conflated.  But only Kelvin is listed in the SI tables.

Nope. Celsius is an "SI derived" unit for which SI defines "°C" as the unit symbol. "SI derived units" are SI units. They are not "SI base units", but they are SI units.

Here is an overview: https://en.wikipedia.org/wiki/International_System_of_Units

Link to comment
Share on other sites

On 12/31/2019 at 11:35 AM, Gurgel said:

Some holdouts stick to historical units because they believe they are somehow better

Some non-SI units are convenient for particular purposes. Degrees of angle remain in common use in spite of the SI radian, for example, likely because 360 = 23·32·5 for lots of small divisors.

So where's the Rankine temperature scale option? Kelvin is absolute Celsius, Rankine is absolute Fahrenheit.

Link to comment
Share on other sites

10 minutes ago, Derringer said:

Some non-SI units are convenient for particular purposes. Degrees of angle remain in common use in spite of the SI radian, for example, likely because 360 = 23·32·5 for lots of small divisors.

True. And you can do degrees in your head, while radian requires some calculation aid if you want reasonable accuracy.

@KittenIsAGeek: Look, I have absolutely no problem with your preferences, and you seem to have using these units well in hand. It is just that things significantly more complicated than needed (i.e. using two different units for temperature without very good reasons) offend me on some deep level and sometimes that shows through. I think I may just be too much of an engineer (and _all_ good engineers deeply believe in KISS) to see this more relaxed.

Link to comment
Share on other sites

12 hours ago, Gurgel said:

True. And you can do degrees in your head, while radian requires some calculation aid if you want reasonable accuracy.

@KittenIsAGeek: Look, I have absolutely no problem with your preferences, and you seem to have using these units well in hand. It is just that things significantly more complicated than needed (i.e. using two different units for temperature without very good reasons) offend me on some deep level and sometimes that shows through. I think I may just be too much of an engineer (and _all_ good engineers deeply believe in KISS) to see this more relaxed.

I get where you're coming from.  I'm an engineer as well, but my personal paradigm is "use what makes sense for the situation."  So when I'm dealing with thermal energy calculations, I use Celcius (or Kelvin) because it makes sense to do so. I don't have to do lots of conversions or use extra lookup tables.  However, when dealing with humans in my area, Fahrenheit makes more sense.  Its what they're used to, and I can be more precise while still using only whole numbers.  20C is 68F. 21C is 69.8F -- almost, but not quite, 2 degrees change in Fahrenheit. 

I don't have a problem with you only using Celcius.  But it would add extra complications to my life if I did so.  I would have to continually explain that "No, 20 C isn't below freezing, its close to room temperature" when someone asked me about the weather. Keeping it simple, then, means that I need to use Fahrenheit when talking with people.

I have similar issues with meters and feet.  The "foot" in my opinion is a good unit of measurement for people to use.  My palm to my elbow is almost precisely one foot.  My finger tip to first knuckle is almost exactly an inch, so I can do a lot of quick and dirty measurements using my arms and hand.  Miles, on the other hand, are somewhat klutzy -- 5280 feet.. what?  However, when I'm doing precision design, I definitely prefer to use the metric system because the math is a lot cleaner.  A tenth of a CM is a MM, etc, where it takes 12 inches for one foot.  As long as my units are clearly labeled, there's never any problem.  I don't have any body parts that come out to easy metric units, so "eyeballing" something is more difficult.  I can walk the length of a room and get a remarkably accurate distance in feet, but I can't do the same in meters because I have no easy reference for comparison.

And as long as I'm off topic.. has anyone else played around with base 12 number systems and noticed how incredibly easy most fractions become?  Why did we have to focus on the whole decimal system and turn fractions into sometimes infinite number strings?  =^.^=  

Link to comment
Share on other sites

4 minutes ago, KittenIsAGeek said:

And as long as I'm off topic.. has anyone else played around with base 12 number systems and noticed how incredibly easy most fractions become?  Why did we have to focus on the whole decimal system and turn fractions into sometimes infinite number strings?  =^.^=  

Simple: Too few people with 12 fingers. If we were dupes, we would be using base 8.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

Please be aware that the content of this thread may be outdated and no longer applicable.

×
  • Create New...