Can’t take the heat anymore? Here’s how cooling technology has evolved to solve that

Updated on 03-Jun-2020
HIGHLIGHTS

This summer, find out more about the technology that helps keep you and your things cool even when it's boiling outside

Summer is almost here. In fact, many of you might already be complaining about the sweltering heat. With it, begins an annual pastime of looking up air conditioner prices, stocking cold water in the refrigerator, ordering ice cream and finding ways to keep your vehicle from overheating, especially on long drives. All of these provisions are powered by cooling technology of multiple types, that have evolved over the years into their current state. While the summer makes its presence felt, let’s take a journey across the history of cooling technology.

Since ancient times

Home cooling is quite an old practice. Dating back to Ancient Rome, the architectural use of aqueducts to carry away heat from buildings was one of the first ways it was done. However, the earliest known form of refrigeration predates even that. The harvesting of ice and snow for ice cellars and cold storage, mainly to cool beverages, goes as far back as 1000 BC in China and was also seen in Ancient Rome, Greece and Egypt.

Aqueducts have been used to carry heat away in Ancient Rome

Fans, curtains, evaporative cooling and architecture played a major role in keeping houses cool until the reintroduction of large scale air conditioning efforts in the West in the 1800s. We can still see wind columns strategically designed to channel breeze into buildings in Middle Eastern architecture. Ostentatious methods that involved using up large amounts of ice aside, the next major push in air conditioning came with the popularity of electricity. Tesla’s alternating current motors made oscillating fans possible. 

On the other hand, experimental artificial refrigeration existed before that. As far back as 1755, William Cullen had designed a small refrigerating machine by boiling diethyl ether. Around 1758, Benjamin Franklin and John Hadley confirmed that evaporation of highly volatile liquids, such as alcohol and ether, could be used to drive down the temperature of an object past the freezing point of water.  Subsequently, many tried their hand at vapour-compression refrigeration but weren’t successful commercially until James Harrison in the 1850s. However, household refrigeration stayed on iceboxes until as recent as the early 20th century.

Benjamin Franklin worked on evaporative cooling

The 20th century

In 1902, Willis Carrier, a 25-year-old engineer from New York, invented the first modern air conditioning system. His system sent air through water cooling coils, which was aimed at controlling humidity in the printing plant which was his workplace. One of the first private homes to be fitted for central air conditioning goes as far back as 1914 to a home in Minneapolis.  Carrier followed up his work in 1922 with the centrifugal chiller which had a central compressor to bring down the size of the unit. Since its introduction in 1925 at a movie theatre, people have been flocking to movie theatres for air conditioning. In 1933, the Carrier Air Conditioning Company of America developed an air conditioner using a belt-driven condensing unit and associated blower, mechanical controls, and evaporator coil, and this device became the model in the growing U.S. marketplace for air-cooling systems. In 1945, Robert Sherman of Lynn, Massachusetts invented a portable, in-window air conditioner that cooled, heated, humidified, dehumidified, and filtered the air.

Willis Carrier invented the modern air conditioning system back in 1920

General Electric had developed a gas-powered refrigeration unit in 1911. Using gas removed the need for an electric compressor as well as made the unit smaller. However, GE also made an electric version in 1927 called the Monitor Top. In 1930, Freon was synthesized by a major competitor of GE, Frigidaire. Based on chlorofluorocarbons (CFCs), this made refrigeration safe and accessible to the general populace. It also led to the development of refrigerators that were smaller and lighter. In fact, the average price also dropped from $275 to $154, bringing it to most households in America at that time. It was later discovered in the 1970s that CFCs are harmful to the ozone layer.

By 1957, the first rotary conditioner was developed, making air conditioning units smaller, and increased their efficiency. The introduction of heat pumps in the 1970s led to the development of devices that could cool during the summer as well as heat during the winter. In 1987, the Montreal Protocol was signed to protect the earth’s ozone layer. The Protocol establishes international cooperation on the phase-out of ozone-depleting substances, including the chlorofluorocarbon(CFC) refrigerants used in HVAC equipment. By 1995, CFC manufacturing had ended in the USA.

A new coolant

Back when refrigerators and air conditioning units incorporated ammonia, methyl chloride and propane, there was always a danger of the toxicity and flammability leading to fatal accidents. Quite understandably, Freon was quite a revolution when it was invented in the late 1920s. Multiple different types of CFC based coolants have been used since then, such as R-11, R-12, R-22, R-134A. The name includes a number indicating the molecular composition. The most commonly used variant in homes is known as chlorodifluoromethane (R-22).

R134A became the coolant of choice because of R12's ozone-depleting nature 

Most designs switched over to R134A due to the ozone-depleting potential of R12. Many companies denied the possibility of an alternative to HCFCs and HFCs. The environmental organization Greenpeace provided funding to a former East German refrigerator company to research alternative ozone- and climate-safe refrigerant in 1992. They came up with a hydrocarbon mix of isopentane and isobutane, but could not patent it as a part of their agreement with Greenpeace. Hence the technology was widely adopted by other firms. Post that, activist marketing led to the adoption of this new product in most parts of the world.

Automotive cooling

Prior to World War II, water was the primary cooling agent used in car radiators. However, the development of high powered aircraft for military and domestic use also led to instances of the water boiling and damaging the engine. Antifreeze, which was used in winters to keep engines from freezing over, was also effective in pushing the boiling point of water higher and was put to use for this purpose. Over time other additives were also mixed with the water and antifreeze, such as corrosion inhibitors which help to extend the life of a metal or alloy by decreasing the rate at which they corrode. This is very useful as it prevents the coolant from doing any damage to the engine.

Ethylene glycol was a widely used antifreeze during World War II. In addition to mixing readily with water in any proportion, it has a lower freezing point than water and a higher boiling point. After the war, it remained the dominant antifreeze technology for nearly the rest of the 20th century. The future might see glycerol taking over the role of antifreeze, as it is getting cheaper to produce due to technological advancements and also because it is non-toxic.

Connect On :