The Interesting History of the Kilowatt-Hour

Share on facebook
Facebook
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit
Share on whatsapp
WhatsApp
Share on email
Email

By far, the most commonly used billing unit around the world today is the kilowatt-hour. Surprisingly, however, very few people actually know what gave rise to this unit of measure. In this blog, we’ll be exploring the interesting history of the Kilowatt-hour, from its inception in the late Victorian era to its spread as the near-universal form of electric power metering.

What is a kilowatt-hour?

But before we get started, it helps to have a little science lesson on what the term “kilowatt-hour” actually means. Indeed, many people get confused by the different units of measure utilized when it comes to electricity – the volt, ampere, kilowatt, and kilowatt-hour.

Distinguishing between them is very easy with the help of using an analogy. Think of electricity as water flowing through a pipe. An ampere would be the volume of water flowing through it. A volt would be the water pressure within the pipe, and the kilowatt would be the amount of power (Volume/Amps x Pressure/Volt) the water provides. A Kilowatt-hour would then be 1 kilowatt of power expended in one hour.

The DC Era

The need for metering first emerged in the 1880s, when the commercial use of electricity was increasingly becoming popular. Initially, business energy providers billed their customers a fixed amount per month based on the number of electric lamps. However, this proved to be increasingly inefficient and difficult to scale. Inspiration was found in the existing gas meters, and soon, a number of experimental metering systems were being invented for the purpose of electricity billing.

Among the firsts of this time was the one developed by the famous American inventor and businessman Thomas Edison. His device made use of an electrolytic cell to measure electricity. A small current would continually pass through the cell and sustain an electrochemical reaction, adding mass to its plates. At regular intervals, these plates would be removed and weighted to measure the consumption of electricity. Edison’s meter was too impractical to be well-received by the public, and attention was turned to other inventors for a better solution.

Across the pond, within the United Kingdom, a better metering device was given inception. Called the ‘Reason’ meter, it consisted of a mounted glass tube with a mercury reservoir. As the current was drawn from the supply by an electrochemical reaction, the mercury reservoir would slowly be pushed to a bottom column. Once the pool of mercury was completely transferred, the meter would become an open circuit, necessitating the customer to pay the energy provider to unlock and invert the meter, and thus, resume supply.

However, these early electric meters measured in ampere-hours and thus didn’t take into account changes in voltage. A new unit of power had to be proposed to give a more accurate reading on power consumption. In the Fifty-Second Congress of the British Association for the Advancement of Science, the concept of watt was proposed, named after the famous 18th-century inventor James Watt.

The first accurate electric meter that measured in watts was invented by a German engineer named Hermann Aron. It contained two mechanical clocks with coils around their pendulum bobs. When a current was passed, one of the bobs accelerated while the other slowed down. A differential mechanism measured the difference in speed of the two clocks and counted it on a series of dials, giving a reading of electricity consumption.

The Switch to AC

Aron’s meter worked well with DC supply, but alternating current (AC) was steadily beginning to rise in popularity, and the search was on for someone to invent a new metering device. A famed Hungarian engineer, Ottó Bláthy, finally came up with a practical model, and a specimen was presented at the Frankfurt Fair in 1889. Within the same year, his invention entered mass-production. In the same year in America, Elihu Thomson came up with a recording wattmeter based on a coreless commuter motor, which could read consumption both in DC and AC.

Five years later, in 1894, American engineer Oliver Shallenberger give rise to the electric meter in its modern form. An induction disk whose rotational speed was set proportional to the power supply moved a gear system, which in turn drove a register mechanism that counted the revolutions in a series of dials. Although Shallenberger’s meter could only work with AC, it was far simpler and less costly than the earlier Thomson’s version.

However, during this time, the kilowatt-hour was still not an official unit of billing in many of the world’s jurisdictions. This began to change after 1906 when an international conference was held in London on Electric Units and Standards. For measuring units of power, the watt was adopted as the official standard. And thus, in turn, this helped popularize kilowatt-hour for measuring power consumption in homes and offices.

Lower Your Energy Bills

Did you know that business energy companies discriminate in pricing based on customers’ consumption profiles? Businesses are not universally charged the same rates for every kilowatt-hour of power consumed. Rates can vary based on season, Time of Use (TOU), amount of power consumed, and your peak demand. Energykillbill is here to help. Using our service, you can compare the rates charged by various energy providers here in the UK and discover ones with the lowest cost of business electricity.  Switching energy providers is completely a quick and hassle-free process with EnergyKillbill. There are no brokers or hidden fees involved with our service, just simply easy switching. For further information, visit our FAQs or call us on our number – 0333 050 8419.

More to explore