HorE.gif - 1338 bytes

Why 230 V or 240 V?

The first person to use electricity to produce light was Sir Humphry Davy (1778 - 1829) who made an electric arc light in about 1802. (He needed more than a thousand cells (“batteries”) to make it work!) Arc lights are very bright indeed - more than a hundred and forty years later they were still being used, for example in anti-aircraft searchlights in the Second World War. But they need several thousand volts to make them work, they give out dangerous levels of ultraviolet light, and they are far too bright to be used in people’s homes.

In the 1880s Westinghouse was using them to light New York streets, but several employees were killed by electric shocks: this was known as being Westinghoused.

Edison was a very famous inventor, but he was also a very astute businessman, and he saw a need for a simple and safe way of using electricity to light ordinary homes. In about 1880 he made the first incandescent electric lamp, where an electric current flowing in a thin wire inside a glass bulb gets so hot that it glows white-hot, and it was still the most widely used method of providing lighting in the home until being replaced by low energy lamps such as CFLs and LEDs in the 21st century.

In the 1870s and 80s there were no plastics, and electric cables and equipment were not always properly insulated, so there were many accidents. Edison used 100 V (DC) for his light bulbs because he thought that this was the highest voltage that was safe to put into people’s homes, and at that time he was probably right. Alternating current, developed by Westinghouse but using Tesla’s patents, came soon after, and by the end of its first year his AC generators accounted for more than 30% of all electricity used in the USA. By this time improvements in the design and manufacture of electrical equipment and insulating materials meant that Westinghouse considered it safe to put 240 V into ordinary people’s homes. Electrical safety is discussed on its own Page.

Over the following years some countries used 220 V, some 230 V and some 240 V. (What happened in the USA is described on the next Page.)

Equipment made before the 1970s often had a little knob at the back which you could adjust with a screwdriver to set the correct voltage for where you lived.

RevoxCP2.JPEG - 287Kb

At one time people thought that it was important to standardise voltage and current throughout the world, but today adapters that allow you to use any equipment on any voltage and frequency are so readily obtainable and so easy to use that this is no longer considered necessary, and most equipment being marketed in any country is already set up for just that country and fitted with the right plug.

In most countries power suppliers are legally permitted a variation of up to 10% in the voltage being supplied to a home, to allow for variations in demand which affect the voltage drop along the cables. In Britain 230 V is the nominal voltage. The wires carrying the electricity from the sub-station to your home do have some resistance so there is a voltage drop along them, which depends upon how much electricity you and your neighbours are using. So the voltage at the sub-station is automatically continuously adjusted to take this into account: at times of high demand the first house in the road may sometimes get up to 240 V so that the last house can get at least 220 V. (Your electricity meter also automatically takes the actual voltage into account - you only pay for what you get!)

wol.gif - 999 bytes

© Barry Gray October 2017