Alternating Current

electronics AC

Before we look at rectifier circuits, let's take a brief look at alternating current.

Alternating Current (AC)

Till now, we only worked with direct current (DC) and we will continue to do so. To better understand rectifiers, however, we should to take a brief look at alternating current (AC). While we know DC from working with the Arduino and building low voltage circuits, most parts of the power grid and the wall sockets in our houses use AC. In this tutorial we will discover the reasons for this and get to know the basics of AC.

Disclaimer
Please remember that electricity is dangerous. Keep yourself safe and do not experiment with mains voltage. We will build a low voltage AC source for safe experiments with rectifiers in an upcoming project series.

Let's start at the beginning. What is AC and what is the difference to DC? In DC circuits the current supplied by the power source constantly flows in one direction. This direction is determined by the power source's fixed polarity. This is exactly what we know from batteries, USB, lab bench supplies and many other power supplies.

An AC power supply behaves differently. There is no fixed polarity. If we take a look at Schuko or Europlugs, it's easy to notice this fact. The plugs fit into the socket in one or the other direction. It doesn't matter which direction you choose the connected device still works. The same holds true for the old unpolarized American NEMA connectors, which have luckily been vastly replaced by the safer polarized plugs. The direction of the current flow when using an AC source is constantly alternating, just like the name suggests. When we speak about AC, this usually means that current and voltage curves follow the form of a sine wave. This sine wave for is e.g. generated by synchronous generators in power plants.

The picture below shows the voltage curve of 230 V AC with a frequency of 50 Hz. This is what is used in European households. In the US a voltage of 110 V (or 120 V) with a frequency of 60 Hz is used. One thing that might strike you when looking at this graph is that the voltage reaches a maximum of 325 V. This is not an error, but intentional. 230 V AC is not the peak voltage, but the so-called root-mean-square voltage.

Voltage curve for a 230V 50 Hz AC voltage source

An AC signal can be characterized by the following parameters:

  • Period Length
    Time needed until the voltage curve repeats
    \(T = 20 ms\)
  • Frequency
    Number of cycles per second
    \(F = {1 \over T} = 50 Hz\)
  • Peak Voltage
    Maximum voltage reached
    \(V_p = 325 V\)
  • Peak-to-Peak Voltage
    Voltage difference between the lowest and the highest voltage
    \(V_{pp} = 650 V\)
  • Root Mean Square Voltage
    A value that represents which DC voltage would deliver the same amount of electric power to a load with fixed resistance.
    \(V_{RMS} = {V_p \over {\sqrt 2}} = 230 V\)

While most of these measures should be easy to understand the root-mean-square or short RMS voltage might need a bit more of explanation. As you have probably noticed, this is the voltage value, we commonly use. It is also the voltage stated on plugs and devices. But why not simply use the peak voltage? Well, while the peak value might be interesting when ensuring that the maximum ratings of the used components are not exceeded, it is generally not that important. The peak voltage is only supplied for a very short amount of time in each cycle. In the instance, the voltage crosses zero, on the other hand, no current flows at all and no work is conducted. If we want to calculate the average power consumption of an AC device, we thus cannot simply multiply current and voltage. Neither of them is a fixed value anymore. One way to solve this issue is using integrals. The use of the RMS voltage, however, allows us to do this a lot easier. The RMS voltage is defined, as the DC voltage that is required to deliver the same amount of power to a load with a fixed resistance. For a sine wave this corresponds to the peak voltage divided by \(\sqrt 2\). The RMS current \(I_{RMS}\) can calculated in the same way.

If we have a load with a fixed resistance of \(R = 1 kΩ\) the power consumption, can be calculated just as in DC circuits. For this we combine ohms law and the formula for calculating the power consumption, to directly calculate the power from resistance and voltage:
\(R = {V \over I}\)
\(P = V \cdot I = {V^2 \over R}\)

For our AC circuit we then simply use the RMS voltage as voltage:
\(P = {{V_{RMS}}^2 \over R} = {{230 V}^2 \over {1 kΩ}} \approx 53 W\)

Of course, we could also use ohm's law to first calculate the RMS current that flows through the resistance to then calculate the power consumption in a second step by multiplying RMS voltage and RMS current. The RMS values can be used in this calculation just like they were DC values. Very convenient.

By the way, the RMS voltage is often given as AC 230V or 230 VAC. This makes clear that this is not a constant voltage, as we know it from DC circuits, but the RMS voltage as it is used with AC.

DC or AC?

Why do we use DC in almost all projects and tutorials? Why are there circuits and devices that use AC? Well, these are questions that can't be answered with a single sentence. There are plenty of reasons for choosing the one or the other technology. Each of them has advantages and disadvantages. In the end it all boils down, to the question, which properties of AC or DC are needed by the specific circuit or device. So, let's do a short comparison and look at the historic backgrounds. The later explain, why we use AC in our houses today.

The War on Currents

Whether AC or DC is better, has been debated for a long time. The debate culminated in the so-called war of the currents in the 19th century. This war was luckily only a commercial one. Thomas Edison's Edison Electric Light Company favored DC in conjunction with their 110 V incandescent lamps. The entrepreneur George Westinghouse with his Westinghouse Electric Company, favored an AC system.

The war on currents is also a war over who gets the lucrative contracts to provide lighting for the large cities. Street lights in the late 19th century often used so-called arc lamps. Arc lamps use two carbon electrodes in between which an arc was generated. This makes the electrodes light up brightly. These lamps had a very limited lifetime, however. Additionally, they posed a notable fire hazard. In short, you don't want to have such a lamp in your house. Edison's incandescent lamps, on the other hand, allow for a relatively safe indoor lighting. In the following years they mostly superseded the old arc lamps.

While Edison's incandescent lamps where a big success, the prevailing 110 V DC system had a big issue: the transmission range was less than a mile. The primary issue is the power loss caused by the resistance of the wires. This meant that power plants were needed near to the consumers. Westinghouse had a solution for this issue. His AC system simply used a higher voltage. This reduced the losses in the wire and allowed for much longer transmission ranges. An advantage, that paid off and lead to the success of AC systems over DC systems.

All that would however, not be possible without the invention of transformers, which allowed for an easy conversion between different AC voltages. This made it possible to use a high voltage for the power grid and then convert it to the safer 110 V used for the indoor incandescent lamps. Of course, Edison was not very happy that the DC system, that brought a lot of income to his company, was slowly displaced by AC systems. An ugly campaign began in which Edison tried to show that Westinghouse's AC system is too dangerous to be used at all. Nevertheless, AC won this war and today's power grids mostly use AC.

Outside the domain of power grids and big electrical devices, DC is still the standard. The reason for this is simple: AC has the major drawback that no electrical work is conducted in the instance the current direction reverses. For motors that obsess enough inertia, this no problem. A lot of other circuits and devices won't be able to cope with that, however. This is especially true for a lot of digital circuits and microcontrollers which need a stable, continuous DC power supply.

To deliver DC in devices, rectifier circuits are used to convert AC to DC. We will talk about these in the next tutorial. To reduce mains voltage to a more suitable AC voltage, a transformer is used. The low voltage AC signal gets then rectified, filtered and converted into the required operating voltage. Nowadays, this is usually done by a voltage regulator.

Losses in Power Lines

The losses in power lines are to a huge amount resistive ones. As we learned in the tutorial on conductivity and resistance, the overall resistance increases with the length of the wire. To compensate this one could increase the diameter of the wires, but this is expensive and unpractical at large scale. It is a lot easier to just use a higher voltage.

To understand this we have to look at how we can calculate the power loss for the wire. Our end goal is to provide a certain amount of electrical power \(P\) to the target location. Depending on the resistance of the load, a current \(I\) will flow through it. This is also the current that needs to flow through our power line. If we know the resistance of the power line, we can use this information to calculate the power loss for it. This is possible by first calculating the voltage drop over the power line and by then using our classic formula to calculate the power dissipation. However, we can also combine this into a single formula:
\(P = I^2 \cdot R\)

It is easy to see, that the power loss increases quadratically with the current. What can we do about this? If we increase the voltage \(V\), we can reduce the current while delivering the same amount of power. This can be easily seen if we take a look at our classic formula for power:
\(P = V \cdot I\)

If we double the voltage, we only need half the current. This will in turn cut down the losses in the wire by a factor of 4.

This is of course also true for DC systems. But efficient DC converters for high-voltages and the matching power electronics where not available until the late 20th century. In recent years high voltage DC (HVDC) power lines became a possible alternative for AC power lines, especially for reducing the power loss over long distances. This is possible, because a row of effects that increase the losses only occur with AC and not DC. This is especially true for the skin-effect and capacitive losses.

Is AC dangerous?

Edison certainly pushed it too far, but his concern about the dangers of AC is a valid one. At this point it is however important to note, that DC is not per se safe. What matters in the first place is how much current flows through the body. A high voltage allows for a significant amount of current to flow even in case of a comparatively high resistance like the human body (around 500 Ω - 1 kΩ, although it can be a lot higher with dry skin). Even currents in a range of several milliamperes can become dangerous. But, there is in fact a difference between the danger of AC and DC. There have been more than enough experiments and studies around this topic. The results show that the let go current - the maximum current that allows you to still open your hand and let go of a wire - is around 4 to 5 times higher for DC than for AC. So AC is indeed more dangerous than DC. But, up to what voltage can we experiment without he risk of killing ourselves? There are guidelines on this, like the DIN VDE 0100-410, that define up to which voltage it is not required to employ protections against touching the wires. A touch voltage that should also be safe for children is 60 V for DC and 25 V for AC. In environments with a high humidity lower values are required. There is however no definitive guarantee that you won't receive a shock at these voltages. It all depends on how exactly you touch the wire, how wet your skin is and what way the current travels through your body. So its good to be careful regardless whether it's AC or DC.

A Short Summary

So what about the comparison? Well, such a comparison doesn't really make sense, as the applications for DC and AC are very different. But now that we know the basics, as well as the historic background, we can list some advantages and disadvantages for both technologies.

DC

  • Needed for most low voltage circuits and especially digital ones
  • Voltage conversion is a lot more difficult, though possible with DC-DC Converters and other voltage regulators
  • Less dangerous

AC

  • Easy voltage conversion using transformers
  • Only usable by specific devices (light bulbs, AC motors, heating elements, ...)
  • AC devices often operate at voltages that are not suited for experiments by non-professionals
  • More dangerous

As you can see, there are good reasons, why we use DC in almost all of our projects. Audio circuits are an exemption to this. They are a typical example for mixed-signal circuits that not only use analogue but also AC signals.

Previous Post Next Post