Basic Grid Overview Part one

 

I am certainly not an authority on electrical generation / demand and the grid, but here are some basics relevant to the discussion.  

First there is electrical current (amps). Second there is voltage.  We are going to use the analogy of water flowing in a river to help understand the difference between the two. You can think of current as the total amount of water moving down the river. More water equals more current. By contrast voltage is the amount of energy that a certain volume of water has. Say you have one cubic meter of water moving at 2 kilometers per hour, and another moving at 20 kilometers per hour. The faster moving water has more energy (in this case more voltage). So by analogy a wide slow river has a lot of current (a lot of water) but low voltage (not much speed per cubic meter of water). By contrast a narrow fast river has less current (less water) but more voltage (lots of speed). You can increase electrical energy by increasing the current OR the voltage.

The crazy thing to realize is that the amount of electricity that the utilities generate has to meet customer demand at all times on a moment to moment basis. Yes, literally on a second to second basis! Why is this? It is because electricity is not stored. It is possible to store electricity of course (like in a battery) but because of the costs involved it simply isn't done. Therefore with no buffer (i.e. storage) the electricity being made has to equal the electricity being used at all moments. What happens if it doesn't? I want to keep it simple, so I'll just discuss one thing that can happen - a change in voltage. All of the electronics in your home require a specific voltage in order to work correctly (in the US it is 110 volts). If the voltage is too low then they don't work right, and if the voltage is too high they get fried.

How does this work? Let's say a tiny little power plant is making a certain amount of power, say 100kW (don't worry about the units), and this power plant is supplying 5 little houses, each of which are using 20kW.  5 x 20 = 100, so generation equals demand and because the "push" equals the "pull" the voltage is staying at a perfect constant 110 Volts. What happens if suddenly one of the little houses stops using any electricity?  Generation is still 100 but now demand is only 80. The energy going into the grid is the same (100) but the energy going out (80) is now lower -- there is an excess of 20. We know from physics that energy is conserved, so where does that extra 20 go? It that extra energy results in an increased voltage (remember from the river analogy that voltage carries energy). When the voltage goes up at all the other 4 houses, circuit breakers and those power strips that protect electronics trip, and everything turns off. In other words, voltage that is too high is unacceptable.

OK now let's go through the example again but in the other direction. Again we have a power plant making 100kW and 5 houses each using 20kW. This time one of the houses turns on several electric space heaters all at once and doubles their demand from 20 to 40kW. Generation is 100 but now demand is 120. What happens this time? Give it a guess! You're right -- voltage decreases. When voltage decreases, we call this a "brown out." Most electronics are not happy and cease to function, and so once again too low of a voltage is unacceptable.

Now imagine the nerves of our hypothetical 100kW power plant operator! Every time someone turns on or off a hairdryer he or she must be hovering directly over the controls ready to respond literally within seconds, or there's going to be 5 angry phone calls! No coffee breaks allowed! Of course it's not just 5 houses and 1 poor power plant operator. Now that we have this down, in the next post I review the basics of grid operation.