As Margulius says, this doesn't account for how much energy is used by the US
computer infrastructure aside from internet data centers.
Here is a abstract from a report issued in 2000 by the US DOE:
The major consumer electronics in U.S. homes accounted for over 10 percent of U.S. residential electricity consumption, which is comparable to the electricity consumed by refrigerators or lighting. We attribute 3.6 percent to video products, 3.3 percent to home office equipment, and 1.8 percent to audio products. Televisions use more energy than any other single product category, but computer energy use now ranks second and is likely to continue growing. In all, consumer electronics consumed 110 THw in the U.S. in 1999, over 60 percent of which was consumed while the products were not in use.
The energystar website gives some interesting factoids...
The article "How Small Devices are Having a Big Impact on U.S. Utility Bills" focuses on what the increasing integration of electronics does to energy consumption.
In 2005 "Electronics products accounted for about 13% [0.585 quads] of total home electric consumption [4.5 quads]; almost three times the level in 1980"
By 2015 "Electronics products alone will account for 18% [0.972 quads] of total home electric consumption [5.4 quads]."
Yikes! Just under one fifth of all home electricity consumption will be consumed to run home electronics.
(1 quad = 1 quadrillion British Thermal Units [Btu] = 1.06 exajoules [1x1018 joules].)
So what's the total run up here?
Back to the Chronicle:
"According to the study, servers and the infrastructure used to maintain these machines use about 45 billion kilowatt hours a year.
The United States consumed 3,661 billion kilowatt hours, or nearly $300 billion, in 2005."Home electronics consumed about 0.585 quads...
OK, now to do a lot of conversion and then I wonder - what about office electronics?
It seems like this is still a big number, though the point of the Chronicle article is that it is not as high as anticipated.