This report, sponsored by the National Mining Association and the American Coalition for Clean Coal Electricity, claims that IT needs 1,500 terawatt hours of energy per year - or about 10% of global electricy production. See The Cloud Begins With Coal – Big Data, Big Networks, Big Infrastructure, and Big Power (August 2013).
This figure is a headline grabber, but it is a slightly misleading title as the cloud isn't cloud computing. The report considers the energy cost of production of the hardware, the energy required to keep networks humming etc. However The Register notes that the report:
ignores the data centers the video is served out of, and tablet charging.
I recall this report by the EPA in 2007 which estimated that US data centers consumed 1.5% of US electricity production, and was projected to rise to 3 percent by 2011.
There is the strong possibility of a major shift in IC power design which could radically change these figures. Cambridge-based ARM has started its foray into server chips and Interworx (who may have a bias!) think it is significant: What ARM In The Data Center Means For Hosting Companies.