Simply put, productivity growth refers to the growth in economic output per worker or more precisely, per hour of work. When this growth slows, the potential for real wage increases diminishes since the growth in wages typically reflects the ability of workers to create more output per unit of time.
To the obstensibly naive observer the following idea may seem a plausible explanation: Higher-cost energy inputs into the production of goods and services reduce productivity growth because the economic output per dollar of energy consumed declines. And, though energy inputs aren’t the only thing to consider, they are important. The high energy prices of the last decade or so may be, in part, responsible for low productivity growth. (Conversely, low energy costs would imply more output per dollar of energy consumed.)
But strangely, almost all economic models for productivity consider only so-called “tangible” factors, that is, labor and capital. In the bizarro world of modern economics, energy and materials are not considered “tangible.”
Now, the way in which that productivity growth which is attributable to “technological advances” is typically calculated is to add up contributions to productivity growth from labor and capital (machines, buildings, vehicles, tools of any kind) and then subtract this sum from the known amount of total productivity growth. What is left is the so-called “residual” which is presumed to result from “technological advances” caused by increases in human knowledge. These advances and the increases in capital per worker are assumed to be the drivers of productivity growth.
…click on the above link to read the rest of the article…