Processor: Intel(R) Core(TM) i5-2520M CPU @ 2.50GHz
Hard Drive: Seagate Momentus 7200 250GB (ST9250412AS)
Video: Intel(R) Sandybridge Mobile (GT2+)
Network: Intel Corporation 82579LM Gigabit Network Connection, Intel Corporation 6000 Series Gen2 (uses 1 watt)
OS: Fuduntu 2013.1
Desktop: GNOME 2
Display Driver: Intel 2.20.10
OpenGL: Mesa 8.0.4
Compiler: GCC 4.6.3
But if there's a clear, objective regression with the defaults, is that not a bug? If power usage increases, but performance does not, is that not a bug, even if you don't see it in a custom configuration?
It should not be hard to bisect, maybe i can do it sometime in the week-end, it only takes some tens of full kernel builds
Using this sort of logic it would be safe to say that if kernel 3.7 used more power than 1.0 that it was a regression - when it would just be due to the additional work the kernel needed to do to support new functionality in the kernel.
It is completely silly to perform these sorts of tests, they benefit no-one except Mr. Larabel.
I find it really hard to believe 3.6 enables some new functionality that takes 6-8 watts.
Now, can we get back on topic?
Btw, links posted so far: