Please note that only I quoted a couple of grossly erroneous comments.
Originally Posted by curaga
BTW, Why do you assume that nVidia *Linux* binary drivers do not use binary-detection-hecks - just as their Windows counter-part? (Given the fact that they share the same code-base).
... And whether nVidia employs binary name hacks or not, why should I, as an end user, care?
DEV: Intel S2600C0, 2xE52658V2, 32GB, 4x2TB, GTX680, F20/x86_64, Dell U2711.
SRV: Intel S5520SC, 2xX5680, 36GB, 4x2TB, GTX550, F20/x86_64, Dell U2412..
BACK: Tyan Tempest i5400XT, 2xE5335, 8GB, 3x1.5TB, 9800GTX, F20/x86-64.
LAP: ASUS N56VJ, i7-3630QM, 16GB, 1TB, 635M, F20/x86_64.