Please note that only I quoted a couple of grossly erroneous comments.
Originally Posted by curaga
BTW, Why do you assume that nVidia *Linux* binary drivers do not use binary-detection-hecks - just as their Windows counter-part? (Given the fact that they share the same code-base).
... And whether nVidia employs binary name hacks or not, why should I, as an end user, care?
DEV-NG: Intel S2600C0, 2xE52658V2, 32GB, 4x2TB, GTX680, F20/x86_64, Dell U2711.
DEV: Intel S5520SC, 2xX5680, 36GB, 5x320GB, GTX550, F20/x86_64, Dell U2711 (^).
SRV: Tyan Tempest i5400XT, 2xE5335, 8GB, 4x2TB, 9800GTX, F20/x86-64, Dell U2412.
LAP: ASUS N56VJ, i7-3630QM, 16GB, 1TB, 635M, F20/x86_64.