Please note that only I quoted a couple of grossly erroneous comments.
Originally Posted by curaga
BTW, Why do you assume that nVidia *Linux* binary drivers do not use binary-detection-hecks - just as their Windows counter-part? (Given the fact that they share the same code-base).
... And whether nVidia employs binary name hacks or not, why should I, as an end user, care?
DEV: Intel S5520SC, 2xX5680, 36GB, 5x320GB, GTX470, F19/x86_64, Dell U2711.
SRV: Tyan Tempest i5400XT, 2xE5335, 8GB, 4x2TB, 9800GTX, F19/x86-64, Dell U2412.
VCR: Gigabyte GA-M61PME-S2P, A64/5000X2, 2GB, 1x320GB, 8600GT, F19/x86-64.
LAP: ASUS N56VJ, i7-3630QM, 16GB, 1TB, 635M, F19/x86_64.