Quote Originally Posted by oneman View Post
Wow this is so exciting. BZZZZZT slowaris is long dead. Its a cool name, and it was way rad in the 90's, but it over, linux ripped its ass open.
What? Dont you know Linux is a piece of shit, when compared to a real Enterprise Unix as Solaris? Every serious sysadmin knows that Linux have severe problems with stability, scalability and what not. You want to see some links?



http://kerneltrap.org/Linux/Active_Merge_Windows
"The [linux source code] tree breaks every day, and it's becomming an extremely non-fun environment to work in.

We need to slow down the merging, we need to review things more, we need people to test their f--king changes!"




Linus Torvalds says Linux is bloated and huge:
http://www.theregister.co.uk/2009/09..._bloated_huge/

"Citing an internal INTEL corp study that tracked kernel releases, Bottomley said Linux performance had dropped about two per centage points at every release, for a cumulative drop of about 12 per cent over the last ten releases. "Is this a problem?" he asked.

"We're getting bloated and huge. Yes, it's a problem," said Torvalds."




As Linux kernel Developer Andrew Morton says:

http://lwn.net/Articles/285088/
"I used to think [code quality] was in decline, and I think that I might think that it still is. I see so many regressions which we never fix.
...
it would help if people's patches were less buggy."





Linux sucks as a file server.
http://www.enterprisestorageforum.com/sans/features/article.php/3749926

"Go mkfs a 500 TB ext-3/4 or other Linux file system, fill it up with multiple streams of data, add/remove files for a few months with, say, 20 GB/sec of bandwidth from a single large SMP server and crash the system and fsck it and tell me how long it takes. Does the I/O performance stay consistent during that few months of adding and removing files? Does the file system perform well with 1 million files in a single directory and 100 million files in the file system?

My guess is the exercise would prove my point: Linux file systems have scaling issues that need to be addressed before 100 TB environments become commonplace. Addressing them now without rancor just might make Linux everything its proponents have hoped for."





Linux has scaling problems. Sure, Linux runs on super computers on Top500 (which are just a fast network with a bunch of PCs) or on a 1024 core machine from SGI Altix (which is just some blades on a fast switch) - but that is not the same thing as a running a large machine. Linux always runs on networks. Not on a single large computer.




I have lots of similar links, you want to see them? When Linux kernel devs says that Linux code quality is bad and low quality. So it seems that Linux has code quality problems, is buggy, bloated and has scaling problems. Dont you agree?