DansGuardian needs some work and a front-end for management. Content filtering is required in many schools and libraries and such filters are readily available on Windows. I wrote some utilities with bash scripts and dialoggers but something more polished is needed. There's a Webmin module but it's only slightly better than a text editor.
There are some GUI front-ends available but they have many limitations like a lack of support for multiple filter groups, specific proxy requirements (like tinyproxy), and no easy way to enable proxy anti-bypass firewall rules and network bulletin boards (informing users of what they are doing wrong when attempting to bypass the proxy).
Last edited by jhansonxi; 06-10-2012 at 12:43 PM.
The mentioned problems are getting fixed all the time the problem is that I see no end in sight. Besides, if something doesn't work in Linux, people won't f*cking care whose fault it is, "It works in Windows/MacOS/whatever - Linux sucks", and they are right.
Originally Posted by EvilTwin
Problems with configuration
Without a doubt I think the number one problem for many things Linux is one of configuration and software/hardware -- for anything from PostgreSQL to MySQL to SSD tuning mdadm formatting, the defaults are not exactly ideal, the options are many, and finding the right ones is a pain.
When you install a distro on an SSD, it should know to use trim (discard), and maybe your distro should have considered whether it's better to try default, commit=10, or even commit=600 (in some debian docs). When you create a disk with mdadm should the user really need to figure out the minutia of parity alignments or realize that they need to manually set the chunk size to something sane? With MySQL and PostgreSQL there are slews of cache and memory and performance settings you might need to tweak.
Name a large or low level piece of Linux and the distro defaults probably fall into the category of 'will work, but kind of stupid.'
Of course there are some things that fall into the category of hassle... For example, ever try to monitor your temps under linux?
You'll probably need at least 3 commands: something like aticonfig --odgt, sensors, and to either have set up or invoked several time hddtemp. Even worse, the output from sensors will likely be mostly wrong unless you dig through it and figure out oh, what it reports as fan one is really my PWM case fan, and fan 3 is my CPU fan2, and none of their number match what's in bios, and in some cases you'll even get numbers that seem non-sensible. And good luck figuring out which temp goes to what.
Now none of these things are insurmountable. They're just stupid hassles that the end user ought to be able to not have to figure out.
I'm barely scratching the surface here, but the gist of it is that default configurations need to be contextual and way smarter. It might even make sense to, horror or horrors, have a wizard or something now and again.
Also, just a shoutout to that section of linked articles that mentions the sound system... I haven't been able to figure out what the deal is exactly, but what which audio port is which sure doesn't seem to match the mobo manual. Or something. In general, though, even when it's working correctly, there's a huge variation in volume normalization and output between programs. If I leave system volume at 50%, I'll often have to turn my speakers down to play music and up to watch a movie. No idea why.
Last edited by sloggerKhan; 06-10-2012 at 01:15 PM.
From the sounds of it, I don't suppose you like the Windows or Mac desktops either. And sysadmin under Windows is the biggest f***ing pain I've ever come across — extremely difficult automating stuff and every single driver/piece of softwhere you instell gets in front of your face asking something or other.
Originally Posted by garegin
No, what linux needs is an easy way for developers to reliably release their apps to linux distros. If for example you look at the humble bundle games, many of them need a bit of tweaking to get sound to work. I've no idea how well Ubuntu software centre works, but if it's Ubuntu-only, it's not enough. Linux is fragmented into many distributions because distribution is what linux does best. That, and a constantly moving target, make "just making things work" harder than with MS or Mac, but not impossible!
Caving to the lowest common denominator, following bullshit ui/market trends.
* Developers need to have the mantra to make Linux/GNU the best. Latest versions Quality, quality and more quality. Speed, size, and functionality.
* Developers and groups wasting time and being a detriment, porting for other operating systems that are already commanding the scene. Like "Microsoft/Windows can go to Hell", needs to be the mantra. "'F' Azure", another.
* Lack of updates. Example slow repository updates or scope of projects.
* Lack of shunning system for hardware makers that don't support Linux/GNU. Example, Canon should be shunned. Buyer should be aware not to buy Canon products.
* Developers should also consider GUI along with CLI.
* Naming of concepts, projects, and programs. Some real weird names or acronyms out there and no clear explanations.
Last edited by e8hffff; 06-10-2012 at 01:36 PM.
One thing I do like is that there is progress. Every six months or so many distributions take a few steps forward.
Unfortunately they also take a few steps back. Taking the most recent Ubuntu as an example, the following happened to me:
- Nvidia binary driver kept crashing
- Noveau kept getting confused putting wrong content in places
- Compiz MoveWindow extension is necessary (otherwise everything ends up at 0x0) but broken (keeps moving windows it shouldn't, breaks full screen windows)
- Machines with wifi take 2 minutes to boot waiting for network
- I had to delete a whole bunch of gconf type directories to get a normal desktop (eg volume controls would be missing)
So every upgrade I do involves the following steps:
- Download new packages. This is pretty quick.
- Wait for packages to install. This takes a really long time, because fsync turns into sync on ext filesystems and is called during each package installation
- Immediately notice several things that are broken, search bug tracking, websites etc for discussion and fixes as I'll never be the only one
- Put in fixes which work some of the time, rinse and repeat
- Then notice more broken things that aren't in your face and go through the same search and fix process.
The net effect is that an upgrade takes several weeks before things are stable again. The underlying cause is pretty simple - virtually no effort is put into preventing or fixing this sort of thing. The Ubuntu bug tracker is a joke - items go in there, commenters have fixes and several years later some robot asks if it still happens.
IMHO,the biggest problem of desktop linux is: lack of management.
In linux kernel development , there is Linus Torvalds who makes decisions for patches being adopted. what about desktop linux? Is there any formal management organization to lead desktop linux's evolutionary? freedesktop.org or LSB(Linux Standard Base)?No,both of them are inefficient compared with JCP to java or linux kernel team to linux kernel. Consequently, None of linux distros have consistent user experience by now . Also Fragmentation,too many choice but none of the choice is good enough.As a platform it is lack of enough attraction to the third party developers and even companies to develop apps for it. Why? Lack of well-organized and updated documentations and tookits, stable apis and relatively unified packages format for them to easily and quickly start developing apps on this platform. Because of lacking apps, users gradually lost interest in linux. We need satisfying and retaining users and developers. In my opinion,all of these problems are due to lack of management. We really really need work together to do only one big thing : move desktop linux forward.
In addition, Linux graphics sucks compared with windows and osx. Yep, wayland is our hope and wish it could come earlier.
Back when I was using Ubuntu 8, I would have said everything. Ease of use, dependency on Terminal, and a UI only a nerd could love. With Ubuntu 12 the situation is very different.
#1 Graphics drivers. Open source are not able to compete with proprietary drivers, and proprietary drivers aren't very good at all. Just as always Nvidia is the way to go if you're using Linux, and that needs to change.
#2 Windows software compatibility, or in other words WINE. Despite what many people wanna believe about having native Linux software, Windows has been around for a long time and has a lot of software that Linux may never get. WINE compatibility and speed has to be tolerable enough for joe six pack to use. It's the bridge that will bring end users and with them developers.
#3 Games, and lots of them. Valve bringing Steam is a wonderful thing, and could be a huge turn around for Linux. Open source games are nice but we need more commercial games. Gaming has always been a huge strength for Windows, and a huge strength for iOS devices. These platforms are currently doing very well. We need to make it more attractive for developers to bring their games over.
An example of poor quality I've experienced recently...
The Developers working on porting (not Android) Linux to Arm devices. Example Rhombus-Tech project. Many are building old kernel versions. Yeah they may work, but the objective should be to hit release with a highest current kernel and OS. This is the lack of quality mantra in Developers. Like code on Ubuntu 12.04, not 10.04. If you find a problem in 12.04 then you've progressed the Ubuntu 12.04 project early, a problem that would still need to be repaired later. Like 12.04 also has 1001 improvements for Arm over 10.04. It's head bagging moments like that, that make is sad.
Sometimes in life you need to make great things, not mediocre. Hobby means it's fun, but you can still make great for prosperity.