Page 3 of 14 FirstFirst 1234513 ... LastLast
Results 21 to 30 of 137

Thread: Preview: Ubuntu's Performance Over The Past Two Years

  1. #21
    Join Date
    Jan 2011
    Posts
    1,287

    Default

    Quote Originally Posted by TheOne View Post
    With newer versions of apt-get a new command 'source' was introduce which basically downloads the original sources to generate the deb files on the repository servers. If you as a user have so much time to waste for building the software with custom cflags, etc... then you can basically

    apt-get source packagename
    cd packagename-vblah

    and recompile as much packages as you want.

    Also some users insinuated that you can't compile software on Ubuntu (debian based) systems but well that just sounds insane, you can always

    apt-get install build-essential
    wget sources.tar.gz
    tar -xvzf sources.tar.gz
    cd sources
    ./configure --prefix=/home/myuser/mysoftware
    make && make install

    But it is pretty easy to just apt-get source package, modify as needed and compile. Too much science for guru's out there willing to waste some time?

    Anyway I use Xubuntu for desktop, and Debian for servers so I can focus on the real work. It doesn't matter if my system is 500 milliseconds slower
    Or you could just use apt-build.

    Quote Originally Posted by JS987 View Post
    If you will install software with make install, you will break your machine sooner or later.
    Meh, that's mostly a myth, I did it for a long time without a single breakage.

  2. #22
    Join Date
    Sep 2012
    Posts
    343

    Default

    Quote Originally Posted by mrugiero View Post
    Meh, that's mostly a myth, I did it for a long time without a single breakage.
    You shouldn't break system if you always install in new directory software compiled with ./configure --prefix=/usr/local/software-123
    You can easily break system If you try to replace some software which was installed from deb package with newer version of software compiled with ./configure --prefix=/

  3. #23
    Join Date
    Jan 2011
    Posts
    1,287

    Default

    Quote Originally Posted by JS987 View Post
    You shouldn't break system if you always install in new directory software compiled with ./configure --prefix=/usr/local/software-123
    You can easily break system If you try to replace some software which was installed from deb package with newer version of software compiled with ./configure --prefix=/
    I'm aware, but it's really simple to avoid those problems. make install will surely not break your system if you do things right, and since doing them right is still considerably easier than packaging...

  4. #24
    Join Date
    Sep 2012
    Posts
    343

    Default

    Quote Originally Posted by mrugiero View Post
    I'm aware, but it's really simple to avoid those problems. make install will surely not break your system if you do things right, and since doing them right is still considerably easier than packaging...
    I was talking about upgrading/replacing software which is already installed from deb package. It isn't safe even if you create own package. make install is dangerous in that case.

  5. #25
    Join Date
    Mar 2011
    Posts
    339

    Default

    Quote Originally Posted by frign View Post
    .....then even Arch unfortunately sets limits.
    Careful with that, it's a bit misleading.Yes, Arch is a binary distro and while it does let your only include things you want it isn't as flexible as compiling your own..... However there is nothing stopping you from making your own packages or using the AUR. Though, I guess you could do some of that stuff in Ubuntu as well.. God I love Linux.

  6. #26
    Join Date
    Oct 2012
    Location
    Cologne, Germany
    Posts
    308

    Talking Careful with that

    Quote Originally Posted by nightmarex View Post
    Careful with that, it's a bit misleading.Yes, Arch is a binary distro and while it does let your only include things you want it isn't as flexible as compiling your own..... However there is nothing stopping you from making your own packages or using the AUR. Though, I guess you could do some of that stuff in Ubuntu as well.. God I love Linux.
    Yes, no one stops me from recompiling software manually with just the features I want and then package them in my own repo.
    Theoretically, it's possible and works in most cases.

    Gentoo, however, does that for me automatically and allows me to set _global_ USE-flags, which is a tremendous simplification!

    If I wanted to strip PAM from all packages, I would have to first identify all packages depending on it (To be fair, that's still quite easy on Arch and Debian).
    Then I would have to get source-tarballs for the respective programs and compile them without PAM. Packaging would need to be done and it would be required to explicitly declare the package as independent from PAM-libs to prevent aptitude from pulling the library in accidentally.
    So there I go. If everything works, I can install the new packages and remove the PAM-components manually.
    In Gentoo, I just need to add "-pam" to $USE (in /etc/portage/make.conf) and the rest is done automatically.
    Removing PAM is just a trivial example. What about libpng? What about removing all traces of ConsoleKit?

    To be fair, we are talking about a binary-distribution and you have your freedoms by being able to package your own stuff. But if there is a library many programs pull in as a dependency unnecessarily (and that could be a security and is a performance problem), then you can't get around a source-based distribution like Gentoo, as repackaging is a waste of time when it involves many packages.

    I like to put it this way: On Gentoo, easy stuff is complex. The more complex the tasks get, the easier it is. Overall, you don't have to worry about the easy stuff once you've taken care of it .

    Overall, I love GNU/Linux for being that flexible . This would never be possible with Windows or Mac OS.
    Last edited by frign; 07-13-2013 at 12:57 PM.

  7. #27
    Join Date
    Oct 2012
    Location
    Cologne, Germany
    Posts
    308

    Cool QFT

    Quote Originally Posted by BO$$ View Post
    Not to mention unnecessary.
    Quote Originally Posted by BO$$
    Especially since it's unnecessary.
    qft.

    I don't think you are capable of perceiving the potential of these things. (As I already told you).

  8. #28
    Join Date
    Mar 2013
    Posts
    63

    Default

    Quote Originally Posted by frign View Post
    And now compare it with Gentoo ...

    Ubuntu may have become faster, but it is still horrible bloatware.
    I am interested in Gentoo, but I couldn't find benchmarks comparing it to recent versions of binary-based distros as Debian, Ubuntu, Fedora...

    Do you know how many performance you get from going from a generic AMD_64 to compiling with optimizations? The own Gentoo site recommend the distro for owners of 8-cores, but does not give any detail on why.

  9. #29
    Join Date
    Sep 2012
    Posts
    343

    Default

    Quote Originally Posted by BO$$ View Post
    Yes I am. It's just masturbation. There is no need to compile locally, just to show off maybe.
    It will likely contain less malware. No additional patches made by NSA.

  10. #30
    Join Date
    Oct 2012
    Location
    Cologne, Germany
    Posts
    308

    Talking Go for it!

    Quote Originally Posted by juanrga View Post
    I am interested in Gentoo, but I couldn't find benchmarks comparing it to recent versions of binary-based distros as Debian, Ubuntu, Fedora...

    Do you know how many performance you get from going from a generic AMD_64 to compiling with optimizations? The own Gentoo site recommend the distro for owners of 8-cores, but does not give any detail on why.
    You should really go for it! I started with Gentoo in December with a quad-core Mac mini, which literally is an 8-core machine when hyperthreading is activated. But normally, you could use Gentoo with a single-core machine, as compiling doesn't need to be supervised. For very large programs, binary-versions are available in case of problems.

    Installing it for the first time is a _real_ challenge. I don't want to talk it down in any way: My first installation took over 8 hours and additional setup took days, but it was definitely worth it.
    You can't compare it to normal GNU/Linux-distributions: My system boots in 3 seconds and you can literally tweak anything.

    It's not about extreme compiler-flags (optimizations and the like), but more about what you compile into your software (shared libraries, generally speaking: dependencies).
    If you use a binary distribution and install GIMP for instance, it pulls everything in. Support for udev, image-libraries, acl's and stuff.
    You don't need most of it and compiling your own version of a given program can definitely yield positive results in regard to memory-consumption, performance and startup-speed.
    Added to this come a tremendous package manager (portage), a great documentation and an awesome community.

    I reinstalled Gentoo a few months ago (don't get me wrong, one setup can be literally used for decades) and knew a lot about the system by then. I was finished pretty quickly, as I could easily move all relevant configuration-files to the new setup.

    All in all, the steep learning-curve is worth it. Go for it!

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •