Page 1 of 2 12 LastLast
Results 1 to 10 of 11

Thread: X.Org Server 1.11.3 Officially Releasd

  1. #1
    Join Date
    Jan 2007
    Posts
    15,098

    Default X.Org Server 1.11.3 Officially Releasd

    Phoronix: X.Org Server 1.11.3 Officially Releasd

    X.Org Server 1.11.3 was released by Apple's Jeremy Huddleston prior to starting the weekend...

    http://www.phoronix.com/vr.php?view=MTAyOTU

  2. #2
    Join Date
    Aug 2009
    Location
    south east
    Posts
    342

    Default xorg is slowing down on old hardware

    I often wonder what the test hardware is that is used by Xorg developers.
    Intel HD10000, ATI 9999, Nvidia 974 ZT:
    I'm exaggerating the device models but it's so hard to identify the premium from the heavy duty.
    I've just been noticing Xorg slowing down.

    My oldest machine -- P3 with a Trident chip -- chokes on just X and fluxbox.
    Everything is configure to optimal settings. Even the tricks to turn off PAE and compiling for the chipset don't generate any better performance.
    Then I was thinking maybe glibc had some sort of emulation for CPU extensions.
    Tried a second machine with a Nvidia 6200pci -- p4 celeron -- that chokes on a lot it shouldn't.

    So what I did was take an old distribution -- Slack 12 / 2.6.21 -- and compile a newer kernel for it (2.6.36). I didn't see any regressions in Xorg performance.
    So that rules out the kernel.

    True that it may be the case nobody uses these ancient systems anymore. I've just lost touch with where OSS is going in terms of the general population of hardware.
    My core-duo/i945 is starting to show it's age.


    Matrox G200 ships in servers sold by Dell. 1600x900 isn't going to happen. Maybe at 8bpp but I've no proof.
    Vesa is still usable but only with reduced resources. Even then I've found it unbearable.


    Just some random thoughts I had.
    I noticed nobody has posted to this thread yet.

  3. #3
    Join Date
    Apr 2011
    Posts
    405

    Default

    I think most people have updated their hardware since 1999...

  4. #4
    Join Date
    Feb 2008
    Location
    California
    Posts
    79

    Default

    Quote Originally Posted by squirrl View Post
    I often wonder what the test hardware is that is used by Xorg developers.
    ...
    True that it may be the case nobody uses these ancient systems anymore. I've just lost touch with where OSS is going in terms of the general population of hardware.
    You're a victim of the commercial success of open source. When it was just people hacking for a hobby, they paid a lot of attention to the systems they could afford - older, cheaper hardware.

    But today, most of the work on X.Org is done by employees of vendors either hardware vendors like Intel, AMD, and nvidia, whose employees also have to ensure the new hardware coming out is supported, so are often using it; or OS/distro vendors like Red Hat, SuSE, and Oracle, whose employees are paid to ensure it works on the hardware used by the customers paying for the enterprise support, which are typically companies that have discarded the old machines as fully depreciated and costing more (either directly or in lost productivity) to continue to maintain than replace.

    A lot of the old video card drivers for X.Org are kept minimally building by the devs, in hopes that's useful for someone, but few of those ancient cards are tested, either for functionality or performance, and no community members who care have stepped up to help out with them, so they mostly sit and bit-rot.

  5. #5
    Join Date
    Apr 2011
    Posts
    405

    Default

    Quote Originally Posted by alanc View Post
    You're a victim of the commercial success of open source. When it was just people hacking for a hobby, they paid a lot of attention to the systems they could afford - older, cheaper hardware.

    But today, most of the work on X.Org is done by employees of vendors — either hardware vendors like Intel, AMD, and nvidia, whose employees also have to ensure the new hardware coming out is supported, so are often using it; or OS/distro vendors like Red Hat, SuSE, and Oracle, whose employees are paid to ensure it works on the hardware used by the customers paying for the enterprise support, which are typically companies that have discarded the old machines as fully depreciated and costing more (either directly or in lost productivity) to continue to maintain than replace.

    A lot of the old video card drivers for X.Org are kept minimally building by the devs, in hopes that's useful for someone, but few of those ancient cards are tested, either for functionality or performance, and no community members who care have stepped up to help out with them, so they mostly sit and bit-rot.
    You mean I can buy a new system and it will probably work, but your ancient junk from 1999 might not work as well as it did in 1999 (provided it even worked with Linux in 1999, since most off-the-shelf PC hardware was designed completely around whatever Microsoft was doing, compared to the minimal standards we have now).

    The horror.

  6. #6
    Join Date
    Aug 2007
    Posts
    6,641

    Default

    The problem with old systems is that you can not buy a new gfx card as they usually did not use pci-e. The few "new" agp cards are basically a dead end the day you buy em, similar are pci cards, those might work with newer systems but often even the onboard/chip solutions are faster. So in most cases the "solution" will be to replace the systems even if they would be still fast enough to surf the web (maybe without youtube). Another problem is that ram is really expensive for old systems if you are not lucky enough to get used ram for free. The minimal amount needed to buy a system that could be upgraded is about 100 with an outdated amd board (sometimes with nv chipset) + basic cpu + ram. This is sometimes even cheaper than getting a new apg card...

  7. #7
    Join Date
    Apr 2011
    Posts
    405

    Default

    Quote Originally Posted by Kano View Post
    The problem with old systems is that you can not buy a new gfx card as they usually did not use pci-e. The few "new" agp cards are basically a dead end the day you buy em, similar are pci cards, those might work with newer systems but often even the onboard/chip solutions are faster. So in most cases the "solution" will be to replace the systems even if they would be still fast enough to surf the web (maybe without youtube). Another problem is that ram is really expensive for old systems if you are not lucky enough to get used ram for free. The minimal amount needed to buy a system that could be upgraded is about 100 with an outdated amd board (sometimes with nv chipset) + basic cpu + ram. This is sometimes even cheaper than getting a new apg card...
    Anyone who thought AGP had a future had a few screws loose.

    The number of old systems around that still work or are in the hands of people capable of supporting them are so few that the plan for supporting ancient graphics cards which will go into effect next year is to only use them for the VGA port and the 2d engine and use llvmpipe for 3d.

    Since a lot of those cards are OpenGL 1.5 and DirectX 7 at best, I struggle to think about what you could still be doing with them that the 3d engine is suited for. Join the 21st century, we have cake.

  8. #8
    Join Date
    Feb 2008
    Location
    Linuxland
    Posts
    5,182

    Default

    Why yes, let's have the P2's and P3's run llvmpipe, which is only about 100x slower than their GL1.5 cards. It can't even play OA on the latest six and eight core beasts properly, so it must be ok on cpus several gens older obviously.

  9. #9
    Join Date
    Nov 2008
    Location
    Madison, WI, USA
    Posts
    877

    Default

    Quote Originally Posted by Kano View Post
    The problem with old systems is that you can not buy a new gfx card as they usually did not use pci-e. The few "new" agp cards are basically a dead end the day you buy em, similar are pci cards, those might work with newer systems but often even the onboard/chip solutions are faster. So in most cases the "solution" will be to replace the systems even if they would be still fast enough to surf the web (maybe without youtube). Another problem is that ram is really expensive for old systems if you are not lucky enough to get used ram for free. The minimal amount needed to buy a system that could be upgraded is about 100 with an outdated amd board (sometimes with nv chipset) + basic cpu + ram. This is sometimes even cheaper than getting a new apg card...
    I know they're not exactly high-end, but you can still buy fairly new PCI graphics cards... Newegg is currently listing a Radeon 5450 PCI, and they've also got a GeForce GT 430. I just installed a GeForce 8400GS PCI in a HTPC that I built for the in-laws (Nvidia blob + VDPAU = 1080p live tv on a P4 celeron in MythTV), and it works just fine.

    My recommendation if you have a system that can't handle modern 3D software is this: stick to the version of the software that performs the best for your hardware.

    I've got a P4 laptop with an i820 chipset. It ran like crap on Ubuntu 11.10 (no KMS support = VESA driver), so I downgraded to 10.04 LTS and everything works much better. I'll have to stick with PPAs for Firefox and other updates, but the machine performs just fine with 10.04.

  10. #10
    Join Date
    Apr 2011
    Posts
    405

    Default

    Quote Originally Posted by curaga View Post
    Why yes, let's have the P2's and P3's run llvmpipe, which is only about 100x slower than their GL1.5 cards. It can't even play OA on the latest six and eight core beasts properly, so it must be ok on cpus several gens older obviously.
    The Playstation 2 situation is different. I'm unaware of any major new games out on it, the most recent stuff seems to be stripped down backports of engines that ran on the Wii.

    The Playstation 2 doesn't use DirectX or OpenGL. It has its own 3d libraries, as did the PS1, as does PS3. In each case, it's because the PS 1 and 2 are too weak to handle OpenGL 1 and even though the PS3 could handle OpenGL 3.3 on that Nvidia card, the last thing Sony wants is to encourage developers to port off the Playstation 3, so instead of "Open", they went with a proprietary API. Sure some of it is documented, it has to be because if they didn't nobody could develop for it. It's all under NDA though, so if the general public ever sees any of it, it's because someone out there has violated their Non Disclosure Agreement with Sony. It also means that there's a lag between API updates and when public information about them becomes available. Since it's basically tied to the setup of the PS3, it wouldn't do anything useful for people who would theoretically want to implement it elsewhere.

    To get a title onto PS3, XBOX 360, and Windows, it's an unholy mess. But the console market is so fragmented that developers often have no choice but to rewrite their engine's 3d API backend just to get it to run on the PS3. DirectX on XBOX 360 and Windows XP is basically stuck at version 9, which is one reason why even new PC games like Skyrim are targeted at an 11 year old Microsoft API even though they have declared it obsolete since 2006 with the release of Windows Vista. No developer sees the Windows PC as enough sales anymore to make it worth extending their engine to use any features of newer DirectX versions when something like half of all Windows users are still on XP and they have the obsolete garbage GPU in the XBOX 360 to deal with, which is essentially a pre-HD Radeon.

    So you get a DirectX9 target for Windows (all versions) and XBOX 360. A Sony PS3 proprietary API rewrite for the PS3. Maybe (if you can scale down your engine further still and want to target Nintendo's 3d API) you get a Wii port. You've already written a major chunk of your engine three times by this point.

    It amuses me that people wonder why the PC gaming market is weak, it's because you have consoles like XBOX 360 and PS3 sucking up all the developer resources, holding them to a lowest common API (D3D9) with the XBOX 360 and rewriting a chunk of their engine all over again to get it on the PS3. After the rewrite, the PS3 version of the game doesn't look any better. The goal wasn't Sony making a better API, the goal was for Sony to throw around its weight and force developers into deciding if XBOX 360 and the PC are worth supporting too. Sony doesn't have the lead they were hoping to maintain over Microsoft, they've each sold about 55 million consoles in this generation.

    Then you wander into the issue that Microsoft and Sony both want to milk their current console designs as long as they can, so neither of them officially have a new console in the works as they continue to push hardware from 2005. Though, they could just be trying to not scare customers away from buying the current console to see if they can squeeze one more Christmas season out of what they have. I wouldn't put it past them, it's been done before.

    -------

    Going back to tie this all up with X.org, Mesa, and the Linux kernel, which Sony banned on the PS3 and then bribed a US federal judge to dismiss the class action lawsuit over it... If you want to run accelerated OpenGL on the PS3, you'd need to crack the firmware which might not be legal, then install a Linux distribution on it, and then hell if I know if Nouveau will ever run on the old Nvidia card + customizations that Sony used in that console... Current information says it won't, which means you are in fact stuck with llvmpipe for OpenGL support. The Cell hardware itself was targeted, but the driver has been unmaintained due to lack of interest, and broken, for some time now. When Sony officially supported "OtherOS", the hypervisor would not allow Linux access to the GPU at all and so the only graphics support you had was the vesa driver which normally (on a real PC anyway) is a "WTF!? How did I get here!?" situation.

    Sony is not very....open. Microsoft DirectX is actually more open than the 3d APIs that Sony uses. So there you have it. OpenGL on llvmpipe can be yours for the time and effort to crack your way through a DRM-laden PS3 firmware and voiding the warranty, and installing a Linux distribution. Sony has never allowed OpenGL on their hardware in *any* official context. It's not part of their SDK, it's not supported in the PS3's OS at all, it wasn't allowed when they let Linux run restricted by their hypervisor, and it only works with llvmpipe after you illegally mod your console and run Linux on it now.
    Last edited by DaemonFC; 12-19-2011 at 08:20 PM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •