Page 6 of 41 FirstFirst ... 4567816 ... LastLast
Results 51 to 60 of 408

Thread: AMD's opensource lies exposed

  1. #51
    Join Date
    Nov 2009
    Location
    Italy
    Posts
    934

    Default

    Quote Originally Posted by glxextxexlg View Post
    Did you check your cpu usage?
    Yes, no more than 50%.

  2. #52
    Join Date
    May 2007
    Location
    Third Rock from the Sun
    Posts
    6,584

    Default

    Quote Originally Posted by Wyatt View Post
    I distinctly remember the 17" monitor I got in 2001 supporting 2560x1920.
    Got a make and model number for that? I have my doubts.

  3. #53
    Join Date
    Oct 2010
    Posts
    145

    Default

    Why do you say AMD/ATI are lying? Who do you think are the developers of the free ATI driver?
    Good question. AMD is lying because they will never deliver opensource OGL3x/4x and video acceleration support to their customers. AMD dropped support for r300 - r500 hardware in early 2009 and they left development of the open drivers to independent developers like marek olsak and corbin simpson. It took them a very long time and tremendous amounts of unpaid hard work to make r300g an OK driver and it still is not ready for mainstream use.
    And yes they will drop support for their "older" r600/r700 hardware. And people won't be able to play new games released for linux (OilRush etc) that use OGL3x/4x, or do the magic things available with the html5 WebM, WebGL for a very long time with linux. Because r600-700 development will be shifted to independent devs and that hardware is orders of magnitude more complex than r300-r500 hardware.

    The result is people saying:
    Quote Originally Posted by people
    this linux is not a good OS coz it doesn't run even a native game developed for it.

  4. #54
    Join Date
    Jun 2009
    Posts
    2,929

    Default

    Quote Originally Posted by glxextxexlg View Post
    Good question. AMD is lying because they will never deliver opensource OGL3x/4x and video acceleration support to their customers.
    They never claimed they would, so they can't be lying, by definition.

    AMD dropped support for r300 - r500 hardware in early 2009 and they left development of the open drivers to independent developers like marek olsak and corbin simpson. It took them a very long time and tremendous amounts of unpaid hard work to make r300g an OK driver and it still is not ready for mainstream use.
    It took them a long time, because the Gallium stack and KMS were not ready yet, and this is clearly not Marek's or Corbin's fault.

    And yes they will drop support for their "older" r600/r700 hardware. And people won't be able to play new games released for linux (OilRush etc) that use OGL3x/4x, or do the magic things available with the html5 WebM, WebGL for a very long time with linux. Because r600-700 development will be shifted to independent devs and that hardware is orders of magnitude more complex than r300-r500 hardware.
    This is all lots of ifs and maybes, 3 years from now.

    Most of OpenGL3 infrastructure in Mesa is almost ready. r600/r700 can only do OpenGL3, no more. So actually, we're on a good way.

    But this is another thing that needs to be sorted out in Mesa first, before HW-specific drivers support it. I would also like more corporate funding in Mesa (Intel and VMWare are doing most of the funding now), but still AMD is doing more for Mesa than Nvidia.

  5. #55
    Join Date
    Jan 2010
    Location
    Portugal
    Posts
    945

    Default

    Quote Originally Posted by Wyatt View Post
    I distinctly remember the 17" monitor I got in 2001 supporting 2560x1920. I kept it at 1600x1200 so I could get the 100Hz refresh rate. Not 1280x1024. Not 1280x720. Not 1920x1200. Do you know how many displays (of any size) have been made in the past five years that can touch that?
    Note: not to be taken seriously. Since this thread is pretty much doomed anyway, here's my contribution to the flame war:

    What??? What kind of super-duper-nuclear-fusion-powered 17" monitor was that?? I have an Eizo T960 21" CRT that tops at 2048x1536@75Hz (default modes only) and everything is just so small at that resolution that I would never even consider using it a whole day. Oh, and it has a maximum 115KHz horizontal scan frequency, which is pretty high when it comes to CRTs, and can "only" go up to 92Hz on 1600x1200 (the resolution I use). The monitor you describe would need at least a 125KHz Horizontal scan frequency, which wouldn't make much sense on a 17" monitor. And, the only way to get those kind of high resolution video modes would be to create them in X.org or with xrandr.

  6. #56
    Join Date
    Jul 2008
    Posts
    359

    Default

    Quote Originally Posted by glxextxexlg View Post
    Good question. AMD is lying because they will never deliver opensource OGL3x/4x and video acceleration support to their customers. AMD dropped support for r300 - r500 hardware in early 2009 and they left development of the open drivers to independent developers like marek olsak and corbin simpson. It took them a very long time and tremendous amounts of unpaid hard work to make r300g an OK driver and it still is not ready for mainstream use.
    And yes they will drop support for their "older" r600/r700 hardware. And people won't be able to play new games released for linux (OilRush etc) that use OGL3x/4x, or do the magic things available with the html5 WebM, WebGL for a very long time with linux. Because r600-700 development will be shifted to independent devs and that hardware is orders of magnitude more complex than r300-r500 hardware.

    The result is people saying:
    http://www.techeye.net/hardware/amd-...dows-and-linux

    I seriously doubt that an r500 would be able to play oil rush anyway

  7. #57
    Join Date
    Oct 2010
    Posts
    145

    Default

    It took them a long time, because the Gallium stack and KMS were not ready yet, and this is clearly not Marek's or Corbin's fault.
    I never implied that its marek's or corbin's fault and I feel utmost respect for their efforts. Its amd's fault as the manufacturer. Amd didn't help X and mesa developers to implement the gallium3d API and KMS+DRI2. Ironically it was a rival, intel who helped them! My r580 uses the intel glsl bits as it renders 3d applications hohoho. The same thing happens with the mesa OpenGL 3x/4x bits right now.

  8. #58
    Join Date
    Jun 2009
    Posts
    2,929

    Default

    Intel does do more for the open source stack than AMD, and I respect them for it.

    How much did Nvidia help with Mesa and kernel DRM? How much did they help the nouveau hackers?

    Banging on AMD while praising Nvidia is hypocritical.

    All of us would be happier if AMD hired another 3-4 full-time developers and the drivers progressed faster. But the facts still are that:

    - they have opened specs and continue to do so
    - they employ two full-time OSS developers who contribute to the kernel and Mesa, as well as the X drivers
    - all of their GPUs (save a couple of the most advanced ones from the HD6xxx line) are supported by OSS drivers and provide a good desktop experience. Neither Intel nor Nvidia support all of their chips under Linux!

    That's very commendable and worthy of support, in my opinion. So far, I'm happy with the development in the last two years.

    Now would be a great time to throw some cash into Mesa and video decoding, and I'd be very glad if AMD did this. However, they didn't pledge to do this.

  9. #59
    Join Date
    Nov 2009
    Location
    Italy
    Posts
    934

    Default

    Quote Originally Posted by pingufunkybeat View Post
    - they employ two full-time OSS developers who contribute to the kernel and Mesa, as well as the X drivers
    Three. They were hiring few times ago, but I still don't know who's the third developer.

  10. #60
    Join Date
    May 2007
    Location
    Third Rock from the Sun
    Posts
    6,584

    Default

    Quote Originally Posted by devius View Post
    Note: not to be taken seriously. Since this thread is pretty much doomed anyway, here's my contribution to the flame war:

    What??? What kind of super-duper-nuclear-fusion-powered 17" monitor was that?? I have an Eizo T960 21" CRT that tops at 2048x1536@75Hz (default modes only) and everything is just so small at that resolution that I would never even consider using it a whole day. Oh, and it has a maximum 115KHz horizontal scan frequency, which is pretty high when it comes to CRTs, and can "only" go up to 92Hz on 1600x1200 (the resolution I use). The monitor you describe would need at least a 125KHz Horizontal scan frequency, which wouldn't make much sense on a 17" monitor. And, the only way to get those kind of high resolution video modes would be to create them in X.org or with xrandr.
    Ya the bullshit-o-meter went to 11 (on a scale of 1-10) on that one.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •