Page 1 of 2 12 LastLast
Results 1 to 10 of 17

Thread: Cabbages and Kings

  1. #1
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,514

    Default Cabbages and Kings

    This is a continuation of a thread which sorta started here :

    http://www.phoronix.com/forums/showt...t=16740&page=4

    re: effort required to keep fglrx running on new distro versions :
    Quote Originally Posted by Davy View Post
    I don't say that. We volunteer for this job, but this can be done with open source drivers only, because it can be done before the releases. And that's not what he said. He was talking about X server 1.6 support for fglrx legacy driver.
    That`s what I was talking about too. I think we`re all talking about the same thing -- I read your original post as suggesting that we not add new features but just keep adding support for new distro versions (kernel, xorg, dri etc..); my response was that if it was as trivial as you suggest we would already be doing it.

    Quote Originally Posted by Davy View Post
    As Nanonyme said too, we are a little off topic and we should continue this interesting discussion (at least for me) somewhere else. Sorry for the noise guys.
    And here we are. I`m open to suggestions for a better thread title

    re: the fact that we're not adding new OS support for pre-6xx on any OS, not just Linux :
    Quote Originally Posted by Davy View Post
    I didn't know that. Does it mean that it won't be possible to use Win$ 7 on those GPUs?
    MS is ensuring back compatibility with the existing Vista drivers, although there are some performance and efficiency gains you don`t get without supporting the Win7 native driver model.

    re: waiting for AIGLX
    Quote Originally Posted by Davy View Post
    I was waiting for compiz to work with fglrx and AIGLX (it was missing something like texture_from_pixmap extension). And yes your right what I wanted to see too is Redirected Direct Rendering. I've never been able to see RDR on my desktop, because the cooker is using X server 1.6 for months (in fact it was 1.5.99.3, since December 30th).
    Ahh, if you updated in December then you probably never saw it working. If you get bored you might want to give it a try on one of the supported distros.

    re: 5xx and earlier parts not having dedicated video decode hardware other than the MPEG-2 IDCT engine :
    Quote Originally Posted by Davy View Post
    One more bad news. Perhaps, in the future, there will be a way to use the GPU power for video offloading through the VA API but in a more generic way as we are now talking about GPGPU
    This shouldn`t be bad news, it`s not like we broke into your house and stole the video decoder out of your chip

    All the info required to accelerate video decoding with shaders has been out there for a year at least; if nobody else adds the support to the open drivers I`ll ask one of our guys to do it after we get a bit further on 6xx-7xx 3d.

    re: the cost of supporting open source driver development :
    Quote Originally Posted by Davy View Post
    You are talking about legal issues for IP or something more technical?
    Writing the documentation, writing the first sample code, IP review, developer support... they all cost time and money. It`s not like the docs and open source drivers are sitting around and we just have to stop being stupid

    re: murdering fglrx if the open source driver performance and features become high enough to be viable in the workstation market :
    Quote Originally Posted by Davy View Post
    Well, let's say 95%, if you prefer. I don't think 5% more would be significant. Perhaps it could even be faster than the fglrx. Who knows?
    At 95% it might work, but we lose business for a single percentage point of performance. Even the devs don`t think they`ll get anywhere near 95% on complex apps though...
    Last edited by bridgman; 05-04-2009 at 12:07 AM.

  2. #2
    Join Date
    Nov 2007
    Posts
    1,353

    Default

    Quote Originally Posted by bridgman View Post
    At 95% it might work, but we lose business for a single percentage point of performance. Even the devs don`t think they`ll get anywhere near 95% on complex apps though...
    With how horrible fglrx performs when compared to windows, I wouldnt be surprised to see the open source drivers surpass it after awhile. Right now the developers are working on implementing basic functionality. That work will be done sooner or later. Of course it is never ending and soon a new chip or architecture will be released that requires the same effort that every other past generation required. In that sense the open source drivers may never get the same level of tuning that fglrx gets, but at the same time new functionality and features are being implemented at a much faster rate. It's only logical to assume that the open drivers will catch up with the same level of supported functionality. For example, once OpenGL 3.0 is implemented and stable on the open drivers, I wouldnt be at all surprised to see it at 85% of the performance of fglrx, but on the other hand I wouldnt be at all surprised to see it at 125% either. Considering the open driver consistently outperforms fglrx in most metrics, on most loads, using most benchmarks even right now when it is still sorely lacking in major functionality.

    The problem is that you cant do a direct 1 to 1 comparison between the open drivers and fglrx. The open drivers dont have the same level of functionality yet. And with fglrx more than half of the "supported" functionality is so buggy that it isnt at all usable. Even though 2d support isnt fully implemented it still outperforms fglrx. Even though video decoding isnt at all supported it still outperforms fglrx. Even though modesetting is still a bit buggy, dual monitors at least work with the open drivers and are completely impossible due to massive show stopping bugs in fglrx.

    As far as the truth is concerned the only performance advantage that fglrx has over the open driver is 3d support. Most of that advantage will disappear once OpenGL3 comes along... and what little advantage it gets from tuning will be completely worthless considering the open driver actually works and fglrx doesnt.

    So I guess the question is simply this............

    What would ATi do if the open driver comes out on top and outperforms fglrx in 3d tasks, just like it does now on everything else?

  3. #3
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,514

    Default

    I suspect we're talking about different things here. I'm talking about 3D, where the code is very similar across all the OSes. Sounds like you are talking about 2D and video, where the code is Linux-specific since the APIs are totally different. 3D performance with fglrx on Linux should be pretty much identical to Windows.

    2D and video acceleration are a few thousand lines of code each; 3D acceleration is a few million lines of code.

    If the open drivers ended up more functional and faster than fglrx in 3D we would be happy. Surprised, but happy. 2D and video are a different story; we expect the open drivers to move faster there since we're talking about Linux-specific APIs.

  4. #4
    Join Date
    Aug 2008
    Location
    Finland
    Posts
    1,675

    Default

    Quote Originally Posted by bridgman View Post
    If the open drivers ended up more functional and faster than fglrx in 3D we would be happy. Surprised, but happy. 2D and video are a different story; we expect the open drivers to move faster there since we're talking about Linux-specific APIs.
    I wonder if there would be any correlation between the decreased need of developing fglrx and an increased amount of in-house opensource developers if that theoretical thing ever happened. Since the thing is, unless there's a sufficient amount of people coding in-house the opensource drivers, we probably won't be ever reaching the point where cards actually work full-featured with opensource drivers even right after they are published. (not that I know what the sufficient amount of coders is)
    Edit: What I meant was "works full featured but potentially not with the newest and shiniest platforms". As in, the in-house developers wouldn't need to worry about things like porting it to new X server versions etc.
    Last edited by nanonyme; 05-04-2009 at 05:53 AM.

  5. #5
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,514

    Default

    It's hard to say when we're talking about hypotheticals. Can we assume that the open source 3D driver is miraculously running faster and better on workstation apps without any performance tuning or per-application bug fixing ?

    Right now the 3D code is shared across multiple OSes, so there isn't that much Linux-specific work required. Unless this hypothetical new 3D driver were pretty much maintenance free it could easily take more Linux-specific effort to maintain than we spend on fglrx today -- since we would not be leveraging work done for other OSes.

    Are you assuming that the portion of the development community interested in 3D performance becomes much larger and takes care of these issues, or are you assuming a brilliant "write it once and it takes care of itself from there" breakthrough in driver design ?
    Last edited by bridgman; 05-04-2009 at 09:04 AM.

  6. #6
    Join Date
    May 2009
    Location
    France
    Posts
    9

    Default

    Quote Originally Posted by bridgman View Post
    This is a continuation of a thread which sorta started here :

    http://www.phoronix.com/forums/showt...t=16740&page=4

    re: effort required to keep fglrx running on new distro versions :


    That`s what I was talking about too. I think we`re all talking about the same thing -- I read your original post as suggesting that we not add new features but just keep adding support for new distro versions (kernel, xorg, dri etc..); my response was that if it was as trivial as you suggest we would already be doing it.
    I must admit that I have no idea of the work needed to support X server 1.6 in the legacy fglrx driver.
    I just feel frustrated to see my hardware not well supported by any new distro at this very moment.
    But I keep hoping for a better (soon) futur.

    Quote Originally Posted by bridgman View Post
    And here we are. I`m open to suggestions for a better thread title
    Here are some proposals:
    - "Brave New World" (I read this one in its French translation)
    - "Desperation & Hope" (very versatile, but the order is important )
    - "Renaissance"


    Quote Originally Posted by bridgman View Post
    re: waiting for AIGLX

    Ahh, if you updated in December then you probably never saw it working. If you get bored you might want to give it a try on one of the supported distros.
    Well it is a possibility, but I'd rather just downgrade my X server to the one used in previous Mandriva release.
    In the mean time, I'd like to try and help debugging the new free driver, and I don't like very much the idea of having 2 different systems at the same time.
    I update my system almost every day, as I use the Mandriva cooker...
    Don't tell me that I shouldn't run the cooker. I know the rules and the breakages and I accept them.
    The problem is that the brand new stable release has this problem too.
    I understand that you don't care about Mandriva, but Ubuntu 9.04 should have the same problem too.


    Quote Originally Posted by bridgman View Post
    re: 5xx and earlier parts not having dedicated video decode hardware other than the MPEG-2 IDCT engine :


    This shouldn`t be bad news, it`s not like we broke into your house and stole the video decoder out of your chip
    kinda

    Quote Originally Posted by bridgman View Post
    All the info required to accelerate video decoding with shaders has been out there for a year at least; if nobody else adds the support to the open drivers I`ll ask one of our guys to do it after we get a bit further on 6xx-7xx 3d.
    This is a good news. Thank you! But we agree that there is more important things to finish before.

    Quote Originally Posted by bridgman View Post
    re: the cost of supporting open source driver development :


    Writing the documentation, writing the first sample code, IP review, developer support... they all cost time and money. It`s not like the docs and open source drivers are sitting around and we just have to stop being stupid
    Do you mean that there was no documentation before you take the decision to go the OSS way?

    Quote Originally Posted by bridgman View Post
    re: murdering fglrx if the open source driver performance and features become high enough to be viable in the workstation market :


    At 95% it might work, but we lose business for a single percentage point of performance. Even the devs don`t think they`ll get anywhere near 95% on complex apps though...
    Really?! People are crazy: even 2% or 3% is not significant!
    It looks like: I have a bigger than yours

    P.S:
    It's very nice of you to take the necessary time to answer our questions. I guess you have to repeat the same things every time and stay calm with angry people must be sometimes hard. Do you take any medication to keep cool?
    I really appreciate and I'd like to thank you.

  7. #7
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,514

    Default

    Quote Originally Posted by Davy View Post
    Here are some proposals:
    - "Brave New World" (I read this one in its French translation)
    - "Desperation & Hope" (very versatile, but the order is important )
    - "Renaissance"
    Touch choice; they all apply depending on your point of view

    Quote Originally Posted by Davy View Post
    Do you mean that there was no documentation before you take the decision to go the OSS way?
    Not this kind of documentation. Each new GPU design involves thousands of pages of hardware and software design documents (probably tens of thousands, I've never added them all up), but we co-locate the hardware and software design teams so driver code and hardware can be developed collaboratively. Since we use a unified driver a lot of the "how to program the chip" is inherited from pre-existing code so there was no need for self-contained "here's how to program XYZ" documentation before we started the open source project.

    Quote Originally Posted by Davy View Post
    Really?! People are crazy: even 2% or 3% is not significant! It looks like: I have a bigger than yours
    The issue is that the major buying decisions are made by OEM product design teams, not individuals, so to some extent they have to decide based on specs.

    Quote Originally Posted by Davy View Post
    P.S:
    It's very nice of you to take the necessary time to answer our questions. I guess you have to repeat the same things every time and stay calm with angry people must be sometimes hard. Do you take any medication to keep cool?
    No but I live in the middle of a 70' pine forest, so just going outside and walking around for a while helps a lot

    The forest isn't great for satellite internet, of course. Dial-up sucks.

  8. #8
    Join Date
    Nov 2007
    Posts
    1,353

    Default

    Quote Originally Posted by bridgman View Post
    It's hard to say when we're talking about hypotheticals. Can we assume that the open source 3D driver is miraculously running faster and better on workstation apps without any performance tuning or per-application bug fixing ?
    I think that is the very reason why fglrx sucks so horribly bad right now.. Per application bug fix? Are you kidding me? So what your saying is that rather than writing the driver to meet the standards set forth in the API, you'll just do what you want and then make it "work" later?

    How about this as a novel idea.... Develop the driver to meet the standards set forth in the API, and let the developers of the application do the same.... The biggest benefit of an open source model in my opinion is that the application developers can see exactly how the driver works, and how best to optimize for it... And of course visa-versa....

    The reason the open driver wont be tuned (even though the tuning potential is far greater) is because the developers you could be paying to do it are being wasted away by tuning fglrx. Dont try telling me that they would be doing it anyways because of the shared code base. You just said hand tuning on a per application basis, and since were talking about Linux, we're talking about Linux applications and therefore Linux programmers.

    Why not let them develop on a rewarding project that actually works?

  9. #9
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,514

    Default

    Quote Originally Posted by duby229 View Post
    I think that is the very reason why fglrx sucks so horribly bad right now.. Per application bug fix? Are you kidding me? So what your saying is that rather than writing the driver to meet the standards set forth in the API, you'll just do what you want and then make it "work" later?
    No, I'm saying that after writing the driver to meet the standards set forth in the API, and after passing the compliance tests, when the application *still* doesn't work the customers expect the driver developers to "make it work", no matter where the real problem lies. We all have dev rel teams to work with the application developers, but the reality is that once an application ships you have to live with what was shipped.

    Quote Originally Posted by duby229 View Post
    How about this as a novel idea.... Develop the driver to meet the standards set forth in the API, and let the developers of the application do the same.... The biggest benefit of an open source model in my opinion is that the application developers can see exactly how the driver works, and how best to optimize for it... And of course visa-versa....
    In a perfect world with perfect standards all correct implementations would be interchangeable. Let me know if you find either.

    Quote Originally Posted by duby229 View Post
    The reason the open driver wont be tuned (even though the tuning potential is far greater) is because the developers you could be paying to do it are being wasted away by tuning fglrx. Dont try telling me that they would be doing it anyways because of the shared code base.
    The 3D code base is common across multiple OSes, and tuning is shared across the OSes. I know you don't want me to say that but that's just the way it is.

    Quote Originally Posted by duby229 View Post
    You just said hand tuning on a per application basis, and since were talking about Linux, we're talking about Linux applications and therefore Linux programmers.
    No, I said per application bug fixing, not per application hand tuning. Please check the text you quoted.
    Last edited by bridgman; 05-04-2009 at 08:08 PM.

  10. #10
    Join Date
    Nov 2007
    Posts
    1,353

    Default

    Quote Originally Posted by bridgman View Post
    No, I'm saying that after writing the driver to meet the standards set forth in the API, and after passing the compliance tests, when the application *still* doesn't work the customers expect the driver developers to "make it work", no matter where the real problem lies. We all have dev rel teams to work with the application developers, but the reality is that once an application ships you have to live with what was shipped.
    And that is exactly why fglrx sucks so bad. If that train of thought actually worked why is it that fglrx is so terrible across the board? It doesnt work with most applications. It isnt standards compliant. It doesnt work with most APIs. I could perfectly understand your argument if what you say was true, but it isnt. That methodology clearly doesnt work as proven by fglrx.

    In a perfect world with perfect standards all correct implementations would be interchangeable. Let me know if you find either.
    Consider the Linux kernel. I consider the kernel to be very well developed and stable. The API does tend to change and break compatibility between versions, but within a version everything works perfectly. And this is just the way it is whether you like it or not. The Linux kernel maintainers are not going to change there methodology now or any time soon. The only way that kernel compatibility can be guaranteed between versions is to get the code in the kernel. Thats just the way it is, and you cant do that with a closed driver. The same is true for most other open source projects. The X Server being another good example. The DDX drivers that ship with that version work with that version. Alsa, Cups, Dbus, gstreamer, lm-sensors, etc, etc, etc, The drivers and libraries that ship with that version work with that version. This is how open source works, and it --IS-- a much better methodology. We dont have to maintain years old legacy compatibility with broken and incomplete APIs because of it. If an API is broken in the last version we actually have the ability to fix it in the next version. It truly is a miracle of modern technology that simply is --not-- possible with proprietary software that expects and needs the API to stay the same no matter what. Proprietary software requires stagnation. Open Source software eliminates that requirement and makes possible true and genuine innovation.

    The 3D code base is common across multiple OSes, and tuning is shared across the OSes. I know you don't want me to say that but that's just the way it is.
    And as I already said that is exactly what the problem is. I know thats not what you want me to say, but thats just the way it is...

    No, I said per application bug fixing, not per application hand tuning. Please check the text you quoted.
    Which still requires Linux programmers wasting there time on a dead end project that doesnt work and never will. Which brings me back to my question... Why not let them develop on a rewarding project that actually works?

    (edited for clarity)
    Last edited by duby229; 05-04-2009 at 09:46 PM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •