Page 3 of 5 FirstFirst 12345 LastLast
Results 21 to 30 of 49

Thread: Radeon Driver To Support ATI R500/600

  1. #21
    Join Date
    Oct 2007
    Posts
    178

    Default

    Quote Originally Posted by libv View Post
    Compared to modesetting, proper modesetting, these bits are tiny. So it makes no technical sense to hack avivo style support into an already huge and hard to maintain driver, this while porting over 2D acceleration to radeonhd is comparatively little work and very managable.
    Sounds to me as if you developers should devote some time to this then, so that you quickly get a group of users with already supported cards who can help testing it parallel to your modesetting work.

  2. #22
    Join Date
    Jul 2007
    Posts
    404

    Default

    When you say 2d acceleration, do you mean XAA? Because it seems to me to be rather counter-productive to do work on XAA these days, considering that it essentially provides no performance benefit, only takes a rather slight load off the CPU. I think moving directly to work on EXA is more productive.

    However it is also more difficult, because EXA uses parts of the 3d engine, needs DRM suppport, etc. However it would be much more effective acceleration in the long run.

    As for overlays, I think the issue is rather similar. I would expect at some point a TexturedVideo-style implementation, using the 3d engine to do overlays.

    Of course the big gotcha here is that we still need specs for the 3d engine, but hopefully we'll have those in a month or two.

    In that vein, I'd say yes, the X11 driver work needed at this point is not huge, but a lot of work will need to be done on DRM and DRI. I'm hoping the 3d driver ends up being a Gallium rather than a Mesa driver, because that seems to be where the future of GFX drivers for linux is.
    Last edited by TechMage89; 11-23-2007 at 05:59 PM.

  3. #23
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,463

    Default

    Quote Originally Posted by TechMage89 View Post
    When you say 2d acceleration, do you mean XAA? Because it seems to me to be rather counter-productive to do work on XAA these days, considering that it essentially provides no performance benefit, only takes a rather slight load off the CPU. I think moving directly to work on EXA is more productive. However it is also more difficult, because EXA uses parts of the 3d engine, needs DRM suppport, etc. However it would be much more effective acceleration in the long run.
    My understanding is that we need both -- some apps (I'm told) require XAA, while everyone agrees that EXA is the superior approach but needs more work (and probably integration with TTM) before it's really going to shine.

    My non-technical impression is that "everyone agrees EXA is far superior even though all the available EXA implementations seem to be dog-slow"

    I think we're all on the same page here. We need to get the bottom level infrastructure (DRM etc..) in place anyways, not spend much time on XAA, and do everything we can to help EXA evolve and improve to the point where it works for everyone.

    Quote Originally Posted by TechMage89 View Post
    As for overlays, I think the issue is rather similar. I would expect at some point a TexturedVideo-style implementation, using the 3d engine to do overlays. Of course the big gotcha here is that we still need specs for the 3d engine, but hopefully we'll have those in a month or two.
    Yep. We're going to need pretty much the same processing whether the final representation is blitted into a window or whether we use overlay, and all that processing is going to need DRM + memory management + 3d processing. The only interesting point is that we don't necessarily need the full GL stack for simple video processing so in principle the work could happen in parallel with creating a final GL driver.

    Either way, we're going to get the info out there, see what happens, and help where we can.

  4. #24
    Join Date
    Jul 2007
    Posts
    404

    Default

    It also occurred to me that with the r600+ approach as everything-is-3d, this would be an ideal place to start work on Glucose because, once 3d is working, that fits the paradigm of the card, and should offer superior performance.

    Regarding EXA, it seems the problem is that the DRM for many drivers was never designed to handle it, and the implementations have been fairly poor. Lots of changes need to happen in a lot of drivers. The benefit here is that we have the opportunity to make a fairly fresh start.

    This is quite exciting, because it means, not only stable, fast ati drivers in the not-so-distant future, but also the opportunity to try new technologies.

  5. #25
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,463

    Default

    Yep. One of the things I have heard a lot since hanging out in the Linux world is "you can't rely on having 3d". The 3d folks say "you can't rely on having DRM", which then seems to lead to "you can't rely on having the kernel module installed and configured properly" and so on.

    I'm hearing a lot more people starting to say "you have to be able to rely on 3d", which is going to enable a lot of good things further up the software stack.

  6. #26
    Join Date
    Oct 2007
    Posts
    50

    Default

    Quote Originally Posted by bridgman View Post
    Yep. One of the things I have heard a lot since hanging out in the Linux world is "you can't rely on having 3d". The 3d folks say "you can't rely on having DRM", which then seems to lead to "you can't rely on having the kernel module installed and configured properly" and so on.

    I'm hearing a lot more people starting to say "you have to be able to rely on 3d", which is going to enable a lot of good things further up the software stack.
    first: good to see AMD on this forum

    then: what direction is AMD moving in wrt to its fglrx driver and the open source effort: is it actively trying to help the Gallium3D and TTM efforts or is it on a different path? this seems quite relevant since G3D and TTM seem to be gaining a lot traction lately and TTM seems to be bound for kernel inclusion even.

    also: I do not understand why AMD insists on keeping its own fglrx driver development alive. Ok, maybe for the parts that require certain 'IP' thingies, but then again: why not come up with a 'plugin' architecture of the open source drivers (in conjunction with their developers). this I think is a much more efficient use of effort!

    any ETA on the 3D documentation?

  7. #27
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,463

    Default

    We will be actively supporting open source developers working on Gallium and TTM, and would like to see open source drivers for AMD graphics adopting new technologies as quickly as possible.

    The fglrx driver was originally aimed at workstation users, where performance and ISV certification are critical but the apps and distro versions are tightly constrained. I believe proprietary proprietary drivers will remain the solution of choice for workstation, as well as for gaming and possibly for high-end video. Open source is ideal when you need to deal with a broad range of distros and apps, particularly things which are either still in development or which just poppsd out a week or three ago, as long as the open source driver is kept fairly simple so that it *can* be fixed quickly.

    The only problem I see with a plug-in (typically people talk about proprietary 3d/video on open source 2d/kernel) is that if we ever want to play protected video we're going to need a proprietary driver right down to the hardware in order to meet copy protection aka Digital Rights Management requirements. I know nobody cares about DRM today but I also hear everyone wanting Linux desktop usage to grow dramatically, and I don't think you can have broad desktop acceptance without the ability to play DVD and HD legally. You won't see OEMs embrace Linux on the desktop without legal DVD/BR/HD-DVD and you need OEM SKUs if you ever want to see serious market adoption. Even display drivers have to be proprietary if you are serious about DRM, since the content has to be protected all the way to the frame buffer.

    I think open source drivers will be the norm for most out-of-box desktop users (except for workstation), while anyone serious about getting the most gaming performance or video capabilities will upgrade to the proprietary driver. OEMs selling desktop consumer systems with Linux pre-installed will generally go for the proprietary drivers in order to get all the bells and whistles, but there will be cases where the OEM works closely with a major distro for support and in that case open source drivers can work out fine.

    Anyone wanting to do development or testing with the latest unreleased (or just released last week) kernels or distros will also want to stay with an open source driver, since there's a good chance something will need to be tweaked to line up with the latest tweaks in the OS.

    We are trying to achieve two things with open source drivers :

    - empower the distributions to provide a complete and high quality "out of box" end user experience

    - ensure that drivers for our graphics products can always keep pace with changes in the overall OS/desktop/app environment (ie "no user left behind" )

    "Official" schedule for initial 3d is 1Q08, but I want to see developers starting to work on 3d before the end of 2007 if at all possible. The 690 integrated part should be able to use existing code largely unchanged, but more work will be needed for 5xx and 6xx discrete parts.
    Last edited by bridgman; 11-24-2007 at 03:38 PM.

  8. #28
    Join Date
    Oct 2007
    Posts
    50

    Default

    thanks for the very quick answer!

    Quote Originally Posted by bridgman View Post
    We will be actively supporting open source developers working on Gallium and TTM, and would like to see open source drivers for AMD graphics adopting new technologies as quickly as possible.
    very good to hear! I sure hope TTM lands in the kernel very soon

    Quote Originally Posted by bridgman View Post
    The fglrx driver was originally aimed at workstation users, where performance and ISV certification are critical but the apps and distro versions are tightly constrained. I believe proprietary proprietary drivers will remain the solution of choice for workstation, as well as for gaming and possibly for high-end video. Open source is ideal when you need to deal with a broad range of distros and apps, particularly things which are either still in development or which just poppsd out a week or three ago, as long as the open source driver is kept fairly simple so that it *can* be fixed quickly.
    I don't see this argument the same way. For workstations and other 'flavors' you can still have different branches of the driver, but still based on the same base: compare a git master (latest and greatest) with a git branch (certified driver branch).
    I also do not see what is so different from a normal driver and a gaming driver or a high end video driver. Also, performance: I bet you a truckload of money that once 3D works that the community will start tuning and tweaking until it has every last bit of performance from the hardware and driver that it can obtain...

    The driver will almost automatically be much simpler when G3D is used.

    Quote Originally Posted by bridgman View Post
    The only problem I see with a plug-in (typically people talk about proprietary 3d/video on open source 2d/kernel) is that if we ever want to play protected video we're going to need a proprietary driver right down to the hardware in order to meet copy protection aka Digital Rights Management requirements. I know nobody cares about DRM today but I also hear everyone wanting Linux desktop usage to grow dramatically, and I don't think you can have broad desktop acceptance without the ability to play DVD and HD legally. You won't see OEMs embrace Linux on the desktop without legal DVD/BR/HD-DVD and you need OEM SKUs if you ever want to see serious market adoption. Even display drivers have to be proprietary if you are serious about DRM, since the content has to be protected all the way to the frame buffer.
    Yes, this is about the only argument I agree with you on for a reason to maintain the fglrx. However.....
    Once the community knows how the hardware works and when the protected content goes through the framebuffer it would be fairly simple to just grab the buffer for every frame, thus defeating the protection.

    We all know the DRM is pointless (BD+ cracked before even being released, etc etc.) so in the long run it will be extinct. From an engineering standpoint any encryption schema that needs to be fast on decompression is weak and can be easily attacked. any good engineer knows that. Besides, it's not 'Rights' but 'Restrictions' ;-)

    Quote Originally Posted by bridgman View Post
    I think open source drivers will be the norm for most out-of-box desktop users (except for workstation), while anyone serious about getting the most gaming performance or video capabilities will upgrade to the proprietary driver. OEMs selling desktop consumer systems with Linux pre-installed will generally go for the proprietary drivers in order to get all the bells and whistles, but there will be cases where the OEM works closely with a major distro for support and in that case open source drivers can work out fine.
    I do not agree. I think it is MUCH more important that things 'Just Work'. Bells and whitles are nice but this is not what people REALLY want. I take an example from personal experience: I'm an engineer and LOVE gadgets and computers. I did NOT buy a media center but instead bought a DVR that Just Works. I did not want yet another machine to maintain....
    Machines must make our lives easier and not make everybody a sysadmin. This attitude I see around me more and more....

    Quote Originally Posted by bridgman View Post
    Anyone wanting to do development or testing with the latest unreleased (or just released last week) kernels or distros will also want to stay with an open source driver, since there's a good chance something will need to be tweaked to line up with the latest tweaks in the OS.
    personally tracking the radeonhd and radeon drivers closely :-)

    Quote Originally Posted by bridgman View Post
    We are trying to achieve two things with open source drivers :

    - empower the distributions to provide a complete and high quality "out of box" end user experience

    - ensure that drivers for our graphics products can always keep pace with changes in the overall OS/desktop/app environment (ie "no user left behind" )
    and a sound move considering 'Fusion' !!!

    Quote Originally Posted by bridgman View Post
    "Official" schedule for initial 3d is 1Q08, but I want to see developers starting to work on 3d before the end of 2007 if at all possible. The 690 integrated part should be able to use existing code largely unchanged, but more work will be needed for 5xx and 6xx discrete parts.
    thank you!!

  9. #29
    Join Date
    Jul 2007
    Posts
    404

    Default

    So does this mean that some 3d engine specs might be available before the end of the year? I hope so.

    There seems to be quite a bit of enthusiasm behind this, and the RadeonHD devs seem to be doing a good job of making sure everything works right. If there could be a stable driver with EXA and Aiglx by the end of 2008 that would be an enormous achievement. I am willing to help in any way I can, but my understanding of the functioning of graphics hardware is extremely limited (I keep running into new acronyms I need to look up like TTM, LVDS, GART, etc). I can code though, so if there is any way to learn about this and contribute to driver development, I'd like to do it.

  10. #30
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,463

    Default

    Quote Originally Posted by fhuberts View Post
    I also do not see what is so different from a normal driver and a gaming driver or a high end video driver. Also, performance: I bet you a truckload of money that once 3D works that the community will start tuning and tweaking until it has every last bit of performance from the hardware and driver that it can obtain...
    None of the developers I work with plan to spend time analyzing new games and optimizing the driver for them on an ongoing basis. It's fun the first time, but realistically only a commercial team is going to keep optimizing as each new generation of apps comes out.

    What I think you will see from open source 3d is a clean, elegant implementation that runs pretty well with everything and has maybe 50-70% of the proprietary driver performance on the hottest apps. There's a big difference between what the community *can* do and what they *will* do.

    Now... will the open source driver be good enough for most people ? Absolutely. I expect it will also need less tweaking and bug fixing as new apps come out, but if you want to match the performance of the Windows driver I don't see any alternative to a proprietary driver. Maybe I'll be wrong... nobody knows for sure.

    Quote Originally Posted by fhuberts View Post
    Once the community knows how the hardware works and when the protected content goes through the framebuffer it would be fairly simple to just grab the buffer for every frame, thus defeating the protection.
    That's the whole point of having proprietary drivers right down to the hardware, including the display driver. All the buffers have to be protected (yes, including the frame buffer) so other processes can't even map them let alone access them, and the OS has to provide the necessary support or it's as easy to break as you say.

    It sucks, I know
    Last edited by bridgman; 11-24-2007 at 04:52 PM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •