Page 2 of 5 FirstFirst 1234 ... LastLast
Results 11 to 20 of 47

Thread: Mesa 9.0 Officially Released, Supports OpenGL 3.1

  1. #11
    Join Date
    Feb 2012
    Posts
    423

    Default

    Quote Originally Posted by przemoli View Post
    If codecs are patented than they will not land in MESA.
    How do you explain mpeg2 being there? It's also patented.

    This isn't about patents, it's about one simple thing: GPUs aren't actually suitable for video decoding. A lot of effort required for not much gain. See this post for example, there's also a bit more discussion later in the thread.

  2. #12
    Join Date
    Mar 2011
    Posts
    90

    Default

    I do think that Xvid or x264 efforts are patented. Besides I would rather disagree there is not much to gain with hardware decoding. There is pleanty of devices which cope with decoding of Full HD material only if done by the graphics driver example of which would be many low powered devices including AMD E-350 which I am happy owner of.

    IMHO it would be great gain for Linux in general if one could use device like that to it full extend therefore I believe that mesa should extend its vdpau implementation or add a way to use external vaapi that is done so great by Intel guys

  3. #13
    Join Date
    Sep 2012
    Posts
    279

    Default OpenGL 3.1 yes, but not for Radeons

    The title says it all. If a multimillion company can't make a decent driver, then I guess I'll buy a NVIDIA card next time. Yes, their driver is closed source, but at least it works and is environment friendly.

  4. #14

    Default

    so what would i need to start playing with openCL and clover? I guess i'd need a newer card than a HD3650. is clover complete enough to compile GEGL and have an openCL enabled GIMP?

  5. #15

    Default *sigh*

    Quote Originally Posted by wargames View Post
    The title says it all. If a multimillion company can't make a decent driver, then I guess I'll buy a NVIDIA card next time. Yes, their driver is closed source, but at least it works and is environment friendly.
    End the F.U.D.

    FGLRX supports OGL 3.1. FGLRX supports up into the OGL 4.x ranges. You're getting things mixed which shouldn't be mixed.

    We're talking about the open source drivers here, in which AMD has in a lot of ways started over. Who knows when, but the OSS driver will eventually reach feature-parity with AMD's closed driver.

  6. #16
    Join Date
    Nov 2008
    Location
    Old Europe
    Posts
    904

    Default

    Quote Originally Posted by ssam View Post
    so what would i need to start playing with openCL and clover? I guess i'd need a newer card than a HD3650. is clover complete enough to compile GEGL and have an openCL enabled GIMP?
    Not sure how complete it is, but you'll need a Radeon HD 5000 class or higher.

  7. #17
    Join Date
    Jul 2010
    Location
    Melbourne, Australia
    Posts
    115

    Default

    Quote Originally Posted by ryszardzonk View Post
    I do think that Xvid or x264 efforts are patented. Besides I would rather disagree there is not much to gain with hardware decoding. There is pleanty of devices which cope with decoding of Full HD material only if done by the graphics driver example of which would be many low powered devices including AMD E-350 which I am happy owner of.

    IMHO it would be great gain for Linux in general if one could use device like that to it full extend therefore I believe that mesa should extend its vdpau implementation or add a way to use external vaapi that is done so great by Intel guys
    I think (please someone correct me if im wrong :P) but what is being talked about here is shader based acceleration?? Not via the video decoding unit in amd/nVidia gpu's (for OSS drivers). So an E350 would have a lot harder time decoding than a desktop grade GPU with more shaders.

  8. #18
    Join Date
    Mar 2011
    Posts
    90

    Default

    Quote Originally Posted by zeealpal View Post
    I think (please someone correct me if im wrong :P) but what is being talked about here is shader based acceleration?? Not via the video decoding unit in amd/nVidia gpu's (for OSS drivers). So an E350 would have a lot harder time decoding than a desktop grade GPU with more shaders.
    yes you are right. I tend to confuse those too as I am looking at things more from the user perspective who occasionaly does some bug reports and traslation stuff not the developer. Programing seems to technical for me, but what I am getting at is that one way or another users in general would welcome hardware / gpu decoding in the opensource drivers to the extend it is being done in closed source counter-parts and if am not mistaken there is no shader or any other acceleration for r600 driver that would let decode 1080p sources using that driver

  9. #19
    Join Date
    Feb 2012
    Posts
    423

    Default

    Quote Originally Posted by ryszardzonk View Post
    I do think that Xvid or x264 efforts are patented.
    Yeah, they're patented. But that's irrelevant, mpeg2 is patented too and yet Gallium has a decoder for it.

    Quote Originally Posted by ryszardzonk View Post
    Besides I would rather disagree there is not much to gain with hardware decoding. There is pleanty of devices which cope with decoding of Full HD material only if done by the graphics driver example of which would be many low powered devices including AMD E-350 which I am happy owner of.
    Your E-350 decodes Full HD using a dedicated hardware decoder (UVD) that the fglrx driver has access to. But I wasn't talking about a dedicated decoder, I was talking about the GPU. The GPU is bad at decoding.

    Quote Originally Posted by ryszardzonk View Post
    IMHO it would be great gain for Linux in general if one could use device like that to it full extend therefore I believe that mesa should extend its vdpau implementation or add a way to use external vaapi that is done so great by Intel guys
    Adding vaapi to Gallium wouldn't change anything when it comes to AMD. It's not about the API, it's getting access to UVD. We don't have documentation to do that, so Gallium can only use the GPU (shaders) for decoding. And writing a shader-based h264 decoder would be a lot of effort for little gain, it's not worth it.
    Last edited by Gusar; 10-09-2012 at 11:08 AM.

  10. #20
    Join Date
    Dec 2007
    Posts
    2,329

    Default

    Quote Originally Posted by dogsleg View Post
    What is the state of radeonsi? When it is planned to be ready?
    It currently runs lots of 3D demos and basic games, including piglit. I think the biggest thing left is shaking out the remaining bugs in the flow control code in the shader compiler. Once that's working properly we can enable more glamor features and more advanced games should start working.

    For the current todo list see:
    http://dri.freedesktop.org/wiki/RadeonsiToDo

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •