Page 66 of 117 FirstFirst ... 1656646566676876116 ... LastLast
Results 651 to 660 of 1167

Thread: AMD's UVD2-based XvBA Finally Does Something On Linux

  1. #651
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by Kano View Post
    @Q

    You can be sure i don't use fully outdated systems - i really compile much. But using cpu decode on hd video is not usefull - same for using opencl for it. opencl would use the normal gfx core but NOT the dedicated chip which is already in there. Do you call that good? I don't. The combined power consumption will be always higher than using the right chip for that. That's similar when you use oss drivers for normal work which do not use powermanagement. You might "feel" better not using the bad binary stack, but basically you need much more energy than you would with another driver - especially on the mid/highend cards. So the only one how laughs in your energy provider - you have to pay the bill, don't forget. Ok the latest highend nv fermi have too high idle consumption, but what you propose using is definitely not better.
    "You can be sure i don't use fully outdated systems "

    you use intel systems... intel means outdatet...
    try a IBM Power7! or a Opteron6000 48 core system!
    Intel can't compete against these 2 CPUs!

    "But using cpu decode on hd video is not usefull"

    because you use outdatet non multicore software ?


    "The combined power consumption will be always higher than using the right chip for that."

    you are just wrong phoronix test this! CPU do need less power consuming!
    CPU=win GPU=LOSE

    "You might "feel" better not using the bad binary stack"

    its not just a feeling nvidia is watching you... (now i know why you write good things abaut nvidia because they hack your system via the clousedsource nvidia driver and highjack your phoronix password)

    "Ok the latest highend nv fermi have too high idle consumption,"

    Fermi=80idle with good idle energy saver
    my hd4350=5watt and my 4670=16watt

    i call you nvidia fanboy a tree murderer!

  2. #652
    Join Date
    Dec 2008
    Posts
    980

    Default

    Quote Originally Posted by bridgman View Post
    I understand that from your perspective video functionality is more important than some of the other features the devs work on, but in fairness we do have other customers and we do have to deliver features that *they* want as well.
    In all fairness the Catalyst release notes mention three distributions that are supported.

    • Red Hat Enterprise Linux suite
    • Novell/SuSE product suite
    • Ubuntu


    Arguably Ubuntu is more oriented to the home user, and Ubuntu users are more likely to use their OS to watch videos. The default and only officially supported video player for Ubuntu is Totem using the GStreamer framework. There are no GStreamer GL output plugins in the Ubuntu repositories, so users have a choice to either watch videos with tearing or install an unsupported player and use GL output. Hardly an ideal situation for an OS that's supposed to be supported by Catalyst.

    But if adding vsync for Xv to Catalyst is a too daunting task, I'm sure the GStreamer developers welcome any GL/XvBA patches from ATI/AMD.

  3. #653
    Join Date
    Aug 2007
    Posts
    6,598

    Default

    First of all you have to compare the ATI 5870 to GTX 480 and 5850 to GTX 470. Also there is huge difference when you only use 1 display and not 2. A GT220 beats a 4550 easyly, so better compare that to your lowend cards.

    Also i prefer to use desktop systems - those have got only 1 cpu. Server systems with more than 1 cpu - in your case 4 must be loud as hell - currently i prefer to use a relatively low noise Esprimo system with E8400 cpu for everyday use, a Q930 for testing ATI and X3380 for compiling kernels and gaming (only rarely). I have got no need for multi cpu systems at all at home as i am not deaf - as you must be. The main target for my systems is low noise until you need pure power, then it can be a bit louder. Therefore powermanagement has to work for CPU and GPU.

  4. #654
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by Kano View Post
    First of all you have to compare the ATI 5870 to GTX 480 and 5850 to GTX 470. Also there is huge difference when you only use 1 display and not 2. A GT220 beats a 4550 easyly, so better compare that to your lowend cards.

    Also i prefer to use desktop systems - those have got only 1 cpu. Server systems with more than 1 cpu - in your case 4 must be loud as hell - currently i prefer to use a relatively low noise Esprimo system with E8400 cpu for everyday use, a Q930 for testing ATI and X3380 for compiling kernels and gaming (only rarely). I have got no need for multi cpu systems at all at home as i am not deaf - as you must be. The main target for my systems is low noise until you need pure power, then it can be a bit louder. Therefore powermanagement has to work for CPU and GPU.
    there are single socket amd g34 opteron 6000 boards!

    80watt tdp.. means.. you can cool it passiv!

    "A GT220 beats a" 4350 5watt tdp ? ? ?

    in your dreams!

  5. #655
    Join Date
    Aug 2007
    Posts
    6,598

    Default

    Learn reading: 4550! 80w passively cooled thats of course something that only Q manages.

  6. #656
    Join Date
    Oct 2008
    Posts
    2,909

    Default

    Q's point that modern systems don't really need video acceleration except to save power is a good one. So is Kano's that fglrx sucks for home users. I'm not sure even AMD would dispute that, it's the whole reason they are supporting the OSS drivers.

  7. #657
    Join Date
    Oct 2008
    Posts
    2,909

    Default

    Oh, and i never understood AMD's insistence on doing monthly driver releases either. This isn't just about linux, but also the windows drivers. Why not just do them quarterly, with perhaps an extra one every now and then when new hardware is released or something major changes? It just doesn't seem to be a problem for any other hardware company on earth, except for amd which insists terrible things would happen if they changed their process. It makes you wonder just how much money and resources they spend on QA and releasing so many drivers, and what they could have done in the driver instead if they had refocused some of that energy.

  8. #658
    Join Date
    Jan 2010
    Posts
    68

    Default

    Quote Originally Posted by smitty3268 View Post
    Q's point that modern systems don't really need video acceleration except to save power is a good one. So is Kano's that fglrx sucks for home users. I'm not sure even AMD would dispute that, it's the whole reason they are supporting the OSS drivers.
    I disagree. I think the reason that AMD supports the OSS drivers is to encourage open-source development and innovation, and to get the one-up on Nvidia. Their open-source driver is.....where?

  9. #659
    Join Date
    Aug 2007
    Posts
    6,598

    Default

    Nvidia has got legacy drivers, just not for the oldest series 71.xx for new Xservers. Ok, thats not nice, but those cards are at least 10 y old. ATI legacy cards are R300-R500, and those were even sold in new systems at the time of being legacy - the oss driver at that time was not really in good shape when the cards have been dropped from fglrx. Certainly nv could do more for the oss drivers, it is a shame that fermi cards will not even get modesetting support in nv but in the next 5y these cards will be most likely better supported than anything from other companies. As long as you don't have em in a laptop a replacement should be relatively easy possible until pci-e is outdated - so you are not really depending on long term support.

  10. #660
    Join Date
    Nov 2008
    Location
    Germany
    Posts
    5,411

    Default

    Quote Originally Posted by Kano View Post
    Learn reading: 4550!
    you sould better read my hd4670 idle 16watt burn your gtx220 nvidia shit cart in hell!

    a 5670 and a 5750go-green-75watt TDP burn your card in hell!

    Fermi=80watt idle
    hd5870=38watt idle!

    you can't read!

    nvidia also do not have 5watt TDP class GPU! there is no hd4350 @ nvidia!

    only Tree-killer-cards! nvidia fanboys kills kids..ok no kids but trees!

    Quote Originally Posted by Kano View Post
    80w passively cooled thats of course something that only Q manages.
    TDP means you don't need to burn the 80watt only in rarly-situations

    call it semi-passiv!

    there are 2-3 air coolers how can handle 80watt TDP passiv!

    my one for examel 1kg of pur metal Scythe Mugen 2 CPU Cooler its a passiv cooler for up to 4peaces of 120x120mm airflowers:


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •