Results 1 to 8 of 8

Thread: RadeonHD Driver Gets HDMI Support

  1. #1
    Join Date
    Jan 2007
    Posts
    14,369

    Default RadeonHD Driver Gets HDMI Support

    Phoronix: RadeonHD Driver Gets HDMI Support

    The xf86-video-radeonhd driver has today received support to handle HDMI (High Definition Media Interface) connectors. While if you've used a DVI to HDMI dongle with the RadeonHD driver it would have worked already (as we shared in our recent ATI HDMI Linux article) this support is for those with integrated connectors.

    http://www.phoronix.com/vr.php?view=NjI1Mg

  2. #2
    Join Date
    Jul 2007
    Posts
    429

    Default But presumably no support for higher resolutions?

    Quote Originally Posted by phoronix View Post
    Phoronix: RadeonHD Driver Gets HDMI Support

    The xf86-video-radeonhd driver has today received support to handle HDMI (High Definition Media Interface) connectors.
    If RadeonHD doesn't allow you to use the card's full resolution over an HDMI connector then what has been gained?

  3. #3
    Join Date
    Sep 2006
    Location
    PL
    Posts
    909

    Default

    when i saw the commit i thought to myself "gee, phoronix is going to make a big deal out of it" :]

  4. #4
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,386

    Default

    Quote Originally Posted by chrisr View Post
    If RadeonHD doesn't allow you to use the card's full resolution over an HDMI connector then what has been gained?
    It's probably not a question of high resolution support, just EDID funnies and finding the right way to over-ride EDID. This is just a guess (I haven't spoken with Michael or the SuSE folks about it) but many consumer devices only expose 1280x720 resolution (aka 720P) even if their native resolution is higher. Not sure why but it seems to happen. In those cases you need to force a higher resolution, as Michael was trying, but during the addition of RandR support to RadeonHD the mechanisms for over-riding EDID changed a bit AFAIK.

    There is also a debate going on about the best way to over-ride in the first place -- one view is that garbage in the x conf file causes so many problems that reliable information from EDID should over-ride the conf entries; another is that EDID isn't always right and if the user mucks up conf it's their own fault. I had to unsubscribe from the lists to keep my sanity so I'm not sure what the current status is

  5. #5
    Join Date
    Jul 2007
    Posts
    429

    Default What about HDCP support?

    Quote Originally Posted by bridgman View Post
    It's probably not a question of high resolution support, just EDID funnies and finding the right way to over-ride EDID.
    I was under the impression that hardware with HDMI ports was designed not to provide its highest and best resolutions unless some kind of "handshake" was made via HDCP. Are you saying that HDCP is optional for ATI cards?

  6. #6
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,386

    Default

    As far as I know this is a policy implemented by player applications as part of their agreement with high-def content providers. The player app looks at the content (eg. an HD-DVD or Blu-Ray disk), the output resolution of the display, and decides whether or not output protection is required. If protection can not be enabled then the app may choose to downscale the resolution as a compromise.

    The drivers and hardware just provide a robust mechanism to inform the player app and to implement its decisions -- I don't think there are resolution dependent rules built into the hardware. I will ask around and try to confirm this.

    As far as I know HDCP is optional for all hardware, not just ATI/AMD products -- it's just that without HDCP a player app may choose not to play protected content at high resolution. I vaguely remember hearing about a couple of cases where HDCP was not optional -- either a specific display required it or a specific driver turned it on by default -- but that's it.

  7. #7
    Join Date
    Jul 2007
    Posts
    429

    Default But if HDCP is optional, why bother with it at all?

    Quote Originally Posted by bridgman View Post
    As far as I know HDCP is optional for all hardware, not just ATI/AMD products -- it's just that without HDCP a player app may choose not to play protected content at high resolution.
    I'm just trying to imagine circumstances where a player might choose to use HDCP... and I am drawing a blank. E.g.

    "Shall we f**k our users over?" (Y/N)

    Tough one...

  8. #8
    Join Date
    Oct 2007
    Location
    Toronto-ish
    Posts
    7,386

    Default



    Seriously, the decision is made by the content providers not the player developers. When the player dev signs a license agreement to play HD content, one of the conditions is that the player app will honour the protection rules which accompany the content.

    My understanding is that today none of the HD/BR disks have included a rule about requiring HDCP above a certain resolution but AFAIK this can be turned on at any time on a disk by disk basis. I believe the rules can either prohibit playback without protection or require "constriction", ie scaling to a lower resolution then upscaling again if required to fit the window.

    There are some good articles about this on the web -- I'll see if I can dig up a couple and post some links.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •