Results 1 to 4 of 4

Thread: Where is RRCreateMode handled?

  1. #1
    Join Date
    Jul 2007
    Location
    Stockholm, Sweden
    Posts
    49

    Default Where is RRCreateMode handled?

    Hi.

    I'm trying to find out why I can't get a PAL mode that works on my TV with xrandr.

    I can set a nice 16:9-adapted PAL modeline in Windows PowerStrip with
    Code:
    modeline "960x576" 20.348 960 1050 1098 1294 576 592 600 627 interlace +hsync +vsync
    However,
    Code:
    xrandr --newmode "960x576_50i" 20.348 960 1050 1098 1294 576 592 600 627 +hsync +vsync interlace
    xrandr --addmode DVI-1 960x576_50i
    xrandr --output DVI-1 --mode 960x576_50i
    just results in a black screen on my TV (no sync). Changing +hsync to -hsync gives me horisontal sync (a vertically rolling picture), but -vsync does not give me vertical sync. Neither -csync nor +csync works at all; gives no sync at all on the TV.


    I'm trying to trace the xrandr call through X to see what happens when xrandr --addmode is called. The xrandr application is implemented in xrandr.c in the xrandr-1.3.2 package on my machine. This code calls XRRCreateMode which is implemented in XrrMode.c in the libXrandr-1.3.0 package.

    Here's where I'm losing the trail; XRRCreateMode() calls GetReq(RRCreateMode, req), fills in the data structure req, then calls Data() and _XReply(). I'm guessing conrol is handled over to the X Server here? Where can I find the code where the request is handled/implemented?

    BTW, I found a probable bug at xrandr.c:2025: the return type of check_strtod() is declared as int, but should probably be double:
    Code:
    static int
    check_strtod(char *s)
    {
        char *endptr;
        double result = strtod(s, &endptr);
        if (s == endptr)
    	usage();
        return result;
    }
    As is, this declaration discards the fraction from the pixel clock value in MHz.

  2. #2
    Join Date
    Jul 2007
    Location
    Stockholm, Sweden
    Posts
    49

    Default

    I found some time to dig out my old oscilloscope and hook it up to pins 13 (H/C sync) & 14 (V sync) on the VGA output from my graphics card.

    I tried different combinations of sync options with xrandr:
    Code:
    ./xrandr --newmode "960x576@50i" 18.479 960 991 1078 1183 576 592 601 625 interlace -hsync -hsync
    replacing "-hsync -hsync" with "-csync", "+csync", "csync", and "+hsync +vsync" resp. All csync options give positive, separate hsync and vsync signals (ie the output is equivalent with the output from "+hsync +vsync". "-hsync -vsync" gives separate negative sync signals, as expected.

    So no wonder I cannot get vertical sync on my TV from this signal.

    I would like to fix this. I'm not asking for a bug fix, but rather I'm looking at this as an opportunity to learn something of the inner workings of X.

    XRRCreateMode() is called to create the new mode, but I cannot step into this function with gdb, and I haven't been able to find out where it is implemented.

    If some nice developer could explain the code path so I knew where to look I would appreciate it a lot!

    I'm running X.Org X Server 1.6.5 with xf86-video-ati-6.12.4, if it matters.

  3. #3
    Join Date
    Oct 2008
    Location
    Sweden
    Posts
    983

    Default

    Another topic that's better suited for discussion on the xorg or xorg-devel mailing lists

  4. #4
    Join Date
    Jul 2007
    Location
    Stockholm, Sweden
    Posts
    49

    Default

    Ok, thanks for the advice!

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •