Quote Originally Posted by frign View Post
I am not completely into the Wayland-spec, but I am certain this is part of it. How did the devs put it? Every frame is perfect, and judging from my tests with GL-applications (like glgears), this works well.
Yes ever frame is perfect because they control whats in the buffers. If theres something wrong in the buffers then they (or the graphics drivers) fucked up. All Wayland does is take pointers and buffers and display their contents. How they got there, and whats in them (but not WHO put what in there, wayland keeps close tabs on buffer security) doessnt matter to the protocol.

And X is complex because they wanted it to be as platform independent as possible, they were writing an operating system ONTOP OF an existing operating system (whatever flavor of unix you ran) That complexity is a bad thing. Wayland has the right idea: the parts that can never break (Wayland) have to be minimal so that one mistake doesnt impact a trillion other things. Wayland is made to get out of the way and anything "complex" (such as multiple GPU's) is "A client problem."

If we ever hit a big changeup in the way we do graphics (Optimus) again in the future, it will help to ensure that the protocol isn't the problem. With X + Optimus the protocol WAS, and to an extent IS, the problem. Because instead of cluttering up the protocol we just introduce new libraries, new clients, and they handle the changes. All Wayland wants is pointers and buffers and a display to shove their contents onto.