Page 2 of 4 FirstFirst 1234 LastLast
Results 11 to 20 of 38

Thread: Going Indepth With Wayland Sub-Surfaces

  1. #11
    Join Date
    Jan 2011
    Posts
    1,287

    Default

    Well, it seems it still needs explaining.

    Does Wayland include a specific protocol for minimizing? Nope.
    Does it need to? Neither. The compositor should take care of that.
    But WHYYYYYYYYYY???! :'(
    Because the policy for minimizing is usually dependent on what the compositor's makers want it to look like, i.e., compositor implementation dependent. Thus, it doesn't really make sense to include it in the core protocol, if it is to be bypassed by the compositors. Remember, window management was always handled by the window managers.

    But why aren't there compositors capable of that? That's not true, AFAIK. IIRC, there is already KWin, and Martin says it is usable in a day to day basis, so I bet it has the ability to minimize windows. AFAIK, E18 is able to. Just Weston, thought as a test bed, doesn't include something as trivial. Weston shouldn't be used in production, as it is not its aim, although some people want to, and I guess those ones will end up writing a plugin to enable it to minimize windows with Weston.

  2. #12
    Join Date
    Mar 2012
    Posts
    240

    Default

    Quote Originally Posted by mrugiero View Post
    Well, it seems it still needs explaining.

    Does Wayland include a specific protocol for minimizing? Nope.
    Does it need to? Neither. The compositor should take care of that.
    But WHYYYYYYYYYY???! :'(
    Because the policy for minimizing is usually dependent on what the compositor's makers want it to look like, i.e., compositor implementation dependent. Thus, it doesn't really make sense to include it in the core protocol, if it is to be bypassed by the compositors. Remember, window management was always handled by the window managers.

    But why aren't there compositors capable of that? That's not true, AFAIK. IIRC, there is already KWin, and Martin says it is usable in a day to day basis, so I bet it has the ability to minimize windows. AFAIK, E18 is able to. Just Weston, thought as a test bed, doesn't include something as trivial. Weston shouldn't be used in production, as it is not its aim, although some people want to, and I guess those ones will end up writing a plugin to enable it to minimize windows with Weston.
    That means applications will not able to figure out their current state if the minimization is compositor specific . Is that right?

  3. #13
    Join Date
    Jan 2011
    Posts
    1,287

    Default

    Quote Originally Posted by Alex Sarmiento View Post
    That means applications will not able to figure out their current state if the minimization is compositor specific . Is that right?
    AFAIK, they will use hints, which already exist for X applications. Refer to ICCCM and EHWM. I think Wayland adheres to those as a whole.

  4. #14
    Join Date
    Apr 2010
    Posts
    819

    Default

    Quote Originally Posted by Alex Sarmiento View Post
    That means applications will not able to figure out their current state if the minimization is compositor specific . Is that right?
    Kind of, but the point is - "minimised" isn't a state, it's a user operation that may be implemented in any number of ways (or not implemented at all). The application shouldn't care about whether it's minimised - it should care about whether it should be rendering or not, and depending on how a given desktop implements minimising, the answer to that question will vary (e.g should it stop rendering, should it keep rendering, should it render to a smaller thumbnail surface?)

  5. #15
    Join Date
    Mar 2012
    Posts
    240

    Default

    Quote Originally Posted by Delgarde View Post
    Kind of, but the point is - "minimised" isn't a state, it's a user operation that may be implemented in any number of ways (or not implemented at all). The application shouldn't care about whether it's minimised - it should care about whether it should be rendering or not, and depending on how a given desktop implements minimising, the answer to that question will vary (e.g should it stop rendering, should it keep rendering, should it render to a smaller thumbnail surface?)
    Uh, the application should NOT care about the desktop neither the particular compositor . From a user point of view, minimizing only means "get out of my sight, quickly ". Neither the protocol nor the compositor should tell the application how it must respond to that kind of demands from the user. Maybe i misunderstood, but you seems to imply that the concept of minimizing depends on each compositor. But that's crazy . So i hope that is not what you meant

  6. #16
    Join Date
    Jan 2011
    Posts
    1,287

    Default

    Quote Originally Posted by Alex Sarmiento View Post
    Uh, the application should NOT care about the desktop neither the particular compositor . From a user point of view, minimizing only means "get out of my sight, quickly ". Neither the protocol nor the compositor should tell the application how it must respond to that kind of demands from the user. Maybe i misunderstood, but you seems to imply that the concept of minimizing depends on each compositor. But that's crazy . So i hope that is not what you meant
    Not the concept, but the implementation. Some just want a "get out of my sight" minimize, some others want a fancy animation while it gets out, some others want it to keep rendering (for example, to render thumbnails as suggested), while others will want it to avoid rendering if it is not in sight, as it would be just wasted energy if you don't care about what's happening. Thus, it depends on the compositor. Thus, the protocol doesn't mandate how it should work. As the only part caring about minimizing is the compositor, this features belongs in it, and not inside the protocol. The only thing the app might care about (but I don't see a practical way of enforcement) is if they should keep rendering or can just avoid it, and for that you can use simple hints between compositors and apps, that allow the application to skip frames if the hint for that is given.
    Another reasonable question would be why would you use a hint instead of providing this messaging mechanism within the core protocol, and my answer is that it might make more sense, as this should be optional for both the compositor and the application: there are good reasons to want the app to keep rendering and there are good reasons to not want that.

  7. #17
    Join Date
    Aug 2013
    Posts
    40

    Default

    The point is, since currently all wayland apps have client side decorations, there is the need of a protocol to let the client ask the compositor to minimize it when the user click on the minimize button. That is not in the wl_shell_surface protocol, so you can't minimize any wayland app currently, on any compositor (again, when clicking on the minimize button in the decoration. A compositor can still minimize a window if the operation is started by the compositor itself, like clicking on a "minimize" entry in the context menu of the task bar.). The xdg_shell protocol that is being worked on will have that feature.

  8. #18
    Join Date
    Mar 2012
    Posts
    240

    Default

    Quote Originally Posted by mrugiero View Post
    Not the concept, but the implementation. Some just want a "get out of my sight" minimize, some others want a fancy animation while it gets out, some others want it to keep rendering (for example, to render thumbnails as suggested), while others will want it to avoid rendering if it is not in sight, as it would be just wasted energy if you don't care about what's happening. Thus, it depends on the compositor. Thus, the protocol doesn't mandate how it should work. As the only part caring about minimizing is the compositor, this features belongs in it, and not inside the protocol. The only thing the app might care about (but I don't see a practical way of enforcement) is if they should keep rendering or can just avoid it, and for that you can use simple hints between compositors and apps, that allow the application to skip frames if the hint for that is given.
    Another reasonable question would be why would you use a hint instead of providing this messaging mechanism within the core protocol, and my answer is that it might make more sense, as this should be optional for both the compositor and the application: there are good reasons to want the app to keep rendering and there are good reasons to not want that.

    No, is there any way for applications to know whenever a user demands "minimization" under wayland? . Obviously, how the system renders the minimization is irrelevant for the protocol and the application itself , like fancy zoom or genie effects. For X or Y reasons that have nothing to do with rendering , an application might like to know when is being minimized.

  9. #19
    Join Date
    Aug 2013
    Posts
    40

    Default

    Not yet, but the xdg_shell protocol will have that.

  10. #20
    Join Date
    Feb 2011
    Posts
    1,310

    Default

    Quote Originally Posted by Alex Sarmiento View Post
    For X or Y reasons that have nothing to do with rendering , an application might like to know when is being minimized.
    Such as?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •