Why so serious?
Nor does it need a "minimize-taskbar and minimize-tray action" in the protocol, this is handled entirely by the compositor. In fact, this could be handled without problems without even needing a "minimze" event. As as compositor, simply don't display the window surface, and place an icon in either the tray or taskbar: it's completely up to you.
Personally, I don't see a need for anything but a "window size changed" event. Why does an app need to know it's minimized? I think it should just carry on as normal. This would make window previews "just work", without any hacks.
For example, imagine you minimize some apps, and then press Alt+Tab and you want to select from them. If you send anything to the app to inform it that it has been minimized, it might do something that will make the app preview look bad. The app window preview being just its surface buffer that is displayed at another size.
But then again, it's entirely up to the compositor whether it wants to send applications the "minimize" event. If it thinks it might make them look bad, just don't sent it. I guess it doesn't harm anything to put the event in there.
On windows, minimized programs can display progress bars.
Not sure any of that has do to with the display server, or if it has to be defined in optional display protocol. But maybe these are the kind of possibilities they are considering.
They can on KDE too, and on Unity. They are API features of the desktop toolkit, and have nothing to do with the display server. In practice, probably the best way to do it is have the compositor "capture" a mid-resolution copy of the last frame state of an application going "hidden" to display as a preview where necessary (like window scrolling, hover previews, alt-tab, etc) so the app can run as if it has nothing to draw since its hidden, and the compositor can worry about the rest of the world.
The way you want it, in theory, is for it to be minimally invasive for both the protocol (which should be generic) and the applications (which should be naive to their environment) and the compositor should be handling the intermediary.
If I had to choose a compositor, I'd choose the one that did that. I mean, what are the advantages to suspending buffer updates? Seriously, why stop gnome-terminal from sending buffer updates to its window? To save the minuscule system power required to print text on a black surface? In the case of laptops, a "power-saving mode" might be useful, where the compositor does just this to save power. Though I'm not sure how much power it'd actually save.
Honestly, it's up to the application whether or not it wants to continue drawing when it's hidden/off-screen. It probably isn't a big resource burner, either way. What needs to be defined is whether or not the application needs to continue notifying the compositor when it (the application) has updated it's buffer. Everything else is app/compositor specific (what the compositor does with the buffer when it receives an update, etc.). It might be a good idea to implement a low-memory state, where the buffer is garbage collected when the window is hidden and re-created when it's restored, but again, that's mostly implementation specific.
I'd imagine the biggest concern with regards to resource consumption would be the scaling that would have to occur to have those live thumbnails, but that'd only be a problem when the thumbnail is visible, and from what I understand that's not a big issue on modern GPUs anyway. I could be wrong, of course.
Maybe having window states that say: full screen, windowed and hidden(minimized).
Specially choose d to give the best hints as how to manage resources for the Operating System, drivers, other software and hardware.
Another thing might be scaled window view: a window that is a version of the window with or without non 1:1 size definition. This can be enlarged for example with screen reading helpers with magnification and smaller mini version for preview, some kind of thumbnails.
Whether minimized means the surface is invisible to the user or not is a big hint this is not the way to define these lower level protocols.Quote:
Further, the term minimize is relatively subjective and defined by the implementation. Clients should not expect that minimized means the surface will be invisable to the user. There are several use cases where displaying minimized surfaces will be useful. Clients might want to change input handling or pause when minimized but nothing should change with regards to submitting surface buffer updates.
The resource manager needs to know if it needs to draw something or not. There for see my post above with full screen, windowed and hidden.
If minimized means still visible that's not very useful for lower level resource management. You just have to define a way to say what to draw or not.
Implementing a highly ambiguous term as minimize that can mean multiple things is not good. This should be handled by other protocols and maybe even Desktop Engines. Remember the job of low level display stuff is not to define too high level behaviour. To work most optimally it must say how to use resources. Not define some ambiguous behaviour. If minimize means some kind of windowed mode then it's not useful to define it in Wayland or Weston.
It should belong in another FreeDesktop.org standard.