Announcement

Collapse
No announcement yet.

GNOME Shell & Mutter Just Landed More Wayland Improvements

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • GNOME Shell & Mutter Just Landed More Wayland Improvements

    Phoronix: GNOME Shell & Mutter Just Landed More Wayland Improvements

    GNOME Shell and Mutter didn't see new 3.17.3 releases for last week's GNOME 3.17.3 development release, but today they've released the new package versions...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Hmm I wonder if I will be able to use my graphics tablet in Krita under Wayland in September? Probably not, but it would be great ...

    Comment


    • #3
      Originally posted by CrystalGamma View Post
      Hmm I wonder if I will be able to use my graphics tablet in Krita under Wayland in September? Probably not, but it would be great ...
      Summary the talk:- A brief explanation of the parts that needed to be worked on in order to get tablet support in Weston (libinput, the wayland protocol, and...


      Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite
      All opinions are my own not those of my employer if you know who they are.

      Comment


      • #4
        Originally posted by Ericg View Post

        Summary the talk:- A brief explanation of the parts that needed to be worked on in order to get tablet support in Weston (libinput, the wayland protocol, and...


        http://www.phoronix.com/scan.php?pag...tem&px=MTcyOTE
        Yes, I know that tablet support is in Wayland (first-class input device even) and is planned for GNOME 3.18 ...
        I'm rather worried that, because Krita (AFAIK) runs under XWayland, there would be complications.
        Or is there a working XWayland translation of tablet input? (Sorry, don't have the time to watch the video right now)

        Comment


        • #5
          Originally posted by CrystalGamma View Post

          Yes, I know that tablet support is in Wayland (first-class input device even) and is planned for GNOME 3.18 ...
          I'm rather worried that, because Krita (AFAIK) runs under XWayland, there would be complications.
          Or is there a working XWayland translation of tablet input? (Sorry, don't have the time to watch the video right now)
          I am not sure the details, but I thought libinput also worked under X11?

          Comment


          • #6
            Originally posted by TheBlackCat View Post

            I am not sure the details, but I thought libinput also worked under X11?
            Yes



            It is now upstream.

            Comment


            • #7
              Unfortunately the thing I'm waiting on have been pushed back (touchpad gestures). Touchpad gestures is a sad story in Linux today.

              Comment


              • #8
                Originally posted by Pajn View Post
                Unfortunately the thing I'm waiting on have been pushed back (touchpad gestures). Touchpad gestures is a sad story in Linux today.
                Sadly true. I remember the multipointer and multitouch demos https://www.youtube.com/watch?v=TBZtSf3sgeQ
                I think touch gesture is mostly vendor specific and libinput is mainly a clean up from legacy input drivers.

                Comment


                • #9
                  I have absolutely no clue why they thought it was a good idea to separate mouse, touch and tablet input. It is all esentially the same and having 3 different apis means developers have to work 3 times harder and likely only implement one.

                  There should just be one "input" event system based off the touch code, which has an enum for the device type. If it is a pen a, a mouse or a finger. That way if you do not care what the user used to click a button it will always still work. But then when you do want to separate pressure or top and bottom of a pen you still can without having to implement a new set wayland proxies and callbacks.

                  Android does this and it makes much more sense for developers.

                  Comment


                  • #10
                    Originally posted by Waylandtester View Post
                    I have absolutely no clue why they thought it was a good idea to separate mouse, touch and tablet input. It is all esentially the same and having 3 different apis means developers have to work 3 times harder and likely only implement one.

                    There should just be one "input" event system based off the touch code, which has an enum for the device type. If it is a pen a, a mouse or a finger. That way if you do not care what the user used to click a button it will always still work. But then when you do want to separate pressure or top and bottom of a pen you still can without having to implement a new set wayland proxies and callbacks.

                    Android does this and it makes much more sense for developers.
                    First, they really aren't the same at all (on touch, finger down and move is a drag operation; on tablet/touchpad, it's not). Secondly, they really, really aren't the same. Thirdly, Android doesn't actually have the exact same API for them. Fourthly, the touch code would be the wrong place to start. Fifthly, we did that for XI2's multitouch API (where pointer and touch are mostly mashed into one), and it was a disaster, because again, they aren't all the same.

                    Comment

                    Working...
                    X