Vert packing is an old trick, you can create any complex system you like as long as you can also unpack in a vertex shader.
You sure it will be a benefit though?
How can a vertex shader look at more than one vertex at a time? With delta compression, every vertex would depend on the previous one, so the GPU could not unpack them in parallel. However multiple delta-compressed sequences could be unpacked in parallel. So maybe each vertex given could be the start of a delta-compressed path, and a supplementary buffer provides the deltas? But a vertex shader is restricted to producing one output vertex for each input vertex. What I want is a way to generate vertices in shader code. Bezier rendering is adaptive: you generate more vertices in the areas of the curve which make tight bends, and fewer in the areas which are closer to straight; so it's hard to predict the total number that you need in advance. When rendering delta-compressed sequences, the total number of output vertices can be known in advance, however it's still a vertex-generation problem because each vertex must be visited in order, to apply diffs from the previous vertex. Isn't this type of problem the point of having geometry shaders?