As long as they don't set -Werror by default. I can't tell you how many codebases I have to compile on a regular basis, which work fine in practice, but have all kinds of warnings that I can't be arsed to fix (and neither can the upstream maintainers, apparently). It's gone so far that it's somewhat of a badge of honor among FOSS developers to throw as many warnings as you can while still producing well-working code.
Newer versions of gcc have had issues where trying to compile older codebases results in errors that can be silenced with -fpermissive. Yes, yes, I know; undefined behavior sucks and all that. But if it works, I just want you to STFU and build my program - raagh! I don't have time to make everything 100% correct per the language spec. I'm not a language lawyer; I'm a practical programmer. Shut up and give me a binary. If my undefined behavior leads to problems, I'll see them, and I'll fix it. Nyeh.
It's gone so far that it's somewhat of a badge of honor among FOSS developers to throw as many warnings as you can while still producing well-working code.
I know you're joking, but I think that in practice, most of the warnings come from GCC's increasing strictness - code compiles cleanly on one version of GCC, spits out warnings on the next version, and starts throwing compile errors on the version after that. It might have been bad code to start with, but it was bad code that the compiler used to accept without a fuss.
From that point of view, increasing numbers of warnings are a *good* thing.