Virtual by default or not?

I've been skimming through “Hardcore Java”, and I came across a section on the use of 'final' in Java. In it, one of Simmons' comments (okay, I'm not sure it's his comment because he's listed as editor and not author, but that sounds better than “whoever wrote this section) is (and I'm paraphrasing here):

Don't use final on methods unless you're sure you want them to be final

He then goes on to say that this is because you never know who might want to extend your class.

There are really two viewpoints on this issue. They are:

“Make it virtual in case somebody wants to extend it”

and

“Don't make it virtual unless you're sure somebody wants to extend it”

I'd like to expand on the first viewpoint, which I've sometimes labelled as “speculative virtuality”. I don't like it.

My reasons have everything to do with predictability and robustness. If you can't conceive of a user extending your class in a specific way but still choose to provide such extensibility, then it's pretty clear that you don't have an extension scenario in mind, which means that neither your code nor your tests are likely to ensure that it does work - especially across versions. Sure, it's possible that it may work in the current version, and may even continue to work in future versions, but I wouldn't call it a supported scenario. I'd prefer to use classes where I know what I'm doing is supported.

The second issue is around understandability. If I walk up to a class (in the metaphorical sense, of course, you can't really “walk up” to a class) and it has 29 virtual methods, it's hard for me to tell whether the author *intended* me to extend the class through a certain method or set of methods, or whether they just left them virtual “just in case”. Where if a class only has one (or a small number) of virtual methods, that's a good indication that the designer wants me to use them (ie they're part of a supported scenario).