When discussing whether quality is more important than quantity in programming circles, quality will often be cited as the clear winner. The argument is that focusing on quantity only ends up hurting us in the long run. Sacrificing quality usually means taking so called "shortcuts", which can lead to headaches in the future. When the shortcuts turn into dead ends, we end up having to take detours to get around them.
But does focusing completely on quality necessarily improve it? We try to justify spending more time improving quality by saying that once it's done the right way, the problem is less likely to resurface in other ways. In other words, we're spending more time on it now so that we don't have to in the future. But what does it actually mean to concentrate on the quality of the application? Just because I try to anticipate future uses of the code, or remove all duplication, or try to document what I'm doing doesn't necessarily mean the code is of higher quality. If there are known bugs and I eliminate those, I haven't necessarily raised the bar quality-wise. I could have introduced other defects as side effects, or some security or performance issues. I can spend more time writing tests, but again, I haven't improved the quality of the code here.
I'm not going to make an attempt to define software quality myself. I only wanted to point out that it's not completely obvious, and refactoring has its own risks. Instead, I'll summarize an interesting story I came across here.
In a ceramics class, half of the students were graded completely on quantity of work produced. The other half was graded completely on quality. You'd think that the quantity students would produce tons of sub par work, while the quality students would produce one amazing work. I did anyway.
I was wrong. The quantity group in fact ended up producing works of higher quality. They were able to learn from their own mistakes. The quality group, while coming up with sound theoretical works, failed to deliver.
This sounds very similar to programming. And unless you're this guy, it usually takes a few iterations to get something right.
We tend to learn more effectively through our own mistakes. Just like parents will let their children make their own mistakes, programmers learn to avoid pitfalls by first falling into them. The better programmers are the ones that write more code.