It would be a good thing, as Alistair suggests, to be able to model the effect of refactoring on ExtremeProgramming speed. But we don't have to model it to know that we must do it to be effective.
In the context of the other ExtremeProgramming rules, refactoring is necessary to ensure that the code meets quality standards. It also helps development go quickly.
If you use YouArentGonnaNeedIt, JustInTimeProgramming, DoTheSimplestThingThatCouldPossiblyWork and the other ExtremeProgramming practices, the emergent property is that you are doing iterative development. You choose consciously to go as fast as possible to add functionality in order to learn most rapidly what it is you are trying to build.
As DoomSayers have suggested elsewhere, these practices, unchecked, would lead to hideous spaghetti code, redundant and conflicting implementations, code bloat, unreliability, and nuclear winter. The DoomSayers are correct: you need something to balance quality against the pace of learning.
To ensure that these evils don't occur, ExtremeProgramming explicitly includes the rule to RefactorMercilessly. We interpret DoTheSimplestThingThatCouldPossiblyWork not only to mean to pick the simplest solution, but to pick the simplest (least redundant, most communicative) code to do it with. It is this refinement process that reduces the chaos that would otherwise be caused by a rapid evolutionary process like XP's.
If we were to apply the other practices without concentration on refactoring, the code would become increasingly hard to maintain. Progress would slow. This isn't just a belief, the C3 project experienced the problem. See GoFasterWithRefactoring. --RonJeffries
This page mirrored in WikiPagesAboutRefactoring as of July 17, 2004