Steve defines Stratification ans "trying to keep the levels of decomposition stratified so that you can view the system at any single level and get a consistent view". Later in his following example, he describe a common scenario that when people trying to build a "modern" system over "a lot of oder, poorly designed code", it is better to design "a layer of the new system that's responsible for interfacing with the old code".
Theoretically, this is a good and reasonable advice. But in real world, sometimes the advice is misinformed. The point is that your "modern" system will be outdated and some followers might consider your system be poorly designed when they are building the future system on your "modern" system, hence an additional layer will be added between your system and the future system. And finally a system will become a sandwich with many layers, with each layer built on an assumption that it's base is poorly designed and hence is not well stratified.
And here comes up with two questions:
1. how to prevent your "modern" system from be stratified some day ?
2 when should we stop pile up the sandwich and build our system from gound up ?
In one of my previous projects, I have ever found some code history trace back to 1980s. And there are some strange layers created to stratify legacy code from new codebase. That strategy works, but I don't know if we can save a lot of money and time if we built the system from ground up. (The whole product last for 3 years and costs more than 100 million dollar).
Another interesting story is after revolutionary changes of codebase, firefox (previously called firebird) reborn from ashed netscape navigator.
So it is really a challenge to make the decision when is the good time to abandon your legacy code instead of to reuse them.