Assume that both formats have their limitations.
Further, assume that when a component "breaks in", the way that the 1s and 0s are picked up, transmitted to the D/A, converted by the D/A, and fed to the Analog out all change while the tiny pieces all adjust to having electrical presence in them.
I believe (as most do) that the bandwidth and the amount of information conveyable on CD is less than what is possible via SACD. THerefore, it is possible that a cable might sound fine in your system while listening to CD, but CD's inherent limitations could mask the limitations of the cable. Once you put a SACD player in there, you might notice that your cable isn't giving you that last degree of detail, or finesse, which was totally unnoticed and therefore unnecessary with plain CD, but now that you've got that truer source, it is as plain as day. SO, from there, it isn't much of a stretch to suggest that the burn in process will "peak" faster with the poorer source, since many things that might happen while a component breaks in, could happen outside of what is noticeable, because of the limitations of CD, and therefore, you might think that your CD/SACD player is burned in, while listening to CD, since you stopped hearing a difference. I bet with some CRAP wire, or really poor preamp, or a reference system, but using an 8-track as a source, you might never notice the sound changing at all. So, you're listening to CD, and the sound changes, and continues to change, then stops changing. You might assume that the break in process is 100% complete, but switch to a more accurate, wider bandwidth source, and those changes that were inaudible, or otherwisw outside of the CD's bandwidth, or masked by the sonic limitations (only so many samples for a given frequency range), are now placed under a microscope when you go to a source with roughly 64 times the amount of info at any given frequency (SACD). So, teh part of the break in that was always, still happening, but was unnoticeable, is now noticeable.
Just a theory.