I'd characterize the answer a bit differently than those above and note that there is no specific "right" answer to the question that is universal.
First, note that the 3db down point is meaningless - unless it's stated against a reference SPL at a given level of distortion. Example: F3 from 90db @ 10% THD is XYZ hz.
If you hold the room's overall SPL constant (in this case 90db) and add a second sub you will typically see F3 down significantly - AT THE STATED DISTORTION LEVEL (i.e. 10% THD). Just how significantly will depend on the particular room, reference level you've chosen for the test and the specific subwoofer(s) in question.
The key point is that adding the second sub allows each individual sub to play at a (3 to 5 db) lower SPL while maintaining the same overall SPL (90db, in our case) in the room. Thus, each sub exhibits less distortion at every frequency. The question is: How much less distortion and - more specifically - how much less distortion right at F3 at the test's given SPL?
If you examine the performance of most (though not ALL) home subwoofers below 50ish hz, you will see that they pretty quickly reach a point at which THD begins to increase almost exponentially with decreasing frequency - the distortion graph goes almost vertical as frequency continues to fall - provided a reasonably high test SPL. Let's call that point where performance goes to hell in a hurry the "critical" frequency. Since adding the second subwoofer effectively reduces the spl at which each sub is playing by 3- 5db, this may reduce distortion a TON, depending on where F3 sits vis a vis the test sub's critical frequency. OTOH, it may reduce it just a tiny bit if the single test sub is behaving well (10% THD) at F3 at the stated SPL.
IME, this happy result is pretty unlikely, unless you have a very small room and/or a monster sub and/or a very high stated THD and/or a very low reference level for testing F3. For context, high power, small cabinet subs (1 CuFT) like the Velo SLP or Sunfire often show 30+% THD at 90db at 50hz.
Hope that makes sense.
Marty
First, note that the 3db down point is meaningless - unless it's stated against a reference SPL at a given level of distortion. Example: F3 from 90db @ 10% THD is XYZ hz.
If you hold the room's overall SPL constant (in this case 90db) and add a second sub you will typically see F3 down significantly - AT THE STATED DISTORTION LEVEL (i.e. 10% THD). Just how significantly will depend on the particular room, reference level you've chosen for the test and the specific subwoofer(s) in question.
The key point is that adding the second sub allows each individual sub to play at a (3 to 5 db) lower SPL while maintaining the same overall SPL (90db, in our case) in the room. Thus, each sub exhibits less distortion at every frequency. The question is: How much less distortion and - more specifically - how much less distortion right at F3 at the test's given SPL?
If you examine the performance of most (though not ALL) home subwoofers below 50ish hz, you will see that they pretty quickly reach a point at which THD begins to increase almost exponentially with decreasing frequency - the distortion graph goes almost vertical as frequency continues to fall - provided a reasonably high test SPL. Let's call that point where performance goes to hell in a hurry the "critical" frequency. Since adding the second subwoofer effectively reduces the spl at which each sub is playing by 3- 5db, this may reduce distortion a TON, depending on where F3 sits vis a vis the test sub's critical frequency. OTOH, it may reduce it just a tiny bit if the single test sub is behaving well (10% THD) at F3 at the stated SPL.
IME, this happy result is pretty unlikely, unless you have a very small room and/or a monster sub and/or a very high stated THD and/or a very low reference level for testing F3. For context, high power, small cabinet subs (1 CuFT) like the Velo SLP or Sunfire often show 30+% THD at 90db at 50hz.
Hope that makes sense.
Marty