You can and you will lose 3 to 6 dB. All that said, try it and see if your ear can tell the difference on your favorite stations and then on some fringe stations too.
7 responses Add your response
Depending on the quality of the splitter used, you can end up with a loss of 3 - 10 dB's. As such, try to use a splitter that is rated for the widest bandwidth possible. The higher the frequency that the splitter is rated for, the chances are that it is a low loss design. I would also suggest using a splitter with the amount of taps that you need i.e. if you need two but think that you may need more later, DO NOT buy a four way and let the others go to waste. Even though you aren't using the other taps, they will drop the signal down even further than i described.
As far as the antenna being mounted in your attic, this antenna is VERY sensitive to what is around it. Mount the antenna up as high as possible and away from any other metal objects. Sean
Sean is right about not getting a splitter with extra unused outputs. You lose some gain even on the unused outputs.
Just noting that there are splitters that do not split evenly. Say a two way splitter loses 3.5 dB. There are some 3 way splitters that instead of losing 5.83 dB each, will have one output that still loses 3.5 dB, and the other output is split losing 7.0 dB each.
I use one like this for the cable where it comes into the house. The broadband internet cable modem is on the one that still only has a 3.5 dB loss; and the TV signals are on the two 7.0 dB outputs. Holland Electronics is one maker of splitters like this. The regular even 3-Way, Holland calls a "balanced splitter".
I tried this once and have come to the conclusion that it is not the way to go. The signal was severely compromised, and the result was not enjoyable.
I ended up buying two identical MD ST-2's and running one to each tuner. It is WELL worth the extra $100.
Sugar is right-on about the coax, too. Definately use high quality RG-6.
FM tuners will "capture" a signal that is above a very low threshold (with modern tuners), and as long as your signal is above that capture level a stronger signal (less gain loss) doesn't matter. A splitter incurs a gain loss, but S/N ratio is unchanged. This is why signal booster amplifiers generally don't help. Signal-to-Noise ratio is usually more important than signal strength, and audiophile tuners generally have a front end superior to what you find in a booster amp. One possible exception to this rule is if the antenna is located a long way from the tuner, and the signal needs to be boosted so that it is not degraded by pickup in the leadin wire. In the days of unshielded twinlead, and in fringe reception areas (where I live) this was often the case, but nowadays shielded coax is the usual lead-in wire. I have about 100 feet of shielded 300 ohm twinlead, and it proved superior to a remote (at the antenna) booster amp.
To your question...a splitter may degrade reception of stations that are already so weak that they are unpleasant to listen to. For the stronger stations - the ones you are apt to listen to - there probably will be no degradation. Keep the antenna away from metalic stuff and wires. You may need to move the antenna around to experimentally find the best location/orientation. Maybe you don't remember "rabbit ears" TV antenna, but twisting those things around so as to optimize the TV picture was a fine art.