Ask Dr. SETI ®
It seems to me that if we are looking for a beacon, and if a civilization were to engineer one, it would probably take advantage of low absorption frequencies and probably the antenna gain available from a dish at high frequencies. But if a civilization had no beacon, at which frequency should we look? We emit huge amounts of power at very low frequencies. Large commercial radio stations are 50,000 Watts, and military VLF is much more (on the order of MegaWatts).
I realize that with a lower frequency, antenna gain and angular resolution go way down. But the power available at low frequencies is so huge, I wonder if searching those frequencies might give us the first hint that a civilization exists, even if we would not have the resolution to know where it was coming from.
Or does the antenna gain of the big dishes make up for the higher power in the lower radio frequencies? Are the low frequency radio stations less detectable, even with their high power?
Dick H., via email
The Doctor Responds:
The problem with very low frequency SETI is that the ionosphere effectively blocks most of the signal energy which might otherwise reach Earth. That is one of the reasons that Grote Reber moved to Tasmania (Australia's southernmost state) -- he was doing low frequency radio astronomy, and thought he could see through the hole in the ozone layer. (A better solution would be to observe from space, but the costs are prohibitive!)
Yes, it's true that additional antenna gain helps, and this seems to favor the higher frequencies. In fact, given a fixed antenna size, such gain dominates the link analysis, more than compensating for power differences with increasing frequency. Here's my back-of-the-envelope calculation:
Assume that, on average, available power decreases linearly with increasing frequency. That means every time you double the frequency, you have three dB less signal available. But according to the Inverse Square Law (which Congress forgot to repeal when they cancelled the NASA SETI program), free space isotropic path loss increases with the square of frequency. That means that every time you double frequency, you have 6 dB more signal loss enroute. Combining these two effects, it sounds like the net penalty for increasing frequency by a factor of two is going to be about 9 dB.
But, for a given physical antenna size, antenna gain increases with the square of frequency. That means if you double frequency, you've picked up six dB in the antenna (all else remaining equal).
So, it sounds as though the net penalty for doubling frequency is three dB (the sum of the above two effects). Only, remember that in any communications path, there are two antennas (one for transmit, and the other for receive). Since each antenna experiences a gain increase of 6 dB for a doubling of frequency, the overall result is that doubling frequency gives you a total system improvement of 3 dB (assuming fixed antenna size at both ends) -- even neglecting ionospheric effects.
These are only rough calculations, to be sure. But they tend to underscore why higher frequencies are favored for space communications in general, and point-to-point interstellar contact in particular. Even factoring in power and path loss considerations, the higher the frequency the better (up to a point, of course!)
What that point might be has yet to be determined.
entire website copyright © The SETI League, Inc.
this page last updated 27 September 2003
Top of Page