<< Chapter < Page Chapter >> Page >
Describes line-of-sight communication.

Long-distance transmission over either kind of channel encounters attenuation problems. Losses in wireline channelsare explored in the Circuit Models module , where repeaters can extend the distance between transmitter and receiver beyond what passive losses thewireline channel imposes. In wireless channels, not only does radiation loss occur , but also one antenna may not "see" another because of the earth's curvature.

Two antennae are shown each having the same height. Line-of-sight transmission means the transmitting andreceiving antennae can "see" each other as shown. The maximum distance at which they can see each other, d LOS , occurs when the sighting line just grazes the earth's surface.

At the usual radio frequencies, propagating electromagnetic energy does not follow the earth's surface. Line-of-sight communication has the transmitter and receiver antennas in visual contact with each other. Assumingboth antennas have height h above the earth's surface, maximum line-of-sight distance is

d LOS 2 2 h R h 2 2 2 R h
where R is the earth's radius ( 6.38 6 m ).

Derive the expression of line-of-sight distance using only the Pythagorean Theorem. Generalize it to the case where theantennas have different heights (as is the case with commercial radio and cellular telephone). What is the rangeof cellular telephone where the handset antenna has essentially zero height?

Use the Pythagorean Theorem, h R 2 R 2 d 2 , where h is the antenna height, d is the distance from the top of the earth to a tangency point withthe earth's surface, and R the earth's radius. The line-of-sight distance between twoearth-based antennae equals

d LOS 2 h 1 R h 1 2 2 h 2 R h 2 2
As the earth's radius is much larger than the antenna height, we have to a good approximation that d LOS 2 h 1 R 2 h 2 R . If one antenna is at ground elevation, say h 2 0 , the other antenna's range is 2 h 1 R .

Got questions? Get instant answers now!

Can you imagine a situation wherein global wireless communication is possible with only one transmittingantenna? In particular, what happens to wavelength when carrier frequency decreases?

As frequency decreases, wavelength increases and can approach the distance between the earth's surface and theionosphere. Assuming a distance between the two of 80 km, the relation λ f c gives a corresponding frequency of 3.75 kHz. Such low carrier frequencies would be limited to low bandwidth analogcommunication and to low datarate digital communications. The US Navy did use such a communicationscheme to reach all of its submarines at once.

Got questions? Get instant answers now!

Using a 100 m antenna would provide line-of-sight transmission over a distance of 71.4 km. Using such very tall antennas wouldprovide wireless communication within a town or between closely spaced population centers. Consequently, networks of antennas sprinkle the countryside (each located on the highest hill possible) to provide long-distance wirelesscommunications: Each antenna receives energy from one antenna and retransmits to another. This kind of network is known as a relay network .

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Fundamentals of electrical engineering i. OpenStax CNX. Aug 06, 2008 Download for free at http://legacy.cnx.org/content/col10040/1.9
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Fundamentals of electrical engineering i' conversation and receive update notifications?

Ask