Wireless technology to many seems like magic; the ability to connect to the network and get all of the same functionality as a wired connection with the added bonus of being able to walk around without worrying about some cord keeping me tethered down sounds like something out of a JK Rowling book. To many, wireless technology is this new and potentially scary invention associated with WiFi and Cellular phones, however the underlying concepts have been around for over 100 years.
Wireless technology, whether it be the latest WiFi 6 standard, the newest 5G standards, or plain old GSM cellular, rely on the same base principles of radio developed back in 1895. While new protocols and standards define different wireless technologies and have vastly improved speed and efficiency in the wireless space, they all still rely on the fundamental principles of electromagnetic radio waves. In this post I want to talk about the fundamental principles of radio in an attempt to demystify any wireless technology.
So what does wireless technology use to transmit data from point A to point B? Well the short answer is radio waves. But what are radio waves? Radio is the lowest band of electromagnetic frequencies in the electromagnet spectrum. Depending on who you talk to the Radio bands either include or exclude microwave frequencies, not sure why this is the case but I wanted to point out that most people I’ve talked to include microwaves in the overall Radio band; however as we will discuss later, it makes sense to separate the Radio band into Radio and Microwaves simply because different technologies fit nicely into either one or the other.
As this figure shows, the electromagnetic spectrum starts at radio, works it way to visible light, then to X-ray and Gamma rays. All of these technologies listed under each category share electromagnetism as a common medium. The thing that separates each category is the wavelength frequency.
When talking about the dangers of electromagnetism there is a line located about where visible light ends and ultraviolet light begins. Its the point where the frequency of the wavelength is so small that it can penetrate the skin. This is why we block UV rays with Sun block to keep from getting sun burnt, or nurses have to take precautions when working with x-ray machines. This point is where the electromagnetism is considered ionized. As you can see radio and microwave both are very far away from the ionizing zone, and so are considered non-ionized and from a frequency standpoint safe.
Frequency vs amplitude
As stated above the different categories of electromagnetism can be defined by their frequency range, called a band. All wireless digital communication operates in the microwave frequency range, where all broadcast radio and analog television operates in the radio frequency range. For the remainder of this post my focus will be on the microwave spectrum, with a few mentions of broadcast radio to help understand concepts.
How do we measure the underlying wave form of a wireless signal? To measure the different wave forms two things come into play, the radio waves operating wavelength, and the amplitude or height of the wave. Think of these two concepts in terms of an ocean wave; wavelength is how long between each wave, and amplitude is how tall the wave is. So when we measure a radio wave we take into consideration the wavelength (measured in hertz(Hz)) and amplitude (measured in decibels(Db)).
In general a longer wavelength will travel farther when comparing two frequencies using the same amplitude, and a greater amplitude will also cause the signal to travel father when comparing two amplitudes on the same frequency. Have you ever wondered why while on a road trip the AM stations seem travel farther distances than the FM stations. Well this is because the AM radio stations are broadcasting on a longer wavelength frequency than FM and so travel farther.
One of the key factors of almost every electromagnetic wave is that as you get farther from the source the amplitude dissipates exponentially. This phenomenon is call attenuation.
When you think about amplitude you should think about it as the energy that the wave has. Going back to the Ocean wave analogy; a small lapping wave doesn’t hurt, but a huge tsunami will kill. In that same sense higher amplitudes can harm us.
Looking at a microwave oven; the frequency of the waves it generates overlaps with the 2.4Ghz range used by WiFi, but the wattage output in a microwave oven is around 1000 to 1200 watts. In comparison, a 2.4Ghz WiFi radio is usually in the neighborhood of a 1/4 of a watt. Its the same frequency, however the difference is Amplitude. One can cook a chicken whole, the other is considered harmless.
Because of attenuation, we can place radios relatively close to humans without worrying about getting cooked alive. If you ever get a chance to be close to large radio towers you might see a sign that looks like the one. Above that point the energy level is above the legal limit for human exposure, but below that level its considered safe. This is why radios that we do need to worry about exposing high levels of radiation are placed on large towers. By the time the signal gets to the ground the energy has attenuated to a safe level.
Bringing it together
Now that we have gone over the fundamentals of electromagnetism lets relate it to modern wireless technology. So as I stated earlier all digital wireless technologies such as WiFi, 5G, and standard GSM cellular fall under the microwave band. In the US the Federal Communications Commission (FCC) dictates which technologies can use which frequencies. The reason for that is to minimize noise from other protocols being picked up on different frequencies. Have you ever been listening to your car radio and you were between two radio stations that use the same channel? I’m sure you were annoyed by hearing music from both station being played at once, well its the same concept with digital wireless technology. Different technologies get assigned different channels so that they don’t interfere with other wireless technologies.
There is a pretty cool, although at this time a little dated, poster that shows how the FCC has carved up the Radio/Microwave spectrum into different frequencies. You can check it out here.
Because the FCC has fixed a wireless technology to a frequency band, Wireless Engineers have to manipulate the amplitude of the radio waves to get their desired outcome. Increasing the amplitude makes it easier for the signal to travel though walls and get to areas that it struggled with previously and as we know increased signal strength means faster downloads.
Hopefully you got something out of this blog post and at least from a high level I helped to demystify wireless technology. I tried keeping this post high level, as I didn’t want to go into the math for all of this, it can get a bit much. I plan on posting a follow up blog about Optimizing WiFi where we dive a little deeper into these concepts, so make sure to follow this blog.
One thought on “Wireless Fundamentals”