Question Can (or will) a wifi client show up on both 2.4GHz and 5 GHz channels with the same MAC address?

kaladorn

Prominent
Jul 15, 2019
6
0
510
I'm looking in my router/access point's list of clients and I've changed a client device between 2.4 GHz and 5.0 GHz bands and refreshed and saw the same client MAC address. I wasn't expecting that.

Part of me assumed that there were different radios in play (one for each band) - there are in the WAP, but perhaps not in the client? - and thus I should see a 2nd MAC address associated with that client. I saw the same MAC address in the router's client table in either network band.

I hypothesize my client (a Nexus 4) has one wifi radio that can support multiple radio bands and this is why I see a single MAC address.

Is this generally the case with mobile clients or is this unusual? Are there mobile clients that I would expect to have a different MAC address for each frequency band or would that be very unlikely/unusual?
 
In general there is only 1 radio chip in end devices. It can not run on 2.4g and 5g at the same time. Years ago they needed different chips to run on different radio bands modern equipment can be changed via software. Most new routers use 2 exactly the same radio chips one set to run on 2.4g and the other on 5g.

Still the mac address is a software thing. Although there might be a default on in the hardware the software can send data with anything it wants. This is how you change the mac on say you ethernet card. It does not actually change the mac address in the hardware it just sets the value the driver uses. There are virtual server applications that can have mulitple different mac addresses on a ethernet port.

Wifi is pretty locked down because the FCC does not want people messing with things like radio transmit power and other options you can set in radio chips. If the vendor wanted to they could send different mac addresses on 2.4g and 5g.

Many wifi drivers let the end user set the mac address, last time I check windows disables this ability....like a real hacker uses windows in the first place :)
 

kaladorn

Prominent
Jul 15, 2019
6
0
510
In general there is only 1 radio chip in end devices. It can not run on 2.4g and 5g at the same time. Years ago they needed different chips to run on different radio bands modern equipment can be changed via software. Most new routers use 2 exactly the same radio chips one set to run on 2.4g and the other on 5g.

Still the mac address is a software thing. Although there might be a default on in the hardware the software can send data with anything it wants. This is how you change the mac on say you ethernet card. It does not actually change the mac address in the hardware it just sets the value the driver uses. There are virtual server applications that can have mulitple different mac addresses on a ethernet port.

Wifi is pretty locked down because the FCC does not want people messing with things like radio transmit power and other options you can set in radio chips. If the vendor wanted to they could send different mac addresses on 2.4g and 5g.

Many wifi drivers let the end user set the mac address, last time I check windows disables this ability....like a real hacker uses windows in the first place :)

The MAC is Link Layer (in the OSI model) and that's down in Driver-Land all right. My understanding was that all NICs were built with a built in ID and that unique ID was the MAC. Now, that said, I know you can put the cards (Driver permitting) into a promiscuous mode to pull in all traffic (network sniffing) and it doesn't suprise me at all (because I've heard of MAC address spoofing) that some drivers let you override any built in info in what is sent out on the wire.

I just had assumed every radio channel used implied a separate hardware chip. I guess Bluetooth I figured had a radio and in a sense maybe NFC too. WiFi I can see getting away with one given most end-user devices have an either-or choice for 2.4 GHz or 5.0 GHz connectivity at one time.

Now you've made me curious though: If the radio chip is the same for both channels, and the physics of antennas say that there are particular values for the length (virtual or actual) of an antenna for best performance for each frequency, does a cell handset use one WiFi chip and 2 antennas (one tuned to 2.4 Ghz, the other to 5.0 GHz) or does it just make a compromise with the antenna so that it can support both channels? Or are they using one of these bi-modal antennas I've heard of? (When I did some work with the RCMP, they said at the time, getting 3 frequencies into one antenna was a technical hurdle, but that was back in the late 90s).

Thanks for the information. I appreciate your time and your explanation.
 
They likely use a "average" antenna. The optimum antenna size is always a multiple of the wave length. With microwave antenna the wave lengths are very very small and you can likely come up with a length that is a close enough to a exact multiple of both. Not like ham radio antenna where you have wave lengths measured in many meters.

The antenna on cell phones are extremely tiny and in most cases you can barely tell they are antenna, they are just traces on the printed circuit boards. The antenna for some cell frequencies is much lower which is why you had apple for example building it into the sides of the case and people were shorting it out with their hands.

I used to work for a large cell equipment provider I sat in meetings with people who had degrees in RF engineering. All they did all day long was plan antenna patterns for cell towers. Way to complex for me and I have a masters degree in EE.
 
  • Like
Reactions: kaladorn

kaladorn

Prominent
Jul 15, 2019
6
0
510
They likely use a "average" antenna. The optimum antenna size is always a multiple of the wave length. With microwave antenna the wave lengths are very very small and you can likely come up with a length that is a close enough to a exact multiple of both. Not like ham radio antenna where you have wave lengths measured in many meters.

I remember that and some of the stuff about the relationship between a moving radio and the number of fades you'd get. That was 25 years ago I studied that and built AM and FM transmitters/receivers in college.

And with short wavelengths (aka higher frequencies) as you point out, for any frequency, another frequency roughly a multiple of 2 isn't that much to add as far as antenna length...

The antenna on cell phones are extremely tiny and in most cases you can barely tell they are antenna, they are just traces on the printed circuit boards. The antenna for some cell frequencies is much lower which is why you had apple for example building it into the sides of the case and people were shorting it out with their hands.

Yes, some of the early cells (and even the more ancient 'man portable' phones) used lower frequency ranges and that tended to dictate more noticeable antennas.

I used to work for a large cell equipment provider I sat in meetings with people who had degrees in RF engineering. All they did all day long was plan antenna patterns for cell towers. Way to complex for me and I have a masters degree in EE.

Some years back, I worked in mobile policing using a system from Bell called Ardis (or something like that) at 9600 bps (max channel bandwidth on any given frequency you had a card for... practically, the actual bandwidth was more like 1200 to 2400 and they had issues when everyone on shift fired up their laptops. With those limits, the laptops were still a new and wonderful tool and evolved into things capable of shipping files around and handling pictures and so on.

Back then, the radio API was on top of the Network Layer (on some radio networks - different implementation for every vendor's gear, but at least the API tried to harmonize the calls) so I had to write guaranteed delivery (packet sequencing, retransmissions protocols, etc). Lower data rates meant collisions and channel contention when everyone was near the tower near the detachment (until they threw in 2 or 3 more channel cards).

I learned that higher frequencies meant higher data rates were possible but at the cost of coverage footprint for a tower. Faster networks, higher frequencies -> smaller coverage.

I also learned that some cell providers, in addition to the $300K (not counting real estate, permits, or installation) towers they were putting up, had 'base station in a box' that could be used for testing (smaller footprint of course) and that these were sometimes used to shore up localized dead spots. About $75K for one of those boxes, but cheaper than a full tower. And you can move it. Those are good for disaster relief and getting basic phone systems up in disaster areas if they are hooked to a battery bank.

In big cities, the antennas are broken into sectors and of course you have a lot of towers (yours and other providers) around a metropolitan area so aligning the sectors to give coverage but not to interfere with one another was a job for some pretty smart folks.

In places like Vancouver, you need lots of antennas - skyscrapers and hills all over. In Red Deer, I think three towers spaced around the outskirts provided coverage to Red Deer and rural areas around Red Deer. Geography is everything when it comes to transmission line of sight.

More recently, I worked on cell phone policy enforcement software (accounting, auditing, authentication) and on adding some SNMP network monitoring capabilities. The company I worked for had a good product, but there were so many tiers (counting the cell phone provider, the policy enforcement framework (itself N-tier and distributed) and then possibly third parties who can be in various places in the chain... it was a wonder to me that we could ever make a cell phone call at all. It worked and that was a tribute to a lot of smart software engineers.

Thanks again for your insights.
 

TRENDING THREADS