Qualcomm Reveals First 802.11ay Wi-Fi Chipsets

  • Thread starter Thread starter Guest
  • Start date Start date
How do they expect this to work practically at all? A router in every room? Deadly power microwave rays?

https://phorgyphynance.wordpress.com/2013/01/21/60-ghz-wireless-a-reality-check/
 

I don't think the enhanced localization is their primary reason for doing it, but I do think we need to ask whether FB can be trusted to use this capability responsibly.
 



Your link is useful, pointing out the major line of sight issue that 60ghz has, but "deadly"? Every square meter/yard of matter, but it your wall or another person, emits about 400 watts of 300ghz-400thz radiation. And you're worried about 1 watt of 60ghz?
 


Why bother asking? All of us already know the answer to that question.

 


As I understand it 802.11ay is like a MIMO version of 802.11ad (WiGig), so 60GHz does work. Still, I think the answer to both of your questions is: yes. The "infrastructure and fixed wireless access" chipsets will likely have high power (though probably not deadly, LOL) and the "mobile applications" chipsets will require an access point in every room.
 

Pretty sure he's referring to visible light
 

I think he was (facetiously) referring to the possibility of cranking up the wifi signal power to dangerous levels to try to overcome the reduced penetration that 60 GHz gets.

Comparing (mostly infrared) blackbody radiation to microwave radiation doesn't make sense in this context IMO, different wavelengths interact differently with the body.
 
So just for anyone who is wondering (like me), what's the difference between 802.11ay and the older 802.11ad. Both are identical as 802.11ay is based on 802.11ad, but it brings more improvements to increase the bandwidth, so just an improved version of ad.

Upcoming standards:
802.11ax: uses the same 2.4GHz & 5.0GHz bands with future expandability to a wider range of 1GHz to 7GHz bands, called WiFi 6, it should replace the current b, n & ac. And provide higher bandwidth, CES 2018 demos showed maximum of 11Gbps. Estimated public release is 2019.

802.11ah: Uses lower band of 900MHz, named WiFi HaLow, designed to have an extended range & more penetration of obstacles, and also have a low power design. making it compete against Bluetooth, it's designed mainly for IoT, sensors and smart homes appliances. Maximum bandwidth is about 348Mbps. Specification was published in 2017 but there's still no commercial chipsets that support it.

802.11af: Uses even lower band it uses the licensed UHF and VHF TV bands, the main goal is again more penetration, It offers more bandwidth than 802.11ah with a maximum of 569Mbps. Still no available product as it depends on the 802.11ah chipsets also. Range is very long to 1KM. Duo to it's use of licensed band, there's very strict regulation for it's adoption and uses.

802.11ad: The first standard that is based on the 60GHz band, allows higher bandwidth of 6.7Gbps per stream, but at much much lower distance. Has a very low penetration power even for open line of sight, full bandwidth only possible at 3.3m (10ft) range. These two factors limited the adaptation of this standard even though it was finalised since 2012.

802.11ay: An improvement of 802.11ad, using the same 60GHz band with more technologies to allow higher bandwidth at 20Gbps per stream and provide a longer range, indoor range is 10m (33ft), and outdoor range is 100m (328ft). Estimated release was in 2017 but was pushed to 2019.
 


I think the answer is "applications we don't typically use WiFi for now." Advertisers might want to say, have video kiosk ads (as are starting to pop up at malls, public transit stations, etc.) with an .11ay router built in that enables, say, a very fast download of a kind of "mobile experience" ad to your phone. At that point, you could expect the user to be in a close radius, but I'd still worry about, say, someone holding their phone a certain way in a case.

A second application might be wireless VR headsets, where high bandwidth and low latency are key, but there's obviously a lot of other things that would need to happen for that to work.

A third might be wireless streaming to TVs and projectors in office/home settings - potentially much faster and more convenient than a Chromecast, and with lower latency.

Device-to-device communication might be another application, though with a ~10 foot range, even that's tricky (think streaming video from a security cam to a hub).
 


Maybe it's because 'they' plan to install a tower at around every fifth house. I'm not sure what was known in 2013 compared to today.
 


short range, duh?
 

Of course they want you to have a router in every room. Maybe two, to avoid line of sight issues. Just think of all the wireless chipsets they can sell!
 

That's the right frequency range, but what's this about 400 W?
 

I think the question was regarding Facebook's proposed usage. Perhaps big antenna arrays could support MU-MIMO implementations with greater range.