Before I get a bunch of responses questioning why I would use 160mhz bandwidth, I am one of the few people that 1) does not have neighbors that would cause interference and 2) need the bandwidth offered by the larger spectrum.
However, I am curious of the behavior of a 160mhz signal that inherently spans into the DFS range if there is a radar event that restricts the use of the DFS portion of the signal. I know with 80mhz channels the AP will shift to a non-DFS range (say channel 36 for example) but that is not really an option for 160mhz because it must span into DFS (I do not have a split channel option). Does it force itself down into 80mhz? Does this result in a momentary lapse in connection for clients? I have no way of testing this and I don’t think I would have issues in the DFS range but I am not willing to reduce reliability if this is the case. If the specific AP matters, I am in the omada ecosystem.
If compliant with federal law, the behavior would be the same as in any other situation with DFS.
When starting to use any spectrum in the radar band, the AP would listen for 60 seconds before transmitting any signal at all. Then if a radar signal is detected it would change to a different channel. If that channel used radar band spectrum it would start another 60 second listening period.