Good comment Yung Yi, thx. We typically, only recommend increasing default BSS minrate in a dense AP environment
or APs too close, and also usually only applied with multiple APs under controller management. We typically also maybe
recommend OFDM only and 12mbps BSS minrate as a lower end "comfort zone" for clients, who ultimately make the roaming
decisions, before increasing a BSS minrate, that shrinks the zone that clients must be in to talk to the APs.
But Arranda, I can't tell if you have more than one R710 that you're comparing to another brand? In which case, rebooting
our APs, when left with default "SmartSelect/Background Scanning" techniques, the AP will scan the air at the time it comes
up and choose what *should* be *best* 2.4g and 5g channels at that time, based on other SSIDs seen and interference at
that time.
If you are operating an R710 on Solo/Standalone AP firmware, you can see a rolling 2 minute RF/client connection "snapshot"
in the Support Info (text) file, that you can retrieve periodically from the Maintenance::Support Info page of the AP WebUI.
Find the ### Athstats Radio 0 ### section to view 2.4g radio interference, under Histogram of PHY errors since clearing all stats.
That shows the percentage of packets seen, with how many layer 1 OFDM header unreadable condition to the AP radio.
You want to see all numbers under 5K, and be aware of what percentage is seen at 10K and above, which are considered high.
Look at another Support Info snapshot at different times, and see if the numbers are much different. I don't know your
environment, but sometimes a microwave in the kitchen is in use, when people see poor performance, etc.
Otherwise, if you can hard-code your channels on both APs, do a test on channels 1, 6, and 11, between both, and that might
give you another perspective on your testing. Good luck and best regards.