It isn’t uncommon to have 10+ devices in houses, and with more and more devices being “smart connected” that number is only going to rise.
This is why a more robust protocol for handling ATF is needed now, rather than later.
You can also notice how performance degrades when different people are looking at different cam’s feeds, one person is fine, the other person gets the message to switch to 360p mode, and that shouldn’t happen with ATF enabled.
This is one of the better explanations I have come across.
Guys, think of Airtime fairness as a toggle to decide whether the amount of data, or the amount of time should be fair, among actively receiving clients.
w/ airtime fairness (ATF), each devices gets the same amount of time, so a faster client (better modulation) can get more data in it’s time, and a slower client gets less.
w/o, all devices get equal throughput. A fast device will get 2kb, then a slow client will get 2kb and back and forth.
If you usually have just 1 or 2 devices with good signal needing data, then turning ATF off can help maximize throughput because data is basically just send out the buffers in FIFO so it works pretty well. ATF on will only hurt a little bit here, so this is probably too specific a case for most people that have a couple phones, tablets, laptop etc.
I like ATF on because it makes wifi predictable. I know that my AP is handling 7 active devices, so I know that my high modulation devices get 1/7th of their modulation worst case. For instance, right now my MacBook Air is at 92% signal, modulating at 300Mbps. There are 7 devices. Worst case with ATF on is I’ll get ~43Mbps of that no matter what the other devices modulate at. 60% of that is ~26Mbps which is my ‘rule of thumb’ for actual throughput based on modulation. It’s not likely that all of these are active all the time so I’m likely getting higher throughput most of the time, but I have a higher confidence of good performance in the worst case scenario.
Without ATF, the device at 35% signal modulates at ~20Mbps. That’s only 7% of the 300Mbps client. If they get the same amount of data from the AP then the slow client needs 300:20 = 15:1 of the airtime! So that 300Mbps client’s performance can collapse down to just 20Mbps! If you hear people say that wifi can be limited to the slowest clients modulation, it’s short-hand for this ratio. It’s not precisely true, but it stands up in worse case scenarios. These worst-cases are actually pretty likely because software typically requests data as fast as possible, so a 20Mbps client connection (less 40% for wifi overhead) will get hammered when they try to pull a web-page down, or stream a song because it tries to pre-buffer it.
In a nutshell, enable Airtime fairness unless you have a specific use case like min-rssi set to force high modulation and you want devices to have first-come/first-serve type data flow.