No, not entirely true. Roughly speaking, 40MHz channel means each AP delivers twice the bandwidth/throughput as a 20MHz channel width. Of course, in the real world, it's not actually 2x, but it's close. That usually translates to needing half as many AP's, or being able to cover slightly more range. For example, if your goal is 10mbit/s everywhere, you might find that where a 20MHz AP delivers 10mbit, a 40MHz AP delivers 18mbit, so you expand your 10mbit coverage radius by some.
Of course, this all starts to break down if you are in an overlapping coverage radius situation, like a busy stadium / convention center where you really need 12 overlapping AP's or more to cover the same area. Then you might actually run out of 40MHz channels and find it advantageous to use 20MHz channels to increase density.
EDIT: Long story short, it doesn't change the coverage *range*, but it roughly doubles the data rate delivered at each distance. And most of the times when people say range, they really mean range at which the AP can still deliver some threshold data rate, like 5mbps or whatever.
Right. Sean's scenario above primarily focuses on density of clients being the limiting factor / bottleneck for coverage / AP density.
The other scenario is the lower-density but higher-throughput scenario, which probably also needs to be considered. Especially since David brings up a 1080p video streaming scenario... In that case, another factor is, at what range can the AP deliver a data rate capable of sustaining the required speeds?
What actually brought me to Ruckus is I have special business needs to deliver a minimum of a 100mbit data rate in a 1500 sq ft dwelling. And even though I only have 10 or so possible clients that could need that rate, it took me 2 802.11ac AP's running 80MHz channels to attain my goal in a relatively small area. Of course, now I can probably fit a convention center's people in my house and give them Facebook quality wifi...
The modern classroom setting is probably somewhere between these two extremes.