Interesting but probably just coincidental, because we hear frequencies on a logarithmic curve - so for every increase of one octave you double the amount of frequencies in that band. For example 1 kHz to 2 kHz is a 1000 Hz difference, but from 2 kHz to 4 kHz is a 2000 Hz difference etc...
I'm not an expert on cochleas but it seems like that would give us an inordinate amount of hair cells devoted to frequencies we almost never use or need (above 12 kHz).
If you have ever looked at an "equal loudness curve" - we actually hear best in the 500 Hz to about 6 kHz range (which makes sense as that's where most speech happens). I would guess that we probably have a disproportionately higher amount of hair cells in those regions.
Then again, I'm just a guy on the internet speculating about stuff I really don't know about, so don't put too much stock in that.