- May 27, 2020
- 556
- Tinnitus Since
- 2007
- Cause of Tinnitus
- Loud music/headphones/concerts - Hyperacusis from motorbike
Over the last couple of days, it occurred to me that two, seemingly polar opposite processes can resolve hyperacusis, so I thought I'd start an open discussion to see if we can find out anything more about the underlying pathology. I'll start by sharing some of my own thoughts.
There have been some anecdotal reports of people with hyperacusis seeing their symptoms improve after receiving a cochlea implant, the deduction being that increased input must somehow reverse the maladaptive plasticity that has occurred. Conversely, I've seen several cases on this forum and others where hyperacusis has resolved after sufferers have experienced (further) hearing loss, the deduction being that decreased input has also reversed this same maladaptive plasticity. Why is this?
As many of us already know, research has suggested that the sensitisation of type II afferents through ATP leakage and/or an increase in synapses in the OHCs (in conjunction with loss of synapses in IHCs) are what are causing the maladaptive plasticity. But if leaking OHCs are indeed the issue here, how would the increased input arising from a cochlea implant help with what is seemingly a molecular/biological process? Surely the OHCs are compromised, CI or no CI. Equally, if hearing loss occurs, one would expect further ATP leakage from newly compromised OHCs, leading to further sensitisation - unless of course this hearing loss is the result of already compromised OHCs dying off completely, therefore reducing ATP leakage altogether - as well as a further increase in the number of OHC synapses.
I can understand how a drug like FX-322 might work with hyperacusis because new, structurally sound OHCs are replenishing the dead/damaged OHCs therefore decreasing ATP leakage, but in the case of a cochlea implant, it is simply boosting overall input to the existing pool of OHCs and IHCs, which makes me wonder: if we know type I afferents are the ones responsible for transmitting and processing sound meaningfully, and given that it is IHCs that innervate predominantly to the type I afferents, is it possible that something else is going on somewhere along the auditory pathway?
Conversely, I also recall the knock-out mouse study, where they genetically engineered mice to be deaf but were still able to induce hyperacusis in said mice, effectively suggesting that pain can be experienced even in the presence of no input, although I wouldn't infer that restoring input wouldn't alleviate the hyperacusis.
I suppose I am still none the wiser after writing all this lol, but I welcome anyone else's thoughts on this topic.
There have been some anecdotal reports of people with hyperacusis seeing their symptoms improve after receiving a cochlea implant, the deduction being that increased input must somehow reverse the maladaptive plasticity that has occurred. Conversely, I've seen several cases on this forum and others where hyperacusis has resolved after sufferers have experienced (further) hearing loss, the deduction being that decreased input has also reversed this same maladaptive plasticity. Why is this?
As many of us already know, research has suggested that the sensitisation of type II afferents through ATP leakage and/or an increase in synapses in the OHCs (in conjunction with loss of synapses in IHCs) are what are causing the maladaptive plasticity. But if leaking OHCs are indeed the issue here, how would the increased input arising from a cochlea implant help with what is seemingly a molecular/biological process? Surely the OHCs are compromised, CI or no CI. Equally, if hearing loss occurs, one would expect further ATP leakage from newly compromised OHCs, leading to further sensitisation - unless of course this hearing loss is the result of already compromised OHCs dying off completely, therefore reducing ATP leakage altogether - as well as a further increase in the number of OHC synapses.
I can understand how a drug like FX-322 might work with hyperacusis because new, structurally sound OHCs are replenishing the dead/damaged OHCs therefore decreasing ATP leakage, but in the case of a cochlea implant, it is simply boosting overall input to the existing pool of OHCs and IHCs, which makes me wonder: if we know type I afferents are the ones responsible for transmitting and processing sound meaningfully, and given that it is IHCs that innervate predominantly to the type I afferents, is it possible that something else is going on somewhere along the auditory pathway?
Conversely, I also recall the knock-out mouse study, where they genetically engineered mice to be deaf but were still able to induce hyperacusis in said mice, effectively suggesting that pain can be experienced even in the presence of no input, although I wouldn't infer that restoring input wouldn't alleviate the hyperacusis.
I suppose I am still none the wiser after writing all this lol, but I welcome anyone else's thoughts on this topic.