A great book on spatial simulation is The Art of Mixing by David Gibson. Older but forever relevant
Just yesterday was watching Territory season 1 where the characters have an intense suspenseful, almost whispering "serious voice" conversation while standing next to a running helicopter, without even raising their voices which took me out of the scene.
So the question is, do viewers want it, or do know it all producers say people do and put it in?
I'm having problems watching movies at all, there is so many things breaking my immersion. :-)
Even more than that, they will notice if you don't it the "wrong" way that they've come to expect. This is called The Coconut Effect: https://tvtropes.org/pmwiki/pmwiki.php/Main/TheCoconutEffect
My clearest memory of that was me as a kid watching a Bond-movie where a sportscar makes a screaching sound when driving down a sandy beach. I turned of the TV and don't think I ever saw a full Bond movie after that.
The list on the page you linked had one thing that isn't toally correct though:
>The very specific (but entirely unrealistic) echoing thud that is heard when all the lights are turned on in a large spacenote .
That sound is realistic if it is an old building with the heavy type of power relays or whatever they are called. They do make that sort of sound if the acoustics are right. They could be set up with timers so they don't start the lights at exactly the same moment to prevent overloading the fuses.
The closest analogue I can think of is how due to practice now anyone can close their eyes and imagine typing entire essays how they know exactly where the keys are. Try it.
Then you have directional localisation based on delay between ears, difference in volume & properties of reverberations. Things to the sides are going to arrive in either ear at different moment. Add source if first echo & you have confirmation that a sound is coming from either right or left. The more directly to the side is the sound, the bigger the delay between ears is, so you get approximate angle.
Now we consider sound muffling, caused by shape of our head & ears. Things in front are going to sound clearer in the opposite ear, than sounds from the back.
The same principle is used for detection of height. Things below are going to get muffled, things above will be clearer. In reality, feeling sounds with the whole body helps in source localisation, which can't be emulated with headphones.
Echolocation is finding out distance to objects (not sound sources!) by sending a sound wave in a direction, and listening for echos that bounce back. Hence echolocation.
The only sound source is you.
It's a form of active sensing: literally how a submarine sonar works (or radar, for that matter). Bats do it, too.
This has very little to do with "locating things in headphones", as that is entirely missing the active part in the first place.
Then, locating sound sources using binaural hearing is not the same as analyzing the scattered echoes when the sound source is you (relative to yourself, you know where you are already!).
It's interesting that this is currently the top comment. I wonder how many people read the article before engaging in this discussion.
And dolphins and whales, no need to go to submarines.
Fascinating to find out that the scientific community had this kind of bias as well.
Whether it's audible on a train, depends on how insulated the train is.
I get the OP's point, but indeed this probably wasn't the best example.
Manfred Spitzer once wrote that he thinks there are two groups of people on this planet who really have good audio location capabilities. Blind people and conductors. Conductors because they need to be able to listen to a particular performer, to isolate them from the rest of the orchester. And blind people, because we use the ear to navigate the world.
Now, I actually use everything around me as a source of sound. Tapping with the cane is one of them. However, if I want to "scan" my environment, I usually make a clicking noise with my tongue.
But those are the a small part of the game. The rest of the noises I use come from outside. Just a small example, before I loose myself in thsi comment: I can hear poles and trees on the sidewalk. Not because they emit so much sound, but because they eat it up. If a car drives behind the pole along the street, I can actually hear the point where the external sound doesn't reach me, infering that there must be a pole or a tree. Echo location is not always about what you send. Its m6ore about you learning how the sound waves around you behave. Sometimes, but this is getting borderline esoteric, I can hear the materials involved. Walking towards a wooden wall sounds destinctly different from walking towards a concrete wall...
I was listening to a podcast and realized I could hear the speaker turning pages under the microphone by the way it affected their voice in the microphone rather than the rustle of the page. It was pretty wild. I could ‘see’ it before i recognized what was going on.
Kinda off topic but I’m on a brand new phone (not logged in and no history) and the next video suggested by YouTube is a French fascist promoting (actual) nazis policies. Why would YouTube do that?! It has absolutely zero connection with audio topics. I just have my OS language set to French. That’s so worrying for the youth with being exposed to pure hate for no reason.
Maybe a simple mechanical clicker device like those used for dog training could be a useful tool.
Another route would be to mix the ultrasound with another sound closer to the ear, then there is no need for an electronic ear at any point. The interference between sound can cause the inaudible frequencies to become audible.
Perhaps it could be such that the ultrasound warbles whilst interfering sound does not (or vice versa), which would make the sources easier to distinguish also.
It appears that the hardest part of echolocation for humans is the "produce a directed, crisp click" part. The "process the sound" part is readily handled by our brains with a relatively mild learning curve.
For some interesting context, here is a description of dolphin echolocation:
https://www.britannica.com/animal/cetacean/
"The amount of information obtained by an echolocating dolphin is similar to that obtained with the eyes of a sighted human. ...
Toothed whales use extremely high frequencies, on the order of 150 kilohertz, for refining spatial resolution from their echoes. They are capable of “seeing” into and through most soft objects such as other dolphins, though the effectiveness of toothed whale echolocation drops off at distances greater than about 100 metres."
Maybe with enough practice ...
E- chocolate
Don’t get it.
Oh..!
- Humans Can Learn to Echolocate (Livescience, 2015) https://news.ycombinator.com/item?id=10699105
- How humans echolocate 'like bats' (BBC, 2018) https://news.ycombinator.com/item?id=16782557
- Humans Can Learn How to 'Echolocate' in 10 Weeks, Experiment Shows (Sciencealert, 2021) https://news.ycombinator.com/item?id=27404132
- Teach yourself to echolocate - 106 comments https://news.ycombinator.com/item?id=18208334
Just one more thing to add to my bag of tricks.
My anecdotal experience is that we are so out of touch with our bodies these days that we routinely underestimate just how adaptable we truly are if we have the will or need to learn. So I get frustrated when very useful things like echolocation are suppressed by ignorant and cynical scientists who are unaware of their blind sides because they think they studied hard and read a bunch of papers.
Our realities are shaped by our own experiences but what is sad is when people then shape other people's realities based on their own skewed realities.
I'm glad that the internet is so good at spreading disparate, niche and folky knowledge and forcing scientists to reconsider their priors more often.