Reminds me of this relatively new device in the space though: https://store.humanware.com/hca/monarch-the-1st-dynamic-tact...
Would you be able to "perceive" a picture if that picture was engraved on a surface ?
A while ago I read a biography of Louis Braille, and he created his system to replace an older one where they would teach people to feel the shape of letters in wooden blocks. Braille replaced it because it was much easier to read fast, but it was never meant to be used for something like a picture.
I'd also be interested if something like a tactile floor plan would even be useful for someone blind from birth, from what I've heard you don't think about navigating spaces the same way, so a floor plan might be far away from the mental models they use.
[0]: https://evengrounds.com/services/tactile-3d-printed-models-f...
Linear text is perfect to me for documentation, teaching/learning etc...
But also, systems seems to be better digested under the shape of spatial representations (I met a lot of CS persons that fantasized over the possibility of displaying all the files of your codebase in a VR-like environment augmented with visual cues (UML) and I must admit that I would find this unnecessary but comfortable -- and I can imagine applications of this in other domains; imagine a teacher arranging the whole of Kant philosophy as a spatial-visual system referencing positions, comments, etc..). Eyes are cool because you can focus on something while knowing that some available information is there around the zone you are focusing; in a sense, so is the hand, locally, but I imagine (I dont know) it would require some super-human level of braille reading to be able to jump back and forth reading on different fingers, so that's again a probably stupid question to ask to the blind crowd of hn : are you able to do this?