Black Hole Vision simulates the gravitational lensing effects of a black hole and applies these effects to the video feeds from an iPhone's cameras. The application implements the lensing equations derived from general relativity (see https://arxiv.org/abs/1910.12881 if you are interested in the details) to create a physically accurate effect.
The app can either put a black hole in front of the main camera to show your environment as lensed by a black hole, or it can be used in "selfie" mode with the black hole in front of the front-facing camera to show you a lensed version of yourself.
Next, you can select the "Kerr black hole" mode, which adds rotation (spin) to the black hole. Additionally, you can augment the rotational speed of the black hole (its spin, labeled "a" and given as a percentage of the maximal spin).
When the user selects the "Static black hole" mode, this texture is computed on the GPU and cached. The "Kerr black hole" textures, however, have been precomputed in Mathematica, due to the need for double precision floating point math, which is not natively available in Apple's Metal shading language.
The source code, including the Mathematica notebook, can be found here https://github.com/graveltr/BlackHoleVision.
The code was written at Vanderbilt University by Trevor Gravely with input from Dr. Roman Berens and Prof. Alex Lupsasca. This project was supported by CAREER award PHY-2340457 and grant AST-2307888 from the National Science Foundation.
License: This app includes a port of the GNU Scientific Library's (GSL) implementation of Jacobi elliptic functions and the elliptic integrals to Metal. It is licensed under the GNU General Public License v3.0 (GPL-3.0). You can view the full license and obtain a copy of the source code at: https://github.com/graveltr/BlackHoleVision.
It works very well (and in a browser!) but is limited to a non-rotating (Schwarzschild) black hole---we really wanted to include black hole spin (the Kerr case). As we write on the github, talking with Dominic about his implementation was very useful and we are hoping to get a paper explaining both codes out before the end of the year.
(And he was also my PhD advisor.)
It looks like nothing actually disappears. I expected a black hole to not just affect what an area looked like, but also to “disappear” some part of what was there.
If so, then my question is wouldn’t some light be lost to the black hole? Shouldn’t a substantial portion of the light coming at me from the other side of the black hole disappear into the black hole, making what does lens around dimmer?
If you happen to approach the event horizon closely and come back again far away to where you started, you will see that a lot of time passed at your origin, while by your clock, the trip might have been short.
This shape of the shadow is also wrong for kerr though, this is what kerr looks like:
Do black holes have hair?
Where is the Hawking radiation in these models? Does it diffuse through the boundary and the outer system?
What about black hole jets?
What about vortices? With Gross-Pitaevskii and SQR Superfluid Quantum Relativity
https://westurner.github.io/hnlog/ Ctrl-F Fedi , Bernoulli, Gross-Pitaevskii:
> "Gravity as a fluid dynamic phenomenon in a superfluid quantum space. Fluid quantum gravity and relativity." (2015) https://hal.science/hal-01248015/ :
> FWIU: also rejects a hard singularity boundary, describes curl and vorticity in fluids (with Gross-Pitaevskii,), and rejects antimatter.
Actual observations of black holes;
"This image shows the observed image of M87's black hole (left) the simulation obtained with a General Relativistic Magnetohydrodynamics model, blurred to the resolution of the Event Horizon Telescope [...]" https://www.reddit.com/r/space/comments/bd59mp/this_image_sh...
"Stars orbiting the black hole at the heart of the Milky Way" ESO. https://youtube.com/watch?v=TF8THY5spmo&
"Motion of stars around Sagittarius A*" Keck/UCLA. https://youtube.com/shorts/A2jcVusR54E
/? M87a time lapse
/? Sagittarius A time lapse
/? black hole vortex dynamics
"FORGE’d in FIRE: Resolving the End of Star Formation and Structure of AGN Accretion Disks from Cosmological Initial Conditions" (2024) https://astro.theoj.org/article/94757-forge-d-in-fire-resolv...
STARFORGE
GIZMO: http://www.tapir.caltech.edu/~phopkins/Site/GIZMO.html .. MPI+OpenMP .. Src: https://github.com/pfhopkins/gizmo-public :
> This is GIZMO: a flexible, multi-method multi-physics code. The code solves the fluid using Lagrangian mesh-free finite-volume Godunov methods (or SPH, or fixed-grid Eulerian methods), and self-gravity with fast hybrid PM-Tree methods and fully-adaptive resolution. Other physics include: magnetic fields (ideal and non-ideal), radiation-hydrodynamics, anisotropic conduction and viscosity, sub-grid turbulent diffusion, radiative cooling, cosmological integration, sink particles, dust-gas mixtures, cosmic rays, degenerate equations of state, galaxy/star/black hole formation and feedback, self-interacting and scalar-field dark matter, on-the-fly structure finding, and more.
I say that because there's an idea to play with for a v1.1 that would give it staying power for me:
Do you have enough processing power on an iPhone to combine this with Augmented Reality? That is to say: can you explore "pinning" a singularity in a fixed region of space so I can essentially walk around it using the phone?
Assuming that's possible, you could continue evolving this into a very modest revenue generating app (like 2 bucks per year, see where it goes?) by allowing for people to pin singularities, neutron stars, etc. around their world and selectively sharing those with others who pass by. I'd have fun seeing someone else's pinned singularity next to the Washington monument, for instance. Or generally being able to play with gravity effects on light via AR.
The geosharing augmented reality thing mentioned by the parent comment is very very cool too, I'd pay a few bucks for that! Maybe make it social by letting black holes that people drop somewhere IRL merge, etc...
Reach out to me if you eventually would like to spin up a cheap bit of infrastructure to host the data of where people dropped their black holes, and need some help with that!
Example: Set mass of black hole to 1e12 metric tons, or about 100,000 great pyramids.
This has a schwarzschild radius of 1485 femtometers (1 femtometer is around size of a proton).
Nominal luminosity is 356 watts. You could power your computer! Lifetime is 1e12 gigayears.
An interesting thing comes with gravity. Gravity at the schwarzschild radius for this mass is 3e28 m/s^2, but this is at a smaller-than-an-atom radius.
If you put your hand within a foot of it, gravity would be 700,000 m/s^2.
You would need to be at a distance of 270ft to experience gravity from it that compares to earth (9.8 m/s^2).
I tried plugging in some other numbers and, at first confusingly, found that the luminosity goes up at lower masses?! But of course it radiates from it's outer shell, not the entire volume.
Wonderful tool, imagine playing with those parameters in AR
- value can be generated, but not captured (generally good-natured humans do this constantly with those in their communities), and
- value can also be captured, but not generated (i.e. stolen, most of the largest corporations do this in one way or another via e.g. monopolization, political corruption, union busting, resource exploitation, real estate speculation, etc).
Let me give you an example of how backwards that is:
Are you telling me that, for example, Linus Torvalds (or any major contributor to Linux) has generated less long term human value than a congressperson like Rick Scott or Mark Werner?
Linux runs on machines that literally keep people alive as well as that are used to create and display works of art.
Just trying to guess at what they could be is costing me random time...
In the meantime, check out this code developed by Dominic Chang (grad student at Harvard) that implements lensing by a non-rotating (Schwarzschild) black hole in your browser: https://dominic-chang.com/bhi-filter/
Quick edit- I did exactly that and now it works fine. First boot up before seemed like it got stuck when asking for permission to use the camera.
> The developer does not collect any data from this app.
Well, duuh, nothing can escape the black hole, not even information!
2. Neutron stars I think
TLDR: redshift depends not only on the position of the source, but also its velocity.
If you want the details, they're too long to put in this comment but essentially what I mean is that the r->infty limit of the redshift factor in Eq. (B22) of this paper is unity: https://arxiv.org/abs/2211.07469
Really cool app btw!
I have once seen a video of Kip Thorne, explaining that the black hole visual effects of Interstellar were an actual physical simulation. I wouldn't have thought, that it was feasible to run on an iPhone.
As a physicist with a modest background in computing, I was also surprised by how powerful the iPhone GPU is. It can indeed lens the input from the camera at high resolution and in real time with high FPS.
Thanks for the question!
In the meantime, check out this code developed by Dominic Chang (grad student at Harvard) that implements lensing by a non-rotating (Schwarzschild) black hole in your browser: https://dominic-chang.com/bhi-filter/
Thanks for pointing that out.
I’m no astrophysicist but it all looks doable with the camera API, canvas API, and WebGL or WebGPU shaders. That actually sounds like a lot of fun.
Still, kinda fun, reminds me of playing around with different blur / liquidify filters in photoshop back in the day.
If you want to read more about what it's going to do, I wrote a blog post about it on the mission website: https://www.blackholeexplorer.org/bhex-blog/lupsasca-stateme...
5x is actually a lot: we'll be able to resolve the "photon ring" of orbiting light around M87* and Sgr A* (the two black holes previously imaged by EHT at lower resolution) and likely see the "shadows" of another 6-8 black holes, with the possibility of estimating the mass of another ~20-30 sources.
Honestly it's not so far-fetched (to me) that in a few years someone will have GRRMHD simulations running in real time on a portable device.
Are you familiar with A Slower Speed of Light? It's a game which has some nice special-relativistic effects.
I think we're still a ways off from real time GRMHD sims, but CK Chan from UArizona had a working VR simulation (on the Oculus iirc, but now deprecated) that allowed you to explore a pre-existing GRMHD simulation in real time and in 3D. I think he might be working on a new version of this.
(Just for clarity the second R in GRRMHD is for radiation. I know it's typical to just push some photons through the GRMHD results to produce renders, bit since I'm dreaming let's treat the radiation self-consistently.)