I feel hardware technology can improve further to allow under-the-LED-display cameras .... so that we can actually look at both the camera and the screen at the same time.
(There are fingerprint sensors under mobile screens now ...and I think even some front facing cameras are being built in without sacrificing a punch hole / pixels. There is scope to make this better and seamless so we can have multiple cameras if we want behind a typical laptop screen or desktop monitor.)
This would make for a genuine look-at-the-camera video whether we are looking at other attendees in a meeting or reading off our slide notes (teleprompter style).
There would be no need to fake it.
More philosophically --
I don't quite like the normalization of AI tampering with actual videos and photos casually -- on mobile phone cameras or elsewhere. Cameras are supposed to capture reality by default. I know there is already heavy noise reduction, color correction, auto exposure etc ... but no need to use that to justify more tampering with individual facial features and expressions.
Videos are and will be used for recording humans as they are. The capturing of their genuine features and expressions should be valued more. Video should help people bond as people with as genuine body lanuage as possible. Videos will be used as memories of people bygone. Videos will be used as forensic or crime scene evidence.
Let us protect the current state of video capture. All AI enhancements should be marketed separately under a different name, not silently added into existing cameras.
With an unfiltered camera, it looks like I'm making eye contact with you when I'm actually looking directly at my camera, and likewise it looks like I'm staring off to the side when I'm looking directly at your image in my screen.
A camera centered behind my screen might be marginally better in that regard, but it still wouldn't look quite right.
What I'd really like to see is a filter for video conferencing that is aware of the position of your image on my screen, and modifies the angle of my face and eyes to more closely match what you would actually see from that perspective (e.g. it would look like I'm making direct eye contact when I'm looking at/near the position of your eyes on my screen).
You could imagine this working even for multiple users, where I might be paying attention to one participant or another, and each of their views of me would be updated so that the one I'm paying attention to can tell I'm looking directly at them, and the others know I'm not looking directly at them in that moment.
Jokes aside, I think you're absolutely right. Online interactions have dynamic geometry, so mounting a camera behind a screen will just not cut it, unless the entire screen is a camera. Also, some people might prefer projecting/receiving no eye contact at all, at times, in situations. And vice versa.
Philosophical stance here is purely traditionalist, it decides on behalf of people. What people would like to use, that should exist. "Videos are and will" is a strange claim, assuming its claimer has neither control over it nor any sort of affirmation that it is going to be true.
There will be an array of cameras covering say every 2x2 inch square of your screen.
Just see how many cameras are on todays phones. Same can happen with new camera tech too.
Also there will be a huge commercial driver to put multiple cameras under the screen -- all apps and marketers can track your precise gaze. Ads will pause unless you are actually watching them. I will hate it but it feels inevitable.
It's generally considered rude or an act of intimidation to maintain eye contact with people in Japan, for example. Not nodding occasionally while someone is talking is also seen as a sign that you're not paying attention. Are we going to modify videos to nod automatically too? Or maybe we can stop trying to fake social interactions and enforcing local customs on the world.
Critically, the enumerated computational processing units are global transformations, while tampering is inherently a local, "contentful" transformation.
This is a brilliant way to examine / explain the distinction.
I don't actually want the person I'm talking to to appear to be looking directly into my eyes because it's weird - it means they're looking at the camera and not at me on the screen, talking to them.
Eye contact is a subtle and important dynamic in human interaction (to the point where it has been suggested that we have white sclera, while our closest ape cousins do not, as an adaptation in support of easily detecting eye contact.) In a meeting, that includes third parties seeing who is making eye contact with whom.
The systems being discussed here are too simple to restore this natural dynamic, and it is not clear to me that always-on eye contact correction[1] is free of unintended and undesirable consequences - for example, in some circumstances, it might ramp up the tension in a discussion, or it might help someone who is dissembling.
[1] Even with random look-aways, I suspect - in actual conversation, look-aways are often correlated with what's going on in the discussion.
She's taking this as an autistic adaptation NT people are less likely to make, like my gestures are practiced and tailored for the sake of the other, not my own sake. I want to "look in her eyes" to make a point, because that's one of the ways you show people you're making an important point, not to see how she's responding to what I'm saying.
I haven't done any of it on purpose. It's apparently just how I've adapted to the weird communication space of having a gap between actually looking at someone's eyes and being seen to be looking at someone's eyes.
Understanding camera eyelines counts as autistic now?
You're fine doing that. Sorry, but that comment she made really sent me.
Reminds me of how the film department forced the digital artists to take a Cinematography and lighting classed irl so their final project renders would improve.
I think maybe I have "trauma masquerading as ASD," because the symptoms are subjectively improving as I learn to down-regulate my nervous system, but then I don't much care what label gets put on why I'm weird. I'm much more interested in figuring out what to do with the different ways I'm weird. I'm old enough that I can't think of ways formal diagnosis would help me, so I'd rather assume each challenge is treatable until I find out that it isn't.
I don't get many opportunities to express my exasperation with the paradigm of the youtube content creator's thousand video cuts per spoken sentence, but hell, in the same way, I think it's just $#@%ing weird.
Have you ever looked at a group of friends and thought "ONE OF YOU IS BLINKING"? No. Yet it's quite common to have a photo where at least one person is mid-blink. The 30-year lifespan of that photo includes the milliseconds they were blinking. Is it untrue to have a picture where two people were not blinking and standing side by side? They did in real life, in those same poses, but fractions of a second apart. Is it a failure to capture reality by having a picture of them with their eyes open? Maybe - or maybe the blending of several moments is more true to the original situation than any specific snapshot could be.
> I feel hardware technology can improve further to allow under-the-LED-display cameras .... so that we can actually look at both the camera and the screen at the same time.
That doesn't fully solve the problem because you'd be looking at the middle of the screen not at the person talking to you in a group.
> Video should help people bond as people with as genuine body lanuage as possible.
I agree, but having people be able to actually look at each other is surely part of this.
Isn't a "moment" a very narrow snapshot in time by definition?
Repeating my comment on a sibling ...
Once we have technology to put a camera under a screen without sacrificing display quality ... we will not stop at one camera. There will be an array of cameras covering say every 2x2 inch square of your screen.
Just see how many cameras are on todays phones. Same can happen with new camera tech too.
Also there will be a huge commercial driver to put multiple cameras under the screen -- all apps and marketers can track your precise gaze. Ads will pause unless you are actually watching them. I will hate it but it feels inevitable
Well I would have expected any court would have stopped accepting audio and video as evidence by now.
https://www.theverge.com/2024/10/15/24271083/youtube-c2pa-ca...
https://support.google.com/youtube/answer/15446725?hl=en
> Limitations
> “Captured with a camera” only appears if a creator opts to use C2PA technology during filming. If it’s missing, it doesn’t mean the content has modified audio or visuals.
> Note: This feature is separate from our existing altered and synthetic disclosures.
> The metadata that leads to a “Captured with a camera” disclosure is made by a 3rd party (for example, a camera manufacturer). This means there is some risk that someone could take a photo of another screen showing synthetic content. Because the other screen shows an image that has been modified, it wouldn’t be eligible for the “Captured with a camera” disclosure. This issue is called “air-gapping.” Camera manufacturers will continue to develop detection measures to prevent “air-gapping,” but the sophistication of those detection measures may vary in the near term.
https://blog.google/technology/ai/google-gen-ai-content-tran...
I agree. This is one of the things that I actively worry about.
Everything but your smartphone is big enough that you'd to sprinkle your entire screen area with sensors to get the sense of me looking at you. And, that won't be cheap.
Say my laptop had a sensor dead center and I was in a group chat. Only the person dead center would see me looking to the camera.
This is better done in software.
Its been half a decade already from when I first noticed iphones cant capture a red world when wild fires are messing up the air quality, had to break out an ILC (DSLR without the SLR) to capture the world more congruently to how I see
s/iPhones/the iPhone Camera.app/
Apps like Halide and Pro Camera have no trouble handing you over control of white balance. I've captured both faint aurora borealis and red/brown hue when sand and dust is brought over to inland Europe by scirocco with great success.
If people laugh with their mouths open, wouldn’t a camera placed below the LED display capture the inside of their mouths, and the rest of the time just point straight up their noses?
I think some mobile phones have already done this...where they are able to put a camera behind the pixels.
What a society! Processed food, plastics in their blood, processed sensor data. Ugh, we have strayed so far from natural interactions.
Philosophically we have abandoned being mindful of where we are, and just being our natural forms instead of being slaves to what some computer is telling you that you should be seeing.
Controversial stance, but for the same reason I reject wearing makeup.
Girls, you are beautiful as you are! No need to fake it! Most guys don't do that either and everybody is perfectly fine with that too.
Isn't it okay to feel good about looking good, sure (i love dressing up and doing my hair for occasions)! but obviously that can turn very problematic very fast. Honestly, I wish I knew where to draw the line in the sand. Is it makeup? piercings? nice clothes? surgery?
Just a parent with two daughters who has more questions than answers.
Messing with hair in our youth is fun and it grows back. No worries.
Modest piercings society does not frown on. . No tattoos and especially none on the face, hands, etc...
We had boys and girls and it went OK. Not too much complaining and when they became adults, we handed them the keys and wished them well and help where and how we can.
Maybe our experiences help with understanding yours.
On closer inspection it turns out it was actually smoothing my hair and boosting the contrast so I looked like I had dyed "highlights", along with airbrushing my cheeks a flat orangey coloured skin tone with a rosy center, as if I were wearing foundation and blusher!
It's optional on discord. Besides, it's conceivable that you might create a similar effect with a nice audio hardware setup
If the problem is that society (in bubble X, Y or Z) teaches us our value is judged solely based on our appearance, then we should address the lessons we teach. I feel it is unproductive to play whac-a-mole with the emergent symptoms of such an underlying problem.
Otherwise, it is just another way humans choose to dress their external appearance for their own pleasure, fulfilment and social intentions. It's not as if it's hard to tell when someone is wearing makeup - that is, at least when you're close enough to be able to inspect their imperfections at all.
It seems to me that this idea about makeup being 'fake' stems from heteronormative dating, where a man may feel he is unable to properly assess a woman's beauty (and her attractiveness to him) if her face has been changed in arbitrary ways. But personally, I don't think we should optimize all human encounters for dating efficiency. More broadly, there is no social contract which stipulates you must wield your natural appearance at all times. I think we need not add more social expectations to an already long list.
I said, or intended to convey, that it is a personal preference. It really need not matter to others how a person, be it man or woman, chooses to dress their appearance (though of course there is a line, for example most places would encourage that you wear clothes in public.)
I don't believe that the asymmetry between men and women changes this. There may exist:
- an asymmetry between the beauty standards which are applied to men and women, or;
- an asymmetry between the pressure that men and women experience to enhance their natural appearance, or;
- any other difference of expectation between men and women.
I personally feel many of these expectations are harmful overall. However, this need not invalidate a person's choice to dress their face with makeup, and throwing more expectations (even unpopular ones) into the mix will certainly not alleviate this asymmetry.
edit: perhaps I could make it a little more clear that I have a lot of distaste for the way the world works for women. I agree with you largely and I think it's unfair. I just think there's a very large jump from 'women should not feel the need to wear makeup' to 'women should not wear makeup.'
Fellow women should feel encouraged to reject this pressure.
It's exactly those unnatural expectations of looks that are put on women, starting at a really young age, that are the issue here. Not boys, just girls. It skews expectations and boom, everybody feels like they have to do it. It's very sad. I'm not saying don't shower, don't cut or even brush your hair, etc. All fine. But the full-on makeup you see walking through a random city in the morning, geez, what are we doing to ourselves. And what are the guys doing? Nothing close to it, but spend a lot of time justifying it.
[0]: https://www.health.harvard.edu/blog/showering-daily-is-it-ne...
I'd equate perfuming it over to make-up, not showering..
It's also quite sad that a statement "we should put less make-up on" is immediately drifting into a discussion about not showering. Way to ridicule a viewpoint.
First off, I know this is long. It is long because I am having to deconstruct the way we think about showering and make-up. I do think the deconstruction is interesting, but I wouldn't blame you for deciding not to read this.
Though ndndjdjdn's comment was phrased pretty snarkily, but I do think it is a good point. I'll try to explain why, addressing the first and last points, followed by the middle point:
I grew up on a farm. It can smell "bad", especially right after a rain storm. However, since I lived there my entire childhood, it mostly just smelled different rather than bad. I wasn't bothered by it like visitors were, though if I had to pick, I'd prefer the non-after-rainstorm way it smelled to the after-rainstorm smell.
If everybody stopped showering, then everybody would start smelling "bad". I'm sure people would adapt quickly enough, though they would still probably prefer the smell of people who regularly shower to some extent. Thus people tend towards the "showered" state to be more appealing to other people, even though people would probably get used to it if we all eliminated regular showering from our habits.
Similarly, I have seen both IRL and online where men think women who aren't wearing make-up are sickly or ugly, and that women who are wearing natural make-up aren't actually wearing make-up. If a woman stopped wearing make-up, then men would suddenly find that woman to be less appealing, though men would probably get somewhat used to it if all women stopped using make-up. Thus women are pressured to be in the "make-up" state, even though men would probably get used to it if all women eliminated make-up use from their habits.
I would like to note that the logic of the last two paragraphs is the same. Thus, my rebuttal to your point that the body would stink "uncomfortably" is that it probably wouldn't be uncomfortable if everyone stopped showering.
Now to address the middle point. I don't think there is a meaningful difference between showering and showering + wearing perfume (which I will call "perfumed" from hear on). People being in the "showered" state is considered normal, and from that point of normalcy, being in the "unshowered" state is bad and the "perfumed" state is good. However, if we are trying to figure out the best way for the world to be, I don't think what is currently "normal" should matter at all.
Now let's lay out the states people can be in:
Smell wise, people can be in the "unshowered", "showered", and "perfumed" states. As laid out in the comment you are replying to, "unshowered", or at least not daily showered, is the healthiest of these states.
Sight wise, women can be in the "no make-up" or "make-up" states. As you point out, health wise the "no make-up" state is the healthiest of these states.
Thus, if we are prioritizing health above cosmetic appeal, everybody should be in the "unshowered" and "no make-up" state. As I have argued earlier, everyone would probably get used to this eventually, but there would always be pressure for people to shower and wear make-up. Thus I think it is inconsistent to want people to be in the "showered" and "no make-up" state if you are arguing health is the reason. My personal take is that we should just let people make whatever choices they want based on their own values, and not mine or yours.
Thank you for coming to my TED talk. I hope you at least found that interesting. If you did read all the way through, I'd be interested to hear your response.
I interpreted ndndjdjdn's comment as sarcasm. (due to the use of the phrase "next up") That is, I think he was saying that if you take sadcherry's logic to its limit, then people wouldn't shower or would shower less. sadcherry's logic is that people shouldn't wear make-up because it cosmetic is not beneficial to health. Thus I think ndndjdjdn was talking about the fact that people use showers for cosmetic reasons, and believes sadcherry probably doesn't actually want people to shower less, and so should probably rethink his views about make-up.
You then posted your comment, saying that the health benefits of showers justify them even if they do have cosmetic benefits.
I then comment, saying that I shower in a way that is bad for my health because of cosmetic reasons. I wanted to imply that a lot of people shower like this, and therefor the fact that moderate showers might have some health benefits is irrelevant, because the way many/most people shower is actually unhealthy. I probably should have been more explicit about the fact that I thought many/most other people shower in unhealthy ways.
As an aside, I don't actually know of any concrete benefits to health besides making sure open wounds don't get infected. I tried to search the web for other benefits, and the only additional ones I got are exfoliation (which is cosmetic) and relaxation. (but relaxing things aren't generally classified as "healthy") With that in mind, I tend to believe the health benefits of showers are probably pretty over-hyped, (though not non-existent) and more like a cultural fiction to keep people showering than true knowledge.
I'd be interested to hear if you have a different take.
To your aside of health benefits of showers: I also tried to research this, but other than getting rid of contamination (hazardous elements e.g. during construction or demolishing, or just dirt on wounds) I couldn't find any serious claim that washing the skin is beneficial for health (outside of making sure hands are clean before touching food or mucous membranes), I just assumed there should be one.
I take my confident stance on this back...
…We may talk all day how bad and unfair that is, but none of that changes the reality for an average person out there.
There's already a feature that does this called HR
you can switch to another tab, use a miniplayer, in some apps u can focus one person's screen and if you choose someone who has a static avatar up you'll barely see other people's faces.
The nuclear option is to install PowerToys [0] and put something always on top (im a fan of the hotkey winkey+space to toggle always-on-top on and off) in the exact position of the other video feeds. notepad or something.
[1] https://answers.microsoft.com/en-us/msteams/forum/all/featur...
The model tries to copy the blinks of the original video so it's possible that in other conditions, you'd notice less of this.
Fun to see this feedback though, definitely something worth improving :)
Application error: a client-side exception has occurred (see the browser console for more information).
https://en.wikipedia.org/wiki/File:This_shows_a_recording_of...
or the occasional look away? (for which there appears to be a feature for that)
> Look Away: enable_look_away helps create a more natural look by allowing the eyes to look away randomly from the camera when speaking
I expect both to be different: while saccades do happen when occasionally looking _away_ from a person, they also happen when looking _directly at_ one person because we don't constantly stare at a very specific unique and precise point on their face.
For the demo video, try enable_look_away = true, look_away_offset_max = 10, look_away_interval_min = 1 and look_away_interval_range = 1 (then submit), which from the result I got should really be the default for a more natural result.
I have been working for a company which allowed full remote work without any qualms since before COVID and nobody did video calls back then. Since we end up on site in secure environments we also just get told to disable the camera in the BIOS as part of our laptop hardening.
For things like bi-annual meetings with your manager you would go into your local office.
I'm rarely on calls without video, but when I am I find it jarring when voices just appear out of the ether with only a little flashing icon to indicate who it is I'm listening to.
To each their own!
My company was also 100% remote from its start, even before Covid.
As others have stated, it's also just unnerving to have people making nonstop eye contact or staring at any part of your body at all, even if it isn't your eyes. Maybe 90s Los Angeles was an abnormally shitty place to grow up because of the gang activity, but this is the kind of thing kids started fights over all the time. Robert DeNiro's most famous movie scene is about how threatening it feels to have someone looking at you.
This isn't even unique to humans. When you regularly interact with animals, you're taught to look away and not hold direct eye contact because they'll see it as a challenge or threat. I've learned to do this with my own cats to make them more at ease. You learn to blink, narrow your eyes, look to the side.
It's from January 2023, so I don't know if they've improved it further since then.
The video conferencing software providers have been way to slow to put whoever is speaking top-center (near where the camera typically is).
https://www.sievedata.com/blog/eye-contact-correction-gaze-c...
Newer models have come out that allow the same thing to be done and control even more than the eyes.
See here: https://github.com/KwaiVGI/LivePortrait/blob/main/assets/doc...
For web-conferencing, local use is great so NVIDIA's tools are what we recommend in that case.
- Signed, everyone whose currently trying to cheat on interviews that think that forcing peoples videos on does anything at all to keep them honest.
The normal thing is not to uninterruptedly look at a person (which the camera is supposed to be). For example when you make a gesture of trying to remember something by looking somewhere else.
https://publish.purewow.net/wp-content/uploads/sites/2/2024/...
he turns his head a little and his eyes look wrong
And if they do, do you like it?
For the demo video, try enable_look_away = true, look_away_offset_max = 10, look_away_interval_min = 1 and look_away_interval_range = 1 (then submit), which from the result I got should really be the default for a more natural result.
I think you misunderstand the role of "Look Away": it's not like it looks completely sideways, inventing behaviour that does not exist; instead it looks "away" _from the fixed point that would be dead-on camera center_ (that results in this "I'm gonna pierce through your skull with laser eyes" look), substituting it with "when looking - not aiming/scrutinizing - at something, even continuously, human eyes have saccades"
The whole premise of such software (which has already been implemented by Apple in FaceTime with great success) is to _restore_ the reality which is "I'm looking at you but the mechanical offset between camera and window-on-screen destroys the information that I'm in fact looking at you", not invent something that is not real.
Ideally it would even:
- notice actual saccades and reproduce them, only cancelling the offset (super tough, so the next best thing is to fake it, but since these are small, uncontrolled, random-ish movements the approximation is quite sufficient)
- take into account video window position relative to the camera so that if I'm looking away from the window then it stops compensating.
But hey, first implementations are often naive. I give them credit for implementing Look Away because that's one step beyond the naive implementation. I guess it's not the default + tuneables are there because it's still early.
A whole lot. Even if they have varying facial expressions, not looking away is creepy as hell because looking away during conversations is actually an important aspect of the communication. Not looking away is sending a nonverbal message, and none of the usual ways that's interpreted are positive.
That said, plenty of people don't make eye contact with the camera much at all :)
Building further on this idea, I wonder if instead of changing the image to look at the camera, we could change the "camera" to be where we're looking.
In other words we could simulate a virtual camera somewhere in the screen, perhaps over the eyes of the person talking.
We could simulate a virtual camera by using the image of the real camera (or cameras), constructing a 3D image of ourselves and re-rendering it from the virtual camera location.
I think this would be really cool. It would be like there was a camera in the centre of our screen. We could stop worrying about looking at the camera and look at the person talking.
Of course this is all very tricky, but does feel possible right now. I think the Apple Vision Pro might do something similar already?
In order for this to work for gaze correction, you'd probably need to take into consideration the location of the camera relative to the location of the eyes of the person on the screen, and then correct for how the other person is holding the phone, and it would probably only work for one-on-one calls. Probably need to know the geometry of the phone (camera parameters, screen size, position of camera relative to phone)
Would be amazing, not sure how realistic it is.
I sometimes use an Elgato Prompter to better enable eye contact during meetings. The camera and lens is mounted behind the screen so looking at the screen is also looking at the participants. The downside is that the screen is tiny and you leaning forward to read, say, a document does not look that great on camera. So either you have to zoom it substantially or read it on another screen, thus looking away from the participants. In this case though, you are not looking at the participants and faking that eye contact in this case would be kind of weird.
One thing I've always wondered is if this could be made to work for group video chats - depending on the tile you are looking at, that person would know, so you could tell who is paying attention to you, or even exchange a furtive glance with a colleague in reaction to someone else said like IRL. Even harder, but also cool would be updating the gaze dynamically so you could tell what they were looking at in your scene - say you have a whiteboard behind you and you can tell when the person is making eye contact with you vs looking at something you drew on the board.
Original dev make it so! :)
I think it's great that this is labelled as "correction" as in a means of optional postprocessing when it's convenient. Nvidia implying that it's something we should enable by default rubs me the wrong way, but then again, I don't spend my day stuck in virtual meetings.
Just a heads up – your main website is showing an error. You might want to fix it since your post is gaining traction. Here's the link: https://www.sievedata.com/
The error message reads: 'Application error: a client-side exception has occurred (check the browser console for more details).'
Sorry for duplicate post. Also this feature is enabled by default, but causes issues with several sites.
But the resultant video has a tad bit of uncanny valley going on.
I'd rather learn from the guy on the right.
Would recommend trying it on other videos, it is surprisingly good. Although there definitely are areas to improve.
Anyway, I’d much prefer if Apple didn’t silently alter the eye direction of people calling me.
technically cool, however I'd rather prefer some semi transparent mirror set up.
such a set up keeps the eyes alive.
> Limitations
> Works best with frontal face views and moderate head rotations.
> Extreme head poses or gaze directions may produce less accurate results.
There it is. To use this I’d like to see an example showing it stop adjusting when “extreme” aka normal head poses are used. If it can handle real behavior and improve eye tracking in the optimal case so it’s seamless adjusting / not as someone moves around, that would be a good product.
I do appreciate that this is a problem worth solving though, and I spent a lot of my time during COVID worrying about the negative impact that normalising loss of eye contact would have on the social interactions of our younger generations.
Back in 2021, I took one of those £50 teleprompter mirrors that YouTubers use, put a 7in raspberry pi display in the slot where you're meant to put your phone, and made it my 'work calls display' for a couple of days. The interesting thing is that the only people that noticed without me pointing it out were completely non-technical, and when they did they complemented me on the quality of my webcam rather than the fact I was looking straight at them; they could tell something was better, but couldn't quite put their finger on it. Which is funny because I'm sure being stuck behind a cheap perspex one way mirror made my actual camera quality a bit worse.
I remember I got to the point where I started playing with cv2 trying to do realtime facial landmark detection on the incoming feed and having a helper process shift the incoming video window around the little screen so that it would keep the bridge of the other person's nose (the point I naturally made eye contact with) pinned to the bit of the screen that was directly in front of the webcam lens. Then one morning I walked into my office, saw this monstrosity on my desk, realised I was nerd sniping myself and gave up.
One thing I do remember though is how odd it felt looking at yourself in a mirror without your image being mirrored. Not sure my brain was ready for that one after thousands of years of looking at itself in mirrored surfaces.
Bit of a weird pic but the only one I can find: https://pasteboard.co/BXE6zhbpOD7E.jpg
Feynman has a good explanation for that: https://www.youtube.com/watch?v=msN87y-iEx0
But it doesn't go deeper as to why we're perceiving ourselves that way, for that we have to dive into biology, neurology, bilateral symmetry, and the fundamentals as to how, as bilaterally symmetric beings, we're able to orient ourselves in a 3D world.
(I recall reading a paper or watching some video about that, but can't find it anymore)
I'm autistic. I do not make eye contact easily. That's part of who I am, and a useful signal for the people I interact with that I might be neurodiverse.
Something like this could mask or hide that autistic trait, and make me appear allistic. And, for some autistic folks, maybe that's desirable. However, I find the traits that I have around communication to be a deep part of me. Not making eye contact is nothing to be ashamed of, because being autistic is nothing to be ashamed. The introduction of tools like this could lead to pressure to use them to conform to some standard of "normalcy" that people expect.
While the technical achievement here is neat (and other commenters pointed out places where it's struggling to look good), but there's a sort of meta impact that it can have on people who fall outside of normal ranges.
Several years ago during the pandemic, I enlisted a job coach to get me hired. One of her paramount concerns was my eye-contact with the camera. She said it's so important. Am I paying attention? Am I an honorable man who maintains eye contact when I'm in a conversation? If I look away, am I collecting my thoughts, or prevaricating?
Many supervisors, managers, and teachers will judge their employees by whether they can pay attention during meetings, or if they're distracted, in their phone's screen, looking at keyboard, glancing off at children or spouse. Even more important, if you're meeting your wife and she can't even maintain your attention, what kind of husband are you?
If you employ a gadget to lie about this, then I hope they fire you and find someone who'll be honest. I hope your wife sends you to sleep on the sofa.
This is especially true for my set up, where I have two screens side-by-side with the camera replaced right between them. I just stare at the camera because otherwise it looks like I’m looking way off to the left or right. If I do look at the people who are talking, what they see is me looking off at “something else.” That’s a lie! :)
I contend that it's unproductive to train consumers otherwise. Yeah, we could look at the screen and have software correct it. Or, we may eventually integrate lenses into screens so that they're placed exactly right. But it seems kludgy to do this software fix. Just train people to look in the right place. (I hate iPhones and I'm unable/unwilling to do Facetime with them. Please use Meet or Teams.)
I'm gradually building skills that let me be aware of what's on the screen without having to stare into it. Having a relaxed, wide field of vision helps with many things. Glasses are counterproductive here.
Another example, though, would be vocalists in a video; usually they'll be singing right at the viewer and making a connection there, unless they're just too cool and aloof.
Perhaps other people didn't think about it as deeply as I did and maybe it did have the intended effect, but I remember I didn't see him or anyone else doing the same thing in any future all-hands.
This has been enabled on iPhones, by default, for like 5 years now. You never even noticed.
Their implementation only does a small adjustment, which works so well that most people don't even know it's being done.
I have seen three cameras in use in nearly a decade. They were all in interviews. I'm not avoiding opportunities, either. Legitimately 4+ hours a day
Might be fair to say not many cared to see/be seen
Furthermore, if this corrects only someone who's looking directly at the screen, it'd be tolerable. But does it also correct eyes looking at a keyboard, eyes looking at a smartphone screen, eyes looking at a wayward toddler? That's worse.
Also... ten cents per minute? That's highway robbery!
The camera is the eye. Anyone seeing video of me is seeing me through the eyes of a camera. Therefore, to "make eye contact" I look into the camera, not into arbitrary pixels. In videoconferencing, it's wholly irrelevant where my audience's eyes are located, whether they're even visible. In videoconferencing, our cameras are the eyes, and that's how to make eye contact, because when I see you on the screen looking into the camera, your eyes are directed towards mine seeing the screen.
For over a hundred years, any subject of a camera has known that if you look into that camera lens, then your gaze will be perceived as "eye contact" to any viewers. Where do you look when you're taking a selfie? Or a wedding photographer is taking your photo? Do you look in the photographer's eyes? Do you stare at his flashbulb? That's fucking nuts!
Why is this so hard to understand?
If AI is directed to help us lie about a particular, very human, interaction cue, then is it any surprise we're a world full of autists and Asperger babies?
All attempts by folks to subvert the freedom to direct one's attention where they want to are tyrannical in nature. If you can't detect it's happening, it effectively did not have a negative externality. The tree did not make a sound if no one heard it.
This is the same thought that is used to justify not letting cashiers sit while they bag groceries. Those who think this love the taste of boots in their mouth.
I hope that they fire those who refuse to get with the times on AI and embrace ludditism, and I hope your wife considers her future with you after the economic ruin that such practices will bring upon your family.
So if you enjoy freedoms like ignoring your boss or zoning out during meetings where you should be paying attention, or missing a lecture by your instructor, and you believe there aren't any negative externalities from your failure to pay attention, then I don't know what to tell you.
Now the WFH revolution is already horrifying managers, because it is much more difficult to determine when employees are engaged and productive, vs. when they're trying to fake it, or tuned out. If this AI filter wants to remove one of those cues, that's going to continue horrifying businesses everywhere, and they'll double-down on RTO calls. I've also heard horror stories of hiring remote workers, who will fake interviews, rent their identities, deepfake their video, consult AI offscreen to answer interview questions, subcontract to their illegal buddies, and generally use every trick in the book to hoodwink corporations who make the mistake of not having an in-person relationship with their workforce.
My job coach taught me the value of eye contact, and by extension, the value of paying attention to another human being who is engaged in a discussion with me. That is extremely important. In any online interaction, due to reduced cues and limited feedback, any human cue we can maintain is a valuable one.
My lack of eye contact, I believe, is mostly because it can unnerve me to have someone looking intently at me, and I look away from them in order to collect my thoughts, and maintain my train of thought. It's a habit but it's not necessarily effective. It turns out that most of us can indeed carry on a conversation, and not get distracted, when we're looking into someone's eyes.
And the value to the other party is that they know that they have our attention! That is a gift! I have no idea how tasting boots is relevant here. Every job I've had, has been a mutual gift, and a pleasure to serve my employer, and I've always felt valued for that service, despite the unequal power differential.
It is so weird that you want us to "get with the times on AI" when eye contact is such a basic, very human, and valuable habit of successful people. If AI could facilitate a human connection, I'd be all ears, but in this case, for this article, AI is subverting the signal, encouraging laziness, and simply lying, to "save face", as it were.
Also zoning out during the meetings where your presence is required but unnecessary. If you don't pay attention to a university lecture, that's a skill issue on your part.
-------
The role of the worker is to extract as much value for their employer as possible; any productivity is a secondary byproduct.
Because I was in exactly that position as a teaching assistant, and we dealt with students all the time who had cameras turned off, AFK, distracted, lost.
If AI is going to mask those important feedback signals and lie to the leadership, then the leadership will become ever more ineffective, and the workers will all pay the price. Good job.