I keep thinking we really need to apply real time facial tracking to morph video conference images, so that participants actually look at each other. Without this, we constantly stare not-quite-at-each-other, giving teleconferences a feeling of disinterest.
By knowing where the two participants are in 3D space (should be doable with facial tracking, even with a monocular camera), digitally slightly shift the pupil position of each participant to look into the eyes of the other participant. Likely you only want to do this when the participant is actually looking at the face of the other participant, and allow for natural glances away when not (otherwise you get creepy staring looks)
Update 2019: https://arstechnica.com/gadgets/2019/07/facetime-feature-in-ios-13-feigns-eye-contact-during-video-calls
Ha! Good to see someone is moving ahead with this!