MacBook – How to correct dead Macbook Air camera sensor pixels BEFORE other applications use it

camerafirmwarehardwaremacbook pro

My Macbook Air's camera has a handful of broken sensor pixels all clumped together near the middle of the image. They're dead — always black. When I'm on a video conference or a Facetime, this tends to end up making me look like I have a gap between my teeth or some small black thing stuck to my chin. It's irritating. Replacement will be expensive. I know it's possible to do dead pixel correction in post-production for images or videos, but I can't find a solution that allows me to feed the sensor output through the pixel correction before it gets used by other apps (e.g., Facetime or Skype or Google Hangouts or Slack call or something).

How can I find or adapt or build build a software solution for doing dead pixel correction on the sensor's output BEFORE it gets used by other applications?

Best Answer

I'm not sure what mechanism you would use to inject your code in to the firmware of the camera and affect the data that the system sends directly to another app. Similarly, modifying the OS would be tricky.

For the time and engineering you would put up, it would be far cheaper and quick to just add a USB camera and have a good signal in the first place. Depending on where you ran the code, you might even save power / battery by having a good camera send good data rather than process real time video corrections.

That being said, you might be able to use a VR processing type software to send a video stream to an app that's ready to receive it. I don't think you'll have luck with FaceTime or Skype / Teams unless they have a plug in / filter that already does the processing you want before sending the video out.