Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

DrV

macrumors 6502
Sep 25, 2007
272
511
Northern Europe
Distance from sensor to the lens I would think equates physics, but I'm open to being corrected if that's not the correct term.

The smallest 1080p sensor / lens array I'm aware of commercially is almost twice as thick as the MB camera and costs a small fortune (in webcam terms).

I for one would welcome a return to the external iSight camera for those who really "need" better cameras.

I usually avoid using my background as an argument, but half of my PhD was optics, and I have both designed optical systems and lectured optics. From that perspective, I would like to try to shed some light on what physics and engineering says about image quality and camera size.

Physics sets two hard limits:

1. The amount of light entering the system depends on the physical aperture (lens size in millimetres). The number of photons entering the system sets a lower limit to the quantum noise (shot noise, "graininess") of the image due to the quantum nature of light.

2. Image sharpness depends on the physical aperture ("diffraction limit") due to the wave nature of light.

Modern camera systems are diffraction-limited, i.e. they are limited by the second limit (not engineering). Modern camera elements are quite close to the theoretical maximum efficiency (a photon hitting a pixel very likely produces a charge carrier), as well, but there is a more room for improvement. Not easy and not an order of magnitude by any means.

Let's see some comparisons between the diffraction limit and the MBA camera module. I am afraid I have to guess most specifications for the module, but these are my rather optimistic guesstimates:

focal length: 1.5 mm
f-number: f/2
number of pixels: 1280 x 720
field of view: 50°

With the given f-number the diffraction limit on the sensor is approximately 2.5 um (Airy disc first minimum, green light.)

On the other hand, the image on the sensor is approximately 1.3 mm x 0.7 mm (FOV and focal length). When this is divided by the number of pixels, the pixel size should be around 1.0 um. This seems reasonable, smallest pixels sizes in any camera elements are just in that range.

Note that the pixel size is actually well below the diffraction limit. Smaller pixels would not give any more information, just more noise. No point adding pixels with such a small aperture (physical aperture 1.5 mm / 2 = 0.75 mm).

Coincidentally, the same problem (small physical aperture) causes image noise as very few photons find their way into the optical system. So, we could try to keep the focal length intact and increase the aperture. This both collects more light (less noise) and gives a sharper image.

Unfortunately, this is easier said than done, and here we enter the engineering part. With classical lens-based optics it is possible to get up to f/1 apertures depending on the field of view. However, doing that while maintaining the physical size of the optical system is very hard, and even with larger lenses the image quality usually suffers. Apple has worked extremely hard with the f/1.6 lens in iPhone 12PM.

In theory, it should be possible to use non-classical optics or combine images from a large number of sensors. Non-classical solutions (based on diffractive optics or even on negative refractive index metamaterials) are theoretically possible but years and years from being useful in this application. Combining image from several sensors is an interesting opportunity, but there would need to be a lot of them, and that would cause a lot of other problems.

It might be interesting to compare different setups. As I hate the MB webcam image "quality", I often use a Logitech StreamCam as a quick replacement. The image quality difference is day-and-night. The Logitech lens is specified as f=3.7 mm, f/2 (1.85 mm aperture). As the focal length is 3.7/1.5 = 2.5-fold and aperture number similar, the lens collects approximately 6 times (2.5^2) as much light as the built-in webcam, and the optical resolution is approximately 2.5 times better. And that shows.

Sometimes I have tried to use EpocCam and iPhone. My iPhone XSmax seems to have a f=4.25 mm, f/1.8 lens (2.4 mm aperture), which is again somewhat better than the StreamCam (more than the aperture number would indicate, but that comes from other factors). But when I really need decent image quality, I use a D7500 DSLR with a zoom lens. The lens is not a fast one, but at f=35, f/5.6 the physical aperture is 4.5 mm. The light collecting area is thus (4.5 mm / 0.75 mm)^2 = 36-fold compared to the built-in webcam. That is a huge difference, and with the 6-fold increase in sharpness, as well, it shows.

The sensor in D7500 produces a 2160p image, but I downscale it to 720p. Crisp, sharp, well-illuminated 720p is good for anything short of creating HD videos.

Now, there are things that can be done in image processing, and the M1 MBA utilises those. The image quality may become visually more pleasing, but there is no more information in the image. If someone really wants 1080p, the image can be super-scaled by using super-resolution algorithms, but it won't look any better. And noise is poison to those algorithms.

So, due to physics and known engineering limits, you can do the following:

1. Get enough light from the right direction. This makes the image tolerable.
2. Get a good webcam. Plus of course keep illuminating.
3. Get a real camera.

Apple could make the lid thicker and use a thicker module.

From my point of view there are two completely missed easy opportunities. Apple really should make it so that an iPhone could be used as a webcam without unstable third-party solutions. DSLR and compact camera manufacturers should create UVC protocol USB interfaces on their cameras; then any camera could be plugged in as a webcam. A $300/300€ compact camera would be a fabulous webcam with zoom and aperture control.
 

Shirasaki

macrumors P6
May 16, 2015
16,263
11,764
Even better:
Let ur iPhone take a photo of you, send it to MacBook. MacBook then will use the neural engine to generate a deep fake face of you, mimicking mouth movement etc.
Next time someone wants to make a “video call”, what they see is not you, but a computer generated face that is the best of your look. Apple can then tout their image processing power non-stop forever. Win-win.
Who needs a webcam anyways?
 

Maconplasma

Cancelled
Sep 15, 2020
2,489
2,215
Even better:
Let ur iPhone take a photo of you, send it to MacBook. MacBook then will use the neural engine to generate a deep fake face of you, mimicking mouth movement etc.
Next time someone wants to make a “video call”, what they see is not you, but a computer generated face that is the best of your look. Apple can then tout their image processing power non-stop forever. Win-win.
Who needs a webcam anyways?
I'm sure this is sarcasm because isn't that the Memoji?
 

sean+mac

macrumors regular
Aug 13, 2020
103
139
Canada
Apple really should make it so that an iPhone could be used as a webcam without unstable third-party solutions.

First off DrV I just wanted to thank you so much for taking the time write that post. I almost un-subbed from this thread a few times thread recently (it has gotten pretty cyclical in nature) and your post was just an excellent read. Now, I wasn't somebody who needed convincing, but I still learned a few things from it and overall just appreciated the time, expertise and effort that went into it.

So, again, thanks!

Regarding iPhone as a webcam ... 100%. This is such an "obvious" (for Apple to do) suggestion, especially when you think of the larger ecosystem of Handoff-type functionality they've been trying to build up. If my watch can see my camera's stream to take a phone to help take a picture, just make it so my computer can as well.

I have bought EpocCam and it mostly works on days where I want prime video quality for a recorded live stream, but the app and cross-device experience isn't quite as reliable or polished as it could be. Still, it helps solve a "problem" in a way that uses existing products already in my house, helping avoid the need to buy yet another higher resolution camera for a niche use case.

(The iMac's camera is better, so I've been using the iPhone less often in this way. But it is still the higher quality camera option.)
 
  • Like
Reactions: DrV

AppliedMicro

macrumors 68030
Aug 17, 2008
2,831
3,723
As far as the bump, BRING IT ON
A camera bump would look absolutely hideous in a notebook form factor.

Does the camera bump look great on an iPhone? No. But there is good reason to prioritise function over form on such a handheld device.
Does the camera bump great on an iPad? No. But there's still enough reason to prioritise function over form on such a handheld device.
Does the bump look good on the iPhone smart battery case? No. But again...

Is there enough reason to make notebooks thicker and thus heavier, for a webcam? There isn't IMO. Cause there's enough alternatives around for high-quality video-conferencing. Among them professional (stationary) business conferencing solutions for office use. Or, who would have guessed, iPhones and iPads.
Also, there is a good reason not to create a camera bumps on notebooks: They are often transported in cases or backpacks. Third party camera covers have been associated with cracks in displays, due to the uneven pressure aoplied to the screen. A camera bump on the back side of the display is prone to do something similar.
 

Shirasaki

macrumors P6
May 16, 2015
16,263
11,764
I'm sure this is sarcasm because isn't that the Memoji?
No. Not Memoji. Just a virtual face looking Strikingly similar to your own face thanks to AI. It can even reach the level of tricking Face ID that it is you. Thats what I’m talking about.
 

solussd

macrumors newbie
Feb 18, 2011
17
6
It’s a feature. Cams are 18 inches to the face? Most of us don’t want high def.
We don’t even like people who take pictures of their kids at the School Xmas show with an iPad. Do we want them doing it with a laptop?
I’ve got the Logitech 4K webcam that attaches to the XDR display. Trust me, it makes you look better than everyone else’s grainy video with poor low-light performance.
 
  • Like
Reactions: Zazoh

Shirasaki

macrumors P6
May 16, 2015
16,263
11,764
And that's exactly what a Memoji is.
I would not mistake a clearly animated face with a real human’s face. And I cannot use Memoji to unlock my device using Face ID. It has to be so similar that everyone seeing that face would think that is you.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
I usually avoid using my background as an argument, but half of my PhD was optics, and I have both designed optical systems and lectured optics. From that perspective, I would like to try to shed some light on what physics and engineering says about image quality and camera size.

Physics sets two hard limits:

1. The amount of light entering the system depends on the physical aperture (lens size in millimetres). The number of photons entering the system sets a lower limit to the quantum noise (shot noise, "graininess") of the image due to the quantum nature of light.

2. Image sharpness depends on the physical aperture ("diffraction limit") due to the wave nature of light.

Modern camera systems are diffraction-limited, i.e. they are limited by the second limit (not engineering)....
I was surprised to hear you say that what limits the sharpness of modern camera systems is the diffraction limit, rather than the engineering (which would include the engineering of the lens). It is my understanding that lens quality still does matter when it comes to sharpness (especially away from the center of the lens, which is important when you are shooting at wider apertures), and that diffraction-limiting only comes into play when you stop down to sufficiently small apertures. See, for instance:


Also, it seems the key engineering constraint that limits picture quality for these webcams is sensor size (=image size), which is in turn limited by focal length (i.e., lens-sensor distance). Could you please give the formula for image size as a function of focal length, FOV, aperture, and aspect ratio you used in your calculations? It would be fun to play with it.
 
Last edited:

sean+mac

macrumors regular
Aug 13, 2020
103
139
Canada
The Mac does do that with the iPhone.

Thank you for your helpful post which sent me to the Apple Support article or other "how to" guide which explains how to do this as somebody who is ignorant of the feature. Or even the name of the feature so I know what to search for on Google.

If you mean the continuity camera feature for scanning a document or take a photo from your iPhone/iPad within a note on MacOS ... sure, I guess that sort of meets one definition of what I stated. That workflow, by the way, requires you to go interact with your external device, taking the photo/scan from its display and then confirm the ready before then seeing it embedded it back in the note where you started from. Hardly a live streaming of the cam to the Mac to use as a webcam. Nor is it even streaming what the camera sees to the device (unlike my Apple Watch example, which does do this.)

But literally nobody has been discussing this sort of use of the camera the whole thread. We're here debating and discussing video conferencing and recording use cases.

Maybe you mean how you can use Quicktime to record your iPhone's screen? Of course, that requires the rest of the OS's overlays to show and doesn't actually make it show in something like FaceTime, Photo Booth or Zoom as an option. Yes, in Zoom I could share the Quicktime window as a screen share so long as somebody else isn't screen sharing.

If you can send an example along, that would be greatly appreciated. I had referenced EpocCam as third-party solution for the feature proposed, to use an iPhone as a higher quality webcam. (EpocCam was actually mentioned first by the other poster who initially suggested could build a solution for this use case.)
 

Maconplasma

Cancelled
Sep 15, 2020
2,489
2,215
Thank you for your helpful post which sent me to the Apple Support article or other "how to" guide which explains how to do this as somebody who is ignorant of the feature. Or even the name of the feature so I know what to search for on Google.

If you mean the continuity camera feature for scanning a document or take a photo from your iPhone/iPad within a note on MacOS ... sure, I guess that sort of meets one definition of what I stated. That workflow, by the way, requires you to go interact with your external device, taking the photo/scan from its display and then confirm the ready before then seeing it embedded it back in the note where you started from. Hardly a live streaming of the cam to the Mac to use as a webcam. Nor is it even streaming what the camera sees to the device (unlike my Apple Watch example, which does do this.)

But literally nobody has been discussing this sort of use of the camera the whole thread. We're here debating and discussing video conferencing and recording use cases.

Maybe you mean how you can use Quicktime to record your iPhone's screen? Of course, that requires the rest of the OS's overlays to show and doesn't actually make it show in something like FaceTime, Photo Booth or Zoom as an option. Yes, in Zoom I could share the Quicktime window as a screen share so long as somebody else isn't screen sharing.

If you can send an example along, that would be greatly appreciated. I had referenced EpocCam as third-party solution for the feature proposed, to use an iPhone as a higher quality webcam. (EpocCam was actually mentioned first by the other poster who initially suggested could build a solution for this use case.)
I meant the continuity camera. I know you're referring to a live webcam but I don't think there are any PC manufacturers that make phones that are used as live cams for their PC's. Seeing as Microsoft no longer makes phones and Google only has Chromebooks I don't think Apple is out of place here not creating web cams out of the iPhones for the Mac. Also it's more than just creating the feature, latency is something that needs perfecting and it also needs to be a feature more than the niche users would want. I tried the EpocCam once when I wanted to use my iPhone during a Zoom call. That software was so flaky and it had a lot of upset customers waiting for updated drivers, especially for macOS Catalina. I couldn't even remove it. I had to do a full erase and reformat of my Mac.
 

827538

Cancelled
Jul 3, 2013
2,322
2,833
Edit: I spent a while crafting this reply. In the meantime a few others chimed in with some very similar replies, such as this one:


I just wanted to call out I wasn't trying to "beat a dead horse" with a very repetitive reply that honed in on a few of the same points (like glass/aluminum contributing to the limited thickness.) My message was just in the midst of being crafted while a few others were also posted.

Orignal message:

Thank you for your balanced request. I thought this thread had settled (mostly) on body thickness being a constraint on about page 5-6 then suddenly it blew up again with a ton of crying about Apple cheaped out on this design by not upgrading the cam beyond the driver/processing improvements.

Almost certainly they would have had to have redesigned the body completely to thicken the display, even if just for a camera bump (and a space for that bump, if it bumped in towards the body.) Like it or not, they chose not to update the bodies with this round of M1 upgrades. If I had to guess this would be, in part, due to a ton of R&D money already going into the design, testing and manufacturing of a brand new chip which is a generational leap over existing laptop CPUs. Also I suspect it helps communicate to the less tech savvy customers that "it is just another MacBook Air, don't worry about tit" if that's all they're in the shop for.

Anyway, thanks for being reasonable in saying "bring on the bump" (paraphrased.)



This goes back to the need for a bump. I am not 100% sure on the display itself, but the thinnest part of Air's total height specs is 4.1mm – including the keyboard/computer half. So let's say it is 2.5-3mm display thickness, generously. This also aligns to a quick calliper measurement I tried to perform on my 13" MacBook Pro's display, but honestly I was worried about scratching it so only gently tried to get a ballpark measure of it. This thickness includes the glass and aluminum body. While true of all devices, obviously as a % of total available depth the housing would represent a much higher proportion of available space in a thinner device. So even 1,2,3mm added can fully go to housing thicker components and wiring.

As a comparison iPad Pro is, from what I could tell, the thinnest device to have a FaceId module. It is 5.9mm.

Unfortunately I could not find any measurements of the FaceId module itself. When shown in teardown though it does seem to take up a fair bit of the depth of the the devices it is in. Especially with the two little lenses protruding out of it.

I wasted more time than I'd care to admit trying to find a teardown of a modern MacBook's display to show the camera module. Most newer teardown don't seem to even bother disassembling the whole display – likely as it gets replaced as one unit in any sort of repair. Here's a 2012 teardown which claims the display back then was 3-7mm. I hope the ruler shown for the sake a photo wasn't how they actually measured that ;). But it was also 720p camera back then so that's something.

The thickness of the display is tricky to go off of because MacBooks have some roundedness/taper to the body. This aligns close to the LCD bezel, making it thinnest on the extreme edges. Arguably, that tapering covers the camera's location, so a more squared-off redesign could possibly add a little extra room where the camera is place above the LCD.


I don't think there aren't people saying a better camera wouldn't be an appreciated improvement, if one could be added without other form factor tradeoffs. They are just skeptical on the idea that it is even feasible to make a 1080p camera module that would fit the current dimensions without introducing noise and other camera quality tradeoffs.

This brings us back to introducing a bump, a body shape redesign to thicken things up or some other more radical redesign/investment into the overall camera/sensor design.



I will say agree that I much prefer my iMac's 1080p camera to my (Intel) 13" MacBook Pro. It definitely is a quality improvement and the new camera picks up a lot more light and a wider field of view. Appreciated! But I also understand the iMac has a lot more space to work with. If anything, the outrage should be that the iMacs didn't upgrade a few years sooner. I am a little disappointed they didn't bring FaceId into this year's model, given it has the T2 module now and there is space to spare.

You don't need to make the sensor thicker to improve the camera. You can make the sensor taller and wider to accommodate more (and bigger) pixels. The thickness can remain exactly what it is now. I think you're thinking about changes to the lens.
 

deevey

macrumors 65816
Dec 4, 2004
1,417
1,494
Maybe you mean how you can use Quicktime to record your iPhone's screen? Of course, that requires the rest of the OS's overlays to show and doesn't actually make it show in something like FaceTime, Photo Booth or Zoom as an option. Yes, in Zoom I could share the Quicktime window as a screen share so long as somebody else isn't screen sharing.
I meant the continuity camera. I know you're referring to a live webcam but I don't think there are any PC manufacturers that make phones that are used as live cams for their PC's. Seeing as Microsoft no longer makes phones and Google only has Chromebooks I don't think Apple is out of place here not creating web cams out of the iPhones for the Mac. Also it's more than just creating the feature, latency is something that needs perfecting and it also needs to be a feature more than the niche users would want. I tried the EpocCam once when I wanted to use my iPhone during a Zoom call. That software was so flaky and it had a lot of upset customers waiting for updated drivers, especially for macOS Catalina. I couldn't even remove it. I had to do a full erase and reformat of my Mac.
OBS on the Mac and plug in the lightning cable to stream. Airmix Solo or Fullscreen Camera App running on the Phone will output a clean video image to stream.

The combo extremely well (with loads of extra options e.g. Zoom, Focus, Color etc ..)

But yeah, Apple should really just allow the iPhone be used as a webcam rather than just as a capture source.

I usually avoid using my background as an argument, but half of my PhD was optics, and I have both designed optical systems and lectured optics. From that perspective, I would like to try to shed some light on what physics and engineering says about image quality and camera size.

Physics sets two hard limits:

1. The amount of light entering the system depends on the physical aperture (lens size in millimetres). The number of photons entering the system sets a lower limit to the quantum noise (shot noise, "graininess") of the image due to the quantum nature of light.

2. Image sharpness depends on the physical aperture ("diffraction limit") due to the wave nature of light.

Modern camera systems are diffraction-limited, i.e. they are limited by the second limit (not engineering). Modern camera elements are quite close to the theoretical maximum efficiency (a photon hitting a pixel very likely produces a charge carrier), as well, but there is a more room for improvement. Not easy and not an order of magnitude by any means.

Let's see some comparisons between the diffraction limit and the MBA camera module. I am afraid I have to guess most specifications for the module, but these are my rather optimistic guesstimates:

focal length: 1.5 mm
f-number: f/2
number of pixels: 1280 x 720
field of view: 50°

With the given f-number the diffraction limit on the sensor is approximately 2.5 um (Airy disc first minimum, green light.)

On the other hand, the image on the sensor is approximately 1.3 mm x 0.7 mm (FOV and focal length). When this is divided by the number of pixels, the pixel size should be around 1.0 um. This seems reasonable, smallest pixels sizes in any camera elements are just in that range.

Note that the pixel size is actually well below the diffraction limit. Smaller pixels would not give any more information, just more noise. No point adding pixels with such a small aperture (physical aperture 1.5 mm / 2 = 0.75 mm).

Coincidentally, the same problem (small physical aperture) causes image noise as very few photons find their way into the optical system. So, we could try to keep the focal length intact and increase the aperture. This both collects more light (less noise) and gives a sharper image.

Unfortunately, this is easier said than done, and here we enter the engineering part. With classical lens-based optics it is possible to get up to f/1 apertures depending on the field of view. However, doing that while maintaining the physical size of the optical system is very hard, and even with larger lenses the image quality usually suffers. Apple has worked extremely hard with the f/1.6 lens in iPhone 12PM.

In theory, it should be possible to use non-classical optics or combine images from a large number of sensors. Non-classical solutions (based on diffractive optics or even on negative refractive index metamaterials) are theoretically possible but years and years from being useful in this application. Combining image from several sensors is an interesting opportunity, but there would need to be a lot of them, and that would cause a lot of other problems.

It might be interesting to compare different setups. As I hate the MB webcam image "quality", I often use a Logitech StreamCam as a quick replacement. The image quality difference is day-and-night. The Logitech lens is specified as f=3.7 mm, f/2 (1.85 mm aperture). As the focal length is 3.7/1.5 = 2.5-fold and aperture number similar, the lens collects approximately 6 times (2.5^2) as much light as the built-in webcam, and the optical resolution is approximately 2.5 times better. And that shows.

Sometimes I have tried to use EpocCam and iPhone. My iPhone XSmax seems to have a f=4.25 mm, f/1.8 lens (2.4 mm aperture), which is again somewhat better than the StreamCam (more than the aperture number would indicate, but that comes from other factors). But when I really need decent image quality, I use a D7500 DSLR with a zoom lens. The lens is not a fast one, but at f=35, f/5.6 the physical aperture is 4.5 mm. The light collecting area is thus (4.5 mm / 0.75 mm)^2 = 36-fold compared to the built-in webcam. That is a huge difference, and with the 6-fold increase in sharpness, as well, it shows.

The sensor in D7500 produces a 2160p image, but I downscale it to 720p. Crisp, sharp, well-illuminated 720p is good for anything short of creating HD videos.

Now, there are things that can be done in image processing, and the M1 MBA utilises those. The image quality may become visually more pleasing, but there is no more information in the image. If someone really wants 1080p, the image can be super-scaled by using super-resolution algorithms, but it won't look any better. And noise is poison to those algorithms.

So, due to physics and known engineering limits, you can do the following:

1. Get enough light from the right direction. This makes the image tolerable.
2. Get a good webcam. Plus of course keep illuminating.
3. Get a real camera.

Apple could make the lid thicker and use a thicker module.

From my point of view there are two completely missed easy opportunities. Apple really should make it so that an iPhone could be used as a webcam without unstable third-party solutions. DSLR and compact camera manufacturers should create UVC protocol USB interfaces on their cameras; then any camera could be plugged in as a webcam. A $300/300€ compact camera would be a fabulous webcam with zoom and aperture control.
Just want to say, fantastic post :)
 
Last edited:

AutomaticApple

Suspended
Nov 28, 2018
7,401
3,378
Massachusetts
Apple could make the lid thicker and use a thicker module.
That would be a step backwards for a premium product, especially coming from Apple.
Even better:
Let ur iPhone take a photo of you, send it to MacBook. MacBook then will use the neural engine to generate a deep fake face of you, mimicking mouth movement etc.
Next time someone wants to make a “video call”, what they see is not you, but a computer generated face that is the best of your look. Apple can then tout their image processing power non-stop forever. Win-win.
Who needs a webcam anyways?
Or just tape a photo of your face onto your webcam. ;)
I'm sure this is sarcasm because isn't that the Memoji?
You cannot send it to the MacBook though.
First off DrV I just wanted to thank you so much for taking the time write that post. I almost un-subbed from this thread a few times thread recently (it has gotten pretty cyclical in nature) and your post was just an excellent read. Now, I wasn't somebody who needed convincing, but I still learned a few things from it and overall just appreciated the time, expertise and effort that went into it.

So, again, thanks!

Regarding iPhone as a webcam ... 100%. This is such an "obvious" (for Apple to do) suggestion, especially when you think of the larger ecosystem of Handoff-type functionality they've been trying to build up. If my watch can see my camera's stream to take a phone to help take a picture, just make it so my computer can as well.

I have bought EpocCam and it mostly works on days where I want prime video quality for a recorded live stream, but the app and cross-device experience isn't quite as reliable or polished as it could be. Still, it helps solve a "problem" in a way that uses existing products already in my house, helping avoid the need to buy yet another higher resolution camera for a niche use case.

(The iMac's camera is better, so I've been using the iPhone less often in this way. But it is still the higher quality camera option.)
I like the idea, but latency would be a major problem.
A camera bump would look absolutely hideous in a notebook form factor.

Does the camera bump look great on an iPhone? No. But there is good reason to prioritise function over form on such a handheld device.
Does the camera bump great on an iPad? No. But there's still enough reason to prioritise function over form on such a handheld device.
Does the bump look good on the iPhone smart battery case? No. But again...

Is there enough reason to make notebooks thicker and thus heavier, for a webcam? There isn't IMO. Cause there's enough alternatives around for high-quality video-conferencing. Among them professional (stationary) business conferencing solutions for office use. Or, who would have guessed, iPhones and iPads.
Also, there is a good reason not to create a camera bumps on notebooks: They are often transported in cases or backpacks. Third party camera covers have been associated with cracks in displays, due to the uneven pressure aoplied to the screen. A camera bump on the back side of the display is prone to do something similar.
Camera bumps breaking the display? Sound like a PR nightmare.
No. Not Memoji. Just a virtual face looking Strikingly similar to your own face thanks to AI. It can even reach the level of tricking Face ID that it is you. Thats what I’m talking about.
That sounds a bit too complicated, doesn't it?
And that's exactly what a Memoji is.
As far as I'm concerned, AI has little to nothing to do with Memoji.
I’ve got the Logitech 4K webcam that attaches to the XDR display. Trust me, it makes you look better than everyone else’s grainy video with poor low-light performance.
Yes, on their Windows laptops. ;)
I would not mistake a clearly animated face with a real human’s face. And I cannot use Memoji to unlock my device using Face ID. It has to be so similar that everyone seeing that face would think that is you.
Wouldn't that result in a number of security concerns?
I was surprised to hear you say that what limits the sharpness of modern camera systems is the diffraction limit, rather than the engineering (which would include the engineering of the lens). It is my understanding that lens quality still does matter when it comes to sharpness (especially away from the center of the lens, which is important when you are shooting at wider apertures), and that diffraction-limiting only comes into play when you stop down to sufficiently small apertures. See, for instance:


It seems the key engineering constraint that limits picture quality for these webcams is sensor size (=image size), which is in turn limited by focal length (i.e., lens-sensor distance). Could you please give the formula for image size as a function of focal length, FOV, aperture, and aspect ratio?
This thread is turning into a science lecture. :p
Thank you for your helpful post which sent me to the Apple Support article or other "how to" guide which explains how to do this as somebody who is ignorant of the feature. Or even the name of the feature so I know what to search for on Google.

If you mean the continuity camera feature for scanning a document or take a photo from your iPhone/iPad within a note on MacOS ... sure, I guess that sort of meets one definition of what I stated. That workflow, by the way, requires you to go interact with your external device, taking the photo/scan from its display and then confirm the ready before then seeing it embedded it back in the note where you started from. Hardly a live streaming of the cam to the Mac to use as a webcam. Nor is it even streaming what the camera sees to the device (unlike my Apple Watch example, which does do this.)

But literally nobody has been discussing this sort of use of the camera the whole thread. We're here debating and discussing video conferencing and recording use cases.

Maybe you mean how you can use Quicktime to record your iPhone's screen? Of course, that requires the rest of the OS's overlays to show and doesn't actually make it show in something like FaceTime, Photo Booth or Zoom as an option. Yes, in Zoom I could share the Quicktime window as a screen share so long as somebody else isn't screen sharing.

If you can send an example along, that would be greatly appreciated. I had referenced EpocCam as third-party solution for the feature proposed, to use an iPhone as a higher quality webcam. (EpocCam was actually mentioned first by the other poster who initially suggested could build a solution for this use case.)
I often use QuickTime for that purpose. Serves me well in most situations.
I meant the continuity camera. I know you're referring to a live webcam but I don't think there are any PC manufacturers that make phones that are used as live cams for their PC's. Seeing as Microsoft no longer makes phones and Google only has Chromebooks I don't think Apple is out of place here not creating web cams out of the iPhones for the Mac. Also it's more than just creating the feature, latency is something that needs perfecting and it also needs to be a feature more than the niche users would want. I tried the EpocCam once when I wanted to use my iPhone during a Zoom call. That software was so flaky and it had a lot of upset customers waiting for updated drivers, especially for macOS Catalina. I couldn't even remove it. I had to do a full erase and reformat of my Mac.
Microsoft makes the Surface Duo and Chrome OS is on a variety of other devices.
Screen Shot 2020-11-29 at 9.52.06 PM.png

You don't need to make the sensor thicker to improve the camera. You can make the sensor taller and wider to accommodate more (and bigger) pixels. The thickness can remain exactly what it is now. I think you're thinking about changes to the lens.
At this point, I'm lost. ?
OBS on the Mac plug in the cable and use Airmix Solo or Fullscreen Camera App on the Phone to output a clean video image works very well.

But yeah, Apple should really just allow the Phone as a webcam rather than just as a capture source.
Use a 3rd party tool in the meantime.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
You don't need to make the sensor thicker to improve the camera. You can make the sensor taller and wider to accommodate more (and bigger) pixels. The thickness can remain exactly what it is now. I think you're thinking about changes to the lens.
Yes, you don't need to make the sensor itself thicker. But I believe, everything else being equal, a larger sensor requires a larger focal length (lens-sensor distance), which in turn increases the thickness of the camera module.
 

s66

Suspended
Dec 12, 2016
472
661
It isn't just that.

I use a 5Ds R or 7D Mark II through FaceTime and it's night or day regardless if i'm facing the morning sun on a cloudless day or have large soft studio lights

I always get asked how come my video looks so sharp, vibrant and awesome.
Yeah just image the camera bulge if apple were to incorporate something the size of a Canon 5D, R or 7DmkII + a decent lens into the lid of a laptop.

The essence of why such a setup is better is simple: size. Both for allowing more light into the lens as it's larger, for allowing a lens that's much more advanced (the lenses alone costs as much as a laptop, if not many times more), and then the sensor: in one case it's full frame: 36x24 mm. In the other care it needs to fit inside the lid of an ultra portable laptop. And you cannot shrink pixel size without dramatically affecting image quality as the individual pixels pick up more noise as the get smaller (that's physics of how sensor work).

To get back to the OP: yes y'all are exaggerating big time.
There simply is no way to put a much better camera into the extremely tiny amount of space available in the lid of the laptops. Even an iPhone is many times ticker than the lid of the laptop, and it's camera+tiny lens simply will not fit without a serious camera bump. And who'd want that on their laptop - or a thicker lid - or a camera shooting from below letting all have a look up your nose.

I just hope apple's leadership and engineering is smart enough to not listen to the whining.

If you want better quality conference calling:
- set the light on your face - lots of it
- do not add more light to the background than onto your face
- get an Internet connection with a significant amount of upstream bandwidth

If that's still not good enough: hook up an external camera. You'll notice it'll cost as much as the new laptop (if not a lot more) and will not fit anywhere inside a bag that snugly fits the laptop...
 
Last edited:
  • Like
Reactions: sean+mac

Hexley

Suspended
Jun 10, 2009
1,641
505
Yeah just image the camera bulge if apple were to incorporate something the size of a Canon 5D, R or 7DmkII + a decent lens into the lid of a laptop.

The essence of why such a setup is better is simple: size. Both for allowing more light into the lens as it's larger, for allowing a lens that's much more advanced (the lenses alone costs as much as a laptop, if not many times more), and then the sensor: in one case it's full frame: 36x24 mm. In the other care it needs to fit inside the lid of an ultra portable laptop. And you cannot shrink pixel size without dramatically affecting image quality as the individual pixels pick up more noise as the get smaller (that's physics of how sensor work).

To get back to the OP: yes y'all are exaggerating big time.
There simply is no way to put a much better camera into the extremely tiny amount of space available in the lid of the laptops. Even an iPhone is many times ticker than the lid of the laptop, and it's camera+tiny lens simply will not fit without a serious camera bump. And who'd want that on their laptop - or a thicker lid - or a camera shooting from below letting all have a look up your nose.

I just hope apple's leadership and engineering is smart enough to not listen to the whining.

If you want better quality conference calling:
- set the light on your face - lots of it
- do not add more light to the background than onto your face
- get an Internet connection with a significant amount of upstream bandwidth

If that's still not good enough: hook up an external camera. You'll notice it'll cost as much as the new laptop (if not a lot more) and will not fit anywhere inside a bag that snugly fits the laptop...
Have you even read my reply? Never have I stated that Apple should put a larger camera into the lid of the Macbook.

What I was pointing our was camera makers released apps to let their cameras serve as webcams. If you're inclined to do so check if your point & shoot, DSLR or mirrorless are included.

Unless of course your point is you want to complain or point out my supposed idea is wrong. Which is crazy because I never said put a full frame or L lens into a Macbook lid.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
There simply is no way to put a much better camera into the extremely tiny amount of space available in the lid of the laptops. Even an iPhone is many times ticker than the lid of the laptop, and it's camera+tiny lens simply will not fit without a serious camera bump. And who'd want that on their laptop - or a thicker lid - or a camera shooting from below letting all have a look up your nose.
Sure there is, you just need to be a bit creative about it.

For instance, Apple could use a triple-lens/triple-sensor camera module. Specifically, they could use module that has three of the current-sized 720p webcams lined up in a row, and combine the signals from the three digitally. That would reduce noise and improve dynamic range, which would give a better picture when the lighting is non-ideal.

It might not even cost that much, since those 720p webcams are probably inexpensive; though it would add complexity and increase possible points of failure.

Not saying they necessarily should do this; just pointing out that there are engineering options out there to improve camera quality while staying within the current form factor.
 
Last edited:

aaronhead14

macrumors 65816
Mar 9, 2009
1,246
5,327
Okay then. Give us solid reasons. Out of curiosity, do you have nothing else to complain about? Personally, if I didn't, I would get mad over this too. What else would I have to pointlessly get mad about? ;)
Um, the “solid reason” is that it’s a 720p webcam. In 2020. ?‍♂️
 

Migranya

macrumors member
Apr 13, 2020
69
79
I don't know why people are upset with 720p webcam when all of the videoconference services caps the resolutions because of the bandwidth.

BUT, we know that Apple has the 1080p webcam ready, because it's included in the iMac Pro. So please, maybe we don't need the 1080p webcam but it's OK to have it for the future.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
I don't know why people are upset with 720p webcam when all of the videoconference services caps the resolutions because of the bandwidth.

BUT, we know that Apple has the 1080p webcam ready, because it's included in the iMac Pro. So please, maybe we don't need the 1080p webcam but it's OK to have it for the future.
I think it's more that people don't like the quality of the Mac laptop webcam, and they are assuming that 1080p would improve things (because they know 1080p cameras, like that on the new iMac, typically do look better), when what's really needed is more total sensor area, rather than 1080p per se. Thus, for instance, you would look better-lit and less grainy during a Zoom conference using a 1080p iMac webcam, even if the Zoom is 720p, because the the 1080p iMac camera has a better sensor, giving less noise and more dynamic range. So the question is how you could increase sensor size for the Mac laptop cameras without increasing focal length, which is limited by the thickness of the lid.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.