Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

joedan76

macrumors member
Original poster
Jun 16, 2013
36
45
My wife and I both have XRs, I am running the latest IOS 13 PB.

I noticed portrait photos in the stock camera app look slightly better with the blur more well defined and less hit and miss occurring around the edges and surrounding objects.

This is nothing scientific but I believe there have been some improvements to the portrait effects matte.

I took a number of near identical photos on each phone and used Halide to check out the portrait effects matte. (Lenses were cleaned on each phone)

iOS 13 appeared to consistently have...
A more defined outline of the person
Better quality around the hairline
More accurate outline of the person excluding objects nearby

Took at least half a dozen photos and am positive they have improved the algorithms.

Anyone else seeing this?
[doublepost=1565938632][/doublepost]Camera angles were not 100% exact but the results are reproducible. See if you can tell which is iOS 13.
 

Attachments

  • 1B2A3EFB-07FD-47D8-AE23-F9E787CF9712.jpeg
    1B2A3EFB-07FD-47D8-AE23-F9E787CF9712.jpeg
    59.9 KB · Views: 366
  • 9B09A4E9-8C60-47CE-A25E-C2CD13648996.jpeg
    9B09A4E9-8C60-47CE-A25E-C2CD13648996.jpeg
    72.3 KB · Views: 345
  • Like
Reactions: SRLMJ23 and bether
I do see a difference, especially the pirate sword missing in the second person’s head.

Had to say it lol
 
Lol that’s a fan and in ios 13 it was correctly recognised as background.
 
Great, but if Apple will unlock shooting portrait photos for non-animated objects yet would be really Cool!
Like is in Halide and Focos :)
 
  • Like
Reactions: eyeseeyou
That would be great.

Apple are unlocking semantic segmentation mattes as part of the A12 neural engine capabilities in iOS 13 allowing for recognition of hair, skin and teeth in addition to the portrait matte.

Will be interested to see if this improves portrait mode even further.

Cant see anything in the WWDC tech videos that would suggest they are unlocking ability to detect non people though.

ref. https://developer.apple.com/videos/play/wwdc2019/260/

Also found this article quite interesting how portrait mode for non people are made in halide on the XR.

https://blog.halide.cam/iphone-xr-a-deep-dive-into-depth-47d36ae69a81

Looks like it uses Focus Pixels but the depth info is at a much lower resolution than a dual lens camera explains why some of my photos can be hit and miss as they don’t include a portrait matte.

Looks like the next XR with the dual lens may solve this issue somewhat with the normal resolution depth map from the Tele photo lens. Still live in hope despite all iPhones coming this year with at least dual cameras the added ability of a non human matte in the future will be added to portrait photos on all phones greatly improving the outlines on non people photos.
 
Do you guys think the Next XS will allow us to do portraits on the “regular” lens like the XR shoots right now, but with “real” DoF thanks to the added third lens? Very likely right?
 
I would put money on it. The wide angle is used to provide depth data for the portrait telephoto lens. Can’t see why the ultra wide cant provide depth data for the standard wide lens.

I also doubt Apple would introduce the ability for current XS/MAX flagships to take portraits on the wide angle lens as the XR portrait photos are not perfect with non humans relying on lower resolution depth data using only the Focus Pixels.

that’s my 2 cents though
 
Last edited:
  • Like
Reactions: chfilm
I would put money on it. The wide angle is used to provide depth data for the portrait telephoto lens. Can’t see why the ultra wide cant provide depth data for the standard wide lens.

I also doubt Apple would introduce the ability for current XS/MAX flagships to take portraits on the wide angle lens as the XR portrait photos are not perfect with non humans relying on lower resolution depth data using only the Focus Pixels.

that’s my 2 cents though
Roger, I think exactly the same.
 
That would be great.

Apple are unlocking semantic segmentation mattes as part of the A12 neural engine capabilities in iOS 13 allowing for recognition of hair, skin and teeth in addition to the portrait matte.

Will be interested to see if this improves portrait mode even further.

Cant see anything in the WWDC tech videos that would suggest they are unlocking ability to detect non people though.

ref. https://developer.apple.com/videos/play/wwdc2019/260/

Also found this article quite interesting how portrait mode for non people are made in halide on the XR.

https://blog.halide.cam/iphone-xr-a-deep-dive-into-depth-47d36ae69a81

Looks like it uses Focus Pixels but the depth info is at a much lower resolution than a dual lens camera explains why some of my photos can be hit and miss as they don’t include a portrait matte.

Looks like the next XR with the dual lens may solve this issue somewhat with the normal resolution depth map from the Tele photo lens. Still live in hope despite all iPhones coming this year with at least dual cameras the added ability of a non human matte in the future will be added to portrait photos on all phones greatly improving the outlines on non people photos.

Actually the latest update to the focos app pretty much enables single-lens portrait mode on all iphones.
 
  • Like
Reactions: bether
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.