Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

kenoh

macrumors 604
Jul 18, 2008
6,507
10,850
Glasgow, UK
Agreed. I do architecture, often with perspective control lenses, so my needs probably aren't typical nor well served by the iPhone. In the below shot I used tilt to move the plane of focus parallel with the facade and shift to prevent keystoning. Even with the tilt I had to stop down a fair bit.

29812126171_ece929ce45_z.jpg


:)

23480032931_619f31e3ca.jpg
Oh yes that's what I am talking about
 

ApfelKuchen

macrumors 601
Aug 28, 2012
4,335
3,012
Between the coasts
I was just looking at a review of the new DoF included in the iOS Beta, and the DoF looks all wrong to me. I was looking at this TechCrunch review, and the photos that have foreground, the hand holding a strawberry, the chicken decal, and the paint cans do not look right.

I finally realized it is because there is no foreground blur. With optical DoF you end up with a certain distance that is in focus. From what I can see the iPhone 7 Plus appears to only apply blur to the background, so everything from the subject forward is in focus, which in my opinion looks odd. Kind of like how 48 or 60 fps movies just don't 'look right' compared to 24 or 30 fps.

Did anyone else notice this, or am I missing something?

Best,
-Rick

"You're being too picky" isn't quite the right way to say it, but of the sample shots with significant foreground, I don't see any where having the foreground out of focus would be all that desirable (to me). Now, it would be "natural" to have some foreground out of focus - the aesthetic to which we're accustomed, an aesthetic determined in part by the traditional limitations of the medium.

The thing is, whether conventional bokeh or Apple's effect, they're both unnatural compared to the way our eyes and minds see the world. Since our minds are programmed to ignore unnecessary details, it takes a conscious effort to simulate photographic DOF. Our focus-of-interest is in focus, our peripheral vision is out of focus, and we generally ignore the periphery. I just "added" an out-of-focus foreground (my hands and keyboard) to my current scene (my iMac's display) by extending my awareness to my peripheral vision. Usually, I'm not aware of my hands and keyboard when I type, as they'd just be a distraction. (Consider how often badly-flawed photos are taken because the photographer's mind ignored unwanted elements in the shot.)

I'd consider this effect to be akin to HDR, only it's compositing the characteristics of both a wide angle and tele. Unlike HDR, I've yet to see examples that seem truly unnatural (and, of course, "unnatural" is not always a bad thing when it comes to creative expression). Compared to tilt-shift or "antique" filters, it seems less susceptible to over-use or inappropriate use. Considering the times I've cursed because I couldn't get enough DOF in the foreground... it would be nice to have this tool in the kit.

Whether it's an entire camera, or a specific lens, we're constantly selecting tools based on our needs and vision. My iPhone can create an in-camera panorama, my "good" camera can't. My iPhone can shoot slo-mo, my "good" camera can't. But my good camera has a much larger sensor, and I can slap a long tele onto it when I need to shoot wildlife or sports. I have far more options when I have both cameras at my disposal.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.