I mean iOS. The one run on iPhone iPad etc. macos at least can “hack” somewhat, but no such luck for iOS devices unless I do jailbreak.When you install the app in MacOS Big Sur it shows you how to give it the permission it needs.
I mean iOS. The one run on iPhone iPad etc. macos at least can “hack” somewhat, but no such luck for iOS devices unless I do jailbreak.When you install the app in MacOS Big Sur it shows you how to give it the permission it needs.
If this ‘feature’ is so important to you, I’d recommend you switch to a plethora of android phones that I’m sure you know have said feature and if you want to stick to an iPhone, as I said, there will be a ton of third party apps on the App Store.
Someone is getting nit-pick and I fail to understand your motivation.
Some of mine I use a tripod, others I do not. Mostly I do not. The issue exists for each. I have posted both.
It originally came to light a few beta’s ago in 14.5. First observed doing low light photos. Suspected cause is a change to the post pic processing in stock iOS.
Not a tripod. Both hand helded but very stable.
Not portrait mode. Just normal 2,5x
The focus was on the tree in both
Someone is getting nit-pick and I fail to understand your motivation.
Some of mine I use a tripod, others I do not. Mostly I do not. The issue exists for each. I have posted both.
It originally came to light a few beta’s ago in 14.5. First observed doing low light photos. Suspected cause is a change to the post pic processing in stock iOS.
I’m getting nit picky because (1) you and the other guy are using extreme language to describe a nearby imperceptible difference in the photos (“watercolors”) and (2) because you are doing so (at least he or she is) without controlled conditions. Shooting handheld in low light is virtually guaranteed to generate motion blur. But sometimes you get lucky. Using a tripod eliminates that source of possible image “contamination”.
Bottom line: I do not like blanket assertions made without adequate experimental evidence.
I'm pretty sure it's not motion blur. It was the iOS camera app using the 1x lens instead of the 2.5x lens and applying digital zoom. Which can be easily verified in the EXIF data.
I wanted to share a thought, now that you mention that digital zoom is basically an interpolation. Based on that idea, couldn’t Apple improve this digital zoom using AI thanks to the 16 cores of the A14 Neural Engine? Couldn’t we see better digital zoom in future iOS releases, especially on iPhones with more Neural Engine cores, like those with A12-A13 SoC and even more on A14 devices?That makes sense too. Interpolation (what digital “zoom” is - basically guessing) will cause that effect also.
All of this goes to show that the Apple apps are good for basics but if you want better results, go with a product like Halide.
I wanted to share a thought, now that you mention that digital zoom is basically an interpolation. Based on that idea, couldn’t Apple improve this digital zoom using AI thanks to the 16 cores of the A14 Neural Engine? Couldn’t we see better digital zoom in future iOS releases, especially on iPhones with more Neural Engine cores, like those with A12-A13 SoC and even more on A14 devices?
It has nothing to do with pic post processing in the stock iOS app. Simply put, sometimes the stock app doesn't actually use the 2.5x lens when you switch to 2.5x zoom. The logic behind it is that if the camera app believes the lighting is too low for the 2.5x lens to take a reliable picture (without motion blur etc), it uses the 1x lens. For the average user it usually produces a better result.
If you want to force the iPhone to use the 2.5x lens, you need to use a third party app. It has always been like that. It could be that iOS 14.5 tweaked the logic a bit and uses the 2.5x lens more conservatively as there may have been user complaints about blurred pictures in the dark with the 2.5x lens.
So no, nothing to do with pic post processing. It's just the logic deciding which lens to use that may have changed a bit.
I’m getting nit picky because (1) you and the other guy are using extreme language to describe a nearby imperceptible difference in the photos (“watercolors”) and (2) because you are doing so (at least he or she is) without controlled conditions. Shooting handheld in low light is virtually guaranteed to generate motion blur. But sometimes you get lucky. Using a tripod eliminates that source of possible image “contamination”.
Bottom line: I do not like blanket assertions made without adequate experimental evidence.
So when 14.5 b1 came out this issue did not exist for normal to bright light. Somewhere around 4 it became an issue. I would have to go back to see when I caught it. You can force the use of the 2.5 on Camera and the the issue still exists. The default to the primary lens vs the 2.5x lens was supposed to be for low light photos however the last few betas this is now occuring for even well lit pics.
Whether I use the primary or 2.5 lens, I still see the same issue which is why I feel it is more than just lens selection.
ManuCH is right here, the iPhone has a very different opinion of what 'well lit' means, and it uses the main camera and digitally zooms in because it believes that'll be a better picture. It's not a bug per se, just Apple being aggressive in their usual philosophy of 'knowing what's best' for the consumer. The main camera apertures for all the iPhone 12s got widened to f/1.6, which should allow for 27% more light, and the software seems to be biased towards it because of this. So third party apps are your best bet to force what you want.So when 14.5 b1 came out this issue did not exist for normal to bright light. Somewhere around 4 it became an issue. I would have to go back to see when I caught it. You can force the use of the 2.5 on Camera and the the issue still exists. The default to the primary lens vs the 2.5x lens was supposed to be for low light photos however the last few betas this is now occuring for even well lit pics.
Whether I use the primary or 2.5 lens, I still see the same issue which is why I feel it is more than just lens selection.
ManuCH is right here, the iPhone has a very different opinion of what 'well lit' means, and it uses the main camera and digitally zooms in because it believes that'll be a better picture. It's not a bug per se, just Apple being aggressive in their usual philosophy of 'knowing what's best' for the consumer. The main camera apertures for all the iPhone 12s got widened to f/1.6, which should allow for 27% more light, and the software seems to be biased towards it because of this. So third party apps are your best bet to force what you want.
Even with the right camera, iPhones' computational photography tends to smudge out finer details in an attempt to reduce noise. This is noticeable when you zoom in, but likely looks 'cleaner' if you're just scrolling through social media (which is the primary audience Apple is adjusting for).
If you're shooting in the dark it's dark if the flashlight is not on. You can see in the dark? If I can't use the iPhone flashlight to get around in the dark I have to hold 2 things, a flashlight and the iPhone.
Why are people trying to complicate such a simple feature. The light is already on the phone. I would like the option to use it like a flashlight while taking a picture.
My problem has been a good camera app that does the SAME as the stick camera app: Photo, panoramic, square, and then also video with the same Slow-mo and time-lapse options. Is there a good app that does all that, paid or not?Save yourself the agita and use a different app. There are lots of good choices available, and they bring other advantages as well. Once I adopted a third-party camera app, I haven't looked back.
I just encountered your use case just yesterday, while trying to take a manual focus macro shot of the tiny, poorly contrasted text on an item. Lighting the subject first, instead of relying on a flash pulse, allowed a properly focused shot to be taken without having to take multiples with the flash and then pick the best. This was in a room lit with sunlight; wasn't even in the dark.
That's something most users won't encounter, so it's not likely to be a capability that the stock app will include.
Alright, at this point I’m gonna pronounce that this bug has been fixed.Don’t want to jinx it, but so far all of my iMessage links have showed up properly…
And to me, the entirety of iOS 13 was a dumpster fire and the first beta of iOS 14 was amazing. I guess different experiences for everyone it seems.iOS 14 is a cluster****. This is the first time that I'm really waiting for the next iOS before the Summer starts. It's also the first time I regret upgrading to an iOS version.
Alright, at this point I’m gonna pronounce that this bug has been fixed.
Haven’t had a single link show up incorrectly on Beta 7.
I believe this problem began before the 14.5 betas. I’m using iOS 14.4.2 on my iPhone 12 Pro Max. I selected 2.5x lens with either stock Camera app or ProCam 8 and took photos of the same object indoors with incandescent lighting. ProCam used f/2.2 lens but Camera used f/1.6 lens.So when 14.5 b1 came out this issue did not exist for normal to bright light. Somewhere around 4 it became an issue. I would have to go back to see when I caught it. You can force the use of the 2.5 on Camera and the the issue still exists. The default to the primary lens vs the 2.5x lens was supposed to be for low light photos however the last few betas this is now occuring for even well lit pics.
Whether I use the primary or 2.5 lens, I still see the same issue which is why I feel it is more than just lens selection.
Haha of course not.Finally! Did Apple respond to any of your feedback messages saying they made changes that might have fixed it?