I think, Apple decided to give some room for post-sharpening with iPhone 17 series. It reminds my Nikon D700 images in some way, which could look much better after a 3 step sharpen, first about slight 1.2 px radius, then modest 0.6 px and for last strong 0.1 (or 0.2 px). Older dSLR had anti-aliasing filter in front of sensor giving a little softening to avoid moire effect with sharp lenses, which is difficult to impossible correcting in the post.
Same applies for iPhone 17 Pro jpeg images; 48 mpx photos may appear somewhat blurry (normal - it's not a FF Sony), yet they look fantastic after resizing to about 16-20 mpx and carefully sharpening. I haven't tried any dng files yet, they may be problematic as mentioned. It's unlikely a lens defect, since it usually manifests as non-uniform areas of softness, loss of local contrast and coma as well. This is rather an intentional result of processing algorithm. For example, iPhone 13 mm is inherently soft at edges, but even it can be managed by selecting that area with about 200 px feather and applying a coarse 5 px sharpen until it becomes similar to the central area. Hope, this becomes as a standard part of UWA's image processing.
If an image holds detail, then it can be sharpened later. But if it initially had harsh demosaicing & oversharpened to a degree of sandpaper look at 100% level together with halos around contrast edges (like most Androids apply), then it gets harder to correct into a usable photo.