Excellent...grabbed the new DEB!
Can you explain something that I don't get? If 1920 is roughly 60% of the sensor's horizontal 3264, then why isn't the horizontal FoV 60% of your full-frame video? The native video looks cropped but not by 60%. It looks like it's dropped due to the IS needing some buffer to be able to shift the image to hold it in place. So how does the native video work? They can't double the 1920 to 3840 as that exceeds the sensor width, so I'm confused.
Frankly, not being Apple's own developer developing their camera API's, I don't know the exact algorithm. It's a black box for me and I can only guess based on the output (and extensive knowledge of everything stills / video recording / encoding / sensor tech).
1. As you pointed out, 3264/1920= 1.7, which is far bigger than the horizontal FoV decrease, which is "only" about 36/30 (mm)= 1.2. This means they can't just use the center area of the screen, but must employ some kind of line skipping or even pixel combining.
Apple's algorithm's being pretty advanced is also shown by it producing as excellent digital zoom (as of iOS7+) as that of Nokia's famous PureView zoom. I've done a lot of well-controlled tests (see my camera app test series; for example
THIS and, even more importantly,
THIS) and found it excellent. Which, again, means Apple uses an advanced algorithm.
However, no matter how advanced their algorithm is, it still surely isn't using true, full upsampling, as opposed to Nokia, not even in the already-cropped region. This is certainly proved by, say, the iPhone 5, which would just not be able to deliver steady 1080p30 if the cropped video area were upsampled. (The 5s would be able as, as it seems, it delivers excellent, frame drop-less 1664x1224p30 with full(!) sensor upsampling.) Also, the bad low-light performance, compared to both stills and my upsampler tweak, clearly show they just can't be upsampling. The latter would result in significantly better low-light performance even when using Apple's default video mode.
Interestingly, in spite of Apple's not making use of every single pixel, the results don't show the results of traditional line skipping: there's no aliasing in Apple's output video, while it's delivering full 1080p resolution, meaning Apple didn't apply a simple low-pass filter to get rid of the aliasing artifacts.
2. The cropped-out area (I'm speaking only of the horizontal areas on the left and right, the vertical areas are also further cropped because of the additional 4:3 → 16:9 crop) isn't only used for image stabilization. This is why the video is cropped even if you shoot with a video recorder app with disabled IS, which does somewhat increase the FoV, but in no way to the degree of my upsampler tweak. I've also proved this in this article:
https://forums.macrumors.com/threads/1600908/
This is also a major problem with Apple's implementation. Read: if you don't upsample wth my tweak (only available JB'n, of course, and on all non-iPhone 5s devices it severely decreases the framerate) or don't switch back to 640x480 recording, you in no way can have as wide a FoV as possible, not even when getting rid of the additional sensor area used in all kinds of electronic image stabilizations, including that of Apple.