Martin Wilson
Jun 28, 2022
When you aim your iPhone 11, 11 Pro, or 11 Pro Max towards a subject, whether it's in portrait or landscape mode, you'll see that more detail has been added from the lens with the next biggest field of view. To utilize the over-capture capability, you must go into the Camera settings in iOS 13 and turn on the Photos Capture Outside the Frame option.
To begin, go to Settings >'' Camera, then to the section titled "Composition," and activate the option. To activate the over-capture feature, choose Photos Capture Outside the Frame from the menu. Apple has, by design, disabled the capability, even though the corresponding option for videos is active. (There is a possible explanation for this, which I will discuss after this post.)
This occurs regardless of whether you are using the 1x mode or the 2x mode. That would be to the left and right of the frame if you were shooting in landscape mode but above and below the frame if you were shooting in portrait mode. The Camera app displays the collected content beyond the frame with a somewhat less distinct level of clarity. This additional information is obtained from the next broader camera on the phone, which is scaled down to match. Although no detail is lost in the process, the additional data is downsampled, which means that the pixel density is lowered. In 1x mode, the ultra-wide-angle lens is the next "down" after the wide-angle lens. However, the wide-angle lens is responsible for the top picture. When shooting at 2x on a Pro model, the wide-angle camera is used in addition to the telephoto lens to get the desired effect.
When there is insufficient light for the next-wider lens to function properly, the image area outside the frame will not be visible. Because the ultra-wide-angle lens takes in a significantly lower amount of light than the wide-angle lens, it cannot contribute to the image when there is insufficient light. The label will consist of a square with a dash-bordered yellow bar, and the word AUTO will appear to the square's right. Even when I tried it in pretty dark indoor situations, I could still see the picture outside of the frame. I had to locate a quite dim location for it to disappear.
Following the completion of the shooting, you will have access to these further details inside Photos. Images that include data that extends beyond the visible portion of the frame are denoted with a unique badge in the top-right corner of the screen only when the full image is being seen and not its preview. The emblem has the form of a square with a dotted border and a star in the upper-right hand corner. It's easy to miss.
However, one thing that could be puzzling is that the Crop tool might immediately recognize, upon being used, that the picture has to be adjusted to make it level. When you hit the Crop button, it utilizes signals in the background. If this is the case, you will witness a short animation of the picture modification, and a label will appear at the top of the image. The label will consist of a square with a dash-bordered yellow bar, and the word AUTO will appear to the square's right. If you wish to override that modification, you must press AUTO as if it were a button (because it is!), and all of the adjustments will be deleted.
The planned Deep Fusion machine learning-based update to the Camera app will not work if you have the Capture Outside the Frame function switched on. This is a word of caution and a piece of advice about the feature. Deep Fusion is a feature that will be included in iOS 13.2 that makes use of machine-learning algorithms to generate photographs that have a deeper assembly of features and tones than even the Smart HDR function that is already available.