Categories
Development Devices iOS iPhone

Is DEPTH / DISPARITY information captured in non-portrait photos on iPhone X in iOS 13?

I am running some code that extracts the depth / disparity information from photos on an iPhone X running iOS 13.

In photos that I took recently, not in portrait mode, there is no depth information being returned. However, in portrait photos taken with iOS 12, there is depth information.

In a couple WWDC videos, it looks like the demonstrator is using photos that are NOT portrait mode photos — they are pictures of flowers, trees, and even lakes — but still using their depth information.

Thanks for any clarification.

Leave a Reply

Your email address will not be published. Required fields are marked *