3D Scanning with the iPhone 12 Pro LiDAR

Back to overview

As you’ve probably heard, the latest generation of iPhones has a built-in front-facing LiDAR sensor. I’ve been waiting for this day to happen ever since Apple acquired PrimeSense (makers of the Kinect) in 2013. LiDAR allows point-and-shoot 3D capture, and bringing it to the iPhone means that literally anyone can have a 3D scanner in their pocket.

The feature first arrived in the back facing camera last year, and then in the iPad Pro. We’ve seen a number of apps taking advantage of the new capabilities, and I’ve been playing with pretty much all of them.

Here is an overview of the ones that already feature direct share to Sketchfab.

3D Scanner App

3D Scanner App was—I think—one of the first to release an iPhone version, and with direct share to Sketchfab. The app has a lot of settings you can play with, like range, resolution, masking, etc. The editor is pretty robust as well. I find it great for smaller or more detailed things. Check out the tag page and download the 3D Scanner App.


Polycam recently added direct share to Sketchfab. The default mode works pretty well out of the box, and I find it great for rooms, spaces, and scenes. Check out the tag page and download Polycam.


Scaniverse works well out of the box, lets you crop your models and publish straight to Sketchfab. I like how the app manages to close some of the holes coming from missing parts. Check out the tag page and download Scaniverse.


Record3D lets you record volumetric videos, aka 4D content. It generates one point cloud per frame, and the result is quite magical! Check out the tag page and download Record3D.


Scandy was one of the first apps to leverage the back-facing LiDAR camera of the previous generation of iPhones, using it for things like 3D selfies. They even made a 3D printed extension to let users leverage the back-facing LiDAR as a front-facing camera. Check out the tag page and download Scandy.


SiteScape is great for capturing large point clouds of anything, and I typically use it to scan complex geometry that would not work out too well with a mesh approach. Check out the tag page and download the app.


EveryPoint is also great for capturing large point clouds of anything, and I typically use it to scan complex geometry that would not work out too well with a mesh approach. Check out the tag page and download the app


For tips on how to get the most out of your LiDAR captures, check out my Twitter thread.

About the author

Alban Denoyel

Co-founder and CEO of Sketchfab.


  • Another option would be SiteScape (sitescap.ai)

  • Excellent article!
    I’m doing an ongoing evaluation of all LIDAR capable apps in a VR room on Spatial, and would love to meet you there some day in VR to share experiences. Overall, things are happening fast in this space now that the iPhone has a LIDAR scanner and the apps that previously were updated once a quarter are now seeing multiple updates weekly. The main thing, IMO is that as-is the LIDAR scanner is built for occlusion mapping, not 3D geometry. Scanning with a LIDAR is akin to running around with a brush overfilled with paint, you may get a good scan the first time you try, but should you touch that area with your scanner you will ruin the first scan. Results such as you see in some of these examples, like the pillow in the sofa, are EXTREMELY hard to achieve. I’ll eat my hat if they did not do about 20 tries to get that result. And the bathroom mural example is definitely sus. Looking at the lack of incorrect geometry, not to mention the straight edges of the floor around the mural, I think this model has probably been post-processed before being uploaded. This is not cheating, but the future of LIDAR 3D scanning – some companies are working on 3D post processing of scans directly in the app. Like Shapr3D for instance, who are (the rumors say) working on LIDAR-assisted CAD modeling. Where the app sees for instance a wall, creates a correct 3D geometry for that wall, and then stops scanning that area of the model. If and when that happens, the Apple LIDAR will be a fantastic tool to capture 3D. But it is, at the time of this comment, still in the future.

  • You really need to look into some more software. I’ve been 3D modeling for over 20 years including the use of terrestrial scanners and UAV’s and the software is definitely there. There’s nothing special about a laser scanning sensor and this payload had been used for years in aerial mapping. It’s definitely not developed by Apple so calling it Apple Lidar is a little misleading. The only limiting factor here is the power output in that the iPhone is only going to shoot about 10m and the iPad might get a little more than that. Of course is going to take some people some practice to get good results consistently, but if you know how a laser scanner works and how to approach the subject then you are going to be surprised in what you see. Wait till we start merging these models with UAV scans and photogrammetry…


    Can this 3d system be used with a 3d printer? I do a lot of 3d printing.

  • Bart Veldhuizen says:

    Probably not; scans often contain ‘holes’ that you’d need to fill before you can print them. Also, when 3D printing you need to take material properties like thickness into account, and all of these depend on the printer type that you’re using. You can definitely start with a scan but you’ll have to do quite a bit of work before a model becomes printable.

  • Bart V. has the valid point that you probably won’t be able to print from the raw data and instead will have to go through a little file workflow to get the data in the right format for the printer. This would be the process that creates the mesh from the point cloud. We can print terrestrial and aerial scans by converting the LAS to a DEM and then the DEM to an STL. As of now the Apple system is the roadblock as we are going to need a way to export whatever goofy file Apple has decided to use to an industry recognized file type. This will surely be improved, but it does need to be known that point clouds are massive amounts of data and can easily overrun a computer or 3D printer if they are not prepared.

  • Tim says:

    Yes. The 3d surface file will need to be cleaned up to make solid, exported into a printable file format such as .stl. and /or textured .obj file and could be printed in color on a Stratasys.

  • Tim says:

    Also other phones have LiDAR also. My Samsung Note 10+ does, but no good apps yet. The apps are pretty rough and crude and NOT there yet. ToF Viewer (Time of Flight) and 3D Scanner at Google Play store. This is just the beginning for phones. As Michael L said, this isn’t nothing new and Apple is NOT the pioneer of LiDAR. But like the fraudulant election, people will never know better. Proper credit will never be given but stolen.

  • I just realized after watching that video again that through sharing they showed a blip on the screen for some exports and STL and OBJ were on the list. Our use would be for small structures, areas around structures and building interiors during construction that the photogrammetry from our UAV’s just can’t do. I wonder what limitations were are going to find in their exports…

  • Fantastic article! What kind of mesh-size resolution can you get with the iphone LiDAR eg at 1m?

  • Gary Priester says:

    This technology reminds me of the very first digital cameras that could take photos at around 300 x 200 pixels. The results currently are very rough but I suspect in another few years the things you will be able to do will be astounding.

    I have created stereographic images for print (magazines, books, and commissions) for over 20 years and either create my depth images by hand or purchase 3D models for the depth images. It will be useful from a commercial standpoint to be able to scan a product and use the depth image to create a hidden image stereogram.

  • @emily I am hearing a 16ft (5m) range with the best resolution being one inch (2.5cm) so it is going to be tough for it to capture really small objects and details.

    @Gary Haha, I remember the Sony Mavica with the 3.5″ floppy and 0.3mp… We had three cases full of disks before the first compact flash model came out!

  • Desh says:

    @Raymond If Apple allows an industry standard output file, you can use an online app like https://skanect.occipital.com/ to fill in any holes in the mesh. It’s what we’ve used along with the Kinect camera for over 5 years.

  • Will says:

    Fascinating, thank you. A question. What do you mean by “The feature first arrived in the back facing camera LAST YEAR”? Are you saying there was already some LiDAR tech in 2019’s iPhone 11 Pro, or is that a typo that should say it arrived first on the iPad Pro last year or something? Thanks!

  • I think he was talking about his Galaxy Note, but it has a ToF (Time of Flight) sensor and is not Lidar. It was similar, but not and actually was put on a phone several years ago by Qualcomm. As far as I know the iPad Pro was the first modern mobile device that has Lidar.

  • Will says:

    Ah, got you, thanks!

Leave a Reply

Your email address will not be published. Required fields are marked *

"Post comment" will create a new comment that can be read by anyone who visits this website and has access to this topic. Do not include sensitive data like IDs, credentials, or non-public information.

To remove a comment, contact the Sketchfab Community Team.

Related articles