Here is more progress on 3D laser scanning since my previous post on the subject. This device is based on high school geometry, both in understanding the relationship between line laser offset and distance from the camera, and using the quadratic equation to fit measured offsets to that distance. Two line lasers are displaced left and right as a rotating object has features that are closer to or further from the camera. I record images of the line lasers (in a dark room) and determine their corresponding points in space. From the generated point cloud, I produce a mesh. The mesh can be 3D printed.
I have significantly improved my laser scanning by building a solid platform using 8020. Did I mention that I love 8020? It’s like Tinker Toys for grown-ups! I carefully calibrated the position and angle of three line lasers. One is mounted to the camera itself, showing the pointing direction of the camera. The other two line lasers are offset to point two lines at +/- 30 degrees from the center line. By geometry, one can see that offset left and right of the two lasers corresponds to displacement toward the camera. I call this the Y direction. The Z direction is up, and X is toward the right from the rotating platform center.
I am using 3D printed parts to hold the line lasers and camera. My goal is to accompany my now very reliable 3D printer with reliable 3D scanning.
In my first attempt, I assumed an incident angle for the lasers and linearly determined Y by just multiplying the tangent of the angle. This has two problems. First, the right and left laser lines will not be exactly at + / – 30 degrees, try as I might to align them. Second, the camera has a lens, and has magnification. Moving closer to the camera will also make the lines appear to be further apart in addition to the triangle relationship.
This time I projected the line lasers and center laser on a card and moved the camera closer to platform center using a linear translation stage. This way I had known steps (0.32 mm), and could capture images of the laser lines at each point.
Once I had my calibration, I could remove the linear stage since it is not part of the scanning process. I would be left with just a few inexpensive parts. I can produce this scanner very inexpensively. I am using a 3 axis stepper motor controller from eBay, one stepper motor, three line lasers, also from eBay, some 8020 parts, and some 1/4-20 threaded rod in addition to the mounts that I 3D printed myself. Add some wire and a power supplies for the stepper motor controller and line lasers and my total cost in materials is low. My time, of course, is not so inexpensive, but this is what I love doing!
Alignment is very important. Here is an image of the CAD printed paper I used to verify incident angles and the screen I projected on to in order to perform calibration.
Here is a video of the calibration process.
I first wrote a program to scan through each image, line by line, and look for peaks corresponding to where the laser lines touch the target object. Once I had collected these points, I recorder their position as a function of movement toward the platform. As expected, the plots are slightly nonlinear due to the lens effect. The linear term relating offset of the beam to position moving toward the platform is close to 2 (1.925 for the left beam and -2.007 for the right beam). This comes from the fact that the cosine of 60 degrees is 0.5 and also includes the calibration of pixels to millimeters. Pixels to millimeters will change with every step due to lens magnification. I could try to calculate this, but it is better to measure and calibrate.
Once I had a quadratic fit to pixel offset versus millimeters toward the camera from the platform center, I used the quadratic equation to solve for millimeters based on pixel location as input. Here is a screen shot of the Excel file.
I programmed in these two equations to my image analysis program. Next I scanned an object. Here is a video showing the line lasers as I rotate the scanned object.
Here is a gallery showing the 3D scanner.
And here are the results of the scan analysis. There are lots of tricky little details to work out. My DIY scanner only needs to rotate the object once, unlike the recently offered commercial unit by Makerbot, which always rotates twice. I purchased focusable line lasers. You can see that I used glue to lock in their positions once I set them appropriately. Using the more liquid form gives a nicer result since it does not leave external clear goo.
Note that the mesh feature at the bottom center comes from the tag attached to the little dog. The dog is also holding a tag in its right hand. These are real features.
I uploaded the mesh to Thingiverse.
Here is an animation of the rendering from all angles. You can almost read “Canada” on the dog’s little belly!
My calibration of Z pixels versus millimeters is currently based on a single image of a ruler near the edge of the platform. If I were making quantitative scanning measurements, I would need to also account for camera magnification. For now I am pretty happy with the result of this machine that I built from scratch and the programs that I wrote myself. As I describe in previous posts, I use Python to control the system movement and free software to capture images. I wrote image analysis programs myself, then process resulting point clouds using MeshLab and Blender (also free). I hope to write scripts to run the whole process and have a software package that could be available to others.
I improved my recent 3D scan. I changed the analysis software and reprocessed all of the images that I had collected previously. I changed the process in three ways. 1: I properly scaled Z of each point based on camera lens magnification, which I calibrated. 2: I manually cleaned up outlier points in the point cloud using MeshLab. 3: I resampled the point cloud from over 300k points down to 50k points. This sacrifices some resolution, but also smooths out the areas where the line lasers missed by occultation.
I will describe calibration of lens magnification. For now, please view the improved scan. The much improved, and smaller file size STL is available for download at Thingiverse.
I 3D printed the resulting STL file. Here is a gallery of the result.
You can download the mount for the stepper motor at another one of my Thingiverse posts.