A 3D laser scanner RECIPE

ingredients:

  • 1 Line laser  (ebay)
  • 1 Arduino (ebay)
  • 1 Stepper Motor (ebay longs-motor)
  • 1 Stepper motor driver  (up to 12800 steps)(ebay longs-motor)
  • 1 12V power supply (from a  chinese DVD player)
  • 1 Infrared LED (from a remote control)
  • 1 Digital SLR (Nikon D7000, Nikon SLR or a infrared remotely controlled SRL)
  • 1 Piece of wood (plywood, plexiglass, metal, or whatever you prefer)
  • 1 Ball head (get from a 5€  tripod  bought from Chinese)
  • Matlab (or Octave)

 

the chatgpt concept of a "3d scanner recipe"

Step one: holder arm for laser and SRL.

Assemble the camera and laser ball head to an arm, roughly 40 to 50 cm in length, using your preferred method such as hot glue or welding. Once you have connected the three parts (laser head, arm, and camera), determine the approximate center of gravity of this assembly. At this identified point, drill a hole to accommodate the stepper motor shaft. Ideally, a panoramic head would be more suitable for enabling rotation around the lens nodal point. However, this requires a more robust system than what a simple piece of chipboard and a shaft fitted into a hole can provide.  Ensure that the hole’s axis, the camera’s vertical axis, and the laser plane are all parallel to each other. To achieve this, it’s best to create the motor shaft hole using a drill press. If you have access to a lathe, consider fabricating a collar to guarantee the motor axis’s orthogonality to the arm. Additionally, the laser plane should be perpendicular to the arm’s plane, and the camera must be positioned horizontally and levelled accurately. When fully assembled, your setup should resemble the system shown in the accompanying picture. I am eager to see your own creative adaptations of this setup! Please feel free to share them with me by sending your photos to scanner@metaingegneria.com.

STEP 2: ROTATION AND CAMERA SYNCHRONIZATION

For the second step, initiate rotation and synchronize the camera. The most efficient method I discovered for this synchronization involves utilizing an Arduino to replicate the infrared remote control’s function and to manage the stepper motor.

Connect the infrared LED to pin 13 (as well as to the ground). The three inputs of the driver, namely pul, dir, and enable, should be connected to pins 10, 11, and 12, respectively. 

the Arduino code I used is:

/*
Stepper Motor Control +camera control
*/
int pinIRLED = 13;  // infrared led
int pul = 10;
int dir = 11;
int enbl = 12;
int passi = 4; // numero passi su 12800
int intervallo = 3; // intervallo tra i passi
void setup() {
pinMode(pinIRLED, OUTPUT);
pinMode(pul, OUTPUT);
pinMode(dir, OUTPUT);
pinMode(enbl, OUTPUT);
digitalWrite(enbl, HIGH);
}
 void pulseON(int pulseTime) {
unsigned long endPulse = micros() + pulseTime;        // create the microseconds to pulse for
while( micros() < endPulse) {
digitalWrite(pinIRLED, HIGH);                       // turn IR on
delayMicroseconds(13);                              // half the clock cycle for 38Khz (26.32×10-6s) – e.g. the ‘on’ part of our wave
digitalWrite(pinIRLED, LOW);                        // turn IR off
delayMicroseconds(13);                              // delay for the other half of the cycle to generate wave/ oscillation
}
}
 void pulseOFF(unsigned long startDelay) {
unsigned long endDelay = micros() + startDelay;       // create the microseconds to delay for
while(micros() < endDelay);
void takePicture() {
for (int i=0; i < 2; i++) {
pulseON(2000);                                      // pulse for 2000 uS (Microseconds)
pulseOFF(27850);                                    // turn pulse off for 27850 us
pulseON(390);                                       // and so on
pulseOFF(1580);
pulseON(410);
pulseOFF(3580);
pulseON(400);
pulseOFF(63200);
}                                                     // loop the signal twice.
void loop() {
// ciclo per dare gli step al driver dello stepper
for (int ii=0; ii<passi; ii++){
digitalWrite(pul, HIGH);
delay (intervallo);
digitalWrite(pul, LOW);
delay (intervallo);
}
delay(1600);                                          // questo intervallo è necessario per garantire che il braccio abbia terminato le oscillazioni prima di riprendere l’immagine
takePicture();                                        // take the picture
delay(200);
}

Now the assembly is ready and it should work more or less like this:

In this process, the arm rotates between each photograph by a specific angle determined by the number of steps. For instance, with our setup, the angle is calculated as 360/12800*step = 0.1125°, where ‘step’ equals 4. Using such small rotation increments can lead to decreased positioning accuracy. Therefore, it’s advisable to use a motor with an integrated gear reduction for better precision.

Remember, aligning the laser with the camera is crucial. Both the roll and pitch axes of these components must be orthogonal to the axis of motor rotation.

The captured images should predominantly feature the laser beam, ensuring that the laser’s brightness doesn’t saturate the camera’s CMOS sensor. To avoid capturing extraneous lights or objects in the image, which makes it easier for the software to detect the laser track’s center, I implemented a filter. This filter operates based on a parameter called “taglio,” which eliminates pixels where the red channel value is below a certain threshold, setting them to zero.

For the clearest possible images, use the lowest ISO setting your camera allows and set the focus manually. This prevents the camera from refocusing between shots, which would alter the focal length. For my setup, with the object about 600mm from the laser and using a 100mW laser, the optimal exposure settings were t = 1/250 and f11. If you’re using a less bright laser, settings like 1/125 at f8 might be more appropriate. The aperture should be narrow due to the subject’s proximity and the need for a substantial depth of field. Conduct the photography in dim lighting. The resulting images should resemble the one shown here:

 

Step 3: geometry definition

I utilized a fixed focal length lens (50mm f1.4) for this project. If you’re using a zoom lens, ensure the zoom setting remains constant once set. To secure this, you can use tape to prevent the zoom from rotating.

At this stage, you need to ascertain four key measurements:

  1. The distance “H” between the laser plane and the lens’ nodal point.
  2. The focus distance, which is the distance where the laser intersects the image’s vertical plane of symmetry.
  3. The offsets in the X and Y directions between the arm’s axis of rotation and the lens’ nodal point.

Note on determining the nodal point: To find your lens’ nodal point, you can use the following method. For a 50mm fixed lens like the one I used, a good initial estimate is that the nodal point is 50mm from the camera’s sensor plane, indicated by a symbol resembling a line through a circle on the camera body. Online, you can find various tables listing nodal point positions for different lenses. Keep in mind that as the focus length changes, the position of the focal point may also shift.

 


Step 4: image processing

I developed a script in Matlab for processing the images, employing a straightforward approach. Matlab is great for focusing on the core algorithm, leaving the more intricate aspects of C++ to those who are more adept with it. While Matlab might be slower, it allows for achieving desired results with minimal coding. This script should be easily adaptable for use with Octave, though I haven’t verified this.

To use the script, run it in the same directory where the images are stored. Within the script, you’ll find several parameters that need to be adjusted:

  1. passo_angolare (Angular Step): This is the degree of rotation between each image, corresponding to the setting on the Arduino.
  2. taglio (Cut): This sets the threshold for pixel intensity. Pixels below this value are turned to zero. In my case, a value of 10 works well.
  3. parametro (Parameter): This controls the smoothing process, with 1 indicating the highest level of noise and 0.0001 for more smoothing.
  4. H: The distance between the laser beam and the lens’ nodal point, as previously measured.
  5. distanza_focale (Focal Distance): The distance at which the laser intersects the vertical plane of symmetry in the image.
  6. focale (Focal Length): The focal length of the lens used.
  7. shift: The offset between the lens’ nodal point and the motor’s axis. In my setup, the Y-shift is zero due to my lens’ characteristics.
  8. larg_sensore (Sensor Width): The width of the camera sensor, in millimeters (in this case, 15.6mm).
  9. alt_sensore (Sensor Height): The height of the camera sensor, in millimeters (in this case, 23.5mm).

These parameters are crucial for the script to function correctly and process the images as intended.


Step 5: meshing

The Matlab script saves an ASCII file named test.asc, which contains the point coordinates. To import the point cloud and mesh it, I use MeshLab, a great tool that is powerful and free, developed by VCG at CNR, Italy. Note: The Arduino is also the result of the work of another excellent Italian group.

The meshing process involves two distinct steps. Initially, it starts with point decimation, executed through poisson disk sampling. This is followed by the actual mesh generation, which employs the pivoting ball algorithm.

I recommend setting the radius for the poisson disk sampling to slightly more than half the usual distance between two lines of points. For the pivoting ball algorithm, set the radius to double that of the poisson disk sampling. The quality of the resulting mesh is influenced by several factors, such as the thickness of the laser beam, surface scattering, oscillations of the arm, alignment of the axes, and so on.

The completed mesh should appear similar to this:

 

For the second step, initiate rotation and synchronize the camera. The most efficient method I discovered for this synchronization involves utilizing an Arduino to replicate the infrared re

Naturally, reconstructing the full model necessitates multiple scans, which must be aligned with each other in MeshLab. This alignment is essential for creating a comprehensive mesh that accurately represents the volume. Here is the original fish: