2011年1月26日 星期三

Details of the Laser Range finder

    So, i would guess that at least 20 individuals asked me how to make a laser range finder at the 2003 fire fighting contest. By Sunday i was getting pretty tired of answering the same question so i decided to make this web site. Since in my college it seems that every single project is done in groups i have become used to sharing just about everything i come up with, so you are in luck. I hope that whoever reads this overview will be inspired to make their own system for a fire fighting robot to advance the contest.
Parts
The following are the parts that i used to make the ranger. Substitutions are always possible.

The basic principle of this ranger is to measure distance through triangulation. A laser is used to illuminate the object you would like to measure to and a linear image sensor or other sensor is used to measure the angle to the spot. The laser and image sensor are mounted at a known distance apart pointed in the same direction.

Because the average hobbyist doesn't have access to a wide variety of lenses, lasers or image sensors, many calculations are irrelevant because you couldn't get parts to fit what you calculate. The parts that i used were what i could get but some things should still be kept in mind if a choice exists.

The more resolution your image sensor has the better the accuracy. The image sensor i have has 1024 pixels and provides decent resolution and accuracy. If you were to use something easy like a cheap camera that may have only 300 pixels in a line your resolution will be severely effected. More is better but be careful of the price. I am not very much into optics and i pretty much experimented with the lens to get it to work. I know it has to be about as wide as the active area of your sensor and with a focal length of about 1cm. Other than that you can just mess with the sensors position under the lens and the focus until you get something that works.

The microcontoller i used to drive the sensor had internal analog to digital converters but i chose to use an external IC. It seemed that the fastest i could configure the integrated ADC to run, it would take about 75 microseconds per sample, at low resolution. The TI part can make a conversion in about 5 microseconds allowing for at least 100 complete distance measurements per second.

I chose a green laser for 3 reasons. The trinity gym is very red, leaving the possibility open to put a filter on the sensor. Its really bright so the ranger can see pretty far without strange amplification. Its really cool. You could probably use cheaper red lasers just fine. You may have to amplify the video signal though if not enough light is reflected. In fact i would probably try amplifying the signal instead of using a green laser again, it is way too bright to use regularly.

Electronics

Download the schematics

The video signal with a short range object. Top is video, bottom is sync

Interfacing to the image sensor is specific to the actual sensor you have. I will discuss the PVS sensor although others will be similar. This sensor has several inputs, some for clocks and some to configure the operating mode of the device. I hard wired the mode to be "DPR Mode, previous pixel reset" which basically means that each pixel integrates for one frame, is read and reset to integrate again. Because the clock speed applied to the sensors input is important for obtaining valid data, i could not have the clock start and stop whenever convenient, rather it had to be free running. I used a 4047 multivibrator for the job, set to 100 - 130 khz. Its speed is not horribly important which is good because it is not a horribly accurate device. Depending on the temperature of the capacitor in its RC circuit the frequency varies by around 5 khz. This speed is limited to about 130 khz because of the speed of the ADC which usually takes 5 microseconds to make a conversion, but could take 8 or 10 at times. The way i see it, 100 readings per second is much faster than most other sensors available to hobbyists and is much more data then the robots software can deal with anyway.

Sampling the signal 70% through. Top is pixel clock, bottom is ADC clock The output of the sensor is an analog voltage relative to the intensity of the light striking a given pixel. Each time the clock input is cycled, a new voltage is outputted representing the next pixel. The SYNC line is asserted to signal the first pixel. The data sheet recommends to sample the output at least 60% through the clock cycle to avoid sampling noise caused by the next pixel being selected. I used two NAND gates and both outputs of the oscillator to create some kind of combinational logic to sample about 70% through. That signal is fed to the analog to digital converter, which will then start the conversion and assert its INT line when done, which is usually within 5 microseconds.

At the start of each frame the sensor asserts its SYNC line for 1 clock cycle. When the microcontoller wants to acquire a new measurement it will wait for this signal. Once asserted it will watch for the ADC to signal a conversion is complete. Once complete the microcontroller can load in the data through the parallel connection. The microcontroller keeps track of the first pixel over a certain threshold and the last pixel over that threshold to be used in calculations later.

A closeup of the video signal. Notice the steps which are the individual pixels.

Once the edges of the peak are found we can begin calculating the real distance. From the edges the center of the point is calculated. I just picked the point in the middle of the edges but could have played around with a weighted average.

There are a few things you have to know to convert a pixel to a distance. Firstly, you need to know the base side of your right triangle. Measure from the center of the laser to the center of your image sensor's lens. Next you need to know how many pixels are spanned per degree of incoming light. To find this you can play with some right triangles. Measure the distance to an object and do the trig to find the angle at the sensor. Move the object and do it again. Take the difference in raw readings and divide by the difference in degrees.

You also need to know the smallest raw reading you can get, representing the closest distance you can measure, and the angle at that distance. Just play with holding things up to the sensor until it doesn't see it and measure that distance and work out the angle.

Basically distance = base * tan(angle)
to convert the raw reading to angle i used: angle = ((raw reading - smallest raw reading) / pixels per degree) + smallest degree

Simple as that... download this spreadsheet to try your results. I tweaked the constants with this sheet to minimize overall error. Pixels per degree is the most important number and only needs to be changed by a few thousands to effect accuracy.

Troubleshooting

Alot of things can cause you sensor to not work or not work well. Frankly, i don't remember most of the things i had to troubleshot through because that was back in November at the beginning of the very long construction of this robot. But i can tell you to watch out for noise. I didn't implement the low pass filter like the PVS data sheet suggested because i was not too worried about the accuracy of the pixel data, as long as it was close. But a few percent of the readings are incorrect and detecting false peaks are probably to blame. Also, make sure the system is optically as accurate as possible. If the laser does not point perpendicular to the base of your sensor, error for distances will vary wildly depending on range. Also make sure that the linear sensor is properly aligned. Trying to get a microscopic point of light to stay on the extremely thin line of pixels throughout the entire range is very difficult. I didn't come up with any good ways to make that easy, I watched to signal on the oscilloscope while holding the sensor with one hand and a hot glue gun in the other. Once i was happy i loaded the thing up with glue and hoped it stayed, real scientific.

There are probably a few things that can be improved on too. You should try adjusting the base size to suite the resolution you want. A small base will make for very high resolution close up but very poor far away, and vice versa. For my sensor i could only make the base size about 5 cm because of the need to mount it vertically and the height limit of the competition. If your sensor doesn't need high speeds you could probably use an integrated ADC on a microcontroller to reduce part counts. You could also try some kind of amplification and filtering of the video signal so lower power and cheaper lasers can be used. From what i tried, a red laser pointer showed up pretty well.

Thats It!
Good luck with your project. I hope my experiences inspire others to be creative and show up with new ideas for 2004. Its about time some people shifted their efforts from being the fastest to achieve one scripted task to elegantly achieving many more general tasks.

If i forgot anything or completely confused anyone be sure to email me so i can update this page. Oh, and sorry for the confusing and meaningless sentences, i'm an engineer.
The setup!

沒有留言:

張貼留言

注意:只有此網誌的成員可以留言。