N150 Sonar Restoration

This robot, the Nomadic Technologies N150, is about 30 years old. I restored functionality to its array of 16 ultrasonic rangefinders by integrating some modern microcontrollers. I wrote a basic GUI for it, then blindly navigated around a room using only the sonar data.

Test Run Video

In this video, I show how I reverse engineered the sonar array electronics to interface it with modern micro controllers. I also demonstrate some manual navigation using the sonar.

I noticed the transducers on the robot looked exactly like an old school camera rangefinder. Turns out they’re literally the same part. After a little research, I found a modern replacement (blue pcb in right image) for the board that contains the driving electronics for the sonar transducer. It was the same layout as an older version contained in the robot (tan pcb in right image). This sonar driver board is mounted above a multiplexing board, which is essentially an array of solid state relays that connects one transducer at a time to the driving board. This means that multiple transducers cannot operate simultaneously, but only one set of driving electronics is required for all 16 transducers. The modern PCB I found was all surface mount instead of the older through hole one, but its datasheet listed the connector pinouts which matched the older board. My old part is still functional and I didn’t order a replacement, but this datasheet find was extremely helpful for figuring out how to integrate the data into my micro controllers. A set of 4 digital input pins decide which transducer to activate, then there’s a trigger pin and an echo pin for measuring the time it takes for the ping to return.

I beeped out the black multiplexing board to determine what each pin was on the input ribbon cable. Then I tested a single transducer with an arduino. It selects the associated relay to energize, then sends a trigger pulse and waits for the echo, calculating distance based on speed of sound. The IC on my protoboard connected to the arduino is just an IO expander because it could handle more current throughput than the arduino pins alone could. I needed this because the sonar board pulled a lot of current from its signal pins for some reason.

I soldered the sonar arduino into a protoboard which breaks out the required connections (left). The middle image shows my original breakout board for the arduino which interfaces with the motors. I added on some connections to read the motor encoders for dead reckoning. I kept these two arduinos separate because they were both doing timing critical tasks which required delays and interrupts; it made sense to keep the sonar tasks separate from the encoder tasks. Lastly, I added on an ESP32 (right) which lives outside the faraday cage robot body. Its only job is to communicate data over wifi to my PC. The three microcontrollers communicate with eachother bidirectionally over i2c.

The left image above shows my motor control board, my sonar interface board, and the original sonar multiplexor mounted inside the turret of the robot. an i2c hub is mounted between my two protoboards. The right image is the robot’s original mainboard, which I am no longer using. It once lived below the multiplexing board. I mounted all of the boards in place with 3D printed brackets that used existing threaded holes in the robot. The ESP32 is mounted outside the aluminium body on top in a white 3D printed box so it can receive a wifi signal.

The original tires were mostly disintegrated and missing large chunks. I printed some new ones out of TPU before my driving tests to ensure my dead reckoning was as accurate as possible. Missing chunks and irregular wheel diameter could make the translation between encoder pulses and linear travel inaccurate.

I wrote this basic GUI in processing, then used it to control the robot. The black circle in the middle represents the robot’s position, with the green line pointing in its steering direction. The red lines all around it show the readings from each of the 16 ultrasonic rangefinders, with the rays scaled to match the distance measured. The “X” and “Y” values in the upper left corner are the dead reckoned position in Cartesian feet, where the origin is the point at which the robot was turned on. The four buttons at the bottom represent arrow keys, and turn green when a key is pressed on the keyboard. the numbers on the left side of the screen were just some printed values for debugging purposes.

Here are some images of me driving the robot around, using its sonar to avoid obstacles. There is a camera mounted on the top, facing parallel to sonar 0 (the blue line). The turret rotates using a PID loop such that sonar 0 is always facing the steering direction. I was driving blind, the camera was just there to take footage to look at after the fact. The mechanical sonar transducers have a narrower beam width than I expected, able to resolve a narrow doorway fairly easily.

I plan to keep adding functionality to this robot in the near future. I’ll upgrade to a more powerful main microcontroller that can run ROS2 and experiment with some SLAM packages to create detailed maps of rooms and navigate more autonomously.

Click the box above to view the code I wrote for this project on Github. This includes the code running on each of the 3 microcontrollers, plus the GUI in processing.

Anterior
Anterior

Robot Parade

Siguiente
Siguiente

Gumption Trap II Bathymetry Demonstration