Posted in Computer Vision

How to use (AR) Augmented Reality to Control your CNC Machine

(Augmented Reality JoyStick Part 1)

Prerequisites

To get the most benefit from this blog you should have some background in the following areas; Computer Vision applications with OpenCV and marker tracking, C#/C++ programming.

Introduction

In this article I will demonstrate how to develop an Augmented Reality (AR) application to move your CNC machine.   I call application the AR Joystick.   The AR Joystick interfaces with a camera and displays the video.  It operates by searching for markers on the machine.   When it detects the markers, the app draws a green box around markers and draws the lines for the X and Y Axis.  If the box is not drawn the marker is not detected.  The markers determine where the machine can move.

The initial idea came from the book “Arduino Computer Vision Programming”, Chapter 9 “Building a Click-To-Go Robot”, by Ozen Ozkaya.   The chapter explains how to write an video application that moves a robot from a video window using the mouse.

About this blog subject

This blog could easily generate at least 10 additional blogs.  I will make a series for the additional blogs and make this blog an overview, but I will spend more time talking about the software. Otherwise the blog will be too long for anyone to read.

The Setup for the AR Joystick

Here are the parts used.

  1. PC running Windows 10 x64
  2. OpenCV 3.2.0 – Aruco (Augmented Reality Library from the University of Cordoba)
  3. Visual Studio 2015 Community Edition
  4. An old CNC machine.
  5. Dynomotion KFLOP motion controller.
  6. Dynomotion Snap Amp DC Servo drive.
  7. Logitech WebCam
  8. 2 Laser Markers from Harbor Freight
  9. XBox Joystick.
  10. 3 AMT102 encoders.
  11. Power supply 24 VDC.

Refurbishing an old CNC machine

This is my 3rd machine that I have worked on.  Refurbishing the machine went faster than expected, because most of the parts were still working.  Only the DC Servo motors were outdated, they were using analog tachometers.  The tachometers were replaced with Encoders from CUI, Inc.

I used the Dynomotion KFLOP Motion Controller again for this project.  What is different with this new machine between the previous CNC machine?  I used the Dynomotion Snap Amp to drive the servo motors instead of the Gecko drives.  The Snap Amp was easier to use.

Writing the software for the AR Joystick

The AR joystick uses 2 programs; CNC Machine Client, and CNC Video Server program.  The client is written in C#.  The server is written in C++.  The server program tracks the markers to set up the origin, X and Y axes, and tells the client where to move.

CNC Machine Client program with the Xbox Game Controller.

The CNC Machine client software uses the Xbox Game Controller to move the machine.

The client moves the machine and displays the XYZ location.   When the client is connected to server, the server tells it where to move. When it is not connected to the server, Xbox joystick controls the client.  To connect the client to the server.  Hit the “Connect Pipe” button.

This is what the client looks like.

CNC Machine client

The CNC Video Server Program.

This is where the fun begins.  This is where we apply Augmented Reality and Robot Computer Vision to the project.

The “CNC Video Server” shows 2 images of the same scene.  The image on the right is the perspective view.  The image on the left is the 2D view.  The server acquires the video as shown on the right and transforms the image into 2D using the warpPerspective OpenCv command.

The image on the left is where the user controls the machine movements.  All the user has to do is click the mouse in the video and the machine moves!!

CNC Video Server

Augmented Reality ARUCO markers to move the machine

The main purpose of the server is to track 4 ARUCO Markers to set up a machine coordinate system based on their orientation.  Each marker has a specific purpose;

  • Marker 1 is the origin.
  • Marker 3 specifies the X-axis.
  • Marker 2 specifies the Y-axis.
  • Marker 4 is optional.

The green lines in the video are the X and Y axis.  The red lines you see are projected from the laser markers mounted to the machine.  These markers show the actual machine position in the live video.

Video Server 3D Perspective View

 

Video Server 2D View

 

The server aligns the perspective image into a 2D image.  The distance between the markers is known to the server.  It defines the scaling, pixels/mms, for each axis.

When the user clicks the mouse in the 2D window, the server detects the pixel XY location and converts XY pixels into inches.  Next the program sends XY values to the CNC Client.  When the client receives the XY values, it will move the machine in the specified XY coordinates.

Applying a perspective Transform and warping the Live Video.

The OpenCV Server displays 2 images of the same scene.  One window shows the perspective view the other shows a 2D view.  Here is the OpenCV snippet that transforms the video.

The vector pts_corners are the 4 center points of the AR markers in the perspective view.  The term “vector” refers to the C++ Standard Template Library data structure.

The vector pts_dst are the 4 center points of the AR markers but in the 2D view.  Both of these vectors are used to find the Homography matrix.  This matrix is used to map the 3D image onto a 2D image.

pts_corners.push_back(Point2f(average[0]));
pts_corners.push_back(Point2f(average[1]));
pts_corners.push_back(Point2f(average[2]));
pts_corners.push_back(Point2f(average[3]));

pts_dst.push_back(Point2f(80, 400));
pts_dst.push_back(Point2f(80, 80));
pts_dst.push_back(Point2f(500, 400));
pts_dst.push_back(Point2f(500, 80));

Mat h = findHomography(pts_corners, pts_dst);
warpPerspective(imageCopy, im_out, h, imageCopy.size());
imshow(“XYView”, im_out);

Handling Mouse Events in OpenCV

The OpenCV code snippet for handling mouse events is implemented by using a callback.  A callback is a pointer to a function.  When the user clicks the mouse in the 2D window, the server generates an event to the callback function to process.  The callback function returns the location of the mouse.  The code is very common on other blogs and articles.  The code will look like the following snippet.

setMouseCallback(“XYView”, CallBackFunc, &stagePt);
The callback function will look something like
void CallBackFunc(int event, int x, int y, int flags, void* ptr)
{
if (event == EVENT_LBUTTONDOWN)
{
// Do something
}
}

 

Details that I am leaving out for Now

I am leaving out a lot of details.  The details will be covered in the next blogs if someone want more information.  Otherwise the blog would be too long.

How to Use the Omron EE-SX671 Limit Switches (Part 2)

This article explains how to use these switches.

 

Creating ARUCO Markers for the Coordinate System (Part 3)

You will need to create the 3 or 4 markers for coordinate system.  For more information refer to OpenCV Basics 21-Aruco Marker Tracking on youtube.

 

Camera Calibration (Part 4)

Calibration is very important for software.  Without calibration the machine movements would not be as accurate.  For more information refer to OpenCV Basics 14-Camera Calibration on youtube.

 

Updating an Old CNC Machine:  Servo Tuning for KFLOP and Snap Amp (Part 5)

If anyone is interested in a blog about this subject let me know.

 

Video camera Controller

The camera is simple Logitech 1080p webcam.  It costs about $70.  To write software to control camera refer to the OpenCV Basics 12- Webcam & Video Capture on youtube.

 

Using Pipes to communicate between the programs.

The named pipes were used for the Client and Server to talk to each other.

 

Limitations:

I need to emphasize; THE SOFTWARE IS NOT PRODUCTION CODE.  I would not put the code in mission critical applications.  The software is only a prototype and was only written to prove a simple concept.  Also accuracy of the stage movements is not great.

Credits

AMT102 Encoders from CUI Inc.

George Lecakes – OpenCV Tutorial on YouTube.  I highly recommend these videos.  There are 21 videos each one is about 10 to 15 minutes long.

OpenCV Basics 01- Getting setup in Visual Studio 2015 for Windows Desktop.

OpenCV Basics 11- Building OpenCV Contribute to CMAKE.

OpenCV Basics 12- Webcam & Video Capture.

OpenCV Basics 14-Camera Calibration Part 1, Part 2, Part 3, Part 4

OpenCV Basics 21-Aruco Marker Tracking.

“Click-to-Go” Robot – Arduino Computer Vision Programming Chapter 9.

Software for the Xbox Game Controller.

* An event-driven C# wrapper for XInput devices.

* Author: Jean-Philippe Steinmetz <caskater47@gmail.com>

* Homepage: http://www.caskater4.com/engineering/xgamepad

Drawing maps with Robots, OpenCV and Raspberry Pi.  Chris Anderson

https://medium.com/@cadanderson/drawing-maps-with-robots-opencv-and-raspberry-pi-3389fa05b90f

 

Video of the AR Joystick

Posted in CNC Router

DIY CNC Engraving with the KFLOP and CAMBAM

Hello, I am back again.  I was side-tracked working on an Open Sourced PLC which I will cover in another post.  This post is about engraving patterns using your DIY CNC Mill with the KFLOP and CAMBAM.  CAMBAM has a feature to engrave patterns easy by simply opening a bitmap into CAMBAM.  It is so easy, a Caveman can even do it.  When you open a bitmap in CAMBAM, CAMBAM converts the edges in the image to a polyline.  I have provided the instructions and video to do the engravings.

How to open an Image in CAMBAM.

In this section I used the Boston Marathon logo as an example.  The logo looked like a fun project to do.  Also, I am a runner.  My wife and I ran the marathon in 2013.  Here are the following steps.

Step 1. Open the image from the “HeightMap Generator” under the CAMBAM “Plugins” menu as shown below;

Step 2.   Open the image into CAMBAM

Step 3.  Set your “HeightMap Generator” options.

Be aware that the XStep and YStep size are important.  The smaller size produces a better engraved image, but it takes the machine longer to carve the pattern.  A 4″ x 4″ piece of wood with a 0.04″ step size and feedrate 6″ inch/min will take about 2 hours.  With a 0.02″ step size, the same part will take over 3 hours.  This blog will show the difference between the engravings.

 

Step 4.  Generate the HeightMap

Step 5.  Generate the Tools paths and G-Code.

Video of CAMBAM generating G-Code from a Bitmap.

The video below will show you how to generate G-Code from the “HeightMap Generator” in CAMBAM.

 

 

Video of the CNC Mill engraving the Image on wood.

This video shows how the machine engraves the Boston Marathon Logo on the a CNC machine.  The CNC machine is using the Dynomotion KFLOP to perform the engraving

.  CAMBAM generated the G-Code from a Bitmap.

Final Products.

Before, I engraved the Boston Marathon Logo, I started with a business card to test the first engraving.  I took a photo of the business card and converted it to a black/white bitmap.  The black/white bitmap seemed work better than the color.

The business card is from Run 26 which is a running store in Mill Creek Washington.  If you want to get in shape for long distance running, this is the store.  They can help you select the right gear for you to start running.  They cater to all runners;  fast and slow, old and young.  The owner is a awesome and knowledgeable coach.

Engraving with different Step sizes.

The step size determine the quality of the engraving.  The image on the left was engraved with the steps sizes set to 0.04″ for both X and Y.  The image on the right was used 0.02″ for both X and Y, but took over 3 hours.  The engraving process was stopped after 3 hours.  It would have taken another hour.

If you have any questions or comments please contact us.

Also best of luck to the runners in the Boston Marathon tomorrow April 17, 2017.