Posted in CNC Router

How to connect the Omron EE-SX671 Limit Switch to the KFLOP

Augmented Reality JoyStick Part 2

Limit switches are very important for machine safety and longevity.  The switches prevent the machine from moving past its mechanical limits.  When the machine reaches the limit, the switch sends a message to the motion control system to disable the motor.  This is especially critical when the stage is moving at high speeds.  Without the switches, parts can fly off the machine and potentially injuring the operator.  This project uses the Omron ee-sx671 which is an optical switch.

The Basic Circuit

The switch is just like any switch.  The switch is connected to digital input on the Dynomotion Snap Amp with a pull up resistor.  Here is a simplified version of switch.

The EE-SX671 is an optical limit switch

The ee-sx671 is an optical switch.  The sensor operates by sending a beam of light between the slot.  One side is the emitter the other is the receiver.  When the light is blocked between the emitter and receiver the switch changes state.  The advantage of an optical it is immune to electrical noise.  The disadvantage is it is affected by external light.

Know the configuration of your switch PNP or NPN

Since the part number is EE-SX671 and there is no P or R after the number.  The part is NPN.  Its very important to know the difference.  If the part is hooked up wrong it could damage the part.

Here is the PNP configure.  We are not going to use it.  We will keep this for future reference.

Some of the specifications of the switch

The important information to know is the voltage and the maximum current it can handle.  The switch will operate from 5 to 24 VDC, and can handle 35 ma max.

Connecting the switch to the Snap Amp

The following circuit is the DIO signal to the Snap Amp.  We connect the 5 volts of the switch to OPTO_PLUS signal.  The 470 ohm resistor the switch will draw 10 ma.  The current is well under the 35 ma limit.

Next we connect the OPTO_NEG signal to the OUT signal of the switch.

Each axis requires 2 switches; one for the min and one for max position.  We only need one 5 Volt power supply for all of the limit switches.

The limit switches are connected to;

Axis

Signal on Switch

Signal on Snap Amp

PIN Bit (SnapAmp #0)
X min + 5V OPTO_PLUS_0 32 72
OUT OPTO_NEG_0 31
X max + 5V OPTO_PLUS_1 34 73
OUT OPTO_NEG_1 33
Y min + 5V OPTO_PLUS_2 36 74
OUT OPTO_NEG_2 35
Y max + 5V OPTO_PLUS_3 38 75
OUT OPTO_NEG_3 37
Z min + 5V OPTO_PLUS_4 36 76
OUT OPTO_NEG_4 35
Z max + 5V OPTO_PLUS_5 38 77
OUT OPTO_NEG_5

37

 

Defining the Limit Switches in the KFLOP.

How do you use the information in the KFLOP.  You came to the right place.

 

Step 1:  Run KMotion.exe and Load the C Program for your setup.

You should see something like this dialog.

Step 2:  Open the “Config & Flash”

This is where you set the limit switch parameters.

Step 3:

Open the “Digital IO” dialog for the Snap Amp.  Make sure you connect the 5 volts to your limit switches.  You should see the following dialog.

Place a piece of paper between the slot of the limit switch.  You will see the state change in the check box from UNCHECKED to CHECKED.  Initially, I was not sure if the “Unchecked” state was HIGH or LOW.  The “Unchecked” state is low.  You might want to verify this yourself.

The C program I am using seems to work.  The KFLOP definitely stopped motion when I moved into the limits.

Best of Luck setting the limit switches.  Don’t forget to export the settings and save your program.

 

Posted in Computer Vision

How to use (AR) Augmented Reality to Control your CNC Machine

(Augmented Reality JoyStick Part 1)

Prerequisites

To get the most benefit from this blog you should have some background in the following areas; Computer Vision applications with OpenCV and marker tracking, C#/C++ programming.

Introduction

In this article I will demonstrate how to develop an Augmented Reality (AR) application to move your CNC machine.   I call application the AR Joystick.   The AR Joystick interfaces with a camera and displays the video.  It operates by searching for markers on the machine.   When it detects the markers, the app draws a green box around markers and draws the lines for the X and Y Axis.  If the box is not drawn the marker is not detected.  The markers determine where the machine can move.

The initial idea came from the book “Arduino Computer Vision Programming”, Chapter 9 “Building a Click-To-Go Robot”, by Ozen Ozkaya.   The chapter explains how to write an video application that moves a robot from a video window using the mouse.

About this blog subject

This blog could easily generate at least 10 additional blogs.  I will make a series for the additional blogs and make this blog an overview, but I will spend more time talking about the software. Otherwise the blog will be too long for anyone to read.

The Setup for the AR Joystick

Here are the parts used.

  1. PC running Windows 10 x64
  2. OpenCV 3.2.0 – Aruco (Augmented Reality Library from the University of Cordoba)
  3. Visual Studio 2015 Community Edition
  4. An old CNC machine.
  5. Dynomotion KFLOP motion controller.
  6. Dynomotion Snap Amp DC Servo drive.
  7. Logitech WebCam
  8. 2 Laser Markers from Harbor Freight
  9. XBox Joystick.
  10. 3 AMT102 encoders.
  11. Power supply 24 VDC.

Refurbishing an old CNC machine

This is my 3rd machine that I have worked on.  Refurbishing the machine went faster than expected, because most of the parts were still working.  Only the DC Servo motors were outdated, they were using analog tachometers.  The tachometers were replaced with Encoders from CUI, Inc.

I used the Dynomotion KFLOP Motion Controller again for this project.  What is different with this new machine between the previous CNC machine?  I used the Dynomotion Snap Amp to drive the servo motors instead of the Gecko drives.  The Snap Amp was easier to use.

Writing the software for the AR Joystick

The AR joystick uses 2 programs; CNC Machine Client, and CNC Video Server program.  The client is written in C#.  The server is written in C++.  The server program tracks the markers to set up the origin, X and Y axes, and tells the client where to move.

CNC Machine Client program with the Xbox Game Controller.

The CNC Machine client software uses the Xbox Game Controller to move the machine.

The client moves the machine and displays the XYZ location.   When the client is connected to server, the server tells it where to move. When it is not connected to the server, Xbox joystick controls the client.  To connect the client to the server.  Hit the “Connect Pipe” button.

This is what the client looks like.

CNC Machine client

The CNC Video Server Program.

This is where the fun begins.  This is where we apply Augmented Reality and Robot Computer Vision to the project.

The “CNC Video Server” shows 2 images of the same scene.  The image on the right is the perspective view.  The image on the left is the 2D view.  The server acquires the video as shown on the right and transforms the image into 2D using the warpPerspective OpenCv command.

The image on the left is where the user controls the machine movements.  All the user has to do is click the mouse in the video and the machine moves!!

CNC Video Server

Augmented Reality ARUCO markers to move the machine

The main purpose of the server is to track 4 ARUCO Markers to set up a machine coordinate system based on their orientation.  Each marker has a specific purpose;

  • Marker 1 is the origin.
  • Marker 3 specifies the X-axis.
  • Marker 2 specifies the Y-axis.
  • Marker 4 is optional.

The green lines in the video are the X and Y axis.  The red lines you see are projected from the laser markers mounted to the machine.  These markers show the actual machine position in the live video.

Video Server 3D Perspective View

 

Video Server 2D View

 

The server aligns the perspective image into a 2D image.  The distance between the markers is known to the server.  It defines the scaling, pixels/mms, for each axis.

When the user clicks the mouse in the 2D window, the server detects the pixel XY location and converts XY pixels into inches.  Next the program sends XY values to the CNC Client.  When the client receives the XY values, it will move the machine in the specified XY coordinates.

Applying a perspective Transform and warping the Live Video.

The OpenCV Server displays 2 images of the same scene.  One window shows the perspective view the other shows a 2D view.  Here is the OpenCV snippet that transforms the video.

The vector pts_corners are the 4 center points of the AR markers in the perspective view.  The term “vector” refers to the C++ Standard Template Library data structure.

The vector pts_dst are the 4 center points of the AR markers but in the 2D view.  Both of these vectors are used to find the Homography matrix.  This matrix is used to map the 3D image onto a 2D image.

pts_corners.push_back(Point2f(average[0]));
pts_corners.push_back(Point2f(average[1]));
pts_corners.push_back(Point2f(average[2]));
pts_corners.push_back(Point2f(average[3]));

pts_dst.push_back(Point2f(80, 400));
pts_dst.push_back(Point2f(80, 80));
pts_dst.push_back(Point2f(500, 400));
pts_dst.push_back(Point2f(500, 80));

Mat h = findHomography(pts_corners, pts_dst);
warpPerspective(imageCopy, im_out, h, imageCopy.size());
imshow(“XYView”, im_out);

Handling Mouse Events in OpenCV

The OpenCV code snippet for handling mouse events is implemented by using a callback.  A callback is a pointer to a function.  When the user clicks the mouse in the 2D window, the server generates an event to the callback function to process.  The callback function returns the location of the mouse.  The code is very common on other blogs and articles.  The code will look like the following snippet.

setMouseCallback(“XYView”, CallBackFunc, &stagePt);
The callback function will look something like
void CallBackFunc(int event, int x, int y, int flags, void* ptr)
{
if (event == EVENT_LBUTTONDOWN)
{
// Do something
}
}

 

Details that I am leaving out for Now

I am leaving out a lot of details.  The details will be covered in the next blogs if someone want more information.  Otherwise the blog would be too long.

How to Use the Omron EE-SX671 Limit Switches (Part 2)

This article explains how to use these switches.

 

Creating ARUCO Markers for the Coordinate System (Part 3)

You will need to create the 3 or 4 markers for coordinate system.  For more information refer to OpenCV Basics 21-Aruco Marker Tracking on youtube.

 

Camera Calibration (Part 4)

Calibration is very important for software.  Without calibration the machine movements would not be as accurate.  For more information refer to OpenCV Basics 14-Camera Calibration on youtube.

 

Updating an Old CNC Machine:  Servo Tuning for KFLOP and Snap Amp (Part 5)

If anyone is interested in a blog about this subject let me know.

 

Video camera Controller

The camera is simple Logitech 1080p webcam.  It costs about $70.  To write software to control camera refer to the OpenCV Basics 12- Webcam & Video Capture on youtube.

 

Using Pipes to communicate between the programs.

The named pipes were used for the Client and Server to talk to each other.

 

Limitations:

I need to emphasize; THE SOFTWARE IS NOT PRODUCTION CODE.  I would not put the code in mission critical applications.  The software is only a prototype and was only written to prove a simple concept.  Also accuracy of the stage movements is not great.

Credits

AMT102 Encoders from CUI Inc.

George Lecakes – OpenCV Tutorial on YouTube.  I highly recommend these videos.  There are 21 videos each one is about 10 to 15 minutes long.

OpenCV Basics 01- Getting setup in Visual Studio 2015 for Windows Desktop.

OpenCV Basics 11- Building OpenCV Contribute to CMAKE.

OpenCV Basics 12- Webcam & Video Capture.

OpenCV Basics 14-Camera Calibration Part 1, Part 2, Part 3, Part 4

OpenCV Basics 21-Aruco Marker Tracking.

“Click-to-Go” Robot – Arduino Computer Vision Programming Chapter 9.

Software for the Xbox Game Controller.

* An event-driven C# wrapper for XInput devices.

* Author: Jean-Philippe Steinmetz <caskater47@gmail.com>

* Homepage: http://www.caskater4.com/engineering/xgamepad

Drawing maps with Robots, OpenCV and Raspberry Pi.  Chris Anderson

https://medium.com/@cadanderson/drawing-maps-with-robots-opencv-and-raspberry-pi-3389fa05b90f

 

Video of the AR Joystick