NerdKits - electronics education for a digital generation

You are not logged in. [log in]

NEW: Learning electronics? Ask your questions on the new Electronics Questions & Answers site hosted by CircuitLab.

Project Help and Ideas » Real time in windows OS.

July 17, 2013
by sask55
sask55's Avatar

This issue is concerning using a windows bases PC to communicate and control a micro and any other devices that may be connected to the micro in real time. The topic is not directly related to the Nedrkits or micro processors but more of a PC operating system issue and for that reason may not be well suited for discussion on this forum. Never the less I have come to value the information I have seen here and I am hoping someone has some insight or ideas. The problem has slowed progress on my mill project. I believe that this issue could also be a problem on a temperature cable project I am considering. Perhaps even more so, as real time intervals of short durations appears to be very critical in that project. It may be best to try and do all the short very time sensitive processing on the micro but I think there may still be some problems.

This is an issue that has become apparent to me over the last few mouths. It is not possible to get good reliable accurate real time communications to or from a windows based computer. I have researched this issue on a number of other forums and the general consensus is that it cannot be done while using a windows OS. A much simplified explanation as I understand it is that because the OS is threading a number of tasks it cannot be guaranteed that the processors will be available to carry out any task at any given time. There are more or less random length delays in the order of several micro seconds that can not be avoided.

Because of this it seams that the standard software timers available windows based C# and C are typically set in mill seconds not micro seconds. Timers that are intended to trigger any quicker then about 15 ms do not work as expected in C#. Micro timer functions are available but any I have experimented with are very inefficient and very CPU heavy. Basically the CPU is to busy looping in a short loop in the timer to do much of anything else at the same time.

From trail and error I have found that on a fast quad core PC and am able to set a C# micro time to short intervals down to say 100 us. And get a satisfactory result. On a single processor PC even a 500 us interval on the microtimer will more or less freeze the machine for doing much else.

For my mill CNC project I intended to have the command and control bytes originate from the PC. Each 8 bit command byte sent from the PC to the master micro contains two bits that “address” the other 6 bits of data. Using the “address” bits the master determines where the six data bits are intended to be sent. The 6 control bits cans be sent to x,y,z, axis or power control pins on the master itself. Each of the three slave micros directly control the input pins on a motor controller chip. In this way I am able to set the direction, motor torque level, motor controller clock pin on any of the three motors with one 8 bit command byte. In order to turn a motor one single step the motor controller chip clock pin must be toggled through one cycle, off and on. I have calculated in order to advance the mill quickly from one position to another it would be nice to be able to toggle the controller clock pin at about 2 kz. Since it will require two bytes per clock cycle one to set it high and a second to clear it. I would like to send a command byte at a rate of about 4 kz. That rate is well within the uart capability. The issue I continue to struggle with is finding a trimmer that is capable of being set and adjusted in the PC software down to as short as 250 us intervals and as long as may be required for very slow movements perhaps < 1 byte /sec. There is going to be some additional processing done between timer ticks to build the next byte to be sent out on the uart. By controlling the timing of the motor controller clock pin changes as well as the motor direction pin state and torque level pin states the PC will have very fast very fine control of the mill movements.

Perhaps my entire approach is deeply flawed. I may have to reconsider how I control the mill movements. I would like to stay in the windows OS because there are a number of other related software products that I was intending to make use of for controlling the mill.

I am not at all certain that any of this is clear enough for anyone to comment on. I don’t know what I am going to do I may just use a very fast multiple core PC and dedicate it to the job, strip it down to a very clear boot, if I can make that work. I was somewhat surprised that the bottle neck in this idea would be the PC where the processor is running more then 100 times faster then the micro. But all that overhead that comes with windows OS seams to be giving me grief. I would certainly not be disappointed to be proven wrong here.

Darryl

.

July 17, 2013
by pcbolt
pcbolt's Avatar

Darryl -

I was actually reading something about this the other day in an article about General Purpose Operating Systems (GPOS like Windows, Mac, Linux) vs. Real Time Operating Systems (RTOS like ??????). There is a company called Interval Zero that comes highly recommended for being the best at transforming Windows into a RTOS. I took a quick look through the site and could not find any pricing (and we all know what that means) but they do have older version they might be able to sell relatively cheaply.

But before you go that route you have to ask if the MCU's can handle the job or not, since they are real time (but not strictly operating systems) . In the scenarios you described above, it seemed more like a buffering problem than anything else. If you can queue up a bunch of commands for the MCU to act on and in free moments tell Windows to send more commands when buffer space is available, timing should not be an issue. Now if Windows has to react to input from sensors and quickly output a command to the MCU, you will need a RTOS solution. I'm sure there are open source Linux solutions to this as well but if you are comfortable with Windows I'd see if I could get an older, bare minimum package from Interval zero or the like.

July 18, 2013
by sask55
sask55's Avatar

I will be taking a look at interval zero.

To a great extent you are correct this issue is also related to buffering in the uart and other places. I believe I have some solutions to some of that.

For my mill project I can put up with a bit of short delays in the order of say <100 us. But the software on the PC must have reasonable real time control of the motor controllers. There will be digital calliper position readings coming back from the mill to the PC from time to time as the reading are transmited. The software on the PC may be reacting to these readings at times. Also if the intention is to advance the mill head one step every 500us for some period of time at a more or less steady rate them that is what I want, not moving in random sets of faster rate jumps as the PC finds time to send a buffered set of instructions at some interval that is out of the control of the software. The steady movement of the mill head is totally dependent on a set control bytes arriving at the motor controller chips in real time. Since the PC controls the rate of movement of the head this timing will change drastically depending on how fast the movement is intended to be.

As for the temperature cable project I think I would like very tight timing control, if possible, to do any actual communication with the chips on the cable within a few micro seconds in real time.

July 18, 2013
by scootergarrett
scootergarrett's Avatar

I think I had a similar problem where I wanted to use some type of ‘real time interrupt’ within a C program and the #include <sys/timeb.h> resolution was only good to like 15ms. It keeps good time over a long time but it doesn’t update often enough so I was getting results like (I don't remember the time axis units but you get the idea) this if that makes any sense. Anyways could you use your MCU to sent a ready to receive signal at set intervals, then just have your computer waiting?

July 18, 2013
by sask55
sask55's Avatar

I have been trying to dream up different approach to this by handing over more of the timing control to the micros. The problem I have is that nothing will remain constant for long periods of time. To make this work as I have planned the control byte data stream coming from the PC to the mill will be more or less a constant stream of control bytes spaced out in real time to a frequency that will dictate the mill feed rate at that time. The mill control will change the frequency or the delay between control bytes to achieve whatever feed rate is required at any given time.

I have found that about 15 ms is the usable lower limit for timer type interrupts available in C#. If I use that limit as the fastest command byte rate that I can send data (ie one byte every 15 ms) that would produce a feed rate on the mill of about 1 inches / minute. That would be much to slow for most jobs. I actually have things working at that slow pace. In order to be practical for moving from one point to another and setup ect. I would like to be able to send command bytes at least ten times preferably twenty or thirty times as often as that. The uart , micros, masters and slaves as well as the motor controller chips have no problem handling a much faster rate of command bytes. I just can’t seam to figure out a way to send the command byes from a PC at a controlled real time rate faster then that.

In any event this is not a new problem for me. I may have to make some major changes to my basic design, which I have been avoiding for some time now.

Thanks for the comments

July 18, 2013
by sask55
sask55's Avatar

I don’t know why I didn’t think of this much earlier.

There is no reason I cannot carry out the control byte timing on the master MCU. I should be able to use time interrupts on the mcu to send the control bytes to the appropriate slave at the correct timing. All the mcu needs to know is how long to delay before sending the next control byte on to the appropriate slave. I will need to set up a system to have the PC inform the master MCU each time the speed of the movement is changed. This could be quite frequently as the mill changes back and forth between cutting and positioning movements.

I am thinking this will work. What it boils down to is I don’t have to send the control bytes to the master and be concerned about the timing I only have to be concerned about the timing of the master to slave communication.

Anyway for now I am going to give up on trying to time the motor controller clock pin changes directly from the PC. It should be much more sensible to send the timing data to the master and do it there.

Post a Reply

Please log in to post a reply.

Did you know that NerdKits has been featured in the MIT Undergraduate Research Journal? Learn more...