Determinism

Determinism, when associated with PLC’s, refers to the predictability of the response time of the system. One of the big arguments against using PC based systems is that they are “non-deterministic”.  This is probably somewhat of a misapplied term, but it has stuck pretty well.  This argument is routinely leveled against PC based systems by the manufacturers of PLC’s.

This same argument was used for many years by Allen Bradly against Ethernet networks -vs- their proprietary (and often very slow) networks, such as “Data Highway”.

I will readily acknowledge that “determinism” is a potential problem with PC based systems, and with Ethernet as a network for industrial controls.  Poorly written software can make a PC based control system a complete disaster.  The same can be said for PLC’s, but the PLC will probably still execute the poorly written program in a somewhat “deterministic” manner.  So, the PLC doesn’t get the black eye for determinism that the PC does.

The dirty little secret of PLC’s is how difficult it is to write good code for a PLC.  PLC programming lacks any real structure, and has few if any real guidelines for producing quality code.  I would argue that “ladder logic” almost encourages bad programming.  It is much easier to write bad PLC code than it is to write good PLC code (assuming there is such a thing as good PLC code.)

Let’s look at the Ethernet determinism argument.  PLC vendors (Allen Bradley to name one) spoke out vocally for many years against using ethernet as a communication protocol in control systems.  At the time they were bashing ethernet, they were selling network products based around very slow serial based solutions such as Data Highway.  Many Data Highway implementations were running at top speeds of under 150k baud.  Most were running in the 57k baud range.  Ethernet was running at 10 megabaud rates at that time.  Although data highway was slow, it was “deterministic”, according to AB.  Ethernet was not.  One can’t deny AB’s basic argument, but due to the raw speed (and cost) difference between the two networks, digging a little deeper seems appropriate.

Nominally, the 10 base T Ehternet network is about 200 times faster than a 57k data highway network.  This means under optimal conditions, the network could transfer 200 times the information.  We know that isn’t going to happen, but we also know that chances are the Ethernet network will outperform the data highway network on average. The issue becomes, how often, if ever, the variable latency of the faster network cause a transaction to arrive slower than the worst case data highway message. (and whether it matters.)

We have installed dozens of system using Ethernet based automation equipment running at 10 millisecond transaction times, and haven’t had problems with slow transactions becoming noticeable.  That is not to say that we were pushing the network hard, but the point is that it is possible to get very reliable operation at 10ms scan rates from standard ethernet equipment in control systems.

Now that most network equipment is at least 100 base T, the argument becomes even less meaningful.

PC “Determinism”

The issue of determinism (non-determinism) in PC based controls is another discussion.  There are timing issues with Windows based computers that can’t be avoided completely.  However, the issues can be well understood, and dealt with very effectively in most cases.  It is important to have a good idea of what latency is really tolerable in a control system.  PLC manufacturers boast of highly deterministic systems, and that may be true.  But, don’t get the idea that the scan rate of every PLC system out there is down in the milliseconds. (And don’t get the idea that it has to be.)  I have seen SLC 500 systems running with scan rates of hundreds of milliseconds without causing noticeable problems in the overall picture.  This is not to say that this performance level could be tolerated in all cases, but it is to say that tens of milliseconds or less scan rates are not necessary in all cases.  There are many systems in the industrial automation world that are controllable with loop times that stretch into the tenths of seconds.  I would further argue that in most control applications, there are times where seconds could lapse without response from the control system.

The point is, not all systems need extremely fast response one hundred percent of the time.  So, if we want to use a PC as a control, we need to understand how to get the level of performance required by the application when the application needs it.

If PLC’s were equal in all ways to PC’s, and they had completely deterministic behavior 100% of the time, there would be no reason to consider PC’s as controls.  But, PLC’s are not equal to PC’s in all ways.  That is why we must evaluate whether the advantages offered by a PC offset the lack of determinism.  To seriously consider a PC, we must have a high degree of certainty that the PC based solution will always respond in a way that real-time control is maintained in the application.

To make this determination, we must have a good understanding of what we can expect from a PC, and what level of timing we need in the application.  As a rule of the thumb, I use 20 milliseconds as “real-time” to a human.  I don’t have science to back this, but I do have a lot of experience to back it.  If a control system responds to a human input within 20 milliseconds, most people don’t sense the latency.  When the latency approaches 50 milliseconds, most people can sense it.  It is not an annoying latency, but it is noticeable.  When the latency starts to exceed 100 milliseconds, it becomes very noticeable, and can be annoying.

So, to satisfy the human response, I consider 20 milliseconds or better completely satisfactory.  I also consider occasional latency of several tenths of seconds as acceptable in most cases.

The other piece of the equation has to be what level of performance is required to make the equipment function properly, and more importantly safely.  This analysis can be more difficult, but we can infer that if a PLC would not be questioned as a solution, the answer is probably at least several tens of milliseconds is probably acceptable.  (It is important to recognize when this is not fast enough, even when using a PLC as the control!)

If there is any kind of high speed motion that relies upon the control system to stop the motion before damage is done, further analysis should be done.  (The design of the whole system should probably be questioned too!)  In cases where light curtains are being used to protect people, it is important to completely understand how things are going to respond in the system.  If the light curtains are being properly applied, all motion should be stopping due to hardware interlocking, and there should be assured safe distances in the design. (The point being, control system response time shouldn’t factor into safety in a properly designed system!)

With properly written software, it is not difficult to get consistent responses in the 10′s of milliseconds with a PC.  It is also not difficult to keep maximum latency periods down in the tenths of seconds.  So, with proper evaluation of the application, and with well written software, it is practical to consider PC’s as controls in many circumstances.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>