CAN 12 - XNet Signal Min Max With Reset

So the previous CAN Blog Part 11 was a simple and quick one to write.  It had a very specific use, and a very simple example, and didn't take weeks of preparation, drafting, and coding.  I'm going to continue that trend of a quick and simple examples with this post which describes a way of getting a signals running minimum and maximum using XNet.  In the last post we talked about the Signal Single Point, and how it was probably the most common XNet session type and that it had an issue with when communication dropped out, causing reads to return the same previous value.  Well I have another commonly requested features of the Single Point in and that is a Minimum and Maximum for a signal.

Description

So in the following example I'm going to show some code that allows for displaying CAN signals in a table.  This table will have the N selected signals as rows in the table, with three columns.  The first column will show the last read value for each signal, which has a hold feature, and the second and third columns will show the signals running minimum and maximum.  This pretty simple UI has a reset button, so that the minimums and maximums will all be reset, clearing the history.  And the last value will have a hold of 2 seconds.  This means if the data for a CAN signals is lost for less than 2 second the table will look normal.  Losing the signal for more than 2 seconds sets the three columns to NaN (not a number).  If the signal comes back the last read value will read the new data, but the minimum and maximum will still show NaN.

When you would use this

Often times a user of some software wants to see the current value of a signal.  But they can't be expected to sit there and watch a long test to see if the data does anything weird.  In most cases monitoring the minimum and maximum a signal has ever been is good enough.  Adding the drop out functionality can also be useful because it can tell us if communication was lost at some point and then restored.

How its done

It's pretty simple really.  Using XNet we read a signal as XY.  This will return every value a signal has had over time.  Using this we figure out the running minimum and maximum and store them in something.  In my case I threw them into a feedback node within a subVI.  This is stored in a type of look up table using Variant Attributes.  It can be done several ways.

Here is the main VI:


And here is the subVI that handles keeping the history:


These two VIs have been saved in 2017 and can be downloaded here.

Part 13...

The next topic covered in Part 13 is in regards to Seeds and Keys.  A method of locking out unauthorized users from accessing features or memory locations.