If you have e. g. an amount of measurement values with a strong noise and you want to know if a trend becomes observable than the RMS trend function is a helpfull tool. Typically it is used for calibration, to know if the values of the reference and the unit under test (UUT) are stable.
But what is behind this function?
It is based on the method of Least Squares (see Wikipedia) a standard approach in the regression analysis, concrete the linear regression.
Mathematic modell of least squares (linear least squares)
Applied in the RMS
Interval
- Measurement interval -> time difference between two measurement values in sequence
Duration
- Setting of the measurement point trend (arithmetic calculation) -> considered time for trend calculation
- Minimum one interval is to be selected
- Duration of one intervall considers 2 measurements; Duration of two intervals considers 3 measurements and so on…
Example
- Duration of 10min with an interval of 1min means
- Index i: 1…11
- Index n: 11
- xi of last index n: 10min
- x_average of the considered time: 5min
- yi of index 11: actual measurement value
- y_average of all consiered measurement value (11 values of the last 10 minutes)
- Gain of the calculated trendline (change of the calculated trend per minute) - based on all measurement values during the time of duration.
- ! The trend value in the RMS is per minute, even for intervals of 10 or 30sec. !
- Tip: Use the Slope() function of MS Excel to reproduce the trend.