Make Analysis trigger on different inputs when timestamps of input tags are equal
If an Analysis is triggered on "Any Input", and the inputs have the same timestamp but come in after each other, the Analysis is only triggered at the first input.
For example: We have a simple Analysis which summarizes 2 tag.
The first tag comes in a 14:00, with timestamp 14:00. This triggers the calculation. The second tag comes in at 14:02 also with timestamp 14:00. Now the calculation in not triggered anymore.
We've also discussed this with Techsupport, and they came with the following statement: "With Event Triggered, the first value to come into the PI DA is used by the Analysis Service. Other events with the same timestamp are ignored."
We would expect that the Analysis would be triggered again if the timestamp is greater or equal to the last incoming event.
Unfortunately due to unforeseen circumstances, this has not been completed. It needs to be re-prioritized for a later release.
Jim Nilsson commented
What happened to this?
I´m experiencing the same issue with the OPC A&E interface.
When two inputs with the same timestamp are coming in, only one triggers the analysis.
Are these alarms coming into the PI system via the OPC A&E interface?
When a flood of data comes into a tag like in an alarm situation, sometimes the EF analytics don't trigger as some of the data comes in at the same second.
This is exactly my use case. I also want analyses to be triggered by events on each Input-TAG on the same timestamp (or when all Input-TAGs have the same timestamp)
In response to Roger Palmen, "In cases where there is specific busines..."
@Arie - I agree with your use case - when the analysis triggers again for the same timestamp, potentially with new input data, it should evaluate again and replace the previously written result. We would handle this when we implement support for auto-recalculation.
@Roger - That's a good point. Just to add to it, I think there is a difference between the following two use cases:
Evaluate analyses as soon as input data is retrieved. When more/new input data is available (say with a delay), automatically recalculate - replacing previously written results. In this approach, consumers of calculation results have to expect that in some cases results may be incorrect, but would be eventually correct when all input data is available. The advantage is that you have calculation results available immediately (using the best available information for inputs at the time) - which could be useful (or even required) in some cases.Use some sort of validation logic (like you described) to only write outputs when all input data is available. In this case, consumers of calculation results would have the confidence that the results are correct. However, this approach requires handling cases when validation may never pass or could take a really long time (e.g. one of the input never arrived or arrived a day or week later).I can see both being useful in different situations. Analytics implementation allows you to do (1) [when auto-recalculation is supported]. While (2) may be possible using the approach you described, it can get a bit tedious since it's not that well supported out-of-the-box.
Arie van Boven commented
In response to Nitin Agarwal, "Hi Arie - As Roger Palmen mentioned belo..."
The 5 seconds latency is no problem, it's actually fine. But our second event (with the same timestamp) comes in 1 or 2 minutes after the first event, and we also have cases were the delay is about 10 minutes. So I would expect Analysis would be triggered again. But it seems that the Analysis is only triggered if the timestamp of the latest even is greater then the previous event.
In my humble opinion is should be greater or equal to the previous timestamp, or the latency should be adjustable (like in ACE).
In response to Arie van Boven, "Hi Nitin, What you describe here, is e..."
Hi Arie - As Roger Palmen mentioned below, there is an inbuilt delay (default=5 seconds), such that multiple events (from different attributes) with same timestamp would not trigger the analysis more than once. Currently this wait time cannot be configured per analysis, but even if it could be, there could always be cases where some events would be received with delay. We intend to handle these use cases more generally when we implement automatic recalculation for handling late arriving or out-of-order events. The use case you described appears to be a special case of the same issue, and we would keep it in mind.
Roger Palmen commented
In response to Arie van Boven, "Hi Nitin, The 5 seconds latency is no ..."
In cases where there is specific business logic required to determine when the data for a calculation is 'valid', i would recommend to use a 2-stage approach:
Have a natural scheduled analysis that evaluates if all data for a specific evaluation is present. If true, send a value to a trigger PI Point, else, NoOutput()Perform your actual calculation as usual, but only trigger by the PI Point from the previous analysis.Be aware that you can only evaluate the values at the trigger time, as you can only write values at the trigger time.
For values that typically calculate every X-time, you can create a periodic trigger with offset for the last X periods. E.g. the daily KPI calculations at 2 hours after the start of the day.
Roger Palmen commented
That two trigger values have the same timestamp, does not mean that they are known at the same time. How would you know you have received the last event?
There is of course the delay in evaluation of analysis, but that is a global setting and would need configuration for each individual analysis.
Hello Arie - I am trying to understand your use case. Does the analysis summarize two inputs independently and just writes the outputs to two different outputs, or does your final result depend on the two summaries. In the first case, you could possibly write these as two independent analyses which would trigger independently. The second case is more interesting. If you do in fact require both inputs to produce final output, when it evaluated at 14:00 the first time, the result essentially is incorrect as the analysis has not yet received the data for the other input (which arrives late at 14:02). If we were to evaluate the analysis again (when the second input arrives at 14:02), would you expect the new (correct) output value to replace the one that was previously written?
Arie van Boven commented
In response to Nitin Agarwal, "Hello Arie - I am trying to understand y..."
What you describe here, is exactly what I expect: " If we were to evaluate the analysis again (when the second input arrives at 14:02), would you expect the new (correct) output value to replace the one that was previously written?"