As a process engineer I would like to be able to integrate complex specific calculations, such as from a 3rd party analytics engine, so that I can use the Asset Analytics and its context with all the available data and scheduling to run those without needing to do custom heavy development.
Reopening this suggestion to collect additional feedback.
Now that Matlab is off the table, any new directions for extensibility of Analytics?
I appreaciate the step back! While Matlab provides scaleability performance and capabilities on the high-end of the spectrum, there is still a demand on the lower end.
I do understand the complexities and pitfalls involved (remember Custom DR's), but there should / could be a midway, just like PI Vision seems to strike a midway with it's extensibility. E.g. leaving the data delivery to the custom function to the analysis service, but having the capability to use e.g. C# to implement custom functions within that limited frame.After all we don't want to go back to the days where we open, render and parse Excel files in an ACE module...
Diego Curras commented
Customer would like to have the following possibility:
During the installation of a new Analyses service kit pointing to a PI AF server where an analysis service is already configured, he would like to prevent the current PI analyses service from stopping during the installation so that no real-time data calculation is lost.
He suggested this because he would like to keep his current analysis service running without forcing it to stop despite this new installation.
Brent Bregenzer commented
I've worked with customers who have expressed the same desire. They want to do some basic pre-calculations on many of the incoming data streams across the AF database, e.g., quality checks like badval or range limits and/or some basic pre-processing like moving average with a time window defined in an attribute. It can get tedious having to copy paste all of these lines of expressions from one template to another for many attributes. It would be easier to build a predefined function of their own where they just need to add the attribute name and few parameters like time range.
There have been requests for both user defined functions that only use a set of the included Analysis functions (this might be a good first step) and custom functions written in some programmatic language similar what was done in ACE.
A quick solution could be, providing a Analytics Function to trigger an external script from a script folder. By default when the script is triggered, it should provide the Asset Context & Time Context to the script, Also it should have additional placeholder to pass custom information to the script. Then Write you script in Powershell, Python, R; use AFSDK, WebAPI, SQL DA, as per one's comfortably.
Stephen Reynolds commented
I liken this to the custom function builds in Excel. Simple code defines the inputs and calculations. The function appears in the library and can be used in analytics, i.e.
This would be supplemental to using templates, allowing more flexibility as to where / when the function could be used.
One main concern here is that if we have periodic calculation occurring every minute, then as of now with around 5k AF analyses, some calculations are being calculated with delay.
During this duration of delay (generally 5-7 seconds) value will come as calc failed.
So we need to take in concern that also.
Let me add a current example where we use a custom C# calculation engine for.
For each batch, and during the batch at every change of the batch phase, we need to calculate the operating window for a number of key process parameters. We trigger the engine based on signups to BatchID and PhaseID. On every change, we lookup some process data, and some AF table, calculate our operating window limits (data series using future data) and write those back to PI Points.
i think this request is more specific on the external calculation: https://feedback.osisoft.com/forums/555148-pi-server/suggestions/17219900-extensible-analytics
I'm a bug fan of AF, yet every decent project i needed to create my own external calculation engines for more complex business logic. I've used SQLserver, OLEBDEnt and linked AF tables, or custom C# engines.
In all cases i would at least get rid of all the technical plumbing of management, scheduling, security, etc., just like the ACE scheduler was.
So mechanisms to trigger custom logic, supply the data and provide access, and send back the data to AF.
If needed, i can supply specific examples i've built for customers.
This request could go a long way, so maybe this option solves the request in a more broad sense:
Arie van Boven commented
Because the ACE wizard isn't working anymore in VS2015, it would be good to get away from ACE, and make use of the AF Analysis scheduler.
Michael Halhead commented
This overlaps with the advanced analytics/extensible analytics. However, I see a slight difference. What I have in mind is that we would be able to create a function similar to the pre-built functions that can be used in Analytics. I would like to see this in two forms:
1) take a set of AF Analytics and expose this as a function. Basically you would build an Analytics template in PSE that is then exposed as a function in the Element Analysis tab.
2) build a function using your favourite programming tool (C#, F#, ....) which is then available in AF Analytics like a normal function.
David Pugal commented
Would you please expand on this idea little bit? Do you mean something like custom/predefined Expression functions, or rather full programmatic access?
As a developer, I would like the analysis service to support custom functions that I write using AF SDK.
As an analytics admin, I would like to be able to add more complex code such as Loops, Arrays, and other advanced logic to AF Analytics calculations SO THAT I can perform more complex and custom calculations while leveraging the scheduling and other features of AF Analytics.