How can we improve the PI Server?

Custom functions for the analysis service

As a developer, I would like the analysis service to support custom functions that I write using AF SDK.

105 votes
Sign in Sign in with OSIsoft
Signed in as (Sign out)

We’ll send you updates on this idea

AdminKeith Fong (Team Lead, AF & Analytics, OSIsoft) shared this idea  ·   ·  Flag idea as inappropriate…  ·  Admin →

Update: This may not be relevant to all subscribers of this idea. Previously we informed via a status update that native MATLAB integration is now available in the 2018 release. This integration will no longer be included with the PI Server starting with PI Server 2018 SP2 release. Please see status on https://feedback.osisoft.com/forums/555148-pi-server/suggestions/17742490-custom-functions-for-the-analysis-service for more details.

10 comments

Sign in Sign in with OSIsoft
Signed in as (Sign out)
Submitting...
  • rpalmen commented  ·   ·  Flag as inappropriate

    I appreaciate the step back! While Matlab provides scaleability performance and capabilities on the high-end of the spectrum, there is still a demand on the lower end.
    I do understand the complexities and pitfalls involved (remember Custom DR's), but there should / could be a midway, just like PI Vision seems to strike a midway with it's extensibility. E.g. leaving the data delivery to the custom function to the analysis service, but having the capability to use e.g. C# to implement custom functions within that limited frame.After all we don't want to go back to the days where we open, render and parse Excel files in an ACE module...

  • Diego Curras commented  ·   ·  Flag as inappropriate

    Customer would like to have the following possibility:

    During the installation of a new Analyses service kit pointing to a PI AF server where an analysis service is already configured, he would like to prevent the current PI analyses service from stopping during the installation so that no real-time data calculation is lost.

    He suggested this because he would like to keep his current analysis service running without forcing it to stop despite this new installation.

  • Brent Bregenzer commented  ·   ·  Flag as inappropriate

    I've worked with customers who have expressed the same desire. They want to do some basic pre-calculations on many of the incoming data streams across the AF database, e.g., quality checks like badval or range limits and/or some basic pre-processing like moving average with a time window defined in an attribute. It can get tedious having to copy paste all of these lines of expressions from one template to another for many attributes. It would be easier to build a predefined function of their own where they just need to add the attribute name and few parameters like time range.
    There have been requests for both user defined functions that only use a set of the included Analysis functions (this might be a good first step) and custom functions written in some programmatic language similar what was done in ACE.

  • uniqueshanu commented  ·   ·  Flag as inappropriate

    A quick solution could be, providing a Analytics Function to trigger an external script from a script folder. By default when the script is triggered, it should provide the Asset Context & Time Context to the script, Also it should have additional placeholder to pass custom information to the script. Then Write you script in Powershell, Python, R; use AFSDK, WebAPI, SQL DA, as per one's comfortably.

  • Stephen Reynolds commented  ·   ·  Flag as inappropriate

    I liken this to the custom function builds in Excel. Simple code defines the inputs and calculations. The function appears in the library and can be used in analytics, i.e.

    customfunction('attribute1','attribute2','time1','time2')

    This would be supplemental to using templates, allowing more flexibility as to where / when the function could be used.

  • rpalmen commented  ·   ·  Flag as inappropriate

    Let me add a current example where we use a custom C# calculation engine for.
    For each batch, and during the batch at every change of the batch phase, we need to calculate the operating window for a number of key process parameters. We trigger the engine based on signups to BatchID and PhaseID. On every change, we lookup some process data, and some AF table, calculate our operating window limits (data series using future data) and write those back to PI Points.

  • Michael Halhead commented  ·   ·  Flag as inappropriate

    This overlaps with the advanced analytics/extensible analytics. However, I see a slight difference. What I have in mind is that we would be able to create a function similar to the pre-built functions that can be used in Analytics. I would like to see this in two forms:
    1) take a set of AF Analytics and expose this as a function. Basically you would build an Analytics template in PSE that is then exposed as a function in the Element Analysis tab.
    2) build a function using your favourite programming tool (C#, F#, ....) which is then available in AF Analytics like a normal function.

  • David Pugal commented  ·   ·  Flag as inappropriate

    Would you please expand on this idea little bit? Do you mean something like custom/predefined Expression functions, or rather full programmatic access?

Feedback and Knowledge Base

Posted ideas will have one of the following statuses.
Full definition of these statuses can be found on the Home Page.
No status
NEEDS MORE DISCUSSION
RESEARCHING/EVALUATING
DECLINED
PLANNED
STARTED/IN DEVELOPMENT
IN BETA
COMPLETED