41 votesRESEARCHING / EVALUATING · 12 comments · PI Server » Asset Framework (AF) · Flag idea as inappropriate… · Admin →
In response to Mike Greene, "I thought you were in my presentation la..."
Sorry Mike, I didn't make the connection from seeing just the name of the poster on PI Square. Now I know.
PI System Explorer is a configuration tool for configuring objects in AF. Are you using PI System Explorer as a visualization tool?
As to hyperlinks, you can configure the extended property of an element with a hyperlink. You can also configure an attribute of an element with a hyperlink.
Having said that, I'm curious as to what kind of hyperlink you would put on an element? Where would the hyperlink take you?
In response to Sean Jost, "So we used AF SDK to export an attribute..."
Sean, are you able to provide additional information on what your use case(s) are with concrete examples? I am guessing here a bit, but do you have assets like a pump, that you want to added to the AF hierarchy and you want to use ID from a foreign system as a unique way to track this asset through it's lifetime?
While we're on this, David previously mentioned the fact that elements can have extended property which can hold key-value pairs that you can search on. Perhaps that does what you need?
Although your question was around the AF SDK, have you considered using the XML import/export option to preserve unique ID's?
Are you interested in just the release dates?
It is currently possible for an admin to undo other users' checked out changes. Simply go to the undo checkout dialog box and select all users.
Checking in other user's changes is not currently possible.
Sounds like you want to track user/usage. Do you have a trust per user? I'm asking because you spoke to resetting the counter per trust every year on Jan. 1st.
Pt. Created accurately reflects the fact that the point has been created but no values have been received. We should not put in a value when there is not. A Zero is a value.
This is existing capability. You can decide who can read/write analyses by making the appropriate changes using something like PI System Explorer.
Please see attached screenshot
Did you use the LiveLibrary online doc or from PI System Explorer Help?
Sorry, I meant PI Builder user guide, not PI System Explorer user guide.
In response to Bruce McCamant, "Chris - give us the option to just bring..."
That's existing capability.
This is the LiveLibrary URL to Formula Data Reference:
This will help you learn how to create an attribute that is configured to be a Formula Data Reference. Once you have this attribute, you can use it as an input to an Analysis.
I want to emphasize the performance aspect of this per my previous post. Please take great care in ensuring you're not overwhelming the system by calculating excessive amount of data with the Formula Data Reference attribute.
For Example 1, there is a way to do that with existing shipping product. What you need to do is create another attribute (no PI Point, just an attribute) and configure it as a Formula Data Reference. What you would do is configure the Formula to be a*b, with a=FIT-001 and b=DIT-002. Now you can create an Analysis Expression for TagTot('Formula_Attribute', '*-1w', '*') to get the weekly total. Things to keep in mind:
1) The Formula Data Reference is computed on demand. In this case, it's computed when the TagTot is triggered. When the Formula DR is computed, it will retrieve all the values from FIT-001 and also all the values from DIT-002 for the entire time range, then do the computation client side. Therefore you should test and be aware of potential performance issues due to the Formula being computed client side and also the cost of retrieving all the values from the server (network latency).
2) As a way to test this, create this Formula DR attribute in PI System Explorer, then you can perform a "Time Series Data" or "Trend" (via right-click on the attribute) to see what it does.
3) With your Analysis Expression, you should save the output to a PI Point, but in this case it's only 1 PI Point.
In general, AF supports on-demand type calculations via Formula Data Reference and Analysis Data Reference and streaming (scheduled) calculations via Analyses. There are PROs and CONs to both so you need to decide which works best for you.
As to Example 2 Conditional Expressions, you can also tried the same as above by creating a Formula DR attribute with a=x, b=y, c=z, then a AND b AND c. Make sure you have a corresponding AF Enumeration set that you would configure for the Formula DR attribute and you would want to configure the Formula DR attribute to be "stepped". Once you have that Formula DR attribute, you can then configure an Analysis Expression with TimeEq. As a reminder, the same caveat as Example 1 above apply here. Check to make sure the performance is adequate for your needs.
Let me know if you need additional information.
I understand what you're trying to do. However, please keep in mind that AF/Analytics was designed to be a streaming calculation engine. We designed it assuming that users would use it as a way to perform calculations with a schedule, e.g. every time there is a new trigger event for the calculation or based on a clock schedule. What you're describing seems to be a manageability need that may not happen very often, i.e. not in a streaming fashion. Therefore, with the information that you have provided, this request would not be high priority for us.
Does the information in the InstrumentTag attribute change with time?
Can you tell me what you would use this information for?
Would you please create a new suggestion in the PI Vision forum? This particular suggestion originally refers to PI System Explorer as the client. Since different people monitor different forums, your suggestion in the PI Vision forum would be most helpful to the relevant OSIsoft personnel.
Are you referring to PI System Explorer or some other display client?
86 votesRESEARCHING / EVALUATING · 14 comments · PI Server » Analytics & Calculations · Flag idea as inappropriate… · Admin →
When creating predefined Searches within the Management plugin when using PI System Explorer, you can create a search criteria using the Element name with wildcards, e.g. Elem* I just tried this and it seems to work.
Perhaps I'm misinterpreting your question?
In response to Asle Frantzen, "I have a 2018 SP2 installation, and I'm ..."
Service Status comes from PI Analysis Service. The remaining filters come from AF. To do what you described, we have to do some work combining these two sources. We just have not gotten to implementing this.
Would you expect the sorting to affect only the page that you're looking at in the Management plugin or would you expect the sorting to affect all the pages? The reason I'm asking is imagine you have 100,000 analyses. In that case, in order to improve performance of the Management plugin, we implemented paging, which means we get back 1 page at a time from the server and display that as soon as possible to improve responsiveness of the user interface. Consequently, if you expect the sorting to affect all the pages together as one, it's much more impactful to performance and overall responsiveness for users with many analyses since we potentially have to retrieve everything before sorting as opposed to sorting only 1 page.
With the 2017 release, you can create your own customized search/filtering of analyses in the management plugin. Please provide feedback on whether this suits your needs.
There are difficulties allowing users to sort a very large list. The management plugin loads analyses by pages into the grid. This dramatically improves performance as we bring back 1 page of data at a time from SQL Server and display it immediately. With additional sorting options, we would need to find a way to sort potentially millions of analyses and then display them. This makes paging much more difficult. In addition, what we have seen is that there are many elements with the same name, for example "pump". In the case of a very large hierarchy, if you were to sort by "pump", you will end up with hundreds if not thousands or more "pump". I'm not sure that's all that useful.
Your feedback is welcome.
Please try out the 2017 release as the management plug-in has been enhanced to cover many of these features.
In response to Stephen Kwan, "We're investigating why NumOfChange is b..."
Ok, so I know what's going on here. The NumOfChange function uses the user provided StartTime and EndTime and calls RecordedValues with the Mode of "Inside" on the underlying PI Point. As a result, we get all the individual events within the StartTime and EndTime, then we sort through and figure out how many changes occurred. As a result, we don't take into consideration the value at the start time (boundary condition). There are other complications depending on whether the underlying PI Point is configured with Step = 1 or not. In the case of a PI Point that is configured with Step = 0, we need to figure out if we should interpolate at the start time boundary or if we should extrapolate.
I've created a backlog item to correct this. At this time, I do not have an estimate as to when we would be able to work on this. It needs to be prioritized with all the other backlog items.
Please continue to use your workaround for this issue.
In response to Jürgen Schmidt, "Yes it is."
We're investigating why NumOfChange is behaving this way. Will respond back with more info.
Is your PI Point configured with "Step" on?
In response to Jürgen Schmidt, "Our Controller Monitoring by now is base..."
Is this a digital tag? If it is not a digital tag, is it configured with "Step" = 1?
Ok, so you really need to count only if there are changes.
Do you wish to do this counting in an adhoc basis (reporting or basically on demand query) or in a continuous basis?
In response to Jürgen Schmidt, "I would need a possibility to count dist..."
I think there are simpler way to accomplish what you need.
Variable1: FilterData(Setpointvalues, $val = "AUTO")
The first row sets your array with values that are "AUTO". The second row gives you a count. The ArrayLength function would handle the different array lengths.
Asset Analytics was designed to be a streaming calculation engine, therefore it supports very well calculations that executes on a schedule, in a streaming fashion. These suggestions on linear interpolation and polynomial calculations seem to be more suitable for "adhoc" calculations, ones that run only when needed. Can you provide additional details on what your specific use cases are - i.e. it would help me if you can provide a high level description of what problems you're trying to solve. Thanks.
We're having trouble understanding what you're doing. How many different missing Enumeration sets do you have? Can you provide some examples of your use case - i.e. step of step of what you're doing?
Please help me understand how I would differentiate the most future value? There's no way for me to know whether another value further into the future would arrive some time later. In addition, what if you have multiple future data and they arrive out of order? Should we trigger when the most future data arrive even though it may not be the last one to arrive? Thanks for any input you may have.
In response to Kenneth Barber, "I don't do this often, and when I do, I ..."
Thank you for your feedback.
Can you help me understand how often you do this and how important this is to your work? Trying to gauge relative priority compared to everything else in the current backlog.
OK, I understand what you're looking to do. Unfortunately the current architecture makes this very awkward to do. In essence you're looking for a way to be alerted to a data quality issue. We'll keep this request in the backlog and prioritize it with the rest of the backlog items. I do not know if/when we would be able to get to this.
Are you describing a condition whereby the "errors" did not exist when you initially created the analyses and then subsequently these "errors" occurred? Or were these "errors" present when you initially created the analyses?