1

Adventures in Groovy – Part 8: Customizing Data Maps and Smart Pushes

Introduction

Groovy has the ability to execute Data Maps and Smart Pushes.  Data Maps are objects that define the data movement between applications and are accessible globally.  Smart Pushes are Data Maps that are attached to a Data Form and can override the dimensions settings.  They are largely the same thing, but defined separately inside of PBCS.

So, why execute these in Groovy rather than assign to a form and call it a day?

  1. The data push can be customized to push only data that has changed, improving performance.
  2. When a form is saved, all calculations are executed first, and the Smart pushes are executed after all calculations are finished.  If Groovy is used and the data push is done inside a Business Rule, the order of operation can be a
    1. calculation
    2. data push
    3. calculation
    4. data push
    5. etc
  3. Since the Smart Push has a global limit of memory that can be used, and multiple people are running them through form saves, it is critical to make it as small as possible to reduce the probability of hitting that limit and increasing the odds of the Smart Pushes failing.

To date, I see the advantage of running a Smart Push (a Data Map on a form) in that most of the dimensions are already overridden and less overrides are required in the Groovy logic.  There is no difference in performance or the size of what can be pushed between the two when executed from a Groovy calculation.  The advantage of using a generic Data Map is that there is less effort in defining the form level Smart Pushes, and once one Groovy calculation is written to override all dimensions from a Data Map, it can be replicated easily to all required forms.

To understand the memory issues and explanation of how it differs from Data Maps and Smart Pushes, see PBCS Data Map / Smart Push Has Data volume Limits.

Data Map

Executing a Data Map is very simple and can be done in one line of code.

operation.application.getDataMap("Data Map Name").execute(true)

Calling execute() on a DataMap would execute the named Data Map (with no customization) and clearing the target location before pushing data.  Changing the true to false, or removing it, would remove the clear and leave the data as is.

To override the the dimension settings and increase or decrease the scope of what is used, the member names need to be further defined.  Every dimension’s scope can be changed, or just the ones that require a change in scope can be altered.  The following changes the scope for the account, scenario, version, and Entity dimensions.

operation.application.getDataMap("Data Map Name").execute(["Account":"Net Income, Income, Expense", "Scenario":"Budget", "Version":"Working", "Entity":"East Region"], true)

Smart Push

The Smart Push works exactly the same, except the object referenced is a Smart Push, and is obtained through the grid object, not the application.  Again, the likelihood that the Smart Push is further scoped is high, so it is reasonable that the dimensional parameters would be fewer as many of them change based on the POV selected.

operation.grid.getSmartPush("Smart Push Name").execute(["Account":"Min Bonus, Min Salary"])

One additional option is to define a Smart Push from a Data Map in the Groovy script.

operation.application.getDataMap("Data Map Name").createSmartPush().execute(["Account":"Min Bonus, Min Salary"])

Error Trapping

When these are written, it is likely that the Smart Pushes and Data Maps exist.  One extra step to ensure the calculation doesn’t fail is to verify their existence.  For Smart Pushes, verify that it is attached to the form.

//Data Map Example
if(operation.application.hasDataMap("Data Map Name"))
  operation.application.getDataMap("Data Map Name").execute(true)
//Smart Push Example
if(operation.grid.hasSmartPush("Smart Push Name"))
  operation.grid.getSmartPush("Smart Push Name").execute()

Conclusion

Creating variables to use in these functions to change the scope to only the rows and columns that have been edited, the calculation would look like this.  This is where I see the biggest bang for  your buck.  To understand more about using the grid iterator, read Adventures in Groovy Part 3: Acting On Edited Cells.  When a grid has hundreds of rows, only pushing the data that has been edited can make a huge difference in performance.

// Setup the variables to store the list of edited vendors and periods
def lstVendors = []
def lstPeriods = []
// Iterate through only the cells that were changed and create a list of changed vendors and periods
operation.grid.dataCellIterator({DataCell cell -> cell.edited}).each{ 
 lstVendors.add(it.getMemberName("Vendor"))
 lstPeriods.add(it.getMemberName("Period"))
}
// Convert the lists to a comma delimited string with quotes surrounding the members
String strVendors = """\"${lstVendors.unique().join('","')}\""""
String strPeriods = """\"${lstPeriods.unique().join('","')}\""""
// Execute the Smart Push with a change in scope to Vendor and Period
if(operation.grid.hasSmartPush("GP_SmartPush") && lstVendors)
 operation.grid.getSmartPush("GP_SmartPush").execute(["Vendor":strVendors,"Period":strPeriods]

With any luck, you will see significant improvement, unless they change every month of every row!




Updating EPM Automate Just Got Easier

Introduction

One of the challenges with EPM Automate has been eliminated this month.  Although it was a minor issue, the need to update EPM Automate regularly was something that had to be considered monthly.  Administrators of PBCS do not always have access to the on-premise footprint, like a Windows VM, that runs the automation.  Even more frequently, access to the production VM is only available to IT staff, so updating that environment is more strict, and has to be scheduled.  That schedule doesn’t always sync up to the changes in PBCS.

Update Command

As of the 02.18 release, a new command is available.  The “update” command will automatically download, and silently install, the newest version of the EPM Automate utility.  Once logged in, execute the following command.

epmautomate upgrade

If you are a frequent visitor, you know I am a fan of PowerShell.  All the automation I do with EPM Automate in the Windows environment utilizes this free scripting tool.  This command has been added to all my new projects so there is no manual effort in keeping the utility current.  This also eliminates any issues that pop up due to incompatibility issues with PBCS.

In my reusable scripts, this new function has been added.

function EPMA_Upgrade{
  $CmdLine = "upgrade"
  $ReturnCode = Start-Process "$EPMAutomate_Path\epmautomate.bat" $CmdLine -Wait -passthru -WindowStylee $ShowDosWindow
  LogResult "EPM Automate has been updated" $ReturnCode.ExitCode
}

During a nightly process, the function is referenced.  If the request fails, the administrators are emails.

$ReturnResult = EPMA_Upgrade
if( $ReturnResult -eq 1){Send_Email_Error;Exit}

Summary

This is a welcome addition.  Now, administrators and application owners don’t have to worry about using new features or keeping EPM Automate in sync with the active version of PBCS.  As the great, Forrest Gump would say – “One Less Thing.”




Supercharge PBCS with PowerShell

Last year I presented an in-depth overview on PowerShell and how it can be utilized in the Hyperion environment.  I have been asked many times to share it.  The presentation is a technical presentation and is meant to provide a strong introductory level foundation for anybody that wants to start using PowerShell to automate repetitive tasks.  I have built a large library of shared functions that can be used to automate PBCS and ePBCS, and I plan to share pieces of this in future posts.

For now, anybody that is interested in learning PowerShell, or has used it and doesn’t know why some things work and others don’t, this might prove to be a valuable resource.

 




Ohio Valley OAUG – Getting Groovy in Louisville

I have been selected to speak at the OVOAUG on February 16, 2018.  I have been there before, and it is a very nice group of people to engage with.  If you are in the area, or would like to hear more about how Groovy in PBCS can change the landscape of performance, user interaction, improvement of data input, and reduced user frustration, please go to http://ohio.communities.oaug.org/ and register.  I would love to see as many of you there as possible.

Here is the agenda.  It is going to be a functional overview, but we will touch on how to start writing Groovy, and if you show up, I will be more than happy to talk before and/or after the session.




PBCS Data Map / Smart Push Has Data volume Limits

Introduction

When moving data in PBCS with Data Maps or Smart Pushes, they have limits on how much data can be moved.  The amount of data can be seen in the logs, and look something like this.

Failure

Exporting data…
Exported data file(s) size is: 207.1 MB.
Push Data failed. Error: Exported data size of data map that is being executed from groovy is more than permissible amount: 100 MB.

Success

Exported data file(s) size is: 464.7 MB.
EXPORT elapsed time: 39584
IMPORTING – AppName: AreakFin
TRANSFORM elapsed time: 63634
IMPORTING elapsed time: 21166
TOTAL elapsed time: 124553

Prior to the Feb, 2018 release, the following did not always hold true.  If you are/were seeing inconsistencies, see Bug Report: Push Data failed.  It also includes information about how the data cap works, as it is different between Data Maps and Smart Pushes, which is worth reading.

Data Movement Limits Identified

I got the following information from Oracle, and it is useful if you are using the data movement functionality.  When these are developed, it is a good idea to evaluate the size and plan for growth.  If the production data movements are nearing the thresholds, it is recommended to be proactive and try to reduce the POV that is used to move the data.  If it can’t be reduced, one option is to split it into multiple pushes which can be done with Smart Pushes on the Data Form save, or with Groovy.  Groovy also allows you to further condense the POV by dynamically changing the POV based on the cells edited, which is the most productive and efficient way to handle these.

So, here is what was documented.  The data limits imposed on the movement methods are below.

  • There is not a cap when running a Data Map
  • When executing the following, there is a cap of 100MB
    • Smart Push on a Data Form
    • Smart Push via a Groovy Calculation
    • Data Map via a Groovy Calculation

Summary

If you are not seeing this, I would recommend opening a ticket with Oracle to resolve.  I will be writing a post explaining how to execute and override POVs in Smart Pushes and Data Maps with a Groovy Calculation in the near future, so look for an article in my Adventures in Groovy series.

 




Adventures in Groovy – Part 7: Validating Run Time Prompts

Introduction

When developing applications in PBCS, there are times when we need to request information from users.  This is especially useful in workforce application, but certainly not limited to them.  With Groovy, we can take validation to the next level.  We can compare inputs to each other, we can limit input, and we can now stop the execution of the calculation until the inputs are validated.  The difference now is that we have the ability to keep the prompts open, and force the user to either enter valid data, or cancel the request.

Validating Hire Date / Termination Date Use Case

Let’s start with a very simple example.  Let us assume we are adding a new employee, and we are asking for a hire and termination date.  A real-life example would require more options, like a name, title, union, and other required inputs.  To simplify this tutorial, we are only entering the hire and termination dates to prove out the validation and show functionality of the Groovy RTP validation logic.

When a user enters a termination date after a hire date and launches the rule, it validates and executes the rule.

When the job is opened in the job console, we see the RTPs and the values entered, and the Job Status is selected, the log shows the values returned to Essbase.


When a user enters a termination date prior to a hire date and launches the rule, it an error is returned and the RTP window doesn’t go away.  At this point, the user has to correct the error, or cancel the execution of the business rule.

In this case, the log shows the business rule failed.

Code

There are multiple objects that are used to accomplish RTP Validation.  The code that processed the above action is the following.

/*RTPS: {RTP_HireDate} {RTP_TermDate}*/
def mbUs = messageBundle(["validation.InvalidDate":"The termination date must be after the hire date."])
def mbl = messageBundleLoader(["en" : mbUs])

// Validate the Rtp values
if(rtps.RTP_HireDate.getEssbaseValue() > rtps.RTP_TermDate.getEssbaseValue())
   throwVetoException(mbl, "validation.InvalidDate", rtps.RTP_HireDate)

// Print the results to the log
println "Hire Date: " + rtps.RTP_HireDate.getEssbaseValue()
println "Term Date: " + rtps.RTP_TermDate.getEssbaseValue()

rtps object

Creating RTPs in Groovy was covered in the previous article.  If you haven’t read that, it would be a good time to take a look, as it explains the basic of this object.  Expanding on the use of the object, we are using some additional methods.  This object has many, including returning the input as boolean, double, strings, dates, members, and smart lists, to name a few.  In this example, we are using getEssbaseValue, which returns the value sent to Essbase and stored.  If there was a need to compare date ranges, we could have used the getDate, and expanded on this with the Groovy date functions to get the days, months, or years between the entered values.  In this simple example, we just want to make sure the term date is greater than the hire date.

messageBundle

The first thing that is requires is to create a messageBundle and messageBundleLoader.  These two objects work together to hold the errors, the error messages, and multiple languages, if required.

The messageBundle is a map that holds all the errors (name and description).  In this example, we only have one error, but more can be added and separated by commas.  The messageBundleLoader holds all the messageBundle objects that represent the different languages.

throwVetoException

When an exception is found, executing this method will initiate an error and cause the RTP validations to fail.  This method requires the messageBundleLoader, and the error to be returned to the user.

Other Use Cases

By now you are probably already thinking of other uses of this.  I can see limiting growth rates, confirming combinations of RTPs (like not allowing salaried people in a union), ensuring that a new employee doesn’t have a hire date prior to the current date, and probably hundreds of other ways to use this.

If you would like to share an idea, please post a comment!

Conclusion

Being able to validate user input prior to executing a calculation and returning the input request to the user is  huge step forward, and just another benefit of Groovy calculations.  We can reduce the number of user errors and improve the user experience.




Bug Report: Push Data failed. Error: Exported data size violates permissible amount: 100 MB

Introduction

Data Map Error:

Push Data failed. Error: Exported data size of data map that is being executed from groovy is more than permissible amount: 100 MB.

If you are confused, join the club.  The results are inconsistent as some data pushes are successful that are over the 100MB limit.  So, why the following error?

Exporting data…
Exported data file(s) size is: 207.1 MB.
Push Data failed. Error: Exported data size of data map that is being executed from groovy is more than permissible amount: 100 MB.

Clarification

A point of clarification for those of you who are new to data maps and smart pushes. If you think they are the same thing, here is the clarification from Oracle, in my words.

  • A Data Map is any data map executed from the Data Map area, whether it is through the UI, EPM Automate, or the REST API.
  • A Smart Push is essentially any Data Map executed from a Data Form.

Although they seem like the same function, they have different logical areas in execution.  My understanding is that a Data Map should never hit a cap on memory.  A Smart Push does have a cap.  Not only that, the way it was explained to me is that there is a hard cap on how much memory Smart Pushes can consume, and this is a global limit, not a limit per Smart Push.  So, the reason you are experiencing inconsistent results with Smart Pushes is quite simple.  The more Smart Pushes that are executed in a time window, the more memory is used.  So, you may never have a problem in a Test, or at night, but during UAT or in Prod, successful execution may be intermittent.  The reason is when these are run periodically, that limit may never be reached.  Run multiple times by multiple people in short durations will cause the limit to be consumed.

This bug only applies to Data Maps.

The Problem

The same Data Map executed results in two different outcomes.

Failure

Exporting data…
Exported data file(s) size is: 207.1 MB.
Push Data failed. Error: Exported data size of data map that is being executed from groovy is more than permissible amount: 100 MB.

Success

Exported data file(s) size is: 464.7 MB.
EXPORT elapsed time: 39584
IMPORTING – AppName: AreakFin
TRANSFORM elapsed time: 63634
IMPORTING elapsed time: 21166
TOTAL elapsed time: 124553

So, if there is a cap at 100MB, what gives?  If you have seen the following error, and wondered why the same Data Map sometimes runs and sometimes fails, it is related to Bug 27161430.

The Fix

Although support was difficult to navigate, I was lucky enough to be at an Oracle session in Virginia and talked to a developer.  He immediately requested the ticket number and said flat out, this is a problem.  I don’t want to name names, so a huge thank you to an unidentified developer at Oracle for giving me a few minutes and helping, because I don’t believe it would have been escalated to the development team otherwise.

The ticket was updated yesterday, and the fix is slated to be released in February. Although this is an internal bug, here are the details.

Bug 27161430 – PBCS: EXPORTED DATA SIZE OF DATA MAP THAT IS BEING EXECUTED FROM GROOVY IS MORE




Adventures in Groovy – Part 6: Converting a POV into a Fix

Introduction

One of the fundamental features of a Groovy calculation is the ability to dynamically grab aspects of a grid, and getting the POV is something that is required to dynamically generate an Essbase calculation.  There are times when the entire POV is required, times when only members from specific dimensions are needed, and situations where only the rows and columns of the edited cells are used to construct effective fix statements.  All 3 of these situations will be discussed.

Use Case (Pulling POV Members)

Many times, the Fix statement of a calculation being built includes all members in a data grid POV.

Code

List<String> povMemberNames = operation.grid.pov*.essbaseMbrName 
String calcScript = """   
Fix("${povMemberNames.join('", "')}")
 [Calculation]
EndFix;"""
Breaking It Down

The operation.grid.pov provides access to all the dimensions in the POV.  From that, dimension names, member names, and other properties that are useful, can be accessed.  When followed by a *, it returns all the dimensions in the POV to a list.  Using the essbaseMbrName instructs the function to return every member in the POV.  The povMemberNames variable stores a list  of all those values.

When building the calcScript variable, ${povMemberNames.join(‘”, “‘)} will return the list, delineated with “,”.  This would return something like membername1″,”membername2″,”membername3.  This obviously is missing surrounding quotes, which is why it is embedded inside the quotes.

Use Case (Pulling Selective Dimension Members From The POV)

Pulling one, or all of the dimension values in a POV, in individual variables for later use, provides the ultimate flexibility.  The following pulls all the members and stores them in a unique variable.  Then, any dimension in the POV can be accessed.  An easy way to accomplish this is to use the find method of a List object to filter the elements.

Code

List<String> povmbrs = operation.grid.pov
String curYear = povmbrs.find {it.dimName =='Years'}.essbaseMbrName
String curCompany = povmbrs.find {it.dimName =='Company'}.essbaseMbrName
Fix((${fixValues(curYear, curCompany))
 [Calculation]
EndFix;"""
Breaking It Down

The first line stores the entire POV collection.  That variable can be accessed and used like any Groovy collection.  By using the find method, items in the collection can be filtered.  povmbrs.find {it.dimName ==’Years’} will return the Years dimension object and one of the properties of that object is the essbaseMbrName.  Using it will return an Essbase friendly member name.

The “fixValues” method converts strings to “Fix friendly” strings that can be used in an Essbase calc script.  Any of the following objects can be inserted into this method.

  • AttributeDimension
  • AttributeMember
  • Dimension
  • Member
  • PeriodDimension
  • RtpValue
  • YearDimension

The result is the dimension member name inside a fix statement.

Rows/Columns

Some of the real efficiencies with Groovy stem from the fact that the edited cells can now be determined and a Fix statement can dynamically be generated to isolate longer running calculation on only the rows and columns that have changed.  In the following example, there are two methods to get the members.

Code

Set<String> periods = []  
Set<String> products = [] 
operation.grid.dataCellIterator({DataCell cell -> cell.edited}, MemberNameType.ESSBASE_NAME).each { DataCell cell ->         
  periods << cell.periodName       
  products << cell.getMemberName("Product")     
  } 

String calcScript = """   
Fix("${periods.join('", "')}", "${products.join('", "')}")
 [Calculation]
EndFix;"""
Breaking It Down

If you are unclear about how to iterate through a grid, read Adventures in Groovy Part 3: Acting On Edited Cells.  Inside the dataCellIterator, the example assigns two list objects with the respective members.  Since this only touches the cells that have been edited, only the periods and products that have been updated will be stored in the respective variables.

You may be wondering if this is truly the most isolated combination of data.  Theoretically, the same product may not have the same edited months.  You could further condense changes in the Fix statement by looping through the rows and creating a fix for every row.  In this example, Period is dense, so doing this would not change the number of blocks.  Depending on the application, taking this extra step might prove more efficient.  Rather than complicate this further, we are assuming this isn’t required.

Every required dimension (Account, Entity, Period, Scenario, Version, and Year) has its own method. cell.periodName returns the Period member of the cell.  All the dimensions have the same naming convention.

For custom dimensions, getMemberName can be used with the required dimension passed as a parameter.  If you want consistency, this method can also retrieve the 6 required dimensions.  cell.getmemberName(“DimensionName“), where DimensionName is an actual dimension, returns the respective member in that dimension.

Just like the previous example, add the variable to the Fix statement with a join and it returns the delimited list of members that have been edited.

  • “${periods.join(‘”, “‘)}” returns the list of Periods that have edited cells, and
  • “${products.join(‘”, “‘)}” returns the rows (or products) that have been edited.

Wrapping Up

One last step that can be added, and should be, is the check to see if any cells have been modified by evaluating the size of one of the lists that is created during the grid iteration.  After the iteration is completed, the following can be added to accomplish this.  If there are no cells edited, the calculation is stopped at the return line and nothing is sent back to Planning/Essbase to execute.

if(products.size() == 0) {
    println("No edited cells found!")
    return
}

Joining the examples above, the fix would look like this.

String calcScript = """  
 Fix("${povMemberNames.join('", "')}", "${periods.join('", "')}", "${products.join('", "')}", ${fixValues(curYear, curCompany))
   [Calculation]
EndFix;"""
return calcScript

Conclusion

If you implement this, you will likely see huge performance improvements.  In the client production applications, I have implemented this in, I see at least a 95% improvement in performance.  This same logic can be used for Data Maps and data movements from ASO to BSO (which we will cover later)

 




Adventures in Groovy – Part 5: Accessing Substitution Variables

Introduction

Accessing Substitution Variables is critical in most calculations, and accessing them in Groovy is a little more complex than it needs to be with not having an API to get them.  Since the SubstitutionVariable is not available, there are a couple ways to get them.  The precursor to this post is three-fold.

  1. Read the Bug Report: Groovy SubstitutionVariable Class Not Functioning  post on Jan 8, 2018 regarding the SubstitutionVariable class availability.
  2. Thanks to Abhi for providing a great alternative.
  3. It may be helpful to read Adventures in Groovy Part 4: Run Time Prompts to understand how to access RTPs in a Groovy calculation.

In my bug report above, I suggested grabbing them via a hidden column or row from a form.  A reader suggested a another way to do this, and I think it is a better way to accomplish it.  Rather than grabbing the substitution variable by adding it to the form and hiding the column/row from the user, Abhi provided a much cleaner approach to working around not having access to the SubstitutionVariable class by using hidden RTPs.

Create Run Time Prompts to Access Substitution Variables

Assume the following 3 variables are required in business rules.  Create a new RTP for each.  The naming convention is irrelevant, but should be considered and be consistent for easy reference in the business rules.  In this read, I have assumed there isn’t an existing RTP with the defaults set to a substitution variable.  Even if there is, it might be beneficial to create ones specifically for this need so future changes don’t impact the values.

Name: subVar_CurMonth
Type: Member
Dimension: Period
Default Value: &v_CurMonth
RTP Text: N/A

Name: subVar_CurYear
Type: Member
Dimension: Period
Default Value: &v_CurYear
RTP Text: N/A

Name: subVar_BudYear
Type: Member
Dimension: Period
Default Value: &v_BudYear
RTP Text: N/A

Business Rule Inclusion

Inside the business rule, the following convention is required to add the variables.

/*RTPS: {subVar_CurMonth subVar_CurYear subVar_BudYear}*/

Set all the RTPs in the Variables tab to set to hidden so the user isn’t prompted for these.  Now, the substitution variables can be referenced.

def varCurMonth = rtps.subVar_CurMonth.toString()
def varCurYear = rtps.subVar_CurYear.toString()
def varBudYear = rtps.subVar_BudYear.toString()

Conclusion

Since these are likely to be used in many rules, it would be beneficial to add these to a script and embed that script into the rules that need to access these.  Any new variable that needs to be included can be added to the script, and all the business rules would then have access to them.  There are a number of ways to do this with Groovy calculations, but the simplest way is to embed it like a non Groovy business rule.  This can be dragged from the left pane, or entered manually.  The syntax is

%Script(name:="script name",application:="application Name",plantype:="plantype name"

If and when Oracle releases the class that provides direct access to sub vars, expect it to be documented here.




Easy Way to Randomize Data in PBCS

Introduction

When an application is used to demonstrate sensitive information, a training class includes people that shouldn’t see live data, or security is being tested, often times using real data is not an option.  I have written PowerShell scripts and .NET applications to randomize data from Essbase exports, as well as Custom Defined Functions to randomize with calculations.  PBCS just made it much easier.  Using one EPMAutomate command all the data in all the applications can be randomized.  We don’t have the ability to control what is change, which would be a nice addition as I always excluded anything greater than 0 and less than or equal to 1 in an effort to exclude rates and things like a boolean true/false, or headcount.  But, the ability to do it in one command is mighty appreciated!

Masking Data in PBCS

In a recent update to EPM Automate, Oracle introduced a new method – maskData. It can’t get any easier to use.  Once you log in to EPM Automate, simply run maskData.

epmautomate maskData

When this is executed, it will ask you to confirm that you want to randomize the data in the application.  If you want to bypass the confirmation, add -f after maskData and it will force the command to run without confirmation.

Thank You Mr. Obvious

Be very careful using this!  Make sure you are logged on to test, and not production!