Followers

Tuesday, June 18, 2024

Data Extract to File using Data Management - Data Exchange

As we know, Oracle updating cloud applications with latest updates regularly. In same way, Data Extract to file using Data Management also got updated. Here you can find new steps.

Go to Navigate >> Application tab >> Select Data Exchange

In Data Exchange tab, Click on Actions drop down and select Applications. you can refer below screen shot.

Click on + sign to create source or target applications


Select Data Export and select Data Export to File in the list below, as we are here exporting data to File.
Now, browse and select the file, in which format you want to export data. Like Account, Account alias, Customer, Customer Alias, ...... Data

Now, go to Data Management create Data load rule, after creating import format, location. Do update mappings as per your requirement, then you are ready to extract data from PBCS application to File.

Once Data load rule execution is completed, you can find the file in below location.
Navigate >> Application >> Overview >> Actions >> Inbox/Outbox Folder

you can download the file to local machine to verify data using excel.

Thanks,
Mady








Sunday, June 2, 2024

PBCS - Application metadata build and data load automation

PBCS: 

Application metadata build and data load automation.

 We can do EPM automation using 2 methods as we explored:

 1. EPM Automate 

  - Need server to place files and for installing EPM Automate
  - Create automation script to execute the metadata and data load tasks
  - Schedule and execute tasks as per requirement.
  - We can send an email to send about status of the script.

     For example: We have created a script using EPM Automate, which will perform below tasks

    1. Login URL, Set log files to write script activity - Send email in case of login fails.
    2. Execute business rule to set variables
    3. Execute Data management rules to load metadata of 4 dimensions
    4. Refresh database
    5. Execute Data management rules to load data
    6. Execute Business Ruleset
    7. Upload log and err files to outbox folder
    8. Send email to receipts with attachments along with Timestamp

2. EPM Automate without Server

  - Instead of server, we can use OIC - Oracle Integration Cloud to move files to Cloud
  - We can create groovy script to run metadata, data load tasks, and execute using OIC
  - Schedule and execute OIC Integration as per requirement.
  - We can send an email to send about status of the Integration, 
    where we have advantage of to set Abort on error of each task if fails and to send an email.
For example: We have create few integrations, let me explain one of them.
  - To load Subledger Revenue Data into system:
  1. Subledger Revenue Data file will be pulled from ERP system, so we have create a integration to             pull data ERP 
     and place that in to our DM/inbox folder - Here we are not using any server, file directly place in             to EPM from ERP.
  2. We have created Data management load rule to load Subledger Revenue data
  3. Create a groovy business rule script to execute the Data Management rule with help of                             substitution variables, either for one month or multiple months.
  4. Create a Job to execute the same groovy script.
  5. Now in OIC, we have an Integration - to pull file from ERP to EPM and place it into inbox                    folder. To execute pre dataload business rule, Dataload, Post dataload rules. At each step, it will             wait until process completes, in case of failing any step it will abort the integration and                     send an email to receipts.


Thanks,

Mady

Friday, May 17, 2024

Oracle Data Management - Target Options

 We can create different types of target in the Data Exchange. 


EPM Local: To make current application as Target, we can use this method.

EPM Cloud: To connect to any other cloud application, use this method. Where we can update URL, username, password, domain etc. 

Data Source: Have lot of in built adaptors, to create as a Target.

Data Export: To export data from application, use this option.

Dimensions: To build dimensions, use this method, where system will give 6 different types of dimensions along with Custom option.

EPM Local:


EPM Cloud:

Data Export:


Data Source:


Dimensions:



Thanks,
Mady





Oracle Cloud EPM Planning - Load metadata using Data Management

 Loading metadata also one of the major task to automate the process.

Create Source application as File and Target application as Dimensions.


once Dimensions selected, we have to select application name and Prefix. Once you click on OK, system will create 6 types of EPM Dimension Target will get create automatically.

ACCOUNT, ENTITY, SMARTLIST, VERSION, SCENARIO, and CUSTOM

We have to use Custom as our Target to build dimensions other than mentioned above.

Import Format, used to build the dimension is:


We have selected only Member Name, Parent, Alias and Data Storage.

As we know, create location and period mappings as per the requirement. Create Data Rule, and update mappings accordingly. In case of any rows getting blanks in any column of the file, we can use below logic in mapping. 
#SQL

CASE

WHEN ICP IS NULL THEN ' '

ELSE ICP

END

Finally, loading metadata will be done with 3 golden fishes. :)



Thanks,

Mady

Tuesday, April 16, 2024

Oracle Integration Cloud - Business Rules in Planning Cloud

 If you want to automate business rules to execute in sequence or part of any process, you can use Oracle Integration Cloud.

For example: If you want to execute business rule to load data every month, based on the substitution variable and also wants to run calculation using another business rule. you can do all these steps using OIC, where you can pull substitution variable, execute business rules and good part is, if any step is failed, you will get notification with email, which is not available in general scheduling options.

I will try to document and update in next posts about OIC Automation, this is mainly about PBCS, FCCS.

Thanks,
Mady


Friday, September 29, 2023

Event Scripts in FDMEE

 Event Scripts in FDMEE

Event Scripts, located in the Script Editor within FDMEE, are used to execute a task when certain events are carried out during the data load process. Each of the Event Scripts are customizable – users can use the scripts to perform a wide variety of tasks, such as send emails, run calculations, or execute a consolidation. There are nine different events and users can run the scripts before or after the event is executed, which makes a total of eighteen Event Scripts. The nine types of events are: Import, Calculate, ProcLogicGrp, ProcMap, Validate, ExportToDat, Load, Consolidate, and Check, with a Bef and Aft for each event. To create an Event Script, Select the type of event, whether the script will be run before or after that event, and click ‘New’.

Here are descriptions of each of the Event Scripts:

BefImport – This script runs before the Location is processed.

AftImport – Runs after the Location is processed and the data table is populated.

BefCalculate – Only for a Validation run and takes place before the Validation.

AftCalculate – Only for a Validation run and takes place after the Validation.

BefProcLogicGrp – Runs before Logic Accounts are processed.

AftProcLogicGrp – Runs after Logic Accounts are processed.

BefProcMap – Called before the mapping process for the data table begins.

AftProcMap – Called after the mapping process for the data table has finished.

BefValidate – Checks if the mapping process has been completed.

AftValidate – Runs after the BefValidate Script.

BefExportToDat – Called before the data file is created for Export.

AftExportToDat – Called after the data file is created for Export.

BefLoad – Called before the data file is loaded to the Target Application.

AftLoad – Called after the data file is loaded to the Target Application.

BefConsolidate – Called when a check rule is included in the location that is being processed. Can only be used if the Target Application is Hyperion Financial Management or Essbase.

AftConsolidate – Runs after the BefConsolidate Script.

BefCheck – Runs before the Check Rule.

AftCheck – Runs after the Check Rule.

One final thing, when creating an Event Script, make sure that there is a Script Folder within the Application Root Folder. Event Scripts are application specific, so they need to be kept in the Script Folder for the corresponding Application’s folder system.


Collected from https://hollandparker.com/blog/2016/04/21/event-scripts-in-fdmee/



Thanks,

Mady

Saturday, August 26, 2023

Data Management - same combination of all dimensions showing different data

 In Data Management, we found one issue today.

For same combination of all dimensions we were getting different numbers. for example, Dec-23 of all dimensions of same combination getting multiple records with different data. When we are checking other months with same combination, we were getting only one record. 

We verified Source file records, cleared data and loaded again, still same issue.

Loaded only one records for Dec-23, still getting 12-15 records with different data.

We didn't understand the issue, so we did  same steps in other environment with same source, there we were getting only one record.

So, Finally after checking all steps one by one we found one thing.

We verified Period Mapping where prior period key mentioned as 30-11-2022(by mistake) instead of 30-11-2023

Once we are update with this prior period key, we got results as expected only one record.


Thanks,

Mady