Followers

Showing posts with label PBCS. Show all posts
Showing posts with label PBCS. Show all posts

Wednesday, January 28, 2026

OIC - Pull data using BIP Report into Data Management with Substitution variables

      How to pull data using BIP Report into Data Management with Substitution variables

  • Go to Data Exchange > Applications, Select Data Source and Select EPM(Custom)
  • In File, select BIP Report file which you are going to pull data. This column structure we need to pull data in to the system correctly.
  • Update Report details as shown below: Select ERP connection Name.
  • Select Execution Method as BIP Report and Report name along with Path
  • Parameter List, if not mentioned any, select unique in the columns like Period=Jan or dummy column with Y like shown below.

BIP Report
If we are planning to execute report data load like Jan-26, which we are going to load that using Data Management, we can mention below as an argument to pick up Period from POV of the Data load Rule.

argument1=$START_PERIODKEY[MMM-YY]$;

If we are planning to use Substitution variables to pass through arguments like above method, it will not work. Because this report is going to connect to ERP, but our variable needs to pick from EPM.

In case of using substitution variable in OIC to execute dataload rule, OIC will pick substitution variable from EPM to execute Data load rule, which will pass same to BIP Report also. In this case substitution variable will work. 

Generally, to bring substitution variables from EPM Planning application to OIC to run the Data load Rule with that period we can use as below:
'{&CurrentMonth#&CurrentYear}'

Thanks,
Mady

Oracle EPM - Data Management error tracing

                 Tips and Suggestions on Data Management Errors

In Data Management, error tracing some times we will get confuse or forgot where to check. In that few I am recording here.

  • First of all, Process details log, where we will see process statement immediately after process completed. 
  • In Data load, if we get error [None] with Currency dimension, check location where you select import format, you remove [None] with functional currency.
  • Invalid period - means based on the period mapping, system doesn't able to match period with file inputs or source inputs.
  • While exporting and importing Period mappings, you may face with data format, which will change into excel number format or different date format, you can excel custom option to get DM date format, then you can import the same.
  • If you are getting Period in the file, then use Explicit Period option from Source Mapping to pick period from File and to match with Period mapping.
Thanks,
Mady

Tuesday, January 27, 2026

EPM - How to load Attributes in to PBCS application

                 How to load Attributes in to PBCS application

To load Attributes, prepare file like this.

Employee Number1, Alias, Permanent, Monthly

Employee Number2, Alias, contract, Hourly

Here 3rd, and 4th columns are attributes, which i am going to load along with Emp numbers and Alias metadata using Data Management/Data Exchange.


While creating application, select Dimensions and If needed use prefix.

As it will create dimensions, if you are going to load Custom dimension, select Custom and add Attribute name (Parent member of the attributes which you created in application) and assign UD1/2/3 depends on the no.of the Attributes columns you are loading.

So, create import format, where select Source and Target Dimensions and Attributes.

Note: Few you are getting blanks for few attributes, then generally we will write SQL in mapping, to ignore blanks, but here is the catch. If you ignore Attribute it will ignore that complete line. That means metadata also will not build. so, if you want to use to load metadata along with attributes, you must need one member in Attribute to assign for blanks in attribute list. Attribute will not duplicates also, so, if you want to use like NoAttribute1, and NoAttribute2 etc.

While loading metadata, you can ignore blanks not metadata members, so above tip will be helpful.

Load and Test once it's completed...!! 

Reference:


Thanks,
Mady

Friday, June 27, 2025

OIC - To place file in Amazon S3 Bucket

 To place file in Amazon S3 Bucket - we can use below method:

Use Put Object to place file into S3 Bucket.


Update Bucket Name in the below screen

Check Summary and click Finish.


In Mapping, Update details like this. File name you can mention directly, or if needed you can use along with folder name also like mentioned in below screenshot.



After executing you can check S3 bucket, to verify the file is placed or not.


Thanks,

Mady




OIC - Upload file

        How to upload file into EPM inbox/outbox folder or into EPM - DM - inbox folder


we can use below:


/interop/rest/11.1.2.3.600/applicationsnapshots/{applicationSnapshotName}/contents

Needs to use Post Method

Request: Select Binary Method

Use Post with below response:

{

  "status" : "",

  "details" : ""

}


Thanks,

Mady


Wednesday, June 4, 2025

OIC - To get file from Amazon S3 Bucket

                                                    How to get file from AWS bucket


You can use Get Object to pull file from AWS bucket.

Bucket Name and Object name, then in mapping you update same details.

If you upload file in the step, file will pushed to Target location from AWS bucket.



Thanks,
Mady




OIC - Application Admin Mode

                 How to upload file into EPM inbox/outbox folder or DM inbox folder

Post:

/HyperionPlanning/rest/v3/applications/NiccoFPR/jobs

Request:

{

  "jobType" : "Administration Mode",

  "jobName" : "AppAdminJob",

  "parameters" : {

    "loginLevel" : "Administrators"

  }

}

In Mapping:


To make application into Admin Mode, Use Parameter as Administrators or to release application for all users, mention as All Users


Thanks,

Mady

OIC - Import Data

 How to Import Data: we can use below with RESTApi - to execute from OIC

Post:

/HyperionPlanning/rest/v3/applications/NiccoFPR/jobs


Request:

{

    "jobType": "IMPORT_METADATA",

    "jobName": "ImportMetaDataJob",

    "parameters": {

    "errorFile":"ImportMetaDataErrorFile.zip",

"importZipFileName": "myMetaDataDailyJob.zip"

    }

}

Response:

{

  "status" : 0,

  "details" : "",

  "jobId" : 224,

  "jobName" : ""

}

Get Status:

/HyperionPlanning/rest/v3/applications/NiccoFPR/jobs/{jobIdentifier}/details

Import metadata status:

Response:

{

  "items" : [ {

    "recordsRead" : 0,

    "recordsRejected" : 0,

    "recordsProcessed" : 0,

    "dimensionName" : "Departments",

    "loadType" : "Metadata Import"

  } ]

}


Thanks,

Mady