Extracting Data from Oracle Fusion SaaS using Data Replicator in Oracle Analytic Cloud Service

Extracting Data from Oracle Fusion SaaS using Data Replicator in Oracle Analytic Cloud Service

 If you were keeping up to date with the goings on at Oracle OpenWorld last week, then you may have caught Larry Ellison announcing the ‘Fusion Analytics Data Warehouse‘.
  Whilst the specific features that were showcased are unlikely to be available in the very near future, we have been exploring some of the functionality made available in the latest releases of Fusion SaaS and Analytic Cloud Service that likely underpins his demo. The key components here are the BI Cloud Connector Console (BICC) in Fusion SaaS and the Data Replicator functionality in Oracle Analytic Cloud Service (OAC). BICC allows you to extract data from Fusion SaaS as .CSV files, and push it to either the inbuilt Universal Content Management Server (UCM) or an Oracle Cloud Storage instance. Data Replicator allows you to orchestrate this process from within OAC and manages the transfer of data from the .CSV files to an Oracle Database – it even does the table creation for you! The aim of this post is to walk you through the end to end process of bringing your Fusion SaaS data into an Oracle Analytic Cloud environment. The instructions provided below assume you have access to an:

– Oracle Analytic Cloud (OAC) instance

– Oracle DBaaS instance

– Oracle Storage Cloud instance

– Oracle Cloud ERP/HCM (Fusion SaaS) instance

    Create a BICC Admin Role in Fusion SaaS The first thing we need to do is create a role that has the necessary privileges to schedule extract jobs in BICC. We can do this by creating an abstract role that inherits the relevant privileges from existing roles. Navigate to the Security Console page in Fusion and select Create Role, then provide the following details: Role Name: BIACM_ADMIN Role Code: BIACM_ADMIN Role Category: BI – Abstract Roles
  Navigate to the Role Hierarchy page, select Add Role ​​​​​and add the following roles:

– EssAdmin

– ORA_ASM_APPLICATION_IMPLEMENTATION_ADMIN_ABSTRACT

 
  Skip through the wizard to the Users page, select Add User and add the relevant user(s) to the role:
  Click Next to review your changes on the Summary and Impact Report page, and then Save and Close:
  Run Retrieve Latest LDAP Changes Process Whenever we make changes in Security Console, we need to run a process to propagate those changes throughout the rest of the application. Navigate to Tools > Scheduled Process and select Schedule New Process:
Search for Retrieve Latest LDAP Changes and submit the process:
  ​Ensure the process has completed before you try to run a Replication job.     Configure External Storage The next step in the process is to configure the BI Cloud Connector to push the .CSV files to our cloud storage service. Login to the Business Intelligence Administration Console (BIACM) within Fusion SaaS and navigate to the Configure External Storage page:
  Under the Configure External Storage tab, select Cloud Storage Service as the storage type and then fill in the rest of the details under Storage Service Configuration to match your Cloud Storage instance:
  ​Do the same under the Configure OAC External Storage tab:
You can test the connection details to make sure they are configured correctly:
That’s it for setup within Fusion SaaS – from here on we will configure and run the extract definition from the Oracle Analytic Cloud Service.     Setup and Run Cloud Replicator Extract Definition Login to the Oracle Analytic Cloud service and select Create > Replication Connection
Select Oracle Fusion Application Storage:
  Enter the details for your Cloud Storage service and Fusion SaaS application – the Host Name field should contain your Fusion SaaS base url appended with /bi/ess/esswebservice:
  Save the connection and then create another Replication Connection, this tile selecting Oracle Database as the connection type:
Enter the connection details for your Oracle DBaaS Service:
  Save the connection and then select Create > Data Replication:
Select the Fusion Application Storage Connection that we created earlier and click OK:
Now we can get on with defining our Replication definition – here we can select the Offering you want to extract for (you can only choose one per Replication) and the PVO objects within that offering that you are interested in. You can find more detail on all of the available PVOs and their associated content under  https://docs.oracle.com/en/cloud/saas/business-intelligence/index.html For this example, we will select the Financial offering:
Once you’ve selected an offering and chosen the PVOs that you are interested in, you will need to choose the columns you want to include in your replication and configure the table that the data will be end up in. As you can see below, we’ve done a search on ‘journal’ and selected the JournalLinesPVO. If we expand that object we can see that we’ve got a load of fields selected by default, some of which do not have meaningful target names, and a Target Object Name that doesn’t make a whole lot of sense:
Let’s choose a more meaningful target table name, de-select some of the fields that we aren’t interested in and rename some of the columns that have auto-generated names:
  Once we are happy with our Replication Definition, we can select our Replication Target​. Here will select the DBaaS Connection that we created earlier:
Make sure you are loading into the correct Schema:
When you are happy with your replication definition, save it with an appropriate name:
​ Now we are ready to run the Replication Flow:
Once our replication has completed, we should see the successful run under the Data > Data Replications page:
  And that’s it! You should now have a table in your DBaaS instance full of data that’s been extracted from your Fusion SaaS instance:
  From here we can use this data to create visualisations in OAC – either by including it in a Data Set in Data Visualizer (DV), or mapping the table to a subject area in the RPD and using Answers:

Leave a Reply

Your email address will not be published. Required fields are marked *

four + 3 =

This site uses Akismet to reduce spam. Learn how your comment data is processed.