Cloud

O SpecGold OracleBusIntApps7 clr

 Gcloud

 

   Call us now 

  Manchester Office

  +44 (0) 8450 940 998

 

  

 

Welcome to the Beyond Blog

As you'd expect from the winners of the Specialized Partner of the Year: Business Analytics at the Oracle UKI Specialized Partner Awards 2014, Beyond work with leading edge BI Applications primarily within the UK Public Sector. We intend to share some of our ideas and discoveries via our blog and hopefully enrich the wider discussion surrounding Oracle Business Intelligence and driving improved insight for customers

  • Home
    Home This is where you can find all the blog posts throughout the site.
  • Categories
    Categories Displays a list of categories from this blog.
  • Tags
    Tags Displays a list of tags that have been used in the blog.
  • Bloggers
    Bloggers Search for your favorite blogger from this site.
  • Team Blogs
    Team Blogs Find your favorite team blogs here.
  • Login
    Login Login form

Unless you've been hiding under a stone for the past few years you'll know that the cloud is the big thing at Oracle. There are fewer and fewer on-premise installations for greenfield projects. With the new pricing structure it is easy to see why more and more organizations are considering cloud services for their new developments. An easy venture for a client new to cloud may be say a reporting suite, developed in APEX, utilizing data from their source ERP system. The big question then of course is how do you transfer your data to the cloud securely? there are many products out there to facilitate this, such as Oracle Data Integrator (ODI), Oracle DataSync, custom processes with file transfers over sFTP etc. However I want to show a really easy way to do this via an SSH tunnel.

There are a number of steps that need to be done - some are optional (such as TNS Names entries) and you can work without them, however I've written the post as I would prefer to set it up - you may choose . I am using E-Business Suite R12.1.3 Vision as a source system, however the principle applies equally to others.

Source System Configuration

First we create a read-only user on the source system and grant the objects we wish to expose. We then create synonyms as that user to make querying easier (and to protect against change in the future).
As SYS

VIS121 r121@ebs121-vm ~ $ sqlplus / as sysdba

SQL*Plus: Release 11.1.0.7.0 - Production on Tue Dec 12 16:00:40 2017

Copyright (c) 1982, 2008, Oracle.  All rights reserved.


Connected to:
Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options

SQL> create user apps_cl_query identified by apps_cl_query;

User created.

SQL> grant connect, resource to apps_cl_query;

Grant succeeded.

SQL> conn apps/apps
Connected.
SQL> grant select on per_all_people_f to apps_cl_query;

Grant succeeded.

SQL> conn apps_cl_query/apps_cl_query
Connected.
SQL> create synonym per_all_people_f for apps.per_all_people_f;

Synonym created.

SQL> exit
Disconnected from Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
VIS121 r121@ebs121-vm ~ $

Last modified on Continue reading
Tagged in: Cloud DBaaS ETL
in Technical 397 0
0

If you haven't already started using Oracle Cloud services, then what better way than to sign up for $300 free credits for a pay-as-you-go subscription (valid for one month)? Simply visit https://cloud.oracle.com/tryit and click Sign Up. I did it earlier - it's really easy and was set up within an hour. To try it out, I provisioned myself a Standard Edition Database service - again, pretty easy. You can choose either a predefined set of options, or customise it to suit your needs (in terms of CPU's, memory, database version etc). You'll see the service in your console with a status of "Creating service...".

Creating Service

After about half an hour the service provisioning is complete. You may notice that provisioning a Database service actually gives you a couple of supporting others too - you can of course provision these standalone if required.

Services

Starting/Stopping a particular service is as easy as this:

Starting and Stopping Services

Next was to get access via SSH. To do that you will need to generate a key pair on your client machine using ssh-keygen. Then simply copy your public key up to the client.

Last modified on Continue reading
Tagged in: Cloud Database 12.2
in Technical 478 0
0

I know there are already a good number of blogs/guides out there already describing various methods of automating the backup of APEX applications, however I thought I'd share the method I recently implemented internally as it uses a remote subversion repository. This gives rise to a number of subsequent benefits such as holding a full version history, low storage overheads and resilience to local hardware failure. Why might you want to do this? Well, aside from the obvious catastrophes, acts of God, malicious deletion, accidental corruption etc, it's sometimes simply useful to be able to take your application as of a particular point in time, regardless of your database flashback etc.
Anyway, here is the process we take will follow.

  1. Export all our APEX applications from the workspace.
  2. Add any new applications that we've not seen before to the svn repository.
  3. Commit any changes to svn

First we need to create a working directory of our repository on the APEX database server. Note that I already added all applications to this repository previously - this is not necessary however. I chose to check out a specific directory only rather than the root. You of course need to install the svn client software on your server if you haven't already for this step. It's free and easy - and not worth explaining here.

[oracle@localhost tmp]$ svn checkout https://mysvnrepo/folder/subfolder/etc svn
A    svn/f101.sql
A    svn/f10100.sql
A    svn/f110.sql
A    svn/f10200.sql
A    svn/f20100.sql
A    svn/f10210.sql
A    svn/f20200.sql
A    svn/f20300.sql
Checked out revision 1079.

To export we can use the APEXExport Java utility. This is called in the following way.

java -cp $CLASSPATH oracle.apex.APEXExport -db <database connection> -user <database user> -password <database password> -workspaceid <workspace id>

This will generate a set of .sql files in the format f<application_id>.sql in the current directory, which we can then copy into our working directory. The issue here is that an APEX export file contains a line representing the date and time at which the export was done. This will then be considered a change by svn. To avoid that, I strip out that line using the sed utility.

sed -i '/--   Date and Time:/d' f*.sql
Last modified on Continue reading
Tagged in: APEX
in Technical 1118 0
0

Oracle APEX Exploitation - Part 3

This is the third in my series of short posts about methods thatc an be used to exploit your Oracle APEX applications. The first two posts concentrated on URL Injection which is relatively easy to protect against, however this third post is going to focus on something that is a bit more difficult to stop, and not quite as obvious an issue. I am going to call it Select List Injection.

Select List Injection

This exploit relies on the application having a select list that has been filtered somehow for the user. For example, a select list may show the list of employees that report to the current user - in reality the list of employees on the base table is a superset of these.

Mechanism of Attack

A simple example is a page which contains a select of employees reporting to the current user and displays a report based on the selected value. The select list only contains the employees visible to the user. We can set up a simple example as follows.
Select List LOV Code:

select ename, empno from emp
where mgr=7566
order by ename


Report Code:

select ename, empno, hiredate, sal from emp
where empno=:p5_emp_id

Report

Last modified on Continue reading
Tagged in: APEX
in Technical 1217 3
0

Oracle Data Visualizer has been out for a couple of years now and is already on version 4.  I'm a big fan and have been digging deep into the latest release which has brought in a substantial amount of changes.  They are all available here, but I think that the most exciting inclusions are around the Explain capability and new algorithms that have been included in the product focused on Sentiment Analysis and Machine Learning, as well as the opportunity to load up your own custom scripts. 

As an example, let us perform some Sentiment Analysis.  I have created some sample data by means of some short reviews of three ficticious restaurants. 

b2ap3_thumbnail_dv1.png

Two look pretty good to me and one somewhat less so.  Let's push this through the sentiment analyzer and see what results we get.  Firstly I  navigate to the new super-dynamic Home Page in Data Visualizer v4 and selet the Data tab on the left hand side

b2ap3_thumbnail_dv10.png

As per previous versions, we can upload the data - it can of course be sourced from multiple types of sources, but for this example we're just uploading my small review spreadsheet.

b2ap3_thumbnail_dv3.png

Now we have the data file, we can goto the Data Flows section and create a new data flow.  Here we start the flow with the source restaurant review data file.

b2ap3_thumbnail_dv4.png

Note that there are a substantial number of Machine Learning models now available to use in the flow and we will be covering examples of these in further posts.

b2ap3_thumbnail_dv5.png

So, let us add a Sentiment Analysis as the next part of the flow.  We will tell Data Visualizer to use the Review column as the source of the analysis and to write out the sentiment to a new column called Emotion.

b2ap3_thumbnail_dv6.png

Let us now add the final storage step to hold the results of the output of the flow.  If you look at the table below you can also see that the Sentiment Analysis has done it's job already actually and created what I think look to be pretty accurate results.

b2ap3_thumbnail_dv7.png

We will now save and run the data flow - which will be instant - and then we can look at the results by creaing a simple Project and a visulaisation with a bit of colour.

b2ap3_thumbnail_dv8.png

Personally I think we can now really see the investment in the product coming through and not only is getting so much more powerful, it stilll importantly remains intunitive to use and is a great tool to augment "traditional" BI. 

Last modified on Continue reading
in Business Intelligence 716 0
0