O SpecGold OracleBusIntApps7 clr



   Call us now 

  Manchester Office

  +44 (0) 8450 940 998




John's Blog

This is my blog of various topics, from things that I think might be useful to others, to things that I just find interesting personally. If you have any comments or questions on any of my posts then please do ask - any positive contribution is very much welcomed.

The views expressed in this blog and completely my own and do not reflect those of Beyond Systems Ltd. All content is provided for informational purposes only, and you are solely responsible for how you use this on your own or others systems. We accept no liability for any losses or damages caused.

  • Home
    Home This is where you can find all the blog posts throughout the site.
  • Categories
    Categories Displays a list of categories from this blog.
  • Tags
    Tags Displays a list of tags that have been used in the blog.
  • Bloggers
    Bloggers Search for your favorite blogger from this site.
  • Team Blogs
    Team Blogs Find your favorite team blogs here.
  • Login
    Login Login form
John Keymer

John Keymer

My primary area of interest and expertise is Oracle E-Business Suite, particularly Foundation, Human Resources, Payroll, Time and Labor, Warehouse Management and Inventory. Whilst I prefer to focus on functional aspects I do have a keen interest in technical areas too such as SQL tuning/the optimizer, Application Express (APEX), Unix (particularly Linux) and general development.

  ACE Logo

Cert Logo  
Cert Logo 2

I'm quite a fan of row generators. If I know I need a row for every day of the year for example, I generally avoid holding (and maintaining) a table of these dates, and instead will generate them. This is by no means a universal rule, so take each case on its merits.

select trunc(sysdate,'YYYY') + level -1 dt
  from dual
connect by level <= add_months(trunc(sysdate,'YYYY'),12) - trunc(sysdate,'YYYY');

You can see a few methods (of many) of generating rows on one of my previous posts. The issue with generating rows in this manner is that the optimizer estimates cardinalities based on the number of rows in the table. Which for dual is... 1.
So when we look at the execution plan for such a query, we see this.

SQL_ID  16r2my83pj187, child number 0
select /*+gather_plan_statistics*/ trunc(sysdate,'YYYY') + level -1 dt  
 from dual connect by level <= add_months(trunc(sysdate,'YYYY'),12) - 
Plan hash value: 1236776825
| Id  | Operation                    | Name | Starts | E-Rows | A-Rows |   A-Time   |  OMem |  1Mem | Used-Mem |
|   0 | SELECT STATEMENT             |      |      1 |        |    365 |00:00:00.01 |       |       |          |
|   1 |  CONNECT BY WITHOUT FILTERING|      |      1 |        |    365 |00:00:00.01 |  2048 |  2048 | 2048  (0)|
|   2 |   FAST DUAL                  |      |      1 |      1 |      1 |00:00:00.01 |       |       |          |

We thought we would get just 1 row back - but we in fact got 365. Once you start joining out to other tables, the magnitude of that soon becomes more prominent. Let's say I want to generate a row for each employee who is a manager, for each day of the year.

Last modified on Continue reading
Tagged in: sql Tuning
in Technical 188 0

If you haven't already used Live SQL then you're missing out. This free (Enterprise Edition) database instance provided by Oracle as a sandpit for trying things out and learning new stuff is a great resource. More importantly, it tends to be kept in line with the latest "hot off the press" database release.

So it came as no real surprise last week when it was upgraded to the latest 18c version (which is really just, however is now on Oracle's new Calendar Year naming convention. I know I'm a bit late here, however I've only just got around to writing this post. :)

Oracle Database 18c Enterprise Edition Release - Production

Check out the new features (Polymorphic table functions, private temporary tables etc) - various Oracle guys have already uploaded a bunch of sample scripts to get you started.

Last modified on Continue reading
in Technical 274 0

This issue arose at a customer today, and I've seen it happen in the past, so I thought it worthwhile making a quick note.
A situation had arisen which had caused the overnight ETL execution to be delayed. Once this had completed the users complained that the dashboard was missing some data. This was tracked down to a shared filter which contained a restriction using the variable LAST_REFRESH_DT. When we checked this value of this variable we found it was two days behind.

This had happened because of the way repository variables are refreshed in OBIEE. They are not refreshed as part of the ETL, they are refreshed on the initialization block.


So in this case, at midnight every night. That clearly is no good if our ETL starts anything later than midnight as we miss the update on w_day_d.
We need to set it to something that is a factor of the latest expected finish time of the ETL, and the latest time the end users are willing to wait for the refresh. And as the latter always has to be after the expected finish time, then we can use that. Let's say it's 8am. We should therefore change the time on the schedule o 8am to ensure the refresh of the variable (not just this one, but in theory any repository variable) is done after the warehouse refresh has completed. If there are any exceptional circumstances we need to be aware of these and deal with them accordingly.

Changed Refresh Time

There is of course the option for changing this to an hourly refresh instead, however this similarly needs a change from the default. It's more about being aware of the issue and knowing that there needs to be some planning in place that is a function of your ETL schedule and end user expectations.

That's all - short and sweet! :)

Last modified on Continue reading
Tagged in: BI Applications
in Business Intelligence 400 0

With machine learning being one of te big things at the moment, I thought I'd cast my mind back to my first ever c programming assignment at university - write the game of Pangolins. The game is based on the 20 Questions game, whereby the user thinks of an object and the machine aims to guess that object by asking simple yes/no style questions - ideally less than 20. the system starts off by knowing about only a single object - a small ant-eating mammal called a Pangolin.
Each time a user thinks of something the system isn't aware of, it learns from this. The internal implementation of this is just a simple set of nodes, which can either be a question, or an object. A question node has two pointers to a yes and a no node. It's probably easiest to illustrate with a walkthough. I created a little demo app which can be accessed here on The sample code to create can be found at the bottom of this post.

We start off with a single entry - and we are therefore asked "are you thinking of a Pangolin"?

Step 1

So assume we were actually thinking of a pencil, so we say no. The system then asks us what were we actually thinking of. Let's tell it so.

Step 2

Next we are asked to give a yes/no question that will distinguish between a pencil and a pangolin.

Step 3

And clearly the answer for that is No.

Step 4

Last modified on Continue reading
Tagged in: APEX Machine Learning
in Technical 1057 7

Unless you've been hiding under a stone for the past few years you'll know that the cloud is the big thing at Oracle. There are fewer and fewer on-premise installations for greenfield projects. With the new pricing structure it is easy to see why more and more organizations are considering cloud services for their new developments. An easy venture for a client new to cloud may be say a reporting suite, developed in APEX, utilizing data from their source ERP system. The big question then of course is how do you transfer your data to the cloud securely? there are many products out there to facilitate this, such as Oracle Data Integrator (ODI), Oracle DataSync, custom processes with file transfers over sFTP etc. However I want to show a really easy way to do this via an SSH tunnel.

There are a number of steps that need to be done - some are optional (such as TNS Names entries) and you can work without them, however I've written the post as I would prefer to set it up - you may choose . I am using E-Business Suite R12.1.3 Vision as a source system, however the principle applies equally to others.

Source System Configuration

First we create a read-only user on the source system and grant the objects we wish to expose. We then create synonyms as that user to make querying easier (and to protect against change in the future).

VIS121 r121@ebs121-vm ~ $ sqlplus / as sysdba

SQL*Plus: Release - Production on Tue Dec 12 16:00:40 2017

Copyright (c) 1982, 2008, Oracle.  All rights reserved.

Connected to:
Oracle Database 11g Enterprise Edition Release - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options

SQL> create user apps_cl_query identified by apps_cl_query;

User created.

SQL> grant connect, resource to apps_cl_query;

Grant succeeded.

SQL> conn apps/apps
SQL> grant select on per_all_people_f to apps_cl_query;

Grant succeeded.

SQL> conn apps_cl_query/apps_cl_query
SQL> create synonym per_all_people_f for apps.per_all_people_f;

Synonym created.

SQL> exit
Disconnected from Oracle Database 11g Enterprise Edition Release - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
VIS121 r121@ebs121-vm ~ $

Last modified on Continue reading
Tagged in: Cloud DBaaS ETL
in Technical 557 0