Cloud

O SpecGold OracleBusIntApps7 clr

 Gcloud

 

   Call us now 

  Manchester Office

  +44 (0) 8450 940 998

 

  

 

Welcome to the Beyond Blog

As you'd expect from the winners of the Specialized Partner of the Year: Business Analytics at the Oracle UKI Specialized Partner Awards 2014, Beyond work with leading edge BI Applications primarily within the UK Public Sector. We intend to share some of our ideas and discoveries via our blog and hopefully enrich the wider discussion surrounding Oracle Business Intelligence and driving improved insight for customers

  • Home
    Home This is where you can find all the blog posts throughout the site.
  • Categories
    Categories Displays a list of categories from this blog.
  • Tags
    Tags Displays a list of tags that have been used in the blog.
  • Bloggers
    Bloggers Search for your favorite blogger from this site.
  • Team Blogs
    Team Blogs Find your favorite team blogs here.
  • Login
    Login Login form

If you haven't already used Live SQL then you're missing out. This free (Enterprise Edition) database instance provided by Oracle as a sandpit for trying things out and learning new stuff is a great resource. More importantly, it tends to be kept in line with the latest "hot off the press" database release.

So it came as no real surprise last week when it was upgraded to the latest 18c version (which is really just 12.2.0.2, however is now on Oracle's new Calendar Year naming convention. I know I'm a bit late here, however I've only just got around to writing this post. :)

Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production

Check out the new features (Polymorphic table functions, private temporary tables etc) - various Oracle guys have already uploaded a bunch of sample scripts to get you started.

Last modified on Continue reading
in Technical 62 0
0

This issue arose at a customer today, and I've seen it happen in the past, so I thought it worthwhile making a quick note.
A situation had arisen which had caused the overnight ETL execution to be delayed. Once this had completed the users complained that the dashboard was missing some data. This was tracked down to a shared filter which contained a restriction using the variable LAST_REFRESH_DT. When we checked this value of this variable we found it was two days behind.

This had happened because of the way repository variables are refreshed in OBIEE. They are not refreshed as part of the ETL, they are refreshed on the initialization block.

Refresh

So in this case, at midnight every night. That clearly is no good if our ETL starts anything later than midnight as we miss the update on w_day_d.
We need to set it to something that is a factor of the latest expected finish time of the ETL, and the latest time the end users are willing to wait for the refresh. And as the latter always has to be after the expected finish time, then we can use that. Let's say it's 8am. We should therefore change the time on the schedule o 8am to ensure the refresh of the variable (not just this one, but in theory any repository variable) is done after the warehouse refresh has completed. If there are any exceptional circumstances we need to be aware of these and deal with them accordingly.

Changed Refresh Time

There is of course the option for changing this to an hourly refresh instead, however this similarly needs a change from the default. It's more about being aware of the issue and knowing that there needs to be some planning in place that is a function of your ETL schedule and end user expectations.

That's all - short and sweet! :)

Last modified on Continue reading
Tagged in: BI Applications
in Business Intelligence 239 0
0

With machine learning being one of te big things at the moment, I thought I'd cast my mind back to my first ever c programming assignment at university - write the game of Pangolins. The game is based on the 20 Questions game, whereby the user thinks of an object and the machine aims to guess that object by asking simple yes/no style questions - ideally less than 20. the system starts off by knowing about only a single object - a small ant-eating mammal called a Pangolin.
Each time a user thinks of something the system isn't aware of, it learns from this. The internal implementation of this is just a simple set of nodes, which can either be a question, or an object. A question node has two pointers to a yes and a no node. It's probably easiest to illustrate with a walkthough. I created a little demo app which can be accessed here on apex.oracle.com. The sample code to create can be found at the bottom of this post.

We start off with a single entry - and we are therefore asked "are you thinking of a Pangolin"?

Step 1

So assume we were actually thinking of a pencil, so we say no. The system then asks us what were we actually thinking of. Let's tell it so.

Step 2

Next we are asked to give a yes/no question that will distinguish between a pencil and a pangolin.

Step 3

And clearly the answer for that is No.

Step 4

Last modified on Continue reading
Tagged in: APEX Machine Learning
in Technical 904 7
0

Data Flows in v4 of Oracle Data Visualization (in the new OAC as well as Desktop) is much improved, so let's look at creating a flow to :

  • Join together two datasets
  • Filter the columns
  • Create some bins
  • Add a new calculated column
  • Save the results as a singe data source that we can then analyze.

Our flow will eventually look like this .....

b2ap3_thumbnail_img1.png

We will start with one data set I have created, that being a spreadsheet of ficticious sales people and their travelling and renumeration.

b2ap3_thumbnail_img2.png

The second data set is a sheet of the sales people with the cars that they drive

b2ap3_thumbnail_img3.png

So let's get the basics out of the way and load them both up as data sets ....

Last modified on Continue reading
in Business Intelligence 390 0
0

Oracle Data Visualization Desktop has a lot of useful features that you might not know about on first inspection, so in this blog I will run through what Data Actions are, and the clever ways they can be used in your own projects.

A data action enables you to link a visualization with an URL, move to another page in your project or to another project all together. Data Actions can also be used on any visualization, or restricted to specific visualizations. Filters can also be passed through a Data Action from one canvas to another.

URL Data Actions

Create a visualization, select the Canvas Settings icon in the top right and select Data Actions.

 b2ap3_thumbnail_Data-Actions-1.png

Click the + icon in the Data actions menu, check type to be URL and enter the URL you want to link your visualization to.

b2ap3_thumbnail_Data-Actions-2.png

Now if you right click onto your visualization, you’ll see your data action on the menu. Click this and you’ll be brought to your desired URL.

b2ap3_thumbnail_Data-Actions-3.png

Last modified on Continue reading
in Technical 441 0
0