ODE – The start of a journey
There must be a better way!
About two years ago we started a journey into the world of AgileBI. It all started out of frustration (as most journeys do), frustration about the magic sauce.
The magic sauce was the reason why one data warehouse project would be a raving success and the next a ‘meh’. It was often the reason that after delivering a raving success and then returning 12 months later I would find the data warehouse had become a ‘meh’. By that I mean not updated, not well managed and not delivering to the stakeholders expectations.
My view was that often the delivery team had somebody who had built many data warehouses and just intrinsically knew what to do to be successful. They could also take the team with them on that journey to make sure the entire project was successful.
Ask them how they did it and you would either get a smirk or a shrug.
So we started a journey to see how we could make the process repeatable and able to be delivered by our entire team, and more importantly, by our customers once we had finished the initial delivery.
Welcome to the world of Agile
That lead us to the world of Agile. Agile is great and a mature approach, but Agile for Data Warehousing not so much, leading us to investing time in defining an AgileBI approach.
The first thing we did was understand what we needed to deliver this approach, and we ended up with our 5 AgileBI circles.
Nothing revolutionary there, but it gave us roadmap of what we were looking for.
We were lucky enough to find the BEAM* approach from Lawrence Corr, which gave us the ability to gather data driven business requirements with agility. It also gave us a portion of the data modelling approach we needed.
But what about the data?
BEAM* is great, but we were still stuck with the problem of how we modelled and integrated the data that we required without a big upfront design phase followed by month and months of unique ETL development.
We understand while a lot of the business context (i.e Customer, Policy, Cover) is always unique to an industry and a lot of business rules are unique to a customer, we always find there are some things we always build the same way for each project (need a date dimension anyone?).
You can have the best data integration team around, but you will find that they all have their own slightly different way of coding around the same problem. And for real fun try getting them to agree what naming standards should be used!
Enter the Data Vault
We came across an approach called Data Vault. Its an approach to structuring data based on ensemble modelling.
It has a bunch of benefits that I will outline in a future post, but one of the major benefits for us was the ability to have relatively simple code that could be used to automate a lot of the dross work we always did as part of our data warehouse builds.
The other benefit was there were quite a few smart people around the world who were doing some heavy thinking on how to improve the approach.
So we decided to do what we always do when we find something new, exciting and promising. We would give it a go.
We hired Brian who had spent time building a Data Vault for Xero and was a proven guru, and we sent a couple of our team off to training and certification in Australia.
Just use an ETL tool right
Now we had a team who knew what it was, and how to build one. Let the coding begin!
We tell our customers its better to buy than it is to build, so we spent some time looking for software we could use to automate the building of the vaults.
There are not many options out there and the ones we found were either a standard ETL tool (or ELT tool) that were used in a certain way to deliver the vault structures and data needed. Or they were data vault specific tools that were focussed on automating the data loading and not applying the business rules that were needed.
We were not enamoured with either approach.
So we did what all New Zealand companies do in this situation, bring out the number 8 fencing wire and roll our own.
Research It, Build it, Prove it, Rinse and Repeat
We have learnt that embarking on a massive project to build these types of products is asking for a hiding and is far from Agile. We have also learnt that a customer priority will always arise that means we have to halt development for a while and then pick it up later.
So we have become very good at managing the process of chunking work down into bits that we can build and use to prove each capability or component. Also this helps us invest in research work upfront each time we are approaching a new area that we have not done before. We have found that this research-it, then build-it approach has resulted in a much higher success rate. As well as the ability to stop when we hit a gnarly problem that will just suck effort with little chance of success.
Hell thats the art of Agile right.
We have also found that implementing each bit in anger on real projects also helps us harden the product, and focus on the next piece of development that would provide the highest value.
So we are now at the stage we have a base of pretty cool code that automates parts of the data vault process. We have also proven it works within projects.
We have designed a cool architecture for the product which means we can deploy it on multiple technology platforms (Microsoft, Oracle, SAS, R etc) while still retaining a core design and code base.
Don’t get me wrong we still have a long road to go before it does everything we need, let alone everything we want.
Lets make the world a better place
At the stage that we had to decide how to move the product to a production ready product and that means we had to decide on our go to market approach.
Our choices are as always:
- Commercial Licensed Product
- Software as a Service offering
- Open Source
- Some weird arse alternative
I love WordPress for so many reasons. One is their ability to produce a full open source product and then have a commercial backbone that makes sure it is constantly enhanced. They do this without having to resort to the n-1 or hold out enterprise features approach all the other Commercial Open Sources vendors spin.
Another reason is that the wordpress community add so many cool features and addons to the product that it really does grow at a rate of knots, that is bigger than the core wordpress team.
Data Vault and DW Automation have been around for a long time, but for some reason it is still not a widely adopted approach. I believe one of the reasons is because there is not any readily available software to easily help you adopt this approach.
So we have decided to open source our product and see if we can help make the world a better place (or data warehouse delivery easier, faster and more successful at least).
Say welcome to Optimal Data Engine. We pronounce it ODE as in the lyrical stanza.
(those that have known me for too long know I love Steve Jobs power of 3 and I also love post rationalisation of a decision, not to mention characterisation of products, ODE covers so many of those it isn’t funny!)
And the so journey begins
The journey so far has been far from smooth and we know its only going to get bumpier.
So I have decided to blog each week to record the things we find, good or bad.
Buckle up baby and lets get started!
Cloud solutions have little to do with the solutions we have on-premises. Cloud solutions are built with ambitious and monstrously huge goals in mind, so, often it feels like no previous technology has been reused. In addition to some understandable concepts, like...
On my first BI job, where I was an ETL developer, my team was using Microsoft SQL Server Reporting Service (SSRS) as a reporting tool. My job was to model the Data Marts and create the data flows into those tables. Our report analysts were good at SQL, I have even...
Recently I was working for a client and had a specific issue which led me to probe around various aspects of Qlik Replicate tasks, attempting to understand how Qlik Replicate works. While I did not manage to achieve what I was specifically attempting I did have a look...
Over the years, we've been building up our comic-style SAS function one-page guides, drip-feeding them to you. Well, guess what?! We put them all together and made a book so they're all in the one place, to make referencing them easier for you. Thank you...
So there you are having created a work of beauty in Sparx Enterprise Architect that models all of your databases, and then the fly lands in the ointment. There's a new version of "stuff" coming down the pipeline and you need to update the model or things are...
On one of my recent projects I was analysing my customer's data for migration. Their legacy application could handle documents. They were stored on the organisation's shared folder, and the file path was recorded in the database with the rest of data. As a part of the...
The nation has been hit by a baking craze! Lockdown gave some of us more time to do some of the simple things in life, like baking, and, if you were lucky enough to get flour and yeast you could create some delicious treats! I've always enjoyed baking, so loved...
Recently I watched a webinar organised by Snowflake, it was called "Data Warehouse Automation, Ingestion and Industry Leading Analytics with Snowflake and Qlik". In the past I could never find time for webinars, and it's amazing how life can give you opportunities to...
The world has been social distancing for some time now, and will be for a bit longer, depending where you live. To help put off or at least mitigate going stir crazy I thought I would share what I have been doing to keep up some sort of social life. Before the lock...
We do cool sh!t with data and over the years we have shared a lot of blogs to help you do cool sh!t with data too. If you want to jump right in and see if we have something that piques your interest here are all our how to blogs. If you...
Traditionally business intelligence is an enterprise solution, as only big businesses have multiple sources that require integration and analysis. It is assumed that the rest are small fishes that could be able to analyse their data with simpler accessible tools or...
When a user starts and runs code in a SAS session, a number of temporary folders and files are created in the SASWork and SASUtil locations. If that session completed and exits normally these folders and files are removed, however some SAS sessions exit abnormally,...
If you have lived in New Zealand for long enough, I am sure you would have heard of the poison 1080, used by the Department of Conservation (DOC) as a form of pest control. 1080 can be used in bait stations or by airdrop, which is used for many of the remote and hard...
Lockdown started on the March 26 here in New Zealand, and following the Prime Minister's announcement on April 20th, we're looking at just an extra 2 business days in Level 4, before we move to Level 3 after the ANZAC long weekend. Level 3 means that...
In part two of my Adobe equivalent Free Open Source Software (FOSS) blog I thought I would look at the video editing pipeline and find some free alternatives for your video editing needs. You can read Part One which covers Photoshop, Illustrator and InDesign here....
Reporting and analytics computations are usually very resource-consuming, therefore they are never executed on the same server where the crucial business application is running. Instead, data is copied into dedicated servers to be dissected and studied for insights....
Many places all over the world are currently in self-isolation mode in response to the COVID-19 pandemic. In New Zealand, lock down started on Thursday March 26, which means that some people are able to work from home. At OptimalBI, we are working from home and on the...
Woohoo we've got a new website! If you're reading this blog you're already on it and we reckon you should click a few extra buttons than you were planning to and go check it out... Data really does make anything possible, and we're here to help you harness data...
We find ourselves in unprecedented times. All across the globe the impact of COVID-19 is being felt, it’s heart-breaking reading the stories of loss, the impact on societies, businesses and individuals as a result of measures put in place to restrict the momentum of...
One of the things many developers love about MongoDB and other NoSQL databases is the ability to store flexible objects in their native format. When you walk around the office singing its praises, experienced developers chime in and tell you that this is a quick path...