Home Site map

News Items

Australia-the data-driven “coup capital of the world”

Author: Iain Kiernan


Australia: the data-driven coup capital of the world

At Data Agility we make a lot of fuss about applying reliable data to key decisions. One group who may have mastered this are Australia’s politicians who recently changed Australia’s Prime Minister without an election.

Following Malcolm Turnbull’s recent appointment as Prime Minister the BBC called Australia the ‘coup capital of the world’. This seemed to make sense as Turnbull was Australia’s fourth Prime Minister in just 27 months (Gillard, Rudd, Abbott, Turnbull), Coup Capital indeed. Well maybe, but consider this….….perhaps Australia’s politicians mirroring the community and business and applying reliable data more quickly than ever.

The 2013 election resulted in Tony Abbott becoming Prime Minister. However, while elected with a large majority and a clear mandate Abbott was never personally popular and his Government (the Coalition), as measured by the data gathered through opinion polls, quickly and increasingly became unpopular.   The Newspoll data below tells the story

Newspoll Fed Gov 2015 c

After 30 adverse poll’s Turnbull challenged for the leadership of the Liberal Party (and therefore the Prime Ministership). Turnbull cited the Liberals poor performance in the polls as the trigger for the challenge. A ballot was called, Turnbull won and the following day was sworn in as Prime Minister.

While there was public angst about the change, the decision appears to have been driven by analysis of data viewed by the decision makers (those who elect the party leader) to be very relevant and very very reliable.   In their judgement the data showed that without change the Government would lose power and individual members, possibly lots of them, would lose their seats and therefore their jobs.

It’s long been said that the only Poll that matters is the election itself. In Australia this is demonstrably no longer the case. As the new Prime Minister made clear ‘Nobody looks at opinion polls… with more attention than politicians’. Today Australia’s politicians are mirroring the way that business and the community operate. They are accessing data they regard as reliable and applying it in far shorter timeframes than electoral cycles.

Whether Australia remains as the ‘coup capital’ is to be seen. What seems certain is that its politicians will be applying data (increasingly data created through digital transformation) to its most important decisions. We are not making a party or political point here, but this reminds us, and warns us, that maintaining and applying reliable data is critical in the decision making process.



How leveraging metadata is enabling better decisions

Author: Iain Kiernan


At Data Agility we’ve always felt as if we’ve done our fair share of work around metadata management but over the past 18 months or so we’ve experienced a real surge of interest. We reckon three things are driving this.

Firstly, more than ever before, metadata has become part of the Australian conversation. Metadata had for many been a rather obtuse topic (I do recall being asked about five years ago by a CEO in an open forum ‘what is metadata?’) but the discussion around the Federal Government’s upcoming access to previously private telephony metadata has made it a lead story.

Secondly, the proliferation of smart phones has led to an often informal education in metadata as we use, for example, the metadata embedded in digital photographs to almost limitless social and business applications. We no longer look at a photograph on a piece of paper and wonder when and where it was taken. Today we apply agreed metadata standards to capture the exact time and place to a digital photograph. And further, we can share the photograph across limitless electronic systems applying the same metadata standards enabling us to view it and understand the data that is captured with it.

The third driver is that data reliability is much more widely understood and metadata has an important part to play in applying reliable data to decision making as we apply common data definitions across systems.

The information and technology community have recognised that device driven digital transformation underpins these changes and that applied metadata is crucial to making decisions using the huge volumes of data being generated by smart phones, blood-pressure monitors, parcel trackers, smart meters, RFID tags, SCADA tags, location services and so on. This community has also recognised that decision making based on reliable data underpins a successful digital transformation.

Our work hasn’t always been called metadata management, amongst other things it’s been described as repositories, registries, data dictionaries, business glossaries and catalogues. Whatever the label, the work we’ve been doing has enabled structured definitions to be created and stored and easily applied to a wide variety of operational, transactional and analytical systems.  We’ve also had a focus on delivering metadata solutions that are easy to use and provide sophisticated search capabilities.

If you’d like to know more about how our clients in the public and private sectors are using metadata to make better decisions in customer service, operations, production, sales and to reduce cost and to be more efficient please get in touch.

 



The reliability of Australian data—2011 to 2015

Author: Iain Kiernan


Data Agility has published an update of the continuous analysis of the causes and consequences of unreliable data for Australian organisations.

First conducted in 2011, this update highlights the pursuit of data quality along with the continued themes multiple systems, information silos, multiple channels and a lack of common data definitions.

To download this summary report, complete your details below.




Data Agility congratulates Eastern Health

Author: John Neville


At 7am this morning, the Victorian Minister for Health, the Hon David Davis, wheeled the first patient into the new Box Hill Hospital.

Today was the culmination of three years of planning, design and construction, with the finished building a credit to all concerned.

For the past six months, Data Agility has had the pleasure of working with the Box Hill Hospital Redevelopment team to ensure all the technologies and plans were in place for this important day.



Customer centricity and Big Data

Author: Iain Kiernan


Last week the CPA held their Retail conference and I was a member of a panel discussing customer centricity and Big Data.  Data Agility research shows that the most significant business trigger for improving the application of data is being in transformation, and the Retail sector is undergoing massive change. 

My fellow panellists, from Harley-Davidson and former marketing director of Target, were right on their game and our facilitator, flawlessly hipster cool and as sharp as a pin, got us quickly into our stride.  The early discussion took us through what it actually means to be customer centric. It disentangled a romanticised view to one which was focused on delivering services and products as effectively and efficiently as possible.  While customer intimacy at Harley-Davidson can get very intimate (2nd most popular tattoo in the world after ‘Mum’ I’m told) for most that level of connection isn’t possible (know anyone with a bank brand as a tattoo?).  As the pressures of change are enormous efficiency is massive in Retail and this means having the right data at the right time.  The view was that the ‘right’ data was still the stuff we have been improving on for years; product data, sales data, customer data, customer interaction data and so on.   

The discussion validated another Data Agility research finding in that most organisations are actually dealing with increased volumes of data but it is the ability to analyse the data across channels that is the key challenge.  And Retail, just like everywhere else has seen huge on-line, and therefore cross-channel, growth. 

So what does that finding mean for Big Data?  Well, by-and-large, the conclusion is that at least for the next few years getting the most out of high-quality analytics solutions and building internal analytical skills is where the benefits are to be gained.  Certainly, in-memory technologies will enable a level of processing at speed which will provide advantage. But for most Australian organisations, there is still competitive benefit to be had from materially improving the reliability of your data, leveraging existing analytics capabilities and building the skills of analysts such that that they are genuinely able to identify what makes a difference in commerce and IT.

As the former Target guru generously suggested, specialist services providers, such as Data Agility, who already have the skills, methods and capability available, have a significant part to play in the delivery of early benefits and getting you on the road to success.

So while Big Data is coming it’s not the panacea to meet today’s challenges in a sector that is experiencing such transformation.  Focusing on the customer, the data and cross-channel analysis is where the opportunity is today.



Australian productivity and unreliable data

Author: Iain Kiernan


In the early part of 2014 we’ve seen increasing media attention on Australia’s productivity.  As the resources boom plays out, the Aussie dollar remains high and the manufacturing industry endures terrible pain there is ever increasing focus on Australia’s productivity.

The USA is usually set as the benchmark, and in round numbers they are about 15% more productive than we are.  Not only do we have a high value currency but like our northern European peers and unlike much of the USA we have a relatively high minimum wage and high wages overall.

So what can be done to improve our productivity?

Much of Australian industry is now in the services sector and the proportion of knowledge workers who’s job it is to report accurately or drive insight out of data is growing daily.  When I say ‘who’s job it is’ I choose my words carefully, because when you actually look at what people with roles such as Data Analyst are actually doing its quite different.  For many what they actually do is spend much of their time doing very basic data quality work on a data set that they have extracted from a source system of some description.

This is a crazy expensive way to do what is a largely automatable low cost task. But….it gets worse.  There are many many examples where the same data set is extracted from a source system by different parts of a business, found to be unreliable and then cleansed by high quality high priced resources in several places.  Its just crazy expensive multiple times with the same data! Arragh… the pain, the pain.

We have a massive tertiary education sector and a lot of smart well educated people.  If we are to  to be economically successful and close the productivity gap then one of the best things we can do is create and manage reliable data sets in a cost effective way that allows our analyst community to drive out insight that makes a difference.

Lets get this done and shift our productivity.

Iain Kiernan

Director



The key to making decisions in emergencies

Author: Iain Kiernan


One of my colleagues recently asked me why some of the Information, Communication and Technology strategies we had seen lately were struggling to make an impact on business performance.  My response was that there is often too much emphasis on the ‘C’ and ‘T’ components of the strategies and too little on the ‘I’.  The result being that organisations often aren’t making better decisions.

Within a day or two of this conversation I was attending workshop hosted by CSIRO.  Its title was ‘Building a system of systems for disaster management’ and it was attended by about 70 senior representatives of State and Commonwealth emergency management and response organisations, and scientists from universities and organisations such as Geoscience Australia, Bureau of Meteorology.  Motorola and IBM where also on hand to demonstrate their in-field command centre capabilities.

As indicated by the attendees, the current disaster-management landscape in Australia is characterised by many players with many systems and a lot (yes a lot) of data.

Right from the off the focus wasn’t on systems but on information.  In the very first presentation, the Director of the Federal Crisis Coordination Centre, Andrew Grace, made it clear that the key to disaster management is ‘information that can be relied upon’ as this was the platform for emergency services making good decisions.

From that point the focus of the workshop shifted from systems to data and information.  It emerged that there are many, many systems but the challenge isn’t to be met by an enormous systems integration exercise – it will be met by enabling timely access to high quality data.  And it’s not just that the data is required at the time of an emergency, it is a crucial input to the planning.

My role at the event was to lead a workstream and our group concluded that interoperability wouldn’t be achieved by creating a systems of systems.  Rather it would be achieved by raising data quality, defining and applying data standards and establishing a governance mechanism for emergency management data.  We also concluded that communication across the sector was a key component.

All in all, by the end of the event it became clear that the ‘C’ and the ‘T’ are very important in disaster management but more concentration on the ‘I’ is going to lead to better planning and better decision making.

Iain Kiernan is a Director of Data Agility. His focus is the effective business application of information and he leads the Information Management capability. For more information on how improve unreliable data in your organisation, contact Iain today.