Business Intelligence

The Latest

25 January 2022: IBM has announced its acquisition of Sydney-based data analytics software company Envizi. In an official press release, the move was finalised to boost IBM’s capabilities to provide environmental, social and governance (ESG) analytics, which is an emerging specialised field.  

Envizi will be integrated with IBM’s existing package of manufacturing and supply chain solutions such as IBM Maximo, IBM Sterling, IBM Environmental Intelligence Suite (EIS) and IBM Turbonomic to support feedback automation in their operations and corporate environmental initiatives. 

Why it’s Important.

IBRS has observed increased activity by large vendors acquiring small, local Australian enterprises that specialise in data analytics. Some of these include the following:

  • Fujitsu’s acquisition of Melbourne-based data and analytics firm Versor in 2021
  • Cognizant’s 2021 purchase of Sydney-based Servian, a data analytics and AI vendor
  • Healthcare tech firm Beamtree’s acquisition of New South Wales-based comparative analytics enterprise Potential(x) in 2021
  • Accenture’s 2019 purchase of Australian big data and analytics consultancy Analytics8 then its series of acquisitions involving advanced analytics firms overseas such as Bridgei2i and Byte Prophecy in India, Novetta Solutions and End-to-End Analytics in the United States, as well as PRAGSIS BIDOOP in Spain.

Aside from these, acquisitions of data analytics startups by other firms outside of Australia have become prominent in the industry with the likes of Capgemini on Sweden-based Advectas, Genpact on Enquero, and Infogain on Absolutdata, which were all formalised in 2020.

IBRS believes that while it is beneficial for the industry to have vendors expand their analytics capabilities, customers or enterprise partners need to constantly assess the likely impact on their existing service contracts with analytics partner vendors. Some of the areas that are critical include terms and conditions, possible pricing changes, future services, contracted support and personnel changes, among others.

Who’s impacted

  • CIO
  • Development team leads
  • Business analysts

What’s Next?

Organisations need to be prepared for their analytics partners to be the next targets for acquisitions. As part of its strategy, organisations must remain vigilant and engaged with their analytics vendor partners regarding any acquisitions and the potential impact on services and costs. This includes assessing the implications of the potential scenarios that are most likely to occur, as well as the risks or opportunities that may be present with regard to adjusting to ramifications to the existing service, if there are any. Some potential risks or challenges that must be reviewed by the organisation’s legal and procurement teams can be found on this checklist.

Finally, organisations need to be cautious on assurances that are critical to their operations if these have not yet been put into written agreement. Becoming more pragmatic about the new vendor will minimise service disruptions in the future.

Related IBRS Advisory

  1. Mergers & acquisitions require federated service providers governance
  2. Mergers and Acquisitions - Devising the Right Strategy for IT

Artificial intelligence (AI) is an emerging technology that can be applied across business lines and yield significant results when aligned with business priorities. Assessing the AI maturity of your organisation can assist in providing AI roadmaps and aid in developing strategies and business cases.

The purpose of this presentation kit is to provide an AI maturity model in the analytics space. The proposed maturity model can be applied to any type of industry. Log in and click the PDF above to download the 'Analytics Artificial Intelligence Maturity Model' presentation kit and discover:

  • An IBRS AI maturity model that provides the foundation to apply the existing AI technology where it matters to the business
  • Guidelines to evolve into the future, whereby only limited data is available to make informed decisions
  • Next steps for your organisation

ICT executives and data analytics specialists are facing ever-increasing demands from business stakeholders. Driven by vendors’ promises of agile, self-service analytics and instant access to big data, business stakeholders expect the world, while concerns of governance and data quality are often overlooked.

In this webinar replay, IBRS explores the growing tension between business stakeholders expectations and the ICT group’s ability to provide appropriate guardrails for analytics.

The video explores:

  • How the concerns of business stakeholders differ from those of ICT
  • The four operating models of business intelligence
  • The emergence of data mesh architecture, and the potential impact
  • Using data literacy maturity to drive an evolving and practical data strategy

Download the presentation kit:  Business-First_Analytics_Webinar.pdf


As self-service data analytics and visualisation becomes mainstream – due in no small part to Microsoft’s Power BI strategy – traditional data teams within IT groups need to reconsider traditional business intelligence architectures and plan a migration to a new environment. Underpinning the new architecture must be a sharper focus on tools and practices to support data governance, which is not a strength of Microsoft’s portfolio.

Download the 'Power BI is Driving Data Democratisation: Prepare Now' presentation kit and discover:

  • The key areas of business intelligence to inform your Power BI strategy
  • Next steps for your organisation

Regardless of its digital strategy, many organisations have not been positioned to properly leverage the digital and data assets that are available to them. A Chief Data Officer (CDO) role can improve this situation by advancing an organisation’s data portfolio, curating and making appropriate data visible and actionable.

Log in and click the PDF link above to download 'The New CDO Agenda' presentation kit and discover:

  • 4 pivotal points of the CDO agenda
  • A sample framework on how to understand the ownership of a data domain
  • Next steps for your organisation

Too often, information communications technology (ICT) and business analytics groups focus on business intelligence and analytics architectures and do not explore the organisational behaviours that are required to take full advantage of such solutions. There is a growing recognition that data literacy (a subset of digital workforce maturity) is just as important, if not more important, than the solutions being deployed. This is especially true for organisations embracing self-service analytics.

The trend is to give self-service analytics platforms to management that are making critical business decisions. However, this trend also requires managers to be trained in not just the tools and platforms, but in understanding how to ask meaningful questions, select appropriate data (avoiding bias and cherry-picking), and how to apply the principles of scientific thinking to analysis.

Download the pdf now.



The Latest: 

26 June 2021: Zoho briefed IBRS on Zoho DataPrep, it’s new business-user focused data preparation which is being included in its existing Zoho Analytics tool, as well as being available separately as a tool to clean, transform and migrate data. DataPrep is in beta, and will be officially launched on 13th July 2021.

Why it’s Important

Traditionally, cleaning and transforming data for use in analytics platforms has involved scripting and complex ETL (extract, transform and load) processes. This was a barrier to allowing business stakeholders to take advantage of analytics. However, several analytics vendors (most notably Microsoft, Tableau, Qlik, Snowflake, Domo, etc.) have pioneered powerful, drag-and-drop low-code ETL into their products.  

Zoho, which is better known for its CRM, has an existing data analytics platform with Cloud storage, visualisation and reports, and dashboards. While the product is not as sophisticated as its top-drawer rivals, it can be considered ‘good enough’ for many business user’s needs. Most significantly, Zoho Analytics benefits from attractive licensing, including the ability to share reports and interactive dashboards both within an organisation and externally. 

However, Zoho Analytics lacked a business-user-friendly, low-code ELT environment, instead relying on SQL scripting. Zoho DataPrep fills this gap by providing a dedicated, AI-enabled platform for extracting data from a variety of sources, allowing data cleaning and transformations to be applied, with results being pushed into another database, data warehouse and Zoho Analytics. 

All existing Zoho Analytics clients will receive Zoho DataPrep with no change to licensing.

However, what is interesting here is Zoho’s decision to offer its DataPrep platform independent of its Analytics platform. This allows business stakeholders to use the platform as a tool to solve migration and data cleaning, not just analytics. 

IBRS’s initial tests of Zoho DataPrep suggest that it has some way to go before it can compete with the ready-made integration capabilities of Tableau, Power BI, Qlik, and others. In addition, it offers less complex ETL than it’s better established rivals. But, that may not be an issue for organisations where staff have limited data literacy maturity, or where analytics requirements are relatively straightforward.

Who’s impacted

  • CIO
  • Development team leads
  • Business analysts

What’s Next?

The bigger take out from Zoho’s announcement is that ETL, along with all other aspects of business intelligence and analytics, will be both low-code, business-user friendly and reside in the Cloud. ICT departments seeking to create ‘best of breed’ business intelligence architectures that demand highly specialised skills will simply be bypassed, due to their lack of agility. While there will be a role for highly skilled statisticians, data scientists, and machine learning professionals, the days of needing ICT staff that specialise in specific reporting and data warehousing products is passing. 

Related IBRS Advisory

  1. Snowflake Gets PROTECTED Status Security Tick by Aussie Auditor
  2. IBRSiQ: Power BI vs Tableau
  3. Business-First Data Analytics
  4. AWS Accelerates Cloud Analytics with Custom Hardware
  5. IBRSiQ AIS and Power BI Initiatives
  6. Trends in Data Catalogues
  7. When Does Power BI Deliver Power to the People?
  8. Staff need data literacy – Here’s how to help them get it

IBRSiQ is a database of Client inquiries and is designed to get you talking to our advisors about these topics in the context of your organisation in order to provide tailored advice for your needs.

The Latest

May 2021: Talend, a vendor of data and analytics tools, released its Data Health Survey Report that claims 36% of executives skip data when making decisions, and instead go “with their gut”. At the same time, the report claims that 64% of executives “work with data everyday”. On the surface, these two figures seem at odds. However, the report goes on to claim 78% of executives “have challenges in making data drive decisions”, and this is largely due to data quality issues. However, the most interesting finding from the report is “those who produce and those who analyse data live in alternative data realities”.

Why it’s Important

At its core, this report highlights the issue of data literacy. The report was compiled from 529 responses from companies with over USD10 million in sales. A quarter of respondents were from the Asia Pacific region. However, IBRS cautions drawing Australia-specific inference, given that different markets have differing levels of data literacy maturity. No details were given for industry, which is also likely to impact data literacy maturity. In fairness, any more detailed analysis of a country or industry would not be feasible, given the sample size. 

The above concerns aside, the report does highlight the importance of data literacy: investments in big data tools are useless unless executives are knowledgeable and well versed in the key concepts of applying analytical thinking to business decisions. IBRS notes that without data literacy, the most common use of new self-service visualisation tools such as Power BI, Looker, Domo, Tableau, Qlik, Zoho and others, is to ‘prove’ executives' gut feelings. In short, too often visualisations tools are used to reinforce the ‘current ways of thinking’ rather than seek areas for improvement.  

The report’s statement that “those who produce and those who analyse data live in alternative data realities”, frequently underpins IBRS inquiries into why business intelligence and analysis programs fail to produce the expected business benefits.

Who’s impacted

  • Business intelligence/analytics teams
  • Senior line-of-business executives
  • Human resources/training teams

What’s Next?

ICT teams responsible for providing business intelligence and analytics services need to cease solely focusing on the tools and technologies and ‘getting data curated’, and spend time exploring which business decisions would most benefit from the application of analytical thinking. However, the ICT teams cannot do this alone. They need to be involved in uplifting data literacy among line-of-business executives and work closely with them to identify the decisions that not only can be addressed with data, but those that would make the biggest difference to organisational outcomes. This does not mean that all aspects of a data scientists role need to be explained to business executives. Rather, training executives in the principles of using data to inquire into issues or disprove current ways of doing things is more important.  

Related IBRS Advisory

  1. Staff need data literacy – Here’s how to help them get it
  2. When Does Power BI Deliver Power to the People?
  3. The critical link between data literacy and customer experience

IBRS interviews Dr Kevin McIsaac, a data scientist who frequently works with board-level executives to identify and prototype powerful data-driven decision support solutions.

Dr McIsaac discusses why so may 'big data' efforts fail, the role ICT plays (or rather, should not play) and the business-first data mindset.


The growing maturity of data handling and analytics is driving interest in data catalogues. Over the past two years, most of the major vendors in the data analytics field have either introduced or are rapidly evolving their products to include data cataloguing.

Data catalogues help data users identify and manage their data for processing and analytics. Leading data cataloguing tools leverage machine learning (ML) and other search techniques to expose and link data sets in a manner that improves access and consumability.

However, a data catalogue is only beneficial when the organisation already has a sufficient level of maturity in how it manages data and analytics. Data literacy (the skills and core concepts that support data analytics) must also be established in the organisation’s user base to leverage full benefits from the proposed data catalogue.

Organisations considering data catalogues must have a clear picture of how to use this new architecture, and be realistic in how ready they are to leverage the technology. Furthermore, different organisations have unique and dynamic data attributes, so there is no one-type-fits-all data catalogue in the marketplace.

The Latest

29 April 2021: Cloud-based analytics platform vendor Snowflake has received ‘PROTECTED’ status under IRAP (Australian Information Security Registered Assessors Program).  

Why it’s Important

As IBRS has previously reported, Cloud-based analytics has reached a point in cost of operation and sophistication that it should be considered the de facto choice for future investments in reporting and analytics. However, IBRS does call out that there are sensitive data sets that need to be governed and secured to a higher standard. Often, such data sets are the reasons why organisations decide to keep their analytics on-premises, even if the cost analysis does not stack up against IaaS or SaaS solutions.

The irony here is that IT professionals now accept that even without PROTECTED status, Cloud infrastructure provides a higher security benchmark than most organisations on-premises environments.

However, security must not be overlooked in the analytics space. Data lakes and data warehouses are incredibly valuable targets, especially as they can hold private information that is then contextualised with other data sets.

By demonstrating IRAP certification, Snowflake effectively opens the door to working with Australian Government agencies. But it also signals that hyper-scale Cloud-based analytics platforms can not only offer a bigger bang for your buck, but greatly improve an organisation's security stance.

Who’s impacted

  • CDO
  • Data architecture teams
  • Business intelligence/analytics teams
  • CISO
  • Public sector tech strategists

What’s Next?

Review the security certifications and stance of any Cloud-based analytics tools in use, including those embedded with core business systems, and those that have crept into the organisations via shadow IT (we are looking at you, Microsoft PowerBI!). Match these against compliance requirements for the datasets being used and determine if remediation is required.

When planning for an upgraded analytics platform, put security certification front and centre, but also recognise that like any Cloud storage, the most likely security breach will occur from poor configuration or excess permissions.

Related IBRS Advisory

  1. Key lessons from the executive roundtable on data, analytics and business value
  2. VENDORiQ: AWS Accelerates Cloud Analytics with Custom Hardware
  3. IBRSiQ: AIS and Power BI Initiatives
  4. VENDORiQ: Snowflakes New Services Flip The Analytics Model

The Latest

7 May 2021: Analytics vendor Qlik has released its mobile client Qlik Sense Mobile for SaaS. During the announcement, Qlik outlined how the new client enables both online and offline analytics and alerting. The goal is to bring data-driven decision-making to an ‘anywhere, anytime, any device’ model. 

Why it’s Important

While IBRS accepts that mobile decision support solutions will be of huge value to organisations, this needs to be tempered with an understanding that not all decisions should be made in all contexts. There is a very real danger that in the hype surrounding analytics, people will start making decisions in less than ideal contexts. Putting decision support algorithms (i.e. agents), KPI dashboards and simply modelling tools on mobile devices will likely be the next wave of analytics. In short, mobile big data/AI driven solutions that support specific, narrow mobile work tasks will be a very big deal in the near future.

However, creating and diving into data - that is, data exploration - is or should be, a process rooted in deep, careful, considered scientific thinking. That is a cognitive task that is not well suited to a mobile device experience. This is not just due to the form factor, but also the working context. Such deep thinking requires focus that a mobile work context does not provide.

As organisations embrace self-service analytics and more staff are engaged in creating and consuming visualisations and reports, data maturity will become an increasingly important consideration. However, data literacy is not just a set of skills to learn: it requires a change in culture and demands staff become familiar with rigorous models of thinking. It also requires honest reflection, both of the organisation’s activities and individually. 

While mobile analytics will be a growing area of interest, it will fail without a well-structured program to grow data literacy within the organisation and without granting staff the time and appropriate work spaces to reflect, explore and challenge their assumptions using data.

Who’s impacted

  • CDO
  • HR directors
  • Business intelligence groups

What’s Next?

Organisations should honestly assess staff data literacy maturity at a departmental and whole or organisation level. Armed with this information, a program to grow data literacy maturity can be developed. The deployment of data analytics tools, and indeed data sets, should coincide with the evolution of data literacy within the organisation. 

Related IBRS Advisory

  1. Staff need data literacy – Here’s how to help them get it
  2. When Does Power BI Deliver Power to the People?
  3. The critical link between data literacy and customer experience

The Latest

28 April 2021:  AWS has introduced AQUA (Advanced Query Accelerator) for Amazon Redshift, a distributed and hardware-accelerated cache that, according to AWS, “delivers up to ten times better query performance than other enterprise Cloud data warehouses”.

Why it’s Important

AWS is not the only vendor that offers distributed analytics computing. Architectures from Domo and Snowflake both make use of elastic, distributed computing resources (often referred to as nodes) to enable analytics over massive data sets. These architectures not only speed up the analytics of data, but also provide massively parallel ingestion of data. 

By introducing AQUA, AWS has added a layer of specialised, massively parallel and scalable cache over its Redshift analytics platform. This new layer comes at a cost, but initial calculations suggest it is a fraction of the cost of deploying and maintaining traditional big data analytics architecture, such as specialised BI hyperconverged appliances and databases.

Given the rapid growth in self-service data analytics (aka citizen analytics) organisations will face increasing demands to provide analytics services for increasing amounts of both highly curated data, and ‘other’ data with varied levels of quality. In addition, organisations need to consider a plan for rise in non-structured data. 

As with email, we have reached a tipping point in the demands of performance, complexity and cost where Cloud delivered analytics outstrip on-premises in most scenarios. The question now becomes one of Cloud architecture, data governance and, most important of all, how to mature data literacy across your organisation.

Who’s impacted

  • Business intelligence / analytics team leads
  • Enterprise architects
  • Cloud architects

What’s Next?

Organisations should reflect honestly on the way they are currently supporting business intelligence capabilities, and develop scenarios for Cloud-based analytics services. 

This should include a re-evaluation of how adherence to compliance and regulations can be met with Cloud services, how data could be democratised, and the potential impact on the organisation. BAU cost should be considered, not just for the as-in state, but also for a potential future states. While savings are likely, such should not be the overriding factor: new capabilities and enabling self-service analytics are just as important. 

Organisations should also evaluate data literacy maturity among staff, and if needed (likely) put in place a program to improve staff’s use of data.

Related IBRS Advisory

  1. IBRSiQ: AIS and Power BI Initiatives
  2. Workforce transformation: The four operating models of business intelligence
  3. Staff need data literacy – Here’s how to help them get it
  4. The critical link between data literacy and customer experience
  5. VENDORiQ: Fujitsu Buys into Australian Big Data with Versor Acquisition

IBRSiQ is a database of Client inquiries and is designed to get you talking to our advisors about these topics in the context of your organisation in order to provide tailored advice for your needs.

The Latest

09 April 2021: During its advisor business update, Fujitsu discussed its rationale for acquiring Versor, an Australian data and analytics specialist. Versor provides both managed services for data management, reporting and analytics. In addition, it provides consulting services, including data science, to help organisations deploy big data solutions.

Why it’s Important

Versor has 70 data and analytics specialists with strong multi-Cloud knowledge. Fujitsu’s interest in acquiring Versor is primarily tapping Versor’s consulting expertise in Edge Computing, Azure, AWS and Databricks. In addition, Versor’s staff have direct industry experience with some key Australian accounts, including public sector, utilities and retail, which are all target sectors for Fujitsu. Finally, Versor has expanded into Asia and is seeing strong growth. 

So from a Fujitsu perspective, the acquisition is a quick way to bolster its credentials in digital transformation and to open doors to new clients. 

This acquisition clearly demonstrates Fujitsu’s strategy to grow in the ANZ market by increasing investment in consulting and special industry verticals.  

Who’s impacted

  • CIO
  • Development team leads
  • Business analysts

What’s Next?

Given its experienced staff, Versor is expected to lead many of Fujitsu’s digital transformation engagements with prospects and clients. Fujitsu’s well-established ‘innovation design engagements’, are used to explore opportunities with clients and leverage concepts of user-centred design. Adding specialist big data skills to this mix makes for an attractive combination of pre-sales consulting.

Related IBRS Advisory

  1. The new CDO agenda
  2. Workforce transformation: The four operating models of business intelligence
  3. VENDORiQ: Defence Department Targets Fujitsu for Overhaul


The decision to integrate machine learning (ML) into systems and operations is not one that is made lightly. Aside from the costs of acquiring the technology tools, there are added considerations such as staff training and the expertise required to improve ML operations (MLOps) capabilities.

An understanding of the ML cycle before deployment is key. Once requirements and vision are defined, the appropriate tools are acquired. ML specialists will then analyse and perform feature engineering, model design, training, and testing and deployment. This is also known as the dev loop. At the implementation stage, the ML model is deployed and the application is subsequently refined and enhanced. The next stage is the monitoring and improving stage where the organisation refines the model and evaluates the ROI for its data science efforts. This stage triggers the retraining of the model through data drift and monitoring.


Too often, information communications technology (ICT) and business analytics groups focus on business intelligence and analytics architectures and do not explore the organisational behaviours that are required to take full advantage of such solutions. There is a growing recognition that data literacy (a subset of digital workforce maturity1) is just as important, if not more important, than the solutions being deployed. This is especially true for organisations embracing self-service analytics2.

The trend is to give self-service analytics platforms to management that are making critical business decisions. However, this trend also requires managers to be trained in not just the tools and platforms, but in understanding how to ask meaningful questions, select appropriate data (avoiding bias and cherry-picking), and how to apply the principles of scientific thinking to analysis.


For enterprises and small to medium businesses (SMBs), Artificial Intelligence (AI) opportunities are widespread and industry-specific. Each industry will grapple with conversations to understand how AI can:

  1. Create competitive advantage.
  2. Complement existing business.
  3. Disrupt, or even destroy the business model that exists today.

What businesses need to plan for is that AI engineering and AI ops are destined to be the essential umbrella to govern AI in the coming decade. Hyper-automation (HA) of business processes will see some business models fail whilst others thrive into the 2030s.

Conclusion: Regardless of its digital strategy, many organisations have not been positioned to properly leverage the digital and data assets that are available to them. A Chief Data Officer (CDO) role can improve this situation by advancing an organisation’s data portfolio, curating and making appropriate data visible and actionable.

The CDO position is appropriate for all larger organisations, and small-to-large organisations focused on data-driven decision-making and innovation. These organisations benefit from a point person overseeing data management, data quality, and data strategy. CDOs are also responsible for developing a culture that supports data analytics and business intelligence, and the process of drawing valuable insights from data. In summary, they are responsible for improving data literacy within the organisation.

Conclusion: Agility to respond to change has become essential. Compared with previous years, CIOs are expected to produce results over longer periods of time, now expectations have become much higher. Stakeholders are expecting results as soon as possible. With the trend geared towards an increase in technology dependence, the pressure of delivering results has therefore increased for CIOs and IT leaders.

Part of this new set of expectations is improved efficiency and productivity, which in most cases requires a thorough evaluation of business processes to garner potential inefficiencies. One of the primary tools organisations have at their disposal is the enterprise resource planning (ERP) systems. Eventually, it all boils down to whether or not the migration to S/4 HANA can be justified in terms of value-add-services. Implementation effort and run costs are only a part of the business case, not the whole.

The Latest

19 Nov 2020: During its annual summit, Snowflake announces a series of new capabilities: a development environment called Snowpark, support for unstructured media, row-level security for improved data governance and a data market.

Why it’s Important

Of Snowflake’s recent announcements, Snowpark clearly reveals the vendor’s strategy to leverage its Cloud analytics platform to enable the development of data-intensive applications. Snowpark allows developers to write applications in their preferred languages to access information in the Snowflake data platform.

This represents an inversion of how business intelligence / analytics teams have traditionally viewed the role of a data warehouse. The rise of data warehouses was driven by limitations in computing performance: heavy analytical workloads were shifted to a dedicated platform so that application performance would not be impacted by limits of database, storage and compute power. With Cloud-native data platform architectures that remove these limitations, it is now possible to leverage the data warehouse (or at least, the analogue of what the data warehouse has become) to service applications.

Who’s Impacted

Development teams
Business intelligence / analytics architects

What’s Next?

Snowflake's strategy is evidence of a seismic shift in data analytics architecture. Along with Domo, AWS, Microsoft Azure, Google and other Cloud-based data platforms that take advantage of highly scalable, federated architectures, Snowflake is empowering a flip in how data can be leveraged. To take advantage of this flip, organisations should rethink the structure and roles within BI / analytics teams. IBRS has noted that many organisations continue to invest heavily in building their BI / analytics architecture with individual best-of-breed solutions (storage, databases, warehouse, analytics tools, etc), while placing less focus on the data scientists and business domain experts. With access to elastic Cloud platforms, organisations can reverse this focus - putting the business specialists and data scientists in the lead. 

Related IBRS Advisory
Workforce transformation: The four operating models of business intelligence
Key lessons from the executive roundtable on data, analytics and business value

Conclusion: One of the misconceptions in business intelligence (BI) is that the goal is to capture and report upon all available data. This misses an essential business maxim: data is only useful when it is applied deliberately and with a clear goal in mind.

Too often, an organisation’s focus on BI quickly moves from aspirational principles of ‘being a data-driven business’ to discussions of technology architecture and data governance. However, it is dangerous to focus on simply hoarding data in the hope it will be useful in the future. What extracts value from data are steps taken after collection. And to define those steps, an organisation must first define the purpose to which the data will be applied.

IBRS has identified four increasingly sophisticated business models for how data can be applied: business reporting, data visualisation, key performance dashboard and predictive decision support.

Conclusion: A common pitfall experienced by service-orientated organisations is the disconnect between its digital efforts and its marketing program. In good practice, marketing efforts should underpin your digital strategy. This can be achieved by unifying marketing’s focus on customer and staff engagement, communications and promotion with the leveraging of digital channels to conduct these activities.

Conclusion: As Australia’s use of consultancy services continues to grow, so too does the need for businesses to obtain value from these engagements quickly and effectively. Key to obtaining this value is the organisation’s ability to easily and rapidly provide consultants and contractors with the specific context of your business, your customers and your unique challenges.

By providing the organisational context quickly, you can mitigate time, scope and budget creep, improve the quality of outputs developed by consultants and ensure that consequent plans are actionable and genuinely valuable for your business.

However, the ability to provide the needed organisational context quickly and effectively to consultants remains a common organisational challenge, and therefore a pitfall for successful vendor engagement. This paper covers how you can overcome this pitfall.

Conclusion: Dropbox’s announcement of a new interface may seem trivial, but its repositioning of ‘folders’ heralds the next disruptive phase of information management. By changing folders from being an approach for hierarchical organisation of information to being a ‘digital workspace’ for collaboration, Dropbox is leading the charge to drop the ‘paper metaphor’ in favour of collaboration. The impact on traditional information management lifecycles and information management will be both significant and challenging.

Conclusion: While the current artificial intelligence (AI) initiatives are data-driven, there are instances whereby the current data is insufficient to predict the future. For example, answering the following questions might be challenging if the available data is only of a historical nature irrelevant for forecasting purposes:

  • Q1: What will be the effect on sales if the price is increased by 10 % as of the next quarter?
  • Q2: What would have happened to sales had we increased the price by 10 % six months ago?

The purpose of this note is to provide a framework that can be used to derive sales principles to answer the above questions. The same approach can be used to derive other business processes principles such as procurement, customer service and client complaints tracking.

Related Articles:

"Analytics artificial intelligence maturity model" IBRS, 2018-12-03 09:44:43


Conclusion: Artificial intelligence technologies are available in various places such as robotic process automation (RPA), virtual agents and analytics. The purpose of this paper is to provide an AI maturity model in the analytics space. The proposed maturity model can be applied to any type of industry. It provides a roadmap to help improve business performance in the following areas:

  • Running the business (RTB): Provide executives with sufficient information to make informed decisions about running the business and staying competitive.
  • Growing the business (GTB): Provides information about growing the business in various geographies without changing the current services and products.
  • Transforming the business (TTB): Provides information to develop and release new products and services ahead of competitors.

IBRS iQ is a database of Client inquiries and is designed to get you talking to our Advisors about these topics in the context of your organisation in order to provide tailored advice for your needs.

Conclusion: Automation is understood to facilitate repetitive but essentially simple tasks. In conjunction with general purpose machine intelligence, virtual personal assistants and technologies leveraging artificial intelligence, automation will expand into more operational roles.

As the technologies improve, the potential applications will expand and play a larger marketing role.

IBRS iQ is a database of Client inquiries and is designed to get you talking to our Advisors about these topics in the context of your organisation in order to provide tailored advice for your needs.

Conclusion: Organisations continue to emphasise their competitive differentiation based on the data they hold, and the insights gained from analysing this valuable resource. The rate at which organisations are shifting from traditional process-based to insight-oriented differentiation is being further accelerated by the adoption of Cloud-based data analytics services.

The combined result is an increasing portion of enterprise project activity that can be classified as extract, transform and load (ETL).

Despite ETL being the mainstay of data integration for decades, the cost of specialised skills and significant manual effort expended on integrating disparate data sources is now coming into sharp focus. In response, organisations are rightly seeking lower-cost solutions for data integration.

Although ETL exists in the form of at least one tool in almost every enterprise, the cost of ETL as a proportion of data analytics projects means organisations must decrease reliance on traditional ETL tools in favour of automated solutions that exploit machine learning techniques to reduce the need for ETL developers.

 IBRS iQ is a database of Client inquiries and is designed to get you talking to our Advisors about these topics in the context of your organisation in order to provide tailored advice for your needs.

Conclusion: The return on investment in big data and associated analytics projects has been generally positive. It is more likely that returns over the longer term will grow too, provided strategic aims are established. The promise of big data hinges on information analysis, and therefore organisations must be clear as to use and application of the insight.

Conclusion: Machines are becoming increasingly proficient at tasks that, in the past, required human intelligence. Virtually all human domain expertise can be encoded in digital data with the right knowledge engineering tools. The bottleneck in understanding between humans and software is shaped by the ambiguities inherent in human communication, not by the challenge of developing machine intelligence. To benefit from big data, organisations need to articulate knowledge in the language of data, i.e. in a format that is not only understandable by humans but also actionable by machines.

Data scientists are in hot demand. In December 2012 the Harvard Business Review featured an article titled “Data Scientist: The Sexiest Job of the 21st Century”. International online job boards and LinkedIn have many thousands of openings asking for big data skills, and a growing number of openings for data scientists. What is all the hype about?

Conclusion: Cloud infrastructure and platforms have started to alter the landscape of data storage and data processing. Software as a Service (SaaS) Customer Relationship Management (CRM) functionality such as is considered best of breed, and even traditional vendors such as SAP are transitioning customers to SaaS solutions. The recent disclosure of global Cloud data mining by the US National Security Agency (NSA) has further fuelled concerns about industrial espionage in Europe and has significantly raised citizen awareness with respect to privacy and data custodianship. Any realistic attempt to address these concerns requires radical changes in data architectures and legislation.

Conclusion: Government agencies are slow in implementing open public sector information in line with freedom of information requirements. Agencies are challenged in terms of awareness of related government policies, in terms of cross-disciplinary collaboration, and in terms of obtaining funding for open data initiatives. The implications are not limited to government, but also affect the ability of Australian businesses to develop innovative products that derive value from Big Data in the public domain.

Many organisations are seeing growing demandand discussion around mobility and mobile ap-plications, in particular in the Networks Group.In theory, mobility can enable significant businessinnovation and optimisation of business process-es. However, few organisations have been able toclarify the benefits of mobility in terms that arealigned to their organisational goals and visionsstatements. This challenge is exacerbated by therapid innovation and changes underway in themobility market.

What is needed to address these problems is aconsistent, repeatable process that embeds mo-bility into the organisation’s overall IT Strategy.At the same time, mobility needs to be treatedslightly differently to many traditional projectsof work, as most mobility initiatives are smaller,with shorter deliver times, than large system de-ployments, but of often intimately interconnectedwith, and enabled by, the traditional larger backend systems.

To meet this challenge, IBRS developed its Mobil-ity Strategy Methodology, which provides a formalframework and process.

Conclusion: The maturity of information management practices in an organisation has a direct effect on the ability to achieve business goals related to supply chain optimisation, the quality of financial decisions, productivity, and quality of service. The exponential growth of unstructured information is no replacement for structured information. Quite the opposite: a stream of unstructured Big Data can only be turned into tangible value once it is channelled through a distillery that extracts highly structured information accessible to human decision makers, and that can be used to provide a service to the public or to drive a commercial business model. The transformation of unstructured data into knowledge and actionable insights involves several stages of distillation, the quality of which determine the overall performance of the organisation.

Conclusion: There are many links between the story of data warehousing and the story of SAP adoption, going all the way back to 1997, when SAP started developing a “Reporting Server”. Over the following decade SAP firmed up its dominant position as a provider of Enterprise Resource Planning functionality, creating countless business intelligence initiatives in the wake of SAP ERP implementation projects. Up to 80% of data warehouses have become white elephants, some completely abandoned, and others have been subjected to one or more resuscitation attempts. Big data can either be the last nail in the coffin, or it can be the vaccine that turns the colour of the data warehousing elephant into a healthy grey.

Conclusion: Unless you have a definition of the key data items for your enterprise, you will not be able to manage your data effectively. Astute CIOs have an understanding of the key data items that their organisation relies on for effective decision-making.

An enterprise data model documents the data in your organisation. It is a key enterprise architecture asset that enables more effective data management as well as offering the CIO the ability to reduce duplication and provide a higher level of service to the organisation.

Conclusion: Location, or geospatial information, is a central but significantly under-utilised element of the volume of data created and leveraged by organisations. Location information is simply presented and leveraged as text, e.g. an address. But location information is not just about where an asset or activity is located, but rather, where it is located in relation to other assets or activity. That relationship is best presented visually.

Conclusion: Pattern-based and repeatable processes, such as gathering operational data, validating data, and assessing data quality, offer potential for automation. The Web and software-as-a-service technologies offer powerful tools that facilitate automation beyond the simple mechanical pumping of data from one system to the next. Operational management tasks that focus on administration and control can and should be automated, so that managers have time to think about the organisation as a system, and can focus on continuous improvement.

Conclusion: A competency centre for Business Intelligence (BI) must have an active mandate and involvement from the senior executive to sustain optimised delivery of the organisational BI strategy. This leadership is a key factor in the ability to successfully deliver the initial benefits of the competency centre within a three month development period, establishing long term benefits.

Conclusion: The implementation of, and ongoing investment in Business Intelligence (BI) solutions have regularly failed to meet organisational expectations, both in terms of business outcomes and cost1. Further highlighting this, the most recent Gartner predictions for 2012 highlight that by 2015, more than 85% of Fortune 500 companies will have failed to effectively exploit Big Data (and by extension BI) to achieve competitive advantage2. As an outcome, consideration of the establishment of a BI Competency Centre is relevant for a large number of organisations who wish to improve outcomes of BI within their organisation. The establishment of the centre of excellence can align resources, focus capabilities and ensure education of projects and processes are shared across the organisation.

Conclusion: Over the last decade, the volume of data that governments and private corporations collect from citizens has been eclipsed by the data produced by individuals, as photos, videos, and messages on online social platforms, and also the data produced by large scale networks of sensors that monitor traffic, weather, and industrial systems. Web users are increasingly recognising the risks of handing over data-mining rights to a very small group of organisations, whist getting very little in return. The pressure is on to develop robust solutions that not only deliver value, but also address concerns about data ownership, privacy, and the threat of data theft and abuse.

Conclusion: The implementation of Business Intelligence is critical to the optimised operation of even the most basic business functions. When executed well it provides quantifiable competitive advantage for private sector organisations, and improved service delivery outcomes for the public sector.

IT has a significant opportunity to enhance its business relevance by ensuring that Business Intelligence best practice is active and transparent across the organisation. Organisations without a comprehensive investment and capability in Business Intelligence will struggle to complete and will operate below their potential.

Conclusion: IBRS has identified three broad approaches to Microsoft Office upgrades. In this research, we examine the benefits and challenges of each approach, and key considerations for planning. Organisations with more than 750 seats should avoid ad hoc Office deployments and take time to get their migration strategy in place, or risk creating a “demand feedback loop” that will result in higher costs and dissatisfaction with the IT department.

Conclusion: Business intelligence has traditionally served as an after-the-fact reporting and analysis capability that drifts weeks or months behind current events. Modern enterprises demand timelier access to integrated information. This demand cannot be met by conventional business intelligence approaches and requires a variety of new techniques targeted at the immediacy of the information required.

Conclusion: We are living in the Knowledge Age, and the operations of many organisations are critically dependent on the use of software-intensive systems. The value of operational data is well recognised, and the power struggle between the Internet superpowers such as Google, Amazon, and Facebook is largely about control over data. Knowledge however, is much more than raw data, and can be defined as the capability to transform data into valuable products and services. Today vast amounts of knowledge are expressed in the form of program source code and related data structure definitions. Most of this knowledge is not nearly as easily accessible and modifiable as we would like it to be. Techniques for knowledge reconstruction are becoming highly relevant, and organisations are well advised to up-skill Enterprise Architects and Business Analysts in this new discipline.

Conclusion: Operational data is the heart of a business in the information age. Without operational data the organisation would cease to function, irrespective of the software and hardware infrastructure that is in place. Hence the quality of data is a logical starting point for identifying opportunities to improve business operations. When used in combination with top-down value chain analysis, a quality assessment and categorisation of data can be used to identify essential system functionality, to identify pockets of obsolete functionality, and to discover sets of unreliable or redundant data.

Creating a single source of truth is impossible in the modern enterprise. Today’s complex environment of custom, COTS and cloud solutions mean that redundant and inconsistent information will persist for a long time. Instead drive consistency and currency of data across systems using Master Data Management (MDM). Addressing underlying data quality issues will remain a harder task.

Conclusion: Automated software and system testing will never be the testing silver bullet. One of its components though, the automated generation of test data, is one of the powerful weapons in the software testing arsenal1 and its deployment can provide a strategic advantage in the testing battle. The key is when and how to automate test data generation and which of its features are most effective when deployed. Two of its most useful benefits are reducing risks by protecting personal details and lowering costs by significantly reducing the numbers of tests required.

Conclusion:Organisations are drowning in complexity and information overload. At the same time, saving costs is at the top of the agenda. The only realistic path forward lies in tackling complexity head-on by deploying analytical techniques that help identify spurious complexity and confirm intrinsic complexity. Subsequently spurious complexity can be removed by surgical intervention, one step at a time.

Recently Wired magazine featured an interview with the CEO of Facebook where Mark Zuckerberg claims that Facebook does not regard other online networking platforms as competition, but that Google is the real competitor.

Conclusion:Privacy and data protection laws in Australia and NZ hold organisations, rather than their subcontractors, responsible for the activities of their subcontractors. Before committing to outsourcing any corporate data to a cloud computing vendor any organisation must ensure that all relevant legal constraints are agreed and in place so as to avoid any subsequent litigation. Ensuring and monitoring this may not be easy.

Conclusion: In our experience many Business Intelligence (BI) initiatives end up well short of their original objectives. But all is not lost. Sometimes it helps to learn from the experiences of others. For those:

  • Intending to embark on a new BI initiative
  • Working to remediate a BI strategy that had lost momentum.

This research paper examines some case studies and examples which reference breakthrough approaches and reflect the sometimes arduous travails involved in dealing with the many challenges presented by BI projects.

Conclusion: To gain insight into C-level executive intentions with information management, Accenture carried out a global survey2 in 2007. Whilst the majority of respondents had well-developed views on the power of Business Intelligence (BI) as a strategic differentiator, the report unearthed an underlying frustration in achieving their vision of an organisation-wide BI capability. This echoes our experiences in the ANZ market in which we observe many CIOs struggling to bring their complete BI visions to reality.

Conclusion: In 2008, corporate databases reached unprecedented sizes. Yet despite the abundance and diversity of data, many organisations remain challenged by Business Intelligence (BI) initiatives. They buy on vendor promise, but many have difficulty fulfilling it. Against this backdrop, and in a confusing post-acquisition market, BI vendors continue to release increasingly sophisticated and capable products.

Conclusion: Building a business case for Unified Communications is currently more of an art than a science. Traditional Return on Investment (ROI) models are now inapplicable unless arbitrary values are placed on intangible benefits. However, the difficulty of building a business case for UC does not mean that there is none – just that we need to view (and measure) UC’s benefits in accordance with the stage of maturity of the technology’s adoption. Paradoxically, as UC evolves past its current human-to-human model over the next decade, we will be able to switch back to using formal ROI models.

Conclusion: Too often corporate decision making is not a rational and well structured process.. The team charged with making the decision often accumulates a lot of information, probably biased by their own values, then goes into a room and emerges with a decision. This approach is a poor basis for making complex or important decisions  

Conclusion: BI technology platforms have been available in various guises for over two decades. Indeed, certain BI terms, such as ‘drill down’ have become embedded into business parlance.

The technology itself is mature and capable and many organisations have harnessed it to their advantage. However, some of our recent dealings with both IT and business executives reveal an underlying dissatisfaction with their BI implementations. Complaints include costly implementations, poor acceptance of the technology, particularly by middle management, and concerns with data quality and integrity. 

Conclusion: Due to their scale of operation and the massive databases they need to manage, Australia’s major banks often act as a bellwether for other IT users. This is certainly the case at present as a number of banks commit to Master Data Management (MDM) in an effort to bring their management reporting into order.

Conclusion: Knowledge Management (KM) is often thought of as a dark art. It’s not. Many organisations can benefit in tangible ways (e.g. quick access to a problem database in a Help Desk context) by harvesting the knowledge that already exists within them.

The last article on KM concerned explicit knowledge management, being knowledge that has already been articulated in some form within an organisation. This article is focused on tacit or implicit knowledge which is concerned with the experiences of individuals.

Conclusion: With the increasing sophistication of application software, it seems inconceivable in 2005 for any organisation to have data quality problems. Yet it is a problem that does occur more frequently than many recognise.

The worldwide recession in IT spending is, by most accounts, about to end. However, our discussions with technology buyers show that the demands to ‘squeeze more out of less’ are still common. With most IT budgets forecast to show percentage growth only in the single digit range, demonstrable ROI from new IT initiatives is essential.

Conclusion: Understanding the future would obviously give everyone a real competitive edge, or at the very least reduce wastage and efforts in the wrong direction. Forecasting is intended to reduce risk but the quality of forecasts is the key to getting something useful from them. That statement may seem simplistic but many forecasts do not use standard methodologies, or even methodologies that are clear to an outside observer. For anyone using forecasts to build plans and investments, the forecast should meet two conditions:

  1. Use a clear and transparent methodology with data that is verifiable and from known sources, and:

  2. A forecast model that contains more than one outcome, because a range of possible outcomes within the confines of the forecast, may be more realistic given the variable forces operating in a market.

Unless a forecast meets the two conditions outlined above, what ought to be a powerful instrument with which to organise strategy, is just a scrap of paper.

You are the only person in the elevator on the ground floor at 7.15am. Just as the doors are about to close, you see the CEO hurrying to catch the elevator and press the ‘Open Doors’ button so she can join you. She says breathlessly, “Thanks. Our meeting is timely. I read your report on our business information management dilemma last evening and raised it with the Chairman before I left. He just contacted me on the mobile phone and said he wants to talk about it when I get out on the 20th floor.