The Latest

25 January 2022: IBM has announced its acquisition of Sydney-based data analytics software company Envizi. In an official press release, the move was finalised to boost IBM’s capabilities to provide environmental, social and governance (ESG) analytics, which is an emerging specialised field.  

Envizi will be integrated with IBM’s existing package of manufacturing and supply chain solutions such as IBM Maximo, IBM Sterling, IBM Environmental Intelligence Suite (EIS) and IBM Turbonomic to support feedback automation in their operations and corporate environmental initiatives. 

Why it’s Important.

IBRS has observed increased activity by large vendors acquiring small, local Australian enterprises that specialise in data analytics. Some of these include the following:

  • Fujitsu’s acquisition of Melbourne-based data and analytics firm Versor in 2021
  • Cognizant’s 2021 purchase of Sydney-based Servian, a data analytics and AI vendor
  • Healthcare tech firm Beamtree’s acquisition of New South Wales-based comparative analytics enterprise Potential(x) in 2021
  • Accenture’s 2019 purchase of Australian big data and analytics consultancy Analytics8 then its series of acquisitions involving advanced analytics firms overseas such as Bridgei2i and Byte Prophecy in India, Novetta Solutions and End-to-End Analytics in the United States, as well as PRAGSIS BIDOOP in Spain.

Aside from these, acquisitions of data analytics startups by other firms outside of Australia have become prominent in the industry with the likes of Capgemini on Sweden-based Advectas, Genpact on Enquero, and Infogain on Absolutdata, which were all formalised in 2020.

IBRS believes that while it is beneficial for the industry to have vendors expand their analytics capabilities, customers or enterprise partners need to constantly assess the likely impact on their existing service contracts with analytics partner vendors. Some of the areas that are critical include terms and conditions, possible pricing changes, future services, contracted support and personnel changes, among others.

Who’s impacted

  • CIO
  • Development team leads
  • Business analysts

What’s Next?

Organisations need to be prepared for their analytics partners to be the next targets for acquisitions. As part of its strategy, organisations must remain vigilant and engaged with their analytics vendor partners regarding any acquisitions and the potential impact on services and costs. This includes assessing the implications of the potential scenarios that are most likely to occur, as well as the risks or opportunities that may be present with regard to adjusting to ramifications to the existing service, if there are any. Some potential risks or challenges that must be reviewed by the organisation’s legal and procurement teams can be found on this checklist.

Finally, organisations need to be cautious on assurances that are critical to their operations if these have not yet been put into written agreement. Becoming more pragmatic about the new vendor will minimise service disruptions in the future.

Related IBRS Advisory

  1. Mergers & acquisitions require federated service providers governance
  2. Mergers and Acquisitions - Devising the Right Strategy for IT

Conclusion: Chatbots are a valuable addition to any organisation’s customer service strategy, and are an essential part of the toolkit that should support your organisation’s high-tech, high-touch approach to increasing customer satisfaction.

The Latest

09 November 2021: Amazon Web Services (AWS) announced the availability of Babelfish for Amazon Aurora. Babelfish enables its hyperscale Aurora relational database service to understand Microsoft SQL Server and PostgreSQL commands. This allows customers to run applications written for Microsoft SQL Server directly on Amazon Aurora with minimal modifications in the code. 

Why it’s Important.

This new feature in Amazon Aurora, means enterprises with legacy applications can migrate to the Cloud without the time, effort and huge costs involved in rewriting application codes. In addition, using Babelfish benefits organisations through:

  • Reduced migration costs and no expensive lock-in licensing terms, unlike in commercial-grade databases
  • No interruption in existing Microsoft SQL Server database use since Babelfish can handle the TDS network protocol
  • Availability of the open-source version of Babelfish for PostgreSQL on GitHub under the permissive Apache 2.0 and PostgreSQL licenses 

Who’s impacted

  • CIO
  • Development team leads
  • Business analysts

What’s Next?

More general availability of hyperscale Cloud computing to support scalability and high-performance needs is expected in the coming months from major vendors. The most successful ones will require minimal changes in enterprises' existing SQL Server application code, speed of migration, and ease of switching to other tools post-migration.

Related IBRS Advisory

  1. VENDORiQ: Google Next: Data - PostgreSQL Spanning the Globe
  2. VENDORiQ: Google introduces Database Migration Service

The Latest

02 November 2021: Snowflake recently released the Snowflake Media Data Cloud that allows access to real-time, ready-to-query data products, and services from more than 175 data providers. The data-sharing company announced that its product can combine consumer data across sectors to reduce data latency and improve accuracy.

Why it’s Important.

More Australian organisations now recognise that access to external data enables enterprises to create one-to-one or one-to-many relationships for more reliable insights into data. Since it is difficult for businesses to make sense of data they don’t generate themselves, sharing information between internal business units inside the same company or between outside organisations, has narrowed insight gaps aside from lowering the cost of data collection and research. Some recent developments in this area include the following institutions that have extended their data sharing:

  • In 2014, Coles revealed that its online shoppers using Flybuys would have their personal information shared with 30 companies under the same Coles umbrella as well as with third parties in more than 23 countries.
  • Woolworths first started granting access to its consumer shopping behaviour data with all of its suppliers in 2017 to support collaborative decision-making with a customer-centric approach. However, it remains obstinate against disclosing all companies that handle its data when asked to submit comments during the Privacy Act review in 2021.
  • In June 2021, Bunnings announced an upgrade of its tech platform to capture customer information to improve buyer experience. Its privacy policy page explicitly discusses how information is shared with third party businesses such as financial searches, security providers, market research firms, and payment collectors.
  • Likewise, Target Australia discloses customer information to its service providers based overseas and to external call centres, recruitment companies and external fulfilment businesses. 

Ensuring the rights of consumers whose data is being shared can be an issue and apprehensions about maintaining privacy and confidentiality are often raised. The government introduced open banking across the country to provide consumers greater control of their personal data, and with whom it is shared, when applying for banking services.

Enterprises in the data-sharing environment must also find ways to ensure fair and equitable advantage of the information by accessing the same level of data insights as their competitors do. 

Who’s impacted

  • CIO
  • Development team leads
  • Business analysts

What’s Next?

Enterprises need to address the challenges of sharing large scale datasets, such as adherence to legislative and ethical frameworks, using personally identifiable information (PII) for testing, defining the critical role of service providers and their limitations, and improving the overall context of each shared data environment. This can be achieved if policies, procedures and standards on data privacy and security are aligned with data ethics that engender trust among the myriad direct and indirect actors involved in data sharing. Whatever goals such practice entails (such as developing innovative ancillary products with business partners or improving customer care by analysing real-time dashboards for rapid issue resolution), making the best use of opportunities in the field needs to be secure, lawful, just and ethical to ensure that collaboration leads to better decision making when building upon the work of others and fostering a culture of trust. 

Related IBRS Advisory

  1. Beyond privacy to trust: The need for enterprise data ethics
  2. Three ways to turn employee engagement results into actionable and achievable plans
  3. Data loss by the back door, slipping away unnoticed
  4. How Australia must use the PageUp data breach to become stronger - AFR - 18th June 2018

Artificial intelligence (AI) is an emerging technology that can be applied across business lines and yield significant results when aligned with business priorities. Assessing the AI maturity of your organisation can assist in providing AI roadmaps and aid in developing strategies and business cases.

The purpose of this presentation kit is to provide an AI maturity model in the analytics space. The proposed maturity model can be applied to any type of industry. Log in and click the PDF above to download the 'Analytics Artificial Intelligence Maturity Model' presentation kit and discover:

  • An IBRS AI maturity model that provides the foundation to apply the existing AI technology where it matters to the business
  • Guidelines to evolve into the future, whereby only limited data is available to make informed decisions
  • Next steps for your organisation

Regardless of its digital strategy, many organisations have not been positioned to properly leverage the digital and data assets that are available to them. A Chief Data Officer (CDO) role can improve this situation by advancing an organisation’s data portfolio, curating and making appropriate data visible and actionable.

Log in and click the PDF link above to download 'The New CDO Agenda' presentation kit and discover:

  • 4 pivotal points of the CDO agenda
  • A sample framework on how to understand the ownership of a data domain
  • Next steps for your organisation

Too often, information communications technology (ICT) and business analytics groups focus on business intelligence and analytics architectures and do not explore the organisational behaviours that are required to take full advantage of such solutions. There is a growing recognition that data literacy (a subset of digital workforce maturity) is just as important, if not more important, than the solutions being deployed. This is especially true for organisations embracing self-service analytics.

The trend is to give self-service analytics platforms to management that are making critical business decisions. However, this trend also requires managers to be trained in not just the tools and platforms, but in understanding how to ask meaningful questions, select appropriate data (avoiding bias and cherry-picking), and how to apply the principles of scientific thinking to analysis.

Download the pdf now.



The Latest: 

26 June 2021: Zoho briefed IBRS on Zoho DataPrep, it’s new business-user focused data preparation which is being included in its existing Zoho Analytics tool, as well as being available separately as a tool to clean, transform and migrate data. DataPrep is in beta, and will be officially launched on 13th July 2021.

Why it’s Important

Traditionally, cleaning and transforming data for use in analytics platforms has involved scripting and complex ETL (extract, transform and load) processes. This was a barrier to allowing business stakeholders to take advantage of analytics. However, several analytics vendors (most notably Microsoft, Tableau, Qlik, Snowflake, Domo, etc.) have pioneered powerful, drag-and-drop low-code ETL into their products.  

Zoho, which is better known for its CRM, has an existing data analytics platform with Cloud storage, visualisation and reports, and dashboards. While the product is not as sophisticated as its top-drawer rivals, it can be considered ‘good enough’ for many business user’s needs. Most significantly, Zoho Analytics benefits from attractive licensing, including the ability to share reports and interactive dashboards both within an organisation and externally. 

However, Zoho Analytics lacked a business-user-friendly, low-code ELT environment, instead relying on SQL scripting. Zoho DataPrep fills this gap by providing a dedicated, AI-enabled platform for extracting data from a variety of sources, allowing data cleaning and transformations to be applied, with results being pushed into another database, data warehouse and Zoho Analytics. 

All existing Zoho Analytics clients will receive Zoho DataPrep with no change to licensing.

However, what is interesting here is Zoho’s decision to offer its DataPrep platform independent of its Analytics platform. This allows business stakeholders to use the platform as a tool to solve migration and data cleaning, not just analytics. 

IBRS’s initial tests of Zoho DataPrep suggest that it has some way to go before it can compete with the ready-made integration capabilities of Tableau, Power BI, Qlik, and others. In addition, it offers less complex ETL than it’s better established rivals. But, that may not be an issue for organisations where staff have limited data literacy maturity, or where analytics requirements are relatively straightforward.

Who’s impacted

  • CIO
  • Development team leads
  • Business analysts

What’s Next?

The bigger take out from Zoho’s announcement is that ETL, along with all other aspects of business intelligence and analytics, will be both low-code, business-user friendly and reside in the Cloud. ICT departments seeking to create ‘best of breed’ business intelligence architectures that demand highly specialised skills will simply be bypassed, due to their lack of agility. While there will be a role for highly skilled statisticians, data scientists, and machine learning professionals, the days of needing ICT staff that specialise in specific reporting and data warehousing products is passing. 

Related IBRS Advisory

  1. Snowflake Gets PROTECTED Status Security Tick by Aussie Auditor
  2. IBRSiQ: Power BI vs Tableau
  3. Business-First Data Analytics
  4. AWS Accelerates Cloud Analytics with Custom Hardware
  5. IBRSiQ AIS and Power BI Initiatives
  6. Trends in Data Catalogues
  7. When Does Power BI Deliver Power to the People?
  8. Staff need data literacy – Here’s how to help them get it

The Latest

26 May 2021: Google has introduced Datasteam, which the vendor defines as a “change data capture and replication service”. In short, the service allows changes in one data source to be replicated to other data sources in near real time. The service currently connects with Oracle and MySQL databases and a slew of Google Cloud services, including BigQuery, Cloud SQL, Cloud Storage, Spanner, and so forth.

Uses for such a service include: updating a data lake or similar repository with data being added to a production database, keeping disparate databases of different types in sync, consolidating global organisation information back to a central repository.

Datastream is based on Cloud functions - or serverless - architecture. This is significant, as it allows for scale-independent integration.

Why it’s Important

Ingesting data scale into Cloud-based data lakes is a challenge and can be costly. Even simple ingestion where data requires little in the way of transformation can be costly when run through a full ETL service. By leveraging serverless functions, Datastream has the potential to significantly lower the cost and improve performance of bringing large volumes of rapidly changing data into a data lake (or an SQL database which is being used as a pseudo data lake). 

Using serverless to improve the performance and economics of large scale data ingestion is not a new approach. IBRS interviewed the architecture of a major global streaming service in 2017 regarding how they moved from an integration platform to leveraging AWS Kinesis data pipelines and hand-coded serverless functions, and to achieve more or less the same thing that Google Datastream is providing. 

As organisations migrate to Cloud analytics, the ability to rapidly replicate large data sets will grow. Serverless architecture will emerge as an important pattern.

Who’s impacted

  • Analytics architecture leads
  • Integration teams
  • Enterprise architecture teams

What’s Next?

Become familiar with the potential to use serverless / cloud function as a ‘glue’ within your organisation’s Cloud architecture. 

Look for opportunities to leverage serverless when designing your organisations next analytics platform. 

Related IBRS Advisory

  1. Serverless Programming: Should your software development teams be exploring it?
  2. VENDORiQ: Google introduces Database Migration Service


Too often, information communications technology (ICT) and business analytics groups focus on business intelligence and analytics architectures and do not explore the organisational behaviours that are required to take full advantage of such solutions. There is a growing recognition that data literacy (a subset of digital workforce maturity1) is just as important, if not more important, than the solutions being deployed. This is especially true for organisations embracing self-service analytics2.

The trend is to give self-service analytics platforms to management that are making critical business decisions. However, this trend also requires managers to be trained in not just the tools and platforms, but in understanding how to ask meaningful questions, select appropriate data (avoiding bias and cherry-picking), and how to apply the principles of scientific thinking to analysis.

Conclusion: Regardless of its digital strategy, many organisations have not been positioned to properly leverage the digital and data assets that are available to them. A Chief Data Officer (CDO) role can improve this situation by advancing an organisation’s data portfolio, curating and making appropriate data visible and actionable.

The CDO position is appropriate for all larger organisations, and small-to-large organisations focused on data-driven decision-making and innovation. These organisations benefit from a point person overseeing data management, data quality, and data strategy. CDOs are also responsible for developing a culture that supports data analytics and business intelligence, and the process of drawing valuable insights from data. In summary, they are responsible for improving data literacy within the organisation.

Conclusion: Most organisations have vast pools of data (a. k.a. information assets) lying underutilised, as many IT and business professionals are unsure where it is stored and are unaware of its value. To turn the situation around organisations must strive for data mastery1, which is the ability to embed the data into products and services to increase efficiency, revenue growth and customer engagement.

The Latest

19 Nov 2020: During its annual summit, Snowflake announces a series of new capabilities: a development environment called Snowpark, support for unstructured media, row-level security for improved data governance and a data market.

Why it’s Important

Of Snowflake’s recent announcements, Snowpark clearly reveals the vendor’s strategy to leverage its Cloud analytics platform to enable the development of data-intensive applications. Snowpark allows developers to write applications in their preferred languages to access information in the Snowflake data platform.

This represents an inversion of how business intelligence / analytics teams have traditionally viewed the role of a data warehouse. The rise of data warehouses was driven by limitations in computing performance: heavy analytical workloads were shifted to a dedicated platform so that application performance would not be impacted by limits of database, storage and compute power. With Cloud-native data platform architectures that remove these limitations, it is now possible to leverage the data warehouse (or at least, the analogue of what the data warehouse has become) to service applications.

Who’s Impacted

Development teams
Business intelligence / analytics architects

What’s Next?

Snowflake's strategy is evidence of a seismic shift in data analytics architecture. Along with Domo, AWS, Microsoft Azure, Google and other Cloud-based data platforms that take advantage of highly scalable, federated architectures, Snowflake is empowering a flip in how data can be leveraged. To take advantage of this flip, organisations should rethink the structure and roles within BI / analytics teams. IBRS has noted that many organisations continue to invest heavily in building their BI / analytics architecture with individual best-of-breed solutions (storage, databases, warehouse, analytics tools, etc), while placing less focus on the data scientists and business domain experts. With access to elastic Cloud platforms, organisations can reverse this focus - putting the business specialists and data scientists in the lead. 

Related IBRS Advisory
Workforce transformation: The four operating models of business intelligence
Key lessons from the executive roundtable on data, analytics and business value

Conclusion: ‘Voice of the customer’ (VoC) programs often involve the collection and analysis of data through feedback, research and analytics. This can provide an organisation with a strong view of customer desires, pain points, improvement opportunities and new product opportunities. However, this approach does not provide insight into whether these desires, pain points and ideas are shared by your employees. It also does not tell you whether these ideas are easy to implement or if they are achievable. In part, these are the reasons why only 24 % of large firms think they are good at making changes to the business based on insights captured through their VoC programs1.

Many organisations invest in employee engagement programs and initiatives, without realising the full benefit (i. e. action) of this investment2. This paper explores how, by capturing the voice of your staff as a component of your VoC program, organisations can increase the practical value of insights collected, expedite the road to implementation and focus on targeted, achievable action.

Conclusion: A common pitfall experienced by service-orientated organisations is the disconnect between its digital efforts and its marketing program. In good practice, marketing efforts should underpin your digital strategy. This can be achieved by unifying marketing’s focus on customer and staff engagement, communications and promotion with the leveraging of digital channels to conduct these activities.

Conclusion: As Australia’s use of consultancy services continues to grow, so too does the need for businesses to obtain value from these engagements quickly and effectively. Key to obtaining this value is the organisation’s ability to easily and rapidly provide consultants and contractors with the specific context of your business, your customers and your unique challenges.

By providing the organisational context quickly, you can mitigate time, scope and budget creep, improve the quality of outputs developed by consultants and ensure that consequent plans are actionable and genuinely valuable for your business.

However, the ability to provide the needed organisational context quickly and effectively to consultants remains a common organisational challenge, and therefore a pitfall for successful vendor engagement. This paper covers how you can overcome this pitfall.

Conclusion: The IT organisation in most enterprises suffers from the “Cobbler’s Children” syndrome – they give great advice but do not practise what they preach. A prime example is when IT does not apply Enterprise Architecture approaches and capabilities to the business of IT itself1 and yet expects other departments to apply such principles. Sadly, a new deficiency is emerging in IT as increasingly the role of analytics is democratised across the business – leading to the lack of data analytics capability for IT itself.

As organisations embrace data science, artificial intelligence and machine learning to generate increasingly sophisticated insights for performance improvement, IT must not let itself be left behind. This means ensuring that within a contemporary IT-as-a-Service operating model, space is created for the role of IT Data Analyst. This should be an inward-facing function with primary responsibility for the generation and curation of the IT organisation’s own core information assets in the form of data relating to the portfolio of IT assets, services and initiatives, including curation of operating data from Cloud providers and other partners.

Conclusion: While the current artificial intelligence (AI) initiatives are data-driven, there are instances whereby the current data is insufficient to predict the future. For example, answering the following questions might be challenging if the available data is only of a historical nature irrelevant for forecasting purposes:

  • Q1: What will be the effect on sales if the price is increased by 10 % as of the next quarter?
  • Q2: What would have happened to sales had we increased the price by 10 % six months ago?

The purpose of this note is to provide a framework that can be used to derive sales principles to answer the above questions. The same approach can be used to derive other business processes principles such as procurement, customer service and client complaints tracking.

Related Articles:

"Analytics artificial intelligence maturity model" IBRS, 2018-12-03 09:44:43


Conclusion: Artificial intelligence technologies are available in various places such as robotic process automation (RPA), virtual agents and analytics. The purpose of this paper is to provide an AI maturity model in the analytics space. The proposed maturity model can be applied to any type of industry. It provides a roadmap to help improve business performance in the following areas:

  • Running the business (RTB): Provide executives with sufficient information to make informed decisions about running the business and staying competitive.
  • Growing the business (GTB): Provides information about growing the business in various geographies without changing the current services and products.
  • Transforming the business (TTB): Provides information to develop and release new products and services ahead of competitors.

Conclusion: Automation is understood to facilitate repetitive but essentially simple tasks. In conjunction with general purpose machine intelligence, virtual personal assistants and technologies leveraging artificial intelligence, automation will expand into more operational roles.

As the technologies improve, the potential applications will expand and play a larger marketing role.

Conclusion: The return on investment in big data and associated analytics projects has been generally positive. It is more likely that returns over the longer term will grow too, provided strategic aims are established. The promise of big data hinges on information analysis, and therefore organisations must be clear as to use and application of the insight.

Many IT organisations are trying to change their perceived image from high-cost / low quality to value-added service providers. However, many of the adopted approaches revolve around improving just few processes (e.g. problem management). While these processes are important, they are insufficient to produce the desired effect for IT groups to deliver value-added services. 

In this IBRS Master Advisory Presentation (MAP), IBRS outlines the high-level issues, surrounding Running IT as a Service from both business and technology viewpoints.This MAP is designed to guide and stimulate discussions between business and technology groups and point the way for more detailed activity. It also provides links to further reading to support these follow-up activities.

The MAP is provided as a set of presentation slides,  and as a script and executive briefing document.

Conclusion: Machines are becoming increasingly proficient at tasks that, in the past, required human intelligence. Virtually all human domain expertise can be encoded in digital data with the right knowledge engineering tools. The bottleneck in understanding between humans and software is shaped by the ambiguities inherent in human communication, not by the challenge of developing machine intelligence. To benefit from big data, organisations need to articulate knowledge in the language of data, i.e. in a format that is not only understandable by humans but also actionable by machines.

Data scientists are in hot demand. In December 2012 the Harvard Business Review featured an article titled “Data Scientist: The Sexiest Job of the 21st Century”. International online job boards and LinkedIn have many thousands of openings asking for big data skills, and a growing number of openings for data scientists. What is all the hype about?

Conclusion: Government agencies are slow in implementing open public sector information in line with freedom of information requirements. Agencies are challenged in terms of awareness of related government policies, in terms of cross-disciplinary collaboration, and in terms of obtaining funding for open data initiatives. The implications are not limited to government, but also affect the ability of Australian businesses to develop innovative products that derive value from Big Data in the public domain.

The topic of Big Data has been propelled from the engine room of theWeb 2.0 giants into the mainstream press. Over the last decade, the volume of data that governments and financial institutions collect from citizens has been eclipsed by the data produced by individuals in terms of photos, videos, messages, as well as geolocation data on online social platforms and mobile phones, and also the data produced by large scale networks of sensors that monitor traffic,weather, and industrial systems.

IBRS has always recognised data as the key to value creation, and has built up an extensive body of research on the latest trends and the shift from enterprise data to “big data” that is currently unfolding. This white paper addresses the scale and the businessimplications of this shift.

Many organisations are seeing growing demandand discussion around mobility and mobile ap-plications, in particular in the Networks Group.In theory, mobility can enable significant businessinnovation and optimisation of business process-es. However, few organisations have been able toclarify the benefits of mobility in terms that arealigned to their organisational goals and visionsstatements. This challenge is exacerbated by therapid innovation and changes underway in themobility market.

What is needed to address these problems is aconsistent, repeatable process that embeds mo-bility into the organisation’s overall IT Strategy.At the same time, mobility needs to be treatedslightly differently to many traditional projectsof work, as most mobility initiatives are smaller,with shorter deliver times, than large system de-ployments, but of often intimately interconnectedwith, and enabled by, the traditional larger backend systems.

To meet this challenge, IBRS developed its Mobil-ity Strategy Methodology, which provides a formalframework and process.

Conclusion: The maturity of information management practices in an organisation has a direct effect on the ability to achieve business goals related to supply chain optimisation, the quality of financial decisions, productivity, and quality of service. The exponential growth of unstructured information is no replacement for structured information. Quite the opposite: a stream of unstructured Big Data can only be turned into tangible value once it is channelled through a distillery that extracts highly structured information accessible to human decision makers, and that can be used to provide a service to the public or to drive a commercial business model. The transformation of unstructured data into knowledge and actionable insights involves several stages of distillation, the quality of which determine the overall performance of the organisation.

Conclusion: Effective data science requires a cross-disciplinary team of highly skilled experts, as well as data in sufficient quantity and quality. These requirements imply a level of maturity in information management that is beyond the capability of most organisations today. An information management maturity assessment can help determine whether an organisation is ready to embark on a big data initiative, and to identify any concrete deficits that need to be addressed.

Conclusion: There are many links between the story of data warehousing and the story of SAP adoption, going all the way back to 1997, when SAP started developing a “Reporting Server”. Over the following decade SAP firmed up its dominant position as a provider of Enterprise Resource Planning functionality, creating countless business intelligence initiatives in the wake of SAP ERP implementation projects. Up to 80% of data warehouses have become white elephants, some completely abandoned, and others have been subjected to one or more resuscitation attempts. Big data can either be the last nail in the coffin, or it can be the vaccine that turns the colour of the data warehousing elephant into a healthy grey.

Conclusion: Big data not only refers to the growing amounts of netizen-generated online data, it also refers to customer expectations related to the data services provided by corporations and government departments. Increasingly corporate and individual service users expect not only a basic service, but also access to advanced tooling for data transformation, representation, and integration into other systems. In the future, the level of maturity and professionalism of an organisation will increasingly be determined by data-related quality of service characteristics. It is time for organisations to grow-up, and to treat information services as a core product line.

Conclusion: Location, or geospatial information, is a central but significantly under-utilised element of the volume of data created and leveraged by organisations. Location information is simply presented and leveraged as text, e.g. an address. But location information is not just about where an asset or activity is located, but rather, where it is located in relation to other assets or activity. That relationship is best presented visually.

Conclusion: The implementation of, and ongoing investment in Business Intelligence (BI) solutions have regularly failed to meet organisational expectations, both in terms of business outcomes and cost1. Further highlighting this, the most recent Gartner predictions for 2012 highlight that by 2015, more than 85% of Fortune 500 companies will have failed to effectively exploit Big Data (and by extension BI) to achieve competitive advantage2. As an outcome, consideration of the establishment of a BI Competency Centre is relevant for a large number of organisations who wish to improve outcomes of BI within their organisation. The establishment of the centre of excellence can align resources, focus capabilities and ensure education of projects and processes are shared across the organisation.

Conclusion: Over the last decade, the volume of data that governments and private corporations collect from citizens has been eclipsed by the data produced by individuals, as photos, videos, and messages on online social platforms, and also the data produced by large scale networks of sensors that monitor traffic, weather, and industrial systems. Web users are increasingly recognising the risks of handing over data-mining rights to a very small group of organisations, whist getting very little in return. The pressure is on to develop robust solutions that not only deliver value, but also address concerns about data ownership, privacy, and the threat of data theft and abuse.

Conclusion: Adding analytics is essential to any social media strategic initiative, whether it is well organised or just experimental. Without using analytics an organisation is blind to market interaction and therefore cannot modify or understand how to modify tactics. However, avoid simply trusting the data alone to provide the answers and set directions. To gain the most benefit from such analytics tools will require skills in interpretation, analysis and judgement in when to implement actions and or revisions.

Conclusion:Organisations are drowning in complexity and information overload. At the same time, saving costs is at the top of the agenda. The only realistic path forward lies in tackling complexity head-on by deploying analytical techniques that help identify spurious complexity and confirm intrinsic complexity. Subsequently spurious complexity can be removed by surgical intervention, one step at a time.

Conclusion: Web analytic tools are so pervasive and widely used it hardly seems necessary to consider their capabilities and implementation. Yet businesses and other organisations may under-use, their Web analytics software. In which case they are not obtaining the value they expected.

The evidence from both measured and anecdotal sources is that organisations that achieve the greatest gains through Web analytics have used a process to select the right tool for their needs, then integrated it well, and trained their staff to use the system to segment visitors, understand their engagement, and quantify the effectiveness of the website.

Conclusion: In our experience many Business Intelligence (BI) initiatives end up well short of their original objectives. But all is not lost. Sometimes it helps to learn from the experiences of others. For those:

  • Intending to embark on a new BI initiative
  • Working to remediate a BI strategy that had lost momentum.

This research paper examines some case studies and examples which reference breakthrough approaches and reflect the sometimes arduous travails involved in dealing with the many challenges presented by BI projects.

Conclusion: To gain insight into C-level executive intentions with information management, Accenture carried out a global survey2 in 2007. Whilst the majority of respondents had well-developed views on the power of Business Intelligence (BI) as a strategic differentiator, the report unearthed an underlying frustration in achieving their vision of an organisation-wide BI capability. This echoes our experiences in the ANZ market in which we observe many CIOs struggling to bring their complete BI visions to reality.

Conclusion: In 2008, corporate databases reached unprecedented sizes. Yet despite the abundance and diversity of data, many organisations remain challenged by Business Intelligence (BI) initiatives. They buy on vendor promise, but many have difficulty fulfilling it. Against this backdrop, and in a confusing post-acquisition market, BI vendors continue to release increasingly sophisticated and capable products.

Conclusion: Too often corporate decision making is not a rational and well structured process.. The team charged with making the decision often accumulates a lot of information, probably biased by their own values, then goes into a room and emerges with a decision. This approach is a poor basis for making complex or important decisions  

Conclusion: Through various channels of the media the news that the first wave of Baby Boomers are retiring implies some uncertainty. While it is true that those people who are 60 are retiring, the actual numbers are quite small and the flow on effect to the economy not large – just yet.

Population, like the planet, is something accepted as a basic fact, but like the initiatives to reverse global warning and operate in a sustainable way, significant changes are happening to the composition of the population that alter sixty years of accepted facts.

Organisations cannot create a single strategy to deal with demography but the effects of demographic change must be catered for in the next decade. In the broadest terms, with fewer young people and more older people, different approaches to training and skills, working arrangements and communication with the market are likely. Organisations that have seen and planned ahead may not only find a competitive advantage but an easier transition to the changes that will ensue.

The worldwide recession in IT spending is, by most accounts, about to end. However, our discussions with technology buyers show that the demands to ‘squeeze more out of less’ are still common. With most IT budgets forecast to show percentage growth only in the single digit range, demonstrable ROI from new IT initiatives is essential.

Conclusion: Understanding the future would obviously give everyone a real competitive edge, or at the very least reduce wastage and efforts in the wrong direction. Forecasting is intended to reduce risk but the quality of forecasts is the key to getting something useful from them. That statement may seem simplistic but many forecasts do not use standard methodologies, or even methodologies that are clear to an outside observer. For anyone using forecasts to build plans and investments, the forecast should meet two conditions:

  1. Use a clear and transparent methodology with data that is verifiable and from known sources, and:

  2. A forecast model that contains more than one outcome, because a range of possible outcomes within the confines of the forecast, may be more realistic given the variable forces operating in a market.

Unless a forecast meets the two conditions outlined above, what ought to be a powerful instrument with which to organise strategy, is just a scrap of paper.

Conclusion: For medium sized companies there is no opportunity to fail at the planning stage otherwise it’s burning the investment capital and not even dealing with the big ROI issues.

Whatever web analytics software is selected it ought to be accountable over four key criteria:

  1. Enterprise Resources

  2. Capital Invested

  3. Human Capital

  4. Web Productivity

This is a progressive evaluation of a system and is therefore robust enough to assess the return on investment over many dimensions.