Sam Higgins is an IBRS Advisor over 20 years of both tactical and strategic experience in the application of information and communications technology (ICT) to achieve business outcomes from large complex organisations. Through previous roles as a leading ICT executive, strategist, architect, industry analyst, program consultant and advisor, Sam has developed an extensive knowledge of key markets including as-a-Service (Cloud) computing, enterprise architecture (including service-orientation and information management), enterprise applications and development, business intelligence; along with ICT management and governance practices such as ICT planning, strategic sourcing, portfolio and project management. Sam’s knowledge of service-oriented architecture and associated business models is widely recognised, and he was a contributing author on the Paul Allen book Service-orientation: Winning Strategies and Best Practices, released in 2006 by Cambridge University Press. As the former Research Director for Longhaus he undertook the first in depth research into the implications of cloud computing and other “as-a-Service” ICT offerings on the Australian and near shore markets. The 2010 report entitled, Defining cloud computing highlights provider gaps in the Australian ICT market, was widely reported in both the online ICT industry press and mainstream media.
- IBRS iQ
31 December 2017
Conclusion: Organisations seeking to ride the new wave of AI-enabled transformation are facing a clear choice when it comes to the adoption of supporting AI capabilities such as machine learning or speech recognition, either:
- DIY (Do It Yourself) – By adopting AI early as stand-alone services; or
- MODIFY (Make Others Do It For You) – By waiting for AI functionality to be embedded in existing solutions.
Deciding which path to take requires that organisations reflect on their current maturity when it comes to building solutions. Only those organisations that can honestly demonstrate full development lifecycle capabilities and that have contemporary development tools and frameworks should expect anything but proof of concept success with DIY approaches to AI solutions.
05 July 2018
Conclusion: In seeking to achieve their vision, goals and objectives, organisations constantly evaluate internal and external factors in order to take action. Although tuned to the unique needs of each enterprise, there have been identifiable waves of factors and responding actions that have occurred since 2000 in the form of business and digital transformation.
Business transformation addressed the changing nature of markets in a connected and globalised world by focusing on delivering cost savings through new models of operation, while the subsequent wave of digital transformation sought to employ technology and exploit pervasive connectivity to increase the efficiency of internal processes and customer-facing interactions.
IBRS has identified a new wave we call “artificial intelligence-enabled (AI-enabled) transformation”, which is focused on optimising business operations through the use of emerging technologies that leverage “self-learning” algorithms to make predictions, respond to real-world objects and events, and possess user interfaces that mimic how humans communicate.
However, in order to successfully exploit this new wave of transformation, organisations must first understand what exactly AI is and how AI-enabled transformation differs from the waves that have come before it.
- Governance & Planning
01 June 2018
Conclusion: Due to years of tactical software deployments in response to urgent digital transformation uplifts, organisations have created a jungle of business intelligence (BI) technologies deployed in the absence of a well described and comprehensive approach to the challenges faced; challenges that will continue to increase with the shift to AI-enabled transformation.
Instead the majority of solution paradigms have centred around the application of emerging technologies with little articulation of a coherent architecture traceable to the underlying functional or non-functional requirements required to support a well governed and long lived data analytics platform. Instead, with each new trend in reporting and analytics, e. g. big data, results in a litany of partial solutions.
Enter Data Vault 2.0 (DV2.0) is the first well described architecture, methodology and modelling approach to emerge from the BI community in the last 5 years. DV2.0 provides a solid basis for organisations wishing to avoid the data sins of the past and adoption should be a top consideration for the inevitable expansion of BI that flows from business application transformation and as part of a clear DataOps strategy.
05 May 2018
Conclusion: The development of AI-based solutions is heavily dependent on various types of data input in the form of either:
- Large data sets used to conduct experiments to develop models and algorithms for predictive analytics, optimisation and decision recommendations; or
- Enriched and tagged corpuses of images, audio, video and unstructured text used to train neural networks using deep learning techniques.
While at first the data management needs of AI-based solution development might leverage both data scientists and their existing business intelligence platforms to exploit these types of data, the actual lifecycle management needs of AI developers will expand quickly beyond the boundary of the traditional enterprise data warehouse.
Therefore, like the source code and configuration data underpinning transactional business applications, the raw data and algorithms of AI solutions must be managed by evolving DevOps practices towards a comprehensive “DataOps” model.
01 April 2018
Conclusion: Although online digital platforms are in ready supply, organisations remain unable to avoid the receipt of critical information in the form of paper documents or scanned images. Whether from government, suppliers or clients, organisations are faced with written correspondence, typed material, completed forms or signed documents that must be consumed. For a variety of reasons, it may be unreasonable or impractical to expect this information to be sent in machine-readable form.
However, machine-readable content from incoming information, both past and future, is emerging as a prerequisite to exploit artificial intelligence and machine learning as part of digital transformation. Therefore, organisations need to re-examine their data ingestion strategies and move proactively to the use of optical character recognition on incoming paper- and scanned image-based information.
06 March 2018
Conclusion: Organisations continue to emphasise their competitive differentiation based on the data they hold, and the insights gained from analysing this valuable resource. The rate at which organisations are shifting from traditional process-based to insight-oriented differentiation is being further accelerated by the adoption of Cloud-based data analytics services.
The combined result is an increasing portion of enterprise project activity that can be classified as extract, transform and load (ETL).
Despite ETL being the mainstay of data integration for decades, the cost of specialised skills and significant manual effort expended on integrating disparate data sources is now coming into sharp focus. In response, organisations are rightly seeking lower-cost solutions for data integration.
Although ETL exists in the form of at least one tool in almost every enterprise, the cost of ETL as a proportion of data analytics projects means organisations must decrease reliance on traditional ETL tools in favour of automated solutions that exploit machine learning techniques to reduce the need for ETL developers.
02 February 2018
- IBRS iQ
01 February 2018
Conclusion: The release of Amazon’s Echo in 2014 heralded the first of a series of “ambient” technologies1. These new devices are unobtrusive, multiple purpose and capable of responding to conversational input through integration with virtual digital assistants (VDAs) such as Amazon’s Alexa.
A key enabler of these platforms is the ability to implement “skills” or expand the platform’s capability to interpret and respond with appropriate conversational content beyond the basic function of the device itself.
The consistency of information required by organisations under omni-channel delivery models, combined with under-resourced editorial teams, mean organisations must prepare for conversational channels by transforming existing content sooner rather than later.
Failure to do so will see history repeat itself through short-term replication of content to support new channels only to have that content and channel functionality merged back into increasingly sophisticated content management platforms at significant cost.
04 January 2018
Conclusion: Technologists consistently under-estimate the growth of data volumes. The result is tactical actions aimed at increasing capacity achieved by adding storage on-premise using traditional bulk storage solutions or moving technical workloads, such as back-up or disaster recovery, to Cloud-based Storage-as-a-Service offerings. This reflects a decades-old mantra of “disk is cheap, buy more disk”.
When the lack of predictability of data volume growth is combined with the need to capture then distribute data from new sources as well as control the hidden cost of data movement across networks, these tactical responses fail to deliver transformational value to end users.
To deliver effective and efficient data storage solutions, IT infrastructure architects must collaborate with their information and data management colleagues to identify the demographics of data being managed1; they must then select storage solutions that optimise data capture, storage, distribution and access based on these characteristics, not simply by volume.
03 December 2017