Sam Higgins is an IBRS Advisor over 20 years of both tactical and strategic experience in the application of information and communications technology (ICT) to achieve business outcomes from large complex organisations. Through previous roles as a leading ICT executive, strategist, architect, industry analyst, program consultant and advisor, Sam has developed an extensive knowledge of key markets including as-a-service (cloud) computing, enterprise architecture (including service-orientation and information management), enterprise applications and development, business intelligence; along with ICT management and governance practices such as ICT planning, strategic sourcing, portfolio and project management. Sam’s knowledge of service-oriented architecture and associated business models is widely recognised, and he was a contributing author on the Paul Allen book Service-orientation: Winning Strategies and Best Practices, released in 2006 by Cambridge University Press. As the former Research Director for Longhaus he undertook the first in depth research into the implications of cloud computing and other “as-a-service” ICT offerings on the Australian and near shore markets. The 2010 report entitled, Defining cloud computing highlights provider gaps in the Australian ICT market, was widely reported in both the online ICT industry press and mainstream media.
- IBRS iQ
31 December 2017
Conclusion: Due to years of tactical software deployments in response to urgent digital transformation uplifts, organisations have created a jungle of business intelligence (BI) technologies deployed in the absence of a well described and comprehensive approach to the challenges faced; challenges that will continue to increase with the shift to AI-enabled transformation.
Instead the majority of solution paradigms have centred around the application of emerging technologies with little articulation of a coherent architecture traceable to the underlying functional or non-functional requirements required to support a well governed and long lived data analytics platform. Instead, with each new trend in reporting and analytics, e. g. big data, results in a litany of partial solutions.
Enter Data Vault 2.0 (DV2.0) is the first well described architecture, methodology and modelling approach to emerge from the BI community in the last 5 years. DV2.0 provides a solid basis for organisations wishing to avoid the data sins of the past and adoption should be a top consideration for the inevitable expansion of BI that flows from business application transformation and as part of a clear DataOps strategy.
05 May 2018
Conclusion: The development of AI-based solutions is heavily dependent on various types of data input in the form of either:
- Large data sets used to conduct experiments to develop models and algorithms for predictive analytics, optimisation and decision recommendations; or
- Enriched and tagged corpuses of images, audio, video and unstructured text used to train neural networks using deep learning techniques.
While at first the data management needs of AI-based solution development might leverage both data scientists and their existing business intelligence platforms to exploit these types of data, the actual lifecycle management needs of AI developers will expand quickly beyond the boundary of the traditional enterprise data warehouse.
Therefore, like the source code and configuration data underpinning transactional business applications, the raw data and algorithms of AI solutions must be managed by evolving DevOps practices towards a comprehensive “DataOps” model.
01 April 2018
Conclusion: Although online digital platforms are in ready supply, organisations remain unable to avoid the receipt of critical information in the form of paper documents or scanned images. Whether from government, suppliers or clients, organisations are faced with written correspondence, typed material, completed forms or signed documents that must be consumed. For a variety of reasons, it may be unreasonable or impractical to expect this information to be sent in machine-readable form.
However, machine-readable content from incoming information, both past and future, is emerging as a prerequisite to exploit artificial intelligence and machine learning as part of digital transformation. Therefore, organisations need to re-examine their data ingestion strategies and move proactively to the use of optical character recognition on incoming paper- and scanned image-based information.
06 March 2018
Conclusion: Organisations continue to emphasise their competitive differentiation based on the data they hold, and the insights gained from analysing this valuable resource. The rate at which organisations are shifting from traditional process-based to insight-oriented differentiation is being further accelerated by the adoption of Cloud-based data analytics services.
The combined result is an increasing portion of enterprise project activity that can be classified as extract, transform and load (ETL).
Despite ETL being the mainstay of data integration for decades, the cost of specialised skills and significant manual effort expended on integrating disparate data sources is now coming into sharp focus. In response, organisations are rightly seeking lower-cost solutions for data integration.
Although ETL exists in the form of at least one tool in almost every enterprise, the cost of ETL as a proportion of data analytics projects means organisations must decrease reliance on traditional ETL tools in favour of automated solutions that exploit machine learning techniques to reduce the need for ETL developers.
02 February 2018
- IBRS iQ
01 February 2018
Conclusion: The release of Amazon’s Echo in 2014 heralded the first of a series of “ambient” technologies1. These new devices are unobtrusive, multiple purpose and capable of responding to conversational input through integration with virtual digital assistants (VDAs) such as Amazon’s Alexa.
A key enabler of these platforms is the ability to implement “skills” or expand the platform’s capability to interpret and respond with appropriate conversational content beyond the basic function of the device itself.
The consistency of information required by organisations under omni-channel delivery models, combined with under-resourced editorial teams, mean organisations must prepare for conversational channels by transforming existing content sooner rather than later.
Failure to do so will see history repeat itself through short-term replication of content to support new channels only to have that content and channel functionality merged back into increasingly sophisticated content management platforms at significant cost.
04 January 2018
Conclusion: Technologists consistently under-estimate the growth of data volumes. The result is tactical actions aimed at increasing capacity achieved by adding storage on-premise using traditional bulk storage solutions or moving technical workloads, such as back-up or disaster recovery, to Cloud-based Storage-as-a-Service offerings. This reflects a decades-old mantra of “disk is cheap, buy more disk”.
When the lack of predictability of data volume growth is combined with the need to capture then distribute data from new sources as well as control the hidden cost of data movement across networks, these tactical responses fail to deliver transformational value to end users.
To deliver effective and efficient data storage solutions, IT infrastructure architects must collaborate with their information and data management colleagues to identify the demographics of data being managed1; they must then select storage solutions that optimise data capture, storage, distribution and access based on these characteristics, not simply by volume.
03 December 2017
Conclusion: Despite repeated audits pointing to failures by IT to deliver expected outcomes, organisations continue to publish IT plans that do not adequately address the fundamental dimensions of IT planning, being the IT Business Plan, IT Strategies and IT Program of Work.
These elements are often developed as a single composite document, but this approach fails to recognise that each dimension:
- requires a different method of creation
- is owned by different stakeholder groups
- has a different purpose and audience
- requires renewal on different cycles.
Failure to ensure that all dimensions are addressed presents risks to implementation both in terms of effective up-front investment selection as well as ongoing IT governance arising from gaps in critical decision-making information.
To avoid these risks, organisations should maintain the content of each IT planning element as a separate deliverable even if the desire, or requirement, is to regularly produce an “annually” updated composite document.
- Governance & Planning
03 November 2017
IBRS iQ is a database of Client inquiries and is designed to get you talking to our Advisors about these topics in the context of your organisation in order to provide tailored advice for your needs.
- IBRS iQ
27 October 2017