Conclusion: For most corporate IT departments, concepts such as Cloud Computing seem light years away from current day-to-day reality. Yet the number of commercial providers of such services is growing fast, and even more far-fetched ideas such as global software service supply chains are emerging on the horizon. The distance between innovators and late adopters of modern techniques and technologies is growing. In this scenario it is essential to know when not to remain amongst the late adopters, to avoid being left behind in the dust and struggling with evaporating profit margins.

This article is the first in a series of three on technologies and techniques that are leading to fundamental changes in the architectures used to construct software applications and software intensive devices. First examples of these changes are already visible today, and over the next five years, many of the current rules for architecting business applications will be re-written.

Conclusion: IT managers planning business-to-business integration, or with the need to couple old-school EDI (Electronic Document Interchange) and legacy ERP (Enterprise Resource Planning systems) with modern web-based architectures must look towards a uniform message-based middleware infrastructure. If the organisation is already moving down the .Net deployment path BizTalk R2 is now a contender along with the more traditional products, such as Tuxedo and Tibco.

Conclusion: While wikis are certainly an important new approach to information management, they should not be considered as a replacement for enterprise content management systems (CMS). Instead, wikis should be considered an adjunct to content management, providing added flexibility and collaboration where needed. Understanding the differences between CMS and wikis is vital.

Conclusion: Legal firms acting on behalf of copyright holders of images are sending out thousands of copyright infringement notices globally, seeking hefty fees from companies whose web sites contain copyrighted images. CIOs need to understand some of the legal nuances and be directly involved in planning and implementing measures to minimise the financial risks of web content. 

Utility Computing is gaining higher levels of customer interest and acceptance, amid significant investments by systems vendors to build on their individual "brand" (e.g., Adaptive Enterprise, On Demand, Real Time Infrastructure) and the resulting cacophony of terms, definitions, and strategic directions. However, there still remain several crucial "missing-technology-links" in the evolutionary chain -- these will be addressed by both traditional software and systems vendors, as well as Open Source providers.

Widely-available, relatively cheap technology is catching up with the long-standing desire of end-users and businesses to use and pay for technology as demand arises, rather than them being forced to buy entire software packages or infrastructure, and then use just a small percentage of the overall capability. At the same time, user business environments have become more nimble, requiring more flexibility in IT delivery and usage, in licencing and payment structures, and in vendor business models. By being aware of new provisioning models, users will be able to gain long-sought improvements in costs and service.

''On-demand''—or ''adaptive,'' ''agile'' or other terms being used by major vendors includes the combination of business models, processes and operations that are enabled by and which require these IT resources. This concept is therefore more of an overall business strategy, including not just the availability of IT resources ''on-demand,'' but the ability to build, change, adapt and manage business operations using and leveraging the ready availability and variable capabilities of utility computing.

The next ''Big Thing'' in business computing is more about new IT deployment, sourcing and management models than about the evolution of technology itself. Vendors such as IBM and Hewlett-Packard have continued to fine-tune products and services that were announced in 2001 to a point where they are now viable choices for technology buyers. Given labels such as ''On Demand'' by some, or the Adaptive or Agile Enterprise by others, they can all be considered part of the Utility Computing paradigm.

Conclusion: IT organisations wishing to select quality services at competitive prices should rate themselves against an IT procurement maturity model to leverage economies of scale. This will enable IT organisations to reduce cost while meeting business needs in a timely and cost-effective manner.

Conclusion: It is no longer viable for telecommunication providers to simply offer Session Initiation Protocol (SIP) trunks for voice connectivity or Multi-Protocol Label Switching (MPLS) links to connect office and data centre locations. Nor does it make good business sense for the telco or for the customer.

The modern architectures of Cloud and Software-as-a-Service (SaaS), mixed with the need to maintain on-premise for critical elements are key components that support most digital strategies. Using older telecommunications architectures with fixed connections and physical infrastructure for routing and switching can be costly, and can stifle agility and therefore productivity.

However, modern telecommunication architectures bring an ability to virtualise connections and network switching. The abstraction of these capabilities allows dynamic management of the services providing substantial agility, as well as potential productivity gains and cost savings to the customer.


As-a-Service machine learning (ML) is increasingly affordable, easily accessible and with the introduction of self-learning capabilities that automatically build and test multiple models, able to be leveraged by non-specialists.

As more data moves into Cloud-based storage – either as part of migrating core systems to the Cloud or the use of Cloud data lakes/data warehouses – the use of ML as-a-Service (MLaaS) will grow sharply.

This paper summarises options from four leading Cloud MLaaS providers: IBM, Microsoft, Google and Amazon.


DevOps, business intelligence (BI) and data, machine learning (ML) and artificial intelligence (AI) are all driving rapid change within IT departments. The challenge will be finding Cloud certified people to meet the rising demand.

Leaders have two main choices. Upskill their existing teams, or embark on a recruitment campaign that brings in Cloud certified professionals to manage Cloud migration and provide the ongoing support and optimisation needed to bring the full value of Cloud to IT operations.

For organisations who suddenly realise how far they are behind on the Cloud value curve, pressure will mount to deliver results quickly. Make sure staff are certified and ready to address your hybrid or multi-Cloud environments.


If you lack confidence in the direction of your Cloud migration team (CMT), you are not alone. Chances are the Cloud migration was led by one or two enthusiastic champions from enterprise architecture, infrastructure or apps development who were comfortable being high-risk takers to advance their careers. Too often, these efforts result in partial results and sporadic application, which leaves many senior executives questioning the value of these untamed Cloud engagements.

What is needed now is a structured approach to Cloud skills development and team selection, that culminates in a CMT that effectively manages business needs and the underlying IT-as-a-Service (ITaaS) to deliver those business needs. Holding complimentary skills across the team will ensure a more robust analysis of the business needs, and a selection and rightsizing of solutions that continues to flex to meet changes in business and customer requirements. A post-migration Cloud framework should also apply, and contain a continuous improvement register which is examined and updated as the Cloud evolves, and governance programs that identify opportunities to maximise any hybrid or multi-Cloud solution.