Log in


Conclusion: In spite of some benefits in security, remote access and speed of deployment, VDI has remained a niche product. This has largely been due to the higher complexity and much greater capital cost compared with a Full Desktop. However, as VDI infrastructure innovations continue to close the gap, the adoption of VDI will increase beyond this small base. Due to the risks and costs of switching from a well understood model to a relative unknown model, the adoption will increase at a moderate rate and there never will be a “year of VDI”.

Circa 1960: The “Hard theory of platforms”

In the early days of information technology, hardware was THE platform. Companies such as IBM and DEC provided the big iron. Business software was THE application. In those days even software was as hard as stone. The term application platform was unheard of.

Conclusion: No, and there never will be “the year of VDI”. However, now that the capital cost of VDI is close to that of a Full Desktop the adoption of VDI will begin to increase beyond its current small niche. The large capital cost and complexity of replacing the existing desktop fleet, the perceived risks in using to a new desktop approach, and a general lack of experienced staff will ensure adoption of VDI will proceed slowly.

For the next 5-7 years organisations will continue to use a range of desktop deployment techniques (such as Full Desktop, Laptop, Remote Desktop Services aka Terminal Server) with VDI being just one of many.

Conclusion: The foundation of any BYO device initiative is a robust BYO device policy. The policy must set the boundaries for acceptable use, costs and security. Ensure device security is driven by business stakeholders and is based on pragmatic risk analysis rather than technical concerns from IT staff, or FUD from vendors who are anxious to sell their wares.

Robust policy, strong corporate culture and proper training can be more effective than technology in securing corporate data and controlling costs and risk. Use policy, culture and training to drive compliance, minimising the need for complex and expensive technological controls.

Conclusion: The forecast growth of data transmission over the Internet in the next decade means the role of content distribution networks will probably rise. As demands on bandwidth grow, efficient management of online data will be at the centre of many organisations’ online delivery strategy.

While it may seem that improved broadband and the arrival of the NBN (when that occurs) will solve the issues of speed, it will not because more users, richer media and more applications will fill the bandwidth. Consequently a content distribution network (CDN) strategy ought to be part of any organisation’s online planning.

Conclusion: The idea of Bring-Your-Own (BYO) Laptop has been bandied about for the last seven years, but it is not as common as implied by the press. Few ANZ organisations have BYO Laptops, however some have implemented BYO smartphones and many intend to do so in the next 18 months.

The driver of BYO device in the organisation is not avoidance of the capital costs but rather the need to accommodate users’ expectations of technology, which have been significantly increased by the consumerisation of IT, and largely driven by the iPhone and iPad.

Conclusion: Oracle will continue to excel in the Application, Middleware and Database markets, but it also intends to radically transform and simplify IT infrastructure. Oracle’s strategy is to eliminate complexity, create significantly greater business value and reduce infrastructure costs using an Integrated Systems approach. The objective is to enable customers to focus on applications, instead of infrastructure, in the hope they consume more Oracle software.

IT executives should keep abreast of Oracle’s infrastructure innovations, as well as the competitors’, and be prepared to rethink their existing infrastructure approach if an Integrated System can create a significant new opportunity for the business.

Conclusion: The discipline of Enterprise Architecture has evolved from the need to articulate and maintain a big picture overview of how an organisation works, covering organisational structure, processes, and systems. Whilst Enterprise Architecture can assist in implementing industry best practices, several-fold improvements in productivity and quality are only possible if the organisation makes a conscious effort to attract and retain top-level subject matter experts, and if it commits to a so-called Domain Engineering / Software Product Line approach to the strategic analysis of market needs and the design of products and services.

Conclusion: Poor quality and incomplete requirements continue to be a leading cause of IT project failure. While the more widespread use of iterative project management techniques is minimising the impact of bad requirements, it is still not addressing the underlying cause. Accountability for improving the quality of requirements remains elusive. Enterprise architects must take a stronger role in the validation of requirements, and be prepared to intervene when necessary.

Observations: The saying goes that you cannot create a symphony by asking a hundred people to give you ten notes each. This is an apt description of the way requirements may be developed on large IT projects. The result is often a disjointed set of wishful ideas, concepts and assumptive solutions without any intrinsic integrated design or consistent rationale. Given this profoundly flawed starting point, it is not surprising that subsequent project implementation activities that rely on correct and consistent requirements will be inherently challenged.

Challenges in defining requirements: Understanding of the term “requirement” differs among stakeholders. Requirements can be variously perceived as user wish-lists; or detailed product features sets; or complex business process descriptions. The language used to express these requirements is often loose and ambiguous, instead of concise, testable statements of conformance. Requirements often focus on functional behaviour and ignore important non-functional aspects such as performance, security and operational concerns.

Commonly the task of establishing a set of requirements is somewhat blithely described as “requirements gathering” and implies that they already exist ready-formed in perfect shape, and just need to be harvested like simply picking cherries from a tree. Such a perception is a very dangerous attitude – especially among senior executives.

The reality is that high-quality requirements are difficult to create. Unless there is a very clear and concrete understanding of the objectives of the system, and ready access to explicit and accurate supporting information about all relevant dependencies, the process of defining requirements can become a messy and imprecise affair. Common challenges include:

  • conflicting understanding of the underlying business problems between stakeholders

  • limited access to key subject matter experts

  • organisational politics that hinder contribution and create divergent objectives

  • changing circumstances render requirements obsolete

  • time pressures that cause analysis to be incomplete and poorly formed

Dealing with poor quality requirements: Delivery pressures tend to force poor requirements to be accepted unchallenged. In the face of impending (or missed) deadlines, there is acute pressure to have the requirements ‘signed-off’ regardless of the quality. Project governance checkpoints tend to measure when a project milestone has been completed, but not the quality of the actual work products. If requirements are identified as lacking, this advice can be ignored, or dismissed as rework that can occur in later project phases.

The best way to guard against poor quality requirements is to have them validated early and often. Requirements can be quickly tested against some very simple heuristics to gauge the quality and completeness of their definition. Simple tests include:

  • Cohesive – does the requirement address a single, simple business function?

  • Precise – is the requirement completely unambiguous and stated using concise, simple, plain language?

  • Verifiable – can conformance with this requirement be easily proven in the testing phase?

  • Traceable – are all requirements linked back to a clear business need or objective, and are all business needs covered by a comprehensive set of requirements?

The rise of agile delivery techniques has cut the time between requirements definition and requirements testing. This has meant that faulty requirements can be identified faster and at a smaller scale than traditional waterfall techniques. However agile delivery methods are still not pervasively utilised – and very large programs of work found in government and financial sectors still rely heavily on waterfall techniques.

The role of the architect in requirement validation: Requirements elicitation and definition is commonly the domain of the business analyst. Architects tend to be engaged in projects in the earlier conceptual phases to make key decisions about platforms and technologies based on high level business needs. Then, in parallel to the detailed business requirements definition, architects focus on things such as:

  • defining system context and external integration points

  • identifying system components and interactions

  • understanding information structures and flows

  • performance, security and capacity analysis

The risk here is that while the architects are focused on all-things architecture, they remain isolated and disconnected from the detailed requirements definition and validation. But architects are the best placed people to perform requirements validation. They are the experts that should hold the end-to-end system knowledge and enterprise context, coupled with a clear understanding of the business needs and desired benefits to critically and authoritatively validate the quality of requirements.

Despite protestations from architects that requirements validation is unwanted QA of business analyst artefacts, or unnecessary detail, or this is the role of the project manager – architectural validation of detailed requirements must be performed. And project managers must be accountable in ensuring that any deficiencies identified in architectural review are acted upon.

If poor quality requirements are identified by architects, and not addressed by project teams, architects are obligated to escalate the issue for executive attention. Architectural intervention over poor quality requirements is perhaps one of the most important actions that can be taken to improve the chances of project success.

Next steps:

  1. Examine how the quality of requirements is assured on projects within the enterprise.

  2. Check whether architects have appropriate review and oversight of requirement quality, or is this left as a project manager responsibility.

  3. Make architects accountable for requirement validation as a mandated governance checkpoint.

  4. Ensure appropriate escalation path exists for architecture intervention if necessary.


Conclusion Leading IT organisations now recognise that selecting and integrating a mix of best-of-breed servers, storage and networks no longer adds value to their organisation. Instead they are purchasing Integrated Systems from a single vendor that eliminates the cost and complexity of integrating these components; lowers the integration and support risks; and reduces the time to deliver a working solution.

To make this paradigm shift most organisations will need to change the kind of relationship they have with their infrastructure vendors from a purely transactional supplier to a long term strategic partner. For many IT, and vendor staff, this will be a difficult and traumatic transition.

In the News

New data breach notification scheme will be a barometer for business maturity - AFR -12 March 2018

Do not mistake cyber security for being merely a technical discussion about IT problems to be fixed. Cyber security is now, and always has been, purely a response to risk. The risks have changed...

The Future of Work: The Role of People - Adobe - 31 Jan 2018

The Future of Work: The Role of People Foreword by Joseph Sweeney, IBRS Advisor For the past 30 years, organisations have applied technology to people to make the workplace more productive. But...

Businesses unprepared for new data breach notification laws - AFR - 29th January 2018

Thousands of Australian small businesses remain woefully unprepared for the introduction of new laws that will require them to publicly disclose if their customers' data is breached by hackers or...

Intel chip meltdown flaw shows new vulnerability - AFR - Jan 5th 2018

Cyber security experts have warned the long-term implications of chip vulnerabilities nicknamed Spectre and Meltdown discovered by researchers this week are still unknown, despite it appearing that...

Business experience should help parents keep kids safe online - AFR - 28th Nov 2017

The adults in the lives of young people need to know more about security and safety in an online world and they could be learning this at work The Office of the eSafety Commissioner deals with some...

Subscribe to IBRS Updates

Invalid Input
Invalid Input
Please enter a valid email address
Please enter your mobile phone number
Invalid Input

Get in-context advice from our experts about your most pressing issues or areas of interest

Make an Inquiry


Already a subscriber?

Login to read your premium content.

Recently Viewed Articles
Related Articles