Kevin McIsaac

Kevin McIsaac

Read latest work...

Connect with Kevin

Have a specific question Kevin McIsaac?

Email

Conclusion: The prototyping of user interfaces is moving beyond traditional low-fidelity static wireframes and embracing more sophisticated approaches to simulate the entire application experience. Modern solution visualisation tools can deliver a more accurate and dynamic rendition of complex enterprise applications before a line of code is cut. This can significantly reduce delivery risk and improve development productivity. Beware though the diminishing returns from overly complex solution visualisation tools that absorb the cost and risk that they should be offsetting.

Observations: Prototyping has long been a means to visualise early in the system development lifecycle how a system should look and behave. This has the benefit of allowing a system design to be validated as early as possible by subject matter experts. While textual based requirements remain the primary contractual specification for system development, they are often hard to fully comprehend without the kinesthetic aid of a visual prototype.

A good prototype should put abstract and generalised requirements into the context of real life business processes and user interactions, creating a rich dialogue between system designers and users around the system’s expected behaviour. The use of a prototype to assist in the harvesting of correct, concise and consistent requirements can pay off considerably in the long term. Studies show that addressing defects in the requirements phase can be 100 times cheaper than fixing them as post-implementation defects.

Traditional prototyping: Low tech prototyping tools such as butchers paper, white board markers and post-it notes can be immensely valuable design tools when combined with adept user interaction facilitators and knowledgeable subject matter experts. Rapid storyboarding of anticipated system behaviour from a variety of stakeholder perspectives can generate a robust high-level shared view of system functionality. This can help establish, or challenge, the important assumptions and principles that will guide detailed design work.

The use of productivity applications to generate basic user interface mockups (Powerpoint, Visio, OmniGraffle) or RAD development tools (such as Dreamweaver and Visual Studio) to create HTML or native-app click-models have become common prototyping tools. In the case of RAD development tools the lurking danger is that users or (even more worryingly) executives, begin to see the prototype as almost the finished system. This can lead to pressure to “put the prototype into production” without the required underlying architectural support, or set false expectations about the considerable extra effort required to implement complex integration and business rule support. To avoid this situation a common rule is to implement a “throw away” prototype using a technology that is impossible to deploy into production.

Solution visualisation: New prototyping tools (such as iRise, Axure and SketchFlow) have emerged that support the end-to-end simulation of applications without the overhead of custom development. These “solution visualisation” tools enable the definition of rich visual user interfaces with advanced interactive features such as data entry, simulated business rules and complex navigation logic. Supporting these design elements are collaborative features that enable applications to be reviewed, annotated and critically evaluated in a fashion similar to the “mark-up” features of modern word processors. These prototype applications can often be packaged for easy distribution by email, facilitating analysis and review without the overhead of maintaining access to prototype infrastructure.

While solution visualisation tools can quickly deliver rich prototypes that flesh out high-fidelity representations of desired systems, the cost of purchasing such software and the effort to maintain complex prototype models can undermine the core benefits of prototyping. The benefit of common tools such as HTML editors or office productivity application lies in their ubiquitous availability and universal accessibility. Complex solution visualisation tools may require expensive up front license costs and/or proprietary file formats that require specific reader software to be installed for the viewing of applications. They may also require centralised repositories to be installed into server environments. These tradeoffs need to be carefully considered before embracing a solution visualisation toolset.

Multiple device simulation. The enterprise application landscape is changing. Touch phones and tablets have established a need for mobile “finger-driven” apps as much as there is a need for “mouse-driven” desktop apps. Corporate applications are demanding dual-mode solutions that support finger operated Android and iOS access just as they do web-based or rich native applications. This need is adding an extra burden on prototyping efforts to support multiple target platforms. Solution visualisation tools can help manage consistent requirements across a multitude of platforms.

ALM tool integration. The key outcome of solution visualisation or prototyping efforts is a consistent and complete set of requirements. In the modern software world applications are often managed by dedicated tools or as part of application lifecycle management (ALM) suites. Effective solution visualisation tools need to be able to integrate into the requirements and design management tools that support the end-to-end development lifecycle. This allows solution visualisation tools to be institutionalised in the core process of application delivery, rather than sidelined as a niche user experience activity.

Next steps:

  1. Organisations should carefully evaluate the use of prototyping in supporting custom application development. Is it effectively contributing to the quality of application design before moving into the development phase?

  2. If basic prototyping tools are not being used effectively start with the low fidelity approach and embed their use into upstream requirements and design activities.

  3. If basic prototyping has reached a ceiling of either design richness or stakeholder reach, consider solution visualisation toolsets. Consider carefully the Total Cost of Ownership of such tools against the anticipated benefits.

McConnell, Steve (2004). Code Complete (2nd ed.). Microsoft Press. ISBN 0-7356-1967-0.

 

Conclusion: Running a robust, cost efficient data centre requires a scale of operations and capital expenditure that is beyond most ANZ organisations. Organisations that host equipment in their own facilities have a higher business risk. Business management is unlikely to be aware of these risks, and has not signed off on them, leaving the IT organisation exposed.

Business management should ask for a business impact assessment to expose these risks to an appropriate decision making level. Management can either sign-off on these risks or request a mitigation plan. For many organisations, moving to a commercial Tier-2/3 data centre reduces risk without substantially changing the total cost. SMEs should consider migrating to a cloud environment (IaaS and/or SaaS) and get out of the business of owning and running their own IT infrastructure.


Read more


Successful IT architecture is largely about choosing the optimum systems and technologies that enable organisations to achieve their strategic objectives. The right way to choose between architecture options is through an open, timely, visible process that incorporates key stakeholder input, is based on credible evidence and is measured against alignment with organisational needs and priorities. Poor architecture decision making leads to confusion, waste and delay.


Read more


Conclusion: Demand for storage capacity continues to grow at 60%+ per annum, requiring ongoing capital investments in incremental capacity upgrades, or worse, a capital intensive rip and replace upgrade every 3-4 years. Since “cloud” is the current IT buzzword, IT organisations are being asked to look at how the use of cloud storage can reduce cost and transform lumpy capital expenditure into a more uniform “pay as you go” operational cost.


Read more


Conclusion: For many organisations the question of thin vs. full is highly polarised and usually framed as a mutually exclusive choice where the “winner takes all”. Recent advances in desktop deployment methods enable this question to be constructively reframed as a benefit analysis focused on who, what and where. This approach ensures the appropriate device is used in each scenario, enhancing desktop agility and improving the user’s desktop experience. 


Read more


2010 has seen many high profile IT failures in well-run Australian companies

  • Westpac’s online banking system was down for about nine hours in August, due to a “software patch”.

  • Virgin Blue had a complete outage of its online reservation system in September, which lasted for about 21 hours. This was caused by data corruption on a solid state disk that appears to have then corrupted the recovery environment. Virgin said this created severe interruption to its business for 11 days and estimated the cost as between $15 million and $20 million.

  • NAB’s core banking system was impacted by the “upload of a corrupted file” in November. This prevented many of customers from receiving payments or withdrawing funds. The impact of this failure was still being felt some weeks after the initial incident.

  • CBA had processing problems that impacted about five per cent of customers so that from an ATM their accounts had a zero balance.

  • Vodafone customers experienced poor reception and slow download speeds for over a month ago after a “software upgrade caused instability” in its system.

In five months Australian has experienced five high profile failures from five brand name companies. So how is this possible? Each of these organisations has large, well-funded IT organisations.


Read more


Conclusion: As Windows 7 celebrates its first birthday many organisations are contemplating a desktop upgrade. Most desktops were designed more than seven years ago and there are many new technologies and approaches that need to be considered.

For most staff the desktop is a personal experience, making the upgrade a high-profile project. Treating this as just a technical refresh risks creating a technically successful solution that is considered an expensive failure by the business, or of marginal value. To avoid a career-limiting move, approach the desktop upgrade as a business project that has strong links to key business drivers, and structure the implementation to ensure it quickly delivers tangible business benefits.


Read more


Conclusion: In recent months several Tier-1 Australian and New Zealand vendors have announced, and in some cases delivered, locally hosted Infrastructure as a Service (IaaS)1. These announcements will reduce Business and IT Executives’ perception of the risk of adopting IaaS, and result in greater interest in using cloud as a “lower cost alternative” to in-house infrastructure.

While the cloud is often assumed to be an inexhaustible supply of low cost virtual machines, that are available on a flexible pay-as-you-go model, organisations that have looked beyond the hype found it was not as cheap, or as flexible, as you might think.


Read more


A spate of poor Service Oriented Architecture (SOA) initiatives has left some thinking that SOA is yesterday’s silver bullet. However, an effective SOA remains an essential foundation for the evolution of enterprise systems. Any organisations disillusioned by the promise of SOA should revisit its experiences to understand why business value was not successfully realised. With the right insight into the critical conditions for SOA success, those organisations can realign, and if necessary reactivate, SOA efforts as an integral part of their IT strategy.

Read more


The right starting point for Enterprise Architecture (EA) is a clear picture of your organisation’s strategic objectives and desired operational model. If this picture is well formed, an EA can effectively mould the necessary structure and behaviour of business and IT assets over time to maximise business performance. For organisations just beginning the process of developing an EA, it is better to start by concisely and simply documenting an EA vision, rather than attempting to create a detailed EA strategy with complex, formalised frameworks.


Read more


Conclusion: With the release of View 4.5 VMware has failed to move beyond the limitations of a centralised, virtualised desktop (aka VDI) to a robustly managed Dynamic Desktop that supports Full, Virtual and Published Desktops. VMware claims to have eliminated the capital cost barriers to VDI adoption and has introduced a management framework concept called the Modular Desktop that in the long run will enable VMware to expand out of its desktop niche.

VMware will continue to be challenged by Citrix which has much greater experience in the desktop market and has delivered a Dynamic Desktop for over 12 months. Microsoft also has the capability to deliver a Dynamic Desktop, but has yet to articulate it in a robust or compelling way.


Read more


Conclusion: Being acquired by Oracle is a good thing for Sun technologies. However the long acquisition period, followed by weak marketing of the benefit and poor communication of the product roadmaps, has left many customers unsure about their strategic investments in Sun technologies.

Oracle has a clear plan for Sun, with detailed product roadmaps, but customers will have to dig deep to get this information.


Read more


Creating a single source of truth is impossible in the modern enterprise. Today’s complex environment of custom, COTS and cloud solutions mean that redundant and inconsistent information will persist for a long time. Instead drive consistency and currency of data across systems using Master Data Management (MDM). Addressing underlying data quality issues will remain a harder task.

Read more


Conclusion: Defining the Cloud is proving to be elusive. This is because vendors are trying to neatly define Cloud around their products and services. This creates competing, product based definitions for what is actually an aspiration to create a “better IT environment”.

Viewed as an aspiration, the Cloud becomes a journey to create an IT environment with specific characteristics, such as cost transparency, utility pricing, capacity on demand, commodity pricing, self-service, location and device independence. Like all journeys there are many different paths which depend on where you start and where you want to go.


Read more