Kevin McIsaac

Kevin McIsaac

Read latest work...

Connect with Kevin

Have a specific question Kevin McIsaac?

Email

Conclusion: Storage capacity growth rates have accelerated from a historical average of 30% to more than 50%, ensuring that data management and storage remains one of the key IT infrastructure issues though 2012. However, as the total storage capacity for unstructured data rapidly overtakes that for structured data, simply adding more storage capacity in an ad-hoc fashion will no longer work. Instead IT organisations must create new data storage strategies that can deal with the more rapid growth (60%-200%) of unstructured data.


Read more


Conclusion: One of the fundamental drivers of the Windows desktop Total Cost of Ownership (TCO) is the tightly coupled “application installation” model used by the Windows operating system. Application virtualisation can eliminate many of the problems associated with this model, significantly reducing the time and effort to install and maintain applications.

Since implementing application virtualisation requires significant changes to the desktop image the ideal time to introduce this technology is when deploying a new image across the desktop fleet. With Windows Vista being a significant trigger for deploying a new desktop image in the next 24 months, IT Organisations should consider adding application virtualisation to the migration project as a way to derive a stronger and quicker return from a Vista upgrade project.


Read more


Web 2.0 has emerged as one of the top buzz words of the last 12 month. But, is it hype, a passing fad or an important emerging trend that IT executives must know more about? Like the major IT paradigm changes before it, Web 2.0 is a little bit of each!

The difficulty with the Web 2.0 is a lack of a clear definition and so it can suffer from being “all things to all people”. According to the Wikipedia, “Web 2.0 is a term often applied to a perceived ongoing transition of the World Wide Web from a collection of websites to a full-fledged computing platform serving web applications to end users.”


Read more


Conclusion: In 2006 the adoption of X86 server virtualisation moved from “fast follower” to “mainstream”, with over half of IT organiations using or piloting it. In that year VMware established a clear technology and market share lead while today Microsoft is still 12 months away from releasing its first competitive (hypervisor based) product.

As the market grows dramatically though 2007, due to technical and market leadership coupled with a lack of credible alternatives, VMware will cement a dominant position in the Enterprise market that Microsoft will fail to over come until at least 2012. In this same period, due to Microsoft’s product and channel strength in the SMB market and the current low take up of server virtualisation by that segment, Microsoft will establish a strong base of Intel server virtualisation in SMBs.


Read more


Conclusions: The volume of digital data created and storage by organisations continues to grow exponentially, typically anywhere from 30%-60% per annum. For most organisations this level of digital data growth is not new, however what is different is that growth is now being driven by unstructured data.

Over the last 20 years organisations have made significant investments to deal with structured data, resulting in well managed structured information that supports and drives the business. On the other hand few organisations have invested similarly in unstructured data (e.g., e-mail, faxes, and documents) and many organisations are now finding the growth in unstructured data is a significant business problem.

To cope with the data growth over the next 10 years, organisations must learn to deal as effectively with unstructured data as they do today with structured data.


Read more


Conclusion: With storage capacities typically growing at 35%+, most organisations are finding they must routinely add more capacity. To avoid creating silos of storage capacity that can not be shared and optimised, or building a storage infrastructure that becomes increasing complex and costly to manage, IT organisations must properly plan and execute storage acquisition.


Read more


Conclusion: Over the last 10 years the IT applications and infrastructure in many organisations has rapidly evolved forcing IT departments to implement a variety of new technologies. In many cases this has resulted in technology silos that are complex, difficult to maintain and costly to extend. To support the business through the next 10 years, IT organisations must transform this complex legacy into an agile infrastructure that enables change.

A starting point for this journey is the development of an infrastructure architecture based on reusable, end-to-end infrastructure design patterns that leverage internal and external best practices, skills and technologies.


Read more


Welcome to the annual review of the most important IT trends for 2007. The IBRS Advisors have looked hard at their crystal balls and predict the following trends emerging in the next 12 months.


Read more


Conclusion: Linux on the IBM mainframe (z/Linux) has been available since 2000 but is not widely adopted. As IBM increases its resources promoting z/Linux in Asia-pacific it is an idea that will be raised more frequently in this region. While recent advances in z/Linux (e.g., 64-bit implementation) make it a powerful and technically viable platform, with some organisations reporting significant benefits in specific circumstances, z/Linux will remain a niche solution rather than a common alternative to Lintel and Wintel.

The factors involved in making a decision to migrate Linux from Intel to System Z are extensive and complex, ensuring that the adoption of z/Linux in Asia-Pacific will remain slow and usage will stay very low though 2010.


Read more


Oracle has long been a major contributor and supporter of Linux, beginning in 1998 with the first release of the Oracle database on Linux and later with the release of Oracle applications and middleware. Oracle has made significant contributions to the Linux kernel over recent years, e.g., the Oracle clustered file system, and in the process developed considerable Linux expertise.


Read more


I recently facilitated three panel discussions at Storage Network World. Using an interactive response keypad I was able to run a series of polls to the 80+ delegates. The results are very interesting and cause for some industry reflection.

When asked what the “most important business issue” was, they said “reliability and availability” closely followed by “business continuity”, with “cost containment” a distant third. When asked their “biggest storage related challenge”, the top response was “Managing growth and meeting capacity.”


Read more


Conclusion: With mobile device shipments predicted to grow in excess of 50% per annum though to 2010, IT organisations must learn how to deal with this trend. The key is to ignore the glamour and glitz associated with the mobile device and focus on the benefits that mobility brings to a core business process. Mobility projects have many of the same characteristics as ERP or CRM projects and IT organisations should apply the lessons learned from implementing such applications.

To avoid becoming a “solution looking for a problem”, organisations must resist the technology hype from telecommunications carriers and use a top down, business process improvement focused approach when considering whether to invest in mobility projects.


Read more


Conclusions: As virtualisation of distributed systems rapidly matures, IT organisations should evaluate their current virtualisation capability level and determine which level best supports the business’s needs. The right level of virtualisation capability will reduce complexity and increase infrastructure agility; enabling better alignment of IT infrastructure to applications requirements and hence alignment of IT infrastructure to business strategy.


Read more


Conclusion: A recent survey by Ideas International showed that 2/3rds of IT organisations use virtual servers, at least occasionally in their data centres, with 1/3rd using them at least frequently. While server virtualisation is commonly associated with server consolidation, and as a method of lowering hardware costs; early adaptors are using virtual servers to create an agile, utility-like infrastructure that is better aligned to the business’s needs.

Leading IT organisations use virtualised infrastructure, based on virtual x86 servers (generally VMware) and networked storage, to decouple workloads and data from the underlying hardware. This enables new infrastructure capabilities that better support the business’s availability and capacity requirements. Unlike the earlier, over-hyped promises of utility infrastructure, infrastructure virtualisation is a simple, practical technique that is delivering measurable business and IT benefits now.


Read more