Disaster Recovery Planning DRP

Conclusion: In recessionary economies, as in war, values and behaviours change in response to the times. Formerly valued business success factors may no longer apply; management thinking once considered outmoded may now have new relevance. At an organisational level, focus is likely to be on the lower strata of Maslow’s hierarchy of needs1. Indeed, C-level executives will be appraised on their ability to contribute to meeting these needs.

Conclusion: The possibility of enhancing websites is not high in 2009. Therefore, developing ingenious ways to improve old website properties is necessary. Evaluating and testing the website is a wise strategy in order to refresh content and enhance contact with site users.

A testing strategy should set out the business case, including the logic by which it will be conducted and the return on investment that may be expected. This focus on process will help to ensure that the testing program can achieve results and that other stakeholders within the organisation understand the objectives and purpose of such a testing program.

Conclusion: It is now increasingly recognised that small (domain specific) modelling languages hold the key for improving productivity and quality in software design, development, configuration, interoperability, and operation. Little custom-built languages can be introduced and exploited without necessitating any changes in architectural frameworks or run-time technologies – a characteristic that painfully lacking in the vast majority of software products and tools. One of the first steps to get started with domain specific modelling is the selection of an appropriate DIY tool kit to build software power tools based on little languages. Currently there are three mature tool kits in the market that are worthwhile considering and the number of contenders is increasing.

Conclusion: The choice of technology for a website involves a selection process with several factors. The process must consider adequacy of the technology, future business needs, and organisational resources, both current and future. Clarity in the choice of products will reduce risk and offer better resource allocation.

The best way to decide the preferred technology option is to use a decision template which assists in the selection process, providing a rational, transparent background to choices. This method can work for an organisation into the future regardless of personnel.

Conclusion: Economic downturns alter organisational dynamics and can herald changes in the executive power hierarchy. IT can be particularly vulnerable if seen as a cost centre and order taker. As economic forecasts darken, a common scenario is for the balance of power to swing to the CFO. Then, an economic austerity agenda is usually pursued, characterised by a program of across-the-board cost cuts that have Chief Executive imprimatur.

Conclusion: The importance of web site usability has higher recognition now than it did a few years ago, but there are still several gaps in achieving an effective usability evaluation process. In order to improve site usability for end users, combining technology with survey research will help considerably.

There have tended to be two paths to examining website usability. The first is the use of Web analytics data, and other technology tools generally, to improve a site’s functionality. The second path employs consultants’ expertise in conjunction with research focus groups to address the usability and functionality of web properties. The integration of these two methods, on a case by case basis, would be more effective.

Conclusion: Exploring better content management solutions to remain competitive and to raise the value of online investments is a wise policy to adopt now. With much slower economic prospects ahead, gaining greater efficiency or reaching users in better ways is going to be necessary.

For commercial websites the criteria to implement content management should be underpinned by usage – that is, click rates, content access and so on. The web sites that create dynamic – and personalised – online environments are more likely to outperform stale Web sites. Having a better content management system process may also use resources more efficiently and help align an organisation’s objectives to the new business conditions.

Conclusion:The balance of information power is skewed in favour of knowledge intensive organisations, to the detriment of information-poor organisations and individuals. Reliable, high quality information distilled from Software as a Service users is evolving into a powerful currency that can be translated into financial profit via the sale of ad space and other techniques.

Conclusion: To get the most from their IT vendors, buying organisations must understand the underlying importance of each of their vendors to the organisation, and their potential to work with the organisation to help achieve business goals. A structured approach to building a vendor portfolio will allow key vendors to be identified and for the process of building strategic, partnership type relationships to be initiated.

The financial press has begun using the term GFC as a short form for the Global Financial Crisis. Whilst outside the scope of this paper to speculate on the length and socio-economic effects of the GFC, there is no doubt that its impact will be experienced widely across business sectors and indeed within government. As consumer confidence recedes, corporate earnings shrink and revenue forecasts are revised downward, nothing is more certain than IT budgets being trimmed in 2009.

Conclusion: The International Standards Organisation has just released a new International Standard that focuses on Disaster Planning for IT1. This new standard reflects the changed/outsourced IT world. It provides guidelines for information technology disaster recovery services as part of business continuity management that apply to both “in-house” and “outsourced” ICT environments. This new approach for Disaster Recovery (DR) Standards should stimulate organisations to re-examine their IT DR plans to ensure that they meet current best practice and that the processes they are using to maintain their DR planning are satisfactory.

Conclusion: Web analytic tools are so pervasive and widely used it hardly seems necessary to consider their capabilities and implementation. Yet businesses and other organisations may under-use, their Web analytics software. In which case they are not obtaining the value they expected.

The evidence from both measured and anecdotal sources is that organisations that achieve the greatest gains through Web analytics have used a process to select the right tool for their needs, then integrated it well, and trained their staff to use the system to segment visitors, understand their engagement, and quantify the effectiveness of the website.

Last month’s issue of the Communications of the Association for Computing Machinery (ACM) contained a timely article on the role of formal methods in the design and construction of software systems. The article drives home the point that much of software development today still amounts to "radical design" when viewed from the perspective of established engineering disciplines and that, to date, there are only a limited number of areas for which established "normalised software designs" exist. But this picture is slowly starting to change, as model-driven approaches offer economically attractive ways of packaging deep domain knowledge as reusable "normalised designs".

Conclusion: In the current credit and liquidity market investors demand more transparency, and accurate and timely product and market information, yet most legacy banking systems are not up to the job. There is a strong business case for replacing legacy banking systems to restore organisational agility, and to improve the quality of service offered to customers.

Conclusion: A new age for business applications is unfolding. Arguably, in 2008 applications are at a tipping point akin to that experienced in the early to mid-1990s, which was marked by the emergence of mature ERP technology and subsequent explosive sales growth. CIOs are urged to put applications firmly on their radar and begin acting upon their application portfolios as well as the methodologies and governance approaches that underpin them.

Conclusion: The usefulness of Web based applications is not limited to the provision of Web-enabled front-ends to traditional business software. The Web also allows the design of applications that are capable of putting powerful human intelligence at our fingertips. Tapping into that intelligence to solve truly hard problems possibly constitutes the next disruptive innovation. Intelligence has never been cheaper!

Conclusion: The recent strong media attention on Green IT, coupled with aggressive vendor marketing, has left the impression that many IT organisations have made significant progress in reducing their environmental impact. In recent conversations with our clients it seems this media and vendor attention has raised concerns with some organisations that they have fallen behind their peers in this area.

To help clarify the status of Green IT in ANZ we recently undertook a survey that indicates most organisation are still in the earliest stages of reducing their environmental impact of IT. While there is great interest in Green IT, and the majority of organisations have a mandate from the executive to reduce the environmental impact, there is a strong disconnect with the IT organisations ability to effect change due to lack of budget and formal programs of work.

Conclusion: Manually re-implementing application functionality in a new technology every few years is highly uneconomical. Model driven automation offers the potential to eliminate software design degradation and to minimise the cost of technology churn. Yet the model driven approach only works if conceptual models of questionable quality are discarded, and if deep knowledge about the business is used to develop elegant, compact, and tailored specification languages for domain knowledge.

This article is the final in a series of three on technologies and techniques that are leading to fundamental changes in the architectures used to construct software applications and software intensive devices.

Conclusion: Over the next 7 years the typical commodity IT infrastructure1 will be ‘reinvented’ from today’s network of independent servers and storage into a unified computing resource that looks and behaves remarkably like the old mainframe. This new infrastructure will blend the best attributes from each architecture to create a highly agile, robust and cost effective environment that is based on commodity components.

While the key technologies are available today, due to the inertia of the existing environment, and the cultural barriers in IT and the business, this journey will take most organisations 5-7 years to complete. IT organisations can hasten the journey by breaking down the siloed, hardware centric cultures that exist in their organisation. To succeed, the commodity IT infrastructure must be reinterpreted as a unified, shared resource, where a server is a mere component, rather than as a loose network of servers owned and managed by individuals or groups.

A number of vendors and thinkers have been promoting the idea of putting the entire data centre in a shipping container. These pre-assembled mobile data centres can come complete with pre-installed networking, power and cooling systems and can be transported by truck and quickly made operational. The approach is touted when temporary, mobile data centres need to be set up, such as in disaster-recovery situations. It is also being promoted as viable as organisations struggle with providing adequate IT resources for their IT dependant operations.

Conclusion: Authentication is arguably one of the biggest stumbling blocks on the road towards massive use of Software as a Service and Cloud Computing. Enabling authentication via the traditional login dialogue between individual systems and users does not scale anymore, and home-grown single sign-on architectures are largely limited to the corporate boundary. OpenID addresses the issue of establishing trust and credentials head-on, and makes use of a process pattern that is well established in the domain of financial transactions.

Conclusion: It is well known that the cost to rectify a defect increases significantly the later the stage in the systems development life cycle it is discovered. At the same time it is well known that software requirements can only be reliably uncovered when an iterative process of validating software under construction is used. Taking full advantage of iterative requirements validation while minimising the costs associated with late defect discovery requires a 360 degree perspective on requirements and testing that goes beyond the scope of individual projects, as well as a realistic perspective regarding the (in)ability to foresee future requirements.

Conclusion: Open Source software is demonstrably successful in specific business and government areas (think of all those Linux and Apache servers humming away!). However, apart from these, MySQL and a few exceptions from, for example, the Mozilla stable, Open Source systems have not yet had a significant impact in the mainstream world of business IT. The one exception is in Government where there is a world-wide trend to espouse Open Source Software. Government agencies are reporting what they see as compelling arguments for adopting Open Source. Those in the Government arena who have not yet considered this option may wish to re-examine their stand.

Conclusion: Multi-channel CRM is an area in which technology has to serve people, and when it does not, the disappointment and failure is measurable. In call centres, and in the growing, though underutilised implementation of Web based contact, CRM can be tuned to customer needs, if an organisation has made adequate plans, based on customer behaviour and usage.

If CRM solutions are chosen, for example, Interactive Voice Response technology, organisations may find a cost-effective solution over time, but one that short changes them in their ongoing relationship with customers.

Conclusion: Implementing a web service oriented architecture leads to more maintainable application systems that are cheaper to operate - if you can afford to wait three years or longer, without resorting to cutting corners, or even pulling the plug. Reduction of risk exposure is the real and immediate reason why consumption and creation of services should be an essential part of renovating and evolving the enterprise application landscape of a software intensive business.

Conclusion: With a degree of inevitability the Web 2.0 bandwagon has reached CRM. And the reason is obvious: with so many Web 2.0 applications and media moving users to be engaged in some way, either by posting to blogs, or video and creating their own mashups, the engagement is interpreted as a sign of reinvigorated possibilities through CRM channels. Such diverse and disparate potential is however unlikely to be pursued by most organisations, although new approaches in communications are worth examining.

While for some businesses the mania over 2.0 apps is of value, in the main the hype is just that, and for one important reason: users may engage with some sites and corporations as friends might, but not with others, where the relationship might be described as nothing more than transactional. In other words it’s not an equal opportunity for all players and it would be unwise to accept the apparent 2.0 opportunity regardless of business sector and an organisation’s habitual relationship with its customer.

A recent discussion of software development methodologies with a colleague ended in the joint conclusion that the way software is developed today apparently has a lot to do with process elements that are best described as “rituals”. Often these rituals work as expected, but sometimes they don’t.

Conclusion: In the last seven years Domain Specific Modeling and Model-Driven Software Development have emerged as fundamentally new paradigms for software development. Upon closer examination however, there is a familiar pattern at work. The new approaches represent a shift to a higher level of abstraction, not unlike the shift from assembly language to higher-level languages thirty years ago.

Conclusions: While there is now an increasing emphasis on Business Continuity Management (BCM), many organisations still focus on disaster recovery planning. Unwisely they restrict their focus to restoring IT infrastructure, giving only a “cursory nod” towards a more holistic business orientation that focuses on all critical business operations. Some create an artificial air of confidence by developing their business continuity plans and then not proving them. Others have little appreciation of the quality of their Business Continuity Plans (BCP) and whether or not they meet good practice. In all these cases there can be no assurance that the BCPs will be of any practical use if and when they are needed. The outcome will be, at least, serious and could be catastrophic.

Conclusion: Quasi-pervasive web connectivity in combination with more sophisticated software services that cope gracefully with short-term loss of connection are changing the landscape in which software product vendors operate. The shift in brand-awareness and power in recent years from traditional IT giants Microsoft, IBM, and Oracle towards web-based brands such as Google is one of several observable indicators of the transition of the web from a primarily static information repository to a highly dynamic ultra-large scale system.

In 2003, the UK Office of Government Commerce (OGC) established and coordinated “Proof of Concept” trials of Open Source Software (OSS) in a range of public bodies in conjunction with IBM and Sun Microsystems. The OGC report that summarised the key findings from the trials also included information obtained from other public sector activity in OSS planning and deployment in the UK and elsewhere in Europe.

Conclusion: We live in the age of personalised and mass customisable products, and this has significant implications for the software systems that enable such products or services. If configurability is added to software as an afterthought, the results are not pretty. In contrast, products or services that are personalised and configured based on intelligent interpretation of user feedback constitute a genuine improvement in quality, typically reducing the complexity that users have to deal with.

Conclusion: The amount of information that software intensive businesses store in their databases continues to increase from year to year, fueled by demands for regulatory compliance (for example SOX1), by increasing complexity of products, and the quest for a deeper understanding of customer behaviour. Yet, in the next few years, it is likely that the increasing use of web services will lead to smaller and more modularised database schemas.

Conclusion: Web 2.0 ideas and technologies are still evolving rapidly, but it is possible to identify likely dimensions along which further innovation can be expected. The most mature aspect of Web 2.0 arguably consists of simple/elegant web based community tools. Investing in this area is worthwhile, but the effort should best be channelled into the one or two most relevant platforms.

Conclusion: Speed, quality, and cost with which IT solutions are built and with which IT services are delivered depends on a large number of variables. Understanding and managing these variables can lead to order of magnitude improvements – neglecting them can lead to serious inefficiencies.

Conclusion: Voice over IP is having, and will continue to have, a significant impact on the telephony and PBX market. Telephony is becoming increasingly software based and is taking advantage of software-related economics. The popularity of services such as Skype1 and Babble2, and Session Initiation Protocol (SIP)3, 4 VoIP applications demonstrates this. Microsoft’s recent release5 of its SIP based Office Communications Server (OCS 2007) will have a major impact on the telephony expectations and applications on organisations, irrespective of their size.

Conclusion: Even when one has settled on implementing an iterative software development process, there is still a large number of approaches and process frameworks to choose from. Instead of another instance of a "method war", it is much more productive to concentrate on techniques from different methods, which can be strung together into a working approach that is suitable for a specific context.

Conclusion: Until recently, Digital Rights Management (DRM) has been viewed primarily as an antipiracy technology. The recent advances in DRM and digital media management technology now mean that DRM is able to provide significant and new revenue generating opportunities for suppliers of existing or new digital products.

Conclusion: All too often scalability considerations are limited to a technical discussion of implementation technology combinations, and other aspects of scalability are ignored. Organisational scalability is only achievable if not only software architecture, but also knowledge management and software portfolio management are part of the equation.

Conclusion: Tesco’s move to sell its own brand of software has been perceived as a direct threat to Microsoft, but the UK retailing giant signals several broader effects for IT vendors and suppliers, for users, and an opportunity for other companies.

As the world’s fourth largest retailer, Tesco’s market share influences markets and by selling its branded software, it may, over the next two to three years, be a catalyst for change in the consumer software industry.

Conclusion: It has been five or more years since many organisations built their current websites and increasingly many of them are examining ways to improve the design and content. While planning for the future is sensible it is apparent that many legacy issues remain. These legacies are evident in the available technology being used and, probably more importantly, in the planning and thinking of the site’s next phase of development.

How the web is developing and its overlap with other digital media are significant factors in a web strategy and its execution through to a content management strategy. Organisations should recognise and adapt to user behaviour if they want their next generation websites to be effective.

Conclusion: Much has been written about the benefits of iterative, incremental software development. There is virtually no software development or integration project that could not benefit from an iterative approach. Yet many large, high-risk software projects are still managed according to the “good old” waterfall approach. And in those cases where projects are run in accordance to some iterative methodology, often the benefits of the approach are not fully realised. The risks of getting it wrong in transitioning to an iterative mode can be minimised by adhering to a few basic guidelines and by seeking assistance from an expert practitioner during the first project.

Conclusion: With mobile device shipments predicted to grow in excess of 50% per annum though to 2010, IT organisations must learn how to deal with this trend. The key is to ignore the glamour and glitz associated with the mobile device and focus on the benefits that mobility brings to a core business process. Mobility projects have many of the same characteristics as ERP or CRM projects and IT organisations should apply the lessons learned from implementing such applications.

To avoid becoming a “solution looking for a problem”, organisations must resist the technology hype from telecommunications carriers and use a top down, business process improvement focused approach when considering whether to invest in mobility projects.

Conclusion: Estimating the cost of software development projects is notoriously difficult. The simple “thumb suck” technique still enjoys significant popularity, and although attempts to introduce a more rigorous estimation process usually lend a scientific touch to the process, any numbers that are not based on historic metrics tend to collapse like a house of cards. Obtaining useful metrics is the hard part. The only way it can be achieved in a realistically short time frame, is by adopting an iterative, incremental approach based on timeboxed iterations of constant duration.

Conclusion: Watching and interpreting Google has become a strategic game; and whether its various plays may lead to gladiatorial combat with another or several large corporations. For most users of IT products and services, consumers and organisations, the alliances and struggles between Google and its competitors are either mysterious or irrelevant. Yet, the rise and rise of Google in so many sectors of the IT industry will present both users and vendors with challenges that are only just emerging.

Much as most consumers enjoy using a dominant search engine, few people would like to be restricted to just one service in other areas of work. The probability of Google dominating all the other information service delivery sectors such as news or TV media is remote. But it is conceivable that it may invade sectors in which it is not present today. Rather than attempt to understand what Google is doing it is timely for organisations to assess how they might deal with that situation.

Three months ago two Google researchers unveiled a project which has wide implications but attracted little attention. They proposed using ambient-audio identification technology to capture TV sound with a laptop PC to identify the TV programme that is the source of the sound and to use that information to produce personalised Internet content to the PC. This technological turnkey is called Mass Personalisation by the researchers because it brings TV and the Web together, harnessing large audiences but which are informed over the Web as individuals.

Conclusion: Good enough, like the expression ‘common sense’, is tacitly understood but lacks precision, so it is relatively facile to criticise products and organisations for failing to deliver a gold standard. But it may be mistaken. The perception, or attitude, about a perfect product is simplistic. Price and value are important factors in how products and services are created and delivered, not an idealised ambition.

Rather than pursue an ideal, managers should make efficient and effective use of the real skills, resources, investments, available to them to provide competitive services in the marketplace. To do so requires good understanding of one’s own organisation and the market in which it operates.

Conclusion: In the mid-nineties online marketing and e-commerce pioneer, Dave Carlick would often say that the Internet offered “a new bargain between consumers and marketers”. He meant that information was more readily accessible and that made prices easier to compare. Being able to obtain more information about goods and prices is one of the benefits of the Web to consumers.

In a channel environment where mainstream media is strong in securing audience reach, the Web seems to offer no additional advantages. Yet, coverage of an audience market is one part of the communications effort; something a stalwart of mass media, the chairman of Australia’s second largest media-buying company, affirmed recently.1

Whether used commercially, or in transacting ideas, the art of persuasion is critical for success. But it’s here that Dave Carlick’s observation is acute because, in part, information can be compared easily. The editorial control is solely with the publisher; the cost of distributing the message is miniscule; and further contact and action may be initiated.

Conclusion: Compared to the consumer market, the enterprise market is more conservative when letting an external service provider store and manage its critical business information remotely, via the web. But in the face of spiralling internal IT operational costs, many companies are likely to significantly expand their use of Software as a Service (SaaS), previously known as Application Service Providers (ASPs) over the next five years.

Conclusion: Knowledge Management (KM) is often thought of as a dark art. It’s not. Many organisations can benefit in tangible ways (e.g. quick access to a problem database in a Help Desk context) by harvesting the knowledge that already exists within them.

The last article on KM concerned explicit knowledge management, being knowledge that has already been articulated in some form within an organisation. This article is focused on tacit or implicit knowledge which is concerned with the experiences of individuals.

Conclusion: The increase in IT related standards since the invention of the Web in 1989 can be seen as an indication of maturity of the IT industry. Today, all kinds of devices that contain software provide interfaces that allow them to communicate with other devices. Similarly, in the realm of enterprise software, today’s applications are typically interconnected across organisational boundaries and across a range of implementation technologies. But adoption and implementation of standards comes at a price. Which standards an organisation should embrace depends heavily on the nature of the business.

Conclusion: In the early 1990s software vendors spoke lyrically of capturing the corporate memory through use of the new products they were launching into the then emerging Knowledge Management (KM) market. Fuelled by success with document and image management solutions, then later by collaboration software such as Lotus Notes, vendors considered KM as the next blockbuster application.

Conclusion: It is time for a major stock-take of model driven software development approaches within software intensive industries. Progress in the last few years in terms of developing interoperability standards for model driven tooling has not been spectacular. The term "Model Driven Architecture" has gone through the usual hype cycle, and the dust is in the process of settling. Model Driven Software Development is about breaking the 1-fits-all approach to implementation languages when needed, and entails the use of small, domain specific languages.Only in some cases can domain specific languages be bought off-the-shelf. Model driven approaches have come a long way, and enable the incremental creation of strategic software assets that can be used across a large number of applications.

Conclusion: Five years ago portals were essential to any worthy online strategy; without one, an organisation was not serious about the Web. That sentiment dissipated as portals were seen as symptomatic of the cyber land grab that failed. In the last two years they have quietly reasserted themselves – or perhaps never went away: Which begs the question: How does an organisation plot a course for its portal?

A portal ought to be created with a business objective, not because it’s fashionable, or competitors have one, or it’s possible with a large amount of content that is underutilised elsewhere in the organisation. From inception the portal should be planned to provide value to users, and in this context focusing on their requirements will produce a portal users keep using.

Conclusion: Open Source Software Development Tools are becoming mainstream. In the Java space, the number of available tools is mindboggling, and keeping up with the latest developments is becoming more and more a matter of being well-connected to the Open Source community and receiving tips and suggestions from trusted colleagues about the best and latest tools. It is no longer true that it is sufficient to keep monitoring the developments of the five largest tool vendors. Amongst the best tools some are being produced by small teams and individuals. The use of Open Source tools is also becoming established practice within traditionally conservative organisations in industries such as banking1.

In February 2006 Oracle announced its intention to buy California-based Sleepycat Software, a leading provider of embedded open source database products, for an undisclosed sum. While this acquisition marks another signpost on Oracle''s broader acquisition binge, it also signals a deeper and more aggressive strategy to leverage and co-opt the growing open source movement to its advantage.

Conclusion: Within the software engineering community only few people fully understand the difference between the traditional use of models in software engineering, and newer so called "model-driven" approaches. In particular the discipline of Enterprise Architecture makes extensive use of modeling techniques, and mainstream practice has not yet caught up with the model-driven approaches that are possible with today's leading edge software tools.

The problem is largely educational, and is compounded by a reluctance to step out of the comfort zone and rise to the challenge of producing precise and unambiguous models that can be used to power a highly automated software production facility. In model-driven approaches models and model transformations take on the same role as traditional source code - requiring a mindshift comparable to the one that was required in the transition from assembler programming to modern 3rd generation programming languages.

The media was replete in November of reports that Telstra would be replacing and decommissioning approximately 1200 legacy systems. For instance, Mike Sainsbury in the Australian Business Section on 17 November, compared the systems to a ‘bowl of spaghetti’ on the assumption they were entwined and it would take a mammoth job to untangle them.

Conclusion: The rising popularity of online business networking platforms, sometimes also referred to as social software, is the first sign that the traditional CRM paradigm that equates to "one CRM system instance for each organisation" has reached the limits of its usefulness. The players that shape a new, complementary CRM paradigm exploit pervasive use of broadband and wireless internet connections. Many of them are dynamic and small organisations, some still operating in start-up mode. Over the next two years the CRM landscape will undergo significant change, and it may be worthwhile to wait for the dust to settle before investing in expensive CRM solutions that may soon be obsolete.

The growth of utility computing (UC) and utility infrastructure (UI) is both driving and being driven by open source software adoption. Leading utility IT vendors show that open source-based technologies and applications are now being considered or used to fill important product line gaps. At the same time, feedback from our customers indicates that utility infrastructure partially based on open source-based software will deliver more value to the enterprise than would utility infrastructure purely based on proprietary technologies.

Conclusion: A website that is underperforming, certainly in terms of the expectations that an organisation may have had for it, could be called loss-making. Not all websites generate revenue in one way or another but many organisations want their sites to demonstrate returns, whether that is in contact with site users, public relations, or just awareness of the site itself.

There are many tools and techniques to measure, and thereby improve the performance of a site: one of these is usability. These techniques will deal with the execution of site structure and content. As valuable as these techniques may be, if a site is loss making it may be that the root cause lies in the initial planning i.e. that the site does not match the strategy or the expectations that were in place when it was developed.

Organisations with underperforming web site must adopt business process re-engineering principles and redesign it and redesign the aims of the site. Return to the strategic plan – if one exists – and reset objectives. Do this planning once the site has been measured and assessed to obtain a thorough overview of the site’s current performance.

Conclusion: With the increasing sophistication of application software, it seems inconceivable in 2005 for any organisation to have data quality problems. Yet it is a problem that does occur more frequently than many recognise.

Conclusion: Practical experience shows that software development initiatives usually entail high risks for the customer and the software developer. In anticipation of the risks both parties attempt to mitigate the impact, the customer often insists on a fixed price, and the software developer consequently builds contingency into the fixed price. This simplistic mitigation strategy rarely works. Successful application development requires intensive interaction with end user representatives and stakeholders, and the ability to take into account new insights into user needs, which are gained while the application is being built.

Conclusion:The Federal Government is proud of its achievements in getting community acceptance of e-government. It has connected more citizens and customers over the last 2-3 years and e-government is proving to be an efficient means of processing transactions, and disseminating information.

The Government is on the brink of taking its activities online to a new phase, which will not result in a radical overhaul of current practice, but rather a consolidation of extant systems and delivery.

Yet despite the government’s success there is considerable criticism from SMEs of what it can offer them. The magazine CRN covered this story last year. The SME’s dispute with government was concisely summed up by one ICT executive as hinging on the perceived higher financial risk, or stability, of an SME. In addition the level of liability insurance the supplier must have as required by government – which for many SMEs inhibits their engagement – means the prospect of winning government business, is remote.

In the context of the challenges for SMEs, the imminent reappraisal of e-Government policy should take into account pragmatic methods to permit this sector of the market to compete with larger enterprises. It might do this by setting an objective for SMEs to win a percentage of business by a specified timeframe. This is not a new idea and since 1953 under the Small Business Act it has been legislated in the US. The ICT sector in France has formed the Richelieu Committee to implement a similar agenda.

Alternatively, government might modify some engagement conditions, depending on circumstances, for certain types of services. In developing policy allowance must be made for the large difference between hardware, software and intellectual consulting engagements; that is, between the risks associated with one service versus another. Such practical measures may improve the competitive field for enterprises competing for government business and be a boon for SMEs who can use government contracts to develop their businesses in other markets and overseas.

Conclusion: It is easy for software development teams to be preoccupied with, and to get lost in low level design. The simplest preventative measure to curb spurious complexity, without being prescriptive at the micro-level, is to consistently make use of a nested subsystem structure within the system architecture. The result is an architecture with fewer point-to-point interfaces. This strategy improves maintainability of code, and it works regardless of the quality of design at the micro-level.

Conclusion: The mid-1990s was a gold rush for ERP vendors. The ERP concept of integrating disparate corporate applications was right for its time and was superbly promoted by major players such as SAP who ran saturation campaigns directed at CXOs. I experienced it first hand, being one of them at the time! The gold rush continued through the decade. Drivers included the market momentum the ERP storm had generated coupled with corporate anxiety about legacy system’s likely inability to meet Year 2000 compliance requirements.

Conclusion:In May 2005, Google released Desktop Search for Enterprise for organisations to use Google without using their public toolbar. The enterprise search market is well served and there are a number of tools which serve particular functions depending on the requirements of the organisation. Google’s objective is to capture the enterprise search business and consolidate their dominance of the entire search function market.

As the Google Press Release disingenuously phrased the announcement:

With the addition of Google Desktop Search for Enterprise, businesses of all sizes can offer their employees one-stop Google search for the desktop, intranet, or web." Google’s aim is to enlist everyone to the use its brand and so aggregate every possible eyeball in support of its main advertising business.

One-stop use of a single brand is one of the reasons for the stock markets sentiment in pushing the Google share price to nearly US$300.

Google’s intention is to encircle individual consumers and businesses so they use Google automatically when they need information. As an information intermediary Google can capture; although it’s unclear just exactly how, information about users and connect it with businesses.

To avoid reliance on Google, examine how many of its products are used with the organisation. While there is nothing sinister in what Google are offering, managers must ensure security and privacy concerns are assessed and addressed now. Guidelines for employees using Google applications should be clarified to minimise potential risks.

The ongoing war for market share in the server platform market will continue to present vendor-sponsored programs for migration to their favoured operating system platform. While these programs offer real assistance in evaluation of new options for your organisation, they require thorough evaluation and significant allocation of resources. If tempted by a vendor offer for a migration evaluation, it is essential that the scope and limitations of the model used by the vendor to build their conclusions includes meaningful measures of business benefit rather than just infrastructure cost savings.

Conclusion: Organisations that take the approach that they are just buying a software product rather than a solution to a business problem are under-estimating the complexities of technological change. Occasionally organisations find it tempting to take this approach at the end of the financial year when they may have a small budget available and a compelling interest in ensuring they achieve perfect budget performance.

PC Virtualisation technology could make it possible to achieve effective desktop lockdown without sacrificing user freedoms. The result could be total cost of ownership savings for enterprises that have struggled to implement full desktop lockdown, as well as clearer definitions of IT responsibilities. This technology can be considered for desktop deployment where PCs are not currently locked down. However before doing so be sure to understand the additional complexity and costs involved before proceeding with testing and implementing.

Conclusion: Organisations should not use old style systems development methodologies when implementing off-the-shelf packages. Package implementations need to tailor business processes to meet the operational characteristics of the selected package. Old style systems development methodologies assume a green field approach where the objective is to tailor the system to meet exact business requirements.

Heavily tailoring an off-the-shelf package will significantly increase ownership costs (by up to 10 times), while reducing the organisation’s ability to adopt future offerings that have been designed to work with the standard package.

Implementing a new package creates an opportunity to improve the way the organisation operates. Trying to fit a new package into old ways of working is an expensive exercise that invariably fails to take advantage of new business opportunities.

Business professionals must be involved in redesigning / changing business processes. As obvious as this sounds it is not hard to find example after example where it isn’t done.

Conclusion: BPM solutions essentially separate the business logic and activity flow from transaction management.  The latest generation of software applications operate through two key components, which are:

Conclusion: Contact update services attract users with the promise of instantly updating their contact details with almost no effort. Often free or at very low cost, their value proposition is they offer the potential to reduce every day contact management effort. These services effectively distribute the labour of updating contact details from the user to their contacts. The concept is attractive and the service easy to use.

Survey results are now reinforcing anecdotal evidence and supporting vendor marketing claims by showing that, when it comes to choice of an operating system platform for an ERP system, there is still a degree of choice. This is in marked contrast to the reduced choices available in ERP systems themselves. Linux will provide a very viable alternative to the entrenched Unix variants and the ubiquitous Windows. This has to be good for end users – particularly those in the medium size business category.

Conclusion: The arrival of weblogs over the last two years has opened new opportunities for communication and is a well known marketing device aimed at audiences outside an organisation. The principles of a blog: direct contact and debate are applied at online message-boards where company executives answer questions and take advice on new software from users.

An intranet blog used however, as another form of internal public relations, with comments posted by executives aimed at employees, may only serve to uncover the assertive self-promoters within an organisation.

Within an organisation blogs may be used to disseminate information to groups of staff and replace group emails. A blog to share expertise among staff may be more productive and useful because the volume and flow of information in companies is large and an electronic noticeboard in a blog offers a medium to manage the information.

In considering blogs for staff to use it must be clear what medium the blog will replace, to some extent email, and consequently, what rules will govern its use. Select a small group to trial its introduction and from that experience use the feedback to expand the use of a blog to other relevant groups.

Conclusion: Delivering real business improvement in Workforce Automation & Management practices has proven elusive for many organisations. Two principal factors seem to have been at play. Firstly, a piecemeal approach seems to have been taken with a focus on rostering rather than on the entire process chain (see diagram). Secondly, the organisational change management effort seems to have been underestimated. With so few opportunities available to businesses to deliver bottom line savings from application software initiatives, it is now timely to revisit this area. Further, increasing safety-awareness in sectors such as mining, construction and transportation, have highlighted the need to achieve success with WAM initiatives, in some cases driven by the need to comply with fatigue management standards for rostered staff.

Conclusion: The predicted business software consolidation marches forward. Eighteen months after initial advances, the U.S. Department of Justice(DOJ) approval has finally paved the way for Oracle to complete its hostile takeover of Peoplesoft.

Conclusion: Help Desk systems are a mature technology, but capabilities between systems differ widely. Ironically, IT departments are even more likely than other business units to choose systems based on unrealistic and misunderstood business requirements. Often their technology savvy leads them to overestimate the ability of technology to overcome the difficulties of people and process management.  In theory, help desk systems are the obvious choice.  However, if the motivation behind the implementation of a new system is to "manage" an undisciplined, unhappy or untrained team, then a help desk system is unlikely to help.

Conclusion: The Vision Foundation of Australia estimate that 18% of the population, or nearly four million people, have a disability and a large percentage of those people, have impaired sight. Based on a straw-poll survey of government and private enterprise websites, those disabled people will, or may, not be able to fully access most sites.

For organisations which have not made their sites compliant with the law, a casual attitude is not acceptable. To remedy some problems of access, such as, replacing text for graphics is straightforward. Adaptation, to allow for different browsers as well as voice output and Braille browsers, is also necessary.

Elements of a site including branding devices, such as pack shots, logos, as well as site functionality may also need to be overhauled, which  may mean revising the architecture and or design so that the objectives of the website remain intact.

Conclusion: Despite increasingly affordable prices, basic workflow management systems are  still not widely used in Australian businesses. E-commerce, intranets, websites, corporate email and other desktop automation trends appear to have obscured the basic usefulness and wide application of one of the most powerful technologies brought to market in the last twenty years.

Conclusion: The pay as you grow benefits of Application Service Providers* (ASP's) are finally approaching critical mass in Australia as a result of greater penetration of broadband technologies (cable internet, DSL, satellite and wireless) and more substantive vendor offerings.

Conclusion: Channel strategy has become more complex with a range of options and better means of reaching various markets. The mobile phone is already a powerful tool and with the onset of more applications could emerge as the next, most discrete, channel in relationship management.

Changes will not occur overnight, but managers ought to look ahead to the next twelve months for options that are suitable to their organisation. This channel may be an effective one, replacing mail and leaflets, for both business and government, including local councils.

A plan to consider the mobile as a CRM tool should, at this stage, examine:

  • 1. Current channels and options and the cost of delivering them to see whether they can or ought to be changed.
  • 2. The aptness of emerging applications and how they might be adapted.

With the market reach of the mobile phone a strategy should be identified soon to take advantage of the channel.

Conclusion: Search engine marketing is increasingly the critical edge of online marketing. With the predominance of Google as the preferred search engine around the world - some estimates assert that up to 80% of all searches are via Google - the power of that single engine to determine the marketing position of a company is influenced by this conduit. Obtaining the top results in a search has inspired strong competition from Web marketers. As Google is a fixture for online marketing, avoiding or ignoring it altogether, is unrealistic

What makes the problem of Google's ‘gateway' for Web searches perplexing for managers responsible for the content on the company's website, is how Google affects the potential value of other marketing and promotional activities.

Managers can instigate minor but effective modifications to their websites and tactical promotions in the following two ways:

1. Change the site so that it is receptive to Google's criteria
2. Re-examine, and if necessary change the links and connections with other sites so that it boosts the popularity of your own site.

These small changes may help improve website rankings and produce a marginal improvement in overall return from online marketing activity by attracting greater numbers to your site.

In most businesses, regardless of size or industry, formal business continuity and/or disaster recovery planning is consistently under-funded and generally neglected by management. The business risks associated with this attitude can be very high but are not understood. Those plans that are in place simply don’t work. This is not surprising since disaster recovery hasn''t been given sufficient consideration, ensuring that plans are rarely tested (if ever) and equally rarely updated to reflect changes in process, technology or applications. In an emergency, there are many continuity requirements within an organisation’s business and services covering processes, facilities, and personnel. IT and a range of business units across the whole organisation must work together, both in planning for continuity and in its execution.

Conclusion: Failure of the two main parties to use their websites to reach voters, as indicated by visitor levels just prior to the ballot, suggest important lessons for organisations marketing online.

The first is that content and information cannot retain audience interest, especially if the content is static and unchanged over several months.

The second lesson is that contact between an organisation and interactive communication through a website can be a strong tactic to galvanise a market. Evidence from the US demonstrates that the effective use of the Web, in conjunction with the mainstream media, builds momentum.

Many organisations’ websites do not change over long periods and the value of the site to the organisation, and to the marketplace, drops. Refreshed content and promotions, or other gimmicks may not be right, but techniques to reach and gather individuals, such as an online conference or chat room, can give renewed purpose to an otherwise static website.

IM represents a new tool for business communication and collaboration – additional to the traditional forms of communication, particularly e-mail and voice. Paradoxically, IM is the personal communication of the impersonal digital world, and as such it requires its own unique set of corporate IM guidelines.

Conclusion: SMS has proved more versatile and effective in business-related communications than simply a means of chatting with text. By reducing costs and simplifying the process of communication, SMS is proving to be effective for firms dealing with their suppliers, customers and staff.

Firms of all sizes – and Government departments - can likewise benefit from SMS in two key ways. Firstly, it is a marketing communication channel which can be used for product promotions and secondly, it has proved its worth as an operational communications tool which can be used for channel management within an organisation.

Experience shows firms can cut costs and increase efficiency by using SMS to deliver timely and useful information to stakeholders. Having said this, it is important that the ease with which messages can be delivered should not be equated with permission to flood mobile phones with frequent and irrelevant messages.

You are the only person in the elevator on the ground floor at 7.15am. Just as the doors are about to close, you see the CEO hurrying to catch the elevator and press the ‘Open Doors’ button so she can join you. She says breathlessly, “Thanks. Our meeting is timely. I read your report on our business information management dilemma last evening and raised it with the Chairman before I left. He just contacted me on the mobile phone and said he wants to talk about it when I get out on the 20th floor.

There are about 1.24 million Freephone and Local Rate Numbers (FLRNs) not yet released by the Australian Communications Authority (ACA). FLRNs are the 1800, 13 or 1300 numbers and are necessary marketing channels for many organisations. An initiative in the May Federal Budget will promote access FLRNs more equitable, which is a boon for businesses as demand for these numbers is increasing.

Conclusion: Despite the recent hype over search engine marketing the basic elements remain simple. They are:

  1. Ensure the listing of the website is clear, complete and comprehensive;

  2. Review it in 6 months, just as all marketing channel investment is periodically reviewed;

  3. Don’t spend more time and effort than is due to the task relative to the investment committed.

With these three guidelines the quality of the listing on search engines ought to be competitive

Conclusion: The two components of any online advertising campaign to get right are the communication objectives and the exposure frequency. This determines the number of times the audience will be exposed to the ads.

The communication objectives comprise the market or sales position of the brand and in tandem with levels of frequency are the two major variables to accomplish any advertising objective

In many campaigns frequency is generally ignored and also as it relates to particular creative work. Media planning must integrate both facets to the complement of the other to maximise the marketing investment.

Conclusion: Investment attraction is the main business driver of local government Smart City projects and planning, followed by automation and internal productivity improvement.

Trophy Smart City projects based on entirely new cities are rare, but new towns, city centres, technology parks, recreation precincts and showcase suburbs are common and benefit from the same principles.

Every existing municipal service should be reviewed as a candidate for support and improvement using digital techniques.

Current and emerging technologies can routinely deliver Smart City services such as smart waste management, parking, transport, street lighting and facilitating community formation. Imagination is initially the resource in shortest supply.

The Mayor’s support for Smart City projects and programs is essential (because of their novelty and the political courage required) in any region of the world. Always.