The standardized PC served its purpose, but IT changes have rendered the approach obsolete

26.01.2015
I work in an international organization, with around 10,000 employees and offices in nearly 100 countries. We started rolling out PCs to our staff when they first emerged as corporate tools in the early 1990s. As in many organizations, in time there was a drive to standardize those tools because:

- PC hardware varied wildly by location. Purchasing a PC in Africa would cost two to three times as much as in Europe. We therefore began purchasing PCs at our Italy-based headquarters and shipping them to the field.

- Configuring PCs was far from plug and play and most configuration was done via scripting languages that required specialist knowledge.

- The emergence of corporate software required specific operating system configurations. WordPefect, was the first such package in our organization, and was quickly followed by a stream of other in-house or shrink-wrapped packages.

- The emergence of threats in terms of computer viruses led to the need for a controlled environment to protect against them.

Through this standardization, IT has also been able to reduce costs and realize genuine efficiencies. But, it also led to the IT mindset that a good PC is standardized, locked-down machine, and that the role of IT is to guarantee that all applications must work all of the time. Conversely, non-standard PCs and non-standard applications are risks to be avoided at all costs.

This mindset has been ingrained for nearly two decades, but it needs to be challenged today because major changes in IT have fundamentally rendered this paradigm obsolete.

From a technology perspective, the Web browser is now the de facto platform of choice for delivering corporate applications, and especially today, it is the only feasible way of delivering cloud-based applications. In a similar way, applications that used to be locally installed on each PC, such as Microsoft and Adobe productivity products, are now moving towards the Cloud. These changes have broken the long-standing critical dependency on the underlying operating system.

And then there is the people perspective. The workplace is beginning to fill with tech savvy people who are familiar with IT and have their own ways of working, often driven by social networks. These people are capable of using IT effectively if they are allowed to. They are also used to having multiple IT devices from Macbooks through to ever-more capable tablets and smartphones and they want to use these devices for both work and in their private life.

To cope with these new realities we have made some changes: we introduced a formal BYOD policy for mobile phones last year, principally as a cost-saving measure, and we have recently enabled Wi-Fi access for personal devices.

While the latter has been helpful in enabling access from a wider range of devices, it still ignores the elephant in the room the 10,000 traditional desktop PCs that are now considered second-class devices, running out-dated versions of software (and usually slowly), and limiting user productivity because they are locked down.

This is where we need the paradigm shift. Here's how we hope to make it happen:

-   We will set up the legal and administrative framework to encourage staff and consultants to use their preferred devices for work

-   We will set up a virtualized desktop infrastructure, but in the long-term this is intended only as a fallback option for those who are unwilling or unable to use their preferred devices.

-  We will change the way we write information systems(((What does that mean))) to make them as platform independent as possible, with a particular emphasis on usability across different device types.

-  We will change the focus of information security; instead of attempting to equally protect every device in every location, the focus will be on protecting the core corporate applications

To move this forward, we have setup a pilot to rollout in the IT Division in the first quarter of 2015, purchasing hardware and licenses to expand an existing virtualized application infrastructure. The intention is then, in the second quarter, to seek funding for startup costs and flagging expected areas of savings. By 2016, we aim to be well under way in implementing this change.

As the first major United Nations agency launching such an initiative we will look to share our findings with our sister agencies in the United Nations system who may then be interested in observing our progress or launching similar initiatives of their own.

Whimpenny is the Senior Officer for IT Architecture in the IT Division of the Food and Agriculture Organization of the United Nations (FAO). The views expressed here are those of the author and do not necessarily reflect the views of the Food and Agriculture Organization of the United Nations (FAO).

(www.networkworld.com)

By Paul Whimpenny is the Senior Officer for IT Architecture in the IT Division of the Food, Agriculture Organization of the United Nations (FAO)