Добавил:
Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:

Forster N. - Maximum performance (2005)(en)

.pdf
Скачиваний:
29
Добавлен:
28.10.2013
Размер:
3.45 Mб
Скачать

460 MAXIMUM PERFORMANCE

of use that once froze the company’s whole computer network (cited by Macken, 1999). In 2002, it was revealed that one-quarter of all sackings in UK companies resulted from employees surfing the Internet without permission, and nearly 70 per cent of these were for accessing pornographic websites (AFP, 2002b).

The uptake and spread of new technologies at work is irreversible. However, hardly any researchers have started to get to grips with what their impact on organizational leadership and people management is likely to be over the next two decades. Nevertheless, we can begin to see the emergence of a new paradigm of leadership and people management that will be able, conceptually and practically, to get to grips with the effects of these new technologies on employees and organizations. This has profound implications for traditional approaches to leadership and people management in organizations which are still heavily influenced by a second wave mode of thinking. This is itself still reliant on a way of thinking that has changed little in its fundamental assumptions since the 1950s, and is desperately trying to play catch-up with all the profound changes that are now occurring in organizations.

Despite all the hype about virtual organizations, it is important to emphasize that successful people management, even in high-tech and virtual organizations, will continue to be based on many leadership practices that have stood the test of time, although there will inevitably be a greater emphasis on self-management, self-organization and empowerment among employees who work in these kinds of organizations. So, while the context in which leadership takes place will continue to evolve, many of the basic principles of people management remain the same as they have been, and this will continue for some time yet. This is because leadership is fundamentally about showing people a road, way or path to travel down and, in fast-changing and uncertain times, this capability becomes even more important and highly valued. Having said this, there is also little doubt that new technologies will not only continue to accelerate the pace of change in organizations, but they will also start to do more ‘thinking’ and ‘managing’ for us in the very near future.

With these thoughts in mind, here are a few tips for helping your organization and its employees get the most out of current and emergent technologies. First, human beings are essentially highly evolved apes, with a lot of high-tech tools at their disposal. Almost all humans are genetically hard-wired to be social and to interact with others face-to- face. Hence they still need a combination of high-tech and high-touch leadership/management, if they are to retain a sense of identity with

LEADERSHIP AND PEOPLE MANAGEMENT 461

the organizations that employ them, and it is still extremely difficult to get team collaboration going in virtual environments. For the foreseeable future, employees will still need physical interaction with each other, and all organizational leaders need to ensure that they have opportunities to do this, even in high-tech or virtual organizations.

Second, the dotcom collapse of April 2000 showed us that technology alone cannot create a successful business. Any new enterprise must have a clear vision, a great business plan, and a well thought-out commercial strategy, before thinking about its technology requirements. For the moment, the Internet, e-commerce and virtual reality are just evolutionary add-ons and portals to the way that business has been done for decades. When all the hyperbole over the Web and the Internet is taken away, we can see that their primary job is to assist in the delivery of the right product/service/content to customers at the right price and in good time – nothing more or less. All current Web/Internet technologies are essentially passive, and it still requires creative and innovative employees to make the best use of these. Companies that derive the most benefit from new technologies understand this important principle, and do not rely on technology to solve basic business problems. As Jim Collins has observed, ‘Good-to-great companies think differently about technology. They never use technology as the primary means of igniting a transformation. Yet, paradoxically, they are pioneers in the application of carefully selected technologies. We learnt that technology by itself is never a primary root cause of either greatness or decline’ (Collins, 2001: 14).

Third, don’t believe the hype. Critically evaluate if you need the latest technological gizmo. Will it really bring added value to your business or organization? For example, one survey revealed that more than 500 million PCs used by organizations around the world in June 2002 had less then one-third of the computing power available in the most up- to-date models available at that time. During 2000–2003, many organizations took strategic decisions not to upgrade their PC systems, until it was clear that there would be productivity advantages to be gained by doing this. For the first time since 1985, sales of PCs fell by 4 per cent (compared to an average increase of 15 per cent a year from 1985 to 2000). World-wide, many companies expressed scepticism about the purported productivity gains that could be gained from upgrading their computer systems (cited by Gottliebsen, 2002b). In this context, it’s worth remembering the old adage, ‘Buy in haste – repent at leisure’.

Fourth, use only those technologies that enhance your core competencies and businesses. If non-core functions can be outsourced more cheaply and effectively, then use that option. This has the added

462 MAXIMUM PERFORMANCE

benefit of allowing you to focus on bringing emergent technologies into a smaller number of core operational and/or business and/or service areas. According to Gartner, business process outsourcing is expected to grow at an average rate of 12.3 per cent a year up to 2010, faster than overall IT outsourcing, as more companies seek ongoing cost reductions, while focusing their energies on developing their core businesses and their ability to cope with fast-changing market conditions (The Australian Special Edition on Outsourcing, 2002: 2). Having said this, it’s important to note that the business landscape of the USA, Europe and Australasia is littered with failed private and public sector ‘whole-of-organization’ IT outsourcing deals, characterized by huge cost overruns and systems that have routinely failed to live up to suppliers’ hyperbole.

Fifth, many businesses remained locked in inappropriate and expensive contracts during the 1990s and early 2000s and, because they had sacked their own IT people in the name of cost savings in the mid-to- late 1990s, lost the capability of taking back control of their IT operations. The market for new large-scale outsourcing projects ground to a halt in early 2002, and the trend today is towards servicing IT needs, not through a single source, but via multiple partners (also known as ’selective sourcing’). Through this, organizations can choose which parts of their knowledge and information infrastructures to outsource and which core IT assets to keep in-house, based on careful assessments of their current business requirements. Many big outsourcing services, such as those offered by EDS, IBM and HP, now offer a much broader range of individual services in specific areas such as desktop management, knowledge management software, network support, mid-range management, application maintenance and mainframe management (Riley, 2003).

Sixth, the second wave of e-commerce is rapidly becoming the universal and essential method of reaching customers and clients, either through direct relationships or through third-party transactions. If your company has still not embraced the Internet, its competitiveness in the future will suffer and it will be seriously disadvantaged in your market. Furthermore, global virtual networking requires a holistic overview of all business processes in organizations, not just an ad hoc or patchwork approach. In turn, this must be driven by the organization’s vision, goals, strategy and culture, never the other way round. All electronic communication systems must be imbedded within knowledge management systems that encourage employees to make effective use of these (as described in Chapter 10).

LEADERSHIP AND PEOPLE MANAGEMENT 463

Time is not on our side. Technology, in both its hard and soft forms, has already brought unparalleled flexibility to the workplace, and will continue to change the way that employees work and increase their productivity levels. While the IT sector was in a slump during 2001–3, there were indications that it will have bounced back from this by 2005. This means that all business leaders must remain techno-savvy and up- to-date with developments in emergent technologies and, of equal importance, need to keep one eye on what may be coming onto the market two or three years ahead. Last, be prepared for the arrival of even more radical technologies that will not only continue to revolutionize our professions and organizations, but which may well shape the next stage of human evolution in our grandchildren’s lifetimes. These are described in the next section.

The companies that spend the most on technology are not winning. The companies that are spending smartest on technology are. There has been a lot of technology investment that has been squandered, ad hoc and random, and has not delivered results. [However], companies across the world that are faster at what they do are typically more successful. They are gaining market share, they are more profitable and they are growing faster. We are talking fast not just in technology, but also in management decisions, how fast you hire people and how fast you can train people. You really need to drive these changes through business, and this is the catalyst for getting the most out of technology.

(Bob Hayward, Senior Vice-President of the Gartner Group (Asia-Pacific and Japan), in a talk delivered at The Need for Speed: Driving the Real Time Enterprise Conference, Sydney, 12 November; abridged from Foreshew, 2002)

The future: the potential impact of emergent technologies on humanity during the 21st century

Forecasting is very difficult, especially if it’s about the future. He who lives by the crystal ball soon learns to eat ground glass. (Edgar Fiedler, 1993)

The further backward you look, the further forward you can see. (Winston Churchill, 1945)

I never think about the future. It comes soon enough. (Albert Einstein, 1940)

While we may all be unsure about the immediate future, our predecessors at the dawn of the 20th century also encountered what, to them, was an equally strange, uncertain and fast-changing new world, with the arrival of the telephone, electricity, the telegraph, flying machines, modern vaccines, the automobile, the radio and many other new inventions and innovations. Then, as now, they were also curious

464 MAXIMUM PERFORMANCE

about what the world would look like in the future. In 1899, a group of Victorian futurologists gathered to speculate about what life would be like in 2000. They got a few things right, but they were way off the mark with many of their predictions. They thought that there would be flying cars, armies of robots to take care of all our needs, time travel and teleportation. They also believed that work would become obsolete and aging a thing of the past. They predicted that there would be a boom in the popularity of leisure activities, such as air-tennis and underwater fish racing. They failed to foresee the arrival of X-rays, radar, television, nuclear energy, transistors, lasers and computers. In 1902, even the visionary science fiction writer H.G. Wells was mistaken in predicting that heavier-than-air flying machines would be possible ‘in about 1950’ and that submarines would never be able to do more than suffocate their crews. A few commentators of the time did get things right. For example, John Maynard Keynes, perhaps anticipating the globalization of consumer markets, made these comments in 1900, ‘The inhabitant of London could order by telephone, sipping his morning tea in bed, the various products of the whole earth and by the same means adventure his wealth in the natural resources and new enterprises of any quarter of the world without exertion or even trouble’ (Margolis, 2001: 118).

What about these earlier predictions from Joseph Glanvill, a philoso- pher–theologian and chaplain to Charles II, in the 17th century? ‘To them that come after us it may be as ordinary to put on a pair of wings to fly to the remotest regions, as now a pair of boots to ride a journey; and to confer at the distance of the Indies by sympathetic conveyances, may be as usual in the future as literary correspondence. It may be that in ages hence, a voyage to the Southern tracts, yea possibly to the moon, will not be more strange than to America. The restoration of grey hairs to juvenility and the renewing of exhausted marrow may at length be elicited without a miracle’ (Margolis, 2001: 47). Perhaps the most prescient of all the futurecasters of the late 19th century was John Elfreth Watkins, writing in the Ladies Home Journal of 1900. Having consulted the most learned experts of the day he predicted, amongst other things, the arrival of international telephone services, colour photography, frozen dinners, school gyms, snowmobiles, the tapping of energy from the wind, the sun and ocean waves, and medicine applied through skin patches (Margolis, 2001). More recently, Gordon Moore (the creator of ‘Moore’s Law’) predicted that ‘integrated circuits will lead to such wonders as home computers, automatic controls for automobiles, and personal portable communications equipment’. He made these predictions in 1965 (cited by Schlender, 2002: 55).

While predicting the future can be an unreliable business, there are two assertions that can be made with confidence. First, as noted earlier, the

LEADERSHIP AND PEOPLE MANAGEMENT 465

pace of technological innovation is speeding up year-by-year. Second, the impact of emergent technologies in the next 100 years will far exceed anything that the human race has experienced up to this point in time. For example, you may have seen the 1997 sci-fi movie, Gattaca.6 In this, the principal character, played by Ethan Hawke, is condemned to live his life as one of a genetic under-caste of ‘invalids’. Born into a society where genetic engineering is the norm, and physical appearance, intelligence and personality can all be genetically enhanced before birth, he finds himself playing an increasingly dangerous game of deception, in order to avoid detection by the police as he pursues his dream of becoming an astronaut. To achieve this, he is forced to use the skin, blood, hair cells and urine of a ‘valid’, crippled in an accident. While the story has a happy ending, Gattaca symbolizes the unease that some people feel about the wild roller-coaster pace of biotechnological evolution we have now embarked upon (as have other sci-fi films such as I, Robot, Minority Report, the Matrix series and AI: Artificial Intelligence). In a similar vein, Mathew Reilly’s fourth best-seller, Area 7, featured a plot that revolved around the defeat of a renegade general who had gained control of a genetically engineered virus that could be used to eliminate specific racial groups. It is now possible to create such a virus. And, in 2002, Michael Crichton’s blockbuster, Prey, featured the nightmare scenario of swarms of artificial nano-organisms escaping from an isolated laboratory, evolving uncontrollably by the minute and turning themselves into replicas of their creators. Those humans they didn’t use they consumed as food. In the introduction to the book, Crichton suggests that, unless we establish strict controls over these technologies, this could happen some time during the 21st century (Crichton, 2002).

The most surprising fact about the futuristic communication technologies unveiled in Minority Report (set in 2054) is that not one was the creation of Steven Spielberg’s creative mind. All were extrapolations from technological developments under way in research laboratories around the world. Researchers are already working on augmented reality systems, where computer images can be portrayed on transparent wrap-around glass, where data and objects can be ‘moved’ by hand. They are working on computers that can understand and translate speech, smell odours and taste substances, feel textures, interpret human gestures, sense human emotions and, maybe, even understand our thoughts. For example, haptic (touch) technologies are being routinely built into new cars. Instead of a steering wheel and a dashboard covered with dials and switches, the 2003 BMW Seven Series prototype featured a universal controller in the form of a joystick and all commands are voice activated (for example, ‘Fan on cool’; ‘Windscreen wiper on slow’; ‘Left indicator on’).

466 MAXIMUM PERFORMANCE

The scenario presented in Gattaca too may well become a scientific reality in the near future. The human genome has been mapped, not only the 130 000 genes that make up a human being but the sequences of their constituent parts. Significantly, it was originally anticipated in the early 1980s that it would take until 2050 before this was achieved. This was revised to 2010 in 1999, and then to 2005 in 2000. This came to pass 50 years ahead of schedule. What began with the ‘discovery’ of DNA by Crick and Watson in the 1950s, the first successful in-vitro fertilization two decades later and the cloning of Dolly the sheep in 1997 is now speeding along a seemingly unstoppable path to pre-conception implantation, genetic screening, the manipulation of human embryos and, possibly, to full-scale human cloning (Stock, 2002). As a result of this, we may be able to slow down and maybe even halt aging, manipulate intelligence, personality attributes, height, physical appearance, musical and creative abilities, and even create ‘designer children’.

It is quite possible that future employment may depend on the willingness of employees to undergo ‘enhancements’ that will increase their brains’ processing speed or their memory capacities (Stock, 2002; Kurzweil, 1999). Companies in the USA have been making promotion, hiring and firing decisions on the basis of appearance, fitness, health and personal lifestyles for nearly a decade. This selection process may extend even further in the not too distant future. If genetic information is not protected, employers could use this to make hiring, firing and promotion decisions. Insurers may discriminate against people, or refuse insurance policies, on the basis of their genetic profiles. Former US President Bill Clinton had to pass a five-year executive order in February 2000 to prevent US government agencies from using genetic information in hiring and promotion decisions for its 2.8 million employees nationwide (cited in The Weekend Australian, 12–13 February 2002). From July 2001, the US Copyright Agency began to offer a DNA Protection Service to high-flyers and celebrities, because any body part could potentially be used to clone a human being, even a single hair.

However, there have been some early setbacks in bioengineering. It was announced on 29 April 2002 that all of the world’s cloned animals were suffering from genetic and physical defects, indicating that cloned humans could also be vulnerable to genetic defects. The first sheep cloned in Australia in April 2000, Matilda, died prematurely in early February 2003 and Dolly, the most famous of these cloned animals, developed advanced arthritis at five years of age, and was put down on 14 February 2003. While human cloning was banned in most industrialized countries during 2001–2, many scientists working in this field believe that biotechnology is a juggernaut that neither governments nor religious authorities will be able to halt. For example, if you

LEADERSHIP AND PEOPLE MANAGEMENT 467

could make use of a safe technology to enhance your children’s physical prowess, intelligence, height, appearance, well-being and happiness, and ensure that they had long, productive and healthy lives, would you use it? Put this way, many of the ethical reservations that people have about genetic engineering may be trampled underfoot in the stampede to access these technologies.

The increasing and enthusiastic uptake of injections to remove wrinkles, the growing popularity of cosmetic surgery, and the use of moodenhancing drugs such as Prozac and Ritalin, all indicate that we will embrace rather than reject the promises of genetic engineering. However, a technology that has the possibility to reshape what we are as human beings also has potentially malign consequences that we cannot even start to imagine. And hanging over these debates is the ugly spectre of eugenics in the past, taken to nightmare heights by the Nazis during World War II. These fears were heightened when it was announced on 28 November 2002 that the first cloned human was going to be born in January 2003 (created by the maverick Italian embryologist, Severino Antinori). As events transpired, Antinori was ‘beaten’ by the religious cult, Clonaid, who claimed that they had produced the first cloned human, a girl named Eve, on 26 December 2002. While neither of these claims was ever verified, it was confirmed that Korean scientists had cloned human embryos for the first time on 14 February 2004 (Dayton, 2004; Hickman and Karvelas, 2002).

As well as biotechnologies, at least 12 American companies, including Johnson & Johnson, Pfizer, Merck and Glaxo-SmithKline, were all in the final stages of developing memory-enhancing drugs in 2003, meaning that memory loss could become a thing of the past in the near future. Dubbed ‘viagra for the brain’, these are designed to slow down or even halt memory loss in middle-aged and elderly people. By identifying memory genes, it is now possible to target and strengthen specific neural connections, thereby enhancing their longevity and reducing memory loss. With the rapid aging of the elderly populations of all industrial nations during the first half of the 21st century, the companies that can develop these drugs stand to make billions. In the USA alone, there are more than 76 million people who complain of forgetfulness. At the time experts also predicted that designer drugs could be developed to enhance good memories and block out bad ones, reminiscent of the mood-altering drug ‘soma’ in Aldous Huxley’s visionary 1930s sci-fi novel Brave New World (Winnett, 2002).

In addition to these developments, we are now on the threshold of developing even more powerful computing technologies, which may lead to the creation of the first artificial intelligent entities (‘artilects’).

468 MAXIMUM PERFORMANCE

Bob Clark, Professor of Experimental Physics at the University of New South Wales (Australia), has predicted that the world’s first quantum computer could be up and running by the end of this decade. A quantum computer will be 100 million times faster at processing information than the most powerful of the current generation of supercomputers. These developments will enable second-generation self-learning entities to be created within ten years, as they begin to match the processing power of the 23 billion neurons in the human brain. In the future, people will be able to delegate more mundane tasks to these intelligent machines, which will be able to use their ‘initiative’, offer suggestions and make decisions. These will also be capable of interpreting and responding to human emotions. Emotionally intelligent computers have been in development at MIT’s Media Lab and by the Siemens Human–Machine Research Group since the late 1990s. The MIT Media Lab has already been successful in creating a machine that can sense human emotions (Kurzweil, 1999; The Sunday Times, UK, website, 24 November 1998).

Computers will evolve to an even higher level of complexity and sophistication, as the age-old distinction between technological and biological systems starts to disappear, and both start to operate in tandem at the molecular level. A second-generation artilect, the cellular automata machine (CAM) with circuitry based on ten billion neurons, may be built by 2007. A third generation CAM with a trillion neurons could take only a few more years to construct. A brain-build- ing machine constructed by Genobyte in the USA has been making the world’s first neural circuits for an artificial brain since 2001. This machine can imbed thousands of microscopic modules of artificial neurons on silicon chips. These are the electronic equivalent of the neural networks that control our brains and body functions. In a Darwinian-like process, the bad ones are discarded but the efficient ones thrive and are linked to other promising modules. This occurs at astonishing speeds, far faster than random biological evolution, with tens of thousands of circuits growing and dying in less than a second. Scientists at Cornell University and Harvard University in the USA have also created the first transistor made from a single atom. In theory, this means that a computer could be built that would fit on the full stop at the end of this sentence (Henderson, 2002b; Devine, 2000).

‘Knowbots’ are being developed. These too are self-learning entities, whose processing systems are based on biological neural networks linked to quantum computing systems that are based on chips cooled to –269°C (or 4 degrees above absolute zero). This will enable these entities to store information on single atoms. In 1998, it was announced in the UK that British scientists had taken the first real steps towards

LEADERSHIP AND PEOPLE MANAGEMENT 469

creating an artificial nervous system that will lead to self-reliant, thinking robots. These are being built around electronic neural processors, built of sodium and potassium ion channels, similar to the human brain. We also have a new generation of ‘neuromorphic engineers’ who are now replicating brain structures on analog-based (that is, selflearning) systems. On 2 February 1999, Dr Craig Ventner at the University of Pennsylvania in the USA announced the advent of the first truly artificial organism. This was soon followed by an announcement on 24 January 2000 that scientists at the University of Texas had made the world’s first synthetic DNA. This means that the world’s first artificial life forms may be created soon and, eventually, may lead to the emergence of ‘Chromo Sapiens’ (see below).

The next stage of development is to further miniaturize computer hardware through the use of nanotechnologies (machines built of individual atoms) which, until very recently, were considered to be in the realm of science fiction. Anything with dimensions of less than 100 nanometres (that is, as small as a flu virus and 1000 times smaller than the width of a human hair) is considered to be nanotechnology (Takahashi, 2002). Under the umbrella of the US National Nanotechnology Initiative, more than 200 US companies are currently involved in nanotechnology research. In the second half of 2003, Intel started manufacturing chips with transistors just 90 nanometres (or 90 billionths of a metre) in width. Combined with new materials, such as silicon geranium, this will lead to the development of nanospheres, nanowires, nanorods and other nanostructures. These will make possible the creation of precise atomic arrangements for smaller, faster and smarter semiconductors and computers, and many other electronic devices. In the future, molecular sized nano-machines may even be programmed to make machines out of atoms to create micro-electronic mechanical structures (MEMS). The potential uses of MEMS are infinite (Kurzweil, 1999).

Another innovative field of research and development, bionimetics, has emerged which mimics natural animal and plant systems at the molecular level, resulting in the creation of novel advanced structures, materials and nano-devices. Nano-sized materials are being developed for application in polymers, pharmaceuticals, drug-delivery systems, cosmetics, sunscreens, paint, inks and textiles (reported in The Australian, IT Section, 1 October 2002). With the aid of a $US50 million grant from the US Army, the Institute for Soldier Nanotechnologies (ISN) at the Massachusetts Institute of Technology has been developing smart uniforms genetically engineered at the molecular level. These combine new materials, such as MIThril (a wordplay on the magical armour used by Frodo Baggins in The Lord of the Rings) to protect soldiers from bullets or biological and chemical agents, and