Добавил:
Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:

English for Electrical Engineers

..pdf
Скачиваний:
29
Добавлен:
15.11.2022
Размер:
1.02 Mб
Скачать

Before you read text 2

1.Charles McCabe once said: «Any clod can have the facts, but having opinions is an art». Discuss the pros and cons of this statement.

2.Now read the following text and see whether the author of the following text is of the same opinion. What other problems are discussed in this article?

Text 2.WHAT CAN COMPUTERS DO?

Neville Holmes, University of Tasmania

Marty Leisner answers his own question «Do Computers Make Us Fools?» with the statement: «It seems that computers make people incapable of independent thought». On the other hand, he concludes that «reliance on them … might make us fools», and this, together with many of his other comments, answers quite a different question and answers it well. But it seems to me that neither question is the basic question.

So what is the real question? What is the basic problem? The context is that computers are seen as underpinning social change. The mistake is that computers are seen as causing social change. Let me illustrate one relevant social change.

Computer as Scapegoat

In 1970 I returned to Australia after living for a while in the Hudson River Valley in America, where there was a fairly widespread use of computers. The state of New York had a very simple and effective drivers' license system based on stub cards, which required only that you send back the stub with your payment each year; the remainder of the card was your license.

When I went to get a license in Canberra, I was given a three-part form. The form not only asked for many more personal details than New York ever required, it required them to be written three times. When I mildly criticized the form design at the counter, I was solemnly informed that the design was as it was because of The Computer. I left it at that, but my later inquiries revealed that the department had neither a computer nor any plans to get one.

This incident altered me to the most important social role of the computer, then as now: universal scapegoat. I have seen nothing since to change my mind on this, and indeed I have seen much to confirm it. The social change here is that people seem to be eager to use computers to avoid personal responsibility. Computers are being used to replace personal values with impersonal ones, like the ultimate abstraction — money.

Computer as Tool

Computers are merely tools. They are not members of society; they are not even pseudomembers, like corporations and governments. They are independent agents. Like cars and telephones, they only do things if and when someone uses them. They can neither be blamed for what they do (are used for), nor can they be given credit for what they do (are used for). If there is blame or credit then it belongs to the users, or to the owners, or to the designers, or to the manufacturers, or to the researchers, or to the financiers, never to the computer itself.

Computers cannot make us fools — they can only allow us to be foolish faster. And they can be used by others to make fools of us, for profit or power.

This is not understood by everyone because the computer industry and the computing profession seem to be saying otherwise. We seem to be saying that computers are like people; that they have memory, intelligence, understanding, and knowledge; that they are even friendly. How ignorant! How impressive! How profitable!

41

Attitudes to Computers

Those in the industry who warned against anthropomorphic language have been ignored. The people who put together the first standard vocabularies for the industry urged people to call the devices where data are put «stores» or «storage», not «memories». To suggest there is any likeness between the computer storage and the memories a human might reconstruct is farcical, if not insulting.

Those in the industry who urged that people be distinguished from machines have been ignored. The people who put together the first standard vocabulary for the industry installed such a distinction in its very first two definitions. In brief, they defined «data» as representations of facts or ideas, and they defined «information» as the meaning that people give to data. Only people can process information; machines can process only data. Embodying this fundamental distinction in the definition of the two most basic computing terms was a complete waste of ink.

As long as we allow people to think of computers as anything else than machines to be owned and used, powerful people and institutions will be able to use computers as scapegoats and avoid blame for the social inequities they are able to bring about for their own benefit by using computers.

After you read

Answer the questions:

1.What is the author's reason for choosing such a preface to his article?

2.Why does Neville Holmes refer readers with concerns about computers and social inequities?

3.What, in your opinion, is the social role of computers?

4.Why does the author stick to the idea that «computers cannot make us fools?»

5.Neville Holmes distinguishes between such terms as «storage» and «memory», «data» and «information». Why?

Text 3. ROBOTS IN SPACE

Generations of Space Robots

Robotic explorers have flown in space for decades bringing a set of eyes and ears to places human beings could never dream of visiting. The robotic space pioneers included America's Mariner, Viking and Voyager and Russia's Venera and Lunokhod. These celestial emissaries sent back the first close-up pictures of other worlds. They changed forever the way we see the Universe. But the early probes had little onboard intelligence. At that time computers were bulky and slow. Almost all spacecraft actions were controlled from the Earth.

The next generation of space robots are smaller and smarter. They have enough brains on board to take orders without second-to-second supervision. Mars «Farfinder» was the first of a series of missions that NASA is sending to Mars roughly every two years. «Farfinder» carried the Mars «Sojourner», the first rover sent into space. Intended as a proof of concept, vehicle «Sojourner» landed on Mars on July 4, 1997, thus becoming humanity's first semiintelligent emissary to another world. At NASA's jet propulsion laboratory, «Rocky 7» is making its way across a rock-for-rock recreation of a Mars landing site. This rover is a great technological leap of a sojourner. Packed full of sensors and downsized computing power, «Rocky» is built to do much of its work without constant human guidance. A NASA controller sends Rocky a message saying, «Take a look at those five rocks and get a soil sample». Ricky works out for itself how to get to the rocks and what to do when it arrives. At the end of each mission it will download these data to the Earth then wait for its next assignment. Eventually fleets of small inexpensive rovers could roam Mars and the Moon paving the way for human pilgrims.

42

The «Ranger» Space Robot

The SSL is building the Ranger Space robots in NASA under conditions that reflect the realities of government budget cuts. This is the Space Systems Laboratory in College Park, Maryland. «Ranger» is being largely designed and flown by a small band of enthusiastic scientists and graduate students at the University of Maryland. Its parts were bought at hardware shops or handmade by the students themselves. Here the team is placing «Ranger» into a water tank for a practice round of a mission it will soon fly in outer space. Serving as a kind of remotely operated «come again» Ranger will float outside the international space station performing tasks such as unscrewing panels and replacing parts. It's designed to reduce the number of hours astronauts have to spend on dangerous spacewalks.

Robots in Space

Robots have a number of different roles for space exploration. Probably the one that is the most significant is to be a precursor for eventual human presence, especially on planetary surfaces. This is the idea where you want to send a robot, for example to the surface of Mars, do a survey to find out what the chemical composition of the soil of the planet may be and to see if there is a way we can process that soil to get life-sustaining chemicals for humans to utilize when they get there eventually. Then later on they can actually be used in surveys for the potential human landing sites, or even in preparation of a human habitat so that basically a human flight that may come 10 or 15 or even 20 years later will have their house already built, sitting on the surface of Mars waiting for them just to open the front door, move in and live.

After you read

A.How would you answer the questions?

1.What is the most significant role of robots in space exploration?

2.There are about four functions of robots mentioned in the texts. What are they?

B.Discuss the following statements:

1.Robots will have much wider application for space exploration.

2.Robots are dangerous.

3.Russia should make greater efforts to keep up with the development of robots.

Before you read text 4

1.Comment on Henry Ford's saying: «Had I worked fifty or ten or even five years before, I would have failed. So it is with every new thing. Progress happens when all the factors that make for it are ready, and then it is inevitable».

2.Now read the text and then say if it contains any new information for you about the Microsoft empire.

Text 4. BILL GATES'S VISION

It must be remembered that the future of the Microsoft empire depends heavily on the accuracy of Bill Gates's vision. If his thoughts occasionally sound mundane or less than original, it is because they are the result of a selection process: a person in his position has a legion of experts at his beck and call, plenty of whom generate ideas as fast as he does. His

43

job is to sort out the ideas worth staking a piece of the company's future on. For that, an idea does not have to be original, or even all that good, but it does have to fit his vision: a computer-filled world in which Microsoft writes the best-selling software.

Early in 1975, Gates, by then a sophomore at Harvard University, and Allen, who was working as a programmer in Boston, set out to overtake the revolution. Their first goal was to write a version of Basic to run on the Altair. (Altair 8800 was the world's first truly personal computer).

Although they didn't own an Altair — and indeed had never even seen one — Allen wrote a program on a Harvard mainframe to simulate the new computer. So equipped, working virtually nonstop in his dorm room, often losing track of night and day and routinely falling asleep at his desk or on the floor, the 19-year-old Gates needed just five weeks to complete the task. Later that spring, the pair formed the world's first microcomputer software company, eventually naming it Microsoft.

Like Ford before him Gates invented nothing: no computer, no peripheral, no programming language. He certainly didn't invent microchips. What he did was probably inevitable, once the components became available. He may, however, have been the very first to see how the 8080 chip (unlike the 8008) could be used to place significant computing power at the disposal of Everyman. He didn't know what would be done with it, and he certainly didn't foresee (as Ford didn't foresee freeways) that offices, not homes, would house most of the early PCs. Gates and Allen only knew that, if priced within reason, the products they offered — DOS and Microsoft Basic — would sell.

Gates is eager to distinguish between the services performed by the present generation of home computers and those to be expected in the future from a station on an information highway. The current Internet, he insists, is only a pale imitation of the highway to come. In time, most of the world's information will be available to almost anyone in it. His investigations have convinced him, however, that current satellite technology will never supply the requisite bandwidth (channel capacity). The transmission of so much information will require that private homes be connected to the outside world by underground fiber optic cables, just as they are now connected by existing sewerage, water, electric power, cable TV, and telephone conduits. The required cable will be installed in due time, he predicts, and will be no more costly than current networks.

When the powerful computers of the future are connected to the information highway, you will be able to stay in touch with anyone, anywhere, who wants to stay in touch with you; to browse through thousands of libraries, by day or by night; and to retrieve the answers to varied questions.

You will also be able to watch almost any movie ever made, at any time of day or night, interrupted only upon request. The instructions for assembling your latest purchase will be interactive. Shopping channels will show you only what you ask to see, and the people with whom you talk by telephone will see a well-groomed likeness of yourself responding to their jokes and flirtations, even if you are actually dripping wet from the shower.

After you read

Answer the questions:

1.What does the future of the Microsoft empire depend on?

2.How is Gates's job characterized in the article?

3.If you worked at Microsoft would you try to come up with any original ideas?

4.What is Gates's vision?

5.How long did it take Allen and Gates to form the world's first microcomputer software company?

6.Gates did not invent anything special. What do you think made him so famous?

7.What was the only thing that stimulated Gates's activities?

44

8.In what way will most of the world's information be available to almost anyone in it?

9.What benefits does the information highway provide?

10.What else do you know about the Microsoft empire and its founder?

11.Give your arguments for and against the statement: «Scientists achieve success when they come down from the heights of science to the level of an ordinary man».

Text 5. TOOLS AND WORK

From the Global Positioning System to electric power generation, electrical engineers have contributed to the development of a wide range of technologies. They design, develop, test and supervise the deployment of electrical systems and electronic devices. For example, they may work on the design of telecommunication systems, the operation of electric power stations, the lighting and wiring of buildings, the design of household appliances or the electrical control of industrial machinery.

Fundamental to the discipline are the sciences of physics and mathematics as these help to obtain both a qualitative and quantitative description of how such systems will work. Today most engineering work involves the use of computers and it is commonplace to use computer-aided design programs when designing electrical systems. Nevertheless the ability to sketch ideas is still invaluable for quickly communicating with others.

Although most electrical engineers will understand basic circuit theory (that is the interactions of elements such as resistors, capacitors, diodes, transistors and inductors in a circuit), the theories employed by engineers generally depend upon the work they do. For example, quantum mechanics and solid state physics might be relevant to an engineer working on VLSI (the design of integrated circuits), but are largely irrelevant to engineers working with macroscopic electrical systems. Even circuit theory may not be relevant to a person designing telecommunication systems that use off-the-shelf components. Perhaps the most important technical skills for electrical engineers are reflected in university programs, which emphasize strong numerical skills, computer literacy and the ability to understand the technical language and concepts that relate to electrical engineering.

For many engineers, technical work accounts for only a fraction of the work they do. A lot of time may also be spent on tasks such as discussing proposals with clients, preparing budgets and determining project schedules. Many senior engineers manage a team of technicians or other engineers and for this reason project management skills are important. Most engineering projects involve some form of documentation and strong written communication skills are therefore very important.

The workplaces of electrical engineers are just as varied as the types of work they do. Electrical engineers may be found in the pristine lab environment of a fabrication plant, the offices of a consulting firm or on site at a mine. During their working life, electrical engineers may find themselves supervising a wide range of individuals including scientists, electricians, computer programmers and other engineers.

After you read

1.Summarize the content of the text. Begin with:

The paper reports on...

45

Text 6. THE WORLD WIDE WEB

Until the appearance of the World Wide Web (WWW), the Internet was mainly used by people who had some computer expertise. File transfer protocol (FTP) was the standard method by which data could be stored on or removed from a server, and if a document that had been transmitted had references to other documents then it was not straightforward to access them. In other words, FTP does not link separate documents together.

In 1992, Tim Berners-Lee, working at Europe's high-energy physics research centre in Switzerland, wrote the first browser program which used a protocol called hypertext transfer protocol (HTTP). This operates as follows:

When a client requests a Web server to send a document, the request is sent using HTTP (rather than FTP). The Web server finds the document in its memory and transmits it along with extra information. It is this extra information that distinguishes a Web server from an Internet server. The extra information transmitted is composed of two main parts:

control codes, using hypertext markup language (HTML), by which the client computer screen can display the document, i.e. the layout, headings, bordering, etc. Images can be transmitted as separate files and incorporated on the visible page by HTML code.

links to other documents. These links are specific words or phrases in the text of the transmitted document that will allow related documents to be accessed.

When the mouse pointer of the client computer is moved over the document on the screen, the arrow changes to a hand with a pointing finger whenever it falls on any hypertext. If the user clicks on this link, the browser will automatically set up the link address and request the appropriate Web server to transmit the new document to the client. When this new document arrives, it is displayed on the screen.

A browser, therefore, is a program, stored in the client's computer, that is able to read hypertext. While the Internet is the huge collection of computer networks and databases connected by backbone cable and optic fibre, the WWW is essentially a browsing and searching system. It allows users with virtually no expertise to access the information stored at certain sites on the Internet.

After you read

From memory if you can, fill in the missing prepositions. 1) until the appearance......... the World Wide Web

2) the method........ which data could be stored

3)the data could be stored on or removed …..... a server 4) references …...... other documents

5) the server transmits the document …... ….... extra information 6) the information is composed …... two parts

7) the extra information provides links …... other documents

Use words and phrases from the text to rewrite the words in bold

1.Accessing web pages is easy and simple, and people with almost no expertise use the web.

2.The browser contacts the right server to transmit the document.

3.The WWW is in its basic character a search system.

4.The information added to documents makes web servers different from Internet servers.

5.Years ago, the Internet was mostly used by experts.

As you read

Translate the following text into English.

46

Text 7. ИНТЕРНЕТ КАК СЕТЬ СЕТЕЙ

Интернет можно описать как огромную цифровую магистраль – систему, связывающую миллионы компьютеров, подключенных к тысячам сетей по всему миру. Её яркое прошлое уходит своими корнями в эпоху холодной войны, конец 60-х – начало 70-х годов. Официально можно считать, что Интернет в современном понимании родился 2 января 1969 года. Вэтотгод были начаты работы над проектом Arpanet.

Первоначально данные разработки финансировались правительством США, и сеть, ставшая предшественницей интернета, была специально спроектирована таким образом, чтобы обеспечить коммуникацию между правительственными узлами в том случае, если какая-то часть выйдет из строя в результате ядерной атаки. Применяемый в ней алгоритм управления передачей информации (межсетевой протокол) был разработан так, чтобы компьютеры всех видов могли совместно использовать сетевые средства и непосредственно взаимодействовать друг с другом как одна эффективно интегрированная компьютерная сеть.

(«Информационные технологии», www.information-technology.ru)

As you read

Render the following text in English.

Text 8. ТЕХНОЛОГИЯ BLUETOOTH

Bluetooth – быстро развивающаяся технология передачи данных по радио, разработка которой была инициирована лидерами рынка в передаче данных и компьютерной отрасли.

Название Bluetooth – Синий Зуб – было дано в честь датского короля X века Гаральда Блатана.

В Х веке датский король Гаральд II Блатан (Блатан по-датски «Синий Зуб», Blue Tooth по-английски) прославился своей способностью находить общий язык с князьями-вассалами. Через 1000 лет технологии беспроводной связи разнородных устройств выбрали название – Bluetooth. Инициатором проекта Bluetooth была шведская компания Ericsson, которая и порекомендовала такое название.

Эта передовая технология позволяет устройствам, включая ноутбуки, PDA и сотовые телефоны, а также многочисленным настольным и другим устройствам связываться по радио автоматически с близко расположенными устройствами для обмена информацией, командами и т.п. Одно устройство может общаться с несколькими (до 7 одновременно) устройствами Bluetooth, остальные будут в режиме ожидания. Bluetooth обеспечивает скорость передачи данных до 721 Кбит/с в радиусе до 10–20 метров в зависимости от чип-сета и мощности.

(«Мир беспроводных технологий», www.asusrouter.ru)

As you read

Render the following text in English.

Text 9. МОДЕЛЬ OSI

Из того, что протокол является соглашением, принятым двумя взаимодействующими объектами, в данном случае двумя работающими в сети компьютерами, совсем не следует, что он обязательно является стандартным. Но на

47

практике при реализации сетей стремятся использовать стандартные протоколы. Это могут быть фирменные, национальные или международные стандарты.

Вначале 80-х годов ряд международных организаций по стандартизации – ISO, ITU-T и некоторые другие – разработали модель, которая сыграла значительную роль в развитии сетей. Эта модель называется моделью взаимодействия открытых систем или моделью OSI. Модель OSI определяет различные уровни взаимодействия систем, дает им стандартные имена и указывает, какие функции должен выполнять каждый уровень. Модель OSI была разработана на основе большого опыта, полученного при создании компьютерных сетей, в основном глобальных, в 70-е годы. Полное описание этой модели занимает более 1000 страниц текста.

Вмодели OSI средства взаимодействия делятся на семь уровней: прикладной, представительный, сеансовый, транспортный, сетевой, канальный и физический. Каждый уровень имеет дело с одним определенным аспектом взаимодействия сетевых устройств.

(Олифер В.Г., Олифер Н.А. «Компьютерные сети.

Принципы, технологии, протоколы»).

The goal of science is to make the wonderful and complex understandable and simple—but not less wonderful.

—Herb Simon, Sciences of the Artificial

Before you read text 10

1. How do you understand the quotation?

Text 10. INDUSTRIAL SAFETY SYSTEM

1.Industrial control system or (ICS) is a general term that encompasses several types of control systems used in industrial production, including supervisory control and data acquisition (SCADA) systems, distributed control systems (DCS), and other smaller control system configurations such as programmable logic controllers (PLC) often found in the industrial sectors and critical infrastructures.

2.ICSs are typically used in industries such as electrical, water, oil, gas and data. Based on data received from remote stations, automated or operator-driven supervisory commands can be pushed to remote station control devices, which are often referred to as field devices. Field devices control local operations such as opening and closing valves and breakers, collecting data from sensor systems, and monitoring the local environment for alarm conditions.

3.

An industrial safety system

ISS

is a

countermeasure

crucial

in

any hazardous plants such as oil and gas plants

and nuclear

plants. They are

used to

pro-

tect human, plant, and environment in case the process goes beyond the control margins. As the name suggests, these systems are not intended for controlling the process itself but rather protection. Process control is performed by means of process control systems (PCS) and is interlocked by the safety systems so that immediate actions are taken should the process control systems fail.

4. Process control and safety systems are usually merged under one system, called Integrated Control and Safety System(ICSS). Industrial safety systems typically use dedicated systems that are SIL 2 certified at minimum; whereas control systems can start

48

with SIL 1. SIL applies to both hardware and software requirements such as cards, processors redundancy and voting functions.

http://en.wikipedia.org/wiki/Industrial_safety_system http://en.wikipedia.org/wiki/Industrial_control_system

After you read

1.Choose the key words which can describe the ICS and ISS.

2.Which paragraph contains the information about ICSS?

3.Find out the translation version of the paragraphs 2 and 4.

Before you read text 11

1. How do you understand the title of the text?

Text 11. CLOUD COMPUTING DRIVING

DATA CENTER AUTOMATION

The dynamic nature of cloud computing has pushed data center workload, server, and even hardware automation to whole new levels. Now, any data center provider looking to get into cloud computing must look at some form of automation to help them be as agile as possible in the cloud world.

New technologies are forcing data center providers to adopt new methods to increase efficiency, scalability and redundancy. Let’s face facts; there are numerous big trends which have emphasized the increased use of data center facilities. These trends include:

More users

More devices

More cloud

More workloads

A lot more data

As infrastructure improves, more companies have looked towards the data center provider to offload a big part of their IT infrastructure. With better cost structures and even better incentives in moving towards a data center environment, organizations of all sizes are looking at colocation as an option for their IT environment.

With that, data center administrators are teaming with networking, infrastructure and cloud architects to create an even more efficient environment. This means creating intelligent systems from the hardware to the software layer. This growth in data center dependency has resulted in direct growth around automation and orchestration technologies.

Now, organizations can granularly control resources, both internally and in the cloud. This type of automation can be seen at both the software layer as well as the hardware layer. Vendors like BMC, ServiceNow, and Microsoft SCCM/SCOM are working towards unifying massive systems under one management engine to provide a single pain of glass into the data center workload environment.

Furthermore, technologies like the Cisco UCS platform allow administrators to virtualize the hardware layer and create completely automated hardware profiles for new blades and servers. This hardware automation can then be tied into software-based automation tools like SCCM. Already we’re seeing direct integration between software management tools and the hardware layer.

Finally, from a cloud layer, platforms like CloudStack and OpenStack allow organizations to create orchestrated and automated fluid cloud environments capable of very dynamic scalability. Still, when a physical server or hardware component breaks – we still need a person to swap out that blade.

To break it down, it’s important to understand what layers of automation and orchestration are available now – and what might be available in the future.

49

The automation and orchestration layers

Server layer. Server and hardware automation have come a long way. As mentioned earlier, there are systems now available which take almost all of the configuration pieces out of deploying a server. Administrators only need to deploy one server profile and allow new servers to pick up those settings. More data centers are trying to get into the cloud business. This means deploying high-density, fast-provisioned, servers and blades. With the on-demand nature of the cloud, being able to quickly deploy fully configured servers is a big plus for staying agile and very proactive.

Software layer. Entire applications can be automated and provisioned based on usage and resource utilization. Using the latest load-balancing tools, administrators are able to set thresholds for key applications running within the environment. If a load-balancer, a NetScaler for example, sees that a certain type of application is receiving too many connections, it can set off a process that will allow the administrator to provision another instance of the application or a new server which will host the app.

Virtual layer. The modern data center is now full of virtualization and virtual machines. In using solutions like Citrix’s Provisioning Server or Unidesk’s layering software technologies, administrators are able to take workload provisioning to a whole new level. Imagine being able to set a process that will kick-start the creation of a new virtual server when one starts to get over-utilized. Now, administrators can create truly automated virtual machine environments where each workload is monitored, managed and controlled.

Cloud layer. This is a new and still emerging field. Still, some very large organizations are already deploying technologies like CloudStack, OpenStack, and even OpenNebula. Furthermore, they’re tying these platforms in with big data management solutions like MapR and Hadoop. What’s happening now is true cloud-layer automation. Organizations can deploy distributed data centers and have the entire cloud layer managed by a cloud-control software platform. Engineers are able to monitor workloads, how data is being distributed, and the health of the cloud infrastructure. The great part about these technologies is that organizations can deploy a true private cloud, with as much control and redundancy as a public cloud instance.

Data center layer. Although entire data center automation technologies aren’t quite here yet, we are seeing more robotics appear within the data center environment. Robotic arms already control massive tape libraries for Google and robotics automation is a thoroughly discussed concept among other large data center providers. In a recent article, we discussed the concept of a “lights-out” data center in the future. Many experts agree that eventually, data center automation and robotics will likely make its way into the data center of tomorrow. For now, automation at the physical data center layer is only a developing concept.

The need to deploy more advanced cloud solution is only going to grow. More organizations of all verticals and sizes are seeing benefits of moving towards a cloud platform. At the end of the day, all of these resources, workloads and applications have to reside somewhere. That somewhere is always the data center.

In working with modern data center technologies administrators strive to be as efficient and agile as possible. This means deploying new types of automation solutions which span the entire technology stack. Over the upcoming couple of years, automation and orchestration technologies will continue to become popular as the data center becomes an even more core piece for any organization.

After you read

Make up the table classifying the automation and orchestration layers according to their types and including the names and some characteristics.

Before you read text 12

How can you describe the process of the cell planning?

Are there any difficulties we can face with planning the cell?

50