Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
методичка по ИТ.11.doc
Скачиваний:
4
Добавлен:
17.11.2019
Размер:
355.84 Кб
Скачать

Answer the questions

1.What files is e-mail activity based on?

2. How can you protect your mail from aliens?

3. Why do Web-browser makers have to include e-mail encryption capabilities into their software?

4. How can a recipient know that you are truly the sender?

5. What are the main advantages of e-mail activity?

6. What can stop an e-mail delivery?

7. How can a user be identified in the net?

8. How many copies of the letter can a user send?

9. How do filters attached in the e-mail software function?

10. What is the simplest phishing attack?

                  1. Text 3. Phishing

Computer spam or unwanted advertising is becoming more and more dangerous. A new study by Gartner R&D centre found that the number of online scams known as “phishing attacks” has increased in the last year and that an online consumer is frequently tricked into divulging sensitive information due to phishing attacks of criminals.

The study surveyed 5000 adult Internet users and found out that around 3 percent of those surveyed reported of giving personal or financial information after being drawn into a phishing scam. Phishing scams use e-mail messages and Web pages designed to look like correspondence from legitimate online businesses. The results suggest that as many as 30 million adults have experienced a phishing attack and that 1.78 million adults could have fallen victims to the scams, Gartner says.

Phishing attacks typically begin with e-mail messages purporting to come from established companies such as EBay, Best Buy, Citigroup, and others. Within the e-mail messages, Web page links direct recipients to Web sites disguised as official company Web pages, where the recipient is asked to enter personal information such as his or her social security number, account number, password, or credit card information.

The U.S. federal authorities and leading Internet service provider have taken aggressive stance on the scams. The U.S. Federal Trade Commission and the U.S .Department of Justice are trying to stop phishing scams that tricked hundreds of Internet users into giving credit card and bank account numbers to Web sites that looked of AOL and Pay Pal, part of EBay.

Thus, the FTC has charged Zachary Keith Hill of Houston with deceptive and unfair practices in that case and the DOJ named Hill as a defendant in a criminal case filed in Virginia.

A success rate of 3 percent is plenty to encourage further attacks, Gartner R&D centre says.

Answer the questions

1. What is a phishing attack?

2. What are online consumers tricked into?

3. What did the Gartner survey find?

4. What are phishing scams designed for?

5. How many adults have experienced a phishing attack?

6. How do phishing attacks typically start?

7. What is the recipient asked to do during a phishing attack?

8. What sites are the users redirected to?

9. What rate is enough to encourage further attacks?

10.Who was charged by the FTC with deceptive practices?

  1. Text 4

A New Spam Campaign Against Financial Transfer Systems

A new spam campaign is targeting a financial transfer system that handles trillions of dollars in transactions annually and has proved to be a fertile target for online fraudsters. The spam messages pretend to come from the National Automated Clearing House Association (NACHA), a U.S. nonprofit association that oversees the Automated Clearing House system (ACH).

Over the last few months, many businesses lost money through ACH fraud, primarily when fraudsters obtained the authentication credentials required to transfer money. In many cases, significant portions of the fraudulent transfers were never recovered.

NACHA has no direct involvement in the processing of the payments, but criminals have managed to launch a spam campaign with messages purporting to be from the organisation saying that an ACH payment has been rejected. The spam messages have a link to a fake Web site that looks like NACHA's. The site asks the victim to download a PDF (portable document format) file,which is actually an executable. If launched, the executable will install Zbot, also known as Zeus, an advanced piece of banking malware that can harvest the authentication details required to initiate an ACH transaction.

There are a number of versions of the Zeus malware, which is periodically re-engineered in order to evade detection by antivirus software. A new version of Zeus being spammed was only detected by 16 of 41 antivirus suites,wrote Gary Warner, director of research in computer forensics at the University of Alabama's computer and information sciences department. Antivirus software is the first line of defence against malware like Zeus. However, malware writers can only modify the file in order to make it undetectable for a while until the security companies see a sample and create a signature for it. It may take a few days before different security suites can detect it. By that time, the money may be gone.

Answer the questions

1.What is a new spam campaign targeting?

2. How do fraudsters obtain the authentication credentials?

3. What messages do spammers send?

4. What link do the spam messages have?

5. What is the victim asked to do by the fraudsters?

6. What kind of malware is used in the fraud?

7. What is required to initiate an ACH transaction?

8. Why is the Zeus malware periodically re-engineered?

9. What is the first line of defence against malware like Zeus?

10. Why do malware writers have to modify the file?

Module VI. New Technologies and Search Systems

Text 1. The World Wide Web Service and HTTP

People have dreamt of a universal information database for a long time. However, only recently, new technologies have made such systems possible. The most popular system currently in use is the World Wide Web. The Internet is having a dramatic effect on the way the Web works. Not long ago, a great Web site was one that had nicely formatted texts and information on some subjects.

However, the situation has changed now with appearing of Hypertext Transfer Protocol (HTTP) – the most frequently used protocol on the Internet today. It grew out of a need for the universal protocol to simplify the way the users can get access to the Internet information. HTTP is a generic, stateless, object-oriented protocol. It allows systems to be built independently of the data transferred. HTTP is a client/server protocol. This means that the client and server interact to perform a special task. For example, a user may click a link on the Hypertext Markup Language page. This causes the page to be replaced with a new one. The client browser uses HTTP commands to communicate with the HTTP server. A connection is established from the client to the server through the default TCP port 80. Once the connection has been made to the server, the request message is sent. The requests are typical for a resource file consisting of images, audio, animation, video or other hypertext documents. After that, the server sends a response message to the client with the requested data. The server ordinary closes the connection, unless the client’s browser has configured a “keep alive” option.

The nature of the World Wide Web provides a way to interconnect computers running different operating systems and display information created in a variety of existing media formats.

In short, the possibilities for hypertext in the worldwide environment are endless. With the computer industry growing at today’s pace, no one knows what awaits us in the future.

Answer the questions

1. What have people dreamt of for a long time?

2. What was a great Web site several decades ago?

3. What is Hypertext Transfer Protocol used for?

4. Why can we say that HTTP is a client/server protocol?

5. How can the client browser communicate with the HTTP server?

6. How is connection from the client to the server established?

7. What does the typical file consist of?

8. When does the sever close the option?

9. What type of computers can interconnect on the Web?

10. What is growing rapidly nowadays?

Text 2.The Opera browser

In 1994, two Norwegians, Jon S. von Tetzcher and Geir Ivarsoy, developed a Web browser while working for the Norwegian telecom company Telenor. When Telenor decided not to use the program, they left to start Opera Software in 1995, and introduced the Opera browser as the shareware for the Windows platform in 1996. In 1998, in an effort to expand their market, Opera Software began a project to port the browser to many different platforms. In 2000, the project succeeded, as the Opera browser was selected for use as the embedded browser for the Ericsson YS210 Cordless Screen Phone, and Psion and Screen Media information appliances. At the end of 2000, Opera made their pay-for-play browser available for free download, but in a version that included integrated banner ads.

The Opera browser has lagged a step behind Firefox and IE in both features, such as searching from the address bar, and giving support for advanced standards such as XML, CSS, or Java. As most Web developers often work with either IE or Firefox or both of them, there used to be problems of compatibility between Opera and complex web pages such as a problem exacerbated by the often non-standard ways in which IE and Firefox treated their features. Thus, Opera 5.11 did not fully implement the CSS or JavaScript on several Web pages. However, Opera and its supporters say that their new version have closed the compatibility gap with IE and Firefox.

The Internet browser company Opera Software has added features for tighter security and the ability to surf the Web with voice commands in its browser, Opera 8 for Windows and Linux. Opera sees the security issue as one it can use to carve into Microsoft’s dominance of the browser market with its Internet Explorer. The desktop browser gives extra information about the identity of Web sites, automatically activating an information field that gives a level of security from 1 to 3 and listing the certificate owner of the site when the user visits a secure Web site. The browser can also identify the origins of pop-up Web sites.

Answer the questions

1. When was the Opera browser developed?

2. Who developed the browser?

3. What was introduced as the shareware for the Windows platform?

4. What project did Opera Software start to expand their market?

5. What use was the Opera browser selected for in 2000?

6. What version of the browser was available for free downloading at the end of 2000?

7. What problems were there between Opera.5 and complex web pages?

8. What features has the Internet browser company Opera Software added to the browser?

9. How does Opera see the security issue?

10. What extra information does the desktop browser give?

Text 3 Cloud Computing

The cloud and cloud computing are among the buzz words nowadays. The big players are moving into this area in a big way.

Google can already run your email and host your documents, and its APP engine lets users run custom applications. At the same time, the concept of cloud computing is far from new.

In cloud computing systems instead of installing a suite of software for each computer, you only have to load one application. That application allows workers to log into a Web-based service which hosts all the programs the user needs for his or her job. Remote machines owned by another company can run everything from e-mail to word processing and complex data analysis programs. In fact, cloud computing can change the entire computer industry.

In a cloud computing system, there's a significant workload shift. Local computers no longer have to do all the heavy lifting when it comes to running applications. The network of computers that make up the cloud handles them instead. Hardware and software demands on the user's side decrease. The only thing the user's computer needs to be able to run is the cloud computing system's interface software, which can be as simple as a Web browser, and the cloud's network takes care of the rest.

When talking about a cloud computing system, it's helpful to divide it into two sections: the front end and the back end. They connect to each other through a network, usually the Internet. The front end is the side the computer user, or client, sees. The back end is the "cloud" section of the system.

The front end includes the client's computer or computer network and the application required to access the cloud computing system. On the back end of the system there are the various computers, servers and data storage systems that create the "cloud" of computing services. In theory, a cloud computing system could include practically any computer program you can imagine, from data processing to video games. Usually, each application will have its own dedicated server .

A central server administers the system, monitoring traffic and client demands to ensure everything runs smoothly. It follows a set of rules called protocols and uses a special kind of software called middleware which allows networked computers to communicate with each other.

Answer the questions

1. What system does Google use to run your e-mail and host your documents?

2. What do you have to do to get into the cloud system?

3. What is the main role of remote machines in the cloud?

4. Why do hardware and software demands on the user's side decrease?

5. What type of software does the user need to get into cloud?

6. What sections does the cloud system consist of?

7. What type of connection does the user need to get into the cloud system?

8. What is located at the back of the cloud?

9.How does a central server administer the system?

10. What type of software does a central sever use?

Text 4 Planet-scale Grid

Scientists have started smashing protons and ions together in a multinational experiment to understand what the universe looked like a second after the Big Bang. The particle accelerator used in this test can release a vast flood of data on a scale unlike anything seen before, and for that scientists need a computing grid of equally great capability. As part of this effort, which costs about 5 billion euros ($6.3 billion U.S.), scientists have built a grid using 100,000 CPUs, mostly PCs and workstations, available at universities and research labs in the U.S., Europe, Japan, Taiwan and other locations. Scientists need to harness raw computing power to meet computational demands and to give researchers a single view of this disparate data.

Researchers believe that improving the ability of the grid to handle petabyte-scale data split up among multiple sites will benefit not only the scientific community but also mainstream commercial enterprises. They expect that corporations will one day need a similar ability to harness computing resources globally as their data requirements grow. It is important to prove that they can maintain the processes for an extended period almost without human attendance. This means that network interconnections are tuned and synchronised and that there's sufficient security and monitoring, as well as staffing and automation, at the respective data gathering sites.

A more difficult aspect is how to provide simultaneous access to the data for 1,000 physicists working around the world. Another limiting factor for approximately 100 developers working on the grid, is the capabilities of resource brokers or the middleware that submits the jobs and distributes the work. If the processing isn't effectively routed, databases can be crashed under heavy loads. There is also a need to ensure that the system has no single point of failure. It involves keeping track of the data. The data could be in one place, while the CPU resource capable of processing it is in another. Metadata, which describes what the data is about, will play a critical role.

Answer the questions

  1. Why have scientists started smashing protons and ions

  2. What can the particle accelerator release?

  3. What grid have scientists built?

  4. How much does this project cost?

  5. What do researchers believe in?

  6. What do scientists need to harness?

  7. What will commercial enterprises try to harness one day?

  8. What is important for scientists to prove?

  9. What factor attracts a lot of attention?

  10. What can happen if the processing is not effectively routed?

Text 5. Keyloggers

Security experts have prised Sumitomo Mitsui Banking Corporation for admitting that it was the target of a failed $424 million hacking attempt. According to media reports, the UK’s National High Tech Crime Unit (NHTCU) has issued a warning to large banks to guard against keylogging, the method adopted by the would-be thieves in an attack on the Japanese bank’s London systems. The intruders tried to transfer money out of the bank via 10 accounts around the world.

Keyloggers record every keystroke made on a computer. They are commonly used to steal passwords. American games developer Valve had the source code to its latest version of Half-Life stolen after a virus had delivered a keystroke recorder program into Valve’s founder’s computer. Keyloggers are becoming more sophisticated, moving away from software forms to sniffer-type hardware devices. They have now got little hardware loggers that are like a dongle that you place between the keyboard connection and the base unit. A cleaner can come in and pop one of these things in. No one ever looks around the back of their PCs.

That type of operation also means that an organisation’s level of encryption or firewall strength could become irrelevant. Hacker sites offer keylogging software for free. Keystroke recorders are also sold on seemingly legitimate Web sites, purposely for employers to keep an eye on what the staff is doing at their computers. Attacks on individuals’ machines are an everyday occurrence and users must remain vigilant. We see from 15 to 20 new pieces of malware a day, and they are worms and Trojans that do keylogging. Individuals probably don’t even know about it. The malware doesn’t display a scull and crossbones or play “A Hard Day's Night” or "Yellow Submarine" over your speakers to announce its presence. Users are urged to update antivirus software probably several times a day and not to forget to install Microsoft patches and a firewall.

Answer the questions

1. Who has prised Sumitomo Mitsui Banking Corporation and what for?

2. What warning has the UK's National High Tech Crime Unit (NHTCU) issued?

3. How did the intruders try to transfer money out of the bank?

4. What are keyloggers commonly used for?

5. How was the latest version of Half-Life stolen?

6. What new forms do keyloggers get?

7. What are hardware loggers and where can one place them?

8. What would this type of operation mean?

9. What kind of software do hacker sites offer?

10 How can users protect themselves from keylogging?

Module VII. Networks

Text 1. A Brief History of Local Area Nets (LANs)

A local area network is a system, which allows computers to share information and recourses within a limited, local area generally less than one mile from the server to a workstation. In other words, LAN is a communication network used by a single organisation. Although only with the arrival of microcomputers, companies managed to implement LANs, the concept itself is not new. The first computers in the 1950s were mainframes. Large, expensive, and reserved for very few select users, these monsters occupied entire buildings. Costing hundreds of thousands of dollars, these large computers were not able to run the newer, more sophisticated business programs that were coming out for IBM PCs and their compatibles. In the middle of1980s, thousands of employees began bringing their own personal computers to work in order to use the new business software written for PCs. As employers began exchanging floppy disks and keeping their own databases, companies met serious problems with maintaining the integrity of their data. LANs offered a solution to such problems.

LANs represent a logical development and evolution of computer technology. A network consists of two main elements – the physical structure that links the equipment, and the software that allows communication. The physical distribution of nodes is a network topology, while the rules, which determine the formats by which the information may be exchanged, are known as protocols.

The first LANs were relatively primitive. Faced with a serious shortage of software designed for more than one user, the first LANs used file locking, which allowed only one user to run a program at a time. Gradually however, the software industry has become more sophisticated and today’s LANs offer powerful complex accounting and productivity programs to several users simultaneously. Each computer attached to the network retains its ability to work as an independent personal computer running its own software.

Answer the questions

1. What does a local area network allow computers to do?

2. When did companies manage to implement LANs?

3. How did the first mainframes function?

4. Could mainframes of the 50s run new sophisticated programs ?

5. Why did office workers begin bringing personal computers to their workplaces?

6. Why did companies get problems with maintaining the integrity of their data?

7. What does a typical local network consist of?

8. What is a network topology?

9. Where wast file locking used?

10. What can today’s LAN offer?

Text 2. Types of Physical Configuration for LANs

There are different ways a local area network can operate. Keep in mind that the form of the LAN does not limit the media of transmission. One of the oldest types of network is a star, which uses the same approach to sending and receiving messages as a telephone system. It means that all messages in a LAN star topology must go through a central computer that controls the flow of data. It is easy to add new workstations to the LAN and allow the network administrator to give certain nodes higher status than others. The major weakness of the star architecture is that the entire LAN fails if anything happens to the central computer.

Another major network topology is a bus. In many such networks, the workstations check whether a message is coming down the highway before sending their message. Because all workstations share the same bus, all messages pass other workstations on the way to their destination. Many low-cost LANs use the bus architecture. Advantage of the bus topology is that the failure of a single workstation does not cripple the rest of the network. However, too many messages can slow down the network speed.

A ring topology consists of several nodes joined together to form a circle where all workstations must have equal access to the network. In a ring LAN, a data packet, known as a token is sent from the transmitting workstation through the network. The token contains the address of the sender and the address of the node to receive the message. If the monitoring node fails, the network remains operative. The network may withstand the failure of various workstations. Additional ring networks can be linked together through bridges that switch data from one ring to another.

To provide some level of uniformity among network vendors, the International Standards Organisation has developed Open Systems Interconnection standards. Different computers networked together need to know in what form they will receive information. The Open Systems Interconnection standards consist of a seven-layer model that ensures efficient communication within a LAN and among different networks.

Answer the questions

1.How do local area networks operate ?

2. Can the LAN form limit the media of transmission?

3. What is the oldest type of the net connection?

4. What is the major weakness of the star architecture?

5. What is the main advantage of the bus topology

6. How does a ring topology operate?

7. What does the token contain?

8. Can a ring LAN withstand the failure of several workstations?

9. What has the International Standards Organisation developed?

10. How are the Open Systems Interconnection standards formed?

Text3. Benefits of Wireless Technology

In a coffee shop, library or a hotel people usually use WiFi to connect to the Internet. In the near future, wireless networking may become so widespread that you can access the Internet just anywhere and any time, without using wires. WiFi has a lot of advantages. Wireless networks are easy to set up. They are inexpensive. A wireless network uses radio waves, just like cell phones, televisions and radios do.

In fact, communication across a wireless network is a lot like two-way radio communication. A computer's wireless adapter translates data into a radio signal and transmits it using an antenna. A wireless router receives the signal and decodes it. Then it sends information to the Internet using a physical, wired Ethernet connection. The process also works in reverse, with the router receiving information from the Internet, translating it into a radio signal and sending it to the computer's wireless adapter.

However, WiFi radios have a few notable differences from other radios. Their frequency is considerably higher than the frequencies used for cell phones, walkie-talkies and televisions. The higher frequency allows the signal to carry more data. This connection is convenient, virtually invisible and fairly reliable. However, if the router fails or if too many people are trying to use high-bandwidth applications at the same time, users can experience interference or lose their connections. Once you've installed your wireless adapter and the drivers that allow it to operate, your computer should be able to automatically discover existing networks.

Being able to connect to the Internet in public hotspots is extremely convenient. Wireless home networks are convenient as well. They allow you to connect multiple computers and to move them from place to place without disconnecting and reconnecting wires. Once you plug in your router, it should start working at its default settings. Most routers let you use a Web interface to change your settings.

Security is an important part of a home wireless network. If you set your router to create an open hotspot, anyone who has a wireless card will be able to use your signal. Most people would rather keep strangers out of their network, though. Doing so, requires you to take some security precautions. Make sure they are current.

Answer the questions

1.Where can people get a wireless connection to the Internet?

2. What are the main advantages of WiFi?

3.How does Wi-Fi function?

4. What type of physical connection does the router use sending information to the Internet?

5.How do WiFi radios differ from those we use in cellphones?

6.What can happen if your router fails?

7. Why are wireless home networks convenient?

8. How can users change their settings?

9. What function is very important in home wireless networks?

10. What can happen if the router has created an open hotspot?

Text 4

Worldwide Interoperability for Microwave Access

In some developing countries teledensity is so low that expanding a wired network to cover the entire population is very expensive. The result is that they can bypass an old technology and move straight to a national wireless to provide broadband and voice (VoIP) services. In such countries Wi-Max or Worldwide Interoperability for Microwave Access has already made a huge impact. It delivers high-speed access wirelessly, enabling fixed and mobile broadband services over large coverage areas. It is an IP-based system which comes in fixed and mobile versions. Fixed Wi-Max is suited for delivering wireless access for fixed broadband services similar to DSL. Mobile Wi-Max supports both fixed and mobile applications with improved performance and capacity.

On the other hand, 3G is a well-established wireless network in developed countries. 3G technology has evolved from the voice-centric telecoms world. It is able to deliver voice and high-speed broadband access. Last years saw the growth of huge networks in the developed world.

In the longer term scientists are starting to see the convergence of Wi-Max and 3G. The fact is that Wi-Max has broadened to become more mobile and capable for media services, while 3G cellular is becoming increasingly broadband. What is more, both Wi-Max and 3G are driven to use the same core sets of technologies.

At the moment, developing countries are still having a choice between those two systems. Factually, they can use both Wi-Max and 3G platforms.

If two technologies cooperate rather than compete, then the future of broadband and voice services will be more promising.

Answer the questions

1. Why are some developing countries not developing their wired networks?

2. What technology is wide spread in developing countries?

3. How can developing countries bypass an old wired technology?

4.What services does Wi-Max provide?

5. What does mobile Wi-Max support?

6. What wireless technology is established in developed countries?

7. How has 3G technology evolved?

8.What can happen to Wi-Max and 3G in the future?

9. What are the signs of practical convergence of Wi-Max and 3G technologies?

10. What platforms are equally suitable for developing countries?

Module VIII. Application of Information Technology

Text 1. Systems Integration Service in Russia

When companies merge, rapidly expand, or implement a major new software application, it usually has a great implication for IT strategy. Companies typically find a service supplier to manage all elements of the IT project for them. One of the largest segments of the Russian IT market is systems integration service. It includes planning, design, implementation and project management of a solution to address a customer’s technical or business needs. As it is difficult to place a minimum dollar limit, local system integration projects typically exceed $50,000. Consolidation of many Russian large companies into large groups and holding companies supports demand in systems integration services as IT infrastructures also have to be integrated.

The largest part of the market is comprised of computers based on industry-standard Intel microprocessors that typically run a Microsoft operating system. However they are also able to run other systems like Novell NetWare or Linux. For small networks it is possible to buy an inexpensive server backed by a vendor warranty guaranteeing on site support. Skills to install, maintain and support these servers are relatively plentiful thus cheap, so Russian customers tend to favour this kind of solution more than do the buyers in comparable Central European markets.

Servers based on alternative processor architecture (usually Reduced Instruction Set Computer or RISC) mostly run a variant of the UNIX operating system. This tend is becoming the choice of larger and richer customer organisations that are prepared to invest the necessary resources to gain benefits from the greater reliability and scalability these more expensive servers can offer.

With huge increases in local Internet use, demand for smaller entry-level servers is growing very quickly. On smaller networks, companies often start using a commodity desktop computer as a server. When it can no longer cope or after the first major tragedy they opt for purpose-built hardware.