
- •14.What Are Newsgroups? Part 1…………………. 1246
- •2. Central Processing Unit. 1155
- •3. Input-Output Environment. 1188
- •4. Printers. 1381
- •Digital – Analog Technologies. 1434
- •8. The First Programmer. 1281
- •Age of Thinking Machines. 1261
- •14.What Are Newsgroups? Part 1 1246
- •15. What Are Newsgroups? Part 2. 1307
- •16. Input Devices. 1400
- •17. Mobile Software Development. 1186
- •18. Integrated Services Digital Network. 1085
- •19 . Mobile Web. 1347
- •20. Modem. 1239
- •21. Voice Over Internet Protocol. 1134
- •22. A Computer System. 1173
- •23. Cable Modem. 1124
- •24. Limitations of Mobile Internet. 1243
- •25. Will the computers think instead of us? 1110
- •26. How does the Net work? 1308
- •27. How does e-mail work? 1488
- •28. Stored Program Architecture. 1406
- •30. Electronic Computer Memory. Part 2 1149
Digital – Analog Technologies. 1434
Digital information can be transmitted faster and more clearly than analog signals, because the electrical impulses only need to correspond to two digits and not to the full range of qualities that compose the original message, such as the pitch and volume of a human voice.
Large parts of the telephone system are now digital, but the bit between your handset and the exchange is still analog. A telephone conversation starts out as an analog signal, then it is converted along the way to a digital signal, and finally back to analog. Both digital data and analog signals may go through a number of back-and-forth conversions during their respective journeys.
The modem is not the only solution to digital-analog analog-digital conversion. There are many devices and methods, often incorporating specialized codec (compression/decompression) algorithms. One of the reasons for going digital is the benefit of data compression.
Computer-controlled industrial processes rely on a number of inputs - such as temperature, pressure, position and process rate-measured by analog devices that typically convert their respective signals to variable current. That, in turn, is converted to digital data readable by a controlling computer, which in turn processes the information and sends instructions back to the machines.
There's nothing new about that; in precomputer days it was all done with mechanical detection and control systems.
8. The First Programmer. 1281
The first computers with stored programs and a central processor that executed instructions provided by users were built in the late 1940s by a team led by John von Neumann. “Real” programming could be said to date from these machines.
Yet there were many previous machines that could be “programmed” in the sense that data could be supplied, usually in the form of cards or paper tapes,that would affect what the machine did.
The first machine of this type devoted entirely to computation was invented by Charles Babbage in the 1830s. “Programs” for his Analytical Engine consisted of a sequence of cards with data and operations.
In 1991 Babbage’s Difference Engine, a simpler computer than the Analytical Engine, but which was also never completed, was constructed at the National Museum of Science in London from drawings he left. The success of this project indicates that the Analytical Engine, had it been built, would probably have worked — 100 years before its electronic counter-part was invented.
Although only parts of the machine were ever built, several examples of the computations it could perform were developed by Ada Augusta, a daughter of Lord Byron. For this reason, she is considered to be the first programmer, and the language Ada has been named after her.
9. Machine Languages.
Part 1. FORTRAN. 1163
With the advent of general-purpose digital computers with stored programs in the early 1950s, the task of programming became a significant challenge. The five units of the computer must communicate with each other. They do it by means of a machine language which uses a code composed of combinations of electric pulses.
This soon gave way to assembly languages, which use symbols and mnemonics to express the underlying machine codes. However, assembly languages are highly machine dependent and are written using a syntax very unlike natural language. They are sometimes referred to as “low-level” languages.
The first high-level language was FORTRAN, developed between 1954 and 1957 by a team at IBM led by John Backus. It was designed for scientific and computational programming, as its name implies (FORmula TRANslation), and its descendants are still significant in scientific applications today.
However, it has also been used for general-purpose programming, and many new features taken from other languages have been added through the years (FORTRAN - II, FORTRAN - IV, FORTRAN - 66, FORTRAN -77, FORTRAN- 90).
10. Machine Languages. 1177
Part 2. COBOL
COBOL (Common Business-Oriented Language) was developed by the U.S. Department of Defense (1959—1960) by a team led by Grace Hopper of the Navy.
This language was quickly adopted by banks and corporations for large-scale record-keeping and other business applications. It is perhaps still the most widely used programming language, but it has been largely ignored by the academic community. (Business schools often offer courses on COBOL programming, but computer science departments generally do not.)
This is partially due to the extreme wordiness of the language. (The design was supposed to permit nonprogrammers to read and understand programs, but it only complicated the syntax without providing true readability.)
Complex algorithms are also extremely difficult to program in COBOL, and the language has added only a few new features to language design.
However, those features are significant.
Those features were:
- the record structure for organizing data,
- the separation of data structures from the execution section of a program, - versatile formatting for output using “pictures”, or examples of the desired format (still used in some database languages today).
11. Machine Languages. 1251
Part 3. Algol.
Algol (ALGOrithmic Language) was developed by a committee (1958—1960) to provide a general, expressive language for describing algorithms, both in research and in practical applications.
It is hard to overestimate the influence and importance of this language for future language development. Most of the current imperative languages are derivatives of Algol, including Pascal, C, and Ada.
Research papers today still often use Algol or Algol-like syntax to describe algorithms. It achieved widespread practical acceptance in Europe for general programming tasks, but was rarely used outside of academic circles in the United States.
Algol - 60 introduced many concepts into programming, including free-format, structured statements, begin-end blocks, type declarations for variables, recursion, and pass-by-value parameters. It also implicitly introduced the stack-based runtime environment for block-structured languages, which is still the major method for implementing such languages today.
At the same time that these three languages were created, based on the standard von Neumann architecture of computers, other languages were being developed based on the mathematical concept of function. Two major examples of such languages are LISP and APL.