the computer
صفحة 1 من اصل 1
the computer
Computer
A computer is a machine designed for manipulating data according to a list of instructions known as a program.
Computers are versatile. In fact, they are universal information processing machines. According to the Church-Turing thesis, a computer with a certain minimum threshold capability is in principle capable of performing the tasks of any other computer, from those of a personal digital assistant to a supercomputer, as long as time and memory capacity are not considerations. Therefore, the same computer designs may be adapted for tasks ranging from processing company payrolls to controlling unmanned spaceflights. Due to technological advancement, modern electronic computers are exponentially more capable than those of preceding generations (a phenomenon partially described by Moore's Law).
History of computing
ENIAC — a milestone in computing history
Originally, the term "computer" referred to a person who performed numerical calculations, possibly with the aid of a mechanical calculating device. Examples of early calculating devices, the first ancestors of the computer, included the abacus and the Antikythera mechanism, an ancient Greek device for calculating the movements of planets, dating from about 87 BC.[1] The end of the Middle Ages saw a reinvigoration of European mathematics and engineering, and Wilhelm Schickard's 1623 device was the first of a number of European engineers to construct a mechanical calculator.[2] The abacus has been noted as being an early computer, as it was like a calculator in the past.
Charles Babbage was the first to conceptualize and design a fully programmable computer as early as 1820, but due to a combination of the limits of the technology of the time, limited finance, and an inability to resist tinkering with his design, the device was never actually constructed in his lifetime. A number of technologies that would later prove useful in computing, such as the punch card and the vacuum tube had appeared by the end of the 19th century, and large-scale automated data processing using punch cards was performed by tabulating machines designed by Hermann Hollerith.
How computers work: the stored program architecture
While the technologies used in computers have changed dramatically since the first electronic, general-purpose, computers of the 1940s, most still use the stored program architecture (sometimes called the von Neumann architecture; as the article describes the primary inventors were probably ENIAC designers J. Presper Eckert and John William Mauchly). The design made the universal computer a practical reality.
The architecture describes a computer with four main sections: the arithmetic and logic unit (ALU), the control circuitry, the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by bundles of wires (called "buses" when the same bundle supports more than one data path) and are usually driven by a timer or clock (although other events could drive the control circuitry).
Conceptually, a computer's memory can be viewed as a list of cells. Each cell has a numbered "address" and can store a small, fixed amount of information. This information can either be an instruction, telling the computer what to do, or data, the information which the computer is to process using the instructions that have been placed in the memory. In principle, any cell can be used to store either instructions or data.
Instructions, like data, are represented within the computer as binary code — a base two system of counting. For example, the code for one kind of "copy" operation in the Intel x86 line of microprocessors is 10110000. The particular instruction set that a specific computer supports is known as that computer's machine language. Using an already-popular machine language makes it much easier to run existing software on a new machine; consequently, in markets where commercial software availability is important suppliers have converged on one or a very small number of distinct machine languages.
Larger computers, such as some minicomputers, mainframe computers, servers, differ from the model above in one significant aspect; rather than one CPU they often have a number of them. Supercomputers often have highly unusual architectures significantly different from the basic stored-program architecture, sometimes featuring thousands of CPUs, but such designs tend to be useful only for specialized tasks.
Libraries and operating systems
Soon after the development of the computer, it was discovered that certain tasks were required in many different programs; an early example was computing some of the standard mathematical functions. For the purposes of efficiency, standard versions of these were collected in libraries and made available to all who required them. A particularly common task set related to handling the gritty details of "talking" to the various I/O devices, so libraries for these were quickly developed.By the 1960s, with computers in wide industrial use for many purposes, it became common for them to be used for many different jobs within an.
Computer applications
Computer-controlled robots are now common in industrial manufacture.
Computer-generated imagery (CGI) is a central ingredient in motion picture visual effects. The seawater creature in The Abyss (1989) marked the acceptance of CGI in the visual effects industry.
Furby: many modern, mass-produced toys would not be possible without low-cost embedded computers.
The first digital computers, with their large size and cost, mainly performed scientific calculations, often to support military objectives. The ENIAC was originally designed to calculate ballistics-firing tables for artillery, but it was also used to calculate neutron cross-sectional densities to help in the design of the hydrogen bomb,[9][10] significantly speeding up its development. (Many of the most powerful supercomputers available today are also used for nuclear weapons simulations.) The CSIR Mk I, the first Australian stored-program computer, was amongst many other tasks used for the evaluation of rainfall patterns for the catchment area of the Snowy Mountains Scheme, a large hydroelectric generation project[11] Others were used in cryptanalysis, for example the first programmable (though not general-purpose) digital electronic computer, Colossus, built in 1943 during World War II. Despite this early focus of scientific and military engineering applications, computers were quickly used in other areas.
Networking and the Internet
Computers have been used to coordinate information in multiple locations since the 1950s, with the US military's SAGE system the first large-scale example of such a system, which led to a number of special-purpose commercial systems like Sabre.
In the 1970s, computer engineers at research institutions throughout the US began to link their computers together using telecommunications technology. This effort was funded by ARPA, and the computer network that it produced was called the ARPANET.
Computing professions and disciplines
In the developed world, virtually every profession makes use of computers. However, certain professional and academic disciplines have evolved that specialize in techniques to construct, program, and use computers. Terminology for different professional disciplines is still somewhat fluid and new fields emerge from time to time: however, some of the major groupings are as follows:
• Computer engineering is the branch of electrical engineering that focuses both on hardware and software design, and the interaction between the two.
• Computer science is an academic study of the processes related to computation, such as developing efficient algorithms to perform specific tasks. It tackles questions as to whether problems can be solved at all using a computer, how efficiently they can be solved, and how to construct efficient programs to compute solutions. A huge array of specialties has developed within computer science to investigate different classes of problems.
• Software engineering concentrates on methodologies and practices to allow the development of reliable software systems while minimizing, and reliably estimating, costs and timelines.
• Information systems concentrates on the use and deployment of computer systems in a wider organizational (usually business) context.
• Many disciplines have developed at the intersection of computers with other professions; one of many examples is experts in geographical information systems who apply computer technology to problems of managing geographical information.
The Internet:
• . The Internet is a global system of interconnected computer networks that interchange data by packet switching using the standardized Internet Protocol Suite (TCP/IP). It is a "network of networks" that consists of millions of private and public,
• academic, business, and government networks of local to global scope that are linked by copper wires, fiber-optic cables, wireless connections, and other technologies.
• The Internet carries various information resources and services, such as electronic mail, online chat, file transfer and file sharing, online gaming, and the inter-linked hypertext documents and other resources of the World Wide Web (WWW).
• Terminology
• The terms "Internet" and "World Wide Web" are often used in every-day speech without much distinction. However, the Internet and the World Wide Web are not one and the same. The Internet is a global data communications system. It is a hardware and software infrastructure that provides connectivity between computers. In contrast, the Web is one of the services communicated via the Internet. It is a collection of interconnected documents and other resources, linked by hyperlinks and URLs
• History
• Creation
• A 1946 comic science-fiction story, A Logic Named Joe, by Murray Leinster laid out the Internet and many of its strengths and weaknesses. However, it took more than a decade before reality began to catch up with this vision.
• The USSR's launch of Sputnik spurred the United States to create the Advanced Research Projects Agency, known as ARPA, in February 1958 to regain a technological lead.[2][3] ARPA created the Information Processing Technology Office (IPTO) to further the research of the Semi Automatic Ground Environment (SAGE) program, which had networked country-wide radar systems together for the first time. J. C. R. Licklider was selected to head the IPTO, and saw universal networking as a potential unifying human revolution.
A computer is a machine designed for manipulating data according to a list of instructions known as a program.
Computers are versatile. In fact, they are universal information processing machines. According to the Church-Turing thesis, a computer with a certain minimum threshold capability is in principle capable of performing the tasks of any other computer, from those of a personal digital assistant to a supercomputer, as long as time and memory capacity are not considerations. Therefore, the same computer designs may be adapted for tasks ranging from processing company payrolls to controlling unmanned spaceflights. Due to technological advancement, modern electronic computers are exponentially more capable than those of preceding generations (a phenomenon partially described by Moore's Law).
History of computing
ENIAC — a milestone in computing history
Originally, the term "computer" referred to a person who performed numerical calculations, possibly with the aid of a mechanical calculating device. Examples of early calculating devices, the first ancestors of the computer, included the abacus and the Antikythera mechanism, an ancient Greek device for calculating the movements of planets, dating from about 87 BC.[1] The end of the Middle Ages saw a reinvigoration of European mathematics and engineering, and Wilhelm Schickard's 1623 device was the first of a number of European engineers to construct a mechanical calculator.[2] The abacus has been noted as being an early computer, as it was like a calculator in the past.
Charles Babbage was the first to conceptualize and design a fully programmable computer as early as 1820, but due to a combination of the limits of the technology of the time, limited finance, and an inability to resist tinkering with his design, the device was never actually constructed in his lifetime. A number of technologies that would later prove useful in computing, such as the punch card and the vacuum tube had appeared by the end of the 19th century, and large-scale automated data processing using punch cards was performed by tabulating machines designed by Hermann Hollerith.
How computers work: the stored program architecture
While the technologies used in computers have changed dramatically since the first electronic, general-purpose, computers of the 1940s, most still use the stored program architecture (sometimes called the von Neumann architecture; as the article describes the primary inventors were probably ENIAC designers J. Presper Eckert and John William Mauchly). The design made the universal computer a practical reality.
The architecture describes a computer with four main sections: the arithmetic and logic unit (ALU), the control circuitry, the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by bundles of wires (called "buses" when the same bundle supports more than one data path) and are usually driven by a timer or clock (although other events could drive the control circuitry).
Conceptually, a computer's memory can be viewed as a list of cells. Each cell has a numbered "address" and can store a small, fixed amount of information. This information can either be an instruction, telling the computer what to do, or data, the information which the computer is to process using the instructions that have been placed in the memory. In principle, any cell can be used to store either instructions or data.
Instructions, like data, are represented within the computer as binary code — a base two system of counting. For example, the code for one kind of "copy" operation in the Intel x86 line of microprocessors is 10110000. The particular instruction set that a specific computer supports is known as that computer's machine language. Using an already-popular machine language makes it much easier to run existing software on a new machine; consequently, in markets where commercial software availability is important suppliers have converged on one or a very small number of distinct machine languages.
Larger computers, such as some minicomputers, mainframe computers, servers, differ from the model above in one significant aspect; rather than one CPU they often have a number of them. Supercomputers often have highly unusual architectures significantly different from the basic stored-program architecture, sometimes featuring thousands of CPUs, but such designs tend to be useful only for specialized tasks.
Libraries and operating systems
Soon after the development of the computer, it was discovered that certain tasks were required in many different programs; an early example was computing some of the standard mathematical functions. For the purposes of efficiency, standard versions of these were collected in libraries and made available to all who required them. A particularly common task set related to handling the gritty details of "talking" to the various I/O devices, so libraries for these were quickly developed.By the 1960s, with computers in wide industrial use for many purposes, it became common for them to be used for many different jobs within an.
Computer applications
Computer-controlled robots are now common in industrial manufacture.
Computer-generated imagery (CGI) is a central ingredient in motion picture visual effects. The seawater creature in The Abyss (1989) marked the acceptance of CGI in the visual effects industry.
Furby: many modern, mass-produced toys would not be possible without low-cost embedded computers.
The first digital computers, with their large size and cost, mainly performed scientific calculations, often to support military objectives. The ENIAC was originally designed to calculate ballistics-firing tables for artillery, but it was also used to calculate neutron cross-sectional densities to help in the design of the hydrogen bomb,[9][10] significantly speeding up its development. (Many of the most powerful supercomputers available today are also used for nuclear weapons simulations.) The CSIR Mk I, the first Australian stored-program computer, was amongst many other tasks used for the evaluation of rainfall patterns for the catchment area of the Snowy Mountains Scheme, a large hydroelectric generation project[11] Others were used in cryptanalysis, for example the first programmable (though not general-purpose) digital electronic computer, Colossus, built in 1943 during World War II. Despite this early focus of scientific and military engineering applications, computers were quickly used in other areas.
Networking and the Internet
Computers have been used to coordinate information in multiple locations since the 1950s, with the US military's SAGE system the first large-scale example of such a system, which led to a number of special-purpose commercial systems like Sabre.
In the 1970s, computer engineers at research institutions throughout the US began to link their computers together using telecommunications technology. This effort was funded by ARPA, and the computer network that it produced was called the ARPANET.
Computing professions and disciplines
In the developed world, virtually every profession makes use of computers. However, certain professional and academic disciplines have evolved that specialize in techniques to construct, program, and use computers. Terminology for different professional disciplines is still somewhat fluid and new fields emerge from time to time: however, some of the major groupings are as follows:
• Computer engineering is the branch of electrical engineering that focuses both on hardware and software design, and the interaction between the two.
• Computer science is an academic study of the processes related to computation, such as developing efficient algorithms to perform specific tasks. It tackles questions as to whether problems can be solved at all using a computer, how efficiently they can be solved, and how to construct efficient programs to compute solutions. A huge array of specialties has developed within computer science to investigate different classes of problems.
• Software engineering concentrates on methodologies and practices to allow the development of reliable software systems while minimizing, and reliably estimating, costs and timelines.
• Information systems concentrates on the use and deployment of computer systems in a wider organizational (usually business) context.
• Many disciplines have developed at the intersection of computers with other professions; one of many examples is experts in geographical information systems who apply computer technology to problems of managing geographical information.
The Internet:
• . The Internet is a global system of interconnected computer networks that interchange data by packet switching using the standardized Internet Protocol Suite (TCP/IP). It is a "network of networks" that consists of millions of private and public,
• academic, business, and government networks of local to global scope that are linked by copper wires, fiber-optic cables, wireless connections, and other technologies.
• The Internet carries various information resources and services, such as electronic mail, online chat, file transfer and file sharing, online gaming, and the inter-linked hypertext documents and other resources of the World Wide Web (WWW).
• Terminology
• The terms "Internet" and "World Wide Web" are often used in every-day speech without much distinction. However, the Internet and the World Wide Web are not one and the same. The Internet is a global data communications system. It is a hardware and software infrastructure that provides connectivity between computers. In contrast, the Web is one of the services communicated via the Internet. It is a collection of interconnected documents and other resources, linked by hyperlinks and URLs
• History
• Creation
• A 1946 comic science-fiction story, A Logic Named Joe, by Murray Leinster laid out the Internet and many of its strengths and weaknesses. However, it took more than a decade before reality began to catch up with this vision.
• The USSR's launch of Sputnik spurred the United States to create the Advanced Research Projects Agency, known as ARPA, in February 1958 to regain a technological lead.[2][3] ARPA created the Information Processing Technology Office (IPTO) to further the research of the Semi Automatic Ground Environment (SAGE) program, which had networked country-wide radar systems together for the first time. J. C. R. Licklider was selected to head the IPTO, and saw universal networking as a potential unifying human revolution.
sadek- عدد المساهمات : 15
تاريخ التسجيل : 04/01/2009
صفحة 1 من اصل 1
صلاحيات هذا المنتدى:
لاتستطيع الرد على المواضيع في هذا المنتدى