112.86K
Category: informaticsinformatics

Computer science

1.

Computer science
Computer science is the study of the theoretical
foundations of information and computation and how
they can be implemented in computer systems.[1][2][3] It is
a broad discipline, with many fields. For example,
computer programming involves the use of specific
programming languages to craft solutions to concrete
computational problems. Computer graphics relies on
algorithms that help generate and alter visual images
synthetically. Computability theory helps us understand
what may or may not be computed, using current
computers. On a fundamental level, computer science
enables us to communicate with a machine, allowing us to
translate our thoughts and ideas into machine language, to
give instructions that the machine can follow, and to
obtain the types of responses we desire.

2.

Computer science has touched practically every aspect of modernday life. For instance, it has led to the invention of general-purpose
computers, for tasks ranging from routine writing and computing to
specialized decision making. It has led to the development of the
Internet, search engines, e-mail, instant messaging, and e-commerce,
bringing about a revolution in our ability to access and
communicate information and to conduct financial transactions. By
enabling the development of computer graphics and sound systems,
it has led to new ways of creating slides, videos, and films. These, in
turn, have given birth to new approaches for teaching and learning.
For research in various fields, computer science has greatly
enhanced the processes of data gathering, storage, and analysis,
including the creation of computer models. By fostering the
development of computer chips, it has aided in the control of such
things as mobile phones, home appliances, security alarms, heating
and cooling systems, and space shuttles. In medicine, it has led to
the creation of new diagnostic and therapeutic approaches. For
national defense, it has led to the development of precision
weaponry.

3.

Through the development of robots, it has enabled the
automation of industrial processes and helped in such
tasks as defusing bombs, exploring uncharted territories,
and finding disaster victims.
On the down side, knowledge of computer science can
also be misused, such as in creating computer viruses,
computer hacking, and "phishing" for private
information. These activities can lead to huge economic
losses, theft of identity and confidential information, and
breach of national security. In addition, the fruits of
computer science—particularly the Internet and its
associated forms of communication—can be used to
spread falsehoods, motivate immoral or unethical
behavior, or promote acts of terrorism and war. Such
misuse can create enormous problems for society.

4.

History
The earliest known tool for computation was the abacus, thought
to have been invented in Babylon around 2400 B.C.E. Its original
style of usage was by lines drawn in sand with pebbles. In the fifth
century B.C.E., Indian grammarian Pāṇini formulated sophisticated
rules of grammar for Sanskrit. His work became the forerunner to
modern formal language theory and a precursor to computing.
Between 200 B.C.E. and 400 C.E., Jaina mathematicians in India
invented the logarithm. Much later, in the early sixteenth century,
John Napier discovered logarithms for computational purposes, and
that was followed by the invention of various calculating tools.
None of the early computational devices were computers in the
modern sense. It took considerable advances in mathematics and
theory before the first modern computers could be designed.
Charles Babbage, called the "father of computing," described the
first programmable device—the "analytical engine"—in 1837, more
than a century before the first computers were built. His engine,
although never successfully constructed, was designed to be
programmed—the key feature that set it apart from all preceding
devices.

5.

Prior to the 1920s, the term computer was used in
referring to a human clerk who performed
calculations, usually led by a physicist. Thousands of
these clerks, mostly women with a degree in calculus,
were employed in commerce, government, and
research establishments. After the 1920s, the
expression computing machine was applied to any
machine that performed the work of a human
computer—especially work that involved following a
list of mathematical instructions repetitively.
Kurt Gödel, Alonzo Church, and Alan Turing were
among the early researchers in the field that came to
be called computer science. In 1931, Gödel
introduced his "incompleteness theorem," showing
that there are limits to what can be proved and
disproved within a formal system. Later, Gödel and
others defined and described these formal systems.

6.

In 1936, Turing and Church introduced the formalization of an
algorithm (set of mathematical instructions), with limits on what
can be computed, and a "purely mechanical" model for computing.
These topics are covered by what is now called the Church–Turing
thesis, which claims that any calculation that is possible can be
performed by an algorithm running on a mechanical calculation
device (such as an electronic computer), if sufficient time and
storage space are available.
Turing, who has been called the "father of computer science,"
also described the "Turing machine"—a theoretical machine
with an infinitely long tape and a read/write head that moves
along the tape, changing the values along the way. Clearly,
such a machine could never be built, but the model could
simulate the computation of algorithms that can be
performed on modern computers

7.

Shannon went on to found the field of information theory
with his 1948 paper on "A Mathematical Theory of
Communication." In it, he applied probability theory to the
problem of how to best encode the information a sender
wants to transmit. This work is one of the theoretical
foundations for many areas of study, including data
compression and cryptography.
During the 1940s, with the onset of electronic digital
equipment, the phrase computing machines gradually gave
away to just computers, referring to machines that performed
the types of calculations done by human clerks in earlier
years.
Over time, as it became clear that computers could be used
for more than just mathematical calculations, the field of
computer science broadened to study computation in
general and branched into many subfields, such as artificial
intelligence. Computer science began to be established as a
distinct academic discipline in the 1960s, with the creation of
the first computer science departments and degree
programs.[4]

8.

In 1975 Bill Gates cofounded Micro-Soft, later known as
Microsoft Corporation, with former classmate Paul Allen.
Landing lucrative deals developing the operating systems for
the computers of that time, and employing aggressive
marketing practices, Microsoft became the largest software
company in the world. Currently, its premiere product, the
Windows operating system, dominates the market by several
orders of magnitude.
One year after Gates founded Microsoft, another young man,
Steve Jobs founded Apple Computer Co. with Steve Wozniak.
From 1976 onward, Apple led the personal computer market
with its Apple I, II, and III lines of desktop computers, until
IBM (International Business Machines Corporation) released
its IBM-PC in 1980. The rivalry between Apple and Microsoft
has continued well into the twenty-first century, with Apple
possessing a relatively small portion of the computer market.
With computers getting smaller and more powerful, they
have become indispensable to modern life, and some are
even used in decision-making capacities
English     Русский Rules