Science





Custom Search










Science (from the Latin scientia, meaning "knowledge") is, in its broadest sense, any systematic knowledge that is capable of resulting in a correct prediction or reliable outcome. In this sense, science may refer to a highly skilled technique, technology, or practice.[1][2]

In today's more restricted sense, science refers to a system of acquiring knowledge based on scientific method, and to the organized body of knowledge gained through such research.[3][4] It is a "systematic enterprise of gathering knowledge about the world and organizing and condensing that knowledge into testable laws and theories".[5] This article focuses upon science in this more restricted sense, sometimes called experimental science, and also gives some broader historical context leading up to the modern understanding of the word "science."

From the Middle Ages to the Enlightenment, "science" had more-or-less the same sort of very broad meaning in English that "philosophy" had at that time. By the early 1800s, "natural philosophy" (which eventually evolved into what is today called "natural science") had begun to separate from "philosophy" in general. In many cases, "science" continued to stand for reliable knowledge about any topic, in the same way it is still used in the broad sense in modern terms such as library science, political science, and computer science. In the more narrow sense of "science" today, as natural philosophy became linked to an expanding set of well-defined laws (beginning with Galileo's laws, Kepler's laws, and Newton's laws for motion), it became more common to refer to natural philosophy as "natural science". Over the course of the 1800s, the word "science" become increasingly associated mainly with the disciplined study of the natural world (that is, the non-human world). This sometimes left the study of human thought and society in a linguistic limbo, which has today been resolved by classifying these areas of study as the social sciences.

Basic classifications

Scientific fields are commonly divided into two major groups: natural sciences, which study natural phenomena (including biological life), and social sciences, which study human behavior and societies. These groupings are empirical sciences, which means the knowledge must be based on observable phenomena and capable of being tested for its validity by other researchers working under the same conditions.[4] There are also related disciplines that are grouped into interdisciplinary and applied sciences, such as engineering and health science. Within these categories are specialized scientific fields that can include elements of other scientific disciplines but often possess their own terminology and body of expertise.[6]

Mathematics, which is classified as a formal science, has both similarities and differences with the natural and social sciences. It is similar to empirical sciences in that it involves an objective, careful and systematic study of an area of knowledge; it is different because of its method of verifying its knowledge, using a priori rather than empirical methods.[4] Formal science, which also includes statistics and logic, is vital to the empirical sciences. Major advances in formal science have often led to major advances in the empirical sciences. The formal sciences are essential in the formation of hypotheses, theories, and laws,[4] both in discovering and describing how things work (natural sciences) and how people think and act (social sciences).

Applied science (i.e. engineering) is the practical application of scientific knowledge.
History and etymology
Main articles: History of science and Scientific revolution
Personification of "Science" in front of the Boston Public Library

It is widely accepted that 'modern science' arose in the Europe of the 17th century (towards the end of the Renaissance), introducing a new understanding of the natural world.[7] While empirical investigations of the natural world have been described since antiquity (for example, by Aristotle and Pliny the Elder), and scientific methods have been employed since the Middle Ages (for example, by Alhazen and Roger Bacon), the dawn of modern science is generally traced back to the early modern period during what is known as the Scientific Revolution of the 16th and 17th centuries.[8]

The word "science" comes through the Old French, and is derived in turn from the Latin scientia, "knowledge", the nominal form of the verb scire, "to know". The Proto-Indo-European (PIE) root that yields scire is *skei-, meaning to "cut, separate, or discern".[9] Similarly, the Greek word for science is 'επιστήμη', deriving from the verb 'επίσταμαι', 'to know'. From the Middle Ages to the Enlightenment, science or scientia meant any systematic recorded knowledge.[10] Science therefore had the same sort of very broad meaning that philosophy had at that time. In other languages, including French, Spanish, Portuguese, and Italian, the word corresponding to science also carries this meaning.

Prior to the 1700s, the preferred term for the study of nature among English speakers was "natural philosophy", while other philosophical disciplines (e.g., logic, metaphysics, epistemology, ethics and aesthetics) were typically referred to as "moral philosophy". Today, "moral philosophy" is more-or-less synonymous with "ethics". Well into the 1700s, science and natural philosophy were not quite synonymous, but only became so later with the direct use of what would become known formally as the scientific method. By contrast, the word "science" in English was still used in the 17th century (1600s) to refer to the Aristotelian concept of knowledge which was secure enough to be used as a prescription for exactly how to accomplish a specific task. With respect to the transitional usage of the term "natural philosophy" in this period, the philosopher John Locke wrote disparagingly in 1690 that "natural philosophy is not capable of being made a science".[11]

Locke's assertion notwithstanding, by the early 1800s natural philosophy had begun to separate from philosophy, though it often retained a very broad meaning. In many cases, science continued to stand for reliable knowledge about any topic, in the same way it is still used today in the broad sense (see the introduction to this article) in modern terms such as library science, political science, and computer science. In the more narrow sense of science, as natural philosophy became linked to an expanding set of well-defined laws (beginning with Galileo's laws, Kepler's laws, and Newton's laws for motion), it became more popular to refer to natural philosophy as natural science. Over the course of the nineteenth century, moreover, there was an increased tendency to associate science with study of the natural world (that is, the non-human world). This move sometimes left the study of human thought and society (what would come to be called social science) in a linguistic limbo by the end of the century and into the next.[12]

Through the 1800s, many English speakers were increasingly differentiating science (i.e., the natural sciences) from all other forms of knowledge in a variety of ways. The now-familiar expression “scientific method,” which refers to the prescriptive part of how to make discoveries in natural philosophy, was almost unused until then, but became widespread after the 1870s, though there was rarely total agreement about just what it entailed.[12] The word "scientist," meant to refer to a systematically working natural philosopher, (as opposed to an intuitive or empirically minded one) was coined in 1833 by William Whewell.[13] Discussion of scientists as a special group of people who did science, even if their attributes were up for debate, grew in the last half of the 19th century.[12] Whatever people actually meant by these terms at first, they ultimately depicted science, in the narrow sense of the habitual use of the scientific method and the knowledge derived from it, as something deeply distinguished from all other realms of human endeavor.

By the twentieth century (1900s), the modern notion of science as a special kind of knowledge about the world, practiced by a distinct group and pursued through a unique method, was essentially in place. It was used to give legitimacy to a variety of fields through such titles as "scientific" medicine, engineering, advertising, or motherhood.[12] Over the 1900s, links between science and technology also grew increasingly strong. As Martin Rees explains, progress in scientific understanding and technology have been synergistic and vital to one another.[14]

Richard Feynman described science in the following way for his students: "The principle of science, the definition, almost, is the following: The test of all knowledge is experiment. Experiment is the sole judge of scientific 'truth'. But what is the source of knowledge? Where do the laws that are to be tested come from? Experiment, itself, helps to produce these laws, in the sense that it gives us hints. But also needed is imagination to create from these hints the great generalizations — to guess at the wonderful, simple, but very strange patterns beneath them all, and then to experiment to check again whether we have made the right guess." Feynman also observed, "...there is an expanding frontier of ignorance...things must be learned only to be unlearned again or, more likely, to be corrected."[15]
Scientific method
Main article: Scientific method

A scientific method seeks to explain the events of nature in a reproducible way, and to use these findings to make useful predictions. This is done partly through observation of natural phenomena, but also through experimentation that tries to simulate natural events under controlled conditions. Taken in its entirety, the scientific method allows for highly creative problem solving whilst minimizing any effects of subjective bias on the part of its users (namely the confirmation bias).[16]
Basic and applied research

Although some scientific research is applied research into specific problems, a great deal of our understanding comes from the curiosity-driven undertaking of basic research. This leads to options for technological advance that were not planned or sometimes even imaginable. This point was made by Michael Faraday when, allegedly in response to the question "what is the use of basic research?" he responded "Sir, what is the use of a new-born child?".[17] For example, research into the effects of red light on the human eye's rod cells did not seem to have any practical purpose; eventually, the discovery that our night vision is not troubled by red light would lead militaries to adopt red light in the cockpits of all jet fighters.[18]
Experimentation and hypothesizing
DNA determines the genetic structure of all known life
The Bohr model of the atom, like many ideas in the history of science, was at first prompted by (and later partially disproved by) experimentation.

Based on observations of a phenomenon,scientists may generate a model. This is an attempt to describe or depict the phenomenon in terms of a logical physical or mathematical representation. As empirical evidence is gathered, scientists can suggest a hypothesis to explain the phenomenon. Hypotheses may be formulated using principles such as parsimony (traditionally known as "Occam's Razor") and are generally expected to seek consilience - fitting well with other accepted facts related to the phenomena. This new explanation is used to make falsifiable predictions that are testable by experiment or observation. When a hypothesis proves unsatisfactory, it is either modified or discarded. Experimentation is especially important in science to help establish a causational relationships (to avoid the correlation fallacy). Operationalization also plays an important role in coordinating research in/across different fields.

Once a hypothesis has survived testing, it may become adopted into the framework of a scientific theory. This is a logically reasoned, self-consistent model or framework for describing the behavior of certain natural phenomena. A theory typically describes the behavior of much broader sets of phenomena than a hypothesis; commonly, a large number of hypotheses can be logically bound together by a single theory. Thus a theory is a hypothesis explaining various other hypotheses. In that vein, theories are formulated according to most of the same scientific principles as hypotheses.

While performing experiments, scientists may have a preference for one outcome over another, and so it is important to ensure that science as a whole can eliminate this bias.[19][20] This can be achieved by careful experimental design, transparency, and a thorough peer review process of the experimental results as well as any conclusions.[21][22] After the results of an experiment are announced or published, it is normal practice for independent researchers to double-check how the research was performed, and to follow up by performing similar experiments to determine how dependable the results might be.[23]
Certainty and science

Unlike a mathematical proof, a scientific theory is empirical, and is always open to falsification if new evidence is presented. That is, no theory is ever considered strictly certain as science works under a fallibilistic view. Instead, science is proud to make predictions with great probability, bearing in mind that the most likely event is not always what actually happens. During the Yom Kippur War, cognitive psychologist Daniel Kahneman was asked to explain why one squad of aircraft had returned safely, yet a second squad on the exact same operation had lost all of its planes. Rather than conduct a study in the hope of a new hypothesis, Kahneman simply reiterated the importance of expecting some coincidences in life, explaining that absurdly rare things, by definition, occasionally happen.[24]
Though the scientist believing in evolution admits uncertainty, she is probably correct

Theories very rarely result in vast changes in our understanding. According to psychologist Keith Stanovich, it may be the media's overuse of words like "breakthrough" that leads the public to imagine that science is constantly proving everything it thought was true to be false.[25] While there are such famous cases as the theory of relativity that required a complete reconceptualization, these are extreme exceptions. Knowledge in science is gained by a gradual synthesis of information from different experiments, by various researchers, across different domains of science; it is more like a climb than a leap.[26] Theories vary in the extent to which they have been tested and verified, as well as their acceptance in the scientific community. For example, heliocentric theory, the theory of evolution, and germ theory still bear the name "theory" even though, in practice, they are considered factual.[27]

Philosopher Barry Stroud adds that, although the best definition for "knowledge" is contested, being skeptical and entertaining the possibility that one is incorrect is compatible with being correct. Ironically then, the scientist adhering to proper scientific method will doubt themselves even once they possess the truth.[28]

Stanovich also asserts that science avoids searching for a "magic bullet"; it avoids the single cause fallacy. This means a scientist would not ask merely "What is the cause of...", but rather "What are the most significant causes of...". This is especially the case in the more macroscopic fields of science (e.g. psychology, cosmology).[29] Of course, research often analyzes few factors at once, but this always to add to the long list of factors that are most important to consider.[30] For example: knowing the details of only a person's genetics, or their history and upbringing, or the current situation may not explain a behaviour, but a deep understanding of all these variables combined can be very predictive.