Where did the scientific method come from?

The scientific method


begin
Web articles
Dictionary
Lectures
training
Essays
rhetoric
Left
author
Contact


April 2004

pdf

Basics

The scientific method is an objective process used by the sciences, like physics, to produce unadulterated, unbiased results that, like a puzzle, form the scientific worldview composed. In addition to the natural sciences, such as the oldest, astronomy, other sciences also make use of this successful principle. Medicine, psychology, sociology, business administration, to name just a few - all disciplines are looking for laws and theories, which they try to confirm on the basis of measurement data such as statistics. In this essay I will limit myself to my area of ​​experience, the natural sciences. In the humanities, the following terms such as "experiment" have to be adapted because these sciences use different methods. Rather, there are procedures like thatdialectic Application: from the opposite of thesis and Antithesis follows through the synthesis a gain in knowledge.

Theory and experiment

As anticipated, I would like to illustrate the scientific methodology using the example of physics. Physics uses mathematics as the language and logic as a method to develop theoretical concepts. Experiments and observations of inanimate nature must support the theory, otherwise it will fall. The other natural sciences, such as chemistry and biology, also follow this fundamental concept.
Accordingly, the basis of the scientific methodology is formed by the two pillars Theory and experiment . The symbiosis of these two areas of scientific work has not always existed. Aristotle (384 - 322 BC) was convinced that laws could be determined solely through thinking, i.e. only through theory. According to this philosophically shaped basic attitude, knowledge comes exclusively from the intelligible world and needsno verification through the real world. First with Galileo Galilei (1564 - 1642) came in the 16th century modern, scientific method to confirm the laws through observation. This epoch is called the enlightenment, in which man increasingly remembered to use his intellect. As a result of this "intellectual development step", the essential characteristics of "rational research" today are to be seen:

  • an objectification of the problem (also through the use of factual technical language),
  • a theoretical embedding (mostly through the use of mathematical language),
  • the reproducibility of the results under the same conditions.

This development step was necessary because the so-called "common sense" often leads to false conclusions. The problem with "common sense" is that he is empirically shaped is, therefore, comes from our daily experience of the world. For this reason, the great physical theories of the 20th century, relativity and quantum theory, required a creative mind willing to stray from the path of "common sense". Even today, these constructs cause considerable problems of understanding and push our imagination to the limits (e.g. curvature of a four-dimensional manifold, tunnel effect), even though they have been verified many times, recognized and highly praised in the scientific community.

determinism

A contemporary Galileo, Descartes (1596 - 1650), sought - inspired by successes in mechanics and optics - imDescartesian determinism to establish a closed, mechanistic world system. At the latest since knowledge of theQuantum theory (Heisenberg's uncertainty principle) in the twenties of the 20th century and the so-called "chaos theory" that better with non-linear dynamics is called, in the eighties of the 20th century, this determinism must be rejected as untenable, at least in its global significance. The uncertainty principle says that two physical quantities, location and momentum (mass x speed) cannot be measured with any precision at the same time. In classical mechanics, bodies move on well-defined paths in phase space (coordinate system with space and momentum axes of the system). With the quantum mechanical uncertainty, this path lost its determined character: the path is smeared in phase space and even the phase space is not a continuum, but consists of phase space cells, the size of which is Planck's constant H dictated.
In addition to this quantum theoretical aspect, there is a more mathematical aspect that is related to the solution structure of general differential equations. Differential equations (or more generally speaking Integrodifferential equations) determine in principle the whole of nature, because every observed process in a system can be formulated by these equations (or as a system of coupled differential equations). The oscillation of a clock pendulum in the gravitational field is a differential equation, as is the flow of a drop in a river or Einstein's field equations of general relativity, as well as the energy eigenvalue equation for the Hamilton operator of quantum mechanics. With these equations, the problem is first formulated, not resolved. Mathematics now provides methods to solve the differential equations. These solutions make a statement about the behavior of the system and enable Forecasts about behavior in the future. This is precisely what makes the natural sciences so valuable and interesting for industrial and everyday applications. But nature has to a certain extent outsmarted us: the problem is subsumed under the term of Non-linearity. It is the second non-deterministic aspect that is deeply rooted in the mathematical formulation of many physical laws: nonlinear terms mean that predictability cannot be guaranteed. They say the physical system behaves "chaotic"! Prominent examples are the double pendulum in mechanics, the weather in meteorology, population dynamics in biology or stock prices on the stock exchange. Fortunately, not all systems are chaotic and can be described in a certain approximation in the" linear limes " the pendulum equation for small angular deflections of the pendulum is linear and determined. But there is also the all-clear from the side of chaos research: Amazingly, the investigation of chaotic systems reveals that areas of order and chaos merge and that structures (self-similarities, fractals, attractors) exist in chaos It is a very young research area in modern physics that is only a few decades old, but the emerging "laws of chaos" give cause for hope principle non-predictability of non-linear systems befriend: Descartes' determinism is therefore not generally valid, but only implemented in special systems.

These border areas of scientific research do not tarnish the basic mood of euphoria: Galileo Scientific methodology initiative celebrated an unexpected triumph. Along with the beginning of a modern philosophy throughDescartes can she as one of the pioneers of the Age of Enlightenment in eighteenth century Europe, that of Protestantism and rationalism was shaped and politically culminated in the French Revolution in 1789. The result of this historical development is our current scientific worldview, which achieved resounding successes especially in the 20th century.

New laboratories

In addition to the now "classic" concept of merging theory and experiment, there are not very different, but complementary possibilities for research in physics.
The older method is called Thought experimentthat mainly through Albert Einstein (1879-1955) gained popularity and excessive use. This method is distinguished by the fact that no material instruments (measuring devices) are required, but only an experiment is carried out with the power of the spirit. It is essential that these are not fantasies, but that the laws of physics remain valid and must be correctly taken into account. The advantage of this procedure is that it is immediate, inexpensive and can quickly uncover contradictions in a theory or thesis. Unfortunately, one cannot always formulate an adequate thought experiment for a problem.
Another approach also made a significant contribution to the merging of theory and experiment and has since complemented the classic method. Accelerated by developments in the Computer technology since Konrad Zuse (1910 - 1995) with the first program-controlled computer system Zuse Z 3 in 1951, an intermediate area between theory and experiment opened up: the Computer simulation. Here, the laws from theory, mathematical equations, are used numerical methods solved on the computer and visualized. So the computer serves as a kind of virtual laboratory. The work of the theoretical physicist is now only analytical in a few cases or at least only at the beginning of the research problem. In the end, complex equations are ported to the computer. The Simulations represent in principle theoretical data that are compared with the experimental data of the experimental physicists. An essential aspect is that Visualization theoretical data, i.e. their representation by means of imaging processes. It turns out that an adequate visualization technique is key to understanding and interpreting simulations!
The computer turns out to be an indispensable instrument, even a measuring instrument, of modern physics: it makes it possible presentation of phenomena that are difficult or impossible to access to humans. Elaborate and lengthy simulations are used Supercomputers (Computers with a multiple in computing speed and memory compared to conventional PCs) the visualization of the development sequence of areas of the universe in cosmology or the representation of the incidence of matter on a black hole in relativistic astrophysics.
Understanding these processes is also trend-setting for detector construction: the researcher gets a feeling for What can be observed and how it can be detected. Then the circle of theory and experiment closes again, because from the final one verification or falsification or where appropriate modification In this way, a complex, scientific building is created: the scientific worldview.

The methodical catastrophe

There is one complication, however fundamental problems in the scientific method: one can play an experimental measurement setup to investigate a theory not arbitrarily far float! Let us again take cosmology, a sub-area of ​​astrophysics, as an example. Cosmology deals with the laws that describe the creation and development of the universe. A popular theoretical model that Big Bang model, is strictly speaking not directly provable! Otherwise you would have to let the big bang happen in the laboratory under controlled conditions, which seems impossible. Many such problems are conceivable and present the experimenter with an unsolvable problem.

Fortunately, one can also find a way out here, but it ultimately bends the fundamental principle of the experiment that supports the theory. The way out of this dilemma is that any theoretical concept secondary circumstantial evidence holds. There are usually other observable phenomena. In the present example from cosmology, this can be the cosmic background radiation, which is very isotropic apart from slight temperature fluctuations (COBE and WMAP data). The isotropy is a strong indication that an explosion occurred at one point in the past, called the Big Bang.
On the other hand, another observation window will be opened for the astronomers shortly: Gravitational wave astronomy. These are curvatures of space-time that propagate in waves at the speed of light. Catastrophic events such as the merging of two neutron stars to form a black hole or the Big Bang itself are promising candidates for intense gravitational waves. In Hanover, such a detector from laser interferometers was called Geo 600 built, which unfortunately has not yet been able to measure gravitational waves. Cosmological models provide a precise idea of ​​the frequency and intensity of gravitational waves of the Big Bang - the theorists have a head start here - and can thus be directly verified or falsified in the future.

Summary

The natural sciences of the past 500 years have therefore benefited enormously from the scientific method of theory and experiment. The 20th century in particular will be mentioned as one of the most fruitful in the history books, unless the "Knowledge SuperGAU" continues in the 21st century (possible field of activity: human genetics). In the computer age, theory and experiment were supplemented by simulation. Both the natural phenomenon and the measuring device can be artificially simulated in the computer. An advantage that certainly also harbors a risk if the person programming the software does not know what he is doing. But here, too, the addition of scientific methodology helps to quickly find reproducibility, possible errors or inadequacies.
There are obstacles or limits to predictability, which are caused in the micro-world in quantum theory and, more generally, in non-linear dynamics. But this departure from determinism is to be got over, because on the one hand it does not bring the entire scientific building to collapse and on the other hand it also opens up opportunities: "Order in chaos".
The principle of scientific methodology can be pushed to its limits, namely when experiments on a theory can no longer be implemented. But creative researchers have also opened up alternative or secondary approaches that can confirm theories. The prospects of piecing together the scientific worldview more and more like a mosaic are very promising, perhaps even more than ever. Whether it can be completed in principle is more a question that philosophers than natural scientists have to ask themselves. Maybe this question already has Plato answered almost 2500 years ago ...

pdf

up


© Andreas Müller, August 2007