Friday, 1 February 2013


Silent spring which was written by marine biologist Rachel Carson and first published in 1962 is the first book to explain in detail the harmful effects of DDT and other chemicals on environment including human beings. The book is widely credited as helping to launch the environment movement. It facilitated the ban of the pesticide DDT in 1972 in the United States. It was named one of the 25 greatest science books of all time by the editors of Discover Magazine. It has been featured in many lists of the best nonfiction books of the twentieth century. In the Modern Library List of Best 20th-Century Nonfiction it was at #5 and it was at No.78 in the conservative National Review.
Rachel Carson started her career as a marine biologist in the U.S. Bureau of Fisheries, and became a full-time nature writer in the 1950s by writing best-selling books like The Sea around Us on ocean life. In 1958 she received a letter from Olga Owens Huckin turned her attention towards harmful effects of chemicals which subsequently paved the way for ‘silent spring’. It not only created global awareness about harmful effects of DDT and other chemicals, it created a global environmental movement. Both her concern for nature and her courage to take on giant chemical companies and other similar groups is highly remarkable.
As contrary to popular misconception silent spring is not only about DDT. RC Wrote about damages from all kind of chemicals. She also alternative methods to replace chemicals but more importantly she severely criticised the human belief(arrogance?) that he can control the nature by showing that many times he fails to understand the consequences of his reactions. Her style of writing is very lively also she explained the facts such that every layman can understand. She managed to strike a perfect balance between citing relevant facts and maintaining curiosity of reader.
She starts with a story of an imaginary village that was destroyed due to use of DDT, which is sure to shock the reader and will make him realise the great danger in using pesticides. Then her analysis the effect of pesticides on humans, water, soil and on the entire ecosystem in a detailed way. Her main objections to use of chemicals are 1)gradually due to mutations insects will become resistant to existing chemicals so the chemicals will become useless and one need to develop more toxic chemiclas.They will also create mutations in the human genes. 2) They may end up in destroying essential insects/plants creating ecological imbalances.3)due to their non-bio degradable nature and solvency in fat, they will enter the food chain & using them in seemingly small quantities will result in large amount of accumulation in the bodies of humans and other living beings. She also questions the wasting huge amount of money on them while there are other cheaper opportunities available. I think this point is particularly relevant in India where use of chemicals in agriculture has highly increased the cost of production of farmers.
She traces the increase of harmful insects to destruction of biodiversity by human activities starting from large scale commercial agriculture to growing predominantly forests with single species. She says that most of the species that came with explorers to the new lands became pests in the new environment. In explaining the increased use of chemicals in agriculture she notes that most of the chemicals used as pesticides were created during the World War II for the purpose of chemical warfare. She argues after the world war ended chemical companies rebranded them and started to sell them as ‘pesticides’. But she doesn’t totally reject them. She calls for a sensible use of chemicals.
Her analysis of the structure of DDT and other chemicals like organic phosphates etc reveals that these chemicals are harmful not only to human beings but to the entire ecosystem. She severely criticises the use of systemic insecticides. She then does a detailed examination of their effect on water, soil, birds, human health in separate chapters. She cites lot of evidences not only events occurred but also scientific research. Her style is so lucid and easy to follow that any laymen can easily understand. Surprisingly she doesn’t tell anything new, all the things were known before but by bringing them together and through her way of writing she created awareness among masses about the hazardous of pesticides. In this way her book virtually created the modern environment movement against using chemicals in agriculture.
In explaining the effect of chemicals on water she argues ‘In the entire water-pollution problem, there is probably nothing more disturbing than the threat of widespread contamination of groundwater’. It is due to the fact that chemicals pass through water cycles contaminating wells and other water sources. But as she explains the most dangerous consequence of mixing chemicals in water is that it leads to the creation of new chemicals due to chemical reactions. These chemicals can be very harmful even if the initial chemicals were not dangerous. The story of farmers in Colorado shockingly displays these dangerous of underground contamination by chemicals. In explaining the effect on species in water sources she cites several cases like Tule Lake and Lower Klamath where birds, fishes and other species were died due to accumulation of chemicals like DDT. There were used as weed killers but because of their non-bio-degradability they passed to other species through food chain. The most dangerous thing is that even though the initial amount used was not harmful due to accumulation the amount found in fishes & birds were very high.It also affected the migratory pattern of birds due to contamination of water sources. In explaining the hazardous effects she basically criticises the human arrogance of trying to change nature as he wishes.
She explains the intricate relations existing between species and shows how insecticides are destroying it. Along with pests good & important species like earthworm etc are also killed. Birds which eat these insects are died due to poisoning. More dangerously as case Florida shows pesticides like DDT, BHC destroys nitrogenity of soil. Due to their non-biodegradability these chemicals get collected in large quantities even when applied in small quantities. On above all that these crops grown in those fields wii contain these chemicals, and get accumulated in body of people who consumes it.
The almost universal presence of these pesticides shocks anyone. They have spread from places of Eskimos in arctic to Antarctica, from our dining tables to deserts. Her concern for environment shows reflects in her writing. She clearly demonstrates that human beings in their egoist attempt to control nature as their wishes are actually acting with total ignorance of complex interconnections of nature. She warns that ultimately it will lead to his own demise. It amazes me how prophetic her words were! For example she gives examples of birds dyeing due to accumulation of DDT, but recently we have seen cases like Endosulphan where people died due to exposure of pesticides.
As the incidence she cites clearly demonstrates that the effect of chemicals on human beings is more disastrous. Since these chemicals are non-biodegradable there cumulative affects over long periods of time are hazardous. As they are fat soluble they are stored in human body, but there effects are not seen until they reach a critical stage. But then it will be too late to take any action. As human beings are top of the food chain, significant amount of chemicals get stored in their body even if only small amount was sprayed. They adversely affect internal organs and in some cases physically challenged children are born due to the affects of chemicals that get passed to them in mother womb through placenta. Furthermore they disturb the activities of individual cells by affecting oxygen cycle and other important functions. Some chemicals are identified as carcinogenic i.e. cancer causing even when taken in small quantities. But the most fearful of all the things she cites is that some chemicals are even capable of causing mutations in genes and damaging chromosomes resulting in alternation of hereditary material itself. It makes mockery of all things human beings do since man’s attempt to control nature resulting in destruction of his own hereditary materials.
She just didn’t explain the harmful effects of chemicals Carlson but expressed universal outlook challenges the human belief in his understanding the nature. She clearly demonstrates his very limited understanding of complex ecological system. She severely criticises his arrogance for trying to control nature by make him to realise that he is the one who need to adjust. I really appreciate her courage for writing.As the book published many chemical companies launched smear campagains against her. She bravely fought them & as her book had high standards in terms of facts and references her critics lost their ground. Lately she has been criticised for ‘causing deaths due to malaria due to her call for DDT ban’. Nothing can be more farther from the truth. DDT was not used for malaria because after few years mosquitoes became resistant to them. This was exactly the thing she suspected while explaining that insects gain resistance to chemicals due to mutations in their genes.
Furthermore she didn’t call for total ban of insecticides .She called for their limited and intelligent use. She argued for understanding their implications on ecological system before using them. She called for other alternative methods like using natural enemies of insects, sterilization techniques, ultrasound, and transgenic methods. Contrary to her critics, her views on transgenic crops etc were so advanced that research on them just started in 1990’s. But some of these methods like transgenic crops caused adverse side effects (like BT cotton). But other methods like organic farming, using insecticides made from plants as neem have proved to be quite useful.
Carlson’s book surprises by its relevance to today’s problems. It is shocking that in India and other developing countries many chemicals represented in her book are still used despite their harmful effects are well known. In India Endosulphon case is a best example. Surprisingly she didn’t cite any new research on the topic but she chronicled all the facts and presented in such a way that anyone can understand. It created awareness in people and they started fight against these chemicals. It demonstrates the importance of popular science.
Overall I think this book must be read by anyone who is concerned about environment. I am sure that after reading the book there perspective about human activities will itself will change. They will appreciate the complexity of environment surrounding us.

World Data Bases

Internet and historical snapshots
Internet Archive / Wayback machine
The Internet Archive offers permanent access for researchers, historians, scholars, people with disabilities, and the general public to historical collections that exist in digital format. Founded in 1996, now the Internet Archive includes texts, audio, moving images, and software as well as archived
Wikipedia is the most famous cooperatively edited encyclopedia. Since every change is stored, Web pages' history can offer a detailed subject-based overview of the most important references of the past.
The Knowledge Centers
A collection of links to other resources for fifinding Web pages as they used to exist in the past.
Whenago provides quick access to historical information about what happened in the past on a given day.
World Digital Library
The World Digital Library (WDL) makes available on the Internet, free of charge and in multilingual format, signifificant primary materials from countries and cultures around the world.

Information retrieval engines
Freebase is an open, Creative Commons licensed repository of structured data of more than 12 million entities. It provides collaborative tools to link entities together and keep them updated.
Wolfram Alpha Computational Knowledge Engine
An attempt to compute whatever can be computed about anything. It aims to provide a single source that can be relied on by everyone for defifinitive answers to factual queries.
Text mining on the Web
Google Trends
Google Trends shows visual statistics about how often keywords have been searched on Google over time. Google Trends also shows how frequently topics have appeared in Google News stories, and in which geographic regions people have searched for them most.
Google Flu Trends
Google Flu Trends uses aggregated Google search data to estimate flu activity. Data available for download as well.
The Observatorium
The Observatorium project focuses on complex network dynamics in the Internet, proposing to monitor its evolution in real-time, with the general objective of better understanding the processes of knowledge generation and opinion dynamics.
We Feel Fine
A database of several million human feelings, harvested from blogs and social pages in the Web. Using a series of playful interfaces, the feelings can be searched and sorted across a number of demographic slices. Web api available as well.
The CyberEmotions project focuses on the role of collective emotions in creating, forming and breaking-up ecommunities. It makes available for download three datasets containing news and comments from the BBC News forum, Digg and MySpace, only for academic research and only after the submission of an application form.
Social data sharing
Linked Data
Linked Data is about using the Web to connect related data that was not previously linked, or using the Web to lower the barriers to linking data currently linked using other methods.
Dataverse Network Project
The Dataverse Network is an application to publish, share, reference, extract and analyze research data. It facilitates making data available to others, and allows to replicate others work. Researchers and data authors get credit, publishers and distributors get credit, affiliated institutions get credit.
Data360 is an open-source, collaborative and free Web site. The site hosts a common and shared database, which any person or organization, committed to neutrality and non-partisanship (meaning let the data speak), can use for presentations and visualizations.
Swivel is a web site where people share reports of charts and numbers. It is free for public data, and charges a monthly fee to people who want to use it in private.
Many Eyes
A IBM initiative that allows users to upload their datasets and use a collection of tools to obtain meaningful visualizations from them. Each visualization is publicly stored on a dedicated page, where users can comment, rate and tag it. Reuse of the data is possible and encouraged.
Conflict data
CSCW Data on Armed Conflict
CSCW and Uppsala Conflict Data Program (UCDP) at the Department of Peace and Conflict Research, Uppsala University, have collaborated in the production of a dataset of armed conflicts, both internal and external, in the period 1946 to the present. Currently, probably the most extensive dataset repository available, in particular for historic data.
The aim of the WarViews project is to create an easy-to-use front-end for the exploration of GIS data on conflict. It can run on a Web browser or it can be displayed using Google Earth.
The following are civil war specifific datasets with additional empirical information:
Ethnic group location dataset
Ethnic power balances dataset
Collection of updated datasets and codebooks from the Uppsala Conflict Data Program (UCDP).
Partially contained in the PRIO dataset, ACLED (Armed Conflict Location and Events Dataset) is designed for disaggregated conflict analysis and crisis mapping. This dataset codes the location of all reported conflict events in 50 countries in the developing world. Data are currently being coded from 1997 to 2009 and the project continues to backdate conflict information for African states to the year of independence.
The Conflict Analysis Resource Center hosts several cross country conflict data sets and a few datasets of particular countries. Repositories also have datasets of political instability and conflict.
The Cross-National Time-Series Data Archive
The Cross-National Time-Series Data Archive provides annual data for a range of countries from 1815 to the present. Frequently cited, it is one of the \leading datasets on political violence", according to Robert Bates at Harvard University. It is \possibly the most widely used event dataset" according to Henrik Urdal, International Peace Research Institute, Oslo (PRIO).
Country specifific repositories: Iraq, Afghanistan
Collection of datasets of terrorist acts.
Data in economics and finance

International real-time data provider for decision makers in finance, business and government.
Maddison Data
Historical statistics about GDP and population data.
UNCTAD Statistics
The UNCTAD Handbook of Statistics on-line provides time series of economic data and development indicators, in some cases going back as far as 1950; the Commodity Price Statistics Online Database; the UNCTAD-TRAINS on the Internet (Trade Analysis and Information System) for trade control measures as well as import flows by origin for over 130 countries; the Foreign Direct Investment database (FDI).
OECD Statistics Portal
Large collection of datasets covering economics, demographics. Extractions are freely available, full access requires subscription.
Detailed statistics on the EU and candidate countries, and various statistical publications for sale.
Where's George?
Spatial tracking system for U.S. and Canadian dollars.
Spatial tracking system for Euro banknotes.

Scientifific collaboration data
ISI Web of Knowledge
Comprehensive source of information in the sciences, social sciences, arts, and humanities. It encompasses several datasets, among which the following are maybe the most noteworthy:
Journal Citation Reports. It allows one to evaluate and compare journals using citation data drawn from over 7,500 scholarly and technical journals;
Web of Science. It consists of seven databases containing information gathered from thousands of scholarly journals, books, book series, reports, conferences, and more.
Google Scholar
Google Scholar is search engine specialized in scholarly literature. It indexes different sources (articles, books, abstract, thesis, etc.) from several disciplines and sorts them according to number of citations, author and journal impact factor.
Scholarometer is a social tool to facilitate citation analysis and help evaluate the impact of an author's publications. It works as a software plug-in for the Firefox browser.
Scopus is a very large abstract and citation database of research literature. It is available only for registered users.
Living Science
Living Science is a real time global science observatory based on publications submitted to It covers real time (daily) submissions of publications in areas as diverse as Physics, Astronomy, Computer Science, Mathematics and Quantitative Biology. Currently, contents are dynamically updated each day. Living Science is a powerful analysis tool to identify the magnitude and impact of scientifific work worldwide.

Social sciences
ICPSR of the University of Michigan
ICPSR offers more than 500,000 digital fifiles containing social science research data. Disciplines represented include political science, sociology, demography, economics, history, gerontology, criminal justice, public health, foreign policy, terrorism, health and medical care, early education, education, racial and ethnic minorities, psychology, law, substance abuse and mental health, and more.
UK Data Center of the University of Essex
The UK's largest collection of digital research data in the social sciences and humanities.
Berkeley's UC DATA Archive
UC DATA's data holdings are primarily in the areas of Political, Social and Health Sciences.
The Economic and Social Data Service (ESDS)
The Economic and Social Data Service (ESDS) is a national data service providing access and support for an extensive range of key economic and social data, both quantitative and qualitative, spanning many disciplines and themes. It contains a map of additional datasets from several European countries.
Wide data collections including sociological surveys, election studies, longitudinal studies, opinion polls, and census data. Among the materials are international and European data such as the European Social Survey, the Eurobarometers, and the International Social Survey Programme.
Gapminder Data
Gapminder is a popular technology and Web application for cross-visualisation of trends in time series of data. It also opens an archive of multiple datasets on diverse socio-economic indicators.
World Value Survey
The World Value Survey provides data about values and cultural changes in societies all over the world.
Urban data
Global Urban Observatory database
The Global Urban Observatory (GUO) offers policy-oriented urban indicators, statistics and other urban information.
Urban Observatory
U.S. based datasets about wealth, innovation and crime across cities.
Traffic data
The Next Generation Simulation (NGSIM) program was initiated by the United States Department of Transportation (US DOT). The program developed a core of open behavioral algorithms in support of traffic simulation, and collected high-quality primary trac and trajectory data intended to support the research and testing of the new algorithms.
Swiss Federal Roads Office FEDRO
The Swiss Federal Roads Office offers a comprehensive overview on traffic flows in Switzerland. Data are collected by permanent automatic traffic counting stations and complemented by regular manual checking since 1961.
The aim of the International Traffic Database (ITDb) project is to provide traffic data to various groups (researchers, practitioners, public entities) in a format according to their particular needs, ranging from raw measurement data to statistical analysis. ITDb promotes a flexible traffic data provision format based on user needs and standard habits.
Clearing House for Transport Data
The Clearing House for Transport Data in the German Aerospace Center is the fifirst point of contact for a quick overview of the available data. It is targeted at both organizations who gather transport-relevant data and those who wish to use the results of such research. The information offered includes the preparation of detailed metadata on the data sets, as well as notes on possible uses and sources.
Desweiteren das Regiolab Delft
The regiolab-delft initiative started just after 2000 as a joint project led by TU Delft in association with the Municipality of Delft, the TRAIL research school, the Province of South Holland, the Ministry of Transport and several industrial partners. The archived dataset consists of over 6 years of 1 minute averaged speed and aggregate flow data from densely spaced inductive loops on the freeway network in the province of south Holland and other data from intersection controllers, license plate detection camera's and much more.
The Research and Innovative Technology Administration (RITA) of the U.S. Department of Transportation offers several datasets about maritime, freights, airline, passengers, etc. traffic statistics.
ETH Travel Data Archive (ETHTDA)
The ETH Travel Data Archive (ETHTDA) is a virtual platform allowing end users to browse the archived travel data over the Web and enabling simple statistical analysis.
Metropolitan Travel Survey Archive
The Metropolitan Travel Survey Archive to store, preserve, and make publicly available, via the Internet, travel surveys conducted by metropolitan areas, states and localities.
Infoblu is a private company providing real-time traffic monitoring services for Italy. All services are available for a fee.
Open maps
Google Maps
World-famous map service. It offers several additional services such as: Street View, user-uploaded content (photos, comments and ratings) and personalized overlays through service apis.
OpenStreetMap (by UCL) is a free editable map of the whole world. OpenStreetMap allows you to view, edit and use geographical data in a collaborative way from anywhere on Earth.
Tracksource Brasil
Tracksource is a collaborative project aimed at creating and distributing for free maps of Brasil.
Logistics data
National Household Travel Survey
The National Household Travel Survey (NHTS) collect data on both long-distance and local travel by the American public. The joint survey gathers trip-related data such as mode of transportation, duration, distance and purpose of trip. It also gathers demographic, geographic, and economic data for analysis purposes. It is part of RITA.
Commodity Flow Survey
The Commodity Flow Survey (CFS) is the primary source of national and state-level data on domestic freight shipments by American establishments in mining, manufacturing, wholesale, auxiliaries, and selected retail industries. Data are provided on the types, origins and destinations, values, weights, modes of transport, distance shipped, and ton-miles of commodities shipped. It is part of RITA and it is conducted every fifive years (last sampling on 2007).
Climate data
Climate data from Julich Research Center.
Google introduces its data-driven philanthropic projects, among which two environmental satellite observatories:
the Earth Engine: for monitoring trends in world deforestation;
the Crisis Response: for monitoring the oil spill from the Deep Horizon sank platform.
Reality mining
Reality Mining
Behavioral data collected from 100 mobile phones over 9 months. Includes both proximity and phone usage statistics. Two anonymized datasets available: single user (MySQL) and global (Matlab).
Other open data initiatives
Wide collection of public US datasets available for research.
Wide collection of public UK datasets available for research.
Digging Into Data
Launched by the National Science Foundation (NSF), it offers a collection of diverse data repositories.
Guardian Data Blog
Data journalism initiative that posts public interest (primarily UK relevant) datasets together with their analysis. A few collaborations with data visualization artists are present as well.
Google Public Data
Google offers several large datasets on diverse world socio-economic indicators and provides tools for easy visualization.

What is Scientific Method

 Note : Below post is being taken  from
I am posting here because i think that it is very informative and must be read by everyone. If the author of original post objected to it i will be happy to remove it from my blog. I apologise for not asking permission of  author before re-posting. 

The scientific method is the process by which scientists, collectively and over time, endeavor to construct an accurate (that is reliable,consistent and non-arbitrary) representation of the   world.
Recognizing that personal and cultural beliefs influence both our perceptions and our interpretations of natural phenomena, we aim through the use of standard procedures and criteria to minimize those influences when developing a theory. As a famous scientist once said, "Smart people (like smart lawyers) can come up with very good explanations for mistaken points of view." In summary, the scientific method attempts to minimize the influence of bias or prejudice in the experimenter when testing an hypothesis or a theory.
1. Observation and description of a phenomenon or group of phenomena.
2. Formulation of a hypothesis to explain the phenomena. In physics, the hypothesis often takes the form of a causal mechanism or a mathematical relation.
3. Use of the hypothesis to predict the existence of other phenomena, or to predict quantitatively the results of new observations.
4. Performance of experimental tests of the predictions by several independent experimenters and properly performed experiments.
If the experiments bear out the hypothesis it may come to be regarded as a theory or law of nature (more on the concepts of hypothesis, model, theory and law below). If the experiments do not bear out the hypothesis, it must be rejected or modified. What is key in the description of the scientific method just given is the predictive power (the ability to get more out of the theory than you put in; see Barrow, 1991) of the hypothesis or theory, as tested by experiment. It is often said in science that theories can never be proved, only disproved. There is always the possibility that a new observation or a new experiment will conflict with a long-standing theory.
As just stated, experimental tests may lead either to the confirmation of the hypothesis, or to the ruling out of the hypothesis. The scientific method requires that an hypothesis be ruled out or modified if its predictions are clearly and repeatedly incompatible with experimental tests. Further, no matter how elegant a theory is, its predictions must agree with experimental results if we are to believe that it is a valid description of nature. In physics, as in every experimental science, "experiment is supreme" and experimental verification of hypothetical predictions is absolutely necessary. Experiments may test the theory directly (for example, the observation of a new particle) or may test for consequences derived from the theory using mathematics and logic (the rate of a radioactive decay process requiring the existence of the new particle). Note that the necessity of experiment also implies that a theory must be testable. Theories which cannot be tested, because, for instance, they have no observable ramifications (such as, a particle whose characteristics make it unobservable), do not qualify as scientific theories.
If the predictions of a long-standing theory are found to be in disagreement with new experimental results, the theory may be discarded as a description of reality, but it may continue to be applicable within a limited range of measurable parameters. For example, the laws of classical mechanics (Newton's Laws) are valid only when the velocities of interest are much smaller than the speed of light (that is, in algebraic form, when v/c << 1). Since this is the domain of a large portion of human experience, the laws of classical mechanics are widely, usefully and correctly applied in a large range of technological and scientific problems. Yet in nature we observe a domain in which v/c is not small. The motions of objects in this domain, as well as motion in the "classical" domain, are accurately described through the equations of Einstein's theory of relativity. We believe, due to experimental tests, that relativistic theory provides a more general, and therefore more accurate, description of the principles governing our universe, than the earlier "classical" theory. Further, we find that the relativistic equations reduce to the classical equations in the limit v/c << 1. Similarly, classical physics is valid only at distances much larger than atomic scales (x >> 10-8 m). A description which is valid at all length scales is given by the equations of quantum mechanics.
We are all familiar with theories which had to be discarded in the face of experimental evidence. In the field of astronomy, the earth-centered description of the planetary orbits was overthrown by the Copernican system, in which the sun was placed at the center of a series of concentric, circular planetary orbits. Later, this theory was modified, as measurements of the planets motions were found to be compatible with elliptical, not circular, orbits, and still later planetary motion was found to be derivable from Newton's laws.
Error in experiments have several sources. First, there is error intrinsic to instruments of measurement. Because this type of error has equal probability of producing a measurement higher or lower numerically than the "true" value, it is called random error. Second, there is non-random or systematic error, due to factors which bias the result in one direction. No measurement, and therefore no experiment, can be perfectly precise. At the same time, in science we have standard ways of estimating and in some cases reducing errors. Thus it is important to determine the accuracy of a particular measurement and, when stating quantitative results, to quote the measurement error. A measurement without a quoted error is meaningless. The comparison between experiment and theory is made within the context of experimental errors. Scientists ask, how many standard deviations are the results from the theoretical prediction? Have all sources of systematic and random errors been properly estimated? This is discussed in more detail in the appendix on Error Analysis and in Statistics Lab 1.
As stated earlier, the scientific method attempts to minimize the influence of the scientist's bias on the outcome of an experiment. That is, when testing an hypothesis or a theory, the scientist may have a preference for one outcome or another, and it is important that this preference not bias the results or their interpretation. The most fundamental error is to mistake the hypothesis for an explanation of a phenomenon, without performing experimental tests. Sometimes "common sense" and "logic" tempt us into believing that no test is needed. There are numerous examples of this, dating from the Greek philosophers to the present day.
Another common mistake is to ignore or rule out data which do not support the hypothesis. Ideally, the experimenter is open to the possibility that the hypothesis is correct or incorrect. Sometimes, however, a scientist may have a strong belief that the hypothesis is true (or false), or feels internal or external pressure to get a specific result. In that case, there may be a psychological tendency to find "something wrong", such as systematic effects, with data which do not support the scientist's expectations, while data which do agree with those expectations may not be checked as carefully. The lesson is that all data must be handled in the same way.
Another common mistake arises from the failure to estimate quantitatively systematic errors (and all errors). There are many examples of discoveries which were missed by experimenters whose data contained a new phenomenon, but who explained it away as a systematic background. Conversely, there are many examples of alleged "new discoveries" which later proved to be due to systematic errors not accounted for by the "discoverers."
In a field where there is active experimentation and open communication among members of the scientific community, the biases of individuals or groups may cancel out, because experimental tests are repeated by different scientists who may have different biases. In addition, different types of experimental setups have different sources of systematic errors. Over a period spanning a variety of experimental tests (usually at least several years), a consensus develops in the community as to which experimental results have stood the test of time.
In physics and other science disciplines, the words "hypothesis," "model," "theory" and "law" have different connotations in relation to the stage of acceptance or knowledge about a group of phenomena.
An hypothesis is a limited statement regarding cause and effect in specific situations; it also refers to our state of knowledge before experimental work has been performed and perhaps even before new phenomena have been predicted. To take an example from daily life, suppose you discover that your car will not start. You may say, "My car does not start because the battery is low." This is your first hypothesis. You may then check whether the lights were left on, or if the engine makes a particular sound when you turn the ignition key. You might actually check the voltage across the terminals of the battery. If you discover that the battery is not low, you might attempt another hypothesis ("The starter is broken"; "This is really not my car.")
The word model is reserved for situations when it is known that the hypothesis has at least limited validity. A often-cited example of this is the Bohr model of the atom, in which, in an analogy to the solar system, the electrons are described has moving in circular orbits around the nucleus. This is not an accurate depiction of what an atom "looks like," but the model succeeds in mathematically representing the energies (but not the correct angular momenta) of the quantum states of the electron in the simplest case, the hydrogen atom. Another example is Hook's Law (which should be called Hook's principle, or Hook's model), which states that the force exerted by a mass attached to a spring is proportional to the amount the spring is stretched. We know that this principle is only valid for small amounts of stretching. The "law" fails when the spring is stretched beyond its elastic limit (it can break). This principle, however, leads to the prediction of simple harmonic motion, and, as a model of the behavior of a spring, has been versatile in an extremely broad range of applications.
A scientific theory or law represents an hypothesis, or a group of related hypotheses, which has been confirmed through repeated experimental tests. Theories in physics are often formulated in terms of a few concepts and equations, which are identified with "laws of nature," suggesting their universal applicability. Accepted scientific theories and laws become part of our understanding of the universe and the basis for exploring less well-understood areas of knowledge. Theories are not easily discarded; new discoveries are first assumed to fit into the existing theoretical framework. It is only when, after repeated experimental tests, the new phenomenon cannot be accommodated that scientists seriously question the theory and attempt to modify it. The validity that we attach to scientific theories as representing realities of the physical world is to be contrasted with the facile invalidation implied by the expression, "It's only a theory." For example, it is unlikely that a person will step off a tall building on the assumption that they will not fall, because "Gravity is only a theory."
Changes in scientific thought and theories occur, of course, sometimes revolutionizing our view of the world (Kuhn, 1962). Again, the key force for change is the scientific method, and its emphasis on experiment.
While the scientific method is necessary in developing scientific knowledge, it is also useful in everyday problem-solving. What do you do when your telephone doesn't work? Is the problem in the hand set, the cabling inside your house, the hookup outside, or in the workings of the phone company? The process you might go through to solve this problem could involve scientific thinking, and the results might contradict your initial expectations.
Like any good scientist, you may question the range of situations (outside of science) in which the scientific method may be applied. From what has been stated above, we determine that the scientific method works best in situations where one can isolate the phenomenon of interest, by eliminating or accounting for extraneous factors, and where one can repeatedly test the system under study after making limited, controlled changes in it.
There are, of course, circumstances when one cannot isolate the phenomena or when one cannot repeat the measurement over and over again. In such cases the results may depend in part on the history of a situation. This often occurs in social interactions between people. For example, when a lawyer makes arguments in front of a jury in court, she or he cannot try other approaches by repeating the trial over and over again in front of the same jury. In a new trial, the jury composition will be different. Even the same jury hearing a new set of arguments cannot be expected to forget what they heard before.
The scientific method is intricately associated with science, the process of human inquiry that pervades the modern era on many levels. While the method appears simple and logical in description, there is perhaps no more complex question than that of knowing how we come to know things. In this introduction, we have emphasized that the scientific method distinguishes science from other forms of explanation because of its requirement of systematic experimentation. We have also tried to point out some of the criteria and practices developed by scientists to reduce the influence of individual or social bias on scientific findings. Further investigations of the scientific method and other aspects of scientific practice may be found in the references listed below.
1. Wilson, E. Bright. An Introduction to Scientific Research (McGraw-Hill, 1952).
2. Kuhn, Thomas. The Structure of Scientific Revolutions (Univ. of Chicago Press, 1962).
3. Barrow, John. Theories of Everything (Oxford Univ. Press, 1991).