10. The Discovery of Vulcan
Vulcan was a planet that nineteenth century scientists believed to exist somewhere between Mercury and the Sun. The mathematician Urbain Jean Joseph Le Verrier first proposed its existence after he and many other scientists were unable to explain certain peculiarities about Mercury’s orbit. Scientists like Le Verrier argued that this had to be caused by some object, like a small planet or moon, acting as a gravitational force. La Verrier called his hypothetical planet Vulcan, after the Roman god of fire. Soon, amateur astronomers around Europe, eager to be a part of a scientific discovery, contacted Le Verrier and claimed to have witnessed the mysterious planet making its transit around the Sun. For years afterward, Vulcan sightings continued to pour in from around the globe, and when La Verrier died in 1877, he was still regarded as having discovered a new planet in the solar system.
How it was Proven Wrong:
Without La Verrier acting as a cheerleader for Vulcan’s existence, it suddenly began to be doubted by many notable astronomers. The search was effectively abandoned in 1915, after Einstein’s theory of general relativity helped to explain once and for all why Mercury orbited the Sun in such a strange fashion. But amateur stargazers continued the search, and as recently as 1970 there have been people who have claimed to see a strange object orbiting the sun beyond Mercury. Amusingly, the entire would-be discovery’s greatest legacy today is that it inspired the name of the home planet of the character Spock from Star Trek.
9. Spontaneous Generation
Although it might seem a bit ludicrous today, for thousands of years it was believed that life regularly arose from the elements without first being formed through a seed, egg, or other traditional means of reproduction. The main purveyor of the theory was Aristotle, who based his studies on the ideas of thinkers like Anaximander, Hippolytus, and Anaxagoras, all of whom stressed the ways in which life could spontaneously come into being from inanimate matter like slime, mud, and earth when exposed to sunlight. Aristotle based his own ideas on the observation of the ways maggots would seemingly generate out of dead animal carcass, or barnacles would form on the hull of a boat. This theory that life could literally spring from nothing managed to persist for hundreds of years after Aristotle, and was even being proposed by some scientists as recently as the 1700s.
How it was Proven Wrong:
It was only with the adoption of the scientific method that many of the classical theories like spontaneous generation began to be tested. Once they were, they quickly crumbled. For example, famed scientist Louis Pasteur showed that maggots would not appear on meat kept in a sealed container, and the invention of the microscope helped to show that these same insects were formed not by spontaneous generation but by airborne microorganisms.
8. The Expanding Earth
Our modern understanding of the interior and behaviors of the Earth is strongly based around plate tectonics and the concept of subduction. But before this idea was widely accepted in the late 20th century, a good number of scientists subscribed to the much more fantastical theory that the Earth was forever increasing in volume. The expanding Earth hypothesis stated that phenomena like underwater mountain ranges and continental drift could be explained by the fact that the planet was gradually growing larger. As the globe’s size grew, proponents argued, the distances between continents would increase, as would the Earth’s crust, which would have explained the creation of new mountains. The theory has a long and storied past, beginning with Darwin, who briefly tinkered with it before casting it aside, and Nikola Tesla, who compared the process to that of the expansion of a dying star.
How it was Proven Wrong:
The expanding Earth hypothesis has never been proven wrong exactly, but it has been widely replaced with the much more sophisticated theory of plate tectonics. While the expanding Earth theory holds that all land masses were once connected, and that oceans and mountains were only created as a result of the planet’s growing volume, plate tectonics explains the same phenomena by way of plates in the lithosphere that move and converge beneath the Earth’s surface.
7. Phlogiston Theory
First expressed by Johan Joachim Becher in 1667, phlogiston theory is the idea that all combustible objects—that is, anything that can catch fire—contain a special element called phlogiston that is released during burning, and which makes the whole process possible. In its traditional form, phlogiston was said to be without color, taste, or odor, and was only made visible when a flammable object, like a tree or a pile of leaves, caught fire. Once it was burned and all its phlogiston released, the object was said to once again exist in its true form, known as a “calx.” Beyond basic combustion, the theory also sought to explain chemical processes like the rusting of metals, and was even used as a means of understanding breathing, as pure oxygen was described as “dephlogistated air.”
How it was Proven Wrong:
The more experiments that were performed using the phlogiston model, the more dubious it became as a theory. One of the most significant was that when certain metals were burned, they actually gained weight instead of losing it, as they should have if phlogiston were being released. The idea eventually fell out of favor, and has since been replaced by more sophisticated theories, like oxidation.
6. The Martian Canals
The Martian canals were a network of gullies and ravines that 19th century scientist mistakenly believed to exist on the red planet. The canals were first “discovered” in 1877 by Italian astronomer Giovanni Schiaparelli. After other stargazers corroborated his claim, the canals became something of a phenomenon. Scientists drew detailed maps tracing their paths, and soon wild speculation began on their possible origins and use. Perhaps the most absurd theory came from Percival Lowell, a mathematician and astronomer who jumped to the bizarre conclusion that the canals were a sophisticated irrigation system developed by an unknown intelligent species. Lowell’s hypothesis was widely discredited by other scientists, but it was also popularly accepted, and the idea managed to survive in some circles well into the 20th century.
How it was Proven Wrong:
Quite unspectacularly, the Martian canals were only proven to be a myth with the advent of greater telescopes and imaging technology. It turned out that what looked like canals was in fact an optical illusion caused by streaks of dust blown across the Martian surface by heavy winds. Several scientists had proposed a similar theory in the early 1900s, but it was only proven correct in the 1960s when the first unmanned spacecraft made flybys over Mars and took pictures of its surface.
5. Luminiferous Aether
The aether, also known as the ether, was a mysterious substance that was long believed to be the means through which light was transmitted through the universe. Philosophers as far back as the Greeks had believed that light required a delivery system, a means through which it became visible, and this idea managed to persist all the way through to the nineteenth century. If correct, the theory would have redefined our entire understanding of physics. Most notably, if the aether were a physical substance that could exist even in a vacuum, then even deep space could be more easily measured and quantified. Experiments often contradicted the theory of the aether, but by the 1700s it had become so widespread that its existence was assumed to be a given. Later, when the idea was abandoned, physicist Albert Michelson referred to luminiferous aether as “one of the grandest generalizations in modern science.”
How it was Proven Wrong:
In traditional scientific fashion, the notion of a luminiferous aether was only gradually phased out as more sophisticated theories came into play. Experiments in the diffraction and refraction of light had long rendered traditional models of the aether outdated, but it was only when Einstein’s special theory of relativity came along and completely reconfigured physics that the idea lost the last of its major adherents. The theory still exists in various forms, though, and many have argued that modern scientists simply use terms like “fields” and “fabric” in place of the more taboo term “aether.”
4. The Blank Slate Theory
One of the oldest and most controversial theories in psychology and philosophy is the theory of the blank slate, or tabula rasa, which argues that people are born with no built-in personality traits or proclivities. Proponents of the theory, which began with the work of Aristotle and was expressed by everyone from St. Thomas Aquinas to the empiricist philosopher John Locke, insisted that all mental content was the result of experience and education. For these thinkers, nothing was instinct or the result of nature. The idea found its most famous expression in psychology in the ideas of Sigmund Freud, whose theories of the unconscious stressed that the elemental aspects of an individual’s personality were constructed by their earliest childhood experiences.
How it was Proven Wrong:
While there’s little doubt that a person’s experiences and learned behaviors have a huge impact on their disposition, it is also now widely accepted that genes and other family traits inherited from birth, along with certain innate instincts, also play a crucial role. This was only proven after years of study that covered the ways in which similar gestures like smiling and certain features of language could be found throughout the world in radically different cultures. Meanwhile, studies of adopted children and twins raised in separate families have come to similar conclusions about the ways certain traits can exist from birth.
3. Phrenology
Although it is now regarded as nothing more than a pseudoscience, in its day phrenology was one of the most popular and well-studied branches of neuroscience. In short, proponents of phrenology believed that individual character traits, whether intelligence, aggression, or an ear for music, could all be localized to very specific parts of the brain. According to phrenologists, the larger each one of these parts of a person’s brain was, the more likely they were to behave in a certain way. With this in mind, practitioners would often study the size and shape of subjects’ heads in order to determine what kind of personality they might have. Detailed maps of the supposed 27 different areas of the brain were created, and a person who had a particularly large bump on their skull in the area for, say, the sense of colors, would be assumed to have a proclivity for painting.
How it was Proven Wrong:
Even during the heyday of its popularity in the 1800s, phrenology was often derided by mainstream scientists as a form of quackery. But their protests were largely ignored until the 1900s, when modern scientific advances helped to show that personality traits could not be traced to specific portions of the brain, at least in not as precise a way as the proponents of phrenology often claimed. Phrenology still exists today as a fringe science, but its use in the 20th century has become somewhat infamous: it has often been employed as a tool to promote racism, most famously by the Nazis, as well by Belgian colonialists in Rwanda.
2. Einstein’s Static Universe
Prior to scientists embracing the notion that the universe was created as the result of the Big Bang, it was commonly believed that the size of the universe was an unchanging constant—it had always been the size it was, and always would be. The idea stated that that the total volume of the universe was effectively fixed, and that the whole construct operated as a closed system. The theory found its biggest adherent in Albert Einstein—the Static Universe is often known as “Einstein’s Universe”—who argued in favor of it and even calculated it into his theory of general relativity.
How it was Proven Wrong:
The theory of a static universe was problematic from the start. First of all, a finite universe could theoretically become so dense that it would collapse into a giant black hole, a problem Einstein compensated for with his principle of the “cosmological constant.” Still, the final nail in the coffin for the idea was Edwin Hubble’s discovery of the relationship between red shift—the way the color of heavenly bodies change as they move away from us—and distance, which showed that the universe was indeed expanding. Einstein would subsequently abandon his model, and would later refer to it as the “biggest blunder” of his career. Still, like all cosmological ideas, the expanding universe is just a theory, and a small group of scientists today still subscribe to the old static model.
1. Fleischmann and Pons’s Cold Fusion
While the conditions required to create nuclear energy usually require extreme temperatures—think of the processes that power the sun—the theory of cold fusion states that such a reaction is possible at room temperature. It’s a deceivingly simple concept, but the implications are spectacular: if a nuclear reaction could occur at room temperature, then an abundance of energy could be created without the dangerous waste that results from nuclear power plants. This groundbreaking theory briefly seemed to have become a reality in 1989, when the electro-chemists Martin Fleischmann and Stanley Pons published experimental results suggesting that they had achieved cold fusion—and the precious “excess energy” it was hoped to produce—in an experiment where an electric current was run through seawater and a metal called Palladium. The response to Pons and Fleischmann’s claims by the media and the scientific community was overwhelming. The experiments were hailed as a turning point in science, and it was briefly believed that with cold fusion energy would be cheap, clean, and abundant.
How it was Proven Wrong:
The fervor over cold fusion died down as soon as other scientists tried to replicate the experiment. Most failed to get any kind of similar results, and after their paper was closely studied, Fleischmann and Pons were accused not only of sloppy, unethical science, but were even said to have stretched the truth of their results. For years after, the idea of cold fusion became synonymous with fringe science. Still, despite the stigma attached to it, many have argued that there was never anything necessarily wrong about cold fusion as a theory. In recent years, scientists have once again started to experiment with new ways of achieving a so-called “tabletop nuclear reaction,” with some even claiming to have achieved surprising success.