1745: The Leyden Jar

The invention of the Leyden Jar marked a significant moment in this history of electrical engineering. The Leyden Jar can be thought of as the first electrical capacitor – a device that stores and releases electrical energy.

The Invention of the Leyden Jar

During the 18th century the mysterious phenomenon of electricity was becoming a hot topic among learned men of science. Electricity could only be created and observed in the moment. One of the mysteries to be solved was whether electricity could be stored for later use and how to accomplish it. The invention that solved the mystery became known as the Leyden jar, named after the city of an early inventor of the device.

Drawing of a Leyden Jar
Drawing of a Leyden Jar

The Leyden Jar is typically credited to two individuals, who independently came up with the same idea.  In Germany, Ewald Georg von Kleist was experimenting with electricity.  He was attempting to store electricity with a medicine bottle filled with water and a nail inserted through a cork stopper.  He charged the jar by touching the nail with an electrostatic generator, and he assumed that the glass jar would prevent the electricity from escaping.  While holding the glass jar in one hand, he accidentally touched the nail and received a significant shock, proving that electricity was indeed stored inside the jar.  

Von Kleist experiments were not well known and around the same time another man experimenting with electricity named Peter van Musschenbroek, of Leyden, Netherlands, also stumbled upon the same invention.  Musschenbroek’s device was much like von Kleist’s jar.  It consisted of a glass jar filled with water that contained a metal rod through a cork sealing the top of the jar.  The outside of the jar was coated with a metal foil.  When an electric charge was applied to the metal rod it was found that electricity could be stored in the jar.  Unfortunately for the person touching the metal rod, a significant shock was received. As Musschenbroek recorded what happened when he first touched the rod:

Suddenly I received in my right hand a shock of such violence that my whole body was shaken as by a lightning stroke…. I believed that I was done for.

It didn’t take long until Musschenbroek’s Leyden jar being used and improved by others.  In 1746, the following year, the English physician William Watson improved the jars storage capacity by coating both the inside and outside with metal foil.  Also that same year, the French physicist Jean-Antoine Nollet discharged a Leyden jar in front of the French King Louis XV.  During the demonstration, Nollet arranged a circle of 180 Royal Guards, each holding hands and he passed the charge from the Leyden jar through the circle.  The shock was felt almost instantaneously by all members of the Royal Guard, to the delight of the King and his court.  Demonstrations such as this brought widespread attention to the exciting new field of electricity.  

Impact and Legacy of the Leyden Jar

Prior to the invention of the Leyden jar, electricity could only be observed and experimented at the moment it was created.  The Leyden jar changed this by allowing scientists to store electrical energy and use it when needed.  Researchers could now conduct various experiments related to electric discharge, conductivity, and other electrical phenomena.  As an added bonus, the jar was easily transported, especially compared to the electrostatic generators of the day.  The jars could be linked together to provide additional storage capacity.  These abilities contributed to the growth of knowledge in the field of electrical science.  

For one example, the American scientist and statesman Benjamin Franklin famously used the Leyden jar during his kite experiment in 1752.  In that experiment, Franklin flew a kite during a lightning storm in an attempt to prove that lightning was a form of electricity.  He attached a metal key to the end of the kite, and the key was then connected to a Leyden jar.  Despite some popular accounts of the experiment, lightning likely never struck the kite directly or else Franklin would have been killed. However, he was able to observe that the Leyden jar was being charged, thus proving the electric nature of lightning.  

The Leyden jar is also considered the first electrical capacitor, which today is a fundamental component in modern electric circuits.   The invention of the Leyden jar laid the groundwork for the development of more sophisticated capacitors.  While they briefly feel out of use after the invention of the battery, the basic idea of the Leyden jar capacitor found a renewed use at the end of the 19th century in modern electronic devices, albeit in a much smaller form. 

Overall, the Leyden jar played a pivotal role in the early explanation and understanding of electricity.  Its impact can be seen in the subsequent development of electrical technology and science.  

Continue reading more about the exciting history of science! 

1932: Discovery of the Neutron

The neutron was discovered by the British physicist Sir James Chadwick in 1932, marking a pivotal moment in the understanding of atomic structure. It was, in a way, the culmination of a series of scientific investigations of the subatomic particles of the atom that spanned several decades. The identification of the neutron provided answers to questions about the mass of the atom, ultimately leading to important developments in nuclear physics.

Developments that Led to the Discovery of the Neutron

Nuclear Structure of the Atom
Nuclear Structure of the Atom

In 1897, J. J. Thompson discovered the electron, a subatomic particle with a negative electrical charge. This discovery provided the evidence that atoms were composed of smaller particles. Two decades later, Ernest Rutherford discovered the proton, a subatomic particle with a positive electrical charge. Rutherford proposed a model of the atom with a dense, positively charged nucleus at its center, orbited by negatively charged electrons. However, this model presented a problem. The positively charged protons in the nucleus should all repel each other, causing the nucleus to burst apart. Yet, this did not happen as the nucleus is obviously stable, and the reasons for this stability were not understood at the time. The existence of a neutral particle was postulated by Rutherford as early as 1920.

The Discovery of the Neutron

This discovery of the neutron was the culmination of a series of successive experiments, worked out by several scientists in the late 1920s and early 1930s. In Germany, Walter Bothe found that beryllium exposed to alpha particles produced a new form of radiation. Bothe attempted to explain this sradiation in terms of gamma rays, because it was not deflected by either electric or magnetic fields. All known particles at the time (electrons and protons) contained a charge.

Taking this information a step further, the French husband-and-wife team of Irene Joliot-Curie (the daughter of Pierre and Marie Curie) and Frederic Joliot reported results from an experiment in January 1932 that led to to neutrons discovery. In the same vein as Bothe, their experiment involved the bombardment beryllium by alpha particles. They noticed that the radiation could eject protons from hydrogen-rich substances such as paraffin.  This was puzzling because gamma rays should not have enough energy to be able to knock out protons in this manner. In other words if this unknown radiation was indeed gamma rays then the law of conservation of energy was being violated.

Experiment by James Chadwick that led to the discovery of the neutron.
Experiment by James Chadwick that led to the Discovery of the Neutron
(Credit: scienceready.com)

Back at the Cavendish Laboratory in Cambridge, James Chadwick, a college or Rutherford, quickly became interested in these results.  Chadwick and Rutherford had been working on and off over the past decade in identifying the missing neutral particle suspected to be in the atomic nucleus.  This background allowed Chadwick to move quickly.  He also conducted a series of experiments where he bombarded light elements, such as beryllium, with alpha particles.  He noticed the same radiation being emitted which was not deflected by electric or magnetic fields.  However he interpreted the results differently than the others who conducted similar experiments. His observations led him to correctly conclude that the radiation was composed of uncharged particles. He used the laws of conservation of momentum and conservation of energy to calculate that the neutron has a mass similar to that of a proton.  He presented this as evidence of a new subatomic particle, which he detailed in a paper published in 1932 and named the neutron.  For his work he was awarded the Noble Prize in Physics in 1935.  

Impact of the Neutron’s Discovery

The discovery of the neutron was a key piece of the puzzle that allowed scientists to understand the binding energy of the atomic nucleus.  It had enormous impacts in both applied and theoretical physics.  

The discovery of the neutron explained the missing mass in atomic nuclei.  This in turn helped to explain the existence of isotopes – variants of elements with the same number of protons but different atomic weights – and therefore different numbers of neutrons in the atomic nuclei.  It also advanced the understanding of radioactive decay processes.  

The most important impact of the discovery of the neutron was in nuclear physics.  The neutron – a particle without an electric charge – was the crucial component in the development and study of nuclear fission, which occurred in 1938 by Otto Hanh and Lise Meitner.  The development of nuclear fission was quickly applied the development of nuclear energy and weapons.  The neutron plays the key role in the chain reactions that occur in both nuclear reactors and atomic bombs.  It is probably fitting then, that James Chadwick was placed as head of the British team that work on the Manhattan Project that produced the world’s first atomic bomb. 

Beyond nuclear fission, the discovery of the nucleus aided in our understanding in the nuclear processes that power the stars through the process of nuclear fusion.  The study of this process had advanced our understanding on the origins and evolution of the elements.  

Continue reading more about the exciting history of science!

1915: General Relativity

It has been famously stated that if Albert Einstein hadn’t published his theory of special relativity, someone else likely would have within the decade. But if Einstein hadn’t published his theory of general relativity we would still be waiting on that discovery to this day. The theory of general relativity, proposed by Einstein in 1915, is one of the most profound and revolutionary achievements in the history of science and transformed our understanding of space, time, and gravity.

Background on the Theory of General Relativity

Curvature of Spacetime, as described by General Relativity
Curvature of Spacetime, as described by General Relativity
(Credit: blackholecam.org)

For nearly two centuries, Isaac Newton’s law of universal gravitation stood unquestioned and unchallenged in the field of physics.  However, around the end of the 19th century some tiny, yet vulnerable, kinks in its armor were beginning to emerge.  Physicists noticed it was unable to explain certain phenomena, such as the orbit of Mercury.  They were also struggling to reconcile classical mechanics with new observations of the nature of light and electromagnetism.   A relatively unknown physicist at the time named Albert Einstein provided the solutions to these vexing problems.

Einstein’s journey towards his theory of general relativity began in 1905 with the publication of his theory of special relativity.  Special relativity primarily focuses on objects moving at a constant speed.  It ignores acceleration, or objects affected by gravity, which Einstein determined are essentially the same thing after he formulated general relativity.  

Between 1907 and 1915 Einstein worked extending of his theory of special relativity to include gravity.  The mathematical complexity of this task proved enormous, and so Einstein had to learn various advanced mathematical techniques to complete his ideas.  He worked with other mathematicians the help him understand the underlying math needed to formulate his ideas. 

Space, Time, and a Sliver of Curvature

In November 1915, Einstein presented his revolutionary ideas on general relativity to the Prussian Academy of Science. His theory describes how mass and energy curve the fabric of spacetime, effectively producing the force of gravity.  Therefore, a major implication of general relativity is that it redefined gravity from a force acting at a distance to a description of gravity as a geometric property of spacetime. The core of general relativity is the field equations, which explain the geometry of four-dimensional spacetime to the distribution of mass and energy within it. These equations are known as the Einstein field equations.  

General relativity also implies an equivalence principle.  This states that the effects of gravity are locally indistinguishable from the effects of acceleration. In other word, someone in a closed space cannot determine whether they are feeling the effects of gravity or acceleration.  A classic example is the elevator thought experiment.  Imagine you are in an elevator with no windows.  If you drop an object it falls to the floor at an acceleration of 9.81 m/s^2 due to earth’s gravity.  However, you could also be in distant space, far away from any gravitational force accelerating at a constant rate of 9.81m/s^2.  In the second scenario when you drop an object it will also fall to the floor at an acceleration of 9.81m/s^2 – the effects of the two scenarios are indistinguishable.  

Testing and Acceptance

General relativity has been repeatedly verified by observation and experimentation.  One of the first key confirmations of general relativity came in 1919 during a solar eclipse expedition by Sir Arthur Eddington.  To goal of the expedition was to measure the apparent shift of the stars near the sun during the solar eclipse.  Indeed, the observed shift matched the predictions made by general relativity.  

Photograph of the solar eclipse by Arthur Eddington' s expedition in 1919
Photograph of the solar eclipse by Arthur Eddington’ s expedition in 1919

Another significant test of general relativity was the perihelion precession of Mercury’s orbit.  Mercury’s orbit exhibits a small deviation in the orientation of its elliptical orbit over time, a phenomenon known as perihelion precession.  Prior to general relativity, astronomers struggled to account for the advance of this phenomenon.  Newtonian classical mechanics failed to explain the discrepancy.  However, when Einsteins equations of general relativity were applied to this problem, they yielded an answer that matched the observed rate of precession.  This successful prediction, along with the results of Eddington’s experiment, provided crucial empirical evidence for the validity of general relativity.  General relativity also predicts other phenomenon such as gravitational redshift and time dilation, both of which have been confirmed through experiments as well.

As empirical validation rolled in, general relativity gradually began to gain acceptance within the scientific community.  By the latter half of the 20th century, general relativity had become fully accepted and became one of the two twin pillars of physics – along with quantum mechanics.  The acceptance of general relativity can be viewed as a watershed moment in the history of science, marking the transition from the Newtonian worldview to that of a relativistic worldview.  

Continue reading more about the exciting history of science!

1860s: The Electromagnetic Spectrum

The existence of a form of light other than visible light was inconceivable prior to the 19th century. However, in nature there exists a continuum of radiation waves, all traveling at the same speed of light, with a nearly infinite possibility of wavelengths and frequencies. This is the electromagnetic spectrum, and the first new form of light was unexpected discovered in 1800 by the British scientist William Herschel.

The Discovery of Infrared Rays

William Herschel's experiment leading to the discovery of infrared rays
William Herschel’s experiment leading to the discovery of infrared rays
(Credit: NIRS Research)

William Herschel is best know for his discovery of the planet Uranus but his interests also extended beyond the visible universe. Herschel was fascinated by the nature of heat and light and his involvement in this area of study lead to the discovery of a completely new and unexpected form of light.

In 1800 Herschel conducted an experiment where he directed sunlight though a glass prism to create a spectrum of visible light. He then measured the temperature of the different colors of light, where he noticed the temperature increased as he moved from the violet to the red end of the spectrum. Then in a moment of true scientific curiosity, he took his experiment one step further. He decided to measure the temperature just beyond the red light, but where no sunlight was visible. This region showed the highest temperature of all, and this led Herschel to conclude that there must be some form of invisible light present that we cannot see. He named this invisible radiation infrared, from the Latin ‘infra’ meaning “below” – or in this case below the red in the spectrum.

The importance of this discovery can hardly be overstated. It added an entirely new dimension to our perception of the universe and to this day has had significant practical applications in a variety of fields such as astronomy, telecommunications, healthcare, and environmental science. It also suggested that there may be other forms of light yet to be discovered.

The electromagnetic Spectrum
The Electromagnetic Spectrum
(Credit: www.miniphysics.com)

A Continuous Spectrum of Electric and Magnetic Fields Oscillating Together

Shortly after Herschel discovery of infrared rays, the German chemist Johann Wilhelm Ritter discovered another of invisible rays. Ritter was obviously inspired by Herschel’s discovery of infrared rays, and he decided to experiment at the opposite end of the spectrum, beyond violet. Ritter conducted his experiment by also focusing sunlight through a glass prism to create a spectrum of colors. He noticed that silver chloride, a chemical known to darken when exposed to sunlight, darkened faster when exposed to the region beyond violet light. He had discovered what would later become known as ultraviolet rays, from the Latin ‘ultra’ meaning “beyond” – in this case beyond violet in the spectrum. The discovery of ultraviolet and infrared rays further prompted the discovery of other types of electromagnetic rays.

The first X-ray photograph of Anna Bertha Ludwig's hand, the wife of Wilhelm Roentgen.  The bones and wedding ring are visible in the photo.
The first X-ray photograph of Anna Bertha Ludwig’s hand, the wife of Wilhelm Roentgen. The bones and wedding ring are visible.

In the 1860s the British physicist James Clerk Maxwell revolutionized the world of physics with his electromagnetic theory. He developed four mathematical equations, known as the Maxwell Equations, that described how electric and magnetic fields interact. These equations unified the previously separate fields of electricity and magnetism. His new theory suggested that light was a form of electromagnetic wave, and that it was just a smaller part of a much broader spectrum. This spectrum is known as the electromagnetic spectrum. In 1867, Maxwell predicted that there should be wavelengths of light longer than the infrared rays discovered by Herschel.

It took another two decades, but in 1887 the German physicist Heinrich Hertz confirmed Maxwell’s predictions when he discovered radio waves in his laboratory. Light with wavelengths of a shorter length were soon discovered next. In 1895 Wilhelm Roentgen accidentally discovered what became known as X-rays while experimenting with cathode ray tubes. It took another two decades until scientists were able to determine that these X-rays were indeed another form of light. The last part of the spectrum to be discovered was gamma rays. The French physicist Paul Villard discovered these in 1900 while he was studying radiation emitted by radium.

Far Reaching Technological Impacts

The discovery of the electromagnetic spectrum has had far reaching impacts on 20th century technology and beyond. The development of radio and television technology was predicted on the understanding of radio waves. In similar fashion, the invention of radar during World War II was possible due to the understanding of microwaves.

From science and technology to medicine and daily life, the impacts of the electromagnetic spectrum are almost too vast and numerous to mention. But to name a few – radio and television broadcasting, mobile phones and the internet, X-rays used in healthcare, space telescopes and satellite-based sensors are just a few of the many examples of the areas in which electromagnetic radiation have had an impact on civilization.

Continue reading more about the exciting history of science!

Ernest Rutherford

Ernest Rutherford
Ernest Rutherford

Ernest Rutherford (1871 – 1937) was a prominent physicist from New Zealand who made significant contributions to field of nuclear physics during the early 20th century. He greatest contribution to physics was his discovery of the atomic nucleus in 1911.

Rutherford came from a humble background. He began his education at a local school where he excelled in mathematics and science. Following his undergraduate studies he was awarded a scholarship to further his education at the prestigious Cavendish Laboratory at the University of Cambridge in England. Under the guidance of J.J. Thomson, he conducted research on the electrical conductivity of bases and earned his Ph.D. in 1898. After finishing his doctorate he joined the faculty of McGill University in Canada, where he conducted groundbreaking research on radioactivity, which laid the groundwork for his future discoveries.

Rutherford’s Gold Foil Experiment

Rutherford’s most famous discovery was the discovery of the atomic nucleus. In 1909 he conducted his famous Gold Foil Experiment, where he bombarded a thing sheet of gold foil with alpha particles. Surprisingly to Rutherford and his team, some of the particles were deflected at large angles while most others easily passed through. His interpretation of these results led to his proposal of a new atomic model, which challenged the “plum pudding” model proposed by his mentor J.J. Thomson.

Based on the Gold Foil Experiment, Rutherford hypothesized that atoms have small, dense, and positively charged nucleus at their center, surrounded by mostly empty space in which negatively charged particles orbit. This model eventually gained acceptance and revolutionized the field of nuclear physics. This discovery paved the way for future advancements in nuclear physics, such as the discovery of isotopes and the concept of radioactive decay, both processes that involve the atomic nucleus. His work still remains a cornerstone in nuclear physics to this day.

1927: The Big Bang

What is the origin of the universe? This is the ultimate origin question. The Big Bang is the scientific theory that proposes to answer this ultimate origin question. According to the Big Bang theory, the observable universe began in a singularity of infinite density and temperature at a specific moment in time some 13.8 billion years ago. It then rapidly expanded in to the vase and complex structure that we see today. The Big Bang theory has revolutionized our understanding of the universe by providing us with a framework to explain the origin and evolution of everything we see around us.

Big Bang Expansion
The Big Bang Expansion
(Credit: Wikimedia Commons)

Deconstructing the Problem of the Origin of the Universe

The origin of the universe has been a topic of human interest since the beginning of recorded history and was likely discussed and debated from the time that language evolved. Only recently have we been able to make significant progress in outlining a theory of the origin of the universe grounded in observable evidence. This is mainly due to the fact that up until the last century scientists had limited observable evidence in which to accurately account for the universe’s beginnings. Powerful telescopes and other advanced instruments allowed scientists to learn more information about the universe’s structure, composition, and history.

In the early 20th century there were two main competing theory’s of the origin of the universe. The first and most popular was the steady-state theory, which postulated that the universe had no beginning or end and that it was in a state of perpetual equilibrium. The second was the expanding universe theory, which eventually became to be known as the Big Bang, postulated that the universe began as an incredibly hot and dense point, a singularity, and has been expanding and cooling ever since. As you can see, these are two very different views of the universe’s evolution. Eventually, some key lines of evidence began to emerge that settled the debate, leaving only the Big Bang theory left standing.

Key Lines of Evidence for the Big Bang Theory

Big Bang Theory
The Evolution and Structure of the Universe
(Credit: European Space Agency)

There are three convincing lines of evidence typically put forward as proof for the Big Bang Theory.  When all there of these are taking together it provides a compelling validation for the Big Bang Theory.

  1. Universal expansion of galaxies – this refers to the observation that all of the galaxies are moving away from each other, with the furthest galaxies moving a way at an accelerating rate.  This phenomenon was first discovered by the astronomer Edwin Hubble in the 1920s when he observed that light from distant galaxies always shifted towards the red end of the spectrum. This line of evidence taken by itself does not prove the big bang.  There could be a center far away where new matter is being created and ejected from the center, pushing everything away. 
  2. Cosmic background microwave radiation – Arno Penzias and Robert Wilson, while working on a satellite at Bell Labs, accidentally discovered the cosmic background radiation (CMB).  The CMB is a form of electromagnetic radiation that fills the universe and is detectable in all directions, and is believed to be the residual heat left over from the Big Bang. It has a mostly uniform temperature of approximately 2.7 Kelvin, however small variations in temperature have been studied to provide important insights in the early universe, such as the distribution of matter and the structure of galaxies.
  3. Relative amount of light elements in the universe – The first few minutes of the early universe provided for extremely hot and dense conditions, with temperatures of billions of degrees and densities of billions of particles per cubic centimeter. The universe quickly expanded and cooled, allowing for nuclear fusion to occur, and the protons and neutrons that made up the early universe began to combine to form the lightest elements in a process called the Big Bang nucleosynthesis. The observed ratios of light elements such as hydrogen, helium, and lithium strongly match up with predictions made by the Big Bang.

Along with these three key lines of evidence there are other pieces of evidence that strongly support the Big Bang theory. The large scale structure of the universe and the age of the universe are both consistent with the predictions of the theory. Taken together, these lines of evidence provide strong support for the Big Bang theory, making it the most plausible explanation for the origin and evolution of the universe.

Continue reading more about the exciting history of science!

The Baloney Detection Kit for US Political Discourse

In the era of Fake News, the Big Lie, and innumerable sources of misinformation, disinformation, and propaganda it has become necessary for each of us to channel our inner Carl Sagan and equip ourselves with a baloney detection kit in order to decipher political truths from falsehoods. We need to think about politics scientifically. Indeed, a healthy dose of skepticism is required to navigate the US political landscape and those with a leaning to the scientific worldview will naturally be better equipped that those with a leaning towards the religious worldview.

Donald Trump The Big Liar
Donald Trump: The Big Liar

Those with the religious worldview have been conditioned from childhood to believe in “faith”, resulting in the unquestioning belief in biological absurdities such as a virgin birth and a death and resurrection. Neither of those events ever happened, clearly. And just because a few fanatics living 2000 years ago claimed that it happened is not very good evidence. Consider this: eyewitness testimony in our modern judiciary system is considered flimsy evidence at best because we know how unreliable human memory is and we know common human fallibility is. So one would think that secondary accounts of events that were recorded decades after the event would be considered suspect at best, but to the religious folks, not so much…

In any case, it’s no wonder that these folks are the leading proponents of some of the most ridiculous conspiracy theories in the current US political discourse. If you find a conspiracy nut, it’s a good chance they are a Jesus lover. For those of use who have *not* gone too far down the fundamental religious rabbit hole, I’ve put together a toolkit of helpful tactics to help sort out truth from fiction and outright lying. It is imperative that we apply these to the US political landscape because democracy is currently under attack.

The Baloney Detection Kit

Everyone needs a baloney detection kit. Many of these tactics have personally saved me from believing things that may have felt right to me or that I have wanted to believe is the truth, but have ultimately turned out to be false. To keep things simple, I have four rules to always follow when trying to decide fact from fiction from US politicians.

  1. Bayes’ Theorem – The Bayesian system of thinking is: Initial belief + new evidence = new and improved belief. It is a form of probabilistic thinking used under conditions of uncertainty (which basically applies to every situation you will encounter). The mathematics behind Bayes’ Theorem basically says that if you have an extraordinary hypothesis, it should require extraordinary evidence to convince you that it’s true.
  2. Occam’s Razor – This is a rule of thumb that when trying to figure out what happened in a situation, the simplest explanation is usually right. Another way of thinking about this is the explanation that requires the *least* amount of coincidences – or special explanations – is usually the right one.
  3. Debate – Debate is a good thing. Anyone trying to discourage you from debating a topic is probably lying. Science fears no questions. The truth fears no questions. Those who fear questions also fear science and the truth.
  4. Do not become emotionally attached to an idea or ideology – Be open to the possibility that you might be wrong. Evidence should change your opinion (see Rule #1) If you decide ahead of time that what you believe to be true must be right then you aren’t thinking scientifically, you’re just a religious fundamentalist.

The Improved Baloney Detection Kit

Carl Sagan
Carl Sagan; author of Science as a Candle in the Dark
(Credit: National Geographic)

These four rules will serve anyone pretty well most of the time, but of course not 100% of the time. Nothing can reduce uncertainty to zero percent but of course we can always get closer and closer to the truth. In order to take things to the next level I recommend reading Carl Sagan’s classic book Science as a Candle in the Dark. In particular, pay close attention to Chapter 12: The Fine Art of Baloney Detection.

Carl Sagan was a remarkable person who wanted to use science to improve the human condition. May his spirit of skepticism live on.

The kit is brought out as a matter of course whenever new ideas are offered for consideration. If the new idea survives examination by the tools in our kit, we grant it warm, although tentative, acceptance. If you’re so inclined, if you don’t want to buy baloney even when it’s reassuring to do so, there are precautions that can be taken; there’s a tried-and-true, consumer-tested method.

Carl Sagan

Further reading: Science as a Candle in the Dark by Carl Sagan

1960s: Plate Tectonics

Plate tectonics is a scientific theory that explains the movement of the Earths surface and many of its most prominent geological features. It is responsible in forming the deepest trench in the ocean to the tallest mountain on land – and in the ocean as we’ll soon see.  The uppermost layer of the Earth is called the lithosphere and is composed of large rocky plates that are on top of a lower molten layer of rock called the asthenosphere.   A convection current is generated between the two layers, causing the plates to glide at a rate of a few centimeters per year.  While it may not seem like much, given enough time this process has formed many of the geologic features of our planet.  It has resulted in the formation of the Himalayas and the separation of the South American and African continents. 

History of Plate Tectonics

The theory of plate tectonics is a relatively new idea, however fragments of ideas can be found in earlier times. In 1596 cartographer Abraham Ortelius observed that the coastlines of Africa and South America could be fitted together like pieces of a jigsaw puzzle.  He speculated that the continents may once have been joined, but have since been ripped apart by earthquakes or a flood.  Creationist have ceased on this idea to “prove” the existence of Noah’s Flood.  It does provide for an arresting image, as Richard Dawkins points out, of South America and Africa racing away from each other at the speed of a tidal wave. In the 1850s there was a noticeable correlation of rock type and fossils discovered in coal deposits across the two continents.  1872 saw the mapping of the Mid-Atlantic ridge. 

In the early 20th century a few versions of a continental drift began making an appearance but there was one that stood out in its influence and the development of the Earth sciences. In 1912 Alfred Wegener published two papers where he proposed his controversial theory on Continental Drift.  He suggested that in the past all landmasses were arranged together into a supercontinent that he called Pangaea (meaning “all lands”).  However he did not have a geological mechanism for how the landmasses drifted apart and the idea was met with much skepticism first. Over a period of decades Wegener continued to amass evidence for his theory and continued to promote his model of continental drift. He gathered evidence from paleoclimatology to provide a further boost to the idea. He noticed glaciations in the distant past occurring simultaneously on continents that were not connected to each other and were outside the polar region. Unfortunately, while in Greenland on an expedition for some of this data, Many of his specific details have turned out to be incorrect but his overall concept that the continents were not static and did move over time has been proven correct.

During the middle of the 20th century more evidence began to support the idea that the continents did move. Huge mountain ranges were being mapped on the ocean floor. It was assumed by both supporters and opponents of continental drift that the sea floor was ancient and would be covered with huge layers of sediment from the continent. When samples were finally obtained it showed all of this was wrong. There was hardly any sediment and that the rocks were young, with the youngest rocks being found near the ocean ridges. In 1960 the American geologist Henry Hess pieced all of this evidence together in his theory of sea-floor spreading. The ocean ridges were produced by molten lava rising from the asthenosphere. As it rose to the surface the magma cooled, forming the young rocks, and spreading the ocean floor in conveyor belt like motion through the slow process of convection. While some parts of the Earth were creating new oceanic crust and spreading the continents apart, other parts were doing just the opposite. Along the western edge of the Pacific for example, thin layers of oceanic crust were being forced beneath the thicker layer of crust, being driven down into the mantle below. This explains the presence and high frequency of earthquakes and volcano’s in places like Japan.

By the beginning of 1967 the evidence of continental drift and sea-floor spreading was quite strong and ready to be assembled into a complete package. That year Dan McKenzie and his colleague Robert Parker published a paper in Nature introducing the term plate tectonics for the overall package of ideas that describe how the Earth’s plates move and the resulting geological features it creates. To this day many details are still being flushed out but this moment marked a revolution in the Earth Sciences. It is relevant now only in geology but also in its synthesis with the evolution of life on Earth.

Continue reading more about the exciting history of science!

1876: The Telephone

The story of the invention of the telephone one is a complex one, rife with rival claims over the invention itself, and the product of numerous discoveries and individual contributions. Italian inventor Antonio Meucci appears to have invented an apparatus he call the teletrophono in the late 1840s. However credit for the invention of the telephone is given to the Scottish born inventor Alexander Graham Bell who was awarded the first telephone patent in 1876.

From Telegraph to the Invention of the Telephone

The invention of the electric telegraph in the 1830s marked the beginning of telecommunications. Now, for the first time in history messages could be sent over enormous distances in an instant. The telegraph quickly gained popularity and attempts to modify and improve the device were fervent. Successive attempts at improvements to the telegraph ultimately lead to the invention of the telephone.

Alexander Graham Bell's Box Telephone, 1877-78.
Alexander Graham Bell’s Box Telephone, 1877-78.
(Credit: National Museums Scotland)

Both the telegraph and the telephone are wire based electrical systems and thus similar in concept. Alexander Graham Bell was experimenting with how to send multiple messages along the same wire at one time, a technique known as multiplexing. This was the goal of several inventors at the time, including the US inventor Elisha Gray. In the early 1870s Gray was using a harmonic telegraph, which consisted of a transmitter and a receiver connected to sets of metallic reeds, in order to investigate speech transmission. A harmonic telegraph was known to be capable of sending multiple messages at the same time on a single wire by using varying frequencies. Bell also began his experiments in 1873 using a harmonic telegraph in his search for a multiplexing device but he quickly hit on the idea of using it for speech transmission too.

By 1875 Bell had proved that different frequencies could vary the strength of the electrical signal in a wire. Now all he needed to do was to develop a working transmitter that could send varying electronic currents and a working receiver that could reproduce these frequencies in audible form. On March 7th, 1876 Bell awarded a patent for his telephone as an “apparatus for transmitting vocal or other sounds telegraphically”. The telephone is considered one of the most valuable patents ever awarded by the U.S. Patent Office.

Bell’s telephone was composed of various elements that largely remained unchanged for many years. First a power source such as a battery was needed to power the electric current generated by the transmitter. The transmitter converted the speaker’s voice into a direct current and on the other end a receiver converted the current back into an audible voice. While not in use the receiver hung on a hook with a switch on it, known as a switch hook. The switch hook connects the telephone to the direct current through a loop. When the telephone is on the hook, contact to the loop is broken. Pick up the telephone and contact is restored with current now flowing through the loop. A dialer and a ringer were also critical components. Lastly an anti-sidetone circuit, and assemblage of transformers, resistors, and capacitors, was used to reduce various forms of noise and electrical feedback.

The Telephone: A Revolution in Telecommunications

The telephone continued to evolve and improve over time. Early telephones needed to be connected directly with each other and Bell quickly realized that all telephones needed to be connected to all other telephones in service for the device to be practical.  By the late 1870s the first switchboard had been invented, solving the problem.  By the 1880s telephone were being assigned numbers to make operations easier, introducing the first telephone numbers.  For that point on the innovations haven’t stopped. 

The 20th century saw enormous leaps in telephone technology. The dial tone was introduced in Germany in 1908 but it took until the 1920s until it was adapted in the United States.  That same decade witnessed the first transatlantic call in 1927, from the United States to the United Kingdom and was transmitted by radio waves. Also by the 1920s it is estimated that there were over ten million telephones that were in service in the United States. The 1960s saw the appearance first Touch-Tone telephones and launch of the worlds first international communications satellite, Telstar.  Around this time first cellular phone had started to appear. Digital telephone technology combined with cellular phones have drastically changed the telephone in the later part of the 20th century. Today most people have converted from using their land lines to carrying around their versatile smartphones.

Continue reading more about the exciting history of science!

1830s: The Electric Telegraph

For most of human history information could only travel at the speed a person could run, a horse could ride, or a ship could sail.  The persisted until only very recently.  For instance only around 250 years ago during the American Revolution it would take months to communicate from the colonies to Britain across the Atlantic.  Today that same communication can happen in an instant.  The electric telegraph was the first major breakthrough of modern telecommunications.

Early Forms of Telegraphy

A Tower with a Semaphore Signal
A Tower with a Semaphore Signal
(Credit: Wikimedia Commons)

A telegraph is a from of long distant communication, as spelled out in the root meaning of its name: tele – at a distant, and graph – to write.  Some of the earliest forms of long distant communication include light signals, drum beats and smoke signals.  These measures were employed with varying but mostly minimal success. In Ancient China, soldiers positioned in towers along the Great Wall would use smoke signals to warn of impending attackers.  Native Americans were particularly well known for using smoke signals to communicate over long distances.  First, tribes would agree on a communication system – such as one puff for a greeting, two puffs for danger, and so on.  However despite some effectiveness these communication systems were limited in their ability to communicate complex messages and required certain weather conditions. 

When people think of a telegraph today most are probably referring to the electric telegraph.  However the first true telegraph that was put into widespread use was the optical telegraph, invented by Claude Chappe in the late 1700s during the tumultuous time of the French Revolution.  With events happening at a frantic pace, French government was in need of fast and reliable communication during the revolution. Claude Chappe, along with his brother Ignace Chappe began working on the problem in the summer of 1790.  The system they developed used semaphore shutters positioned on top of towers to transmit light signals that corresponded to letters.  Relay towers could be positioned approximately every 20 miles and messages could be transmitted distances of over a hundred miles at a rate of approximately two words per minute.  

The Electric Telegraph Transforms Communications

The electrical telegraph harnessed the new science of electricity to revolutionize long distance communication.  Early experiments in the 18th century demonstrated effects of electricity and showed that it could be transmitted across a wire almost instantly. Alessandro Volta developed a battery providing a steady source of energy. The electric telegraph combined these new discoveries, making it was the first valuable invention of applied electricity before 1860s, the decade James Clark Maxwell brought together all the laws of electromagnetism in his mathematical formulations.

To transmit a telegraph electrically there needs to be two or more stations connected by a wire. The first working electrical telegraph was made by a British inventor named Francis Ronald in 1816. His invention was met with little fanfare by the British government. Over the next few decades several other inventors built improved versions of the electric telegraph with limited success. It was Samuel Morse in the 1830s and 1840s who finally brought the invention into the mainstream.  The telegraphic system Morse developed is still used sparingly today and is known to most as Morse Code. Morse Code encodes 26 letters and 10 numbers into a series of dots and dashes that that can be used to communication a written message electrically across a wire.

Morse Telegraph
Morse Telegraph
(Credit: Wikimedia Commons)

Morse used his new system to send his first message across a two mile wire in Morristown, New Jersey in 1838.  The United States Congress quickly realized the benefits of Morse’s system and they provided the funds to set up a wire from Baltimore to Washington D.C. in 1843.  On May 21, 1844 the famous first message “what hath god wrought” was transmitted across this 44 mile line. Within two years the telegraph become such an integral part of communication in the United States that the Associated Press was formed.  The telegraph grew so rapidly that by 1875 there were over 250,000 miles of telegraph wires in the United States alone and over 100,000 miles of wire undersea, linking with world together as never before.

A Short Lived Reign

The electric telegraph took only a few generations complete its full product life cycle complete – an ominous sign of things to come for the telecommunications industry. The telephone, invented 40 years after the telegraph, can be said to mark the beginning of the end for the telegraph.  Radio communications only accelerated its decline. Today telecommunication technologies can rise and fall in a single generation. However many of these technologies owe its scientific roots to the invention of the electric telegraph.

Continue reading more about the exciting history of science!