Home > Znanost > Jam Al-Khalili: Quantum, A Guide for the Perplexed

Jam Al-Khalili: Quantum, A Guide for the Perplexed

Nature’s Conjuring Trick

Light behave as a wave with different wavelengths. It has all the properties of the wave: interference, diffraction and refraction.

Despite an atom being a tiny localized particle, it seems that the stream of atoms have somehow conspired to behave in a way similar to a wave.

Each atom fired from the gun leaves it as a tiny “localized” particle and arrives at the second screen also as a particle, as is evident from the tiny flash of light when it arrives. But in between, as it encounters the two slits, there is something mysterious going on, akin to the behavior of a spread-out wave that gets split into two components, each emerging from a slit and interfering with the other on the other side.

Quantum mechanics does provide us with a perfect logical explanation of the two-slit trick. We can explain what we see but not why.

We usually think of a physical body as a localized object while the notion of a wave is intimately linked to something extended and delocalized. Contrary to this common belief quantum physics claims that both, seemingly contradictory, notions can apply to one and the same object in one and the same experiment.

Origins

Quantum theory and quantum mechanics. The former is used to refer to the state of the affairs during the period 1900-1920. It wasn’t until the 1920s that the real revolution took place, and a completely new worldview (quantum mechanics) replaced Newton’s mechanics when it came to describing the underlying structure of the subatomic world.

According to Planck’s formula, the energy of the smallest bundle of light of a given frequency (a single quantum) is equal to the frequency multiplied by a certain constant. Planck’s constant of action has a symbol h.

The relation between energy and frequency is very simple. The frequency of violet light is twice that of  a red light. A quantum of violet has twice as much energy as a quantum of red light.

The distribution over frequency depends on the body’s temperature. A black body must somehow dispose of all the energy it absorbs – otherwise its temperature would become infinite!

Max Planck was interested in blackbody radiation not because of the failure of Raleigh’s formula, but in order to place Wien’s formula on a firm theoretical foundation. The energy would come in discrete lumps, or quanta, was his statement. This was a radical departure from Maxwell’s electromagnetic theory, in which energy is regarded as continuous. Planck did not at first realize the implications of his revolutionary idea. He did not regard all energy as being ultimately composed of irreducible tiny lumps. He was the founding father of quantum, he just did know it, at the time.

Planck’s hypothesis was based on two assumptions: the first was that the energy of the atoms (or oscillators) can only take on certain values, the second was that emission of radiation by a black body is associated with the energy of the atoms dropping from one value, or level, to a lower one.

Einstein’s paper on special relativity is very important, it is based on two postulates, that the laws of nature remains the same no matter how fast you are moving and that the speed of light is a fundamental constant of nature, it is the maximal speed possible in the Universe.

Einstein also proposed that all light is mad up of energy quants, now known as photons.

Light is not only wave or not only composed of particles. Each single photon (a particle) is associated with a definite frequency and wavelength (wave properties).

The next step in the quantum revolution was taken by a young Danish physicist by the name of Niels Bohr. He started his work in Manchester in 1912 with Ernest Rutherford. Rutherford came up with a model of the atom.

Bohr postulated that the electron energies within atoms are themselves quantized. An electron can only drop to the next lower orbit by emitting a quantum of electromagnetic energy (a photon). Likewise, it could only jump to the next orbit out by absorbing a photon.

Bohr was also able to explain the meaning of atomic spectra: the fact the elements tended to give off light at certain precise sets of frequencies (called spectra lines) – each spectrum being unique to that element.

Probability and Chance

Isaac Newton believed that every particle in the Universe should obey simple laws of motion subject to well-defined forces. This led to the Newtonian idea of a ‘clockwork’ universe. Everything that can ever happen is simply a result of fundamental interactions between its parts. This is known as determinism. Of course in practice such determinism is impossible for all but the simplest systems.

One of the most profound changes in human thinking brought about by the quantum revolution was the notion if indeterminism – that is the disappearance of determinism.

Fate as a scientific idea was proven to be false three-quarters of a century ago.

Down in the quantum world we find a very serious kind of unpredictability that cannot be blamed on our ignorance of the details of the system being studied, or a practical inability to set initial conditions, instead, it turns out to be a fundamental feature of nature itself at this level.

One of Einstein’s most famous quotes is that he did not believe ‘that God plays dice’, in the sense that he could not accept that Nature is probabilistic. He was wrong.

When we talk about an equation for a classical particle, we talk about a value for the precise location and velocity of the particle at some future specified time. Schrodinger’s equation is different.

The wavefunction is symbolized by Greek letter psi. The wave function in the Schrodinger’s equation is the unknown quantity and can be worked out for each moment in time to describe the state of quantum particle. By state we mean everything we can possibly know about the particle.

Much of the credit for the development of the theoretical understanding of quantum mechanics belongs to the Austrian physicist Erwin Schrodinger. Schrodinger decided to see de Broglie’s idea of matter waves could somehow be used to explain Bohr’s model of the atom. He proposed a new equation that describes not the way a particle moves but the way a wave evolves. He came up with Schrodinger equation that describes that. This is where all the probabilistic nature of quantum mechanics comes in.

The wavefunction contains a lot of information. At any instant in time in has a value for each point in space. The wavefunction is spread out over all of space – hence the term wave. At each point in space the wavefunction is assigned two numbers. The probability of the electron being in a tiny unit volume around this point is the sum of the square of these two numbers. The wavefunction itself is not a probability: you have to square it.

If the electron is detected in a certain location then its wavefunction is instantly altered. At the moment of detection there will be zero probability of finding it anywhere else.

One of the most important consequences of the probabilistic nature of the wavefunction is the concept of indeterminacy, which states that knowledge of certain aspects of the state a particle, such as its position in a particular moment in time, does not imply that its future position can be known with certainty. The best known example of indeterminacy is given by the uncertainty relation, first discovered by Werner Heisenberg.

Heisenberg’s uncertainty principle says we cannot know the precise position and momentum of a quantum particle at the same time. This led Niels Bohr to his Principle of Complementarity, which states that both seemingly conflicting aspects are necessary for a completes description of a quantum particle.

The period 1925-1927 saw a revolution in quantum physics. By September 1925, Heisenberg, Born and Jordan had arrived at new quantum mechanics. Other young physicists such as Paul and Dirac helped in clarifying many of the issues in the new theory. Dirac was the first person to show that the Schrodinger and Heisenberg theories were equivalent.

Spooky Connections

The idea of superposition is not unique to quantum mechanics but is a general property of all waves. The process of adding different waves together is known as superposition.

The wave function is not the atom itself but only our description of how it behaves when we are not looking at it.

Interferometers are devices that highlight they way a single particle can travel along two paths at once and, once brought back together again, give rise to an interference pattern or some other type of signal that provides proof that something must have travelled along both routes.

Quantum mechanics tells us that, until we look, the atom’s wavefunction will be in a superposition of two pieces travelling along both routes at once. In principle the two paths can be very far apparat, even on either side of the galaxy.

Quantum particles can be in superpositions of other states, such as spinning in two directions at once or having two or more different energies or velocities at once.

A phenomenon called nonlocality was shown beyond any doubt to exist in the quantum world, through an effect known as entanglement.

Physicists are no longer an any doubt that instantaneous communication between distant objects, or nonlocality, is a general feature of the quantum world, and can be traced back to the nature of the wavefunction itself.

In its original form, the Einstein-Podolsky-Rosen (EPS) experiment was meant to show that quantum mechanics provided an incomplete description of reality. It was part of a long-running debate between Einstein and Bohr. In 1964 John Bell proposed a way of testing who was right once and for all. The Bell’s theorem or Bell’s inequality. If Einstein was right, then there would have to be a maximum amount of correlation between the two particles. If quantum mechanics was correct, then there would be larger amounts of correlations. Violation of Bell’s inequality. In 1982 a team of physicists in Paris, led by Alain Aspect, carried out an experiment that showed that quantum mechanics was correct.

The Watchers and the Watched

Mathematically and logically, the quantum rules are unambiguous and well-defined. But it is hard to see them in practice. This is called quantum measurement problem. How do we explain the collapse of the wavefunction.

Disturbing an object by measuring some property of it. We will unavoidably disturb something like an electron when we try to look at it, but this is not the origin of the uncertainty principle; it is an addition to it.

The funding fathers of quantum mechanics came up with a set of postulates: rules that are an added extra to the quantum formalism which provide a recipe for how to translate the wavefunction to definite answers, or observable – tangible properties we can observe, such as the position or momentum or energy of an electron at a given moment.

One such postulate states that the probability of finding a particle in a certain location is obtained by adding together the squares of the two numbers that define the value of the wavefunction at that position. The other postulates are also to do with what kinds of measurement can be made and what we should expect to find when we carry out a particular type of measurement.

Schrodinger’s cat is dead and alive at the same time. Bohr and Heidenberg did not claim that the cat was really both dead and alive at the same time. They insisted instead that this can be accepted by the majority of physicists ever since – we cannot talk about the cat as even having an independent reality until we open the box to check up on it. Until we look inside the box all we can do is assign probabilities to different outcomes.

No one know where to place the boundary between the quantum domain of wavefunctions and superpositions and the classical domain of definite outcomes when measurements are made.

One of the basic tenets of quantum mechanics is the fact that the results of measurements are defined not only by the properties of the quantum system being studied but by the very act of measurement itself.

Just because we must rely on the wavefunction prior to measurement does not imply that object it describes it not real.

The Great Debate

The majority of practicing physicists have learned to use the theory without understanding why it works. No one interpretation of the quantum formalism has been proven to be any better than the rest, other than on aesthetic grounds or personal taste.

The Copenhagen view is not so much an interpretation of a scientific theory as an ideology or a philosophical stance.

The Copenhagen interpretation is not a single, clear-cut unambiguously defined set of ideas but rather a common denominator for a variety of related viewpoints. It was developed through discussion that took place in the mid-and late-1920s between Niels Bohr and the group of brilliant your geniuses. The most notable of these was Werner Heisenberg.

Heisenberg’s main contribution to physics was his development of an alternative formulation of quantum mechanics to Schrodinger’s wave equation, known as matrix mechanics.

A combination of the two approaches is often necessary.

Only the result of measurement is real. Certain properties of the quantum system are only endowed with reality at the moment of measurement. The notion of collapse of the wavefunction upon measurement was first coined by Heisenberg in 1929.

It was Louis de Broglie who came up with the first serious alternative to the Copenhagen view. He called it his principle of the double solution, implying synthesis of the wave and particle aspects of matter which he had proposed.

The Solvay conference at Brussels in 1927 represents one of the landmark events in the development of quantum mechanics.

Hugh Everet III. proposed what he called the relative state interpretation. His original idea is now known as the many-worlds interpretation.

The Subatomic World

On top of everything already mentioned there are some additional quantum weirdness like quantum tunnelling, spin and Pauli’s exclusion principle.

In a manner reminiscent of Planck’s idea of energy quantization but preceding it by a decade, the Irishman George Stoney had suggested that electricity might be continuous but come in tiny invisible lumps he called electrons. It was J.J. Thompson who received the Nobel Prize for the discovery of the electron, the first elementary particle.

Ernest Rutherford was one of the most influential figures of 20th century science. Bohr work with him.

By solving Schrodinger’s equation, physicists could explain how the electrons arranged themselves in the atom.

The quantum numbers the electrons have define how they arrange themselves in what are called quantum orbitals or energy levels.

Almost all non-technical books on quantum mechanics choose to explain the origin of quantum weirdness using the concept of spin. All electrons spin at exactly the same rate, and can never slow down or speed up. Their direction of spin is ever weirder; it is a superposition of different directions at once, until we look.

All elementary particles that make up matter, such as electrons, protons, and neutrons – and their constituent quarks – are said to have a spin of one-half (measured in multiples of the value of Planck’s constant) and belong to a class of particles known as fermions. Photons belong to the other class of particles, known as bosons, which have the property of having a spin equal to a whole number of Planck constant.

Heisenberg’s uncertainty principle tells us that trapping an electron within the tiny confines of a nucleus would imply knowing its position to high accuracy. Electrons simply could not be imprisoned inside nuclei.

Pauli exclusion principle by Wolfgang Pauli. The reasons elements have different chemical properties was due to the way their electrons occupied different quantum orbits, or shells. Particles such as electrons, protons, and neutrons, that are collectively referred to as fermions, obey Pauli’s exclusion principle.

The uncertainty principle is more generally a statement about our inability to assign precise values to two complementary quantities simultaneously, such as a particle’s position and its momentum. Or particle’s energy and the precise duration that it has that energy.

Virtual particles that can be created out of pure energy are known as bosons. They are also referred to as force-carrying particles. Of the four known forces of nature, three are important inside the nucleus. Strong nuclear force pulls protons and neutrons together. Weak force is responsible to beta-decay. It is the interplay between the repulsive electromagnetic force and attractive strong nuclear force that gives rise to the stability of nuclei.

Dirac in 1927 clarified that Heisenberg and Schrodinger theories were mathematically equivalent. He also introduced a mirror particle of electron, called positron. We now know that every elementary particle has its associated antimatter partner.

Quantum tunneling is also called barrier penetration.

We have six quarks. The carry electric charge and also property known as color charge. Today there are known to be just two species of elementary particles: quarks and leptons. The latter is the name of all particles that do not feel the strong nuclear force.

The Search for the Ultimate Theory

The quest for the ultimate truths is always a quest for beauty and simplicity.

The whole of classical mechanics is explainable by Newton’s laws of motion and forces, and Einstein’s theories of relativity improves on this.

The Holly Grail of physics is to find the ultimate theory of everything, an all-approved theory from which all naturally recurring phenomena in the Universe could be deduced and explained.

James Clerk Maxwell showed that light is made up of a combination of electric and magnetic fields oscillating at right angles to each other travelling at a speed of 300.000 kilometers per second.

Special relativity has taught us that as the body approaches the speed of light its mass starts to grow until, at light speed, it would have an infinite mass – which is why nothing that has mass when at rest can travel at the speed of light.

In 1928, Dirac published a paper with the title The Quantum theory in which he proposed alternative theory that was fully relativistic and also took into account the electron’s spin in a natural way. This theory introduces anti-particle. A year earlier he also combined quantum mechanics with Maxwell theory of light by quantizing the electromagnetic field. What emerged was a theory of quantum electrodynamics or QED.

Physics speak of global symmetry when certain laws of physics remain the same when a particular change, or transformation, is applied equally everywhere.

Above a temperature of about a million billion degrees, such as would have been the conditions in the very early Universe, the electromagnetic and weak forces would become one and the same force. The field theory of the strong nuclear force that was developed is known as ‘quantum electrodynamics’ or QCD.

In 1949 Richard Feynman, Julian Schwinger and Sin-Itiro Tomonaga, separately figured out a way of finessing the infinities with a mathematical trick called renormalization.

Particle physicists refer to the framework that loosely incorporates both the electroweak theory and QCD as the Standard Model.

When we reach a distance scale (10-28 of millimeter) we find that all three forces (weak and strong nuclear and electromagnetic) converge on the same strength. It is here that they can be regarded as a single force and a certain symmetry is restored. A theory that would unify these three forces is known as a Grand Unified Theory (GUT).

For true symmetry the three forces needed to converge simultaneously.

Supersymmetry is talking about every particle has a supersymmetric partner with the opposite character.

Einstein described gravity in terms of pure geometry. Everything in the Universe is trying to pull everything around it closer.

To combine all four forces, we would need to combine general relativity with quantum field theory. The quest for a theory of quantum gravity is on. Two camps that we have are different in their belief on where to start. Quantum theory or theory of relativity. One common point is that we need to follow the lesson of Planck. Space and time themselves must also be ultimately composed of irreducible lumps.

String theory describes the force of gravity in terms of an exchange particle called the graviton. It states that all the fundamental particles are in fact tiny vibrating strings. The different frequencies at which these strings vibrate give rise to the different elementary particles. The original version of string theory was based on ideas developed in 1968 by Gabriele Veneziano. In 1980s John Schwarz and Michael Green applied the idea of supersymmetry to Veneziano’s strings.

The heat radiated by a black hole is paid for, not by energy flowing out of the hole, but by negative energy flowing in. A negative energy state is one with less energy than a state that has zero gravitational field.

Black holes and in fact all spherical objects, create negative quantum vacuum energy in their vicinity because the curvature of spacetime due to their gravitational field disturbs the activity of the virtual particles.

Putting the Quantum to Work

The phenomenon of quantum tunneling gave us nuclear power and will hopefully lead one day to a cleaner source of unlimited energy: nuclear fusion.

The Pauli exclusion principle described how electrons arrange themselves inside atoms such as no two electrons are allowed to occupy the same quantum state. They cannot be described by the same wavefunction but must differ in some way: either in their energy, their orbital momentum or their spin direction. This theory is very practical in use with semiconductors.

Lasers, microchips, semiconductors, LED, everything utilizes quantum.

Lasers can nowadays be used to control a single atoms in ways that are opening up a whole new field of quantum technology.

If we do ever succeed in making electric power cables from room temperature superconductors it would help towards providing cheaper electricity. Current energy efficiency from electric power is low due in part to the energy lost as a heat from the electrical resistance in the transmission cable that criss-cross the land.

Nanotechnology is the relatively new field of building and utilizing micro-machines on the scale of nanometers.

Quantum tunneling is thought to be involved in a wide range of biochemical processes including photosynthesis, respiration, mutation, and protein folding.

Into the New Millenium

In order to make real use of quantum superpositions we need to entangle many quantum states together.

There has already been one successful application of entanglement. It is called quantum cryptography.

Lasers are used to carve out the tiny patterns of an integrated circuit on the surface of a silicon chip.

Moore’s law may come to an abrupt end due to an unavoidable problem with increasing miniaturization known as thermal noise. Point one barrier is when transistors will be just 0,1 micron across. Maybe this can be solved by replacing silicon with gallium arsenide – faster conductivity of electricity. Rayleigh criterion is that the chip can be no smaller than half the wavelength of the beam.

The simplest piece of quantum information is the qubit. A switch which can have two positions, either 0 or 1, while a qubit can be in a superposition corresponding to both 0 an 1 at the same time.

Peter Shor and Lov Grover wrote quantum algorithms.

For a working quantum computer, we would need to entangle thousands of qubits and stave of decoherence long enough for useful computations to be carried out.

You may also like
Richard P. Feynman: Six Easy Pieces

Leave a Reply