Sunday, February 12, 2006

The Hunting of the Quark

The Fall and Rise of Quantum Field Theory
I have encountered several theories of how theory and experiment interact and science progresses: Kuhnian Paradigm Shifts, Galisonian Constraints and Intercalated Experiment-Theory-Instrument Structure, Pickeringesque theories of resistance, "opportunism-in-context" and Theory-Experiment symbiosis and French's ideas of the importance of "uberty" or "the objective structural characteristics of the models concerned that contribute to their heuristic fruitfulness".1

In this paper I will examine the Theories, Experiments and Conditions surrounding Quantum Field Theory, S-matrix Theory the quark theories of Gell-Mann and Zweig (as well as Richard Feynman's Partons and Quantum Chromodynamics (QCD)) and the related experiments (like the scaling result at SLAC), and discuss how we can understand these events in terms of these various theories. I will show that none of these theories entirely captures all of what went on in this period - a full analysis of these events should include a theory that synthesises the elements of all these theories.

Aspects of Theories of Science

When a physicist writes the "history" of his field, he usually narrates the progress of science in an idealized fashion. Usually, the goes something like:

1) There is an outstanding problem in some field of physics - either a theory is believed to be incomplete because of some obvious internal shortcomings, or a new experimental result leads to a crisis that must be resolved before progress can resume.

2) Everyone acknowledges the problem and the great minds of the era turn their attention to it.

3) One or more solutions appear over a short period. Quickly, the community unanimously decides which of these options is the one that might be right.

4) An experimenter takes this theory, immediately conceives a new experiment that will verify some critical aspect of the new theory and returns a year later with the astounding news that his experiment supports the theory.

5) The theory community unanimously agrees that this result is sufficient to declare the theory "correct" until the next iteration of this process.

In the introduction to "Constructing Quarks", Pickering summarizes a variety of "well-known" reasons why this view of experiment as "the supreme arbiter of theory" is misleading:2

1) Even undisputable experimental fact (by itself) still leaves theory completely undetermined. It is necessary to apply judgement to sort the plausible theories from the infinite number of implausible ones that also explain the same experimental results. In this sense, the experimental results are less relevant to vindicating the decisions made by the theorists in step three (above) than typically implied by the physicist/author.

2) The idea that experimental results are indisputable fact is not obviously true.

"At the heart of the scientist's version is the image of experimental apparatus as a 'closed', perfectly well understood system... They are better regarded as being performed upon "open", imperfectly well understood systems[.]"

Pickering is pointing out the familiar fact that experiments are often carried out with assumptions about the particles involved that are later shown to be false - whole experiments have to be reinterpreted in the light of new ideas to derive the "correct" result.

As an example from a slightly removed part of physics, consider the Michelson-Morley Experiment. "Originally" taken to be proof that the motion of the Earth through the ether was not detectable to a given accuracy, it was quickly reinterpreted to become a vindication of electronic theories of matter like that of Lorentz and then again quickly "rereinterpreted" to vindicate Einstein's "constant c" postulate central to the Special Theory of Relativity!

The typical textbook/popular science view of science is of incremental increase from nothing to the present day body of knowledge via a chain of individual discoveries and inventions. This view also carries the implicit (or sometime explicit) idea that all physicists have been trying to solve the same set of problems (or their logical successors) with this ever-increasing volume of tools.

In the late 1940s, Thomas Kuhn, then a graduate student in physics at Harvard published "The Structure of Scientific Revolutions" in which he put forward what became a very popular view of scientific progress that was at odds with this image of gradual, incremental, progress. In Kuhn's view, science consists of periods of "normal science" where an established paradigm rules and research is generally a mopping up operation aimed at filling in the details, interspersed with periods of crisis where contradictions and anomalies between results and the paradigm lead to a break down of the established order.

A paradigm is essentially a collection of beliefs, a set of agreements about how we understand scientific results and problems. Kuhn maintained that paradigms are essential to the practice of science - "no natural history can be interpreted in the absence of at least some implicit body of intertwined theoretical and methodological belief that permits selection, evaluation, and criticism."3 The ruling paradigm defines the questions that scientists ask, guiding the research efforts of scientific communities.

A new paradigm then forms based on the new theories to resolve the crisis. Kuhn holds that, under the new paradigm, the entire world-view of the scientists has changed. "[A] scientist's world is qualitatively transformed [and] quantitatively enriched by fundamental novelties of either fact or theory."4 Science before and after the paradigm shift are isolated - even basic observations cannot be compared.

In his Harvard lectures and in "Context and Constraints"5 Peter Galison expresses a competing view. Instead of Kuhn's global paradigm shifts in theory based on new experimental results completely changing the interpretation of every experimental result, Galison divides physics up into the related areas of theory, experiment and instrumentation. Galison realises that, for example, often when a major shift in the theoretical community's viewpoint occurs (like the move towards string theory), the experimental community does not immediately shift in the same way ([link url = http://physics.about.com/od/particlephysics/]high energy physics[/link] has yet to attempt - because it can not - to investigate the energies involved in string theory). Galison also realises that the instrumentation used by the experimentalists develops at its own pace, with its own breakthroughs and major shifts.

These three areas evolve separately, but not in isolation - the shifts in each are driven (and drive) the shifts in the others - the development of the Geiger counter and other related counters, the experiments conducted with it and the theories built around those results are a good example of Galison's description.

Galison also brings the idea of constraints into the analysis of science history. Borrowed from the "Annales" School of historians, constraints are "the boundaries beyond which inquirers within the community find it unreasonable to pass."6 Depending on the community in question, the relevant constraints can range from the conservation laws and symmetries theorists may hope to hold onto, differing degrees of allowable mathematical rigour - for example, many S-Matrix theorists would not venture into the terrain of Quantum Field Theory because they held formal mathematical objections to the "trick" of renormalisation7 - or the properties of relevant experimental equipment, like the materials used by a particular group. The concept of constraints also encompasses various "external" factors - economic and political limits on what science is allowed, or funded or encouraged.

These constraints differ from subculture to subculture, leading to differing "views about the status of what objects there are, how we learn about them, and how they interact" - but these groups will have to interact with each other and the rest of the world. Here Galison borrows from anthropological linguistics - at the overlap between two different subcultures a pidgin or Creole language forms that allows the disparate groups to communicate. An example Galison has used in his lectures is the creation of very sensitive emulsions for the detection of nuclear particles in films. The physicists who wanted to record and measure the paths of nuclear particles in a photographic film-like emulsion (experimenters) had a very different world view to the chemists and technicians at Kodak who knew how to make emulsions with various properties (instrumenteers), yet in order to secure the emulsion the physicists needed, the two groups had to understand each other. Fortunately, the small overlap of common terms and ideas was sufficient, and Kodak was able to develop the required emulsion (with a closely held secret procedure).

Pickering, in "Beyond Constraint"8 argues for a slightly different interpretation of the evolution of knowledge. His personal focus is the time evolution of "practice" within a subculture, as opposed to Galison's "long duration" and distant view that looks at how subcultures interact and their paradigms shift. In Pickering's view, constraints are understood as things that are "already there" but it is the "real time" effects that appear during the course of the research that really drive what results are found and how those results are interpreted. The "resistances" appear in the course of science as projects or models are pursued. In the face of resistance, scientists make "accommodations" changing their experiments or their models to overcome the resistance. A back and forth ensues between accommodations and resistances until the scientist has a result that is satisfactory within his paradigm.

In "Constructing Quarks", Pickering talks in more detail about another part of his view of the progression of science - the feedback between theory and experiment and the formation of research traditions9. In idealized form this goes something like: experimenters find unexpected phenomena (or some other anomalous results). Some theorist (or theorists) identifies this result as a manifestation of novel phenomena central to his theory. This has a dual effect: it points the way towards new experiments that investigate other aspect of the theory and also, as the new theory now possesses a supporting result, theorists elaborate further upon it, leading to new theories.

If the second generation of experiments confirm more aspects of the new theory then its status is elevated and further generations of experiments and theories flourish into what he refers to as research traditions. Pickering points out that their can be several competing research traditions simultaneously within one field, eg the S-Matrix and the Quantum Field theories.

Pickering summarizes his ideas about the dynamics of research within traditions as "opportunism in context" - theorists and experimenters within a tradition look to each other to find questions that need to be answered - the theorist seeks to explain the latest results of the experiments and the experimenter seeks to investigate phenomena in which there is a theoretical interest - usually within his "tradition."

In a paper critical of this style of analysis[10], French describes this argument as springing "from the view that it is primarily the sociological and psychological, rather than the epistemic, virtues of theories that provide the driving force behind theory choice, understood ... to also include theory pursuit."[11]

French favours a theory that acknowledges "the objective structural characteristics of the models concerned that contribute to their heuristic fruitfulness, or 'esperable uberty' as Peirce would have called it, and which leads to one being pursued rather than the other."[12] Essentially French is arguing that physicists do not necessarily pursue theory that is most accessible to them based on their cognitive skills and experience. Rather, they will tend to choose the theory that offers the greatest promise; based upon characteristics of the theory that lead to an expectation of future results as well as an accounting of previous results the theory has been responsible.

Forman has argued that external social and political pressures shaped the direction science progressed. His two classic examples of this are the rejection of causality in the 1920s Weimar republic lending itself to the adoption of the random aspects of quantum mechanics[13] and the National Security state of cold war America and the focus towards research that is directly or indirectly related to defence.

Quark (and related) Theories and Experiments

From a Kuhnian point of view, high-energy physics entered a crisis almost immediately - as early as 1951 there were already fifteen "elementary" particles, making an "untidy"14 list, with no apparent explanation (this crisis affected the theorists - who could not explain the experimental results - and the experimentalists - who became unsure about what the things they were seeing really meant). Ahead of the curve, in 1949 Yang and Fermi speculated on solutions for the problem of the growing number of elementary particles by describing the mesons as nucleon-antinucleon pairs. By 1961, the crisis had worsened - there were around one hundred "elementaries", but a large group of theorists had acknowledged the crisis and had attempted resolving it. Gell-Mann and Ne'eman proposed "The Eightfold Way" a classification scheme for the hadrons based upon the symmetries (and "broken symmetries") possessed by the interactions between the elementary particles, derived from quantum field theory. The group associated with this symmetry is known as SU(3).

Quantum field theory, immensely popular in the early 1950s as a hold over of the resounding successes of Quantum Electrodynamics, began to develop incurable flaws as a theory of the strong and weak forces in the eyes of physicists that would drive it away from the mainstream of physics for almost 20 years. Quantum field theory at that time did not appear to be amenable to forming a theory of the weak interaction because the theories constructed were non-renormalizable. Nor could it represent the strong force - the strength of this force lead to a large coupling constant that meant that there was no way to truncate the perturbation expansion of any calculations made with any accuracy after a finite number of calculations. The inability to make quantum field theory work constituted another crisis, largely confined to theory, that lead to several abortive paradigm shifts, before being resolved by an improvement in the structure of the field theory/renormalization paradigm - see Quantum Chromodynamics, below.

At the same time as theorists were hitting dead ends with the field theory, other physicists were developing the "S-matrix Theory" approach. Abandoning attempts to sum all the terms in the strong force expansion, they chose to limit themselves to talking about quantities that could be experimentally measured - probabilities of transitions from an incoming set of quantum states to an outgoing one. In the late 1950s, the work of field theorists like Chew and Gell-Mann showed that the S-Matrix was a relatively straightforward function of the relevant variables and amenable to various mathematical techniques. This made it possible for S-Matrix research to be conducted without reference to the quantum field theories that it had developed from. Chew proposed the bootstrap philosophy - if the S-Matrix was really an infinite set of coupled equations that determined everything about the hadrons, then it had to have a unique solution, even if it couldn't be calculated. By truncating this set of equations, approximate results could be generated.[15]

In the late 1950s and the early 1960s, many physicists, in an apparent contradiction of Pickering's theories about "opportunism in context" gave up on Quantum Field Theory as a theoretical exercise. Pickering's "opportunism in context" ideas would indicate that these physicists, who had spent years or decades developing skills in quantum field theory, would choose to continue work in this field. The evidence of this migration appears to support French's "Uberty" position. The physicists willingly gave up on the tradition that their strength lay in, in order to pursue a tradition that must have appeared, by virtue of avoiding some of the problems of quantum field theory, to have held greater prospects.

Although originally developed in the context of field theory, the rise of S-Matrix theory at the expense of quantum field theory left The Eightfold Way orphaned - rather than being an explanation, it was simply a classification scheme, with important phenomenological impact - Gell-Mann predicted mass relationships between members of the SU(3) multiplets based on a simple model of the strong force that successfully predicted (before their discovery) the masses of the Ó, Ó*, Î* and Ù-.

By 1964, there was virtually no argument that The Eightfold Way was the correct system for classifying the hadrons.

Had a paradigm shift occurred? It does not appear so - although The Eightfold Way had simplified the complex tables of hadrons and their properties, but there was still no fundamental change in how anyone viewed the world. The hadrons were still considered elemental and there was no globally accepted theoretical explanation of the SU(3) structure.

What happens to Quantum Field Theory?

In the 1960s and 1970s, there were two large groups doing theoretical physics - The S-Matrix theorists (successful at high energy) and the Constituent Quark Model (see below) analysis of resonances (successful at low energy). Field theory was largely unused by either of these groups - the idea of calculating the details of interaction processes went against the basic ideas of the S-Matrix and the quantum field theorists could not perform any of their calculations on the quark model due to the strength of the required interaction. However, Quantum Field Theory did not completely disappear - people still worked on the original field theory: Quantum Electrodynamics as well as a group (including Gell-Mann) working in the small field known as "current algebra.

The Two Quark Theories

Although the broken symmetry of the Eight-fold way describes the strong interaction of the baryons and the mesons, a fundamental explanation was desired. In 1964, two papers appeared that proposed similar steps to improve the explanation. Murray Gell-Mann at Caltech published A Schematic Model of Baryons and Mesons16 and Zweig at CERN proposed a similar model in the preprint An SU(3) Model for Strong Interaction Symmetry and Its Breaking[17] (with the constituents named "aces" instead of quarks).

Despite the very different forms of the two quark proposals they had several factors in common: The Hadrons were described as composite particles. Zweig and Gell-Mann postulated that they were made up from three fundamental particles of bayron number 1/3. The "up quark" (u) having charge +2/3, the "down" (d) with charge -1/3 (both with zero strangeness) and the "strange quark" (s) also with charge -1/3 and strangeness +1. Then, the mesons are constructed from quark-antiquark pairs and baryons from trios of quarks. As Pickering points out, these similarities between the two proposals are not at all unexpected[18] for a pair of reasons:

1) Since the work of Yang and Fermi mentioned above and a model proposed by Sakata in 1956, high energy theorists had toyed with the idea of representing the many "elemental" particles as compounds of a smaller group of truly fundamental particles. Thus, the idea of hadrons as composite systems was quite familiar to them.

2) A familiar aspect of the group theory classification scheme intrinsic to the Eightfold Way is the "fundamental representation" of the SU(3) group, from which all the other representations could be mathematically derived. Even within the Eightfold Way, the fundamental representation corresponded to particles with the properties of the three quarks!

Despite these common inspirations, Zweig and Gell-Mann arrived at very different formulations of the quark theory. Where did the differences come from?

Zweig: Constituent Quark Model (CQM).

Zweig explained multiplet structure of the Hadrons – all hadrons made up of two or three "aces" (quarks) carrying the correct quantum numbers. He treated quarks as "real, physical" constituents of the hadrons, and derived many predictions.

Why didn't the theory community like CQM (Zweig was often described as a "charlatan"[19])?

Pickering maintains[20] that most of the theory community was committed to either Quantum Field Theory or the S-Matrix. From a Quantum Field Theory standpoint, the only explanation of why no quarks had been observed was to give them a huge mass. But they came together to give light objects, which meant that there had to be lots of binding energy and thus strong quark-quark interactions, Quantum Field Theorists didn’t know how to do these calculations as strong coupling ruined perturbation theory. Adherents to S-Matrix Theory, especially in the bootstrap formulation, believed that there were no fundamental particles.

These objections can be understood in Pickering’s opportunism in context scenario – if the Quantum Field theorists did not know how to deal with the strong coupling already, then it would not be in their interest to investigate the CQM – or in Galison’s Constraint framework - constrained by a principle that there were no fundamental particles, the S-Matrix theorists could not consider the idea of fundamental quarks – Quarks were simply outside the paradigm in which they understood the world! The paradigmatic difference was such that Zweig and the S-Matrix theorists could not even construct a Creole to share any part of the CQM.

None the less, Zweig's constituent quark model was slowly accepted. Despite its shortcomings in the eyes of the theoreticians, the CQM was an immensely valuable tool to a fourth group of scientists virtually unique to high-energy physics (and not present in Galison’s intercalated description as described in "Context and Constraint") - the phenomenologists. Although closely related to theorists, the phenomenologists pursue distinct goals - rather than "the abstract elaboration of respectable theories (like quantum field theory or the analytic S-matrix)- [the phenomenologists pursue] the application of less dignified models to the analysis of data and as a guide to further experiment."[21] Instead of struggling through the pure mathematics of group theory, physicists interested in producing predictions (i.e. the experimentalists and phenomenologists) could simply use their well practised skills in combining the quantum numbers of compound objects.

In his original papers, Zweig produced a variety of phenomenological predictions based on the constituent quark model. He explained the empirical absence of the triplet, sextet and 27-plets that seemed perfectly valid in the Eightfold Way by identifying the mesons with quark-antiquark pairs (qq*) and the baryons with quark triplets (qqq) (the recent "pentaquark" discoveries modify, but do not damage, this result). At the same time, Zweig simplified Gell-Mann's mass formulae by assigning the strange quark a higher mass than the others (a speculation that has survived modern versions of the standard model). This is another example of "Uberty": the quark model was favoured by phenomenologists over the Eightfold way and other group based considerations that they had been working with for years, because they perceived it as being more fruitful as a potential research program.

The CQM led to a more intimate relationship between the phenomenologists and the experimentalists – the resonances being investigated in this era were not clear like earlier – finding them was no longer "bump hunting," instead detailed examination of the decay products was required. The CQM improved the experimental world by providing specific areas to hunt were undiscovered particles were predicted to reside. Also, by providing predictions that could be tested, the quark model created a reason for continuing the detailed examination of the low energy regime, which drove further elaboration of the CQM.

As a "neophyte to theoretical physics[,]"[22] Zweig was constrained to use only the tools he had mastered – a familiarity with the SU(3) symmetry and the Sakata model of composite hadrons, and thus his quark proposal was framed in an "extremely crude manner[.]"[23]

Gell-Mann and Current Algebra/Quarks

Gell-Mann, in contrast, was a theorist with well-established skills working on the current algebra approach which he had founded. Current algebra was, like the Eightfold Way, a theory Gell-Mann had developed based on field theory that still stood as a useful phenomenological tool with field theory taken away.

In his paper, Gell-Mann develops a theory that is based upon currents of free quark fields that allows him to recover successful phenomenological results. Gell-Mann ends paper: "... would help reassure us of the non-existence of real quarks"

Scaling

Late in the 1967, the Stanford Linear Accelerator Centre (SLAC) uncovered an important new phenomenon, Scaling, with important implications to the development of the quark theories. Among the shakedown experiments for the various detectors installed on SLAC was a general survey of electron-proton scattering cross-sections. Electron-proton scattering was considered a useful experiment for two reasons:

1) Because of the strength of the electrical interaction, the scattering can be represented to a good approximation as the exchange of a single photon between the electron and the proton – simplifying the analysis

2) The interaction between the electron and the photon is assumed to be perfectly described by quantum electrodynamics, so the only unknown in the experiment is assumed to be the photon-proton interaction.

Thus, these experiments constituted an investigation of the detailed structure of the proton. Essentially electrons were accelerated down the two mile long, 22GeV accelerator into a target of liquid hydrogen.

The elastic electron-proton (electron + proton --> electron + proton) scattering cross-sections measured at SLAC agreed with the existing (lower energy) result - the electrons, which acted like hard objects, bouncing off each other at high angles, were only shallowly deflected by the protons. The standard explanation for this is that the protons are extended, diffuse clouds of charge, which essentially diffract the electron beam into a narrow cone. However, the high energy inelastic scattering (electron + proton --> electron + anything) measured by the MIT and SLAC collaboration (headed by Kendall and Friedman at MIT and Taylor at SLAC) showed unexpected results. By counting all the electrons scattered by the protons as a function of angle and electron energy, the experimenters measured the cross-section for all the reactions (ep)-->(eX), where X is known as the "Hadronic Debris." The data analysis, which took place starting the spring of 1968 showed, at low energies, the expected resonance structure as various metastable hadrons were formed in the collisions. At high energies, the data showed no resonances and unexpectedly, very large cross-sections at large angles - several orders of magnitude larger than the equivalent elastic cross-sections, and very similar to the electron-electron elastic scattering cross-sections. Here theory and phenomenology both influenced the development of an explanation of this result.

Bjorken - Theory

In 1963, during the construction of the SLAC facility, Bjorken joined the theory group there and began calculations of what the experimenter might find in the electron-proton scattering measurements. Even before the experiments had been conducted, they were influencing theory and phenomenology development – in a sense this is an example of "acausal symbiosis" between experiment and theory.

In 1964, Drell and Walecke at Stanford had shown that the inelastic electron-proton cross-section could be expressed in terms of two independent functions of the impact parameters – W1 and W2 – known as the structure functions. Building on this result, Bjorken, based on current algebra, inferred that, in the limit of very large momentum transfer (q2) and energy lost by the electron (v) (with the ratio of v and q2 fixed), the quantities W1 and v*W2 should only depend upon v/q2 (the "Bjorken Limit"). Because the energy dependence of the data was removed by scaling by the energy q2, the effect became known as scaling. The SLAC cross-section data, when plotted again the scaled variable 2Mpv/q2 (where Mp is the mass of a proton), showed the predicted scaling.

Feynman and Partons – "Phenomenology"

Late in 1968, Feynman visited SLAC and there developed the far more successful Parton explanation of the scaling result. Because the strong force coupling constant was believed to be large, in the typical field theory perturbation expansion of the strong force the more and more complicated terms generally contribute larger and larger amounts to the result. Thus, particles that interact with the strong force, like protons had to be represented as being surrounded by an infinitely complex cloud of virtual particles creating a complex structure. Depending on what particles one believed to be elementary, the protons would either be surrounded by swarms of mesons or swarms of quarks. Even after visiting SLAC, in the absence of compelling evidence, Feynman chose to ignore the detailed form of whatever the cloud making up the proton was – either mesons or quarks. To talk about them in a non-committal way, Feynman developed his theory in terms of “Partons” throughout the 1960s. By making simplifying assumptions about the nature of electron-parton and parton-parton interactions, Feynman was able to develop a phenomenological idea about how partons might appear in experimental data if sufficiently high energies could be reached. At SLAC, he had found the energies he required – and the Parton Model explained scaling in a direct fashion. Instead of exchanging a photon with the proton as a whole, the electron exchanges a photon with a single parton, which being a point particle interacts with the photon in essentially the same way as the electron (i.e. by the rules of Quantum Electrodynamics) explaining the similarities between the electron-proton and electron-electron cross-sections. Feynman also showed that the structure functions, W1 and W2, were measures of the momentum distribution of the partons, so the scaling result also appeared naturally.

Why were people more interested in Feynman’s work than Bjorken's?

According to Pickering, Feynman and Bjorken both developed explanations of scaling that included discrete point scatterers, Feynman as the starting point, Bjorken as an obvious implication of the final result. However, just like the constituent quark model, physicists generally chose to work with partons primarily because they used techniques that were common to many physicists background (despite their original, conceptual, basis in quantum field theory) unlike the complicated current algebra of Bjorken that was accessible only to the few who specialized in that method.[24]

Despite their obvious similarities as theories (or "phenomenologies"?) that explained properties of the hadrons in terms of simple constituent particles, the Constituent Quark and Parton models were not inarguably equivalent. The Parton model assumed that they were essentially free particles (i.e. Partons did not interact with each other), whereas the Quark model required very strong inter-Quark forces to explain their experimental non-appearance. Also, Partons were a model of high energy scattering effects and the Quarks were a model designed to explain the low energy resonances - two very different areas, necessitating two very different languages - one for the phenomenologists dealing with partons and another for those dealing with the constituent quark model. However, because most physicists made the obvious mental leap of making the quarks and the partons the same things within their world pictures, it became necessary for these two groups of phenomenologists to talk to each other. Thus, a trading zone was formed as various people elaborated on the parton model to extract predictions of how the structure functions would depend upon the possible quantum numbers of the partons – with the hope of confirming the partons as quarks.

In the early 1970s, experiments measuring the structure functions in the Bjorken limit (requiring a broader set of data than the original measurements) gave a reasonable indication that the partons were spin 1/2 particles. Further attempts were made to determine if the partons possessed the fractional charge characteristic of quarks. Initially assuming that the parton cloud consisted entirely of the three quarks in Zweig's basic Constituent Quark Model, the estimates of the structure function were roughly double the measured values. Moving closer to Feynman's original idea of an infinite cloud of partons, physicists argued that the parton cloud should contain three "valence" quarks and an infinite quark-antiquark sea. Despite some flexibility in choosing the relative magnitudes of these two componets, the structure functions were still overestimated. The addition of "glue" (later "gluons") to provide the interactions that kept the hadrons together in the form of electrically neutral particles. By allowing the gluons to carry some of the total hadron momentum, but not interact directly with the electron (due to their lack of electrical charge), Weisskopf and Kuti at MIT in 1971 were able to achieve a "fair quantitative fit"[25] to the SLAC data. However, in their effort to wrangle the data into the form they wanted, the phenomenologists had were "in danger of [creating a theory that was] more elaborate than the data which it was intended to explain."[26]

However, their conviction of the quark-parton equivalence was repaid once neutrino data from the Gargamelle detectors at CERN were published in 1974. Because neutrinos are electrically neutral, these results explored the "weak" quantum numbers of the partons. Previous, less efficient, neutrino detectors at CERN had shown that a scaling result held for neutrino-proton interactions. Gargamelle, a huge bubble chamber, ten times larger than the previous detectors at CERN produced a great deal of data, leading to a number of important results, the most relevant here being that

1) the ratio of W2 for neutrino-proton and electron-proton was 3.6, just what the quark-parton (fractional charges) model predicted and far from the integer charged parton prediction.

2) Another structure function had a value of approximately three, corresponding to having three valence quarks, as required for the quarks and partons to be equivalent.

There are aspects of Pickering and Galison in this. Pickering would point to the continual elaborations upon the parton model to make it fit experiments as accommodations made to the resistances (the failed estimates). Galison would instead point to the desire to make the quark numbers work in this context as a constraint the physicists had imposed upon themselves by commiting to the assumption that the partons and quarks were both real and also the same thing.

Quantum Chromodynamics

Quantum Chromodynamics was half of the resurgence of quantum field theory that happened in the early 1970s. The other half was the Electroweak theory – by applying the principle of gauge invariance (invariance of the field to under local transformations of a specific symmetry group) to a field that was a combination of the electric and weak interactions, Weinberg, Salam and Glashow had found a renormalizable theory of the weak interaction. When t'Hooft demonstrated the renormalization of this theory one of the two flaws in quantum field theory vanished.[27]

In this vein, theorists also hoped to develop a renormalized gauge theory of the quark interaction,[28] with the quarks as the fundamental fields and the forces mediated by the gluons (now gauge vector fields) from the Parton model. The strong force needed to act on a different set of quantum numbers to the weak force - fortunately, two equivalent formulations had be created that gave the quarks exactly the required set of extra quantum numbers - parastatistics and colour.

These two formulations appeared in the mid 1960s as responses to an apparent crisis in theory and phenomenology caused by the constituent quark model - although the quarks were spin-1/2 and thus presumably fermionic by the spin statistics theorem, they appeared to have symmetrical wavefunctions - in violation of the Pauli Exclusion Principle.[29] Either the Pauli Exclusion principle was wrong, or the spin-statistics theorem was wrong, or some other explanation was required.

In 1964, O. W. Greenberg explained the quark symmetries by assigning them to a class of particles he had developed the theory of - the parafermions (specifically of order three) that possessed an overall antisymmetry, but were symmetric under the interchange of two paraquarks. Greenberg himself is a classic example of opportunism in context - his early career had revolved around the para-particles (fields with symmetries more complex than those of fermions and bosons).

When Greenburg moved to the Institute of advanced study to work on the SU(6) theories of quarks, he quickly encountered the statistics problem and realized that a parafermion theory would resolve the issue. His model essentially assigned each quark three additional quantum numbers.[30]

In 1965, Han and Nambu introduced colours - a three triplet model in which each quark came in three types, called (because of how they combined) red, blue and green (or sometimes yellow). This method also solved the statistics problem, in a way that was observationally the same as Greenbergs (at the time).

Nambu originally pointed out that, in his coloured quark model, the Eightfold way could be derived from quarks with integral charge, which is something he desired, because, like all physicists, a belief at some level in integer charges was a constraint which he had trouble overcoming. However, the stronger constraint that theory could not diverge drastically from phenomenology and experiment won out: when the evidence of fractionally charged quarks became strong, the theories and models of integer charged quarks disappeared.

Pickering, once again, argues that the unfamiliarity of the parastatistics theories lead physicists to adopt the colour formulation that was structured in a more familiar fashion.[31] In this case, French points out that Greenberg’s papers on parastatisitics had been widely circulated and been seen by such luminaries as Glashow and Nambu as early as 1962, refuting the idea that the parastatistics formulation was intrinsically unfamiliar to the scienctists best positioned to take it up and develop it further.[32] Instead, French argues that the central reason, in the absence of experimental evidence in either direction, that the paraparticle model was neglected in favour of colour was one of "fertility... this model [colour] possessed a greater capacity for generating new lines of development than paraparticle theory, which was, heuristically, comparatively sterile."

Several other factors also needed to fall into place for a quantum field theory of the Strong interaction to appear, most notably a solution to the large coupling constant problem. This appeared in a seemingly surprising quarter - Solid State physics. Solid state physics, the study of collections of atoms (typically in crystal lattice structures), contained a number of problems that could be written down in a structure similar to field theory, but with the same problem as the strong force - large coupling constants. Between 1969 and 1971, Wilson, at Cornell, developed the renormalization group method of analysing these problems, which allowed one to deal with large coupling constants.

Applying this and other discoveries about field theory, in 1973 Gell-Mann, Fritzsche and Leutwyler published the renormalizable gauge theory of colour and the strong interaction - which quickly became known as Quantum Chromodynamics or QCD.

Thus, after twenty years, quantum field theory returns - I cannot argue for its return in Pickering's style, but perhaps a convincing explanation is available from Galison style constraints. Field theorists were constrained by their definite identification or renormalizable field theories with reality in a positive way - this narrowing of the search lead them, slowly but inexorably, to the Electroweak and Chromodynamics.

French’s theory of Uberty is also relevant here: Once the Electroweak theory was shown to be renomalizable by t’Hooft, the "proven fertility" of quantum field theory was markedly increased, attracting more attention to this neglected field. Based on these, and other, results Feynman remarked "[as for] the idea that hadrons consist of quarks…let us assume it is true"33 Here is our paradigm shift (amongst the theorists and the phenomenologists). As of the end of 1973, Feynman had decreed that physics now believed the Partons and the Quarks were the same thing. More importantly, they were real things that were the fundamental building blocks of the hadrons – and the final resolution of the crisis that had begun emerging in the 1950s.

Conclusion

As we have seen, Pickering’s "opportunism in context" theory cannot be universally applied to every development in physics with ease. Some events are easily described by Pickering, like the selection of the Parton model over Bjorken's current algebra based theory of scaling as the useful tool for further elaboration. However, casting certain events, like the flight from and the return to Quantum Field Theory at all in this style is beyond the skills of the author. Other events, like the selection of Nambu's colour theory over Greenburgs paraparticles as the description of quarks to use in the mid to late 1960s can be described with some success by Pickering and by rival theories like French's.


Galison’s context theory forms a useful structure to analyse the progress of science, albeit with some modification to allow for resistances (which, as Pickering stresses, are different to contexts) and other factors (uberty and opportunism in context) that describe the evolution of science within the constraints surrounding it.

Regarding Galison's intercalated structure of "revolutions," the Kuhnian picture works if you look from far enough away, where the decade of flux between different paradigms in the various sub-fields ignored. However, it is clear from the analysis that crises and resolutions occurred at different times in the different fields as Galison suggests, with obvious interaction between them. However, in the case of high-energy physics, I would suggest that Galison’s model is improved by the separation of the phenomenology from theory and experiment fields due to its very separate evolution. The attached timeline gives a summary of how this intercalated structure appears, with the various transitions in the subfields and their interaction identified.

As can be seen, theory, experiment, instrumentation and phenomenology all develop separately, but with a great deal of interaction. For example, the development of Quantum Chromodynamics hinged upon previous theoretical results like the Gauge theories (the electroweak theory) and the renormalization group, phenomenological results like the constituent quark model and experimental results, like the ADONE "colour conformation". In turn, all of these developments rested upon earlier developments in the four fields.

Some interesting observations can be made about the relative importance of theory and phenomenology during the "particle crisis" - as the crisis progressed, theory became less and less relevant to what was going on - everywhere, given the choice between theoretical models and phenomenological ones, everyone choose to pursue the phenomenological ones. However, in order to resolve the crisis in a way fitting the meta-constraints physics subjects itself too, the explanation had reside in theory rather than phenomenology, so the resurgence of theory once phenomenology had found some results that could help drive theory in the correct was is perhaps to be expected. However, this is a single (albeit compound) crisis/paradigm shift event - in order to make sweeping statements, a larger number of crises should be examined to see if a similar pattern emerges.

Footnotes

1 French, ‘The Esperable Uberty of Quantum Chromodynamics’ Stud. Hist. Phil. Mod. Phys. V26 p87-88

2 Andrew Pickering, Constructing Quarks: A Sociological History of Particle Physics. (Chicago: University of Chicago Press, 1984). p5-10

3 Kuhn, The Structure of Scientific Revolutions (Chicago: University of Chicago Press, 1949) p16

4 Kuhn, The Structure... p 7

5 Galison, "Context and Constraints" in Buchwald, ed. Scientific Practice, Ch 2

6 Galison, "Context and..." p14

7 Galison, Physics 121 Lectures Harvard, Cambridge 4/22/04

8 Pickering, "Beyond Constraint" in Buchwald, ed. Scientific Practice, Ch 3

9 Pickering, Constructing Quarks... p 9-10

10 French, "The Esperable...' p87-105

11 French, "The Esperable...' p92

12 French, "The Esperable...' p87-88

13 Paul Forman, "Weimar Culture, Causality, and Quantum Theory, 1918 1927: Adaptation of German Physicists and Mathematicians to a Hostile Intellectual Environment," in C. Chant and J. Fauvel (eds.), Darwin to Einstein: Historical Studies on Science and Belief, p267-302

14 Pickering, Constructing Quarks... p47

15 Pickering, Constructing Quarks... p73-75

16 Gell-Mann, "A Schematic Model of Baryons and Mesons", Physics Letters 8 p214-215

17 Zweig, "An SU(3) Model for Strong Interaction Symmetry and Its Breaking", CERN Preprint 8182/TH401 (17/1/64)

18 Pickering, Constructing Quarks... p86

19 Zweig, 1981, "Origins of the Quark Model", p458, in Proceedings of the 4th International Conference on Baryon Resonances 20 Pickering, Constructing Quarks...P107-110

21 Pickering, Constructing Quarks... p91

22 Pickering, Constructing Quarks...p89

23 Zweig, "An SU(3)..."

24 Pickering, Constructing Quarks... p138

25 Gilman, "Photoproduction and Electroproduction". Phys. Rept. 4, 95-151

26 Pickering, Constructing Quarks... p143

27 Historical Facts relating to Electroweak interaction from: Andrew Pickering, Constructing Quarks... Ch 6

28 Historical facts relating to Quantum Chromodynamics from Andrew Pickering, Constructing Quarks... Ch 7, except where otherwise noted.

29 French, 'The Esperable...' p88

30 French, 'The Esperable...' p91

31 Pickering, Constructing Quarks... p216-220

32 French, 'The Esperable...' p93-94

33 Feynman, 'Structure of the Proton'. Address given at Dansk IngiØrforeng, Copenhagen, Denmark. 10/8/1973. Reprinted in Science, 183, p 608

No comments: