top of page
  • Writer's pictureHarry Law
Announcing the launch of Learning From Examples.
Nidia Dias, Visualising AI by DeepMind

A very short update to to direct you all to my new home at learningfromexamples.substack.com. All of my posts will be housed on Learning From Examples for the foreseeable future, where I am to write more regularly on AI history, philosophy, governance, policy & more.

I started the project because there's a (quite natural) tendency for lots of people hearing about, working on, or interested in AI to think 'nothing like this has ever been done before!', which is a response that I sympathise with given the impressive capabilities of today’s AI systems. Often, though, we have been here before (or at least somewhere similar).*

Machine learning is the dominant paradigm underpinning modern AI. It can best be summed up in three words: Learning From Examples. That very simple idea, looking at the past to predict the future, is the motivation behind this project. I’m interested in thinking about how best to maximise the potential of AI whilst minimising its risk, focusing on questions related to policy, governance, ethics, history, and more.

*Clearly, I don’t mean to say that there have been technologies with identical or even particularly similar social, economic or political loci (especially with respect to the speculated forms that AI might take in the future). What I mean to say is that for anyone interested in ethics and governance issues, our past is replete with lessons to be learned.
This post is the second in a two part series looking at the history of NIST. It focuses on the period 1950-present and the incarnation of the NIST as we know it.
In the last post, we left the National Bureau of Standards as the 1940s were drawing to a close. The Bureau, which would eventually become NIST, had pushed forward the boundaries of science and driven a programme of standardisation across American industry. It had taken on railways and radios, buildings and beacons for aeroplane landing systems. The agency had weathered recessions and wars, but now faced a different sort of test.

First, in 1950, a new Organic Act (the legislation that determined the Bureau’s role and focus) was passed. While it did not meaningfully change its operations nor philosophy, it is worth reflecting on for what it did not include. The new act rejected a 1945 request from Bureau Director Lyman Briggs to widen the mandate of the organisation to include “the prosecution of basic research in physics, chemistry and engineering to promote the development of science, industry and commerce." While this approach might seem in-line with the organisation’s remit at first glance, the core difference is that this ‘basic research’ function would not be directly tied to any overarching goal of standardisation or testing.

The proposed directorate did not manifest, and the goals of the organisation remained unchanged. That sameness, however, had seen the Bureau drift closer to the U.S. military, with former NIST researcher Elio Passaglia estimating that in 1953 “85 percent of the work was for other agencies, most of it for the military.” In response, a government report in the same year by a committee headed by Mervin Kelly of Bell Laboratories argued that its basic standards mission had shrunk, and the Bureau was in danger of becoming an arm of the military. Following the publication of the report, over 2,000 staff (a significant portion of its employees) were transferred from the agency to the U.S. armed forces.

The second challenge was about how best to support, and respond to, the rise of computing technologies. Computerisation was the watchword of the 1950s, as machines such as the University of Pennsylvania’s ENIAC, completed in 1945, proved the concept underpinning general-purpose digital computers. In the early years of the decade, the Bureau built the Standards Eastern Automatic Computer (SEAC), the first internally programmed digital computer. Weighing over 3,000lbs or 1360 kg, the design was based on EDVAC, a sibling machine developed by the University of Pennsylvania similar in scope to its ENIAC project. With a speed of 1 megahertz and 6,000 bytes of storage, SEAC counted the 1957 production of the world's first digital image amongst its achievements. Using the computer, the agency partnered with the Census Bureau to develop the Film Optical Sensing Device for Input to Computers (FOSDIC), a machine capable of reading 10 million census answers per hour. Amidst the success of SEAC, the Bureau’s scientists debunked the myth of commercial additives improving battery performance, a controversial claim that led to high-level investigations but ultimately vindicated its testing methods.

In the 1960s, as the organisation moved from Washington D.C. to its new home in Gaithersburg, the Bureau published the Handbook of Mathematical Functions. The book, which was estimated to be cited every 1.5 hours of each working day by the 1990s, contained over 1,000 pages of formulas, graphs, and mathematical tables written partly in response to the mass adoption of computing technologies. Away from computer science, other notable projects in the decade included an effort to improve the reliability of cholesterol tests (understood to be off by as much as 23 per cent in 1967), the construction of the NIST Center for Neutron Research, and efforts to study the elusive particles known as free radicals. It was, by the Bureau’s standards, a quiet decade.

Against the backdrop of economic turmoil of the 1970s, which once again put pressure on the agency’s composition and budget, the Bureau turned with renewed focus to its core mission: to devise and revise measurement standards at the forefront of science and technology. It began the decade with the construction of the ‘topografiner’, an instrument for measuring surface microtopography. The tool, which could map the microscopic hills, valleys, and flat areas on an object's surface, enabled researchers to better understand how these characteristics determine properties such as how a material interacts with light, how it feels to the touch, or how well it bonds with other materials. In keeping with a focus on building tools at the cutting-edge of measurement science, the agency demonstrated a new thermometer based on the idea that the ‘static noise’ created by a resistor changes in tandem with the temperature of a particular substance. Completing the trifecta of new measurement processes was a 1972 effort to measure the speed of light at 299,792,456.2 +/- 1.1 metres per second, a value that was one hundred times more accurate than the previous recording.

NIST and NASA collaborated to manufacture standard reference material (SRM) 1960,
also known as “space beads.” Source: NIST

This dynamic, of science through measurement and measurement through science, continued as the agency entered the 1980s. Researchers made billions of tiny polystyrene spheres in the low-gravity environment of the Challenger space shuttle during its first flight in 1983. The spheres, known colloquially as ‘space beads’, were made available as a standard reference material for calibrating instruments used by medical, environmental, and electronics researchers. As NIST puts it: “The perfectly spherical, stable beads made for more consistent measurements of small particles like those found in medicines, cosmetics, food products, paints, cements, and pollutants.” A year later in 1984, the organisation managed to create a new standard for measuring electrical voltage that was more accurate, stable, and easier to use than those that had previously existed. The new measurement was based on the Josephson effect, whereby voltage is created when microwave radiation is applied to two superconducting materials (materials that can conduct electricity without resistance) separated by a thin insulating layer. Because the amount of voltage is related to the frequency of the microwave radiation and well-known atomic constants, its scientists were able to reliably calculate the voltage because they knew which frequency of radiation was being used.

Enter NIST: 1988-present

No longer the National Bureau of Standards, the organisation became the National Institute of Standards and Technology in 1988. The federal government mandated a name change as part of a broader expansion in the organisation's capabilities in response to a need to dramatically enhance the competitiveness of U.S. industry. The move followed a period of offshoring in which, for example, technical capabilities such as those needed to produce computer chips were moved to growing markets such as Japan. As the legislation explains:
It is the purpose of this chapter to rename the National Bureau of Standards as the National Institute of Standards and Technology and to modernize and restructure that agency to augment its unique ability to enhance the competitiveness of American industry while maintaining its traditional function as lead national laboratory for providing the measurements, calibrations, and quality assurance techniques which underpin United States commerce, technological progress, improved product reliability and manufacturing processes, and public safety.
The new focus on industrial competitiveness, however, did not come at the expense of its work with the organs of government. NIST’s forerunner had responded to a call from a government agency to standardise lightbulbs in the early years of the 20th century, but by the 1990s the complexity of requests had naturally grown by orders of magnitude. Following a communication from the National Institute of Justice (the research division of the U.S. Department of Justice), NIST produced in 1992 the world’s first DNA profiling standard. The standard was used to ensure the accuracy of restriction fragment length polymorphism analysis, which essentially involves creating a unique 'barcode' for each individual based by extracting DNA, cutting it into fragments using special proteins, and separating each fragment by placing them in a gel and applying an electric field. The request and response drew into focus the growing gulf between similarities in process and divergences in practice from the old organisation to the new. The central difference, however, was that the former was primarily mediated by its role as a federal agency, while the latter was influenced by a constellation of social, technical, and material factors that determined what sort of work it could undertake and what sort of work it should undertake.

Yet for every new standard like DNA profiling there was a refinement or redevelopment of an existing one. NIST researcher Judah Levine developed in 1993 the NIST Internet Time Service, which allows anyone to set their computer clock to match Coordinated Universal Time. Levine said in 2022 that the service, which in 2016 was estimated to have responded to more than 16 billion requests a day, played an important role in ensuring that computers could agree about when interactions such as message receipts or stock trades took place. A common thread throughout NIST’s story, which ran from its inception until the present day, is one of reaction and response. NIST was not an island, and its priorities were shaped by the politics of the moment, requests from other federal bodies and respected groups, and by the broader technological, social and economic forces of the day.

In 1994, all of these factors were in play as the organisation responded to calls for better benchmarking from the machine learning community in the wake of connectionism’s revival. (Connectionism, a form of parallel processing with roots in pattern recognition, would eventually become known as deep learning when the practice of building large, multilayered networks proved to be highly effective.) This was the context in which the organisation created two databases of handwritten digits: NIST Special Database 3 and NIST Test Data 1. The former consisted of characters written by workers at the United States Census Bureau, whereas the latter was made up of characters written by high school students. Hosting a competition whereby participants drawn from academia and industry could test their ability to successfully classify the test set, NIST hoped to encourage the development of algorithms capable of generalising across unseen (and sufficiently different) data. In practice, though, because the distribution of data in both sets was dissimilar, many of the algorithms performed poorly despite achieving error rates of less than 1 per cent on the validation sets drawn from the training data. In response, at the conclusion of the competition, researchers from Bell Labs mixed the training and test datasets to produce two new sets for testing and training. The result was the MNIST database (where the ‘m’ stands for ‘modified’), one of the most recognisable and well-known machine learning benchmarks.

Sample images from the test set. Source: MNIST database

The collision of the science of measurement and the measurement of science has been a defining feature of the agency’s history. Two years later, in 1996, NIST's Kent Irwin invented a ‘transition-edge’ sensor, which was capable of measuring a single photon’s worth of energy. The tools, which were highly sensitive, found use in the South Pole Telescope, which was designed for observations in the microwave, millimetre-wave, and submillimeter-wave regions of the electromagnetic spectrum. Finally, at the edge of the new millennium, the atomic clock NIST-F1 took over as the nation's primary time and frequency standard. Over the course of its life, the clock’s accuracy improved six-fold to the point at which it will not gain or lose a second in more than 120 million years.

This is where my short history of NIST ends. While some of the organisation’s most impressive achievements––from defining new standards for post-quantum cryptography to contributing to the overhaul of the International System of Units––have taken place during its recent years, I am omitting them from my story and ending on a slightly different note. It’s hard to say when exactly NIST became the body that we know today. Over the course of its history it has existed as a function of the social, economic, and political currents that defined America throughout the 20th century and beyond. An organisation that built influential standards in response to both polite requests and wars, its past reminds us that technology is not deterministic and that scientific practice is culturally conditioned. NIST suffered cuts when the federal government tightened its belt and was brought back down to earth when the Kelly report concluded that it was trapped in the orbit of the US military. Today, its extensive mesh of programmes focusing on the development and governance of AI (summarised in part one of this series) might at first appear very different to standardising lightbulb manufacturing or ensuring compliance on the railways.

But dig a little deeper and the motivation, if not the method, remains the same. NIST shaped the path of technical, scientific and industrial development over the course of its 120 year history. But it did not do so in a vacuum. Its tale reminds us that standards can provide rallying points to organise around, yardsticks to determine the rate of progress, and guidelines that encourage scientists to prioritise certain avenues of research over others. That was as true in 1901 for lightbulbs as it is in 2023 for AI.

The first in a two part series looking at the history of NIST. It includes a survey of the organisation’s work today and its early years as the National Bureau of Standards from 1901-1950. Note: NIST was founded as the National Bureau of Standards in 1901 and was renamed Bureau of Standards in 1903. In 1934, the word "national" was once again added to its name. This post refers to the organisation as ‘Bureau’ throughout.

Rose Pilkington, Visualising AI by DeepMind

The National Institute for Standards and Technology (NIST) produces a dizzying array of datasets, software, measures, frameworks, and standards. Part of the United States Department of Commerce, the organisation’s self-described mission is to promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology. It is, for all intents and purposes, the embodiment of Galileo’s famous maxim: “Measure what is measurable, and make measurable what is not.”

NIST’s ‘standard reference materials’, reviewed on an annual basis and published in a catalogue of over 1,200 individual entries, are designed to act as physical benchmarks for industry, government, and academia. With detailed information about chemical composition and material properties such as size and weight, the reference materials are employed to calibrate instruments, verify test results, and develop new measurement methods. They are also used to connect U.S. materials to the International Systems of Units, the modern form of the metric system and the world's most widely used system of measurement.

Standard reference materials are a varied bunch. Information about wild and aquacultured shrimp––including an analysis of genetic composition to determine their place of origin-–enable the U.S. Food and Drug Administration and Customs and Border Protection agencies to assess whether imported shrimp are authentic. Bullets and cartridge cases are produced to act as reference standards for crime laboratories to help verify the proper functioning of optical-imaging equipment and to facilitate laboratory accreditation. And a standard cigarette provides firms producing fire-resistant furniture with a benchmark against which to determine how successful they have been in building safe products. These and hundreds of others are grouped into categories encompassing food and agriculture, nanomaterials, clinical hygiene, engineering materials amongst several others.

Alongside the extensive catalogue of materials sits NIST’s trove of standard reference data, which generally takes the form of information about the properties of a substance or system, measurable characteristics of a physical artefact, performance characteristics of a system or what NIST terms digital data objects. The generation of standard reference data can be traced back to 1968 when the U.S. congress passed the Standard Reference Data Act, which stated that “reliable standardized scientific and technical reference data are of vital importance to the progress of the Nation's science and technology.”

Since then, NIST has developed and distributed standard reference data in chemistry, engineering, fluids and condensed phases, material sciences, mathematical and computer sciences, and physics. It produces a range of software from programs designed to automate the counting of bacteria colonies to those designed to help manufacturers assess the readiness of budding smart factories. The organisation has a GitHub presence including datasets containing excerpts from demographic data from U.S. households used to assess the performance of privacy-enhancing technologies, and a repository for the Face Recognition Vendor Test, an initiative that aims to improve the accuracy, speed, and resilience of facial recognition technologies.

Facial recognition is just one area of AI that NIST is sizing up. The organisation conducts research into the core building blocks of the technology: it creates new system architectures, studies chip design, and sketches different approaches to red-teaming powerful models. Its scientists and engineers use AI to improve the measurement process itself by, for example, developing deep learning methods to improve the fidelity of nanoscale microscopy techniques. NIST’s staff work on data characterisation, practices for the documentation of datasets, and the construction and maintenance of datasets that can be used to test or train AI systems.

The organisation has conducted hundreds of evaluations across thousands of AI systems. It notes that “while these activities typically have focused on measures of accuracy and robustness, other types of AI-related measurements and evaluations under investigation include bias, interpretability, and transparency.” In January 2023, after a period of consultation, NIST released its AI Risk Management Framework, which seeks to help developers, users, and evaluators of AI systems manage harms posed by AI at the individual, organisational, and societal level.

There are few other organisations, public sector or otherwise, concerned with AI and aquaculture shrimp both. Understanding how such an institution came to be––the drivers, contingencies, and dependencies behind NIST as we know it––is a useful exercise for those grappling with the governance of AI.

Early years: 1901-1950

When does the story of NIST begin? The U.S. National Archives’ account starts in 1830 with the establishment of the Office of Standard Weights and Measures, which was tasked with––perhaps unsurprisingly––overseeing the standardisation of weights and measures in the country. According to the historian Louis A. Fischer, the genesis of the Office of Standard Weights and Measures could be found in wider effort to minimise “large discrepancies among the weights and measures in use at the different ports.”

Elio Passaglia, a former NIST researcher and author of A Unique Institution: The National Bureau of Standards 1950-1969, made the case that we ought to start in 1900 with the establishment of the General Electric Research Laboratory. Passaglia proposes that the laboratory, which was the corporate innovation hub of industrial giant General Electric and the first industrial research facility in the United States, was “the catalyst that forced the formation of the Bureau [the precursor agency to NIST].” The founding of General Electric Research Laboratory, he argues, marked the beginning of a ten year period in which U.S. firms (including DuPont and Westinghouse) built dedicated physical science laboratories. Perhaps equally important, though, is that following the 1893 Columbian Exposition in Chicago the U.S. agreed on definitions for electrical units but had no way to measure them. In this version of events, it was downward pressure to measure electric units from lawmakers and upwards pressure from powerful firms that spurred the creation of a new agency.

The most commonly provided date, however, is the point at which the federal government established the organisation that would eventually become NIST: the National Bureau of Standards. Superseding the Office of Standard Weights and Measures, the new Bureau was tasked with the custody of existing national standards, the comparison of U.S. standards with their international counterparts, the introduction of new standards, the testing of apparatus used to measure standards, and the determination of physical constants in materials deemed to be important for science and manufacturing efforts.

Led by Samuel W. Stratton, the first decade of the Bureau saw it take on the difficult challenge of making progress in each of these domains from what was effectively a standard start. In 1904, it began to test lightbulbs for agencies within the government to ensure their effectiveness, and in keeping with the focus on illumination, displayed the first "neon" signs lit by electrified gases at the St. Louis World's Fair in the same year. In 1905, the organisation convened the first National Conference on Weights and Measures to write model laws, distribute uniform standards, and provide training for inspectors. The first standard reference material was issued in 1906: a standardised sample of iron designed in response to a request from the American Foundrymen’s Association.

Seattle weights and measures inspectors with confiscated fraudulent measuring devices. Source: NIST

In the second decade of the 20th century, the organisation continued its push for standardisation, built new technologies, and cultivated ties with the U.S. military. In 1913, the Bureau designed unique railcars to test large capacity scales used on the railway. The programme found that 80 per cent of the rail scales were off by almost 4 per cent, a twenty-fold increase on the tolerable margin for error of 0.2 per cent. Two years later in 1915, the organisation published the nation’s first model electrical safety code in response to a request for help from the coal mining industry six years earlier. It was during this period that the Bureau’s researchers first turned their attention to the military in the development of the radio direction finder, an antenna that determined the direction of radio transmissions deployed by the U.S. Navy to find the positions of enemy forces during the First World War.

In 1921, Herbert Hoover became Secretary of Commerce. In response to a short but intense recession between 1920 and 1921, Hoover thought that recovery could be achieved through "the elimination of waste and increasing the efficiency of the commercial and industrial systems.” As part of this process––which some accounts have connected to Hoover’s interest in conversationism––the Bureau was directed to intensify its programme of standard setting and simplification.

The drive for efficiency collided with its programme of scientific research when the Bureau launched its own radio station six months before Pittsburgh’s KDKA, the first American commercial radio station, came on the air one Tuesday morning on 2 November in 1920. Its purpose was to study the technical problems related to early radio, and was followed by a 1923 effort to broadcast standard frequencies to help stations avoid interfering with each other’s signals. A year later, it developed an ‘earth-current meter’ whose goal was to enable measurement of the amount of current leaking into the ground and corroding nearby pipes (a common problem caused by the laying of over 40,000 miles of streetcar tracks). A 1928 effort saw the organisation burn down two condemned brick buildings in Washington D.C. to aid in the design of uniform fire resistance standards for buildings.

As the U.S. economy reeled from the Great Depression of the 1930s, the Bureau’s staff dropped from around 1,100 people in 1931 to fewer than 700 by mid-decade. Only by 1939 did it again approach the level of the 1920s. But despite pressure on its resources, the 1930s were a decade in which the organisation consolidated its reputation as a respected practitioner of science. In 1931, for example, the Bureau built a chamber that produced highly precise amounts of radiation to enable scientists to test exposure detectors to ensure their proper functioning. According to NIST, “these comparisons harmonized international measurements of radiation and were used when drafting an X-ray safety code.” It invented in 1931 the ‘visual type’ beacon for an aeroplane landing system, which enabled the pilots to locate and land on a runway in poor visibility, and began a programme of building atmospheric measurement instruments known as radiosondes to improve weather forecasting. The final major contribution of the decade came in 1936, when the Bureau sponsored an expedition to Kazakhstan to observe a solar eclipse. Using a Bureau-designed and constructed 14-foot camera and 9-inch lens, the expedition is credited with taking the first natural colour photographs of a solar eclipse.

For the Bureau, the 1940s were a decade dominated by conflict. With the nation’s attention firmly fixed on the Second World War, the organisation helped to design, create and test the radio proximity fuze and automated guided missile system known as the Bat. Perhaps its most influential contribution, however, was the 1939 appointment of Bureau Director Lyman J. Briggs as the chair of the Advisory Committee on Uranium, which ultimately became the Manhattan Project when it was taken over by the Army Corps of Engineers in 1942. In Passaglia’s account, the Bureau did important work for the atomic bomb project focused on the purification of graphite and uranium, and in the development of analytical methods. But Briggs’ conduct was criticised. The nuclear physicist Isidor Isaac Rabi is quoted by John Newhouse in War and Peace in the Nuclear Age saying that Briggs “was out of his depth…and that held things up for a year.”

After the conflict’s conclusion, the organisation returned to its focus on scientific practice by building the world’s first atomic clock in 1949. The clock, built on the microwave frequency of ammonia, wasn't precise enough for timekeeping standards (though it did validate the idea). Only later in 1955 did Louis Essen at the U.K.'s National Physical Laboratory build the first accurate atomic clock, which measured time through the change of state in atoms of caesium. It was in many ways fitting. The speed of technological change would see the Bureau shift its focus towards computing during the middle years of the 20th century. And amidst major interventions from the federal government to consolidate and reorganise, the breakneck pace of digitisation in the decade that followed proved that time was not to be taken for granted.

This is the first half of a two post series about the origins of NIST. Next week, I’ll look at the period 1950-present and connect a historical perspective with today’s large models.

1
2
bottom of page