Toxic Substances Regulation and the Structuration of Interdependent Policy Networks


Gabriel Lévesque, McGill University

An appreciable temporal gap often exists between the discovery of a given substance’s toxicity and its regulation. Even when toxic substances are being regulated, it is commonplace to realize ex post that initial precautions were insufficient, and that workers and communities were still exposed to significant risks. The causal processes behind this protracted back and forth motion between expert knowledge and state regulations is the core puzzle of studies on toxic substance controversies. The flagship literature in this area focuses on the role of corporations in feeding controversies, manipulating knowledge production, and fostering public ignorance about the hazards of toxic products. This literature has been criticized for neglecting the role of states in protracting these controversies. More recent contributions have partly addressed this gap by emphasizing interactions between corporations and policymaking bodies. In this paper, I build on these recent insights and ask two often-overlooked questions about toxic controversies: (1) How are toxic substance policy networks configured? and (2) how are these policy networks shaped by key regulatory changes? I develop an approach to regulatory networks that emphasizes growing interdependence between regulating bureaucracies and regulated industries. I test this approach using longitudinal network data for the policy trajectories of silica and lead in the United States. I build network data frames from the raw text of the Toxic Docs database, which includes millions of pages of previously classified corporate documents. I rely on SpaCy’s named entity recognition algorithm to extract organizations from textual data, which I use to create substance-specific data frames. I manually code a sector (e.g., a federal bureaucracy, an industrial association) for each organization. These sectors are then associated to communities of sectors (e.g., all bureaucracies, all types of corporate actors, etc.) I use modularity to investigate the evolving structure of toxic policy networks. I then model the impact of key policy changes on those networks over time using interrupted time series analyses. Modularity analyses show that corporations and enforcing bureaucracies are increasingly intermeshed throughout policy trajectories. Time series analyses show that major policy events like the passage of the 1970 Occupational Health and Safety Act increases the intermeshing between corporations and bureaucracies. These results sustain two novel arguments in the literature on regulatory politics. First, it suggests that the relationship between industries and enforcing bureaucracies is not primarily one of interference, but rather one of interdependence. Second, it suggests that policy events is primarily as cause, not as consequence, of network structure. This paper leverages the case of toxic substances to contribute to the literature on the relationship between states, corporations, and other actors in regulatory practices. Methodologically, it circumvents a potential event-based bias in comparative historical policy research, i.e., a myopic focus on visible data points that emerge from events like media scandals, new policies and regulations, or scientific publications, by measuring underlying aspects of regulatory processes which are hard to capture otherwise. This allows us to refine theories of regulatory change and the regulatory welfare state. Foremost, this paper reconceptualizes the role of time and policies in regulatory processes, by emphasizing stable patterns over disruptive ones in the structuration of policy networks. 

This paper will be presented at the following session: