Modelling Gene Regulation: (De)compositional and Template-based Strategies

Although the interdisciplinary nature of contemporary biological sciences has been addressed by philosophers, historians, and sociologists of science, the different ways in which engineering concepts and methods have been applied in biology have been somewhat neglected. We examine – using the mechanistic philosophy of science as an analytic springboard – the transfer of network methods from engineering to biology through the cases of two biology laboratories operating at the California Institute of Technology. The two laboratories study gene regulatory networks, but in remarkably different ways. The research strategy of the Davidson lab fits squarely into the traditional mechanist philosophy in its aim to decompose and reconstruct, in detail, gene regulatory networks of a chosen model organism. In contrast, the Elowitz lab constructs minimal models that do not attempt to represent any particular naturally evolved genetic circuits. Instead, it studies the principles of gene regulation through a template-based approach that is applicable to any kinds of networks, whether biological or not. We call for the mechanists to consider whether the latter approach can be accommodated by the mechanistic approach, and what kinds of modifications it would imply for the mechanistic paradigm of explanation, if it were to address modelling more generally.


Introduction
"The gestalt switch to information thinking in biology with all its paradoxes and aporias," writes Lily E. Kay, "was even more fundamental than the paradigm shift from proteins to DNA" (Kay 2000, p xv). The idea of a gestalt switch suggests that there is a grand narrative to be told of how engineering sciences, especially in their informational mode, transformed the biological sciencessomething that has not gone unnoticed by the historians and philosophers of science. While the earlier discussion within history and philosophy of science addressed the theories, models and concepts transferred especially from engineering to biology through analogical reasoning (e.g. Fox Keller 2000Keller , 2002Kay 1997Kay , 2000Loettgers 2007), the more current discussion has also addressed the interdisciplinary practices of contemporary biology. Philosophers have studied, for example, the organization of research work This is the author's final version of an article accepted for publication in Studies in History and Philosophy of Science. The published version can be found under https://doi.org/10. 1016/j.shpsa.2017.11.002 and professional identities of researchers within multidisciplinary biology laboratories (e.g. MacLeod & Nersessian 2014;Osbeck & Nersessian 2017), as well as various kinds of integration taking place within molecular systems biology (O'Malley & Soyer 2012). We, however, call attention to the diversity of ways in which methods and concepts are transferred from one discipline to another. For instance, philosophers, historians and sociologists alike have tended to be critical towards engineering approaches within biology, without paying too much attention to the heterogeneity of these approaches (e.g. Kay 2000;Schyfter et al. 2013).
The research on gene regulation provides a good case in which to study such heterogeneity of engineering approaches in biology, both for scientific and philosophical reasons. While scientific practice manifests strikingly different ways of modelling the structure and dynamic of gene regulatory networks, network models have also attracted a major interest in the recent mechanistic philosophy of science.
What is especially interesting about this discussion is how quickly and almost inadvertently the mechanists' focus has shifted from detailed mechanistic descriptions to abstract network modelling (e.g. Bechtel & Abrahamsen 2010;Bechtel 2011;Levy & Bechtel 2013;Craver 2016). Most discussants have argued, in one way or another, that the mechanistic framework can also accommodate abstract network modelling. In other words, they propose that for the most part it could be parsed together with the decompositional approach, the two then supplying, ideally, different moments of the same research strategy (Green et al. 2017).
We argue that such an ecumenical approach subsuming both (de)compositional and abstract minimal modelling strategies within the umbrella of the mechanist approach is unlikely to succeed because of the fundamental differences between them. One can of course claim, as the mechanists have done, that the two approaches can complement each other in scientific practicetriangulation is indeed a common practice in science (ibid.). But this does not yet mean that the mechanistic approach would provide adequate analytic resources to address models studying network dynamics. Another clear problem is due to the focus of the mechanistic approach on explaining the actual causal mechanisms operative in the world (Koskinen 2017). The traditional mechanistic agenda that relies on decomposition captures very well some strategies for modelling gene regulatory networks, but we challenge the reading of abstract network models in the same light. Rather than results of omission and recomposition, as suggested by Bechtel and his co-workers (e.g. Bechtel & Abrahamsen 2010;Bechtel 2011;Levy & Bechtel 2013), abstract network models are typically template-based (Humphreys 2004;Knuuttila & Loettgers 2016), resulting from the application of formal templates and methods of complex systems theory and graph theory to genetic networks. The methods of complex systems theory and graph theory are subject-independent and applicable across different disciplines.
Because of this, abstract network models do not usually aim to represent any naturally evolved networks; in the study of gene regulation their goal is more general, namely to examine various kinds of 'design principles' of biological organization.
We study these two strategies for modelling genetic networksthe (de)compositional and template-basedthrough the research undertaken in the Davidson and Elowitz labs. The two biological laboratories are situated less than 300 meters away from each other on the campus of the California Institute of Technology.
Both laboratories study gene regulation, and their research programmes have emerged in dialogue with theoretical attempts to understand gene regulation through engineering principlesyet the theoretical and empirical agendas of the two laboratories are strikingly different. The research strategy of the Davidson lab provides an excellent example of an engineering-inspired approach to biology that proceeds in line with the tenets of mechanistic philosophy: experimental decomposition of developing organisms into genes and proteins, and combination of them into detailed wiring diagrams at various levels of organization. Such a mechanistic (de)compositional approach stands in stark contrast to Elowitz lab's minimal modelling approach that focuses on studying the basic organizational principles of 'genetic circuits' (i.e. gene regulatory networks) by making use of model templates from engineering and physics. 1 In what follows, we will first briefly review the mechanistic discussion of modelling and network science (section 2). Then we will describe the two laboratories and examine their modelling strategies (sections 3.1 and 3.2). In section 4 we address 1 While the Elowitz lab focuses on genetic circuits of prokaryotic, singlecelled organisms, and the Davidson lab concentrates on eukaryotic, multicellular systems, the referent in both cases is largely the same: sub-cellular processes of gene regulation. Both laboratories approach gene regulation in terms of networks of interacting genes and proteins. the differences between the two labs in terms of their respective modelling strategies, which can be labelled as a (de)compositional mechanist approach (Davidson lab) and as a template-based minimal approach (Elowitz lab). We study these two strategies in relation to questions concerning modelling dynamics, the relationship between experimentation and modelling, and the generalizability of model results. 2

A mechanistic prelude to modelling networks
The mechanistic discussion on how abstract network models explain is a follow-up of the mechanists' attempt to make room for modelling within a framework, which relies on a detailed description of actual mechanisms. The question has been one of how detailed and complete models of mechanisms should be, in order to be considered as how-actually explanations. The mechanistic philosophy of science was originally interested in giving an account of scientific explanation, especially in the biological and neurosciences, in terms of entities and their activities, organized into a productive whole, that is, a mechanism (e.g. Machamer, Darden & Craver 2000). While the terminology differed slightly among the mechanists, e.g. Bechtel and Abrahamsen talked instead about component parts, component operations and their organization (Bechtel & Abrahamsen 2005), the basic idea was to trace how experimentally isolated components and their interactions perform a function or produce some behaviours.
With regard to modelling, the mechanistic conception of explanation is ontologically committed to causal mechanisms: it is the actual causal mechanisms, or their representations, that are taken to explain. Moreover, up until the recent discussion on network models, it has been supposed that the causal mechanisms delivering an explanation should be described as completely as possible, including all the relevant components and activities. Yet, such an approach has difficulties in 2 Our case studies are based on the published research of the two laboratories, and other written and online documents related to their work, as well as on the interviews of and informal discussions with the PI's of the laboratories, and other synthetic biologists from other labs. Our analysis of the research carried out at the Davidson lab is largely historical as it covers a 50-year period. Eric Davidson died in 2015, and since then the Davidson lab has been reorganized.
accommodating the often abstract nature of modelling, as noticed already by Glennan (2005), who suggested that due to its realist tendencies the mechanist philosophy "has focused on the properties of mechanisms themselves and not said much about … their models or theoretical representations" (p 443).
Since the explanatory ideal of mechanistic philosophy was that of specificity and completeness, models, in their abstract and idealized nature, were considered as mechanism schemas or sketches at best. Craver, for example, granted that there could be explanatory models; yet, in order to explain they should "account for all aspects of the phenomenon by describing how the component entities and activities are organized such that the phenomenon occurs" (Craver 2006, p 374;see also Darden 2007). What this solution tends to mask is the question of how mechanisms, schematic, or more detailed, for that matter, are supposed to be represented. As the mechanistic philosophy concentrated on the explanations provided by real mechanisms in the world, the intricacies of representation did not usually enter the discussion, except when it came to models and how-possibly mechanisms. The question of representation is pivotal, however, for understanding the hallmarks of contemporary modelling practices: knowledge transfer and interdisciplinary exchange. Models and modelling methods are frequently the primary tools whereby interdisciplinary transfer of concepts, formal tools and methods takes placeas we will show in more detail in our discussion of the Davidson and Elowitz labs.
The recent mechanistic attempt to make room for model-based representation is most evident in the work of William Bechtel and his co-authors. They have invoked two ways in which the mechanistic decompositional approach could be aligned with abstract network modelling. The first makes use of the notion of recomposition, and second of the notion of omission. Bechtel and Abrahamsen introduced the notion of recomposition in an attempt to make room for dynamic mechanistic explanations (e.g. Bechtel 2011;Bechtel & Abrahamsen 2010). Such explanations are provided by models that study complex and cyclic phenomena resulting from various kinds of feedback loops. For the analysis of complex phenomena, the reliance of the mechanistic paradigm on "sequential execution of qualitatively characterized operations" proves insufficient (Bechtel 2011, p 533). This inability of the mechanist programme to address dynamics is due to its attention "to the ways of decomposing a mechanism into component parts and operations [rather] than to the ways of recomposing them into an appropriately organized system" (Bechtel & Abrahamsen 2010, p 322). Bechtel and Abrahamsen were soon joined by other philosophers, who also criticized, in various ways, the assumption that explanatory models would need to fully and correctly represent mechanisms in order to qualify as explanatory (e.g. Love & Nathan 2015;Baetu 2015).
The idea of recomposition does not, however, capture very well what goes on in modelling mechanisms. As Knuuttila and Loettgers (2013b) pointed out, mathematical models studying complex phenomena in terms of non-linear differential equations do not recompose experimentally decomposed components into a mechanism. The non-linear character of such systems should already make, in itself, any straightforward combination of decomposing and recomposing questionable.
Moreover, such models typically are very skeletal, yet not easily de-idealized, and, even more importantly, they import elements from other contexts of research, like complex systems theory, physics, chemistry and engineering. These interdisciplinary ingredients and influences remain hidden, if the focus is on supposed real mechanisms and their parts. Levy and Bechtel (2013) drop the idea of recomposition, arguing instead in favour of abstraction as omission that is supposed to help modellers from the wealth of molecular details to abstract network models. On this view, a model is arrived at by abstracting from the details of the parts and operations of a mechanism, yet "when a graph has been adequately constructed, the nodes and edges represent parts and operations in actual mechanisms" (Levy & Bechtel 2013, p 247, emphasis added; see also Craver 2016). In our view, there is something contradictory about ascribing to the idea of abstraction as omission of empirical detail and simultaneously supposing that the application of general methods from graph theory or complex systems theory would bring such a modelling process to a happy end: a veridical representation of an actual mechanism. We argue that the usage of cross-disciplinary general modelling tools, such as applying graph theory or complex systems theory to genetic networks, amounts to a modelling strategy that is decidedly different from the traditional mechanistic approach. The mechanistic strategy seeks to describe the functioning of specific real-world mechanisms. In contrast, scientists modelling complex systems entailing non-linear dynamics, typically aim for the identification of organizational principles and dynamics that are independent of the specifics of any particular system. Steven Strogatz described this strategy in the following way: […] how does one characterize the wiring diagram of a food web or the Internet or the metabolic network of the bacterium Escherichia coli? Are there any unifying principles underlying their topology? From the perspective of non-linear dynamics we would like to understand how an enormous network of interacting dynamical systemsbe they neurons, power stations or laserswill behave collectively, given their individual dynamics and coupling architectures. (Strogatz 2001, p 1) Unifying principle is the key word of this approach, and techniques such as graphtheory provide model templates for the search of such unifying principles. The research agenda of the Elowitz lab provides an example of this kind of a strategy, in its goal to study the general design principles of biological organization at the molecular level. The work done at the Elowitz lab is basic science, and of an exploratory character. In order to apply (for practical purposes) the network designs they are studying, these circuits would need to be situated into some particular context that involves a lot of piecemeal experimental work. Their modelling strategy of seeking general design principles common to all living things is fundamentally different from that of the Davidson lab, which seeks to map the gene regulatory systems controlling the development of a sea urchin. We call the respective modelling strategies exemplified by the two laboratories the (de)compositional mechanistic strategy (the Davidson lab) and the template-based minimal modelling strategy (the Elowitz lab). In the next section, we study in detail the different heuristics and aims of these two strategies.

Two labs and two approaches to gene regulation
The flourishing interest in genetic networks is largely a result of the Human Genome Project that shifted focus from the "language" or "code" of life to the organizational principles and systemic structure of life. Sequencing the genomes of Homo sapiens and many other species at the turn of the 21 st century paradoxically made it clear that the key to many crucial biological properties and functions was not to be found in individual genes and sequences, but in interactive systems of biomolecules. Yet, already in the early 1960s researchers inspired by cybernetics and information theory had begun to model genetic and metabolic regulation in terms of networks. The Lac operon provided the first model system for gene regulation, theoretically represented by the operon model (Jacob & Monod 1961). It became a paradigmatic model of gene expression in bacteria that quantitatively described protein production as

The Davidson lab: the (de)compositional mechanistic account
The work of Eric Davidson, spanning from the late 1960s to the 2010s, provides a good illustration of a transition in the study of gene regulation from gene to genome, and from theory-driven research to data-intensive science utilizing a wide array of bioinformatics tools and computerized equipment. Together with Roy Britten, Davidson theoretically envisioned the first model of gene regulation for eukaryotic cells (Britten & Davidson 1969) that took the study of gene regulation into a new directionthat of studying multicellular organisms.
The central element of the Britten-Davidson model was the quantity of DNA in repeated sequencesthe model was based on the discovery of large fractions of highly repetitive DNA in the genomes of eukaryotic cells (Britten & Kohne 1968)the frequency of their repetition, and the distribution pattern of repetitive sequences.
The model introduced the notions of producer gene, receptor gene, activator RNA, integrator gene, sensor gene, and battery of genes. When an inducer (a hormone, for example) binds to a sensor gene it causes an integrator gene to produce activator RNA that specifically binds to a receptor gene, which in turn causes transcription of a producer gene and, thus, the production of a protein. In current terminology, sensor genes control the transcription of producer genes by way of a signalling cascade.
The possibility of having several different arrangements of 'gene batteries' was also central to the model. By gene battery, Britten and Davidson referred to "the set of producer genes, which is activated when a particular sensor gene activates its set of integrator genes" (Britten and Davidson 1969, p 350). Importantly, Britten and Davidson underlined that "a particular cell state will usually require the operation of many batteries" (ibid.). In subsequent reformulations the notion of gene battery, along with other elements of their model, was revisited in the light of new data, yet the basic idea of regulation as a process in which several genes are activated together remained unchanged.

The turn towards data-intensive science
The theoretical and conceptual adjustments to the original 1969 model reflected subsequent empirical findings. Towards the mid-1990s, in the wake of the Human Genome project, the importance of scrutinizing whole genomes for understanding gene regulation was being increasingly recognized, and it also provided an investigative pathway for addressing issues of evolutionary concern, such as the origin of bilaterian body plans (Davidson, Peterson & Cameron 1995). Since when they are expressed given transcription factors always affect multiple target genes, and since the control elements of each regulatory gene respond to multiple kinds of incident regulatory factors, the core system has the form of a gene regulatory network. That is, each regulatory gene has both multiple inputs (from other regulatory genes) and multiple outputs (to other regulatory genes), so each can be conceived as a node of the network. This understanding of what a model is, how it is arrived at and justified, differs radically from how researchers within the Elowitz lab approach the modelling of genetic networks.

The Elowitz lab: template-based minimal modelling
Michael Elowitz, one of the pioneers of synthetic biology, has a background in physics like many other systems and synthetic biologists. Physicists entering the 5 Interview, Caltech, 10 April 2014. biology laboratories is by no means a new phenomenon, but by the mid-1990s, physicists and engineers could incorporate in their research the use of novel, standardized, molecular biology kits that made experimenting easier than earlier.
Researchers did not have to master all the steps and details of standard methods such as gel extraction to clean up DNA, or polymerase chain reactions (PCR) used in the amplification of pieces of DNA.
The challenges of the move from physics to molecular biology should nevertheless not be underestimated. One of the most obvious differences between the two is that biology lacks general laws or theories similar to those employed successfully in physics. Furthermore, biological systems usually show a higher degree of structural and dynamic complexity than do physical systems. Systems biologist Uri Alon, who had coincided with Elowitz at the Leibler lab, describes this difference in the introduction of his textbook on systems biology: As a physicist, I was used to studying matter that obeys precise mathematical laws. But cells are matter that dances. Structures spontaneously assemble, perform elaborate biochemical functions, and vanish effortlessly when their work is done. Molecules encode and process information virtually without effort, despite the fact that they are under strong thermal noise and embedded in a dense molecular soup. How could this be? Are there special laws of nature that apply to biological systems that can help us to understand why they are so different from nonliving matter? (Alon 2006, p 1) The quest for general principles in biology is central for systems and synthetic biology. And at the same time synthetic biologists focus precisely on the "dance" of molecules and their oscillations in protein level that contrasts with the Davidson lab's meticulous mapping of the wiring up of genes. In order to understand specific biological functions such as cell differentiation, cell cycles and circadian rhythms, researchers at the Elowitz lab examine how the changes in protein levels function as an organizational and regulation element. The ultimate aim is to compile a catalogue of general principles, their associated dynamics and the functions they give rise to, as a kind of ersatz for general theories.

From molecules to modules
The Elowitz lab is known for the synthetic genetic circuits that have been their main research tool. In their conceptualization of biological systems, the Elowitz lab makes use of complex systems theory and graph theory, but they also turn to engineering.
While the engineering approach was not new to biology, Elowitz took his inspiration from the programme laid out in the article entitled 'From molecular to modular biology' (Hartwell et al. 1999). Two of the authors (Hartwell and Murray) came from biology and the other two (Hopfield and Leibler) from physics. They argued for the study of general principles of gene regulatory networks: We argue here for the recognition of functional 'modules' as a critical level of biological organization. Modules are composed of many types of molecules.
They have discrete functions that arise from interactions among their components (proteins, DNA, RNA, and small molecules), but these functions cannot easily be predicted by studying the properties of the isolated components. We believe that general 'principles'profoundly shaped by the constraints of evolutiongovern the structure and function of modules. (Hartwell et al. 1999, p C47, emphasis added) The programme in question was built on engineering notions. In order to describe biological functions also a vocabulary that contained concepts such as amplification, robustness, insulation, error correction and coincidence detection was suggested (Hartwell et al. 1999, p C47). The challenge that researchers like Elowitz faced was how this heterogeneous mix of concepts, methods and tools from molecular biology, physics and engineering could be integrated.

Integration of concepts from biology, physics and engineering
The Elowitz lab makes use of engineering concepts to characterize and examine the structural aspects of genetic networks. In order to analyse the dynamic properties of genetic circuits, the Elowitz lab turns to physics, transferring concepts such as bistability and phase transitions (Ullner et al. 2007; García-Ojalvo 2011; García-Ojalvo mRNA, in turn, pertain to the biological systems studied. All these concepts become integrated in the mathematical models of genetic circuits, and materially embodied in synthetic genetic circuits. In order to gain some insight into this interdisciplinary integration process we will study the construction of the Repressilator, which is one of the two first synthetic models 6 (Elowitz & Leibler 2000). 7 The Repressilator is a synthetic genetic circuit, constructed to study oscillatory behaviour within cells that is supposed to underlie cellular regulation. Concepts from physics shaped the mathematical model that was used as a kind of blueprint for the construction of the Repressilator. The researchers were interested in calculating the stable and unstable states of oscillatory systems like the Repressilator. The assumption of modularity, in turn, was critical for the whole endeavour, as it is for the whole of synthetic biology in general; this assumption is needed to attribute a biological function to a specific network architecture (Schlosser & Wagner 2004;Endy 2005). Other engineering principles included the template for the arrangement of genes and proteins of the Repressilator. It was designed according to a ring oscillator, which is a familiar circuit from electrical engineering consisting of an odd number of inverters connected together so that the output of the last inverter is fed back into the input of the first.
The Repressilator consists of a network of three genes. The protein associated with each gene represses its neighbouring gene, resulting in a negative feedback loop.
This negative feedback mechanism leads to oscillations in the protein levels of the three genes. The mathematical model of the changes in the protein levels of the three proteins provided a starting point for the construction of the Repressilator. 8 The mathematical approach was beset by two kinds of challenges: On the one hand, one had to deal with complex dynamics due to the non-linear interactions in the network.
On the other hand, the biochemical parameters of the system were largely unknown, 6 The other synthetic model was a genetic toggle switch introduced by James Collins and his collaborators (Gardner et al. 2000). 7 Our discussion of the Repressilator relies partially on Knuuttila and Loettgers 2013a,b. 8 The mathematical model is the well-known mass-action model used in the examination of the kinetic of chemical reactions in dynamic equilibrium. and, moreover, the measurements of biochemical parameters are often cumbersome and difficult, limiting their availability.
In making use of negative feedback mechanism as the basis of oscillatory behaviour, Elowitz did not follow earlier traditions (e.g. Goodwin 1963), but rather linked his work to some recent attempts to mathematically model gene regulatory networks. Elowitz stated in a private conversation that in thinking about the possible architectures of his synthetic model he had been inspired by Thomas and D'Ari's book Biological Feedback (1990). It presents a formal methodology for analysing the complex behaviours of networks, aiming to predict all possible patterns of behaviours of a system. The authors model gene regulatory networks by using elaborate combinations of feedforward and feedback mechanisms, both of which are central regulatory mechanisms in engineering, and also familiar from devices such as thermostats and autopilots. In biology, such mechanisms came to be treated as organizational principles (Thomas et al. 1995). In the work of Alon they have even been elevated to general design principles (Alon 2006). [ Figure 2 approximately here]

The representational status of synthetic systems
In making sense of the practice of synthetic modelling it is important to note that Elowitz and other synthetic biologists do not claim that synthetic models represent any naturally evolved organisms. The specific configuration of the three genes of the Repressilator was chosen because it would lead to strong oscillations (Elowitz and Leibler 2000, 335), and the genes were selected from entirely different contexts.
The Repressilator is best understood as a minimal model 9 that nevertheless furnishes valuable insights. First, it provides a proof of principle that biological systems can realize oscillatory behaviour through various molecular feedback mechanisms. Second, by being a biological minimal model, it goes beyond the insights provided by mathematical modelling by exhibiting unexpected, and "nonintuitive" behaviours. 10 The oscillations in the protein levels of the Repressilator did not turn out being as regular as predicted by the mathematical model and computer simulations.
By replacing the deterministic simulations by stochastic ones, the researchers showed that the variations in the protein levels could be due to fluctuations in gene expression. They suspected that the reason for these fluctuations was the low number of molecules in cells. But in order to confirm this hypothesis the researchers had to go back to the study of biological systems. In in vivo experiments on different strains of E. coli, Elowitz and collaborators were able to discriminate between intrinsic and extrinsic noise (intrinsic noise is due to fluctuations in gene expression, while extrinsic noise is due to the cell environment) Swain et al. 2002).
They suggested that noise plays a functional role in processes like gene expression, development, and even in evolution: Noise far from being just a nuisance, has begun to be appreciated for its essential role in key cellular activities. Noise functions in both microbial and eukaryotic cells, in multicellular development, and in evolution. It enables coordination of gene expression across large regulons, as well as probabilistic differentiation strategies that function across cell populations. (Eldar & Elowitz 2010, p 167; see also Eldar et al. 2009) 9 Alon calls the minimal mathematical models he builds "toy models", whose goal is nevertheless "to understand the essential features of the system" (2006, p 142). 10 For the element of surprise in experimentation vis-à-vis modelling, see Morgan 2005.
As a result, the function of noise in gene regulation became a further focus of the lab.
The researcher started to study how biological systems make use of noise. One of the first projects of the Elowitz lab concerned transient cellular differentiation (e.g. Süel et al. 2006). From the perspective of the cross-disciplinary transfer of tools and concepts, it seems interesting that in trying to understand the transition of B. subtilis between the states of vegetative growth and competence, the researchers drew on their earlier experiences on the excitable behaviour of lasers and artificial neural networks.

(De)compositional vs. template-based modelling strategies
In our brief discussion of the mechanistic discussion of modelling above, we noted how it has gravitated from the traditional experimental approaches of biological sciences towards abstract minimal modelling without too much explicit notice. This inadvertent change of emphasis seems to imply that the mechanistic programme can accommodate abstract network modelling without significant changes to its earlier focus on detailed descriptions of actual mechanisms that constitute the locus of scientific explanation. Thus Craver (2016) claims, for example, that the explanatory power of network models is ultimately tied to representing "phenomena [as] situated, etiologically and constitutively, in the causal and constitutive structures of our complex world" (p 707). Our account of the two labs shows that while some network models strive to represent actual causal and constitutive structures (the Davidson lab), others rather target some very general designs that might underlie various kinds of biological phenomena (the Elowitz lab). The ontic commitments of the mechanistic approach do not very well suit this kind of modelling strategy.
We term the two strategies as (de)compositional mechanistic strategy, exemplified by the Davidson lab, and template-based minimal strategy, exemplified by the Elowitz lab. While we do not claim that these two strategies account for all possibilities, they nevertheless offer opposite modelling strategies in more than one dimension, and show that the attempt to reconcile aspects of them within a unitary mechanistic approach may not succeed. Our comparison also shows that engineering concepts, methods and principles can be applied to biology in myriad ways, and any evaluation of the appropriateness of such transfers should pay attention to this diversity. Furthermore, what may seem to be a transfer from engineering to biology may turn out to involve tools, concepts and methods from many other disciplines, too.
The Davidson lab's (de)compositional strategy of modelling that relies both on a long-standing experimental tradition in biology and a big data approach, can be adequately portrayed by the earlier mechanistic accounts. We call this approach '(de) completeness and specificity could not be clearer.
In criticizing the earlier mechanistic approaches for not paying due attention to dynamic (mathematical) modelling, Bechtel points out that those earlier approaches were concentrating on "a start-to-finish sequence of qualitatively characterized operations performed by component parts" (Bechtel 2011, p 534). Biologists typically use various kinds of diagrams to accomplish this task, and this is precisely what BioTapestry is doing when it comes to dynamicsalthough at a vastly different scale.
BioTapestry describes the sequences of gene/protein activations during the process of development in the form of a huge multilayered electronic switchboard.
The Elowitz lab, in contrast, addresses the dynamic interactions of genes and proteins in their modelling practice. Elowitz explained the motivation behind their approach in the following way: I was reading a lot of biology papers. And often what happens in those papers is that people do a set of experiments and they infer a set of regulatory interactions, and then often at the end of the paper is a model, which tries to put together those regulatory interactions into a kind of cartoon with a bunch of arrows saying who is regulating who. […] And I think one of the frustrations […] with those papers, was always wondering whether […] that particular circuit diagram really was sufficient to produce the kinds of behaviors that were being studied. In other words, even if you infer that there's this arrow and the other arrow, how do you know that the set of arrows is actually […] working that way in the organism. (emphasis added) 12 The network models of the Elowitz lab are built to study the complex dynamics of genetic circuits. Their template-based minimal modelling strategy is nearly opposite to that of the Davidson lab. Instead of basing the modelling effort on decompositional experimental analysis, analogies are drawn both to engineered and other physical systems that provide a basis for transferring formal templates for model construction. 12 Interview, Caltech, 11 May 2012.
Paul Humphreys (2004) has paid attention to the fact that a fairly restricted repertoire of mathematical forms and methods are used across different disciplines to model an astonishing array of phenomena. The generality and tractability of these 'computational' templates provide the rationale for using them. Knuuttila and Loettgers (2016) expand on Humphreys's insight in introducing the notion of model template that highlights the conceptual side of mathematical forms and methods, and their productive role in theory development. For example, the mathematical tools that enable scientists to analyse biological organization through feedback loops of various kinds involve the conceptualization of these biological phenomena through the notions of network and control, and also license comparisons to engineered control systems. This allows scientists to probe further similarities and dissimilarities between engineered artefacts and naturally evolved biological systems.
The Elowitz lab transfers templates from engineering and physics on two levels. First, the interactions of genes and proteins are studied as control systems and rendered in terms of wiring diagrams. The Elowitz and Davidson labs share this approach. In contrast, however, the Elowitz lab concentrates on complex phenomena that are brought about by feedback loops in the network structure. In order to study the dynamics of networks, the Elowitz lab transfers mathematical templates from complex systems theory. 13 As a result, the networks that the Elowitz lab studies are markedly different from those of the Davidson lab. First, their mathematically modelled and synthetically engineered genetic circuits are small, constructed from a limited number of genes and proteins in order to enable the researchers to study their complex interactions in a tractable way. Second, these genetic circuits do not aim to depict any naturally evolved networks. Their construction is often inspired by some simple designs from electrical engineering, and they are put together from molecular material derived from different contexts of research in view of the phenomena they are supposed to exhibit.
Through their minimal modelling practice, the Elowitz lab seeks to find some general design principles of biological organization, something akin to general 13 Complex systems theory develops mathematical tools to study complex systems, their general properties and behaviours, and how system's parts engender its collective behaviours. theories in physics, albeit more piecemeal in character, given the nature of biological systems. The goal of these minimal models is to study what is sufficient for a certain function, broadly understood, not how any particular naturally evolved function is actually produced. Yet the modelling approach of the Elowitz lab is, in its minimal template-drivenness, animated by concretely engineered living systems. This fact cannot be overestimated; the traditional biological experimental methods in molecular biology analyse the tissues of dead organisms. In contrast, in almost all of their publications, the Elowitz lab includes snapshots of fluorescent blinking and pulsing bacteria that are taken from time-lapse movies of growing microcolonies of those bacteria. In the work of the Elowitz lab the features of the hypothetical design principles and concrete phenomena like noise and pulsing are studied mathematically, rendered diagrammatically, and finally analysed in action via movies of gene expression in bacterial cells.
The final contrast between the two labs can then be highlighted through the role of the experimental work in their modelling strategies, and how it links to the generalizability of their results. As the Elowitz lab is targeting the general design principles of molecular control, the crucial question for them is whether such hypothetical general circuit designs could really be implemented by nature. Even though the abstract mathematical models applied from complex systems theory may be able to exhibit the observed oscillations in the molecular level, this does not yet prove that the model has succeeded to depict what actually takes place in some specific biological organisms. To find out whether or not the abstract design principles could be biologically realized requires situating them in a biological context. One strategy for such situating is provided by the construction of synthetic circuits from biological parts.
The Davidson lab, in turn, does not need to check whether their constructs are realized by natural circuits, they are derived from the experimentation on natural systems. The research programme of the sea urchin's genome is partly motivated, however, by general considerations. Namely, because of its phylogenetic position with respect to other organisms, the sea urchin's genome has a potential for elucidating what is ancient in architecture and function of the mammal genome. Here the question becomes that of which substructures of the gene regulatory networks could be reorganized and altered in evolution and which are unchanging and inflexibleand how knowledge derived from model organisms can be generalized and re-situated (Ankeny & Leonelli 2011;García-Deister 2011;Morgan 2014). In its search for generalizability, the (de)compositional approach may profit from more abstract modelling approaches. And, on the other hand, the success of synthetic biology in designing and redesigning biological parts and systems for useful ends, relies critically on how well the general insights on biological circuitry can be adjusted to, and complemented with detailed biological knowledge from specific contexts. But could such synergies lead to the assimilation of the (de)compositional mechanistic and template-based minimal strategies into a one larger mechanistic whole? Right now, we remain agnostic.
One problem is related to the use of cross-disciplinary templates transferred from other disciplines on the basis of their familiarity and tractability. A mechanist would need to claim that such template-based minimal models abstract by omitting details (cf. Levy & Bechtel 2013). But, as we have shown, such an answer leaves aside the complexities of representation, and the way these minimal models are actually arrived at. Moreover, many minimal models in biology, and other sciences, study general theoretical possibilities and do not primarily seek to explain actual, causally situated mechanisms in the world. One of the motivations for building synthetic models is precisely to probe whether the network models, studied theoretically and rendered mathematically, could actually be realized in biological organisms.
The mechanistic programme has provided a strong paradigm for scientific explanation, especially suited for experimentally grounded practices within the broader realm of life sciences. We call for the mechanists to more explicitly consider whether the mechanistic account of explanation also possesses analytical resources to cover model-based theoretical exploration. The primary goals of explanation and modelling are often different, and so accounting for model-based explanation needs some more analytic work both from mechanists and philosophers working on modelling.