IHPST Paris-CAPE Kyoto, philosophy of biology workshop

Abstracts:

  • Pierre-Alain Braillard "What has been learned about biology from an engineering perspective?"

    Engineering has made many contributions to biology's progress during its history. These contributions have however mainly consisted in developing various instruments and observational techniques, like the electroencephalogram, the ultracentrifuge or the electron microscope. Since the turn of the century, many voices have argued that engineering models, concepts, and analytical tools might play deeper and more theoretical roles in unraveling and explaining biological complexity. An engineering perspective can be helpful in different ways, like offering new decomposition strategies and methods for analyzing regulatory circuits. It has also helped biologists to address the issue of biological robustness, both at the level of specific mechanisms and from a more general theoretical point of view. Focusing particularly on systems biology and synthetic biology, my goal is to analyze exactly what such a perspective has brought to biology in the last decade and to discuss what can be expected in the near future.

  • Philippe Huneman "Distinguishing among causal explanations in biology: Topologies and mechanisms"

    This paper argues that besides mechanistic explanations, there is a kind of explanation that relies upon "topological" properties of systems in order to derive the explanandum as a consequence, and which does not consider mechanisms or causal processes. I first investigate topological explanations in the case of ecological research on the stability of ecosystems. Then I contrast them with mechanistic explanations. Then, expanding on the case of ecological stability, I consider the phenomenon of robustness at all levels of the biological hierarchy in order to show that topological explanations are indeed pervasive there. Reasons are suggested for this, in which "neutral network" explanations are singled out as a form of topological explanation that spans across many levels. Finally I focus on the relationships, in principle and in practice, between topological explanations and mechanistic explanations, emphasizing both the distinction between research questions and the complementarities, and especially provide examplles of the reciprocal relations of dependence between these kinds of explanations

  • Lucie Laplane "Stemness ontology and therapeutic strategies"

    This presentation will have two aims: (1) it will attempt to disentangle the various ontological perspectives on "stemness" that currently coexist in biology; (2) it will highlight the consequences of these perspectives for cancer treatments.
    Stemness (ability to self-renew and potency to differentiate) is the property by which stem cells are defined. Since few years, there is a debate on the kind of property "stemness" really is. It is usually seen as the essential and distinctive property of stem cells. But a dissenting perspective on stemness has appeared among biologists. This has been described as the "state ontology" according to which stemness is nothing more than a cellular state. We argue that the so-called "state ontology" is plural and contains several ontological perspectives on stemness. We see at least three of them: stemness as a disposition, stemness as a relational property and stemness as a system property.
    We argue that it is of major empirical importance to disentangle these perspectives on stemness, in particular for cancer therapies. This thesis is related to the emergence of the "cancer stem cell" (CSC) theory through the past decade. According to this theory: (1) cancers are initiated, maintained and propagated by CSCs exclusively, (2) CSC can escape classical therapies and lead to relapses, and (3) killing the CSC would be necessary and sufficient to cure cancer. We will show that (3) is true only if stemness is the essential and distinctive property of stem cells or if it is a disposition of stem cells but not if stemness is a relational property of a system property. In the latter two cases other therapeutic strategies have to be developed in order to resolve the problem of the frequent relapses in cancers.

  • Antonine Nicoglou "Plasticity in contemporary biology: The synthesis of two conceptual traditions?"

    It is at the origins of embryology that are found the first uses, in the life sciences, of the term "plastic", particularly in the theory of epigenesis with the idea of "plastic force" (William Harvey) or in the theory of morphogenetic fields (Hans Driesch). But the term is not only found in embryology. From the early 20th century, an emerging tradition appears in the field of genetics with the notion of "norm of reaction" and later with the notion of "phenotypic plasticity". This new "genetic" approach of plasticity is used to describe the diversity of possible phenotypes for a given genotype according to different environments. More recently, some biologists have used the concept of "developmental plasticity" in order to formulate a new synthesis of evolution (after the Modern Synthesis) which combines development and evolution (Scott Gilbert, Mary Jane West-Eberhard). In this presentation, I specify the links between this contemporary use of plasticity and the two traditions described above. I will show that this use of plasticity is more than just the addition or the synthesis of the two traditional uses and that these biologists adopt what I call a "too wide conception of plasticity", which leads them into a misleading conception of a synthesis.

  • Gladys Kostyrka "Using viruses to model life: A paradox?"

    If E. Coli may be seen as a bacterial model of other bacteria, viruses have their viral models too. These models are often well-known and deeply studied viruses, used to represent all the other viruses into the same category or family. For instance, the phage lambda λ is a model of the general category of "λ-like viruses", and more broadly of the bacterial viruses. It sounds logic to study viruses with the help of viruses, but it's more surprising to study cellular processes through viruses. And yet some viral models were also used to understand fundamental processes of cellular life, such as gene regulation and replication (Ptashne 1986, 2004). Based on either direct analogy (cellular genes "behave like" viral genes) or indirect analogy (understanding viral activities in the cell lead to the discovery of specific cellular mechanisms, similar but not directly analogous to viral mechanisms first discovered), the use of viral models contributed not only to the construction of virology but also to the construction of molecular biology. But today, viruses may be more than molecular models: I will show how viral models are used to better understand some general evolutionary processes.

  • Gauvain Leconte "The predictive capacity of biological theories"

    A classical claim made by both biologists (Mayr 1961) and philosophers (Scriven 1959; Smart 1966) is that biology, and especially theory of natural selection, can explain but not predict empirical facts. This dissymmetry between explanation and prediction would be the consequence from the non-deductive nature of biological theories and from the randomness and variability inherent to biological phenomena. Defenders of the existence of predictions in biology (Kochanski 1973; Williams 1970, 1973, 1982) have replied that if we formulate explicitly the deductive structure underlying biological theories this randomness is no obstacle to the formulation of prediction similar to those of statistical physics for example.
    However, studies of novel predictions (Hitchcock and Sober 2004) have shown that predictions are not exclusively deductive inferences of future events. I will consider recent predictions in the field of genetic evolution (Orgogozo and Stern 2008; Stern 2010) and microbiological experimentations (Winther 2009) to see what specific kind of predictions exist in modern biological theories. Yet, I will then argue that if we focus on the predictive capacity of these theories rather than on their actual predictions, we can still support the claim that biological and physical theories have comparable predictive power.

  • Yuichi Amitani "What a tale of two minds can be"

    The dual process theory is, put simply, a view that there are two information-processing systems in our mind and that we often employ either or both of them in problem-solving tasks. It has been popular in cognitive and social psychology for the last decade, but this simplified formulation of the theory has problems, because of the objections raised to it and recent developments of the theory itself. For example, it might be hard to find mechanical (or neurological) basis for each system, and some leading theorists admit that the mindless use of the property-pairs leads us to misidentify the kind of system to which a behavior is attributed. In this paper I shall explore the ways in which we can interpret the dual process theory so that it could meet the challenges.

  • Ryota Morimoto "Genetic drift model as rational inference"

    Genetic drift is considered one of the major evolutionary factors. It refers to chance fluctuation of gene frequency. To express genetic drift mathematically, probability is an indispensable concept. This raises a philosophical question: What is the appropriate interpretation of probability in drift model? Alex Rosenberg (1994) argues that the probabilities used in evolutionary theory including drift model shouldn't be interpreted realistically because they only reflect our ignorance of details. In this presentation, I propose an alternative to Rosenberg's interpretation. First, I give a critical appraisal of his arguments and show that the probabilities reflect not merely our ignorance but some aspects of reality. I discuss this issue by using a standard drift model like Wright-Fisher model. Second, I compare the drift model with statistical mechanics and show that in Wright-Fisher model we can update the probabilities rationally depending on what we know. Then I suggest drift model can be interpreted as rational inference.

  • Senji Tanaka and Akinori Takahashi "The biostratigraphic origin of the theory of punctuated equilibria"

    Forty years have passed since Niles Eldredge and Stephen Jay Gould introduced the theory of punctuated equilibria to evolutionary biology. No idea of paleontology has sparked more debate than the notion of punctuated equilibria. It was the idea based on the recognition of long-term stability and abrupt change in the fossil record. Eldredge and Gould suggested that the pattern in the fossil record could directly reflect evolution, and proposed that the allopatric theory of speciation could explain abrupt evolutionary change. Generally, it is said that this combination of pattern and process is the essence of the theory of punctuated equilibria. However, the most revolutionary idea lies elsewhere. Eldredge and Gould imported (perhaps unconsciously) the basic methodology of biostratigraphy into evolutionary biology in 1972. Since the early nineteenth century, biostratigrahers have always treated their data as if species do not change much during their residence in any local section and do not grade insensibly into their closest relatives in adjacent stratigraphic horizons. Focusing on the biostratigraphic origin of the theory of punctuated equilibria, we argue that punctuated equilibria are enough prevalent evolutionary patterns for biostratigraphers to meet with their splendid empirical success like the reconstruction of the geological history of earth.

  • Yoshinari Yoshida and Hisashi Nakao "Overlooked elements in the history of evo-devo: Studies of epigenetics in the 80s"

    When discussing the history of evolutionary developmental biology, some researchers have emphasized the importance of the technical contribution of developmental genetics (Carroll 2005; Gilbert 2003) while others have also focused on comparative embryology and morphology (Arthur 2011; Love & Raff 2003). This talk argues that they have missed an important aspect of evo-devo: It is commonly acknowledged that evo-devo includes some elements that can not be explained in the gene-centered framework (Arthur 2011; Gilbert & Epel 2009; Hall 1998) and the origin of these elements partly dates back to "epigenetics" in the 80s.
    We focus on Pere Alberch, Brian K. Hall and Gerd B. Müller and show that they challenged to the gene-centered framework of the modern synthesis by emphasizing "epigenetics", and in order to do that, they integrated methodologies and information from experimental embryology and comparative morphology (Alberch 1980; Alberch & Gale 1983, 1985; Hall 1983; Hall 1984; Müller & Streicher 1989; Müller 1990). They were also peculiar in the 80s when other researchers also rethought the relationship between development and evolution (Arthur 1984; Bonner et al. 1982; Goodwin et al. 1983; Raff & Kaufman1983) in that they did not focus on genetics, emphasized revolutionary aspects of developmental studies, and included concrete experimental studies. By looking at the diversity of the attempts in 1980s, we can also understand the diversity of present evo-devo.

  • Yuki Sugawara and Hisashi Nakao "Mechanistic stance: An epistemic norm among scientists"

    Over the last two decades, some philosophers of science have enthusiastically advocated "mechanistic philosophy" (e.g., Machamar et al. 2000; Bechtel and Abrahamsen 2005; Glennan 1996, 2002), arguing that to describe or provide mechanisms is central to scientific explanation. Although some case studies have suggested that this claim is applicable to other examples (e.g., Baker 2005; Bechtel 2005), there has been also raised some serious problems. First, we have not had a unique or adequate definition of mechanism yet. Second, relatedly, some have pointed out that the concept of mechanism defined by Machamar et al. (2000) or Glennan (1996) does not correctly capture explanations by natural selection (e.g., Skipper and Millstein 2005; Havstad 2011). Finally, Moss (2012), unlike Craver (2007), argues that providing mechanisms is not a norm that scientists should conform to.
    This paper argues that mechanistic philosophy is still important for understanding scientists' "mechanistic stance" : In (probably not all but) many cases, scientists actually search and try to describe mechanism-like systems even if they are not or cannot be defined so strictly (e.g., Bechtel and Richardson 2010; Levy 2012; Matthewson and Calcott 2011) and this is one of the norms that scientists actually conform to even if should not. So in this sense, mechanistic philosophy correctly captures such mechanistic stance of scientists.