• Tidak ada hasil yang ditemukan

Co st s

N/A
N/A
Protected

Academic year: 2023

Membagikan "Co st s"

Copied!
142
0
0

Teks penuh

It begins with a summary of modeling approaches in systems biology (although this work focuses on bottom-up modeling approaches). It also focuses on heterogeneity and its impact on the measurement process; there I explore ways of coding heterogeneity as part of a measurement model.

Summary

Introduction

Systems biology addresses this challenge with new measurement technologies and modeling strategies to increase the availability and use of biological data. It then explores the role of data in the success and failure of systems biology efforts.

Systems Biology

Top-down Approach to Systems Biology

Top-down approaches use statistical and machine learning frameworks to explain and model the patterns of abundance or activity of biological species that correspond to biological phenotypes and/or experimental treatments. The models produced in top-down modeling approaches examine large data sets for useful insights to predict biological behavior.

Bottom-up Approach to Systems Biology

The bottom-up modeling process can reveal unconsidered gaps in the understanding of a biological process. Bottom-up modeling approaches find and fill gaps in the mechanistic understanding of biological processes.

Middle-Out Approach to Systems Biology

The potential of bottom-up modeling approaches to understand the subtle details of biological processes has contributed to its growing appeal. The increasing presence of bottom-up modeling approaches in biological research and drug design has set the path to a future where modeling and experimentation occur within a standardized, fully integrated collaborative workflow71.

Challenges to Systems Biology

Complexity, multiscale organization and heterogeneity conspire to both complicate and hinder the measurement and modeling of biological systems. Conversely, biological systems respond to their environment, a complex network of signals that span larger spatial and temporal scales (i.e., biological context).

Addressing the Demand for Measured Data

  • Simpler Models Offset the Demand for Data
  • Facilitating easier access to existing data
  • Technologies generate new quantitative data
  • Improved processing of measurements
  • Using non-quantitative data as a substitute

Arbitrary scaling, normalization, and signal timing impose assumptions about the relationship between some characteristic of the biological system (eg, concentration) and its nonquantitative measurement (eg, fluorescence). Incorporating non-quantitative data into a model requires a functional definition of the non-quantitative data in terms of metrics present in the model.

Recommendations Moving Forward

This rigorous treatment ignores uncharacterized sources of noise that contribute to the non-quantitative character of measurements in biology (eg how noise sources can add uncertainty to bounds to mark the edges of ordinal values). The discretization of continuous variables that occurs in optimal scaling also prevents easy implementation of Bayesian model calibration methods.

Summary

Introduction

Their formulation imposes discrete bounds on the mechanistic model to reflect discrete ordinal values ​​in the data, but this approach limits their ability to integrate multiple data types or use Bayesian methods for training and uncertainty estimation. This definition involves formulating a function that maps variables encoded in a mechanistic model to values ​​in .

Results

  • Contributions and biases from different data types to mechanistic models
  • Uncertainty associated with different data types in model calibration
  • Data-driven measurement model as an indicator of model bias
  • Mechanistic insights from data-driven measurement models

In the calibration process, the measurement model is an explicit intermediate step between simulation of the mechanistic model. The 95% credible region of posterior predictions (shaded region) of tBID dynamics for aEARM calibrated to ordinal measurements two fixed parameterizations for the measurement model (see Table A.3).

Discussion

The data-driven probabilistic measurement model we propose in this research was essential to this finding. We also found that the posterior predictions of our mechanistic model were sensitive to the assumptions, which we encode in the measurement model, about the relationship between measurement and measurement. Uncertainty in non-quantitative measurements drives the, often unrecognized and implicit, assumptions about the relationship between measurement and criterion (i.e. between data and model).

We found that incorrect ad hoc assumptions about measurement could produce patterns that suggested, with a higher degree of certainty, an incorrect prediction (Figure 2.4. B). This finding suggests that ad hoc assumptions about measurements can lull practitioners into a false sense of confidence about the model and the data. Having a measurement model whose attributes are determined by the data creates an opportunity to learn new details about the relationship between a measurement and its measure(s).

In doing so, model calibration using our data-driven measurement model performed feature selection to correctly identify the most important predictor of cell death.

Conclusions

Methods

  • Extrinsic Apoptosis Reaction Model
  • Integrating aEARM Dynamics
  • Measurement Models and Likelihood Functions
  • Generating Synthetic Datasets
  • Model Calibration via Bayesian Inference
  • Model Predictions

Each ordinal constraint function is combined, using the sequential model (i.e. the product of the logistic functions), to give a probability of each ordinal category, 𝑃B𝑦%(𝑡) = 𝑐-F𝑥%(𝑡, 𝜽 ), 𝛼%, 𝛽 . Together, this yields a log-likelihood function (Equation 12) where the probability of each category 𝑐- is calculated for each time point, 𝑡, and observable, 𝑖; and applied to the probability of the data 𝑦, given the model. We trained aEARM on synthetic binary data (survival vs. death) by incorporating a measurement model (i.e., a logistic model of the probability of each categorical outcome) similar to that used for the ordinal data.

The calibration of aEARM to IC-RP and EC-RP fluorescence time-course data provided an optimally fitted vector of rate coefficient parameters, which served as the "ground truth" parameter vector in the synthesis of the non-quantitative data sets ( Table A.5. ). These time courses were preprocessed to yield values ​​of the features encoded in the nominal measurement model above. This random sample was shifted and scaled according to derived values ​​of the model mean and variance.

Random samples of 1000 parameters were taken from the latter 50% of the resulting parameter traces were used in subsequent analyses.

Introduction

In this chapter, I take preliminary steps to calibrate the model of cell death signaling to indirect and/or non-quantitative measurements of intracellular components (eg nominal cell fate decisions, immunoblot measurements of cellular protein content, cell viability and cell viability rate). The biology of cell death illustrates the challenge of complexity as multiple programmed cell death modalities engage a shared signaling molecules in a web of biomolecular regulatory cross talk158. The interdependence between distinct cell death modalities causes unexpected behavior and complicates investigations of cell death.

I demonstrate the potential of systems biology (despite the limitations of the data) to investigate complexity through a model of. This fundamental feature of biology, its multiscale organization, is revealed in our treatment of phenotype. The empirical description of phenotype precedes the phenotype model, and often the phenotype model requires further specification.

The approaches I consider apply dimensionality reduction and include methods in ways that reflect the models (in multiscale modeling approaches) of behavior emerging from one temporal-spatial scale and operating at another temporal-spatial scale159.

Systems Biology Addresses the Challenge of Complexity but is Limited by a Dearth Data Biological complexity represents a significant barrier to understanding and solving problems in Biological complexity represents a significant barrier to understanding and solving problems in

While machine learning measurement models will enable adaptation to a growing range of modeling scenarios, the need for data and data-driven strategies remains. Therefore, this chapter highlights the challenges and opportunities to use more types of measurements to meet (but not replace) data demands. To specifically investigate the role of Bid in regulating necroptosis, we encoded a mechanistic model of TNF Complex I dynamics (TNF ligation, scaffold protein recruitment, cIARP-mediated RIP1 polyubiquitination, and CYLD-mediated RIP1 de-ubiquitination); RIP1 release from Complex I and formation of cytoplasmic Complex II.

However, recent observations suggest that an untruncated BID may regulate necroptosome formation by inhibiting RIP1 accumulation. We coded the above mechanism as a system of ODEs and added to the model one of several hypothetical interactions between BID and necroptosis signalers. By noting the predictions of the calibrated models for in silico apoptotic and necroptotic conditions, we narrowed the set of hypotheses to those containing a novel stable interaction between BID, cFLIP, and capsoase 8.

The Hessian eigenspectrum of a model with this complex exhibited more bands ≤ 1.0 than the model lacking this complex (providing more robust support for the.

TNFR

Generalizing the Measurement Model: Addressing a Challenge of Heterogeneity

Snap-switch dynamics are obscured when in B. averaging the fluorescence measurement over the heterogeneous population of cells. By evaluating the aEARM for multiple simulations (sample size of 𝐾 = 50) of the model parameters, we generated an in silico sample of the apoptosis dynamics;. Shown in Figure 3.4. B., the resulting 95% credible region for predictions of the mean value of the bid truncation dynamics was tightly constrained around the ground truth dynamics of the aEARM.

The most heterogeneous population (shown in green in these figures) has the flattened view of the dynamics (shown in Figures 3.5. B. and 3.5. C.). The probability is intractable because it is impossible to evaluate the mechanistic model over the domain of the probability density function describing the heterogeneity (i.e., the domain is uncountably infinite). The size of the simulated sample set determines the accuracy and reliability of the probability evaluation.

Accurate approximation of the likelihood function of a model of biological heterogeneity requires a sufficiently large sample of simulations of the random process or probability distribution that models the heterogeneity.

Conclusions

Researchers should therefore devise experiments that collect multiple measurements (regardless of the measurements) if they intend to implement systems biology modeling methods. A Brief History of Systems Biology: "Any objection that biology studies is a system of systems." Francois Jacob (1974). Technical challenges of BioNEMS: integration of microfluidics, micro- and nanodevices, models and external control for systems biology.

Practical limits for reverse engineering of dynamical systems: a statistical analysis of sensitivity and parameter inducibility in systems biology models. Measurement of the absolute magnitude and time course of mitochondrial membrane potential in primary and clonal pancreatic beta cells. From qualitative data to quantitative models: analysis of the stress response to phage shock protein in Escherichia coli.

A quantitative framework for assessing and discriminating large-scale patterns in systems biology (Doctoral dissertation, Massachusetts Institute of Technology).

Supplemental Figures for Chapter 2

Model parameters calibrated to an ordinal dataset The parameters for aEARM were calibrated to ordinal values ​​of tBID and cPARP abundances at each 60-s interval. The prior (blue) and posterior (orange) log10 distributions of the parameter value are shown. continued) Model parameters calibrated to an ordinal dataset The parameters for aEARM were calibrated to ordinal values ​​of tBID and cPARP abundances at each 60-second interval. Model parameters calibrated on an ordinal dataset The parameters for aEARM were calibrated to ordinal values ​​of tBID and cPARP abundances at each 180-s interval.

The prior (blue) and posterior (orange) distributions of the log10 parameter values ​​are shown. (continued) Model Parameters Calibrated to Ordinal Dataset Parameters for aEARM were calibrated to ordinal values ​​of tBID and cPARP abundances at each 180 s interval. Model parameters calibrated to ordinal data set Parameters for aEARM were calibrated to ordinal values ​​of tBID and cPARP abundance at an interval of every 300 s. The prior (blue) and posterior (orange) distributions of the log10 parameter values ​​are shown. continued) Model Parameters Calibrated to Ordinal Dataset Parameters for aEARM were calibrated to ordinal values ​​of tBID and cPARP abundance every 300 seconds.

Model parameters calibrated to an ordinal dataset Parameters of aEARM were calibrated to ordinal values ​​of tBID and cPARP abundance at each 1500s interval. Parameters of aEARM were calibrated to ordinal values ​​of tBID and cPARP abundance at each 60s interval. Prior (blue) and posterior (orange) distributions log10 of the value of the parameter are shown. Continued) Model parameters calibrated to Ordinal Dataset using Cauchy priors on.

Necrotic Cell Death in a Nanophysiometer

List of Equations

Referensi

Dokumen terkait

The Board of the Directors requests your understanding that in order to ensure that the Meeting shall only be attended by the Shareholders and/or their legal proxies and the invited