Menu

Research Snapshots

December 7, 2017

Billy Bridges, Andrew Brown, Robert Lund, Chris McMahan, and Jim Peterson are some of the faculty members in Mathematical Sciences with recent grant funding awards. These projects represent cutting edge research in Analysis and Statistics, broadly construed. You can read more about these projects below.

 

 

Research Snapshots

 

Billy Bridges, Robert Lund, and Chris McMahan’s project “Modeling rice production and resistance to climate change in Indonesia” was funded by Biorealm for $21K. This project investigates Oryza sativa or Asian rice, which is a staple food worldwide. To ensure food security for a growing population, there is a dire need to identify high-yielding rice plant varieties that are resistant to climate change. Developing new high-yielding varieties that are resistant to climate change is also important. For example, the population of Indonesia, with an annual growth rate of 1.21%, is estimated to reach 337 million in 2050. With an annual consumption rate of approximately 139kg of rice per capita per year, Indonesian production must increase to 47 million tons by 2050 to meet demand. Climate change, through rising temperatures, drought, and more frequent and/or prolonged flood events, complicates this task; it is estimated that rice yields will decline by approximately 7% for every degree Celsius increase in temperature. The goal of this proposal is to statistically analyze a rice yield database being compiled by the Bioinformatics and Data Science Research Center (BDSRC) at Bina Nusantara University (Jakarta, Indonesia). The analysis seeks to identify key factors (e.g., environmental, genetic, ancestral, climatic, etc.) for rice yield.

 

Andrew Brown’s project “Simulation-Based Design of Polymer Nanocomposites for Structural Applications” with Sez Atamturktur (Civil Engineering) and Christopher Kitchens (Chemical and Biomolecular Engineering) was funded by the National Science Foundation (NSF) for $428K. Engineers and scientists routinely study systems driven by extremely complex processes that are only partially understood, but conducting physical experiments to better glean their behavior is prohibitively resource intensive. As such, computer models are frequently used to predict the behavior of a system under a variety of conditions. In addition to conditions that are known and/or controllable in reality, most computer models take as inputs parameter values corresponding to physical constants or system characteristics. These parameter values must be specified in the code, despite their true values being unknown. Model calibration is the process of tuning such parameters to get computer models to agree with reality while quantifying the associated uncertainty. Calibrating a computer model is similar to engineering design in that the aim is to find values of certain parameters (the design parameters) so that the system outcomes will most closely agree with the target data (the performance criteria). This similarity is the crux of this project. The aim of this research is to use model calibration principles to create a new paradigm for engineering material design and manufacturing, with particular application to designing low-cost, high-performance wind turbine blades.

 

Traditional engineering design considers materials one at a time from a database of existing materials and choosing construction parameters to satisfy performance criteria. In this sense, the final design is constrained by the particular material used in construction. On the other hand, material design uses microscale composition and processing options to create entirely new materials that achieve a specific property enhancement, but the end use of the material might be entirely neglected. To address both of these current limitations, the investigators are expressing the design problem as a model calibration problem to concurrently incorporate both macroscopic design criteria as well as constraints imposed on the possible designs by material properties.

 

Drawing on their previous work involving state-dependent computer model calibration (e.g., Brown and Atamturktur, 2018, Statistica Sinica), the investigators are taking a fully Bayesian approach to estimating the microscale material characteristics via a macroscale finite element computer model that predicts the performance. In addition to allowing direct incorporation of subject matter expert knowledge, this approach facilitates quantification of all sources of uncertainty. These sources include measurement error, the computer model itself (i.e., discrepancy between the computer and reality due to missing physics), using a computationally cheap Gaussian process emulator in place of the computationally expensive finite element model, and the uncertainty about the estimated parameters themselves. By explicitly incorporating the uncertainty into design decisions, the final design will be more robust to model misspecification and inexact performance predictions. Ongoing research questions that they are addressing include the most appropriate way to express the design criteria while accounting for cost, the effect of model discrepancy on the design, and extending statistical calibration methodology to accommodate simultaneous calibration of discrete and continuous parameters whose allowable values are interdependent.

 

Robert Lund and Chris McMahan have had four projects funded by the Companion Animal Parasite Council: “2017-2018 Parasite Forecasting Agreement” for $34K, “2016-2017 Parasite Forecasting” for $71K, “Forecasting various canine vector-borne diseases within the conterminous United States” for $78K, and “CAPC 2015 Clemson University parasite forecasting” for $52K. These projects involve developing and vetting spatiotemporal statistical techniques that can be used to model and forecast future trends in several vector borne canine diseases; e.g., erlichiosis, anaplasmosis, Lyme disease, and heartworm. To accomplish these goals, millions of diagnostic test results, conducted throughout the conterminous United States on the county level, have been compiled, along with putative risk factor data. This information is being used to build statistical models which forecast future trends in disease prevalence. Current work, also seeks to extend these models to use canine disease spread as a sentinel for the potential risk to humans in the case of Lyme disease.

 

Chris McMahan’s project “Group testing for infectious disease detection: multiplex assays and back-end screening” with Joshua Tebbs (University of South Carolina) and Christopher Bilder (University of Nebraska-Lincoln) was funded by the National Institutes of Health (NIH) for $192K. This project studies testing individuals for infectious diseases, which is important for disease surveillance and for ensuring the safety of blood donations. When faced with questions on how to test as many individuals as possible and still operate within budget limits, public health officials are increasingly turning toward the use of group testing (pooled testing). In these applications, individual specimens (such as blood or urine) are combined to form a single pooled specimen for testing. Individuals within negative testing pools are declared negative. Individuals within positive testing pools are retested in some predetermined algorithmic manner to determine which individuals are positive and which individuals are negative. For low disease prevalence settings, this innovative testing process leads to fewer overall tests, which subsequently lowers costs, when compared to testing specimens individually. Previous research in group testing has focused largely on testing for infections, such as HIV and chlamydia, one at a time. However, motivated by the development of new technology, disease testing practices are moving towards the use of multiplex assays that detect multiple infections at once. This research proposal presents the first comprehensive extensions of group testing to a multiplex assay setting. The first goal is to develop new group testing strategies that allow for multiplex assays to be used in sexually transmitted disease testing and blood donation screening applications. This will allow laboratories to obtain the maximum possible cost savings through proper applications of group testing. The second goal is to develop new group testing strategies to increase the classification accuracy both with single and multiple infections in these same applications. This will be done by performing directed confirmatory testing after individuals are initially classified as positive or negative. An overarching theme of this research is to acknowledge individual risk factors by incorporating them into the group testing process. In terms of biostatistical innovation, this research involves developing new classification and Bayesian modeling procedures for correlated latent-variable data.

 

Jim Peterson recently received a 38 month grant from the Army Research Office to work on “Complex Models on Graph Based Topological Spaces” for $380K. The work supports some student activity at both the undergraduate and graduate level as well. The research uses graphs of computational nodes whose nodal and edge processing functions can be arbitrarily complex. The specific application domains are

 

  1. autoimmune models building on work already completed on West Nile Virus infections. The nodes are immunosynapses and the edge processing functions are based on T cell – pMHC interactions mediated by families of cytokine/ chemokine signals. The immunosynapse computations are based on new models of affinity/ avidty that try to understand how weak bindings can give rise to self-damage.

 

  1. consciousness models building on work already completed on cognitive models. Predatory wasp – prey interactions provide key insights into the construction of anesthesia models to help prevent iZombie states in operating theaters. An iZombie state is one where a patient is aware of the operation even though appearing to be anesthesized properly. Nodal and edge computations are based on approximations of neuron processing and ideas from homology and Betti decompositions of cytokine/ chemokine signals. In addition, models of cognitive dysfunction are a logical consequence on these studies as iZombie creation requires an alteration of a normal cognitive map. Hence, understanding how to assess iZombie states gives critical clues about other map changes.

 

  1. search models through topological spaces determined by graphs whose node and edge calculations are based on pseudo fractal decompositions.

 

All three on these research thrusts are linked by viewing the consequences of computations as being shaped by the topology of the manifold determined by the graph and its input structure. Ideas from condensed matter physics that lead to an understanding of topological defects are an inspiration. This work has a heavy computational component also.