Abstracts of the talks
Undergraduate Students
- Marisa Akers and Meera Venkataraman: Evaluating radar reflectivity measurements as predictors of rainfall
- Ryan Anderson: Simultaneous Approximation of a Function and Its Derivative by Linear Splines
- Zhang Bailu and W. Spencer Leslie: When Difference Quotients are Not Enough: Calculus on a Semisimple Associative Algebra
- Lilian Cheung: The Estimating Functions Approach for the Log-ACD Model
- Chris Ehlman and Jeff Fischer: Problems on Nonliear Modeling for Polymer-Based Composites and Nano-Composites
- Sarah Cummings and Ryan Durden: Bias in CMAQ Prediction of Ozone Concentration
- Kyle Fairchild and Michael Scruggs: Analysis of the Self Heating Phenomenon for Insulated and Conducting Systems
- Meagan Gentry and Austin John: Spatial Analysis to Predict Minimum Temperature and Initial Spring Freeze in the Southeast United States
- Candace Ghent: Computing a Network of Proteins Based on Functional Similarity
- Joseph Gibson: Factorization Properties of Congruence Monoids
- Caroline Hollingsworth and Amanda Miller: Estimating Explained Variation for an Underlying Linear Model Using Logistic Regression
- Noah Hughes: Permutations and Weyl Groups
- John Jacobson: Fibonacci and Lucas Identities by Means of Graphs
- Anna Johnson: Cell Growth in a Colon Crypt
- Kev Johnson, Daniel McElveen and Katelyn Sanders: Diffusive logistic equation with nonlinear boundary conditions and Sigma-shaped bifurcation curves
- Erik Kernfeld: Modeling Invasive Aspergillosis in Silico
- Douglas Lowell: Effect of Vitamin D Levels on Health and Wellbeing
- Erin Middlemas: Estimating Soliton Solutions to the Nonlinear Schrodinger Equation
- Chris Miles: Sensitivity Analysis of Discrete Biological Models Using Polynomial Dynamical Systems
- Alison Miller: A Comparison of Seminonparametric and Nonparametric Survival Estimation
- Steven Moffitt and Zane Troyer: Wavelet Deconvolution Techniques for Estimating Probability Density Functions
- Matthew Neal: A Colorectal Carcinogenesis Model Incorporating Insulin and Insulin-like Growth Factor 1
- Andrei Nicholson: Analysis of Noticeability and Suspiciousness in Rowe's Exposure Metric
- Thomas Parrish: Automated Tracking of Small Objects in Video Recordings
- Caitlin Ross: The Effect of Aging Structures on the Evolution of Cooperation
- Christopher R. Shill: Galois 2-adic Fields of Degree 12
- Tracy Spears Gill: An Application of Optional Unrelated-Question Randomized
Response Technique
- Faith Spradlin: The Noncentral t-distribution
- David Sykes: Kleptoparasitic Interactions and Internal States
- Michael M Thomas: The Card Collector Problem
- Anna Tuck: Optional Unrelated-Question Randomized Response Models
Graduate Students
- Abraham Abebe: Interactive treatment planning in cancer radiotherapy
- Yolanda Baker: Analyzing Security Threats as Reported by United States Computer Emergency Readiness Team (US-CERT)
- Snehalatha Ballamoole: Spectral properties of Cesaro-like operators on weighted Berman spaces
- Soumya Bhoumik: On The Automorphism Groups of Almost All Circulant (Di)graphs
- David Brandt: Gillespie's Algorithm
- Virginia Burger: Computational membrane detection in fluorescent cell images to quantify role of endosomes in signal transduction
- Yi Chen: Variance Components Estimation Using Bayesian Methods in Mixed Effect Models
- Melody Denhere: Robust Functional Logistic Regression
- Adam Eury: Host dependence of vector borne disease
- Rick Eugene Farr: On the Zeros of Derivatives of the Riemann Zeta Function on the Left Half Plane
- Leigha Felix: Propant Effects on Oil and Gas Production
- Kelsey Gasior: A Multi-Scale Computational Model of the Epithelial-Mesenchymal Transition in Solid Tumors
- Lakshmi S Kalappattil: Uniqueness Result for Semipositone Problems on Exterior Domains
- Sweta Keshapagu: Analysis of Datasets for Network Traffic Classification
- Hyunju Kim: Exotic NURBS Geometrical Mappings in Isogeometric Analysis for Elasticity containing Singularities
- Swathi Kota: Evaluation of Rowe's Exposure Metric for Intruder Detection in Public Space
- Michelle McCullough: Birth Month Probability Scenarios
- Jonathan Merlini: Steinberg's multiplicity formula and maximal root
- Jacob Norton: Mathematical Modeling of Cardiovascular Regulation
- Archana Polisetti: Simultaneous Classification and Feature Selection for Intrusion Detection Systems
- Michael Robert: Assessing a Reduce and Replace Strategy for Reducing Dengue Transmission by Aedes aegypti
- Brandon Rupinski: Connectivity Index of Graphs
- Hwayeon Ryu: Feedback-Mediated Dynamics in a Model of Coupled Nephrons with Compliant Short Loop of Henle
- Weining Shen: A Bayesian Approach for Network Intrusion Detection Problems
- Nicholas Sizemore: Groups Covers: Covering Numbers & Partition Numbers
- Andrew Snyder-Beattie: Decoupling Global Drivers of Latitudinal Species Gradients with a View Towards Astrobiology
- John Steenbergen: A Cheeger-Type Inequality on Simplicial Complexes
- Hong Tran: Modeling Chemical Reactions in Gene Networks
- Amanda Traud: What's the Queen Got to Do with it? :Testing Queen Presence on Ant Social Network Structure
Interactive treatment planning in cancer radiotherapy
Abraham Abebe, UNCG, Greensboro, NCmentored by Dr. Maya Chhetri
Abstract: Cancer is the second greatest cause of death in USA. Cancer patients are treated with radiotherapy since it has been proven effective as treatment for many cancer types. One therapy technique for cancer is Intensity Modulated Radiation Therapy (IMRT). The goal is to deliver a given amount of radiation (prescribed by the physician) to the tumor while limiting the amount of radiation absorbed by the healthy organs. In this talk we present an algorithm and numerical results for the treatment planning of cancer radiotherapy. The algorithm is based on the so called moments approach in which a given Dose Volume Histogram (DVH) is approximated with a set of constraints on the moments of the dose distribution.
Evaluating radar reflectivity measurements as predictors of rainfall
Marisa Akers and Meera Venkataraman, North Carolina State University, Raleigh, NCmentored by Dr. Brian Reich and Gina-Maria Pomann
Abstract: To improve predictions of weather system models, it is important to have accurate measurements of precipitation at all locations. Actual amounts of rainfall have high variability across space and time, and patterns are generally unpredictable. Gauges measure rainfall, but only at specific locations. Therefore, a reliable prediction method for all locations in a given region is needed. One common method of predicting rainfall is to use measurements of reflectivity from radars. However, radar data is not directly comparable to gauge data because they measure reflectivity and actual precipitation amounts, respectively. The data analyzed contains 406 radar measurements covering about 62,000 square miles in Kansas for August 2004. We match these hourly readings to the 180 gauge stations in this region by the day and hour of measurement. Our main goal is to evaluate how radar reflectivity measurements can be used to predict precipitation. To address this goal, we examine zero-inflated regression models with precipitation as the response variable and radar reflectivity readings as a covariate.
Simultaneous Approximation of a Function and Its Derivative by Linear Splines
Ryan Anderson, Kennesaw State University, Kennesaw, GAmentored by Dr. Yuliya Babenko
Abstract: Linear splines, in particular interpolating splines, are used to approximate a function given a discrete set of values of the function. Linear splines are widely used in many applications targeting geometric modeling of curves and surfaces as piecewise linear functions are generally easy to work with. The concept of linear splines have been extended to bilinear (linear in each variable) and further to polylinear splines with many results having been proved. In this talk, I will introduce the concept of spline interpolation and discuss new results on simultaneous approximation of a multivariate function (of certain smoothness) and its derivatives by linear splines as well as present some results on the error of approximation. The work was done under the supervision of Dr. Yuliya Babenko.
When Difference Quotients are Not Enough: Calculus on a Semisimple Associative Algebra
Zhang Bailu and W. Spencer Leslie, Liberty University, Lynchburg, Virginiamentored by Dr. James Cook
Abstract: We study how to generalize differentiation of functions on the complex numbers to functions on a semisimple associative algebra. We show that the generalized Cauchy Riemann equations solve a certain generalization of Laplace's equation.
Analyzing Security Threats as Reported by United States Computer Emergency Readiness Team (US-CERT)
Yolanda Baker, NC A&T State University, Greensboro, NCmentored by Dr. Rajeev Agrawal
Abstract: The 21st century has seen an enormous and almost sudden expansion in the use and types of technology. The use of cyber technology has allowed people and organizations worldwide to transact and communicate at speeds and in ways that were unimaginable twenty-five years ago. However, the cyber-age has also brought along with it new ways to wage attacks which has been a daunting task to thwart. This presentation provides an overview of the number of high-impact security threats, vulnerabilities and alerts that have been reported by the United States Computer Emergency Readiness Team (US-CERT) over the past five years. This presentation also explores the companies with the highest numbers of reports.
Spectral properties of Cesaro-like operators on weighted Berman spaces
Snehalatha Ballamoole, Mississippi Sate University, MSmentored by Dr. Len Miller
Abstract:
Integral operators on function spaces have been studied from the very begining of Operator Theory. We determine spectral properties of operators C_{ν}f(z) = ∫ _{0}^{z}dw on the weighted Bergman spaces of analytic functions on the disc. These operators C_{ν} are associated with semigroups of weighted composition operators introduced in the study of the classical Cesaro operator.
On The Automorphism Groups of Almost All Circulant (Di)graphs
Soumya Bhoumik, Mississippi Sate University, MSmentored by Dr. Edward Dobson
Abstract: We show that almost all circulant graphs have automorphism groups as small as possible. Of the circulant graphs that do not have automorphism group as small as possible, the second author has conjectured that almost all circulant (di)graphs whose automorphism group is not as small as possible are normal circulant (di)graphs. We show this conjecture is not true, but is true for circulant (di)graphs whose order is in a "large" subset of integers. We additionally explore the asymptotic behavior of the automorphism groups of circulant (di)graphs that are not normal, and show that no general conclusion can be obtained. rats.
Gillespie's Algorithm
David Brandt, Kennesaw State University, Kennesaw, GAmentored by Dr. Anda Gadidov
Abstract: Gillespie's Algorithm is a stochastic method for simulation of compartmental mathematical models. The algorithm has been implemented for a wide variety of chemical, physical, infectious disease and ecological models. The key to using this method is to define each possible step in an unidirectional compartmental model and to add all of these possible events which define the total rate parameter. Only one possible event is allowed for each time step. Using the total rate as the parameter for a draw from the random exponential distribution, the drawn random variable defines WHEN the next event occurs. The next step is to define WHAT the next event will be out of the different possibilities. This is accomplished by drawing a random variable from the standard uniform distribution and using this random variable to decide which event occurred by partitioning the [0,1] interval accounting for the different possible probabilities into or out of the compartments. This will be implemented for the demographic Susceptible-Infectious-Recovered (SIR) system of differential equations for infectious disease modeling. For large numbers of infected people and population sizes the stochastic outcomes will converge toward the deterministic numerical solutions. However, for smaller populations and/or small numbers of infected people for initial conditions, different stochastic realizations can better simulate reality, including the extinction of the infection from the population.
Computational membrane detection in fluorescent cell images to quantify role of endosomes in signal transduction
Virginia Burger, Joint CMU-Pitt PhD Program in Computational Biology, Pittsburgh, PAmentored by Dr. Chakra Chennubhotla and Dr. Alexander Sorkin
Abstract: Ligand activation of epidermal growth factor receptors (EGFR) on the cell surface initiates a signaling cascade within the cell, leading to multiple processes involved in cell motility, proliferation, survival, and differentiation. It has been proposed that ligand stimulated EGFR may trigger signaling not only from the plasma membrane but also from endosomes, which can use microtubular transport to travel long distances in the cell. The Sorkin lab has developed a system for monitoring the process of EGFR activation in space and time using HeLa cells that express fluorescently labeled sensor of EGFR activity to track activated EGFR together with its activating ligands. This system yields three dimensional time-series images of cells, allowing simultaneous visualization of activating ligands both binding and activating EGFRs at the cell surface, as well as within the cell after endocytosis of ligand-receptor complexes into endosomes. As the fluorescent signal captured by these images passes through entire cells, the images have a poor signal to noise ratio and identification of cell features is challenging. To quantify the existence of EGFR signaling from the endosomes, we must computationally segment individual cells and their membranes from the noisy images. By taking advantage of the time and space dependence between the 2D image slices, we form a statistical model which predicts the likelihood of each image pixel belonging to a plasma membrane or cell interior. Using the plasma membrane and interior masks, we then study the role of surface versus endosomal signaling for a variety of EGFR activator ligands.
Variance Components Estimation Using Bayesian Methods in Mixed Effect Models
Yi Chen, University of Georgia, Athens, GAmentored by Dr. Chao Li and Dr. Chuck Miller
Abstract: Negative variance components, which should be non-negative by their definition and our common sense, are not rare in mixed effect models in real life. Statisticians have a hard time presenting convincing explanation of the result to others when meet this situation. Traditional methods like REML method can't really solve the problem by constraining the result. Moreover, forcing the negative variance components to be zero sometimes will provide misinformed results according to our simulations. Applying Bayesian method in estimating variance components may avoid this problem due to its prior's setup. Discussions about Bayesian analysis for mixed-effect models have been made over the past decades, but the literature about their variance components is sparse. So finding out whether Bayesian method is applicable to our data (from Merck manufacturing division), how it can be applied and under what scenarios it has advantages or disadvantages compared to traditional methods are brought to the attention. In this talk, a new Bayesian method to estimate variance components in mixed effect models using a Half-Cauchy prior will be introduced. Comparisons between the new method and traditional methods are made using both simulated datasets and practical datasets from Merck manufacturing division.
The Estimating Functions Approach for the Log-ACD Model
Lilian Cheung, University of Connecticut, Storrs-Mansfield, CTmentored by Dr. Nalini Ravishanker
Abstract: In recent years, there has been interest in the use of statistical models to analyze durations between economic events. Two such statistical models are the Log-ACD1 and Log-ACD2 models proposed by Bauwens and Giot in 2000. The martingale estimating functions approach may be used to estimate the model parameters in the Log-ACD models (Thavaneswaran and Abraham, 1988). In addition, the combined estimating functions approach described by Liang et al. (2011) may be used to generate recursive estimates for the parameters, based on the combined estimating functions. In this talk, we will discuss these techniques with regard to the Log-ACD2 model.
Bias in CMAQ Prediction of Ozone Concentration
Sarah Cummings and Ryan Durden, North Carolina State University, Raleigh, NCmentored by Dr. Brian Reich and Dr. Sujit Ghosh
Abstract: Global scale pollution is among the most controversial topics in society today. The decisions of scientists and policy makers rely heavily on the results of pollution research. Deterministic atmospheric chemistry models help us understand the potential impacts of policy decisions on future air pollution levels. Our goal is to generate a model that allows simulation of future air quality under different conditions and makes improvement on ozone concentration predictions. We accessed and modified a large-scale dataset containing various variables such as the actual measurements of ozone concentrations and CMAQ (Community Multiscale Air Quality) predictions of weather conditions from 82 sites. By selecting the most important variables, we generated a linear model to make predictions for ozone concentration. This would allow the EPA and other CMAQ users to more accurately predict ozone levels throughout the country. In model development, we utilized the statistical procedure of stepwise selection. Exploratory data analysis focused on both physical conditions of weather and chemical predictors for the model.
Robust Functional Logistic Regression
Melody Denhere, Auburn University, Auburn, ALmentored by Dr. Nedret Millor
Abstract: In this work, we discuss the estimation of the parameter function for a functional logistic regression model. We consider ways that allow for the parameter estimator to be resistant to outliers, in addition to minimizing multicollinearity and reducing the high dimensionality which is inherent with functional data. To achieve this, the functional covariates and functional parameter of the model are approximated in a finite dimensional space generated by an appropriate basis. This approach reduces the functional model to a standard multiple logistic model with highly collinear covariates and potential high dimensionality issues. The proposed estimator tackles these issues and also minimizes the effect of functional outliers. Results from a simulation study and a real world example are also presented to illustrate the performance of the proposed estimator.
Problems on Nonlinear Modeling for Polymer-Based Composites and Nano-Composites
Chris Ehlman and Jeff Fischer, Clemson University, Clemson, SCmentored by Irina Viktorova
Abstract: Volterra's theory of hereditary has applications in modeling for many branches of mathematics, physics, and engineering. Volterra's equation of the second type, known at the Rabotnov's model can be used to model the nonlinear relationship between stress, strain, and time exhibited by viscoelastics. Viscoelastics are specialized materials, which respond to an applied stress by deforming in a manner which is both viscous, or flowing in nature, as well as in a elastic manner. Thus viscoelastics are known as "memory" materials and are ideal for modeling using Volterra's equations. Two methods are used in this project to obtain a functional model of stress related properties derived from experimental data sets of creep loading analysis of polymer-based materials. The first method uses a kernel of arbitrary order to perform a least squares regression to create an optimized functional of stress, strain and time. Second, the Laplace-Carson transform is applied to the data set to create a smoothed curve and more robust functional relationship. Each of the methods are hereby analyzed for robustness and accuracy with respect to the experimental data sets of polymer based composites and nano-composites.
Host dependence of vector borne disease
Adam Eury, UNCG, Greensboro, NCmentored by Dr. Gideon Wasserberg and Dr. Clifford Smyth
Abstract: The role of the host is often ignored when modeling vector-borne diseases. This project investigated the effect of the vector's host-dependence (hereafter, vector-host coupling) on disease dynamics. Specifically, we examined how disease prevalence in host populations changes with host or vector abundance. We used an object-oriented-programming approach to simulate three vector-host coupling scenarios: un-coupled, using random movement of the vector (hypothetical), semi-coupled where vectors seek hosts only for blood-meals (e.g., mosquitoes), and totally-coupled where the vector requires contact with the host throughout its life-cycle (e.g., ticks). In all scenarios, decrease in prevalence with host abundance was observed resulting from decrease in the vector-to-host ratio. In the uncoupled scenario, these relations occurred throughout the host abundance range. In contrast, in the totally-coupled scenario prevalence first increases and later decreases. In the semi-coupled scenario, prevalence remains constant at low host abundance and then decreases. These relations result from the vector's host- seeking behavior which increases the connectivity of the host population at low densities, which buffers the decrease in the vector-to-host ratio. Based on a preliminary literature analysis, the majority of papers addressing the effect of host abundance found a positive association, which is partially consistent with the novel predictions of our model.
On the Zeros of Derivatives of the Riemann Zeta Function on the Left Half Plane
Rick Eugene Farr, UNCG, Greensboro, NCmentored by Dr. Sebastian Pauli
Abstract: Levinson and Montgomery have shown (assuming the Riemann Hypothesis) that each derivative of the Riemann zeta function has only finitely many zeros on the left half plane. Not much is known about their distribution. We present an algorithm for evaluating the derivatives of the zeta function on the left half plane and give a table of previously unknown zeros. The algorithm involves the evaluation of the derivatives of the Gamma function which we realized using polylogarithms. None of these functions are available in any computer algebra system. Our computations show an interesting behavior of the zeros of the derivatives of zeta, namely that they seem to lie on curves which are extensions of certain chains of zeros of derivatives of zeta on the right half plane.
Analysis of the Self Heating Phenomenon for Insulated and Conducting Systems
Kyle Fairchild and Michael Scruggs, Clemson University, Clemson, SCmentored by Irina Viktorova
Abstract: In the scope of material science, it is well understood that mechanical behavior of a material is temperature dependent. The converse is also true and for specific loading cases contributes to a unique thermal failure mechanism known as “heat explosion”. The goal for this research is to improve the mathematical models for predicting heat explosion by using a specific case of the Fourier heat transfer system that focuses on thermoviscoelastic properties of materials. This is done by using a computational analysis to solve for an internal heat parameter that determines thermal failure at a critical value. This critical value is calculated under conditions either accounting for or negating the effect of heat dissipated by the material. This model is an improvement on existing models because it accounts for material specific properties and in doing so limits mathematical assumptions of the system. By limiting the assumptions in the conditions, the model becomes more accurate and useful in regards to material design.
Propant Effects on Oil and Gas Production
Leigha Felix, University of Houston-Clear Lake, Houston, TXmentored by Dr. Reinhard Laubenbacher
Abstract: There has been discussion among the oil and gas industry if the type of propant used during the fracturing of wells has impact on overall production. This study is made to analyze the fracturing and oil and gas production data to examine whether or not the propants have an effect on production and if so, which ones are they. The data used in this analysis was acquired through EP Energy in Houston, TX. The propants used in the fracturing process for this data set are three different types of sand: 100 Mesh, 30/50, and 20/40. We also looked at the combination of these three together. AS a result of our analysis, 100 Mesh became an important propant in the investigation, as it was discovered that not all oil and gas companies use this propant. Using regressional analysis and variable selection we were able to evaluate and identify the important propants in the yield of the oil and gas production.
A Multi-Scale Computational Model of the Epithelial-Mesenchymal Transition in Solid Tumors
Kelsey Gasior, North Carolina State University, Raleigh, NCmentored by Dr. Sudin Bhattacharya, Dr. Marlene Hauck
and Dr. Jason Haugh
Abstract: We propose a mathematical and simulation-based model to examine the relationship between intracellular signaling pathways and the cellular behavior associated with the Epithelial-Mesenchymal Transition (EMT) in solid tumors -a crucial step in the metastasis of cancerous cells. EMT is a process by which epithelial cells undergo a phenotypic change and acquire invasive and migratory properties characteristic of the mesenchymal phenotype. Several different signaling pathways can initiate this process, such as Wnt-Frizzled. Activation of this pathway stabilizes the beta-catenin protein, causing it to move to the nucleus and induce the expression of transcription factors such as Snail, Slug, and Twist. These factors in turn suppress the expression of the E-cadherin protein, which sequesters beta-catenin to the cell surface and is essential for maintenance of cell-cell adhesion and the epithelial phenotype. After the EMT process is complete, the newly formed mesenchyma l cells can break through the basal membrane into the blood stream and migrate to other locations in the body. At these remote locations the cells can undergo a reverse process, the mesenchymal-epithelial transition (MET), which transforms them back into epithelial cells that form new secondary tumors. We have developed an ordinary-differential-equation based model of the intracellular signaling pathways leading to EMT, which is then embedded into a multicellular spatial agent-based model to examine how these intracellular pathways contribute to the phenotypic and behavioral changes characteristic of EMT at the tissue level. This multi-scale modeling approach is ideally suited for comparison to experimental observations regarding EMT in solid tumors.
Spatial Analysis to Predict Minimum Temperature and Initial Spring Freeze in the Southeast United States
Meagan Gentry and Austin John, North Carolina State University, Raleigh, NCmentored by Dr. Brian Reich, Dr. Gina-Maria Pomann and Dr. Sujit Ghosh
Abstract: The goal of this paper is to provide a high-resolution map of daily predicted minimum temperatures over the Southeast US. From these predicted temperatures, we can determine the first date of spring freeze in any location on the map, which is valuable to agricultural industry among other things. First, we introduce a spatial regression model on the daily minimum temperatures recorded by the gauge stations in West Virginia, North Carolina, and Tennessee. The spatial model not only incorporates certain climatic and geographical variables, but also considers the spatial dependency among locations. By the same procedure, we also apply spatial models for longer time periods (weekly, monthly, and seasonal), in order to see the variation in models from time to time. We utilize simple spatial kriging to predict the daily minimum temperature for each point on the high-resolution map. The probability of the minimum temperature lower than 28 Fahrenheit can be estimated for each location over time under assumption of normality. The map of these probabilities is useful in determining the expected first spring freezes date for different locations.
Computing a Network of Proteins Based on Functional Similarity
Candace Ghent, NCA&T, Greensboro, NCmentored by Dr. Debra Goldberg
Abstract: To understand a biological system, it is not enough to understand the individual components. We must also understand how these components function together. Network models let us see patterns that help us to understand complex associations amongst proteins. Nodes represent genes and/or proteins. Nodes are connected by an edge if they are associated in some way. We have used various network edge types, including protein interactions, gene regulation, gene co-expression, and similar phenotype. For this project, we want edges between proteins with similar functions. For two reasons, it is not straightforward to measure the similarity of protein function. First, proteins can have multiple functions. Second, each function is represented by a hierarchy of functions at various levels of specificity, so that some functions can be considered to be more similar than others. We investigated different ways to compute a functional similarity, and decided to implement the Ka ppa Statistic. We applied this measure to proteins associated with the Mediator complex to better understand variants of the complex.
Factorization Properties of Congruence Monoids
Joseph Gibson, University of Texas at San Antonio, San Antonio,TXmentored by Dr. Vadim Ponomarenko
Abstract: Take a submonoid of the natural numbers which when reduced modulo n is multiplicatively closed. This submonoid is known as a Congruence Monoid (CM). Unlike the naturals, many CMs enjoy the property of non-unique factorization into irreducibles. This opens the door to the study of arithmetic invariants associated with non-unique factorization theory; most important to us will be the concept of elasticy. In particular, we give a complete characterization of when a given CM has finite elasticity. Throughout, we explore the arithmetic properties of the CM in terms of the arithmetic and algebraic properties of its generator.
Estimating Explained Variation for an Underlying Linear Model Using Logistic Regression
Caroline Hollingsworth and Amanda Miller, James Madison University, Harrisonburg, VAmentored by Dr. Dinesh Sharma
Abstract: The coefficient of determinant, also known as the R^2 statistic, is used to assess the extent of the strength of the relationship between a response and explanatory variables in a linear regression model. It measures the proportion of variation in the response variable explained by a set of independent variables. In many real life events, interest lies on modeling the relationship between a continuous response variable and a set of predictors. But in practice, the continuous dependent variable of interest may not be observable and is represented by its binary proxy. In such situations, logistic regression is a popular choice. There are many R^2 type statistics proposed to measure of explained variation for logistic regression. The pseudo R^2 measure (R_L^2) stands out because of its intuitive interpretation and independence on the proportion of success in the sample. It, however, severely underestimates the proportion of explained variation (rho^2 ) in the variable underlying the binary indicator of event occurrence. In this research we present a method for estimating the explained variation for the underlying linear model using the logistic regression analysis.
Permutations and Weyl Groups
Noah Hughes, Appalachian State University, Boone, NCmentored by Dr. Bill Cook
Abstract: Lie theory (the theory of Lie algebras and Lie groups) is important to many branches of mathematics and mathematical physics. Finite dimensional simple Lie algebras (over the complex numbers) are among the most important and best understood examples of Lie algebras. A simple Lie algebra’s structure is determined by its ”root system” (a collection of generalized eigenvalues associated with certain elements of the algebra). These root systems have beautiful geometric structures and are highly symmetric. The symmetry groups of these root systems are known as ”Weyl groups”.
In this talk we will describe the Weyl groups associated with simple algebras of type B_{n} (special orthogonal algebras so(2n+1)). In particular, we will present a set of permutations which generate the representation of the Weyl group corresponding to the so-called ”minuscule” representation.
Claude Mitschi and Michael F. Singer developed a technique which constructs differential equations whose symmetry groups (i.e. differential Galois groups) are simple Lie groups if their corresponding Lie algebras possesses a minuscule representation whose permutation representation has a ”strictly transitive set of permutation conjugacy classes”. Using our generators we are able to show that such a set exists for simple Lie algebras of type B_{n} when n = 2,3,5,7 (thus Mitschi and Singer’s construction applies). In addition, we can show that no such set exists for when n = 4,6,8,9,10,11.
Fibonacci and Lucas Identities by Means of Graphs
John Jacobson, Kennesaw State University, Kennesaw, GAmentored by Dr. Joe DeMaio
Abstract: In 1982, Prodinger and Tichy defined the Fibonacci number of a graph G, i(G), to be the number of independent sets (including the empty set) of the graph. They do so because the Fibonacci number of the path graph, P_n, is the Fibonacci number F_n+2. Nelson's Proof Without Words series provides numerous visual arguments for several mathematical identities, some of which feature the Fibonacci sequence. In Proofs that Really Count, Benjamin and Quinn provide purely combinatorial proofs of several mathematical identities, some of which feature the Fibonacci sequence. This talk marries these visual and combinatorial features to prove Fibonacci and Lucas identities by means of graphs.
Cell Growth in a Colon Crypt
Anna Johnson, Winthrop University, Rock Hill, SCmentored by Dr. Kristen Abernathy
Abstract: Cancer is the number two cause of death in America, according to the Centers for Disease Control and Prevention (”Deaths and Mortality” 2012). In particular, colorectal cancer is the second leading cause of cancer deaths in the United States. In this talk, we utilize the agent-based modeling software Netlogo to simulate how mutations in colon stem cells can lead to colorectal cancer. These simulations compare the behavior of healthy vs. mutated cells in a colonic crypt, using bottom-up morphogenesis to describe how mutated cells spread to other crypts and lead to tumor development. We then present methods for incorporating this model as an interdisciplinary learning tool in an introductory data analysis or statistics course.
Diffusive logistic equation with nonlinear boundary conditions and Sigma-shaped bifurcation curves
Kev Johnson, Daniel McElveen and Katelyn Sanders, Auburn University Montgomery, Montgomery, ALmentored by Dr. Jerome Goddard II
Abstract: Even though population models with diffusion have been the subject of research since the 1960's, still little is known about their varied dynamics. In this talk, we will study the structure of positive steady state solutions to a logistic population model with diffusion and grazing, i.e. a form of natural predation. Of interest, we consider a relatively new direction: a population that satisfies a certain nonlinear boundary condition. We obtain one-dimensional results via the Quadrature Method and numerical computations using the software package Mathematica.®
Uniqueness Result for Semipositone Problems on Exterior Domains
Lakshmi S Kalappattil, Mississippi State University, MSmentored by Dr. Ratnasingham Shivaji
Abstract: We study positive solutions of a nonlinear eigenvalue problem on exterior domains. In particular, we establish uniqueness results when a parameter is large. We prove our results by obtaining some a priori estimates through a careful analysis of the behavior of the solution.
Modeling Invasive Aspergillosis in Silico
Eric Kernfeld, Tufts University, Medford, MAmentored by Dr. Reinhard Laubenbacher
Abstract: The mold Aspergillus fumigatus causes invasive infections in immunocompromised patients, killing victims in over 25% of treated cases. The main opposition to the fungus consists of epithelial cells, macrophages, and neutrophils in a staged innate immune response. There are multiple systems for fungal iron uptake and multiple immune reactions working to inhibit them; iron is a key limiting nutrient for the fungus. A hybrid, multiscale model of A. fumigatus infection was developed, with an agent-based model simulating interactions among cells and discrete models simulating iron metabolism inside the cells. Parameters were gathered from the literature on A. fumigatus. The model was validated for neutrophil-deficient and healthy patients, marking disease progression by the number of agents representing hyphae. The fungal agents spread within seven days of simulation time in a neutrophil-deficient patient model, whereas healthy patients cleared the infection. For further testing, the knockout strain ∆sidA was simulated, reproducing the finding that the sidA gene is crucial for virulence. The ∆sidA gene is part of the A. fumigatus iron uptake system, and since it was simulated by modifying the discrete model of the fungal iron system, this second validation scheme has the benefit of testing the assumptions linking the hybrid model's components. This model may help researchers generate hypotheses or design successful studies by providing a fast, inexpensive testing platform for potential experiments on invasive aspergillosis and iron dynamics.
Analysis of Datasets for Network Traffic Classification
Sweta Keshapagu, UNCG, Greensboro, NCmentored by Dr. Shan Suthaharan
Abstract: Support Vector Machine (SVM) plays a major role in network traffic classification. SVM is a Machine Learning (ML) technique which uses labeled datasets for training and cross-validation purposes. SVM requires mathematical and statistical properties that truly represent the characteristics of the network traffic in order to perform efficiently. In this paper we present our results and findings of the study conducted using different network traffic datasets and modern SVM-based classifiers.
Exotic NURBS Geometrical Mappings in Isogeometric Analysis for Elasticity containing Singularities
Hyunju Kim, University of North Carolina at Charlotte, Charlotte, NCmentored by Dr. Hae-Soo Oh
Abstract: NURBS (non-uniform rational B-spline) functions are tools for engineering designs in CAD (computer aided graphic design) and the concept of Isogeometric Analysis (IGA) is to combine two tasks which are engineering design and analysis by employing NURBS as basis functions into finite element analysis. We introduce NURBS geometrical mappings based on IGA to deal with point singularities (cracks and jump boundary data) that arise in elliptic boundary value problems and elasticity. The proposed method makes it possible to independently control the radial and angular direction of the function to be approximated as far as the point singularities are concerned. We prove error estimates in Sobolev norms and demonstrate that the proposed mapping technique is highly effective for IGA of elliptic boundary value problems with singularities.
Evaluation of Rowe's Exposure Metric for Intruder Detection in Public Space
Swathi Kota, UNCG, Greensboro, NCmentored by Dr. Shan Suthaharan
Abstract: Intruder activities in public space have increased significantly, but the technology to detect these activities is limited. Our recent studies on Rowe's Exposure Metric (REM), a technique to measure suspicious behavior, suggest its application may lead to efficient intruder detection in public space. In this research we studied the meaning of REM using statistical techniques and evaluated its suitability for intruder detection by simulation. The results and findings of this research are presented in this paper.
Effect of Vitamin D Levels on Health and Wellbeing
Douglas Lowell, Old Dominion University, Norfolk, VAmentored by Dr. Norou Diawara
Abstract: Observational studies have shown that low vitamin D levels are associated with increased risks including all-cause mortality, risk of diabetes, cancer and periodontal disease. Therefore when we performed our own statistical calculations on the NHANES III data linking 25-hydroxyvitamin D levels [25(OH)D], we expected the to replicate the previously published findings. When the participants are grouped by normalized 25(OH)D quartiles, however, the all-cause mortality is actually increased in those with the highest 25(OH)D concentration. We tested the association of all-cause mortality to normalized quartiles of 25-hydroxyvitamin D levels in the NHANES III database. Vitamin D levels were collected in 13,331 nationally represented adults 20 years or older from 1988 through 1994. The individuals were then followed through 2000. The paper presents a study of correlation between the all-cause mortality and normalized quartiles of 25(OH)D levels from participants in the NHANES III linked mortality files. Our findings help decipher the correlation structure of vitamin D data before and after the application of normalization procedures.
Birth Month Probability Scenarios
Michelle McCullough, UNCG, Greensboro, NCmentored by Dr. Kumer Das
Abstract: This presentation is a twist on the familiar birthday problem in which we will explore the probability of various scenarios involving birth months. Specifically, given that two people marry and have two children, what is the probability that the parents birth months are different and the children's birth months match the parents', and how is this probability affected as the number of children increase?
Steinberg's multiplicity formula and maximal root
Jonathan Merlini, University of North Carolina Wilmington, Wilmington, NCmentored by Dr. Dijana Jakelic
Abstract: Representation theory of Lie algebras is a very active research area that has many interactions with other areas of mathematics and also with physics. Some of the basic building blocks of the theory are the irreducible finite-dimensional representations of finite-dimensional Lie algebras. Tensoring two such irreducible representations of a finite-dimensional Lie algebra results in a reducible representation which decomposes into a direct sum of irreducible ones. Describing this decomposition by computing the number of times an irreducible representation appears in the tensor product decomposition is one of the most important questions in combinatorial representation theory. This number is called the multiplicity of that representation in the tensor product. There are several formulas which can be applied for solving this problem and Steinberg's multiplicity formula is one of them. Motivated by recent results concerning the computation of extension groups of finite-dimensional representation of affine Kac-Moody algebras, we focus our attention on the case that one of the tensor factors is the adjoint representation of a given finite-dimensional simple Lie algebra (whose highest-weight is the maximal root of the algebra). We find a connection between the multiplicity of the other tensor factor with the cardinality of the support of its highest weight. We prove our claim for the Lie algebra of type A_2. For the talk, I will assume an understanding of undergraduate Linear Algebra and I will provide the necessary background.
Estimating Soliton Solutions to the Nonlinear Schrodinger Equation
Erin Middlemas, East Tennessee State University, Johnson City, TNmentored by Dr. Jeff Knisley
Abstract: The nonlinear Schrodinger equation is a classical field equation that describes weakly nonlinear wave-packets in one-dimensional physical systems. It is in a class of nonlinear partial differential equations that pertain to several physical and biological systems. In this project we apply a pseudospectral solution-estimation method to a modified version of the nonlinear Schrodinger equation as a means of searching for solutions that are solitons, where a soliton is a self-reinforcing solitary wave that maintains its shape over time. The pseudospectral method estimates solutions by utilizing the discrete Fourier transform to evaluate the spatial derivative within the nonlinear Schrodinger equation. An ode solver is then applied to the resulting ordinary differential equation. We use this method to determine whether cardiac action potential states, which are perturbed solutions to the Fitzhugh-Nagumo nonlinear partial differential equation, create soliton-like solutions. After finding soliton-like solutions, we then use symmetry group properties of the nonlinear Schrodinger equation to explore these solutions. We also use a Lie algebra related to the symmetries to look for more solutions to our modified nonlinear Schrodinger equation.
Sensitivity Analysis of Discrete Biological Models Using Polynomial Dynamical Systems
Chris Miles, Lafayette College, Easton, PAmentored by Dr. Reinhard Laubenbacher and Dr. Franziska Hinkelmann
Abstract: Discrete biological models are often intuitive to construct and able to reveal the qualitative dynamics of a complex system. Sensitivity analysis, which provides insight toward the effect of perturbations and uncertainty in a system, is typically regarded as essential in the modeling process. While methods for performing sensitivity analysis of continuous models have been studied extensively, far fewer analogous methods exist for discrete models. In this presentation, a novel method is proposed for performing sensitivity analysis on discrete models based on analogous continuous model techniques. The method of quantifying sensitivity is based on artificially introducing unknown parameters to the model and comparing the resulting dynamics to the original model. A mathematical framework, namely polynomial dynamical systems, is used to algebraically compute the dynamics of models with unknown parameters, a computation that might otherwise be computationally infeasible without the developed theory. An implementation of the algorithm is publicly available as a Macaulay2 package and was applied to published gene regulatory networks to provide a benchmark for the sensitivity of discrete biological models.
A Comparison of Seminonparametric and Nonparametric Survival Estimation
Alison Miller, Elon University, Elon, NCmentored by Dr. Kirsten Doehler
Abstract: Choosing between methods to estimate the survival function can be an arduous task. We examine two procedures to estimate the survival function in the presence of right-censored data. The most frequently used survival estimation method for data of this type is the Kaplan-Meier nonparametric estimator. The convenience of bypassing any assumptions about the distribution of the data and the easy implementation in statistical software are what makes the Kaplan-Meier so popular. We are using R software to run simulations which compare the Kaplan-Meier estimator and a seminonparametric (SNP) method. The SNP survival estimation technique is based on the SNP density which is defined as a polynomial squared multiplied by a given base distribution. The polynomial portion of the SNP density can be adjusted through use of a tuning parameter. We report on results from our simulation studies and demonstrate an application to a real data set.
Wavelet Deconvolution Techniques for Estimating Probability Density Functions
Steven Moffitt and Zane Troyer, Clemson University, Clemson, SCmentored by Irina Viktorova
Abstract: Estimating probability density functions is a valuable but arduous task because of the infinite dimensional nature of functions and the large data sets which are required. It becomes even more problematic in situations where the observed data is not 'pure', in other words, contaminated with arbitrary noise. Integral transforms have been used in the de-convolution of noisy data in results by Dr. Taylor and Dr. Lee (Clemson University 1989). Their research utilized the Fourier transform to create a consistent probability density function estimator for discrete and continuous cases. Later, it was postulated by Taylor and Lee that the relatively new Wavelet Transform could be used in place of the Fourier Transform to provide analogous results. It was noted that the windowing component utilized in the wavelet kernel could be beneficial to the computation times and potentially approximate the probability density function more precisely. Presently, research is being conducted by a cross disciplinary undergraduate team of the Creative Inquiry Class on the evolution of integral transforms and their applications in scientific fields. Based on the aforementioned work of Dr. Taylor, we seek to utilize the wavelet transform to estimate probability density functions. Given an arbitrary set of data points, we will employ the wavelet transform to extract and remove convoluted factors (noise) while keeping the relevant (pure) data present and thus determine an appropriate probability density function. We also seek to analyze this method in comparison with results from Dr. Taylor and Dr. Lee's Fourier transform method. Numerical results and graphs will be presented to show how the probability density functions can be recovered from convoluted random data.
Analysis of Noticeability and Suspiciousness in Rowe's Exposure Metric
Andrei Nicholson, UNCG, Greensboro, NCmentored by Dr. Shan Suthaharan
Abstract: Video surveillance is an increasingly open problem in the security field where we would like "suspicious" behavior to be distinguished. Using Rowe's exposure metric, agents in a sensor field can be ranked for deceptiveness based on several measurements. Noticeability and suspiciousness are two of the most important metrics in measuring the exposure of a deceptive agent. Our research analyses the effectiveness of these two metrics using statistical techniques and simulated datasets. In this paper our results and findings will be discussed in detail.
A Colorectal Carcinogenesis Model Incorporating Insulin and Insulin-like Growth Factor 1
Matthew Neal, Winthrop University, Rock Hill, SCmentored by Dr. Joseph Rusinko, Dr. Kristen Abernathy and Dr. Zach Abernathy
Abstract: Edward Giovannucci proposes that variation in insulin and insulin-like growth factor 1 (IGF-1) levels influence colonic carcinogenesis. To study these proposed effects, we develop a system of linear ordinary differential equations to model the human colon on an intracellular level, incorporating insulin and IGF-1 and their effects on mutated cell populations. In particular, we focus on the insulin-dependent and independent intracellular signaling pathways and how they influence programmed cell death and growth. We consider the dynamics of all colorectal crypts using a compartmental approach, accounting for stem cells, transit cells and differentiated cells. With this model in place, we determine how changes in insulin and IGF-1 levels affect mutated cell growth. Using Wolfram SystemModeler, we show that high levels of insulin increase the number of cells which resist apoptosis and can lead to the growth of tumors. Our model also tests parameters simulating Familial Adenomatous Polyposis (FAP), a hereditary condition in which stem cells have a mutation at birth. Simulating these conditions, we found that IGF-1 levels noticeably affect the number of mutated cells found after eighty years.
Mathematical Modeling of Cardiovascular Regulation
Jacob Norton, North Carolina State Universitymentored by Dr. Mette Olufsen
Abstract: Understanding the cardiovascular regulatory apparatus is crucial for gaining insight into both the physiology of healthy individuals and the effect of pathologies on individuals. In order to maintain adequate oxygenation of all tissues the cardiovascular system maintains blood flow and pressure at an approximately constant level. To sustain blood flow and oxygen transport, a number of control mechanisms regulate vascular resistance, compliance, pumping efficiency and frequency. One important contributor to cardiovascular control is the baroreflex (or baroreceptor reflex), which uses mechano-sensitive baroreceptor neurons located primarily in the aortic arch and carotid sinuses. These neurons are stimulated by changes in blood pressure and contribute to short-term regulation of vascular efferents including: heart rate, cardiac contractility, and vascular tone. Unfortunately many mathematical models of baroreceptor dynamics, going back to the 1950's, are not biological motivated. Here we propose a new biologically motivated model which reflects all known static and dynamic properties of baroreceptors including: saturation, threshold, PED (post-excitatory depression), adaptation and rectification.
Automated Tracking of Small Objects in Video Recordings
Thomas Parrish, UNCG, Greensboro, NCmentored by Dr. Sebastian Pauli, and Dr. Matina Kalcounis-Ruppell
Abstract: The automated analysis of videos has many applications, such as video surveillance of traffic or people. It can also be used to process video recordings of animals in the wild. One of the fundamental methods of video analysis is the the tracking of moving objects. Automated video tracking methods involve the processes of isolating foreground information, identifying individual foreground components, and tracking these components over time. We will investigate each process from an algorithmic perspective, and cover both simple and advanced foreground isolation methods, object identification by connected component contour tracing, and naive object tracking. As a specific example, we discuss the application of these methods to infrared video recordings of free living mice, which we used to extract and analyze behavioral information from remotely recorded video. In this specific application we had to overcome some challenges: 1) The video was recorded with an infrared camera which yields greyscale output with no absolute temperature calibration; 2) The camera was suspended 10 m over the ground which makes mice very small objects. From the tracking data we extracted the average speed and the total distance traveled as measures of mice activity and found that the average speed is predicted by moonlight and wind speed.
Simultaneous Classification and Feature Selection for Intrusion Detection Systems
Archana Polisetti, UNCG, Greensboro, NCmentored by Dr. Shan Suthaharan
Abstract: We recently proposed a simultaneous classification and feature selection (SCFS) algorithm to classify both network traffic and features, where the features describe the characteristics of the network traffic. It uses statistical variance, runs test and statistical distribution techniques for the classification of traffic and features. However SCFS failed to classify some features. In this research we used linear transformations to transform the features into a statistical domain and performed Support Vector Machine analysis to classify. The results and findings of this research are presented in this paper.
Assessing a Reduce and Replace Strategy for Reducing Dengue Transmission by Aedes aegypti
Michael Robert, North Carolina State University, Raliegh, NCmentored by Dr. Alun L. Lloyd and Dr. Fred Gould
Abstract: In the last decade, a number of novel strategies for controlling the principle dengue vector, the mosquito species Aedes aegypti, have been proposed and investigated. Among those are strategies involving genetically modified mosquitoes (GMMs), which typically have one of two general goals: population replacement or population reduction. In this presentation, we propose and evaluate the potential of a GMM strategy which combines these two general goals. This strategy, henceforth known as Reduce and Replace (R&R), aims to introduce mosquitoes with a single genetic construct that is composed of two transgenes: one that induces female-specific lethality and one that renders its carriers incapable of transmitting the dengue virus. Through numerical simulations of an ordinary differential equation model, we study the effect that releases of R&R mosquitoes have on a wild-type population. We compare the efficacy of strategies that involve the release of R&R mosquitoes i n concert with other GMMs that confer lethal genes or those that carry only anti-pathogen genes. We find that the continued release of R&R mosquitoes alone can successfully replace a wild-type population with one that cannot transmit dengue fever if there is no fitness disadvantage associated with the transgenes involved. If there is a fitness cost associated with carrying the transgenes, continued release of R&R mosquitoes would be required to maintain a low frequency of competent vectors. We find that introducing mosquitoes with lethal genes only before introducing R&R mosquitoes does not, in general, lead to a lower frequency of competent vectors than other strategies. Our model suggests that a release of R&R mosquitoes followed by a release of mosquitoes carrying only an anti-pathogen gene lowers the number of competent vectors more than any other single or combined strategy we consider; however, the R&R strategy on its own underperforms when compared to a combined strategy of R&R and anti-pathogen only mosquitoes. We discuss the R&R strategy as a component of integrated pest control and motivate the need for further assessment of the utility of this strategy before testing of its efficacy begins.
The Effect of Aging Structures on the Evolution of Cooperation
Caitlin Ross, UNCG, Greensboro, NCmentored by Dr. Jan Rychtar and Dr. Olav Rueppell
Abstract: The evolution of altruistic behavior is intriguing because selfish actions often seem to benefit the individual. Game theorists often model this predicament by using the Prisoner's Dilemma. Two players simultaneously decide on a strategy, and though a defector whose partner cooperates will benefit most, mutual defection yields the worst consequences for both players. When using this game to study the evolution of cooperation, it has been shown that spatial structure favors altruistic behavior. We study how life stages affect the evolution of cooperation in a spatially structured, aging population. The spatial structure of the model allows for the evolution of cooperation in otherwise inherently selfish populations. We examine how changing the existence and length of pre-reproductive, reproductive, and post-reproductive stages of life affects the evolution of altruism. Using computer simulation, we show that in general, a proportionally long reproductive stage allows cooperators to thrive best. The existence of any non-reproductive stage suppresses cooperation-post-reproductive more substantially than pre-reproductive. Therefore, our study suggests that cooperation evolves most likely in populations with simple life history, at least when interactions between different life stages are symmetrical.
Connectivity Index of Graphs
Brandon Rupinski, Western Carolina University, Cullowhee, NCmentored by Dr. Risto Atanasov
Abstract: The connectivity index (or Randic) is a topological invariance introduced by Milan Randic in 1975 suitable for measuring the extent of branching of the carbon-atom skeleton of saturated hydrocarbons. There is a good correlation between the connectivity index and several physico-chemical properties of alkanes: boiling points, chromatographic retention times, surface areas, etc. A turning point in the mathematical examination of the connectivity index happened in the second half of the 1990s. In this presentation we will discuss the connectivity index of graphs with cut vertex (vertices).
Feedback-Mediated Dynamics in a Model of Coupled Nephrons with Compliant Short Loop of Henle
Hwayeon Ryu, Duke University, Durham, NCmentored by Dr. Anita Layton
Abstract: The nephron in a rat kidney regulates its fluid capacity, in part, by a negative feedback mechanism known as the tubuloglomerular feedback (TGF) that mediates oscillations in tubular fluid pressure and flow, and NaCl concentration. However, the tubular fluid flow found in the nephrons of spontaneously hypertensive rats (SHR) can exhibit highly irregular oscillations. In this study, we developed a mathematical model of short-looped nephrons coupled through their TGF system to investigate the extent to which internephron coupling contributes to the emergence of flow regular or irregular oscillations. Using a characteristic equation derived from the equations for a model of two coupled nephrons, we conducted a bifurcation analysis of the TGF model equations. An analysis of that characteristic equation yield a number of parameter regions, indicating the potential for different model dynamic behaviors. Numerical simulations revealed a variety of behaviors in these regions. To attain a complete understanding the impacts of parameter variability, we investigated three different cases at which two coupled nephrons i) having identical parameters, ii) with only one nephron having varying parameters, and iii) with identical TGF gains, varying delays. The model results suggest that the stability of the TGF system is reduced by internephron coupling. Based on the information provided by the characteristic equation, we also identified parameters for which the model predicts the irregular tubular flow oscillations with a degree of complexity, which may explain the emergence of irregular oscillations in the hypertensive rats.
A Bayesian Approach for Network Intrusion Detection Problems
Weining Shen, North Carolina State University, Raleigh, NCmentored by Dr. Shan Suthaharan
Abstract: Anomaly-based approaches for intrusion detection problems have been well studied in the past decade. Attacks are identified by comparing collected data to selected features of the normal traffic and evaluating their differences through certain statistical models. We propose a Bayesian approach to select relevant features, analyze the intrusion datasets and make predictions of attacks. An additive model was considered to study the impact of features on the probability of attacks while computationally a fast Bayesian algorithm was implemented. We apply our method on the KDD dataset and obtain a better performance in terms of speed and accuracy than classical methods.
Galois 2-adic Fields of Degree 12
Christopher R. Shill, Elon University, Elon, NCmentored by Dr. Chad Awtrey
Abstract: An important problem in computational number theory is to clasify all finite extensions of the p-adic numbers by computing important invariants which define each extension. Current research has focused on computing Galois groups of these extensions up to degree 11. Consequently for this talk, we will focus on degree 12 extensions. We will begin with a brief overview of p-adic numbers and will conclude by discussing a method for calculating Galois groups of Galois extensions of the 2-adic numbers.
Groups Covers: Covering Numbers & Partition Numbers
Nicholas Sizemore, Western Carolina University, Cullowhee, NCmentored by Dr. Tuval Foguel
Abstract: In this talk, we will present preliminary work into investigating the relationship between group covers and partittions. Specifically of interest are the covering number, defined to be the minimal number of subgroups necessary to form a cover, and its relationship to the partition number, which is defined as the minimal number of subgroups needed for a partition. Conjectures concerning the conditions under which these numbers are the same and/or different are presented, along with preliminary work towards verifying these conjectures.
Decoupling Global Drivers of Latitudinal Species Gradients with a View Towards Astrobiology
Andrew Snyder-Beattie, North Carolina State University, Raleigh, NCmentored by Dr. Kevin Gross
Abstract: Species diversity is highest in the equatorial regions of our planet, declining as one moves to the poles. A plurality of ecological literature suggest one of two primary drivers behind this latitudinal gradient in species richness: amount of sunlight or amount of climatically consistent area. Gliese 581 is a star that provides an unique reframing for these two hypotheses. Rather than rotating like Earth, some planets of Gliese 581 are believed to be tidally locked, meaning that only one side of the planet faces the sun at all times. To investigate these two major ecological hypotheses and to examine hypothetical biospheres on other planets, classic Lotka-Volterra competition dynamics are placed on a spherical lattice to show how both area and sunlight can influence the global distribution of species. Preliminary results suggest that area has a stronger effect than solar inputs when species diversity is controlled by simple resource competition. We discuss the implications of this result for global species patterns both on Earth and tidally locked planets.
An Application of Optional Unrelated-Question Randomized Response Technique
Tracy Spears Gill, UNCG, Greensboro, NCmentored by Dr. Sat Gupta
Abstract: Subjects often provide untruthful responses when asked sensitive or incriminating questions. This response bias leads to inaccurate estimation of population parameters such as mean. Randomized Response Technique (RRT) has been shown to decrease response bias in surveys of sensitive information (Warner, 1965; Greenberg et al., 1969). One variation of RRT, the Optional Unrelated-Question RRT model, may be used to simultaneously estimate the mean of the sensitive behavior, as well as the proportion of subjects that would not be comfortable answering the question directly (Sensitivity Level). This model has been shown to provide good estimates of mean and Sensitivity Level (Gupta et al., 2002), but has not previously been applied to studies of human subjects. We test this model through surveys administered to undergraduate college students about sensitive behaviors. We ask one question that requires a quantitative response, one that requires a binary response, and collect basic demographic information. The same survey questions are administered using three different methods. One method is face-to-face direct question method, which has a very high response rate but also high response bias. Another method, anonymous surveys, has a low response bias but a low response rate. We hypothesize that response bias will be improved with the proposed method as compared to face-to-face direct questioning, and that response rate will be improved over the anonymous survey method. This method also provides an estimate of Sensitivity Level, which other methods do not. Knowledge of the Sensitivity level is important because it allows researchers to assign better-trained interviewers for surveys involving highly sensitive questions.
The Noncentral t-distribution
Faith Spradlin, Kennesaw State University, Kennesaw, GAmentored by Dr. Anda Gadidov
Abstract: As it is known, one of the common applications of the Student distribution is in making inferences about the mean of a normal population when the sample size is relatively small. A topic of interest in hypothesis testing is the power of the test, which gives the probability of correctly rejecting the null hypothesis when it is false. Evaluating the power of the test involves working with the non-central t-distributions which have not been as extensively studied as the cetralized t-distributions. Using simulations in SAS we investigate properties of the non-central t-distributions and how the power of the test depends on the value stated by the null hypothesis and the size of the sample.
A Cheeger-Type Inequality on Simplicial Complexes
John Steenbergen, Duke University, Durham, NCmentored by Dr. Sayan Mukherjee
Abstract: When is a space "almost" disconnected? When does a space "almost" have a hole? In this talk we introduce the mathematics of near-homology, a growing field with applications in clustering, dimension reduction, statistical ranking, and more. Our main tools in studying near-homology are isoperimetric constants and Laplacian eigenvalues. In certain cases, these are related by a Cheeger inequality, so that they give the same characterization of near-homology. In other cases, they can behave quite independently. New results on near-homology in high dimensions will be presented.
Kleptoparasitic Interactions and Internal States
David Sykes, UNCG, Greensboro, NCmentored by Dr. Jan Rychtar
Abstract: A kleptoparasitic interaction occurs when one individual (a kleptoparasite) attempts to take resources from another individual. Some animals exhibit different behavior in similar interactions, and we would like to understand why they may have evolved to do so. Internal states, such as health, age, or hunger, can affect what behavioral strategies yield optimal gains. To study this effect, we have created a mathematical model that describes the outcomes of these interactions in terms of the value of contested resources, the cost of a fight (or a similar conflict), and the internal states of individuals involved. Changing the degree to which internal states affect an individual's appraisal of resources changes optimal behavior, as indicated by our model. When this degree is high, it can happen that individuals should forgo stealing from weaker individuals, and this does not happen when the degree is low. This degree can also be set so that the constant strategy of always stealing is optimal behavior; however, for most parameter settings, optimal behavior is not a constant strategy (i.e. a strategy of always making the same decision). Optimal behavior should, in most cases, be adaptive to changes in resource value, cost of conflict and internal health.
The Card Collector Problem
Micheal M. Thomas, Kennesaw State University, Kennesaw, GAmentored by Dr. Anda Gadidov
Abstract: Suppose that every time you purchase a box of cereal form a certain manufacturer, there is a card inside the box. The complete collection has m different cards, each being found with different probabilities inside the cereal boxes. How many purchases are required, on average, in order to get a complete collection? Our approach to the problem is different than the one in the existing literature. We will also show that the minimum number of purchases is achieved when the cards are uniformly distributed. We also check the theoretical results through simulations.
Modeling Chemical Reactions in Gene Networks
Hong Tran, Virginia Commonwealth University, Richmond, VAmentored by Dr. Kresimir Josic
Abstract: The creation of protein from DNA consists of numerous reactions, such as transcription, translation and protein folding. In such microscopic scale, noise can lead to significant fluctuations in protein concentrations. Furthermore, there is a time lag between a reaction event being initiated and a product being created. The time delay may change the behavior of the system. This research explores the combined effects of time delay and intrinsic noise on gene regulation. The goal of this project is to test the limits of deterministic and stochastic delay models of gene regulation. The Gillespie algorithm is used to model chemical reactions in non-delay systems and the modified Gillespie algorithm (Rejection method) is used to model chemical reactions in delay systems. The research project also attempts to implement the tau leap method to accelerate computational time.
What's the Queen Got to Do with it? :Testing Queen Presence on Ant Social Network Structure
Amanda Traud, North Carolina State University, Raleigh, NCmentored by Dr. Alun Lloyd and Dr. Rob Dunn
Abstract: Social media sites like Facebook have recently brought social networks into the public consciousness. For example, recent analyses have found that networks of people with identified leaders, like business networks that include managers and CEOs, differ in structure from those composed entirely of people on the same level of hierarchy. Formica subsericea are ants and, like humans, have a predefined hierarchy in their interaction networks, namely a caste system that includes workers and queens. In the natural setting, queens are present with workers, and we hypothesize that social networks of these ants, in which a queen is a member, will have a significantly different structure from those that do not include a queen. To test this hypothesis, we observed interactions of small groups of ants that include a queen (queenright) and small groups that are simply made up of workers (queenless). We created both weighted and unweighted networks to test this hypothesis. We compared various sizes of queenright and queenless networks through calculating statistics for each size and category.
Optional Unrelated-Question Randomized Response Models
Anna Tuck, UNCG, Greensboro, NCmentored by Dr. Sat Gupta and Dr. Mary Crowe
Abstract: Obtaining accurate information is essential in all surveys, but can be problematic when subjects face sensitive or incriminating questions. Despite assurances of anonymity, subjects often give untruthful responses, leading to serious response bias. One method of reducing this bias is the Unrelated Question Randomized Response Technique (RRT), in which a predetermined proportion of subjects are randomized to answer an innocuous unrelated question with known prevalence (Greenberg et al., 1969). Subjects are provided a higher level of anonymity because the researcher does not know which question (sensitive or innocuous) any individual answered, although the mean of the sensitive question can be estimated at the aggregate level. We propose a generalization of the Unrelated Question RRT model, to be used with a quantitative response question, which takes into account the fact that a question may be very sensitive for one person, but not at all sensitive for another (Gupta et al., 2002). Each subject is presented the option of omitting the randomization step if the question is deemed non-sensitive. We simultaneously estimate the Mean Prevalence of a sensitive behavior, as well as the Sensitivity Level of the underlying question (proportion of subjects who consider the question sensitive). We also show that both estimators are asymptotically normal and unbiased. Computer simulations are used to validate these theoretical results.