Skip to main content



Research Applications

According to HPCwire, over 1 million researchers and engineers use MATLAB to develop technical computing applications. It is a pervasive tool in the science and engineering community essential to scientific discovery in a wide range of disciplines such as nanoscience, restoration ecology, microbiology and immunology, and computational fluid dynamics. MATLAB is also an important tool for manipulating data from scientific instruments, telescopes, satellites and remote sensors in novel ways in order to gain new insights into basic research questions or to solve time-sensitive problems confronting society today.

Examples of research projects with accounts enabled for the MATLAB on the TeraGrid experimental computing resource are summarized below by research field:


Aerospace Engineering

Project:Dynamics and Control of 1000 Femtospacecraft in Low Earth Orbit
University:University of Illinois at Urbana-Champaign
Researcher:Soon-Jo Chung, Aerospace Engineering
Summary:We are currently developing simulation tools to accurately capture the highly nonlinear dynamics of the relative motions of 1000 tiny spacecraft in Low Earth Orbit. Because of the dynamic models that have various nonlinear terms and huge constants, numerical integration of such a high-fidelity dynamic model is a challenge. Furthermore, the sheer number of vehicles (1000 or more) makes this problem a very challenging modeling and control problem. By working with the NASA Jet Propulsion Laboratory, we are currently studying various aspects of swarm spacecraft in the presence of disturbances such as J2 terms. By having access to the MATLAB resource at Cornell, we should be able to improve the accuracy of numerical integration.

Project: Electron Trajectory Simulation in Hall-Effect Thrusters
University: University of Michigan
Researcher: Michael McDonald, Plasmadynamics and Electric Propulsion Laboratory
Summary: This project deals with kinetic modeling of electrons in the time-varying electric and magnetic fields of a Hall thruster. Millions of electrons are simulated over tens of thousands of time steps each to recreate a particle-based, as opposed to the usual fluid-based, map of anomalous electron transport in the thruster. The code loads a simple field mesh of electric fields, magnetic fields, and background neutral densities. Electrons are sourced from a point within the mesh and adaptive time stepping is used to accurately integrate the electron equations of motion until the electron reaches one of the boundaries of the simulation domain, or until a preset timeout is reached. Output data for post processing is recorded in the MATLAB file format.

Project: Satellite Launch Analysis
University: Cornell University
Researcher: Rajesh Bhaskaran, Space Systems Design Studio CUSat Satellite Project
Summary: CUSat is a multi-year effort to design, build, and launch an autonomous in-orbit inspection satellite system. The CUSat space vehicle consists of two functionally identical satellites that will launch together and separate in orbit. Simulations on the MATLAB on the TeraGrid resource are required to perform free contact analysis of dual satellite launch.


Astronomy and Astrophysics

Project: Gravitational Wave Data Analysis: LIGO and Pulsar Timing Array Data
University: Pennsylvania State University
Researcher: Lee Finn, Gravitational Wave Astronomy Group
Summary: Precision timing of arrays of millisecond pulsars is the only means of observing the gravitational waves from supermassive black hole binaries. Pulsars currently used for this purpose are concentrated toward the galactic center, strongly biasing the sensitivity of the array. Directed searches for new millisecond pulsars that would be suitable for use in pulsar timing arrays must balance the spatial distribution of the pulsars in the galaxy against the improvement that a pulsar, located in a particular direction, would give to the arrays sensitivity. The first part of this project carries out the simulations that will allow the most efficient use of radio telescope time for finding timing array pulsars that will most improve the sensitivity of the existing international pulsar timing array to gravitational waves. The current international pulsar timing array uses the timing data from thirty pulsars. The timing precision possible with these different pulsars varies by an order of magnitude. Neither the number of pulsars that are part of the array nor the distribution of observation time used to monitor them has been optimized for greatest sensitivity to gravitational waves. Adding pulsars to the timing array will increase the arrays sensitivity, but require more observing time to monitor the array. Longer or more frequent observations of any particular pulsar increases the precision of timing data from the pulsar. The second part of this project seeks to maximize the array sensitivity to gravitational wave sources by optimizing the number of pulsars that are part of the array and the distribution of observing time among those pulsars. Monte Carlo simulations of pulsar timing array data (including gravitational wave sources and realistic models of pulsar timing noise) are generated in this project. These data sets are analyzed to find the sensitivity of the array as a function of the location of an additional pulsar added to the array and the distribution of observing time among the different pulsars in the array. Several figures-of-merit are computed: signal-to-noise; ability to localize source; and, ability to infer radiation waveform (including gravitational wave polarization).


Bioengineering

Project: Changes in Femur Shape During Postnatal Development and Growth of C57BL/6 Mice
University: University of California, San Diego
Researcher: Ricky Harjanto, Cartilage Tissue Engineering Lab
Summary: With continued development and improvement of tissue engineering therapies for small articular lesions, increased attention is being focused on the challenge of engineering partial or whole synovial joints. Joint-scale constructs could have applications in the treatment of large areas of articular damage or in biological arthroplasty of severely degenerate joints. The native structure and function of synovial joints in the body provide design goals for joint engineering. Surface geometries, whole-joint motions, and relative motions of cartilage surfaces give rise to the spectrum of native joints. Quantification of cartilage surfaces provides constraints and boundary conditions for theoretical models as well as design targets for the movement of bioengineered joints. While the mature hip shape has been well-studied in normal and disease states, the shape changes during development have only been quantified in terms of radiological parameters that are not sufficient to reconstruct the entire hip contour. Statistical shape modeling techniques determine the variance among landmark points in a sample of shapes and employ principal component analysis to reduce the shape information to significant modes of variation. 2D active shape modeling methods have previously been developed to analyze the shape of the proximal femur and can be utilized to track the shape of the hip during development. A large number of 4D CT and MRI digital atlases of human and animal models exist that could serve as the data set for such an analysis. A previous attempt to analyze one of these databases is instructive, revealing a number of issues in a study of post-natal growth in a cross-section of the C57BL/6 mouse femur. Whole-body scans are liable to distort the shape of the hip and other extremities. Harvesting the hip before imaging can provide a more accurate segmentation result. In addition, the complete shape of the hip cannot be fully captured in two dimensions. The whole femur study omits analysis of shape changes of the lesser trochanter. 3D modeling allows for a more precise description of the crests and troughs in the femur surface and the interplay of femoral head and acetabulum. With only one sample at each age point, it was also not possible to determine if the variation in shape and growth parameters was statistically significant. By expanding both the dataset and modeling techniques, hip growth can be mapped in a more precise and efficient manner. Currently, we are using MATLAB to find correspondences between point clouds, representing whole mouse femurs at different age points, so that each point is mapped to equivalent points as landmarks that will be used to track the shape changes. The correspondence and alignment procedures are not particularly intensive, but computing the eigenvectors (modes of variation) through singular value decomposition from the large covariance matrix of the landmarks freezes up even the fastest computer in the lab. We have tried running svd in parallel on an 8-core CPU, but there does not seem to be enough memory associated with each client. We are hesitant to reduce the number of landmarks because we are aiming for high resolution meshes of the femur.

Project: Dynamic Functional Connectivity Between Cortex and Muscles
University: University of Pittsburgh
Researcher: Sagi Perel, MotorLab, Bioengineering
Summary: The goal of the project is to use novel methods to uncover patterns of dynamic functional connectivity between cortex and muscles. This would pave the way for better understanding of how the motor cortex controls movement. These novel methods we developed rely heavily on resampling techniques; hence the use of a cluster will significantly speed up the computation. The code computes test statistics and confidence intervals by creating bootstrap samples from existing data. Currently the code computes the bootstrap samples serially. I usually run 3 instances of MATLAB in parallel, computing different groups of bootstrap samples. This code would be very easy to port to a cluster.

Project: Multimodal Medical Image Registration
University: Rochester Institute of Technology
Researcher: Nathan Cahill, School of Mathematical Sciences
Summary: We are investigating and developing algorithms for the registration of 3-D medical images (e.g., CT scans, MRI, Ultrasound, PET images of the brain, PET/CT full body scans). Scripts are used to test various registration algorithms on sets of data from various medical imaging applications.

Project: Quantitative Polarized Light Microscopy of Articular Cartilage
University: University of California, San Diego
Researcher: Christopher Raub, Cartilage Tissue Engineering Lab
Summary: The objective of this project is to use a computational technique to quantify aspects of images of human articular cartilage sections taken through a polarized light microscope. These data quantify the average anisotropy and orientation of birefringent material (i.e., collagen fibrils) within the cartilage at 2-4 micron resolution. These aspects of collagen network microstructure may play an important role in cartilage biomechanical function and this technique will identify key changes to the collagen network microstructure during aging, osteoarthritis, and cartilage repair processes. The code reads an image array consisting of 9 images each of size 1024x768 pixels. For each pixel location (there are 768,432), data points from 7 of the images are fit to a sinusoidal nonlinear least squares curve fit, and the amplitude and phase shift of the curve fit are extracted. Pixel intensity values from the remaining two images are compared in order to determine the initial condition at 0 phase shift (the absolute orientation is either 0 or 90 degrees). Pixel colormaps are built describing collagen network anisotropy (related to the curve-fit amplitude) and orientation (related to the curve-fit phase shift) at each pixel location. The nonlinear least squares curve fit on 768,432 data points is the most computationally expensive—the rest of the code only takes ~1 minute to run.


Biomedical Engineering

Project: I. Bayesian Reconstruction of Parallel MRI Using Graph Cut; II. Prior Construction Project; III. Brain Connectivity Project; IV. Multimodality Imaging for Multiple Sclerosis
University: Weill Cornell Medical College
Researcher: King-Wai Chu, Radiology Research
Summary: I. This project aims to develop and refine computationally efficient Bayesian methods for MRI that have the potential to overcome fundamental limits of traditional MR imaging. Aim1: To apply our recent technique, EPIGRAM, to fast high-resolution structural brain imaging. The image priors to be used will be empirically determined and rigorously evaluated. The algorithm will be modified to accommodate this modality and the chosen prior(s). Aim 2: Extending the method from 2D to 2D + time data and apply it to brain perfusion and diffusion parallel reconstruction problems. Aim 3: Validation. Aim 4: Developing efficient feasible algorithms to solve the Bayesian MRPl reconstruction problem. II. Prior Construction Project - Bayesian methods have become quite popular in the solution of many inverse problems related to imaging, in particular parallel coil MR reconstruction. One a priori observation particular to imaging is that the intensities of neighboring pixels should be similar in value, except at the edge of an object. In this work, we construct a prior cost function by utilizing Reversible Jump Markov Chain Monte Carlo (RJ-MCMC) techniques to "learn" the properties of various true underlying images. III. Brain Connectivity Project - Many brain diseases such as stroke, multiple sclerosis and traumatic brain injury result in brain damage and physical or mental disability from cell death. The location and size of the affected area greatly influences the amount and type of disability that the patient incurs. Current MRI-based diagnosis of brain injury is inadequately qualitative, involving subjective assessment of severity and prediction of impairment by the physician. Even when augmented by 3D processing tools that allow for accurate measurement of lesion or tumor volume, this approach is insufficient in characterizing the effect on brain function. This is because functional impairment is determined by both the extent and location of damage, and can be properly assessed only by looking at the connectivity of the affected region to the rest of the brain via its fiber architecture. A new computational methodology is proposed that utilizes DTI and structural MR images of the brain as well as graph theory to methodically assign an overall brain connectivity importance to small sections of white and gray matter regions. IV. Multimodality Imaging for Multiple Sclerosis - The aim of this project is develop robust image marker for the pathological processes of various neurological white matter disease in general and multiple sclerosis (MS) in particular.

Project: Increasing the Efficiency and Parallelization of the Time-Domain Optimized Near-Field Estimator Algorithm for Ultrasonic Imaging
University: University of Virginia
Researcher: Ed Hall, University of Virginia Alliance for Computational Science and Engineering
Summary: In medical ultrasound, bright off-axis targets may introduce broad image clutter, which reduces both image contrast and resolution. Recently, we have developed a model-based adaptive beam forming algorithm, called the Time-Domain Optimized Near-Field Estimator (TONE), that significantly increases both image contrast and resolution by conducting a global optimization based on a model of a sparse set of hypothetical source locations. Due to the global nature of this optimization, it is necessary to model, with finely spaced samples, the entire space from which signal may be received. For typical medical ultrasound applications, this model may contain upwards of 2 million hypothetical source locations, requiring matrices containing upwards of 4 trillion elements. This project involves dramatically improving the computational efficiency of a MATLAB program that implements a novel algorithm for TONE. The goal is for the algorithm to process images requiring an order of magnitude more memory in an order of magnitude less time. The TONE algorithm performs an optimization by iteratively solving a set of linear equations with matrices up to 4 trillion elements. The algorithm has been parallelized using distributed matrices, the parallel implementation of the mldivide command, and the spmd construct from the Parallel Computing Toolbox.

Project: Large Scale HIV Alignments
University: Drexel University
Researcher: William Dampier, Biomedical Engineering, Science & Health Systems
Summary: HIV viral sequence data from patients is used to determine the functional differences between various subclasses of patients, ie. responders vs. non-responders to therapy, drug-users vs. non-drug-users, etc. The multiple sequence alignment starts by computing pairwise scores between all ~10,000 sequences. Then it uses a progressive alignment technique to create the draft alignment. It is refined using a simulated annealing model to lower the overall alignment score.

Project: Modeling of Power Wheelchair Driving
University: University of Pittsburgh
Researcher: Harshal Mahajan, NSF Quality of Life Technology Center
Summary: There are different techniques used for modeling the driving of a motorized/Power wheelchair. We want to explore using system identification techniques to build models of the wheelchair. Our code uses MATLAB system identification toolbox to build models from the driving data collected.

Project: Muscle Contraction Simulation
University: University of Vermont
Researcher: Bob Devins, Vermont Advanced Computing Center
Summary: Simulating molecular process of muscle contraction.

Project: Spatially Heterogeneous Dynamic and Glass Transitions in Cell Monolayers
University: Harvard University
Researcher: Charles Hardin, School of Public Health
Summary: We have developed a novel technique for measuring the full stress tensor for the intercellular stress in a monolayer of migrating endothelial cells. We simultaneously obtain the velocity field and use this data to explore the collective dynamics. In preliminary studies we have observed long range spatial and time correlations which appear to diverge with increasing cell density in a way that is similar to glass transitions in colloidal systems. The code consists of a collection of scripts to abstract the velocity field from phase contrast microscopy images, and several routines to calculate force and velocity autocorrelation functions. We also have a finite element model code, currently implemented in Fortran, which calculates the actual stress tensor.

Project: Temporal and Stoichiometric Variability in the 3-D Shape of Viruses
University: Cornell University
Researcher: Peter Doerschuk, Biomedical Engineering
Summary: Professor Doerschuk is collaborating with J.E. Johnson of The Scripps Research Institute to understand the temporal and stoichiometric variability in the 3-D shape of viruses and how these structures and their variability affect the functioning of the virus particle during its lifecycle. Information from a large number of cyro electron microscope images of virus particles must be fused in order to determine the structure of the virus. Professor Doerschuk and his colleagues have developed a sophisticated MATLAB code to implement a new statistical approach that can determine the continuous variability within each class of geometry. The MATLAB on the TeraGrid resource will allow them to solve large problems for the first time that are of fundamental interest to biological scientists and provide many-fold speedups. It will also enable the researchers to focus more on their science and less on software engineering.

Project: Third Harmonic Imaging of Myelin
University: Cornell University
Researcher: Matthew Farrar, Schaffer Lab
Summary: Demonstration of a new nonlinear microscopy technique for visualizing myelin. The code is a Monte Carlo simulation of epi-detected fraction of generated light.

Project: Ultrasound Imaging and Dynamics of Cardiac Action Potentials
University: Cornell University
Researcher: Niels Otani, Biomedical Sciences, Veterinary Medicine
Summary: The computer will be used for two purposes: the study of the dynamics of propagating electrical waves on the heart during abnormal rhythm, and the ultrasound imaging of these electrical waves. Existing and to-be-developed codes will be used to convert ultrasound images of contracting canine heart into action-potential-induced active stresses. 4D images of the patterns of action potential propagation will then be computed. Dynamics: 2D and 3D conceptual models of the action potential dynamics associated with life-threatening tachyarrhythmias in the heart will be run. Note: codes need to be parallelized.


Chemical Engineering

Project: Mass Transfer in Complex Porous Media
University: University of Delaware
Researcher: Harun Koku, Chemical Engineering
Summary: Brownian dynamics simulations in real-life porous media. Code repeats streaming steps for a preset number of particles and time. Duration of run depends on number of particles, time step size and number of velocities tested. Memory requirements depend on size of geometry.


Civil Engineering

Project: Learning Solutions to Mulitscale Elliptic Problems Using Gaussian Process Models
University: Cornell University
Researcher: Jim Warner, Civil and Environmental Engineering
Summary: The proposed work presents a fundamentally new approach to the solution of multiscale elliptic PDEs through the use of statistical learning. Such problems arise frequently in the computational mechanics field, for example when analyzing the mechanical behavior of composite materials or simulating fluid flow through porous media. In our work, we attempt to learn rather than approximate the solution to the PDE. To that end, the governing PDE and the associated boundary conditions are viewed as data sources. They are called on-demand at various points in the problem domain in order to provide the necessary information for inferring the solution. The statistical model employed is the Gaussian Process, a proven powerful tool in general regression problems. An obvious advantage of the proposed probabilistic framework is that apart from a prediction of the solution, the analysis results provide probabilistic confidence metrics in the form of credible intervals. These metrics differ from standard error measures used in traditional PDE solution schemes, representing the anticipated variability of the solution based on the information provided by the data. This approach also exhibits computational costs comparable to or less than existing multiscale methods and shows the potential to address problems where these methods fail. The computationally intensive portion of the code involves the selection of optimum covariance function parameters by maximizing the log-likelihood function. This is essentially an optimization problem where every time the likelihood function is evaluated, there is an inversion of an nxn covariance matrix, where n is the number of data points used. So as n gets large, this computation takes a very long time.


Computational Science

Project: Large Scale Parameter Sweep Studies Using Distributed MATLAB
University: Pennsylvania State University
Researcher: Abdul Aziz, Research Computing and Cyberinfrastructure
Summary: Using distributed MATLAB to solve large scale problems (see Large Scale Parameter Sweep Studies Using Distributed MATLAB).

Project: Remote MATALB Engine Services Test
University: Pennsylvania State University
Researcher: Jeffrey Nucciarone, Research Computing and Cyberinfrastructure
Summary: Explore the use and interaction with a remote MATLAB resource engine to best determine how to offer a similar service to faculty, researchers, and students. Initial test codes will be financial simulations. Applications will expand to test other toolboxes such as but not limited to Optimization, Bioinformatics, etc.


Computer Science

Project: Automated Flight Call Classification
University: Cornell University
Researcher: Samuel Henry, Computer Science
Summary: Flight calls are a powerful estimator of the migration patterns of bird species; however their potential was hindered by a painstaking manual classification process. Using machine learning and signal analysis techniques I am developing a system of automatically identifying individual species from their flight calls. This technique is much faster and arguably more accurate than manual classification, and is the first step in creating large scale networks of recording stations that can provide a detailed understanding of the migration patterns of individual species. The computationally expensive parts are Dynamic Time Warping which is a type of dynamic programming, Fourier Fast Transforms, Matrix Inversions, and the classification algorithm itself. We currently have one model that requires 4-36 hours of computational time on one core, but given the stochastic nature of the processes and our experimental design, we need to run a large number of times (6000+). We are now manually parallelizing these simulations, but they could be run in automatically parallel on the cluster using the MATLAB parallel toolbox.


Earth and Atmospheric Science

Project: Parallel MATLAB HPC Solutions for Mantle Convection and Melting
University: Cornell University
Researcher: Jason Phipps Morgan, Earth and Atmospheric Sciences
Summary: Convection in the Earth’s mantle occurs on a timescale that is too slow for us to directly observe anything more than a snapshot of the present-day flow field, and is too nonlinear for analytical solutions to provide detailed insights. The consequence is that numerical experiments of compressible and incompressible approximations to mantle convection play a key role in improving our understanding of the variable viscosity flow processes, and the melting-linked differentiation processes that have transformed our planet into its present state. However, mantle convection involves a branch of computational fluid dynamics – the accurate numerical solution of variable viscosity Stokes Flow – that has received relatively little attention by the modeling community. While a good Stokes solver is usually an integral part of a good Navier-Stokes solver, typically Navier-Stokes equations are solved for flow of a fluid with uniform viscosity, or only small lateral variations in viscosity. The result is that we can’t simply use a good Navier-Stokes code to solve the apparently ‘easier’ Stokes problem with strongly varying viscosity. This issue is already central to creating more accurate numerical experiments of convection in Earth’s silicate-fluid mantle (May and Moresi, 2008; van Geenen et al., 2009; Burstedde et al., 2009). Here we are working on implemented improved parallel algorithms for solving for mantle convection with strongly temperature-dependent viscosity. These improvements include the implementation of improved preconditioners for the pressure sub-problem, and improved smoothers for multigrid preconditioned-Krylov solver used in the velocity sub- problem. We are applying these to the study of large-scale mantle convection, and also higher-resolution studies of convection and melting beneath hotspots, mid-ocean ridges, and island arcs. Our parallel-MATLAB-based HPC code, currently running on individual computers and small-clusters, will be scaled up to make use of CAC's "MATLAB on the TeraGrid" system. This code runs on unstructured tetrahedral meshes. It also includes an unstructured mesh generator that we are extending to implement adaptive mesh-generation as a solution is evolving.


Electrical and Computer Engineering

Project: Auditory Models for 3-D Sound
University: Ohio State University
Researcher: Ashok Krishnamurthy, Electrical & Computer Engineering
Summary: There are a number of auditory models for understanding how the auditory system processes sounds. Two common models are the AIM model from Cambridge University and the MAP model from the University of Essex. Both of these are MATLAB based models, and model the processing of sounds in a single ear. This project is aimed at adding binaural processing and extending these models to take into account known binaural effects.

Project: Digital Pathology with Cell Segmentation and Video Tracking
University: University of Houston
Researcher: Liwen Shih, Computer Engineering
Summary: Processing biopsy images and endoscope videos for locating cell concentrated lumps for potential tumors.

Project: EEG MEG Whole Brain Models
University: University of Washington
Researcher: Ceon Ramon, Electrical Engineering
Summary: We will model the cortical activity of neurons with a 0.5 mm resolution using a finite element model of the whole human brain. A maximum of 19 different tissue surfaces will be identified. A 3-D finite element model of the head will be constructed. Anisotropic tissue conductivities will be used. The FEM grid generation and the solver have been implemented in the MATLAB. ~6-8 million node equations need to be solved. We have been able to solve a very small model on a PC with 8-cores and 12GB memory. The full model requires a lot more than 12GB memory.

Project: EEG Signal Classification Using Particle Swarm Optimization and RBFNs
University: Rochester Institute of Technology
Researcher: Ferat Sahin, Electrical Engineering
Summary: EEG signal classification using PSO and RBNs to determine the state of mind and discover the actions being thought. Current MATLAB code is sequential but the research team is working on making it suitable for running on multi-core machines.

Project: Non-Equilibrium Information Theory Applied to Mobile Ad Hoc Networks Using Physics as an Engineering-Mathematic Bridge; Complexity Models for Resiliency Engineering of a Distributed Smart Grid Network
University: University of Texas at San Antonio
Researcher: Brian Kelley, Electrical and Computer Engineering
Summary: The objective of this project is to investigate scale free ad-hoc networks whose transport capacity per-node is independent of the number of network nodes. We show that by adjusting the complexity of the interconnection network and by transmitting information via controlled joint interference, we can increase the network capacity bound indicated by Gupta. This is accomplished in part by applying a new form of non-equilibrium theory derived from concepts statistical physics. We will generate proof of concept engineering models of our ad-hoc wireless network using “MATLAB on the TeraGrid” nodes at Cornell. The objective is to investigate new formal theories and modeling paradigms for a distributed smart grid that is resilient against intruder based security attacks and random transmission line failure or power capacity usage overload events leading to cascadable catastrophic system-wide outages. We will develop new concepts: theoretical models of resilient nation-wide RF communications networks overlaying power grid transmission line networks and new theories associated with wireless resiliency of the network against security attack-events, robust adaptive and distributed control, and wireless sensor networking. The system will optimize resiliency under dynamic load balancing constraints. The MATLAB code will run N ad-hoc nodes in a distributed fashion using OFDM as the air interface model. Scale free communication paradigms are applied based upon non-equilibrium information theory and statistical constraints on the interconnection properties. The model will involve smart grid simulations with distributed ad-hoc communications based upon the ad-hoc descriptions for scale free networks. We will engineer models of our wireless system on the Cornell MATLAB cluster with an aim of 6-Sigma resiliency in the face of natural and man-made failures and attacks.

Project: Random Bipartite Graphs
University: Cornell University
Researcher: Wayne Melton, Electrical and Computer Engineering
Summary: Comprehensive analysis of the bipartite graph LP-relaxed optimization. Code is designed to generate a random bipartite graph where each member is allowed to choose randomly 'k' independent members of the opposite sex for interviews and each interview is assigned a happiness score uniformly distributed between 1 and 100. Afterwards, the LP-relaxed version of the optimization problem is solved.

Project: Randomized Coloring Algorithm for Scheduling in Wireless Sensor Networks
University: Cornell University
Researcher: Roberto Pagliari, Electrical and Computer Engineering
Summary: Simulating coloring algorithms that can be used for self-organization in sensor networks require many runs of the same code in order to get statistical data about randomized protocols for distributed systems.

Project: Sensor Network Routing
University: Colorado State University
Researcher: Dulanjalie Dhanapala, Electrical and Computing Engineering
Summary: Developing Wireless Sensor Network (WSN) routing schemes based on virtual coordinates mathematical modeling of routing protocols. Code focused on generating random sensor networks, implementing routing protocols, and evaluating the performance of the protocols and routability.


Enviromental Engineering

Project: Development and Application of Optimization and Uncertainty Analysis Algorithms to Water Resource Problems
University: Cornell University
Researcher: Christine Shoemaker, Civil and Environmental Engineering
Summary: Professor Shoemaker has three National Science Foundation grants that relate to the development and application of optimization and uncertainty analysis algorithms to water resource problems. These applications include the remediation of contaminated groundwater and the control of phosphorous pollution from watershed given changes in management and climate. Each of these applications is computationally intensive and will benefit from the use of parallel algorithms which will executed on the MATLAB on the TeraGrid cluster.


Immunology

Project: Modeling Networks of Coordinated Amino-Acid Variation in Hepatitis C Virus
University: Centers for Disease Control and Prevention
Researcher: Zoya Dimtrova, David Campo-Rendon, Elizabeth Neuhaus, Yuri Khudyakov, Division of Viral Hepatitis
Summary: MATLAB is used to study networks of coordinated amino acid variation in Hepatitis C virus (HCV), a major cause of liver disease worldwide. Mapping of coordinated variations in the viral polyprotein has revealed a small collection of amino acid sites that significantly impacts Hepatitis viral evolution. Knowledge of these sites and their interactions may help devise novel molecular strategies for disrupting Hepatitis viral functions and may also be used to find new therapeutic targets for HCV. Statistical verification of HCV coordinated mutation networks requires generation of thousands of random amino acid alignments, a computationally intensive process that greatly benefits from parallelization. This application has run successfully across all 512-cores of the MATLAB experimental resource (see poster).


Marine Science

Project: Array Processing of Ambient Noise for Geophysical Inversion
University: Scripps Institute of Oceanography
Researcher: James Traer, Marine Physical Laboratory
Summary: Using data recorded from arrays of seismometers (or hydrophones in the ocean), cross-correlation of the recorded ambient noise can yield information about the seismic (acoustic) wave propagation paths. The presence of multiple propagating paths implies the presence of reflecting layers and as such this method can be used to infer the geophysical environment without use of active sources. This work requires processing large amounts of data, such that relatively weak signals can be extracted from background noise. The MATLAB code loads data in chunks ranging from 10 seconds to 5 minutes; Fourier transforms the data, performs some basic arithmetic, and saves a few key variables in a mat-file. In order to extract the geophysical information required, this must be repeated over days or weeks of continuous data.

Project: Breaking Internal Wave Groups: Mixing and Momentum Fluxes
University: University of Massachusetts
Researcher: Daniel Birch, Ocean Mixing and Stirring Lab
Summary: This project is simulating breaking internal gravity wave groups using Kraig Winters' CFD code written in Fortran. MATLAB is used for post-processing. The post-processing is an embarrassingly parallel problem where each time step could be analyzed individually. The large size of the data set makes post-processing on a desktop machine unfeasible. This work is supported by TeraGrid allocation TG-OCE100007.

Project: Does Nested Network Architecture Inhibit Diffuse Coevolution in Mutualistic Networks?
University: University of North Carolina
Researcher: Stuart Borrett, Systems Ecology and Ecoinformatics Laboratory
Summary: Ecologists have hypothesized that the commonly found nested network architecture found in communities linked by mutualistic interactions appears because it is evolutionarily advantageous by inhibiting diffuse coevolutionary forces. We are using an individual based model to test the plausibility of this hypothesis in silico. We currently have one model that requires 4-36 hours of computational time on one core, but given the stochastic nature of the processes and our experimental design, we need to run a large number of times (6000+). We are now manually parallelizing these simulations, but they could be run automatically in parallel on the cluster using the MATLAB parallel toolbox.

Project: Numerical Simulation of Instabilities Associated with Coastal Downwelling Fronts
University: Oregon State University
Researcher: Scott Durski, Oceanic and Atmospheric Sciences
Summary: TG-OCE090008 and numerical simulation of instabilities associated with coastal downwelling fronts. Different hydrodynamic instabilities develop when coastal winds produce downwelling close to shore. In the bottom boundary layer where potential vorticity reverses sign, shoreward propagating symmetric instabilities develop, while near the surface, baroclinic frontal instabilities form. This research is focused on analyzing the interaction of these two instability processes. I'm interested in some light use of MATLAB on Abe. I have simulations that generate large output data files. If I could take a quick look at the data in these files I can decide if the simulations were 'successful' or not without having to sftp the whole files to a local machine. This involves some simple MATLAB graphics and mexnc for importing/exporting the data. It requires using MATLAB in ~30 minute to 1 hour intervals.


Nanoscience

Project: Transparent Access to Remote MATLAB Resource by nanoHUB.org Science Gateway Users
University: Purdue University
Researcher: Gerhard Klimeck, Michael McLennan, Network for Computational Technology
Summary: nanoHUB.org is a NSF-funded science gateway that currently serves 90 interactive applications to over 6,200 simulation users who run over 300,000 simulations a year. Users set up their numerical experiments and explore the results interactively on the web site without installing any software on their local computer. The highest nanoHUB usage impact and number of users can be achieved if the time between setup and data delivery is a few minutes, if not seconds. The number of users decreases exponentially with the execution time. It is, therefore, critical to move applications that would run for an hour or a day into the time frame of minutes. Introducing on demand interactive computing capabilities can be a catalyst for increasing the use of Science Gateways. Communication protocols have been implemented to enable secure and authenticated data transmission between nanoHUB.org and Cornell. A tool called “NanoNet,” the first of many parallel applications on nanoHUB.org, has been converted to take advantage of the MATLAB resources. Others applications will follow that will enable the Science Gateway to serve a larger number of users.


Neurobiology and Behavior

Project: Ontogeny of Vocal Learning in Wild Parrots
University: Cornell University
Researcher: Karl Berg, Lab of Ornithology Bioacoustics Research Program
Summary: While parrots are prolific vocal learners, the developmental process of learned vocal signals has never been systematically studied in the wild. This project uses a large database of recordings collected throughout nestling development in over 100 nestlings from 25 nests of green-rumped parrotlets in Venezuela. We have identified approximately 30,000 contact calls from the nestlings and their parents that we would like to compare. This results in close to a half a billion pair-wise comparisons. The research code, Sound Xcorr Tool (V.2), was developed by Cornell Research Associate, Kathryn Cortopassi. The code conducts spectrographic cross-correlation for every pair-wise comparison of each wave file in set followed by a MATLAB conducted a Principal Coordinates analysis on the correlation matrix which allows us to map vocal signals in multivariate acoustic space.


Physics

Project: Change Point Analysis of Quantum Dots
University: Swarthmore College
Researcher: Orion Sauter, Physics and Astronomy
Summary: Single semiconductor nanocrystals are of widespread interest as possible single-photon sources and single fluorophores for optoelectronic and biological applications. However, rather than fluorescing steadily, they “blink” on and off — they alternate between bright (”on”) and dark (”off”) states. This intriguing phenomenon has typically been studied by determining the probability distributions of the on and off states. However, the traditional method for calculating these distributions requires choosing a time unit for the analysis, which we have found in previous work can significantly affect the probability distributions obtained, and hence the characteristic blinking timescale. Under faculty advisor Carl Grossman, we are now exploring a statistical approach which does not require such a choice. Our main function takes as input a large (1-15 million element) array representing photon arrival times, and searches for points where the system has changed state, using a statistical likelihood function. The search requires a randomly generated starting point, so we run it multiple times to ensure consistent results. These runs are independent and may be performed in parallel. The function returns parameter values associated with the probability distributions found in each run.

Project: Extracting and Analyzing Mosquito Motion Data
University: Cornell University
Researcher: Sarah Iams, Center for Applied Mathematics
Summary: Extracting and analyzing mosquito motion data from high speed images of motion

Project: Thermodynamic Properties of Frustrated Magnetic Spin-1/2 Systems
University: Grand Valley State University
Researcher: Kingshuk Majumdar, Physics
Summary: The goal of the project is to study the zero temperature magnetic phase diagram of frustrated spin-1/2 Heisenberg AF on different lattice structures. We plan to calculate magnetization, energy dispersion, spin stiffness, intensity, and other factors for the system. Research enabled by MATLAB on the TeraGrid published in Physical Review.


Physiology and Biophysics

Project: Conductance-Based Model of Active Properties Contributing to Neural Integration
University: Weill Cornell Medical College
Researcher: Melanie Lee, Emre Aksay Lab
Summary: We are investigating the role of active dendritic properties in how a network of neurons integrates inputs over time. Temporal integration has been identified as an essential computation for motor control, working memory, and decision making. To understand the interplay between network and intrinsic cellular mechanisms in temporal integration, and to link these findings with experiments that correlate network activity with behavior, we are developing a biophysically realistic model of the fish oculomotor integrator. The model is able to integrate as the result of a careful tuning procedure that balances leakage of currents out of the membrane with network and intrinsic feedback. The tuning procedure involves reducing the conductance-based dynamics of individual neurons to a firing rate model; this permits a linear regression procedure to fit the experimentally unknown strengths of coupling between neurons. Ongoing work is focused on making the network fully bilateral, incorporating realistic morphological data, and testing the model with smooth inputs that drive the vestibuloocular reflex. The completed model will match many experimental features and may help explain common mechanisms underlying diverse processes such as motor control, working memory, and decision making. The main MATLAB code simulates a network of N=50 recurrently connected neurons in one half of the bilateral integrator network. Each neuron consists of an integrate-and-fire somatic compartment and N conductance-based dendritic compartments. The somatic and dendritic compartments for a cell are Ohmically coupled, and linear synapses connect the cells. Currently, it takes about 200 seconds to simulate 1 second of network activity. In order to run more accurate simulations, however, somatic compartments will be converted from integrate-and-fire type to conductance-based; this will require reducing the time-step by a factor of 100, thereby increasing computing time to over 3 hours per minute of simulation time. In addition, we are working on making the network fully bilateral, thus doubling the number of compartments and increasing the number of equations to integrate by at least five-fold. We are looking to parallelize the code to perform more efficient parameter searches and believe the MATLAB on the TeraGrid would be an optimal resource on which to develop and run this code.

Project: The Dynamics of G-Protein Coupled Receptors (GPCRs) in the Cell Membrane
University: Weill Cornell Medical College
Researcher: Sayan Mondal, Harel Weinstein Lab
Summary: We are investigating the differential partitioning of GPCRs, such as the physiologically important rhodopsin and serotonin receptors, into specialized membrane nanodomains. As a part of this project, we have also implemented a novel multi-scale approach of using a continuum formalism to determine the membrane deformation energetics with input parameters from an atomistic model of the GPCR - membrane system. The main MATLAB code simulates the reaction-diffusion of GPCRs in a patch of cell membrane to determine the differential partitioning of GPCRs into membrane nanodomains as a function of membrane deformation energetics, GPCR dimerization status, and relevant parameters.