PhD Programme in Simulation Science
What is Simulation Science?
Every facet of the physical, biological, social, economic and engineering sciences is now critically dependent on insights gained from simulations of complex systems. Simulation is providing a radically new approach to Science addressing problems beyond the scope of traditional Experimental or Theoretical Science. Thus Simulation Science is central in enabling everything from developing new materials and new technologies for energy, information processing, and communication, to designing new medicines and treatments, and predicting how biological and social networks respond to different stimuli, from infections to financial crashes. The use and application of Simulation Science is critical to drive forward future innovations in the competitive environment of industry and scientific research; educating tomorrow’s scientists requires that students acquire substantial depth in these techniques. The Structured PhD Programme in Simulation Science is a new multi-institutional collaborative Ph.D. programme involving University College Dublin, Trinity College Dublin, Queen’s University Belfast, the National University of Ireland, Galway, and supported by the Irish Centre for High End Computing.
The Structured PhD Programme in Simulation Science is a new multi-institutional collaborative Ph.D. programme involving University College Dublin, Trinity College Dublin, Queen’s University Belfast, the National University of Ireland, Galway, and supported by the Irish Centre for High End Computing.
Choose from exciting doctoral projects working with world leading research groups in:
- Molecular Simulation
- Systems Biology and Systems Medicine
- Computational Social Science
- Extreme Events and Risk
- Financial Mathematics and Computation
- Earth System Simulations
- High Performance Computing & Frontiers of Simulation Science
For detailed project descriptions see here.
Simulation Science (SimSci) Fellowships
Funded PhD Fellowships are currently available at University College Dublin in this programme. These SimSci Fellowships are fully funded for 4 years and include a stipend of 16,000 Euro per year together with an allowance for research travel and expenses and cover fees for EU students.
Applications for these Fellowships are now being accepted with the initial review process commencing on 28th January, 2011. Please follow the instuctions on the SimSci application form [PHD Programme in Simulation Science Doc, PHD Programme in Simulation Science PDF].
The SimSci Fellowships are funded under the Programme for Research in Third Level Institutions (PRTLI) Cycle 5 which is co-funded by the European Regional Development Fund (ERDF). Additional funded places on the Structured PhD Programme in Simulation Science will be available through other national and international Research Projects.
The Structured PhD Programme
The Structured PhD Programme in Simulation Science develops a new paradigm for interdisciplinary post-graduate education and is aimed at students with strong backgrounds in Mathematics and Computation who want careers using, and developing advanced computational modelling and simulation tools. At the heart of the graduate programme is a research experience in outstanding nationally funded research programmes that positions students at the leading edge of exploration and real-world application. The training programme will provide substantial depth in computational techniques, mathematical modelling, and data intensive science, and give students unparalleled opportunities for Simulation Science research in their scientific, social science or engineering disciplines.
The Structured PhD Programme in Simulation Science will be a 360 ECTS 4-year Full Time PhD with the student acquiring at least 60 ECTS credits of advanced training (Level 4) through a systematic and personalised programme of learning and also required to successfully complete substantial original doctoral research (level 5). The SPP comprises two stages: In Stage 1, students develop core research skills through taught modules, define a research plan and initiate original research work. In Stage 2, the focus is original doctoral research but will also include advanced training. Core intensive modules in Mathematical Modelling, Data Intensive Science, and High Performance Computing coupled with specialist discipline specific training will offer advanced skills that rapidly get SimSci Fellows up to speed with state-of-the-art simulation techniques. Innovation modules taught through the UCD-TCD Innovation Academy with corporate partners and business faculty, provide critical complementary skills.
The UCD Structured PhD Programme may be understood in terms of the following structure:
For detailed descriptions of UCD Structured PhD Programmes see here.
In Stage 1 of the Structured PhD Programme in Simulation Science students will take 40 ECTS credits acquired through taught modules, encompassing core Simulation Science and Innovation offered through the Innovation Alliance between Trinity College Dublin and University College Dublin. The structure of the modules is flexible and is delivered in the most appropriate method to the topic, either as lectures, labs, intensive courses, seminar series, project-based learning etc.
There are 6 core modules, each worth 5 ECTS:
- Numerical Methods
- Mathematical Methods
- Origins of Innovation
- Either: Scientific Programming - Python, R
Or: High Performance Programming - HPC, MPI, C++
- Computational Statistics
- Opportunity Generation and Recognition
These modules are offer in conjunction with Trinity College Dublin and the Irish Centre for High End Computing.
Students will take at least 2 further optional modules according to their area of research. Sample topics include:
- Simulation Modelling and Analysis
- Natural Computing and Applications
- Mathematical Biology
- Computational Biophysics and Nanoscale Simulations
- Case Studies in Computational Science
- Stochastic Methods
- Nanomechanics - from Single Molecules to Single Cells
- Computational Quantum Chemistry
- Complexity Science and Social Simulation
- Social Networks and Simulation
- Social and Economic Networks
- Multi-Agent Systems
- Cloud Computing
- GPGPU Programmining
- Natural Computing Methods in Finance
- Innovation Networks
These modules are offered in conjunction with Trinity College Dublin, Queen’s University Belfast, the National University of Ireland, Galway, and the Irish Centre for High End Computing.
Additional activities undertaken in the Structured PhD Programme in Simulation Science include specialist workshops, a regular seminar programme, hot-topic workshops, international internships and Summer Schools.
If you have any questions please contact us at email@example.com.
ICT Solutions for Innovation Policy Modelling (Principal Supervisor: Prof. Petra Ahrweiler, Business)
Context and Objective: Research will focus on the development of advanced ICT tools for innovation policy modelling, prediction of policy impacts, development of new governance models and collaborative solving of complex innovation problems.
This research will result in innovative ICT solutions (including open source solutions) that enable one or more of the following:
- Modelling new innovation policy initiatives taking into account all relevant parameters.
- Performing societal simulations to forecast potential impacts of proposed innovation policy measures.
Work Plan: The project will be hosted by the Innovation Research Unit of University College Dublin. At the Innovation Research Unit (IRU), all projects are carried out using the innovative methodology of the unit, which is a combination of empirical quantitative and qualitative research, social network analysis and agent-based modelling.
The successful candidate will:
- Review literature on ICT solutions for innovation policy modelling
- Collect existing models and define gaps and limitations
- Choose a case study from the Irish and/or European context for realistic modelling,
- collect data building an own database,
- build own data-driven simulations starting with the model existent in IRU´s Computational Policy lab and
- develop a class of alternative and complementing models around it,
- carry out experiments and testing innovation policy scenarios using various models, and
- evaluate ICT solutions for innovation policy modelling
Computational Chase of Economic Returns for R&D Investment (Principal Supervisor: Prof. Petra Ahrweiler, Business)
Context and Objective: Policymakers have identified innovation as one of the most important policy targets for dealing with challenges such as the current economic and financial crisis. The role of innovation for modern economies is immediately obvious looking at income distributions and the share of knowledge-intensive industries in different world regions: the correlation is significant – high-tech regions match with high-income regions.
The extensive evidence for this correlation is since long monitored and documented by international and national institutions in much detail. However, these analyses mostly provide evidence in using correlations from econometric data. They do not tell much about causal chains and mechanisms, about the traceable line from investment to result. Empirical evidence proving a direct and immediate profitability of R&D investment is scarce.
Especially in a time of diminished public resources and difficult capital markets, this is not acceptable. The strong need for justifying public and private investments produces a tendency in innovation policy, business management and public discourse to expect that the current investments in R&D, higher education institutions, science-industry networks etc. will immediately produce a flow of products and processes with high commercial returns. The requirement is to see value for money, and that is money for money. If there is a considerable investment as input, there must be a considerable, beneficial output, which can be directly traced to this input. This research is about developing tracing and backtracking procedures, which enable to enlighten the relationship between R&D and economic welfare.
Work Plan: The project will be hosted by the Innovation Research Unit of University College Dublin. At the IRU, all projects are carried out using the innovative methodology of the unit, which is a combination of empirical quantitative and qualitative research, social network analysis and agent-based modelling.
The successful candidate will:
- Review literature on the role of R&D for the economy
- Choose some knowledge-intensive blockbuster products and processes and case study examples
- Do a market analysis for these products/processes
- Develop a methodology to de-compose the product/process into its knowledge components,
- Develop theoretical and computational backtracking and tracing procedures in IRU´s Computational Policy lab and
- Develop a class of models able to show the relation between R&D inputs and economic returns for knowledge intensive products and processes
Validating Simulations of Social Innovation Systems (Principal Supervisor: Prof. Petra Ahrweiler, Business)
Context and Objective: The aim of social simulation is not primarily to reproduce statistical observations of society, but rather to gain a dynamic and detailed description of a complex system where we can observe the consequences of changing features and parameters. Such simulations serve as a laboratory to experiment with social life and test our theories about it in a way that cannot be done empirically due to methodological reasons. Especially, data concerning innovation performance, knowledge development etc. often are empirically not or only incompletely available. However, we need of course to validate simulations of social systems such as innovation systems. Social simulation is social theory running on a computer – it produces data as the outcome of the processes and mechanisms implemented in the artificial social world. To test and possibly falsify the computational theory, we need validation exercises confronting the model results with real-world data. To give credibility to the model results and connect them to other (empirical) research in the same field, the relation between artificial data created by the model and empirical longitudinal data needs to be given some attention.
This research is about this crucial relationship between data and modelling and the requirement of validation. It will progress the whole field of social simulation, and especially, the sub-part of it dealing with innovation systems.
Work Plan: The project will be hosted by the Innovation Research Unit of University College Dublin. At the IRU, all projects are carried out using the innovative methodology of the unit, which is a combination of empirical quantitative and qualitative research, social network analysis and agent-based modelling.
The successful candidate will:
- Review literature on validation of social simulation
- Develop a theoretical framework of validation requirements for simulations of innovation systems
- Test existing innovation models towards these requirements identifying shortcomings and gaps
- Develop computational “middleware” procedures to fill shortcomings and gaps
- Develop an integrated validation workbench for innovation models in IRU´s Computational Policy Lab
- Show general applications for social simulation in general
Modelling environmental effects on coherent excitation energy transfer in photosynthetic light harvesting complexes (Principal Supervisor: Prof. David Coker, Physics)
Context and Objective: Theoretical and computational methods and models will be developed and applied to study excitation energy transfer in photosynthetic light harvesting chromophore arrays. In these systems excitons apparently move in coherent multichromophore quantum superposition states, and with essentially no energy loss over large distances to efficiently deposit their energy into reaction centres where transformation and storage are initiated. Experiments suggest that this lossless energy transmission is due to correlated motions of the nanostructured protein scaffolding in which the chromophores are embedded. Large-scale classical simulations of these systems will be carried out to parameterize multistate quantum system - bath models whose coherent quantum dynamics can be studied with recently developed mixed quantum-classical methods.
This new simulation methodology will be applied to study experiments in two areas: (1) In collaboration with the Toronto group of Scholes, calculations will be carried out for comparison with their experiments to explore the dependence of multichromophore network energy transfer efficiency on network geometry, environmental spectral density characteristics, chromophore electronic coupling, and other model network properties. The objective of these studies is to explore the design principles at work in natural photosynthetic light harvesting arrays. (2) The second application project will be conducted in collaboration with the Dublin group of Thampi and involves constructing reduced dimensional models of harvesting networks coupled to dye sensitized solar cell (DSSC) materials to explore how interfacing biological light harvesting antenna networks with photovoltaic materials might be optimized to design highly efficient natural harvesting - synthetic transformation - hybrid materials.
Work Plan: The PhD student will work Dublin running large-scale molecular simulations of the FMO trimer bacterial photosynthetic light-harvesting complex in the presence of a model membrane, and a slab of DSSC material. Chromophore configurations from the molecular mechanics trajectory will be used in TDDFT electronic structure calculations to compute the distributions of chromophore excitation energies and the time correlation function of the excitation energy fluctuations of the chromophores in different parts of these nanostructured systems. These quantities define the spectral densities and site energies that characterize a reduced model of this exciton energy transfer system that can be treated with various mixed quantum classical dynamics techniques. The student will work for several months in Rome to incorporate this reduced model into the mixed quantum-classical approach developed by the Ciccotti group and co-workers. Various approximate quantum dynamics methods will be studied to explore the effects of inhomogeneous chromophore environments and correlated environmental fluctuations on the coherent excitation energy transfer dynamics in these systems.
Minor host: University of Rome (Giovanni Ciccotti)
Active motion in confinement (Principal Supervisor: Vladimir Lobaskin, Physics)
Context and Objective: Bacterial growth and movement in confined spaces is ubiquitous in nature and plays an important role in diverse fields ranging from soil microbiology, water purification, to microbial pathogenesis. The majority of bacteria in soil and bedrock live in pores of size 6 micrometer and smaller. These bacteria constitute a large portion of the Earth's biomass and are essential for the functioning of soil. Although distributions of bacteria in soil and Earth's subsurfaces have been studied, it is largely unknown how bacteria grow, move, and penetrate pores of very small size. For example, when E. coli cells are close to a surface they trace out clockwise (when viewed from above the surface) circular trajectories, and are observed to stay near the surface for long periods of time, enhancing the probability of their adhesion to the substrate. Consequently, the motility of E. coli near surfaces is important in the early stages of biofilm formation and pathogenic infection. It also has been found that the swimming speed increased with the distance from the boundary. Some experimental and theoretical studies have discussed the effects of restricting geometry on bacterial motility. It has been established that E. coli bacteria can swim in 2.0 μm and wider channels without appreciable slowdown and that bacteria regularly swim in close proximity to surfaces, preferring some types of surfaces to others. It has been shown that these behaviors can be used to guide bacterial movement in microfluidic structures.
Our simulation methodology is based on a combination of the Lattice-Boltzmann method for the solvent and Langevin dynamics for the solute particles (bacteria) and surfaces. The method includes the hydrodynamic interactions. Further on, the coarse-grained solutes will be modelled in the LB solvent using the dissipative coupling or solid surface algorithm. The second method of modelling is Langevin dynamics based on the model of active Brownian particle (the standard Langevin equation for the velocity with a thrust term). A numerical study will attempt to describe the main effects of (i) the interaction of the swimmer with a wall, (ii) motion of a swimmer in a narrow cylindrical or planar channel, (iii) characteristics of the confined random walk describing the swimmer motion.
Work Plan: The PhD student will work in Dublin and Lyon on simulation of a single swimmer next to a solid wall. On the next stage, more complex geometries will be used, including a narrow planar channel or pipe. The Lyon part of the project will include modelling of an active Brownian particle via Langevin dynamics and determining the statistical characteristics of a confined random walk. The Lyon part of the project will also include collaboration with the group of JP Rieu on experimental study of bacterial motilities in confinement.
Minor host: University of Lyon I, France (Anne Tanguy)
Master Equation-based QM/MM Modelling of Structure and Dynamics in Photosystem II (Principal Supervisor: Nicolae-Viorel Buchete, Physics)
Context and Objective: Photosynthesis is crucial to life on Earth, constituting the major process of O2 generation, and providing at the same time an ecological mechanism for CO2 capture and carbon-based biosynthesis. A significant amount of research was focused recently on the Photosystem II (PII), a large multi-component transmembrane protein complex found in green-plant chloroplasts and cyanobacteria. PII uses a 20 subunit photon-antenna system of chlorophyll pigments and quinone cofactors to harvest solar light and to achieve water oxidation through a photosynthetic mechanism that presents smaller activation barriers than chemical or electrochemical water dissociation. Recent high resolution X-ray diffraction data has allowed detailed structural studies of PII, and of its oxygenevolving complex (OEC) in particular (the site of water oxidation and electron production). The size of the PII complex is obviously prohibitive of a full quantum mechanical treatment. On the other hand, a crucial limitation of using classical MD simulations comes from relying on a predefined PES which is based on the knowledge of the molecular structure and bonding pattern. In this work, we use the latest hybrid, quantum mechanics molecular mechanics.
(QM/MM) methods that allow for chemical bond breaking during the dynamical evolution of the system in the QM region of interest. This project will use our recently developed methods based on coarse master equations (CMEs), in conjunction with structural details revealed by previous QM/MM structural studies, to investigate and refine further our detailed understanding of the structural and kinetic coupling between the active site in the OEC vicinity and the neighbouring molecular complex. We can address thus important outstanding questions such as how are the intrinsic metastable conformational states of the protein complex affecting the QM region and the corresponding QM/MM free energy profiles. Our CME-based analysis offers also a systematic framework for coarser level modelling, bridging the gap between different levels of representation (e.g., from QM to atomistic and residue or domain based) of the slow relaxation processes. Most importantly, this work could lead to identifying the most important steps and factors that control the efficiency of energy transduction in the complex PII system.
Work Plan: The research will focus initially on (a) modelling and testing the initial conditions for QM/MM simulations focused on the OEC-region dynamics and its coupling with the surrounding protein (b) perform a systematic CME-based analysis to test and identify the kinetic states specific to both the QM and MM regions, and (c) addressing the challenging problem of coupling the QM dynamics to the emerging CME-based coarse-grained representation of the PII complex. An excellent expertise in addressing the notoriously challenging, multiscale analysis of both QM/MM and coarsegrained simulation results is maintained by close collaboration with the University of Rome biophysics group.
Minor host: University of Rome (Giovanni Ciccotti)
Rogue Waves and Extreme Events in Hydrodynamic-Optical Systems (Principal Supervisor: Prof. Frédéric Dias, Mathematical Sciences)
Context and Objective: As part of a project studying mathematical modeling and experiments studying nonlinear instabilities, rogue waves and extreme phenomena, a PhD student is required to carry out numerical studies at UCD, in close collaboration with the University of Besançon optics group. Several mechanisms, linear and/or nonlinear, have been proposed for the formation of rogue waves. A primary role of the School of Mathematical Sciences of the University College Dublin is to study the formation of rogue waves in various models, ranging from the full water wave equations to nonlinear Schrödinger (NLSE) type equations. The advantage of NLSE models is that they can be found in optics as well as in hydrodynamics.
Work Plan: It has been recognized that one of the mechanisms is the modulational (Benjamin-Feir) instability of a planar propagating wave and it has been recently shown that this instability is also a relevant mechanism leading to the spontaneous formation of rogue waves in optical physics. One of the important goals of the project is therefore to study the modulational instability in the context of incoherent waves, possibly through a coupled set of equations for the evolution of the incoherent and coherent part of the field. Another goal is to use a statistical approach for the description of weakly nonlinear interacting waves, with the inclusion of the influence of coherent structures such as solitons.
The candidate should have knowledge in the numerical integration of partial differential equations, and either in fluid mechanics or in optics.
The candidate will work with mathematicians as well as physicists, and aspects of the work will involve interactions with experimental teams to adapt numerical and theoretical models to describe realistic wave propagation scenarios.
Minor host: University of Besançon (John Dudley)
Numerical code for tsunami propagation running on the GPU (Principal Supervisor: Prof. Frédéric Dias, Mathematical Sciences)
Context and Objective: As part of a project studying tsunami generation, propagation and inundation, a PhD student is required to transform an existing code running on CPU into a code running on the GPU (Graphics Processing Unit). The new code is expected to run faster and to allow the inclusion of more complex physics.
Work Plan: The existing code is based on the nonlinear shallow water equations, which are solved numerically by a finite volume solver. The code can handle all kinds of bathymetric data. The transfer of the code will be done in close collaboration with ICHEC, the Irish Centre for High-End Computing. The candidate should have knowledge in the numerical integration of partial differential equations, and in computer science. The candidate will work with mathematicians as well as physicists, and aspects of the work will involve interactions with experimental teams to adapt numerical and theoretical models to describe realistic wave propagation scenarios.
Numerical and Analytical Aspects of Singular Behaviour in Magnetohydrodynamics (Principal Supervisor: Miguel Bustamante, Mathematical Sciences)
Context and Objective: This project builds on our novel ideas and geometrical insight to solve open problems in fluid dynamics. One of the major open problems in Mathematics is: Do solutions of 3D and 2D ideal magnetohydrodynamics (MHD) equations remain smooth for all times or do they blow up in a finite time? We propose a novel approach that combines mathematical analysis and numerical methods. The new approach responds to the urgent need of a fresh viewpoint regarding this open problem, by effectively solving the main drawbacks of state-of-the-art numerical simulations of Euler fluid equations and MHD.
Currently, given an initial condition with finite energy, it is not known if the solution has a finite-time blowup (singularity). Analytically, only conditional results have been produced, the most important being [R.E. Caflisch, I. Klapper, G. Steele, Comm. Math. Phys. 184, 443-455 (1997)], which states that the solution has a finite-time singularity at time T if and only if the time integral of the sum of the supremum norms of the vorticity and the electric current density, diverges at time T. The hypothesis of finite-time singularity says that this integral is divergent.
We will establish a new numerical method to determine a robust validity test of the hypothesis of finite-time singularity of the MHD equations. Our new approach consists of the numerical integration of a globally regular system, found recently by the Principal Supervisor [preprint: arXiv:1007.2587 (2010)] to be bijectively related to the MHD equations. Solving numerically a globally regular system has a great competitive advantage over a direct integration of the MHD equations.
This project is embedded in a large-scale international collaboration, funded by several agencies: IRCSET (Ireland), EGIDE (France), IUTAM (International Union of Theoretical and Applied Mechanics). The Principal Supervisor is the chairman of an IUTAM international symposium to take place in UCD Dublin in July 2012 on the topic of extreme events, of direct relevance to the current project.
Work Plan: Numerical simulations will be performed at low and medium resolution on a local cluster at UCD, and at high resolution on a Blue-Gene/P supercomputer at the Irish Centre for High End Computing, using Message-Passing-Interface (MPI) pseudospectral codes.
Work Package 1. The PhD student will develop a significant piece of work: a new parallel code to integrate numerically the new regular mapped system. The new code is based on two validated codes of which the Principal Supervisor has first-hand experience. Test simulations will be produced at low and medium resolution, with an emphasis on finding the most stable algorithm to determine the non-linear damping appearing in the globally regular system. The new code will be validated by monitoring quantities related to energy and circulation, and by direct comparison of its output with the mapped output of existing validated MHD codes.
Work Package 2. The PhD student and the Principal Supervisor will run medium- to high-resolution simulations of the new regular system in order to draw conclusions on the finite-time singularity hypothesis of the MHD system, with an emphasis on the effect of initial conditions. The PhD student will improve optimisation methods which the Principal Supervisor is currently developing to generate such initial conditions.
Minor hosts: University of Warwick (Robert Kerr), ENS (Marc Brachet)
Numerical simulations of three-dimensional phenomena: phase separation, mixing, and turbulence (Principal Supervisor: Lennon Ó Náraigh, Mathematical Sciences)
Context and Objective: Phase separation, mixing, and turbulence: three disparate phenomena that share a common mathematical feature. Phase separation occurs when a homogeneous mixture of distinct elements, cooled below a critical temperature, separates into its component parts. Mixing is the process by which the distribution of a tracer dye is rendered homogeneous by stirring and diffusion. Turbulence in fluids is characterised by complicated eddy-type motion, and is produced by nonlinear interactions between all lengthscales in the problem. In each of these problems, diffusion plays an important physical and mathematical role. Physically, diffusion results in a `smearing out' of small scales, while mathematically, it leads to (an apparently) well-posed set of equations to solve. Such equations are amenable to numerical simulation by spectral methods, wherein the differential operators involved in the problem are inverted in Fourier space, leading to substantial speedups in computational time. Indeed, the Principal Supervisor has already solved such problems in two dimensions, where numerical simulations are readily achieved within hours on a desktop computer.
It is well known that turbulence in two dimensions differs drastically from that in three dimensions. In particular, in two dimensions, there is an `inverse cascade' of energy to larger scales, leading to large-scale, coherent vortices. The effects of dimensionality are less well understood in the case of mixing and phase separation, and the goal of the project is to simulate these phenomena in three dimensions.
Work Plan: The first objective is to scale up existing numerical codes for phase separation, mixing, and turbulence to three dimensions, and to implement parallel algorithms to reduce the simulation time. Once these codes are written and tested, they can be implemented on the available high-performance computing platforms, and some of the outstanding questions concerning phase separation and mixing in three dimensions can be tackled.
The project will be hosted by the School of Mathematical Sciences of the University College Dublin, where the research will focus on (a) writing and testing the codes for three-dimensional simulation of phase separation, mixing and fluid flow; (b) performing parametric and computational topology studies of the Cahn-Hilliard equation for phase separation; (c) undertaking parametric and postprocessing studies of the advective Cahn-Hilliard equation for stirred binary liquids, paying particular attention to the effects of dimensionality on filament stretching and mixing; (d) doing numerical simulations of forced three-dimensional turbulence to drive the advection-diffusion equation and to investigate the alignment dynamics of the tracer gradient in three dimensions.
Design and implementation of parallel algorithms for scientific computing on heterogeneous HPC platforms (Principal Supervisor: Alexey Lastovetsky, Computer Science & Informatics)
Context and Objective: Computational science is now commonly considered a third pillar of science, complementing and adding to experimentation and theory. It is concerned with constructing mathematical models and using computers to solve scientific problems. Computing platforms used by computational scientists are becoming increasingly heterogeneous, hierarchical, and complex. At the very low level, latest CPUs are made of multicore processors that can be general-purpose or highly specialized. On the other hand, several processors can be assembled into a symmetrical multi-processor which can also have access to powerful specialized processors, such as Graphics Processing Units (GPUs), that are increasingly used for programmable computing. Hardware trends anticipate a further increase in the number of cores (in a hierarchical way) inside the chip, thus increasing the overall heterogeneity, even more towards building extreme-scale systems, approaching the scale of millions processors/cores. Moreover, multicore chips will soon be fully heterogeneous with special purpose cores (e.g. multimedia, recognition, networking, etc.) and not only GPUs, mixed with general-purpose ones.
Work Plan: These modern and future parallel computing platforms need a new type of algorithms in order to fully exploit their performance potential. Traditional parallel algorithms developed for homogeneous multiprocessors will not work. The proposed research project will focus on the design and implementation of new types of parallel algorithms. Design and analysis of fundamental models and algorithms for these platforms will be followed by their implementation in the form of software suitable both for autonomous use by application programmers and for easy incorporation into system and mathematical programming systems and packages.
Green scientific programming: models, algorithms and tools for development of scientific applications optimized for both performance and power consumption (Principal Supervisor: Alexey Lastovetsky, Computer Science & Informatics)
Context and Objective: Development of scientific applications, known as scientific programming, traditionally aims to optimize their performance. Of two applications solving the same problem, the one that solves the problem faster will be considered a better application. Nowadays we observe the exponential increase of the use of computing in research, development, engineering and many other fields, where the performance was traditionally very important. The scale of scientific computing increases in two dimensions. First, more and more scientists use high-performance computing in their research and development. Second, the HPC systems become larger and larger approaching millions of computing devises available to a single application. This quickly increasing scale of computing results in the quickly increasing scale of power consumption that is becoming a real issue. The annual electricity bill for top computing systems can reach hundreds millions dollars in very near future. Therefore, the power consumption is quickly becoming as important optimization criterion for scientific applications as the performance.
Work Plan: The proposed project will focus on development of models, algorithms, methods and programming tools for development of scientific applications optimized not only for performance but also for power consumption. Target platforms in this project are modern and perspective HPC systems that are becoming increasingly heterogeneous, hierarchical, and complex. The methods proposed by the project will help the scientific programmers develop self-adaptable applications that automatically and at runtime distribute computations and communications in a way that optimizes multiple criteria (including the performance and power consumption) as specified by their users (say, minimize the cost of solution of the problem, which includes the cost of electricity and possibly some other resources, given the problem should be solved by tomorrow morning).
Reconstructing Regulatory Biological Networks from Systematic Perturbation Data (Principal Supervisors: Boris Kholodenko and Walter Kolch Systems Biology Ireland)
Context and Objective: Advances in high throughput genomics and proteomics analyses have facilitated the acquisition of large data sets of the gene expression levels and activities of signalling proteins. However, current methodologies for analysis do not permit interpretation of the data in a manner that reveals the wiring and connections within the biological networks that control the cellular machinery. Our aim is to understand how perturbations, such as growth factors and therapeutic drugs, influence the expression of genes and how we can reconstruct such complex regulatory networks. We have developed a dynamical network reconstruction approach called Modular Response Analysis (MRA) to infer the topology and strength of network connections from experimental data on network responses to perturbations (Kholodenko et al (2002) Proc. Natl. Acad. Sci., USA, 20, 12841-12846; Sontag et al (2004), Bioinformatics, 20, 1877-1886). This method uses measurements of how systematic local perturbations of the network affect its global response in order to reconstruct the dynamic network structure.
Work Plan: The proposed project will expand MRA to take into account statistical methods and advanced simulations of dynamically changing network topologies. We will develop novel methods and algorithms for reconstruction of network topologies from systematic perturbation data, such as drugs and growth factors, obtained by experimental measurements. We will develop novel statistical techniques to augment the dynamic reverse engineering methods based on MRA. We aim to develop a scalable approach where an increase in the network complexity does not result in a combinatorial increase in the number of perturbation experiments and/or computations. The developed powerful techniques will be tested in silico, but also applied to data obtained in vivo. In silico testing will include the development of chemical-kinetics models of cellular networks and the use of computer-generated responses to perturbations that correspond to realistic experimental protocols. Experimental collaborators at Systems Biology Ireland will validate the proposed reverse engineering techniques. Thus, the successful candidate will closely collaborate with experimentalists.
Dynamic Control Networks in Cancer (Principal Supervisors: Boris Kholodenko and Walter Kolch Systems Biology Ireland
Context and Objective: Growth factors, such as EGF (Epidermal Growth Factor), which bind receptors at the cell surface, can reprogramme the behaviour of cells by changing the expression of genes in the nucleus. This process is altered in many diseases such as cancer. Which genes are activated is critically dependent on the duration (transient versus sustained) and other dynamic features (such as bistable, oscillatory, or excitable behaviour) of the signalling networks which connect growth factors to changes in gene expression. However, the way how dynamic control mechanisms, such as feedback and feedforward loops, specify responses to growth factors and cause expression of specific genes are still unknown. The aim of this project is to elucidate how dynamic network control structures regulate growth factor induced gene expression.
Work Plan: We will approach these challenges in two ways. First, we will use engineering and mathematical approaches as well as deterministic and stochastic simulations in order to explore how multiple signalling and gene expression feedback loops with different properties and operating on different timescales shape the input-output characteristics of growth factor networks. Second, using a systems biology approach, consisting of iterative cycles of experimental data collection and model building, we will develop a mechanistic computational model that will simulate how perturbations to feedforward and feedback loops will change the dynamics of signal duration and gene expression. Using the models and a variety of mathematical techniques the systems dynamics and responses to perturbations will be analyzed to pinpoint the fragile nodes most amenable to therapeutic interference to treat cancer. The project will combine cutting edge simulation methods and based on the wet lab data. Thus, the successful candidate will be embedded in an interdisciplinary team of modellers and biologists at Systems Biology Ireland.
Virtual Organs: Simulating Health and Disease (Principal Supervisors: Boris Kholodenko and Walter Kolch Systems Biology Ireland)
Context and Objective: Modern medicine and biomedical research is based on a detailed understanding of the development, function, maintenance and disease of our organs. While we have accumulated a vast amount of knowledge, we are still unable to model and simulate how organs become diseased and predict how they would respond to potential therapies. This project aims to simulate the physiological regeneration of organs and the aberrations that occur during cancerous growth, with a view to better understand the rules determining health and disease. As example we will use colon cancer. This is a prevalent cancer, which ranks second in cancer mortality in Europe. The colon epithelium is completely renewed every 2-3 weeks. In the normal colon this renewal starts from stem cells which reside at the bottom of the colonic crypts. These stem cells divide and the daughter cells migrate up the crypt and the villus (a small tissue protrusion into the lumen of the colon) while they differentiate into mature colonic epithelial cells. When they have reached the tip of the villus they die by programmed cell death and are sequestered into the colonic lumen. Colon cancer arises when this process is disturbed through genetic mutations, which are often caused by mutagenic substances originating from food stuffs. It requires 5-8 mutations for a colon cancer to arise. It is a stepwise development that starts from benign overgrowth (polyps) and many years later may end in aggressive cancers that spread beyond the organ boundaries. The aim of this project is to model a colonic crypt and its dynamic with dividing, migrating and dying cells and simulate the disease processes that occur during cancer development.
Work Plan: We will generate a dynamic model of the colonic crypt with its different cell types, considering their interactions, their possible fates and their anatomic arrangement. Once we have modelled the healthy crypt we will consider the effects of genetic mutations as they occur in colon cancer. To build these methods we will use agent based modelling, rule based computational approaches and stochastic approaches. As many of these mutations affect growth factor signalling pathways we also will be considering them as inputs into simulations of cellular (mis)behaviour. As outcome we envision a model which can simulate the effects of different mutations on the growth dynamics of the colonic crypt and the progression from the healthy tissue to cancer.
Designing Virtual Cancer Patients (Principal Supervisors: Boris Kholodenko and Walter Kolch Systems Biology Ireland)
Context and Objective: Clinical trials are an essential part of developing new drugs. They are hugely expensive and also may involve risks for the participants. These factors place a high priority on the optimisation of the design of clinical trials, so that the necessary information can be gathered with minimal risk and the smallest possible number of patients. The aim of this project is to use simulation to design virtual patients for virtual clinical trials that can be conducted at a small fraction of the cost and help to optimise the design of real trials. As example we will use colon cancer, which is a common cancer in industrialized countries and ranks second in cancer mortality in Europe. We a have a lot of existing data and knowledge on the patho-physiology of colon cancer development at different scales including mutation spectrum in patients, gene expression from patients, biochemical data from cell lines, effect of mutations in mouse models and in-vitro culture, effect of drugs in previous clinical trials. He we aim to integrate these data to design virtual colon cancer patients by computational simulation.
Work Plan: We will use models of the different scales and link them, so that we can rationally predict the influence of one scale on the other, e.g. the influence of a diet, a mutation or a measured gene expression pattern on the disease progression in a patient. Particular emphasis will be put on (i) understanding how environmental factors affect the initiation and progression of colon cancer; and (ii) which therapies will be most efficient against colon cancer in a given patient.
Rational biosensor design based on molecular dynamics simulations (Principal Supervisors: Walter Kolch and Marc Birtwhistle, Systems Biology Ireland and David Coker, Physics)
Context and Objective: Over the past 10 years, the number of genetically encoded, Forster resonance energy transfer (FRET)-based sensors for monitoring various biochemical activities in live cells and real time has skyrocketed. Most of these probes are unimolecular and have a general structure where a sensing unit, which is conformationally responsive to a biochemical activity of interest, is sandwiched between “blue-shifted” and “red-shifted” fluorescent proteins (FPs) capable of FRET. Thus, changes in biochemical activities of interest change the distance between the FPs, leading to detectable changes in FRET. These FRET probes open up completely new avenues for systems biology and mathematical modeling as they allow quantitative measurements of signaling events in single live cells, with spatial resolution, and with high temporal frequency. Although there are known crystal structures for most of the utilized FPs and sensing unit pieces, it is currently unknown, from a protein structure perspective, how to best design these probes to improve their sensitivity.
Work Plan: This project will primarily focus on performing molecular dynamics (MD) simulations of these probes such that we can better understand how to improve probe sensitivity by design of (i) the sensing units themselves and (ii) the linker sequences between the sensing units and the FPs. Standard MD calculations may not be able to effectively sample all the relevant conformations of these large molecular systems so enhanced sampling techniques, multiscale simulation methods, and free energy calculation methods will need to be developed and implemented to reliably study these large complex systems. At later stages, the project will focus on constructing new probes based on the results from the molecular dynamics simulations, and then testing them in live-cell microscopy experiments.
The impact of geophysical variability on the evolution and stability of ecosystem diversity: application to oceanic plankton (Principal Supervisor: Zoltan Neufeld, Mathematical Sciences)
Context and Objective: A striking feature of many natural ecosystems is the high diversity of species. Understanding the key mechanisms controlling biodiversity through speciation and extinctions, and the conditions of stable coexistence of many interacting and competing species are fundamental unsolved problems in ecology. These interactions and evolutionary processes take place in an heterogeneous and variable environment due to geophysical processes and climatic changes. We believe that this so far neglected factor plays an important role in determining biodiversity. Understanding the consequences of natural variability can also provide insights for the management of ecosystems influenced by anthropogenic perturbations and climate change.
Work Plan: The project will use mathematical and computational modeling approaches to investigate the role of geophysical variability and transport processes in ecosystem diversity and stability. The project is most suitable for a student with strong background in applied mathematics and interest in computational modeling applied to ecology, evolution and geophysics. The project will also involve collaborations with specalists in oceanography and climate dynamics.
Design principles of genetic regulation of metabolic networks (Principal Supervisor: Zoltan Neufeld, Mathematical Sciences)
Context and Objective: Biochemical reactions between different types of molecules (proteins, metabolites, genes etc.) form a complex network of interactions composed of metabolic and gene regulatory pathways. Although the reaction dynamics of metabolic networks can be described by a system of kinetic equations this is of limited use since the parameters and the functional forms of reaction fluxes and parts of the regulatory interactions are largely unknown. New approaches have been proposed that instead of analysing the full reaction dynamics focus on the constraints arising from the structure of the stoichiometric matrix that describes the network of interactions between the components. Taking into account these constraints, flux balance analysis (FBA) can determine the flux distribution corresponding to an optimal state of the system (e.g. corresponding to maximal growth rate). Such approaches have been validated experimentally and can also be used to predict effects of genetic knockouts.
 Price et al. Genome-scale models of microbial cells: evaluating the consequences of constraints. Nat Rev Microbiol (2004) vol. 2 pp. 886
 Covert et al. Integrating high-throughput and computational data elucidates bacterial networks. Nature (2004) vol. 429 pp. 92-96
 Ibarra et al. Escherichia coli K-12 undergoes adaptive evolution to achieve in silico predicted optimal growth. Nature (2002) vol. 420 (6912) pp. 186-189
 Prill et al. Dynamic Properties of Network Motifs Contribute to Biological Network Organization. Plos Biol (2005) vol. 3 (11) pp. e343
 Min Lee et al. Dynamic Analysis of Integrated Signaling, Metabolic, and Regulatory Networks. PLoS Comput Biol (2008) vol. 4 (5) pp. e1000086
Work Plan: The PhD project will aim to understand the mechanisms of dynamic regulatory processes through which the optimal metabolic states are reached and maintained in biological systems. First synthetic metabolic network models will be used with full reaction dynamics to design "genetic" regulatory schemes that by adjusting reaction fluxes can drive the system to the optimal metabolic state for the given network. Understanding the characteristic features of the regulatory interactions in the synthetic metabolic networks will then be used to formulate hypotheses than can be tested on real biological systems (e.g. using data available on metabolic and genetic interactions in microbial systems). At a later stage the work will be extended to study adaptation in response to changes or fluctuations in the external environment (e.g. due to variable sources of nutrients) by designing signaling interactions that can optimise the systems behavior under such conditions. Finally robustness and evolvability of the regulation with respect to genetic mutations will also be considered.
The project will involve mathematical and computational modeling biochemical reaction networks and analysis of biological data, and is most suitable for a student with strong quantitative mathematical background (e.g. physics or applied mathematics) with some knowledge and/or interest in biology.
Collective motility, chemical communication and navigation of cells (Principal Supervisor: Zoltan Neufeld, Mathematical Sciences)
Context and Objective: Collective behavior has long been observed in biological systems, such as insect colonies, bird flocks, and schools of fish. Understanding how such emergent behavior arises in large systems of interacting individuals is a fundamental problem in the biological sciences. The aim of the project is to study collective behavior arising at the level of cell populations where the interactions are primarily mediated through diffusing biochemical signals. The project will investigate mathematical and computational models of chemotactic search behavior of cell populations, that is particularly relevant for bacterial biofilms and in the invasion of cancer cells through collective cell migration.
 T.S. Deisboeck and I.D. Couzin, Collective behavior in cancer cell populations, BioEssays 31:190-197 (2009).
 C. Torney, Z. Neufeld, I.D. Couzin, Context-dependent interaction leads to emergent search behavior in social aggregates, Proc. Natl. Acad. Sci. USA 106:22055-22060 (2009).
 R. Mayor and C. Carmona-Fontaine, Trends in Cell Biology, 20 319-328 (2010)
Work Plan: The project will also study evolutionary aspects of collective behavior and the processes leading to specialization in heterogeneous cell populations. This project is suitable for a student with strong background in applied mathematics and/or physics and interest in biology and computational modeling.
Exploring model uncertainty in doubly intractable distributions (Principal Supervisor: Nial Friel, Mathematical Sciences)
Context and Objective: There are many models in statistics for which the likelihood cannot be evaluated exactly. For example, social network models; spatial/image models; phylogenetic trees. In a Bayesian context, where the posterior distribution combines the likelihood model with a prior distribution, this leads to what is called a doubly intractable distribution. This is so-called because not only is the likelihood intractable, but the normalising constant of the posterior distribution is also intractable. Overcoming this intractability is of vital importance, since it turns out that if one could do so, this would allow probability statements to be made about the uncertainty of the model itself, in situations where a collection of competing models are available, each of which could plausibly describe the data.
Work Plan: In many of the situations described above, where it is not possible to evaluate the likelihood directly, it is often possible to simulate from the likelihood model. Simulating from the model can provide useful information about the model – this will be the starting point for this project. This project will examine challenging models in social network analysis and in image analysis and will suit someone with interests at the interface of statistics/probability/computer science.
Classification Problems from Food Authenticity Studies and Cancer Diagnosis (Principal Supervisor: Brendan Murphy, Mathematical Sciences)
Context and Objective: Classification involves assigning observations of unknown origin into known groups. For example, a classification method could be used to determine if a food sample has been altered or not by a rogue producer.
Classification and clustering methods will be developed for data that is collected sequentially over time rather than all at once. Methods based on statistical models will be used so that complex classification tasks can be accommodated with the methodology.
Work Plan: Sequential Monte Carlo and variational Bayes methods for sequential analysis of mixture models and classification will be developed. The methods will be compared and contrasted with existing sequential approaches. Applications to classification problems from food authenticity studies and cancer diagnosis will be pursued.
Network Models with Endogenous Link Formation (Principal Supervisor: Jim Bergin, Economics)
Context and Objective: In recent years, there has been substantial interest in the use of network models for the study of behaviour in social and economic contexts. Some of the earliest work concerned a classic strategy problem, the prisoner’s dilemma where there is conflict between pursuit of short term selfish interests and achievement of long term benefits that arise from cooperative behaviour. Network models (for example Nowak and May (1992), and Lloyd (1995) show how cooperative behaviour can survive when interaction is local. In that work, the network structure (nodes and vertices) is taken as exogenous so that each agent interacts with a given group; overlap occurs across groups. This model has proved useful, for example, in the study of cartel behaviour — the evolution of collusive behaviour in industries. A second category of network model takes the nodes as given, but allows for linkage formation and breakage. This category of model leads to the study of random graphs — a field of study initiated by Erdös and Rényi (1959) and which has proved useful for the study, for example, of disease propagation in a network. New developments in techniques in random graph theory (Newman, Strogatz and Watts (2001)) permit analysis of the associated complex network dynamics. Recent work (Gai and Kapadia (2010)) uses a random network model to study contagion of economic shocks in financial network. A random shock may lower the wealth of some individual or agency, forcing a sale of assets and the depression of associated prices. This creates the possibility of knock on effects as the wealth of others is affected; and may propagate catastrophically through the system leading to knock-on disposal of assets. How exactly does this occur, and what network structures are most robust against this contagion of bankruptcy through the system?
 P. Erdös and A. Rényi (1956), “On Random Graphs”, Publ. Math. Debrecen 6: 290297.
 P. Gai and S. Kapadia (2010), Contagion in Financial Networks, Bank of England, wp383.
 M. O. Jackson. Social and Economic Networks, Princeton University Press, 2008.
 A. Lloyd (1995), “Computing Bouts of the Prisoner’s Dilemma”, Scientific American, June 1995, Vol. 272, No. 6: 110-115.
 M. E. J. Newman, S. H. Strogatz, and D. J. Watts. (2001), “Random Graphs with Arbitrary Degree Distributions and Their Applications” Physical Review E Vol 64, 026118.
 M.A. Nowak and R. M. May (1992). “Evolutionary games and spatial chaos”, Nature 359: 826-829.
 M. Staudigl (2010), “Efficiency in Coordination Games in a Volatile Environment”, mimeo, University of Vienna.
Work Plan: The approaches described in the previous paragraphs are two extremes. In the first, the network structure is fixed, but the behaviour complex; in the second, the network structure is random, but the behaviour simple or even exogenous. A richer model would address the fact that, in general, both choices and linkages are endogenous: people choose their friends and how the interact with those friends, traders select the assets they hold and the investment strategy used. This raises the prospect of amplifying network effects. Current work by Staudigl (2010), shows how such a complex network model can be investigated analytically under specific assumptions on behaviour and linkage dynamics. That work can and should be extended analytically. Furthermore, models of this sort can be enriched further and numerically simulated. It is a natural “next step” in the study of network models and an excellent dissertation topic.
The Structured PhD Programme in Simulation Science and associated Sim Sci Fellowships in SImulation Science are funded under the Programme for Research in Third Level Institutions (PRTLI) Cycle 5 which is co-funded by the European Regional Development Fund (ERDF).