Master of

Computational Science Projects

Cellular simulation of blood flow

Blood is a complex suspension constituted of various components suspended in plasma. Many of its intriguing properties originate from this cellular nature. Red blood cells are the major component, they transport oxygen and determine the bulk behaviour of blood. Platelets, the second most numerous cells, form the link between transport dynamics and several vital biochemical processes such as clot formation. With the recent advancement of micro-medical devices modelling the small-scale behaviour gains more importance. Accurate modelling of blood flow related phenomena on this scale requires a description of the dynamics at the level of individual cells. This, however, presents several computational challenges that can only be addressed by high performance computing. We developed HemoCell (www.hemocell.eu), a parallel computing framework which implements validated mechanical models for red blood cells and is capable of reproducing the emergent transport characteristics of such a complex cellular system. It is computationally capable of handling large domain sizes, thus it is able to bridge the cell-based micro-scale and macroscopic domains. Using HemoCell as a tool, we pursue the answers to numerous open questions around the human circulatory system.

Keywords : Computational Biomedicine,

Track : Computational Bio-Medicine

Contact 1 : Alfons Hoekstra A.G.Hoekstra@uva.nl

Valid Date : Jan. 25, 2019

Computational Origins Simulator

As part of a project for the Dutch Origins Center, and in parallel with the development of an experimental Origins Simulator (http://www.origins-center.nl/the-origins-simulator-a-preliminary-description), we will take initial steps to build a computational Origins Simulator. This will be partly based on previous work on autocatalytic sets (https://evolution-institute.org/article/the-origin-of-life-a-selfish-act-or-a-cooperative-effort), and on ongoing work on modeling biomineralization (https://staff.fnwi.uva.nl/j.a.kaandorp/research.html). Within this project, we will develop and run new simulation models to extend the existing research, in the context of origin of life studies.

Keywords : Agent Based Model, Network-based modelling, Chemical reaction networks,

Track : Complex Sytems Theory

Contact 1 : Peter Sloot p.m.a.sloot@uva.nl
Contact 2 : Wim Hordijk wim@WorldWideWanderings.net

Valid Date : June 29, 2018

What is the role of vasoconstriction in initial platelet aggregation?

Our blood vessels contain smooth muscles cells (in circumferential direction) to regulate the diameter of the blood vessel. The vessels can in this way regulate the blood pressure in our body. Moreover, the regulation of the diameter of a blood vessel is also important when a vessel is damaged. To reduce the loss of blood smooth muscle cells will contract, a process which is called vasoconstriction. This contraction of the vessel is part of hemostasis, the process of blood clot formation. Vasoconstriction causes also an increase in shear rate that could be of value during initial platelet aggregation. Under high shear rates a protein called von Willebrand factor is important in the binding of platelets to the vessel wall and to each other. In our current study we found the appearance of a cell-free layer (CFL) at the place where a platelet aggregation is initialized. Additionally, we think that this CFL is needed for the von Willebrand factor to uncoil and bind platelets. Therefore, in this study we are interested in the relationship between the amount of vessel constriction and the thickness of the CFL. In addition, we want to know if the vasoconstriction is only important in hemostasis or also in thrombus formation. During this project you will work with the cell-based model Hemocell on this topic. If you are interested, please contact Alfons Hoekstra or Britt van Rooij.

Keywords : Computational Biomedicine,

Track : Computational Bio-Medicine

Contact 1 : Alfons Hoekstra A.G.Hoekstra@uva.nl
Contact 2 : Britt van Rooij b.j.m.vanrooij@uva.nl

Valid Date : Dec. 31, 2019

Self-organized criticality in an interest-rate swap market model

The interest-rate swap market (and similar hedging activities) is a large financial derivatives market. Here, agents (banks, funds, others) build up risk internally, which they 'swap' with other agents to make the total risk in total 'zero'. It is thus an agent-based networked model. The problem is that if one agent cannot meet its obligations and defaults then the swaps it holds with other agents reverse, leading potentially to a cascade or avalanche. The initial simulation results of a previous Master thesis project reveals a self-organized critical regime in the parameter space, or even a 'super' self-organized critical regime where system-wide crashes are inevitable. It is written in a thesis report and an optimized software package is available. This project will continue on this work and work towards a scientific publication. Remaining open questions include a mathematical mean-field approach to better understand the process and critical regimes, but also a comparison of the theoretical model parameters with real data or knowledge. Also more simulations are likely needed.The end goal is to bring the important message out into the world that these types of markets are potentially very dangerous to the systemic stability of our economy. We are looking for a mathematically or physics oriented student.

Keywords : Agent Based Model, Economics, Risk Management, Network-based modelling, Computational Finance,

Track : Computational Finance Complex Sytems Theory

Contact 1 : Rick Quax r.quax@uva.nl

Valid Date : Jan. 25, 2019

Shear Thickening Simulations Using Model Cornstarch Particles

Shear thickening is the phenomenon in which the viscosity of a suspension increases as with the shear rate. Two types shear thickening, continuous shear thickening (CST) and discontinuous shear thickening (DST) are observed in experiments. During CST, the viscosity of the suspension increases continuously with imposed shear rate, while in discontinuous shear thickening, viscosity of the suspension increases by orders of magnitude with a small increment in shear rate. Cornstarch and water suspension, commonly known as Oobleck, exhibits DST and CST. This suspension exhibits CST at lower cornstarch volume fraction in suspension, and exhibits DST at higher volume fractions (~0.4). The DST volume fraction for cornstarch suspensions is significantly lower than the DST volume fraction for spherical particle suspensions (~0.56 ). This has been assumed to be due to the non spherical shapes of cornstarch particles Our suspension simulation model (SuSi) can handle different particle shapes and simulate large numbers of particles immersed in any fluid medium. Using the system currently implemented in SuSi, one can create almost any shape by using overlapping spheres. This technique shall be used to model the shapes of cornstarch particles. In this project, we propose to model the cornstarch particles by observing their shapes from the microscopic images. The shear thickening that arises from the cornstarch particle model will then be studied using SuSi.

Keywords : shear thickening, microstructure, simulation, modeling,

Track : Others

Contact 1 : Alfons Hoekstra A.G.Hoekstra@uva.nl
Contact 2 : Vishnu Sivadasan v.sivadasan@uva.nl

Valid Date : Dec. 31, 2019

Holographic projector

When mentioning the word ‘hologram’ many people think of ‘Star Wars like’ technology that creates projections that float in free space. This is actually not too far off from what is theoretically possible. In reality, for viewing, the projector, the projection, and the observer must be in line. In daily life holograms are static images that can be found in some museums and exhibitions. By making use of the wave like nature of light, by means of interference, it is possible to create intensity distributions in free space. A holographic projector can be made with a ‘screen’ in which each ‘pixel’ emits light with precisely set intensity and phase. The light from all these pixels result in a 3 dimensional intensity distribution that represents the object to be projected. A difficulty in generating holograms is the required dense pixel pitch. A ‘conventional’ holographic projector would require pixels of less than 200 nanometer. With larger pixels artefacts occur due to spatial under sampling and the useful angle for projection becomes less (only a few degrees for state of the art systems). A pixel pitch of 200 nm or less is difficult, if not, impossible, to achieve, especially for larger areas. Another challenge is the massive amount of computing power and data handling that would be required to control such a dense pixel matrix. A new holographic projection method has been developed that reduces ‘conventional’ under sampling artefacts, regardless of pixel density. The trick is to create 'pixels' at random but known positions, resulting in a ‘pixel matrix’ that lacks (or has strongly suppressed) spatial periodicity. As a result a holographic emitter can be built with a significantly lower sample density and less required computing power. This could bring holography in reach for many applications like display, lithography, 3D printing, metrology, etc... A short description (in Dutch) can be found through the following link: https://www.nikhef.nl/nieuws/prototype-holografische-videoprojector-in-de-maak/ Our goal is to quantify the relation between hologram ‘quality’ and the density and positioning of the ‘pixels’. The quality of a hologram is defined by factors like: Noise, contrast, suppression of under sampling artefacts, and resolution. One of the challenges a student can work on, is to make a proper simulation model of this system. The programming language and the method(s) for modelling depend, to a large extend, on the preference of the student(s). The purpose of this simulation model is to determine the requirements of a holographic projector for a given set of requirements for the holograms (like, for example a contrast ratio of better than 1:100). In addition a proof of concept holographic emitter is being built. This set-up will be used to verify simulation results (and, of course, to project some cool holograms). If you want additional information, please contact Martin Fransen: martinfr<at>nikhef.nl

Keywords : High Performance Computing, Computational geometry, Image Processing, Monte Carlo simulation, modeling,

Track : Others

Contact 1 : Michael Lees M.H.Lees@uva.nl

Valid Date : Sept. 1, 2019

Improving extreme sea-level predictions with machine learning

The performance of current global coastal flood models is limited by, among others, the coarse spatial or temporal resolution of climate forcings. As a result, some processes are not modelled correctly nor accounted for which produces a strong bias in modelled water sealevels. In order to properly model those processes, detailed spatio-temporal data is required. Machine learning algorithms constitute an interesting downscaling alternative. They have been successfully applied to diverse flood risk, hydrologic and climate studies. In this thesis, the student will apply machine learning algorithms to predict extreme sea-levels and -importantly- compare this with outputs from, physically-based model chains, such as Delft3D. Students interested in applying machine learning to other topics are welcome to present their own project idea as well. Various alternatives for this project are also possible, such as attributing weather systems to the observed extreme sea-levels, or alternatively applying machine learning to extreme water levels from riverine systems. The student should have some experience in coding, and a willingness to learn working with big datasets, machine learning algorithms, and computational models. Interested students should send their CV and motivation to Anaïs Couasnon (anais.couasnon@vu.nl), Dirk Eilander (dirk.eilander@vu.nl) and Sanne Muis (sanne.muis@vu.nl).

Keywords : Big Data, Machine Learning, Risk Management, Time-series analysis, modeling, climate change,

Track : Others

Contact 1 : Anais Coussious anais.couasnon@vu.nl

Valid Date : Dec. 1, 2018

Climate: Persistent summer extremes

Persistent summer extremes are generally associated with increased societal impact. For example, the 2010 Pakistan flood was associated with anomalous precipitation stationary in time and space (i.e. persistent), which led to higher accumulation of water. Persistent heat waves are a bigger threat to crop yield, human health and wild fires compared to summer extremes which are defined only by a certain threshold temperature. Climate extremes occur when multiple actors align together to promote an anomaly. The mechanism that drives an extreme event can differ, and here, we focus on the distinction that can be made between the mechanism of persistent and single event extremes. As we know, the weather is behaving chaotic due to non-linear interactions. Which means that even knowing the equations of motion exactly, minor changes in the initial conditions of the weather today can lead to a wide spread in weather predictions for the next week. Single event extremes are often the result of non-linear interactions that keep amplifying an anomaly, which then grows into an unlucky rare weather event. Persistent climate extremes are more likely to be sustained by actors which vary more slowly in time. For example, heat waves in Europe and eastern U.S. are more likely to occur during certain sea surface temperature patterns that are known to induce ‘quasi-stationary’ Rossby waves. These Rossby waves interact with the jet stream and can nudge the circulation in an anticyclonic flow, thereby promoting the occurrence of heat waves. These kinds of actors are more persistent due to the fact that the precursor (in this case the sea surface temperatures) varies more slowly in time. When multiple of these persistent actors align, it will increase the probability of persistent summer extreme events. Interesting questions remain open regarding persistent weather, such as: • Do positive land-atmosphere feedbacks not only amplify a temperature anomaly, but also prolong them? • To what extend are climate extremes dominated by low frequency variability? • Can we find clustering of extreme events due to low frequency variability? • How does heterogeneous warming like arctic amplification affect the persistence of weather?

Keywords : Big Data, Machine Learning, Time-series analysis, modeling, climate change,

Track : Others

Contact 1 : Dim Coumou d.coumou@vu.nl

Valid Date : Jan. 1, 2019

Skew modelling in XVA

Counterparty Credit Risk is nowadays valued using measures such as CVA, FVA, and KVA. These valuation adjustments are based on heavy Monte Carlo simulations of integrated scenario simulation and pricing models. One of the key risk factors that is typically not taken into account is the volatility skew of interest rate and FX. In this project we would like to develop skew models and apply these to realistic portfolios to analyse the impact.

Keywords : Risk Management, Computational Finance, Monte Carlo simulation,

Track : Computational Finance

Contact 1 : Drona Kandhai b.d.kandhai@uva.nl

Valid Date : Dec. 31, 2019

Network Modelling in Finance

In the past few years, our group has been involved in several theoretical and applied research projects involving network modelling in finance. As a follow up we would like to further explore network reconstruction algorithms and early warning signals.

Keywords : Big Data, Agent Based Model, Machine Learning, Information Theory, Economics, Risk Management, Network-based modelling, Computational Finance,

Track : Computational Finance

Contact 1 : Sumit Sourabh s.sourabh@uva.nl
Contact 2 : Drona Kandhai b.d.kandhai@uva.nl
Contact 3 : Ioannis Anagnostou i.anagnostou@uva.nl

Valid Date : Dec. 31, 2019

Agent Based Models in Economics and Finance

Agent Based Models have been successfully applied to understand macro behaviour in financial markets as a result of their micro constituents and their interaction. More recently models have been developed by the Medical University of Vienna. We would like to understand and extend these methods to model prepayment risk in retail lending.

Keywords : Agent Based Model, Economics, Network-based modelling, Computational Finance,

Track : Computational Finance

Contact 1 : Drona Kandhai b.d.kandhai@uva.nl

Valid Date : Dec. 31, 2019

Extraction of reaction coordinates with auto-encoders

Rare event transitions between stable states in high dimensional systems are a important theme in our group. Examples of such rare event transitions are chemical reactions, phase transitions, but also aggregation, self-assembly and much more. Straightforward modeling of these events is out of the question, and advanced methods are required. The transition path sampling method (TPS) is one of those. Originally developed for two state transitions (between A and B), TPS sample molecular dynamics (MD) trajectories between predefined stable states, reducing the simulations effort orders of magnitude. The great advantage of path sampling is its independence from a priori order parameters. In fact, TPS was originally designed as a tool to understand unknown complex mechanisms in chemical reactions, protein folding, or nucleation phenomena. Here the reaction coordinates are not simple parameters like the number of particles, but usual complex combinations of collective variables. While committor based algorithms enable extraction of such reaction coordinates, non-linear reaction coordinates still provide a huge problem. Machine Learning algorithms allow construction of arbitrary non-linear reaction coordinates. Here we will use Machine Learning algorithms and in particular auto-encoders to best approximate committor surfaces computed by path sampling.

Keywords : Molecular Simulation,

Track : Others

Contact 1 : Peter Bolhuis p.g.bolhuis@uva.nl

Valid Date : June 1, 2019