Master of

Computational Science Projects

Cellular simulation of blood flow

Blood is a complex suspension constituted of various components suspended in plasma. Many of its intriguing properties originate from this cellular nature. Red blood cells are the major component, they transport oxygen and determine the bulk behaviour of blood. Platelets, the second most numerous cells, form the link between transport dynamics and several vital biochemical processes such as clot formation. With the recent advancement of micro-medical devices modelling the small-scale behaviour gains more importance. Accurate modelling of blood flow related phenomena on this scale requires a description of the dynamics at the level of individual cells. This, however, presents several computational challenges that can only be addressed by high performance computing. We developed HemoCell (www.hemocell.eu), a parallel computing framework which implements validated mechanical models for red blood cells and is capable of reproducing the emergent transport characteristics of such a complex cellular system. It is computationally capable of handling large domain sizes, thus it is able to bridge the cell-based micro-scale and macroscopic domains. Using HemoCell as a tool, we pursue the answers to numerous open questions around the human circulatory system.

Keywords : Computational Biomedicine,

Track : Computational Bio-Medicine

Contact 1 : Alfons Hoekstra A.G.Hoekstra@uva.nl

Valid Date : Jan. 25, 2019

Computational Origins Simulator

As part of a project for the Dutch Origins Center, and in parallel with the development of an experimental Origins Simulator (http://www.origins-center.nl/the-origins-simulator-a-preliminary-description), we will take initial steps to build a computational Origins Simulator. This will be partly based on previous work on autocatalytic sets (https://evolution-institute.org/article/the-origin-of-life-a-selfish-act-or-a-cooperative-effort), and on ongoing work on modeling biomineralization (https://staff.fnwi.uva.nl/j.a.kaandorp/research.html). Within this project, we will develop and run new simulation models to extend the existing research, in the context of origin of life studies.

Keywords : Agent Based Model, Network-based modelling, Chemical reaction networks,

Track : Complex Sytems Theory

Contact 1 : Peter Sloot p.m.a.sloot@uva.nl
Contact 2 : Wim Hordijk wim@WorldWideWanderings.net

Valid Date : June 29, 2018

What is the role of vasoconstriction in initial platelet aggregation?

Our blood vessels contain smooth muscles cells (in circumferential direction) to regulate the diameter of the blood vessel. The vessels can in this way regulate the blood pressure in our body. Moreover, the regulation of the diameter of a blood vessel is also important when a vessel is damaged. To reduce the loss of blood smooth muscle cells will contract, a process which is called vasoconstriction. This contraction of the vessel is part of hemostasis, the process of blood clot formation. Vasoconstriction causes also an increase in shear rate that could be of value during initial platelet aggregation. Under high shear rates a protein called von Willebrand factor is important in the binding of platelets to the vessel wall and to each other. In our current study we found the appearance of a cell-free layer (CFL) at the place where a platelet aggregation is initialized. Additionally, we think that this CFL is needed for the von Willebrand factor to uncoil and bind platelets. Therefore, in this study we are interested in the relationship between the amount of vessel constriction and the thickness of the CFL. In addition, we want to know if the vasoconstriction is only important in hemostasis or also in thrombus formation. During this project you will work with the cell-based model Hemocell on this topic. If you are interested, please contact Alfons Hoekstra or Britt van Rooij.

Keywords : Computational Biomedicine,

Track : Computational Bio-Medicine

Contact 1 : Alfons Hoekstra A.G.Hoekstra@uva.nl
Contact 2 : Britt van Rooij b.j.m.vanrooij@uva.nl

Valid Date : Dec. 31, 2019

Self-organized criticality in an interest-rate swap market model

The interest-rate swap market (and similar hedging activities) is a large financial derivatives market. Here, agents (banks, funds, others) build up risk internally, which they 'swap' with other agents to make the total risk in total 'zero'. It is thus an agent-based networked model. The problem is that if one agent cannot meet its obligations and defaults then the swaps it holds with other agents reverse, leading potentially to a cascade or avalanche. The initial simulation results of a previous Master thesis project reveals a self-organized critical regime in the parameter space, or even a 'super' self-organized critical regime where system-wide crashes are inevitable. It is written in a thesis report and an optimized software package is available. This project will continue on this work and work towards a scientific publication. Remaining open questions include a mathematical mean-field approach to better understand the process and critical regimes, but also a comparison of the theoretical model parameters with real data or knowledge. Also more simulations are likely needed.The end goal is to bring the important message out into the world that these types of markets are potentially very dangerous to the systemic stability of our economy. We are looking for a mathematically or physics oriented student.

Keywords : Agent Based Model, Economics, Risk Management, Network-based modelling, Computational Finance,

Track : Computational Finance Complex Sytems Theory

Contact 1 : Rick Quax r.quax@uva.nl

Valid Date : Jan. 25, 2019

Shear Thickening Simulations Using Model Cornstarch Particles

Shear thickening is the phenomenon in which the viscosity of a suspension increases as with the shear rate. Two types shear thickening, continuous shear thickening (CST) and discontinuous shear thickening (DST) are observed in experiments. During CST, the viscosity of the suspension increases continuously with imposed shear rate, while in discontinuous shear thickening, viscosity of the suspension increases by orders of magnitude with a small increment in shear rate. Cornstarch and water suspension, commonly known as Oobleck, exhibits DST and CST. This suspension exhibits CST at lower cornstarch volume fraction in suspension, and exhibits DST at higher volume fractions (~0.4). The DST volume fraction for cornstarch suspensions is significantly lower than the DST volume fraction for spherical particle suspensions (~0.56 ). This has been assumed to be due to the non spherical shapes of cornstarch particles Our suspension simulation model (SuSi) can handle different particle shapes and simulate large numbers of particles immersed in any fluid medium. Using the system currently implemented in SuSi, one can create almost any shape by using overlapping spheres. This technique shall be used to model the shapes of cornstarch particles. In this project, we propose to model the cornstarch particles by observing their shapes from the microscopic images. The shear thickening that arises from the cornstarch particle model will then be studied using SuSi.

Keywords : shear thickening, microstructure, simulation, modeling,

Track : Others

Contact 1 : Alfons Hoekstra A.G.Hoekstra@uva.nl
Contact 2 : Vishnu Sivadasan v.sivadasan@uva.nl

Valid Date : Dec. 31, 2019

Holographic projector

When mentioning the word ‘hologram’ many people think of ‘Star Wars like’ technology that creates projections that float in free space. This is actually not too far off from what is theoretically possible. In reality, for viewing, the projector, the projection, and the observer must be in line. In daily life holograms are static images that can be found in some museums and exhibitions. By making use of the wave like nature of light, by means of interference, it is possible to create intensity distributions in free space. A holographic projector can be made with a ‘screen’ in which each ‘pixel’ emits light with precisely set intensity and phase. The light from all these pixels result in a 3 dimensional intensity distribution that represents the object to be projected. A difficulty in generating holograms is the required dense pixel pitch. A ‘conventional’ holographic projector would require pixels of less than 200 nanometer. With larger pixels artefacts occur due to spatial under sampling and the useful angle for projection becomes less (only a few degrees for state of the art systems). A pixel pitch of 200 nm or less is difficult, if not, impossible, to achieve, especially for larger areas. Another challenge is the massive amount of computing power and data handling that would be required to control such a dense pixel matrix. A new holographic projection method has been developed that reduces ‘conventional’ under sampling artefacts, regardless of pixel density. The trick is to create 'pixels' at random but known positions, resulting in a ‘pixel matrix’ that lacks (or has strongly suppressed) spatial periodicity. As a result a holographic emitter can be built with a significantly lower sample density and less required computing power. This could bring holography in reach for many applications like display, lithography, 3D printing, metrology, etc... A short description (in Dutch) can be found through the following link: https://www.nikhef.nl/nieuws/prototype-holografische-videoprojector-in-de-maak/ Our goal is to quantify the relation between hologram ‘quality’ and the density and positioning of the ‘pixels’. The quality of a hologram is defined by factors like: Noise, contrast, suppression of under sampling artefacts, and resolution. One of the challenges a student can work on, is to make a proper simulation model of this system. The programming language and the method(s) for modelling depend, to a large extend, on the preference of the student(s). The purpose of this simulation model is to determine the requirements of a holographic projector for a given set of requirements for the holograms (like, for example a contrast ratio of better than 1:100). In addition a proof of concept holographic emitter is being built. This set-up will be used to verify simulation results (and, of course, to project some cool holograms). If you want additional information, please contact Martin Fransen: martinfr<at>nikhef.nl

Keywords : High Performance Computing, Computational geometry, Image Processing, Monte Carlo simulation, modeling,

Track : Others

Contact 1 : Michael Lees M.H.Lees@uva.nl

Valid Date : Sept. 1, 2019

Improving extreme sea-level predictions with machine learning

The performance of current global coastal flood models is limited by, among others, the coarse spatial or temporal resolution of climate forcings. As a result, some processes are not modelled correctly nor accounted for which produces a strong bias in modelled water sealevels. In order to properly model those processes, detailed spatio-temporal data is required. Machine learning algorithms constitute an interesting downscaling alternative. They have been successfully applied to diverse flood risk, hydrologic and climate studies. In this thesis, the student will apply machine learning algorithms to predict extreme sea-levels and -importantly- compare this with outputs from, physically-based model chains, such as Delft3D. Students interested in applying machine learning to other topics are welcome to present their own project idea as well. Various alternatives for this project are also possible, such as attributing weather systems to the observed extreme sea-levels, or alternatively applying machine learning to extreme water levels from riverine systems. The student should have some experience in coding, and a willingness to learn working with big datasets, machine learning algorithms, and computational models. Interested students should send their CV and motivation to Anaïs Couasnon (anais.couasnon@vu.nl), Dirk Eilander (dirk.eilander@vu.nl) and Sanne Muis (sanne.muis@vu.nl).

Keywords : Big Data, Machine Learning, Risk Management, Time-series analysis, modeling, climate change,

Track : Others

Contact 1 : Anais Coussious anais.couasnon@vu.nl

Valid Date : Dec. 1, 2018

Climate: Persistent summer extremes

Persistent summer extremes are generally associated with increased societal impact. For example, the 2010 Pakistan flood was associated with anomalous precipitation stationary in time and space (i.e. persistent), which led to higher accumulation of water. Persistent heat waves are a bigger threat to crop yield, human health and wild fires compared to summer extremes which are defined only by a certain threshold temperature. Climate extremes occur when multiple actors align together to promote an anomaly. The mechanism that drives an extreme event can differ, and here, we focus on the distinction that can be made between the mechanism of persistent and single event extremes. As we know, the weather is behaving chaotic due to non-linear interactions. Which means that even knowing the equations of motion exactly, minor changes in the initial conditions of the weather today can lead to a wide spread in weather predictions for the next week. Single event extremes are often the result of non-linear interactions that keep amplifying an anomaly, which then grows into an unlucky rare weather event. Persistent climate extremes are more likely to be sustained by actors which vary more slowly in time. For example, heat waves in Europe and eastern U.S. are more likely to occur during certain sea surface temperature patterns that are known to induce ‘quasi-stationary’ Rossby waves. These Rossby waves interact with the jet stream and can nudge the circulation in an anticyclonic flow, thereby promoting the occurrence of heat waves. These kinds of actors are more persistent due to the fact that the precursor (in this case the sea surface temperatures) varies more slowly in time. When multiple of these persistent actors align, it will increase the probability of persistent summer extreme events. Interesting questions remain open regarding persistent weather, such as: • Do positive land-atmosphere feedbacks not only amplify a temperature anomaly, but also prolong them? • To what extend are climate extremes dominated by low frequency variability? • Can we find clustering of extreme events due to low frequency variability? • How does heterogeneous warming like arctic amplification affect the persistence of weather?

Keywords : Big Data, Machine Learning, Time-series analysis, modeling, climate change,

Track : Others

Contact 1 : Dim Coumou d.coumou@vu.nl

Valid Date : Jan. 1, 2019

Network Modelling in Finance

In the past few years, our group has been involved in several theoretical and applied research projects involving network modelling in finance. As a follow up we would like to further explore network reconstruction algorithms and early warning signals.

Keywords : Big Data, Agent Based Model, Machine Learning, Information Theory, Economics, Risk Management, Network-based modelling, Computational Finance,

Track : Computational Finance

Contact 1 : Sumit Sourabh s.sourabh@uva.nl
Contact 2 : Drona Kandhai b.d.kandhai@uva.nl
Contact 3 : Ioannis Anagnostou i.anagnostou@uva.nl

Valid Date : Dec. 31, 2019

Agent Based Models in Economics and Finance

Agent Based Models have been successfully applied to understand macro behaviour in financial markets as a result of their micro constituents and their interaction. More recently models have been developed by the Medical University of Vienna. We would like to understand and extend these methods to model prepayment risk in retail lending.

Keywords : Agent Based Model, Economics, Network-based modelling, Computational Finance,

Track : Computational Finance

Contact 1 : Drona Kandhai b.d.kandhai@uva.nl

Valid Date : Dec. 31, 2019

Extraction of reaction coordinates with auto-encoders

Rare event transitions between stable states in high dimensional systems are a important theme in our group. Examples of such rare event transitions are chemical reactions, phase transitions, but also aggregation, self-assembly and much more. Straightforward modeling of these events is out of the question, and advanced methods are required. The transition path sampling method (TPS) is one of those. Originally developed for two state transitions (between A and B), TPS sample molecular dynamics (MD) trajectories between predefined stable states, reducing the simulations effort orders of magnitude. The great advantage of path sampling is its independence from a priori order parameters. In fact, TPS was originally designed as a tool to understand unknown complex mechanisms in chemical reactions, protein folding, or nucleation phenomena. Here the reaction coordinates are not simple parameters like the number of particles, but usual complex combinations of collective variables. While committor based algorithms enable extraction of such reaction coordinates, non-linear reaction coordinates still provide a huge problem. Machine Learning algorithms allow construction of arbitrary non-linear reaction coordinates. Here we will use Machine Learning algorithms and in particular auto-encoders to best approximate committor surfaces computed by path sampling.

Keywords : Molecular Simulation,

Track : Others

Contact 1 : Peter Bolhuis p.g.bolhuis@uva.nl

Valid Date : June 1, 2019

Predicting the effect of extended-duration mass drug administration on elimination of worm infections

About a quarter to about one third of the world population carries one or more worm infections that may cause anaemia, growth retardation, disfiguring skin disease, blindness, and even cancer. The four most common worm infections – lymphatic filariasis, onchocerciasis, soil-transmitted helminths, and schistosomiasis – are currently targeted by the World Health Organization to be controlled or eliminated by 2020 or 2025. The main control strategy is mass drug administration (MDA), which involves distributing deworming drugs to entire populations or specific target groups like school age children in endemic areas on a regular basis, typically annually or six-monthly. Successful control or elimination of worm infections requires that MDA is implemented at high coverage of the target population and that MDA rounds are repeated over an extended period of time, sometimes as long as more than ten years. A major challenge in optimizing MDA is to reach and treat individuals that are absent from their communities during MDA due to seasonal work, family visits, etc. Individuals that are systematically absent serve as a reservoir of infection for the rest of the community, and therefore prolong the required duration of MDA programmes to achieve control or elimination. A potential solution to this is to extend the time window during which drugs are distributed, i.e. extended-duration MDA, in contrast to the current MDA duration of only about two weeks. However, extended-duration MDA would also involve additional cost and time of the health professionals and/or volunteers that execute the MDA programmes locally. Currently, it is not clear what the potential benefits of extended-duration MDA are in different situation and to what extent these benefits weigh against the additional costs. The Infectious Disease Control Research Group in the Department of Public Health is specialized in mathematical modelling of transmission and control of infectious diseases and has developed a generalised individual-based model for transmission of worm infections called WORMSIM. The MSc student will have the opportunity to use WORMSIM to generate, visualise, and analyse a large database of predicted outcomes of normal MDA vs. extended-duration MDA in a large number of scenario pertaining to transmission conditions and patterns in MDA coverage. To weigh the benefits extended-duration MDA against the additional costs involved, the MSc student will perform a literature review in the fields of health economics and social sciences. This project is suitable for students with an interest in and affinity for quantitative science. The student will be heavily using the statistical package R; experience with R or programming experience with another language is therefore highly valued.

Keywords : Agent Based Model, Monte Carlo simulation, simulation, modeling, Infectious disease, Epidemiology,

Track : Computational Biology

Contact 1 : Luc Coffeng l.coffeng@erasmusmc.nl

Valid Date : June 1, 2019

Predicting the required efficacy of vaccines for control and elimination of soil-transmitted helminths

About one billion people globally are infected by one or more species of soil-transmitted helminths (STH), a group of worm species that live in the host intestines and are transmitted through faecal contamination of the environment. STH infections are responsible for malnutrition, anaemia, growth retardation, and delayed cognitive development. Because of this, STH are currently targeted by the World Health Organization to be controlled or eliminated by 2020, which is defined as a prevalence of moderate to heavy infection under 1%. The main control strategy is mass drug administration (MDA), which involves distributing deworming drugs to entire populations or specific target groups like school age children in endemic areas on a regular basis, typically annually or six-monthly. Successful control or elimination of STH requires that MDA is implemented at high coverage of the target population and that MDA rounds are repeated over an extended period of time, sometimes as long as more than ten years. In the long run, sustained control or elimination can be achieved by improved access to water, sanitation, and hygiene, but this has proven to be very challenging and expensive to implement successfully in endemic areas. To decrease the required duration of MDA programmes and prevent the anticipated emergence of drug resistance, there are ongoing efforts to develop a vaccine against STH infections. The impact of an STH vaccine will depend on the vaccine target (e.g. the adult worm and/or incoming infective L3 larvae) and efficacy, and the context in which it is implemented (e.g. transmission conditions, history of MDA), and who is being vaccinated (e.g. children vs. whole population). For the development of vaccine candidates, it is important to understand what the minimal required efficacy of a vaccine candidate is on STH transmission. This is ideally investigated by means of mathematical modelling, a powerful technique to translated the individual-level outcomes of intervention to the larger population by accounting for the interactions between individuals in the population. The Infectious Disease Control Research Group in the Department of Public Health is specialized in mathematical modelling of transmission and control of infectious diseases and has developed a generalised individual-based model for transmission of worm infections called WORMSIM. The MSc student will have the opportunity to use WORMSIM to generate, visualise, and analyse a large database of predicted outcomes of vaccinating various target populations with hypothetical vaccines in a large number of scenarios pertaining to transmission conditions and MDA history. Based on this database, the MSc student will correlate vaccine mechanism and efficacy with the expected added benefit of vaccination in terms of worm control and elimination, and determine the required minimal efficacy of future vaccine candidates. This project is suitable for students with an interest in and affinity for quantitative science. The student will be heavily using the statistical package R; experience with R or programming experience with another language is therefore highly valued.

Keywords : Agent Based Model, Monte Carlo simulation, simulation, modeling, Infectious disease, Epidemiology,

Track : Computational Biology

Contact 1 : Luc Coffeng l.coffeng@erasmusmc.nl

Valid Date : June 1, 2019

Individual-modeling of the population dynamics of drug resistance in soil-transmitted worms in humans

About one billion people globally are infected by one or more species of soil-transmitted helminths (STH), a group of worm species that live in the host intestines and are transmitted through faecal contamination of the environment. STH infections are responsible for malnutrition, anaemia, growth retardation, and delayed cognitive development. Because of this, STH are currently targeted by the World Health Organization to be controlled or eliminated by 2020, which is defined as a prevalence of moderate to heavy infection under 1%. The main control strategy is mass drug administration (MDA), which involves distributing deworming drugs to entire populations or specific target groups like school age children in endemic areas on a regular basis, typically annually or six-monthly. Successful control or elimination of STH requires that MDA is implemented at high coverage of the target population and that MDA rounds are repeated over an extended period of time, sometimes as long as more than ten years. In the long run, sustained control or elimination can be achieved by improved access to water, sanitation, and hygiene, but this has proven to be very challenging and expensive to implement successfully in endemic areas. MDA programmes face the real threat that of emergence of drug resistance. The impact of an STH vaccine will depend on the vaccine target (e.g. the adult worm and/or incoming infective L3 larvae) and efficacy, and the context in which it is implemented (e.g. transmission conditions, history of MDA), and who is being vaccinated (e.g. children vs. whole population). For the development of vaccine candidates, it is important to understand what the minimal required efficacy of a vaccine candidate is on STH transmission. This is ideally investigated by means of mathematical modelling, a powerful technique to translated the individual-level outcomes of intervention to the larger population by accounting for the interactions between individuals in the population. The Infectious Disease Control Research Group in the Department of Public Health is specialized in mathematical modelling of transmission and control of infectious diseases and has developed a generalised individual-based model for transmission of worm infections called WORMSIM. The MSc student will have the opportunity to further develop WORMSIM, and to generate, visualise, and analyse a large database of predicted outcomes for various scenarios pertaining to transmission conditions and MDA history. Based on this database, the MSc student will assess how to optimize MDA to maximize the duration until development of relevant levels of drug resistance while maintaining effective control. This project is suitable for students with an interest in and affinity for quantitative science. The student will be heavily using the statistical package R; experience with R or programming experience with another language is therefore highly valued. Initiatives to translate and improve the code other faster languages are encouraged.

Keywords : Agent Based Model, Monte Carlo simulation, simulation, modeling, Infectious disease, Epidemiology,

Track : Computational Biology

Contact 1 : Luc Coffeng l.coffeng@erasmusmc.nl

Valid Date : June 1, 2019

Early prediction of abnormal heart rhythm using information theory

We would like to explore a potential collaboration with Hiroshi Ashikaga, MD, PhD (Johns Hopkins University School of Medicine) by applying our information theory measures to their data. The data comprises of many dense time-series of 20 seconds of heart rhythm of a single patient 't, of which the first 10 seconds are healthy and the second 10 seconds are abnormal. Our question is whether we can predict this transition by measuring a decrease of stability using information theory (information dissipation time). Please have a look at Rick Quax' publications in Interface and Scientific Reports as background reading.

Keywords : Fisher Information, Entropy, Information Theory,

Track : Urban Complex System

Contact 1 : Rick Quax r.quax@uva.nl

Valid Date : Oct. 16, 2018

The effect of stochastic volatility dynamics on CVA in multi-curve interest rate models

In recent years, counterparty credit risk has become increasingly important in derivatives pricing and hedging. Since the 2007 financial crisis, pricing of derivatives not only focusses on market risk, but it also takes into account the possibility that the counterparty, and indeed the bank itself, may not be able to meet their financial obligations. In such an event, one of the parties involved will in general incur a loss if the deal was not fully collateralized. This has led to the adoption of the so called Credit Valuation Adjustment, or CVA, which is a model-dependent premium that is charged on top of the "classical" no-arbitrage price to account for these possible losses. In contrast to classical derivative pricing, CVA calculations are not only done on single trade but also on portfolio level. This means that CVA can depend on a large number of risk factors (FX rates, interest rates in different currencies), which lead banks to use simplified models in practice, which, for instance, do not model skew and smile for interest rate dynamics. The goal of this project is to quantify the effect of a more realistic specification of the volatility dynamics on CVA for FX and interest rate derivatives. In a recent study, Croonenburg [1] performs a similar analysis, but rather than assuming a single-curve framework, in this project the student will extend the analysis in [1] to the more general multi-curve framework [2] which accommodates basis spreads between LIBOR and OIS rates. References [1] N. Croonenburg, The Effect of Local Volatility Interest Rate Models on the CVA of Vanilla Interest Rate Swaps, MSc thesis, October 1, 2017 [2] F. Mercurio, Interest Rate Modelling in the New Era, Available at: https://pdfs.semanticscholar.org/presentation/020b/a103c56a980c6c95fb39d971258ce6f8acd1.pdf

Keywords : Valuations, Risk Management, Computational Finance, Monte Carlo simulation,

Track : Computational Finance

Contact 1 : Drona Kandhai b.d.kandhai@uva.nl

Valid Date : Oct. 16, 2018

Surrogate modelling of damage stability of ships

This internship position is offered by the SARC.nl company, which develops models and software tools for ship design, fairing and on-board load calculations. ABSTRACT: The ship design process is governed by computation-intensive analyses. Well-known are CFD and FEM, but there are also specific topics, such as damage stability analysis. At SARC, some 3D models and simulation software is available for this task. However, a problem is that the damage stability properties cannot be assessed in an early stage of ship design, because modelling the ship and making the computations are too time-consuming. So, the aim of this project is to develop a surrogate model for damage stability prediction at the early stage of ship design, by employing existing 3D computational models and machine learning methods.

Keywords : Machine Learning, simulation, modeling, FEM,

Track : Others

Contact 1 : Valeria Krzhizhanovskaya V.Krzhizhanovskaya@uva.nl
Contact 2 : Herbert Koelman H.J.Koelman@sarc.nl

Valid Date : Aug. 31, 2019

Decision-support for patient after-care choices for burn wounds

The Brandwondenstichting (brandwondenstichting.nl, burn wounds foundation) has a large amount of longitudinal patient data, from the moment of injury throughout the hospitalization and other health care. In the after-care patients still have to make many decisions. For instance, will the patient choose plastic surgery to reduce scar tissue, and if so which type? This is a complex decision with many factors at play, such as fysiology but also what the patient finds important, whether or not it creates discomfort, the risks associated with the surgery, and what is and is not possible to achieve. The goal of this project is to analyze the data and create an optimal decision-tree model, based on the data as well as on expert opinions. The goal is to let the patient answer questions which allow the system to find 'similar' patients, who can then be used to provide historic evidences for the choice of the patient. The outcome of this tool will be discussed between the patient and a doctor. The goal of this project is to create a proof-of-principle which will subsequently be implemented.

Keywords : Machine Learning, modeling,

Track : Complex Sytems Theory

Contact 1 : Rick Quax r.quax@uva.nl

Valid Date : June 29, 2019

Predictive Simulation of Punctuality in Public Transport

Influencing punctuality is often a difficult task, as many agents are influencing this outcome. We aim for this project to move from result-driven indicators to performance-driven indicators. Punctuality is therefore seen a complex system with dynamic agents, whether having virtuous influence or negative. The focus is on understanding these factors as well as on using them for predictive capability. The predictive emphasis is to help steer the KPI decision-makers in taking the right, and in such, preemptive measures to ensure the ultimate goal of public transportation, optimal punctuality. In this project, the student will be looking at the system dynamics behind the capability of a public transport vehicle to meet optimal punctuality. The student will investigate two main points: (1) Detecting outliers and the influencing factors: Develop a formula to detect & describe outliers; Create dynamic table to show relevant factors playing an active role for punctuality. (2) Simulation of the system dynamics: Map of feedback loops & key modifiers; Visualization using simulation software (e.g. MatSim); Calculation of a decision impact on punctuality. The goal is to produce a working predictive tool/module that can be used to simulate punctuality starting from predefined constraints (e.g. traffic, ridership, etc.). Data sources: Vehicle data from public transport companies (e.g. ridership, maintenance, planning, etc.) Publicly available data sources (e.g. weather, traffic, etc.) Contact: Mert Dekkers m.dekkers@zight.nl

Keywords : Mobility, Smart City, Big Data, Transportation, Agent Based Model,

Track : Complex Sytems Theory Urban Complex System

Contact 1 : Valeria Krzhizhanovskaya V.Krzhizhanovskaya@uva.nl

Valid Date : Aug. 31, 2019

Care after burn wounds: shared decision making tool as part of a patient portal for burn survivors

Aim: In this project, complex medical information regarding scar tissue of burn survivors is transformed into a shared decision making (SDM) tool that will be available to burn survivors in a patient portal. Burn survivors often face difficult decisions regarding the treatment and management of (one of) their scars, for example decisions on whether to treat the scar by means of (multiple occasions of) reconstructive surgery or not. The aim of the SDM tool is to provide burn survivors who face such a decision with information that can aid them in making a decision that suits their personal context. This can be done by 1) providing information on burn patients that have similar (clinical) characteristics as the burn survivor facing the decision (i.e. wound size, type of wound, etc.) and to inform them on what treatment these similar patients have received and what the outcomes of these treatments were (e.g. scar quality and quality of life). Subsequently, 2) this information can be more personalized by selecting a subset of these similar patients and their received treatment that also have patient-related factors similar to the burn survivor, such as quality of life expectations, personal values, etc. The basis of the shared decision making tool is an algorithm based upon registry data from the three Dutch burn care facilities, if needed supplemented with data from clinical research studies and if needed priors may be obtained based on expert opinion and previous research. This data contains clinical and etiological attributes (such as total burn surface area (TBSA) at time of admission, type of burn, location, depth of burn, scar size, nr of surgery’s , whether the wound was infected with bacteria etc) as well as patient reported outcomes (PROMS) on quality of life, scar quality, mental health, social role functioning, physical function. The PROMS registry has started recently; gathered data on this matter might contain too little power for the algorithm at the moment, yet will grow overtime. Ideally, machine learning is included in the final shared decision making tool; the choices and outcomes of the burn survivors using the tool are included in the algorithm to optimize quality and accuracy. Deliverables: Basic SDM tool: Algorithm for SDM tool that can perform step 1 and step 2. The algorithm needs to be accessible in a SDM tool interface. The SDM interface is created by a third party together with the Dutch Burns Foundation in order to integrate it into the patient portal. Final SDM tool: Algorithm for SDM tool performing step 1 and step 2 and machine learning, likewise integrated in the patient portal by a third party and the Dutch Burns Foundation. Planning: This project is funded by the National Healthcare Institute for a period of two years, 2018-2020. By September 2020 the SDM tool, including the patient portal, needs to be implemented in all three Dutch burn care facilities. By mid 2019 a first version of an SDM is anticipated.

Keywords : Bioinformatics, Machine Learning, Data Engineering, Data quality, modeling,

Track : Others Computational Bio-Medicine Complex Sytems Theory

Contact 1 : Rick Quax r.quax@uva.nl

Valid Date : Dec. 31, 2019