‘Ohh..goodluck, but I would never think about doing lab experiments during a secondment!’
This was one of my colleague’s “motivational” response when I told him about my plans for my secondment in University of Coimbra. Well, he was being pessimistic, but nevertheless he was not completely wrong. Completing lab experiments during a secondment can be tricky. Mainly because of limited time and unfamiliar lab environment, especially if you are new to the place and cannot speak a single word in the local language, which was exactly my situation. To make it even harder, the one and only lab technician in the water lab can only speak Portuguese. But thanks to Rita, I already had most of the things prepared for my experiments even before I came to Coimbra, so my feelings in the first few days of my secondment here were mixed with excitement and nervousness.
Now it has been almost two and a half months. I am behind my original schedule, but now I am more familiar with all the lab facilities and the experimental setup, so the progress is faster now. I am hoping to finish all the remaining experiments in less- than- a-month’s time.
I have not mentioned about the aim of my lab experiments yet. I am investigating the sediment wash off from an impervious road surface using artificial rainfall. The main aim is to study the effect of (a) rainfall intensity, (b) slope of the surface, and (c) initial load on sediment wash off from the road surface.
Here are some pictures from my lab experiments
Finally, some results. But do not be bored, it has nothing to do with sediment wash off. Thanks to the health application in my mobile, I found a good relationship between two parameters even before I start to analyse my experimental data. Number of steps I walk per day has a good correlation with the number of experiments I carry out per day, so I finish with a health tip: Do lab experiments and stay healthy! 😉
Once I was told: only if you can successfully explain something to your grandmother it means that you master its understanding. I think this is a very easy way to discover your own lack of knowledge.
Not long ago I tried to explain the concept of return periods in engineering to a friend. In principle this is something straightforward; we define it as the average time-span in which an event of a certain magnitude is exceeded. This is a very useful resource in engineering. For example, let’s say we want to design a piece of river-hydraulic infrastructure. This is meant to work under a certain design criteria. However, extreme events (e.g. high river flow) can threaten its integrity or its performance.
In order to achieve an optimal design (reliable and cost-effective), the common procedure would be: first, do a risk analysis and define which probability of failure are you willing to assume. Then gather data on the threat variable x, by studying long time series you can define what is the recurrence period for each magnitude. This value, T relates to the probability of exceeding a certain event by time unit, like: P(X≥x) = 1/T. Thus, we have it! Matching the total probability of failure with the assumed risk will give the magnitude of the design event.
A simple concept. Until questions come.
It happens that I live under sea level. This can be accomplished by: becoming a fish, building a very complex submarine structure, or moving to Delft, The Netherlands. I chose the last option for obvious reasons, the food here is slightly better than in the submarine.
The Netherlands is a very particular place, and a must-to-see for any hydraulic engineer.
Dutch people have been fighting against flooding for centuries. Roughly 30% of habitable land is directly under sea level and 60% of the country’s surface is flood prone. After a series of storm surges of catastrophic impact (in the post-WWII), the Dutch government planned a massive infrastructure development for flood protection. Today, the country resists the rising sea with a series of diked rings and some very impressive pieces of engineering mega-structures (e.g. https://en.wikipedia.org/wiki/Maeslantkering).
A recurrent topic with friends and visitors is; how safe is it to live here? Nothing has happened in a few generations, but how safe is the flood protection system? Well, rainfall-driven flooding is relatively common in some places. But a breach in the dike system and a total sea intrusion (in Delft) is very, very unlikely.
How unlikely? Delft is located in the area known as the Randstad. This is the most populated territory of The Netherlands since it comprises the cities of Amsterdam, The Hague, Rotterdam and partially Utrecht. During the flood defence planning, for political and technical reasons, they selected a period of return of T=10,000 years for this area. Then, the probability that during a 4 years PhD project you suffer such an undesirable event (kayaking in my living room) can be modelled with a Poisson distribution: P(suffer at least 1) = 1 – Ps(k=0, n=4, p=1/T) = 0.00039992 or a 0.039% probability, in theory. I would say very low.
But then someone asked; how do you calculate a 10,000 return period event? Well, observing data. Which data? And here is where things get complicated. It is not until 40-70 years back from now when systematic monitoring data started to be widely compiled. For most variables of interest in hydrology (e.g. rainfall, river flow), you shouldn’t expect to have longer continuous series than that. Still, return periods for critical infrastructure tend to be in the 1000 -10,000 years order of magnitude. So how do you do that?
The estimation of the period of return is usually done by fitting a probabilistic model. The time series used can still suffer of many additional issues as non-stationarity in time, censored data etc. There are a lot of techniques and literature addressing those problems. But the question is still in the air… How can you extract enough statistically sounded information for a 10,000 year return period with only 40 years of information?
This constitutes a few scientific fields by itself but here are some possibilities:
Reconstruction of past events through simulators (e.g. modelling extreme flooding).
Compilation of non-systematic data. Continuous monitoring time series are 40-70 years long, however extra information is available on singular events. This is especially the case in floods in Western Europe where chronological city records can be traced back to 1000 AD or more.
Study of geomorphological sedimentary strata and physical marks of past extreme events. This is known as Paleohydrology.
All this information does not produce exact reliable values but it can be used as bounding regions for refining large return period estimates. With the use of alternative data we are capable of stretching our short time series and refining extreme event predictions. This is quite critical since, as you have seen all our design depends directly on this estimate.
If you want to know more about the Dutch sea defence system,  is a good summary. And if you want to read more about the use of non-systematic data in return period estimates,  has a few good examples.
I hope you learned something new, as I did after trying to explain this simple concept.
 Benito, G., Lang, M., Barriendos, M., Llasat, M. C., Francés, F., Ouarda, T., … & Bobée, B. (2004). Use of systematic, palaeoflood and historical data for the improvement of flood risk estimation. Review of scientific methods. Natural Hazards, 31(3), 623-643.
I am glad to write to you that I had some relaxed holidays during August in the Azores. For some of you who are unfamiliar with this little place, Azores is mostly known by its natural landscape, moderate climate and relaxed inhabitants. According with a rating done by the Dutch National Geographic, Azores was considered in 2016 the top place to visit (http://www.natgeotraveler.nl/galerij/20-ultieme-bestemmingen-voor-2016/de-azoren).
Anyway, on our way way back there was some news/warning that the Azores would be affected by the Hurricane Gaston. Luckily for me and my family it was going to hit on Saturday, two days after we were going to fly back to Germany (by the way I am originally from the Azores). Regarding natural disasters, Azores is often associated with Earthquakes. Truth is that they have also been hit in the past by few extreme weather events (see for instance Hurricane Gordon, Helene or Alex).
Well given that QUICS focuses on the topic of uncertainty, it does seem appropriate to show you some of the forecasts issued by the US National Hurricane Center for the recent Gaston Hurricane. It shows some of the practical aspects of communicating uncertainty to non-experts. Weather forecasting as we all know is highly uncertain, and hurricane forecasting is not an exception. In particular, I have chosen two examples: the first displays uncertainties of paths, and the second uncertainty of intensity.
Figure 1: Coastal Watches/Warnings and 5-Day Track Forecast Cone, Hurricane GASTON Advisory #035, 5:00 AM EDT Wed August 31, 2016 (See this NHC page for current map)
Figure 1 shows you the 5 day track forecast issued on the 31st of August superimposed with the one issued on 2nd of September. The black line shows the forecast track on the 31st and the large blue area is the cone of uncertainty in the future track on the 31st. The smaller blue area is the cone of uncertainty on the 2nd. The cone of uncertainty is based on historical data, 60 to 70% of all historical data indicate that the storm center will remain within the cone for the days following the forecast.
Figure 2: Tropical Storm Force Wind Speed Probabilities, for 5 days from 2nd of September.
The second example deals with wind intensity. Indeed, this is the main issue why hurricane forecasts are issued (see Figure 2 or a current graphic on the NHC site). Winds can reach destructive speeds and are responsible for increasing the wave height considerably (e.g. during hurricane Alex waves up to 14 meters were recorded at high sea). Figure 2 shows the cumulative probability that wind speeds of at least 39 mph will occur during a 120 hour period. Maps of shorter time periods can be seen on the NHC site (click the different durations or loop in the top right), each graphic provides cumulative probabilities the that wind speeds of at least 39 mph will occur during cumulative time periods at each specific point on the map. Unlike the previous map, here it is possible to compare how intensities are expected to evolve during the five day forecast.
As I mentioned before Azores inhabitants tend to be quite relaxed, and indeed this time it was not too bad. The Hurricane went down to a tropical storm and it did not cause too much damages. A happy ending!
Since I started with telling you about our holidays in the Azores, it is well suited to finish it with a picture we took. To conclude let me just write that the Azores is also populated with plenty of black and white cows (while some will say this last sentence is useless others may find it useful for understanding the bigger picture).
All the best!
Jorge Leandro (TU Munich and University of Coimbra)
Last week, we’ve successfully organised the 8th edition of the Sewer Processes and Networks conference in Rotterdam, the Netherlands on behalf of the IWA working group on Sewer Systems and Processes (SS&P WG). The conference took place on board of the SS Rotterdam, an excellent place to spend a few hours on the sun deck at the end of summer in Rotterdam.
Two of the QUICS promovenda, (Antonio Moreno Rodenas and Ambuj Sriwastava), one QUICS post-doc (Mathieu Lepot) and a number of QUICS supervisors (Simon Tait, Francois Clemens and Jeroen Langeveld) witnessed a conference with a lot of presentations about the real stuff: experimental work on sewer processes, either in labs or in the real world. After a decade where modelling studies have been declared as the most convenient way to quickly publish the required amount of journal papers to obtain a PhD, the tide now seems to be turning towards obtaining real (new) knowledge on the most interesting part of integrated catchments: the sewer. Even though admitting studying sewer system dynamics may not be your best bet at your first date, there are a lot of very interesting research topics related to sewer processes, such as: climate change, CFD modelling, discharge patterns of pharmaceuticals, impacts of H2S on brains of sewer workers, spreading of antibiotic resistance via bacteria in sewer biofilms and sediments, monitoring and inspection techniques, asset management strategies, distributed temperature sensing, ground penetrating radar, aeration and H2S release at drop structures, methane formation, retrofitting urban areas with blue-green structures, gully pot cleaning and yes, also integrated approaches and the relevance of uncertainties in rain radar images for integrated catchment studies (thanks Antonio!).
In addition to the conference, we’ve organised a pre-conference workshop on ‘ lessons from failed research’ . All experienced researchers know that many research projects fail at least partly. However, it is very hard to find well documented results of failed research or negative results in literature. This is due to the ‘positive research bias’ of journals, that all want to publish positive results and implicitly train their reviewers and editors to reject less favourable outcomes. The workshop learned that there is a lot to learn from failed research and Richard Ashley and Francois Clemens now take the initiative to compose a review paper discussing how we can benefit from lessons from failed research. Those of you interested in sharing knowledge on this issue, please contact Richard via email@example.com.
Two a priori taughts must gone after reading this name “Happymen”: i) it is a french movement and ii) this group aims at proposing actions and keys to reduce or suppress inequality between women and men, in both professional and personal environment. So “Be a HappyMan” means as well “Be a HappyWoman”.
You should have a look to the link (only in french, sorry), since the QUICS project is a Marie ITN that takes some special attention to fair research and hiring processes. Some nice tips about work and man/woman equality can be found there. For non-french reader, I will try to summarise the main objectives of this group I recentely discovered.
The main goal of this group is to ensure equality between women and men, while sharing tasks and responsibility at and outside work. As the starting point, this group initally states that better working conditions are mandatory to reach this goal. The “hard working man”, i.e. starting early, finishing late and competing with his colleagues in order to get quickly good promotions within a company, The initial statement of this group is not longer a suitable employee. While killing this politic, men would have more time available to take care of the familly (domestic task, child education) i.e. equally share the responsibilities at home. The name of the movement comes from here. Men will become “Happymen” while sharing more personnal moments at home and seeing their children grow. As a consequence, and for the same amount of work at home, women would have more flexibility and time to work and then, will deserve the same salary, promotion and carreer as the men. In one sentence, equality between women and men will start by less work load for men and obviously finish by an equal treatment of women and men at work.
Even if I’m not that old, I saw for a decade more and more pressure in the academic world to produce, produce, produce … For some stupid reasons, managers require indicators to rank researchers, universities, journals, etc. in order to assess the quality of research that each of you does. And the most enraging: most of the people I met for 10 years agree that this is crazy stupidity, but most of us are still following those rules.
Publish or Perish. Researchers, I paid you and I want to read your papers and to read a lot of them (more and more with years). And you author, how many good paper(s) are you able to get published every year? 1, 2, 5, 10, more? Same question for papers you deserve to be co-author …. Wait a minute before answering: I said a “good”. I mean a paper you are proud of it because it is a new idea you got while solving a real life problem, after reading the bibliography and the where the idea has been tested, validated and its robustness demonstrated with experiments. So, how many? Without MSc students, I’m not sure I can write one per year. It is far more complicated for co-authorship since a single contribution (during a short discussion) for a study might help a lot. PhD candidates are almost never supposed to publish more than a paper per year, but if you want to stay in the academic world, you will have to publish more after your graduation. So, how to do? … Work more, to publish more while keeping a “good” quality? Decrease the quality of your publication to publish more without working more? Settle some deals with colleagues to be co-author on their paper and vice-versa? … Ethical questions and problems are slowly but surely coming. More and more paper are available, with supposedly a decreasing quality and we have less and less time available to read and stay up to date in the state of the art. And how many papers can you read per week (to progress in your topic)?
How can you deal with such rules, requiring more and more of your time, always more than your employment contract stated, often more than can be socially acceptable? What is the maximal number of working hours (per week) you can accept to do on a regular basis? Can you be more productive if you work 60 hours a week? I do not have any answer to this question, even for myself. But I think every one should ask her/himself those questions and try to deal the best you can while keeping a good balance professional/personal lifes.
Even more than publications (and I didn’t discuss about co-authorship regulation), some other topics (different from the ones dealing by the Happymen group) require some attention: about the good (efficient) use of funding, collaborations between universities and companies, creation of companies based on public funded research, selection of attended conferences, etc. The list of issues requiring some deep reflection on ethics is endless.
Therefore, I invite you to think to all of these questions … when they pop up, it is often too late to answer in a consistent way (with your ethics) if you didn’t think in advances.
From 16-22 May 2016 in Venice of Italy, I attended a gathering of experts and practitioners interested and active in disaster risks, organised by UR (Understanding Risk). UR is an excellent platform for collaboration, knowledge sharing, and innovation in identifying and assessing disaster risk mainly supported by the World Bank. This event was very inspiring and engaging (and free!). The five days were filled with over 50 sessions, including training events, workshops, technical sessions and plenaries. It was divided into the main conference and Focus Days. The latter allowed organizations to manage their own workshops, stakeholder meetings, training sessions or other activities. The agenda for this year’s Focus Days is here: https://understandrisk.org/ur2016-focus-days/
From the list, you can see there were a diverse range of topics to choose from. I attended a session on ‘Learning InaSAFE through open data for resilience serious game!’. The presenters guided us with hands-on experience in Geonode, QGIS, and InaSAFE (open sourced systems useful for disaster risk assessment and management), and did an exercise in serious games for decision making under uncertainty (my first experience in serious games). It was thoroughly enjoyable and informative.
The main conference’s agenda is here: https://understandrisk.org/ur2016-program/
I enjoyed the ‘Ignite’ session which included 5 minute lightning presentations by technical session leads so that the audience could quickly see the gist of the forthcoming sessions in the rest of the conference. In the evening, a session of 5×15 provided a platform for 5 outstanding individuals with 15 minutes to tell stories about ‘risk’. It was exciting to listen to all of them, but I particularly enjoyed the talk by Prof Marcus du Sautoy, who is Charles Simonyi Professor for the Public Understanding of Science and Professor of Mathematics, Oxford University. He spoke about his latest book What We Cannot Know: Explorations at the Edge of Knowledge and talked about the uncertainty and chaos systems illustrated by a nice pendulum gadget.
A session on ‘Challenges in developing multi-hazard risk models from local to global scale’ has attracted my attention. It is clear that there is an urgent need for open, transparent and credible multi-hazard risk assessment methods, models and tools. The community should put more effort in aligned risk assessment methodologies for the different hazards with incomparable risk metrics. In addition, interactions of multiple hazards are largely neglected, with the result in some cases of strongly underestimated risk. I would like to explore this topic further in the future.
Such an exciting event is organised by ‘Understanding Risk Forum’ once every two years and I am keen to attend the next one. If you would like to know more about UR and its events, it is useful to subscribe to its newsletter and join its LinkedIn community at https://understandrisk.org/about/
Let us play a game. You give me 10 euros. We toss a coin. If it is heads I return you 30 euros and you win 20 euros. If it is tails you loose and I keep your 10 euros. Excellent deal, is it not? From a statistical point of view it definitely is, because the expected net return is positive: 0.5 × 20 – 0.5 × 10 = 5 euros. I am sure you will be happy to take on this bet. Now let us raise the stakes. You give me 100 euros, we toss a coin, if it is heads you get 300 euros back, if it is tails you loose your 100 euros. Will you still play? And what if we raise the bet to 1,000 or 10,000 euros? If your name is Bill Gates you might still play but at some point (10 billion euros?) even Bill will have to stop because he simply cannot afford the risk anymore. At that point the consequences of losing become too serious and are no longer outweighed by the benefits of winning. The option with less uncertainty (don’t play!) is preferred, even though it has a lower expected net return. This tells us that people tend to be risk-averse (https://en.wikipedia.org/wiki/Risk_aversion).
The above illustrates how important it is to quantify uncertainties and probabilities, because risk calculations can only be done if the probabilities are known. For instance, we all know that it would be wrong to design an urban drainage system on just the average rainfall. Instead, the system design should be based on the full rainfall probability distribution, because we want to make sure that the probability of a sewer overflow is small. We must be prepared for extreme events because the financial and environmental costs of an overflow are simply too high. One of the main aims of the QUICS project (https://www.sheffield.ac.uk/quics) is to quantify the uncertainties associated with urban hydrologic models and their inputs, and propagate these to model outputs. It provides crucial information for risk-averse decision making.
Whenever I suggest the coin toss game to my students (only hypothetically, of course) they usually go along but not for very long, because they have a tight budget. I was never really offered the game myself, but being a mathematical engineer and reasoning rationally I think I would go as far as 2,000 or perhaps even 5,000 euro. However, recently I learnt that if it were for real I probably would not. A few weeks ago I was at a Risk Analysis meeting of the International Union of Forest Research Organizations (http://riskanalysis-iufro.org//meetinginformation.html). Experts in risk perception present at the meeting explained me that the human brain functions such that the ‘pain’ caused by a unit loss is felt 2.5 times stronger than the ‘joy’ felt by a unit gain. If this were true, nobody would enter the coin toss game. Whether this also implies that nobody would ever buy a lottery ticket I do not know. I do know that many people do buy such ticket, even though they are aware that the expected net return is negative. Intriguing, is it not?
We can also play the coin toss game in a slightly different way: you decide how much money you put at stake (be it 0, 1, 5, 200 or 10,000 euro). We play the game with your bet of K euros. You have a 50% chance of losing the K euros and a 50% chance of winning 2K euros. How large is your K?