On Wednesday I watched a webminar given by the American epidemiologist Marc Lipsitch. It is part of a seminar series for physicists, on COVID-19. It was an interesting and sobering watch. I was struck by a comment Prof Lipsitch made in answer to one of the questions at the end of the webminar.
Evidence for the effectiveness of mask wearing

In August, 27 customers of a branch of Starbucks in South Korea were infected by one person. But none of the mask-wearing (it’s company policy*) employees were infected. One event does not provide a whole lot of data but it does allow us to take a stab at estimating how effective masks are. The plot above is an an attempt at that.
Masks and superspreaders
What is sometimes called a 90/10 or 80/20 rule or the Pareto principle, crops up a lot in life. If you doing are research, 90% of the best results come from 10% of your effort (sadly it is typically not possible to work out in advance which 10%!). It also seems to apply to the COVID-19 pandemic. 80% of the infections come from from 20% of the infected carriers. As I talked about in the previous blog post, there are superspreading events, in which one infected person can infect tens of other people, in one day. At the other end of the spectrum, many infected people do not pass the virus on to even one other trticperson. With colleagues I am working on understanding how masks work, which leads to the question: Can wearing mask use reduce the number of these superspreading events?
No such thing as an average infected person
I am very struck by this quote from a paper measuring the concentration of corona virus (aka SARS-CoV-2) in swabs taken from infected people
Initial SARS-CoV-2 viral load is widely distributed ranging from 3 to 10 log copies/ml …
Jacot et al, medRxiv 2020
Note the log in the first sentence, the range is not from 3 to 10 — about a factor of 3 — it is from 103 to 1010 viruses per millilitre — a range where the top end is 10 million times the bottom end. In other words, some people at some times during their COVID-19 infection have ten million times as much virus as others do. On a log scale, the average is 106.5 ~ 3 million viruses per millilitre but some infected people have thousands of times more, while others have thousands of times less.
Minimal model of corona virus exposure
Transmission of the corona virus (aka SARS-CoV-2) is very complex, which is basically why it is so poorly understood. But in true theoretical-physicist style, a minimal model has been developed, by a guy called Roland Netz (who is a theoretical physicist in Berlin). It makes a lot of assumptions, and it is clear that there is lot of variability, between one infected individual and another and between one situation and another, so its predictions should be taken with a large pinch of salt. But in this post I will outline this minimal model.
Google Colab means that everyone can have a league table in which their University is top
This blog post combines/builds on two earlier posts: One where I looked at using Google Colab to host Jupyter notebooks for my autumn teaching, and one where I messed around with a Jupyter notebook that can generate a university league table with almost any university at the top. I have tidied up the league table generating Jupyter notebook and you should now be able to see it on Google Colab here and the spreadsheet it needs with the University data is available here.
A-level grades, degree classifications and calling bullshit
Today I am reading both Calling Bullshit by Jevin West and Carl Bergstrom, and of a “growing crisis” over Scottish Higher results — presumably a similar crisis will happen for A levels when the results are released in a few days. I have got to the bit in Calling Bullshit where West and Bergstrom talk about bullshitting via statements that superficially look rigorous, but in reality are pretty flaky. In this blog post I want to suggest, possibly controversially, that the distinctions at the root of the growing crisis in Scotland, between a grade A and B in a Scottish Higher*, or a B and C, etc, have a slight whiff of bullshit about them.
No more than the weight of a cherry
Colleagues at the University of Bristol and I are working on trying to understand how masks work. One fundamental aspect of this is that a mask, like any filter, fundamentally involves a trade off. A mask must as permeable as possible to air, but as impermeable as possible to virus-containing droplets. Air must flow through a mask as freely as possible, but droplets should find the mask as close to impenetrable as possible. The problems is that these two design constraints directly contradict each other, and so any mask, any filter in fact, is a compromise.
Struggles with hosting Jupyter notebooks for teaching
Last year I changed my second-year computing teaching to Jupyter Python notebooks. I think Jupyter notebooks worked well at teaching how to do useful stuff like analyse data. The notebooks were hosted on the Micrsoft service Azure which wasn’t great but basically worked. However, not only is Azure far from perfect, it is also being binned around about week 2 of semester.
Emergence in biology
The beautiful behaviour of this flock of starlings is an example of a class of phenomena variously known as emergent, collective or more-is-different behaviour. The point is that a single starling, or two starlings cannot show this striking phenomenon, you need hundreds or thousands of starlings, to see it. A liquid is a less obviously exciting example of an emergent phenomenon. One or two water molecules aren’t a liquid, you need at least about a hundred to make even a tiny water droplet.