Ig Nobel: The whimsy and the magic of science

Author: Maya Lopez (Co-President)

When the 2025 Nobel Prizes were announced last month, Cambridge’s science enthusiasts and news junkies alike were buzzing with excitement, discussing the laureates, dissecting the research, and tallying college wins. However, I noticed less talk around a month earlier on the Ig Nobels”. Maybe because no Cambridge members were awarded this year? Or perhaps because it’s not serious enough?? … Whatever the reason, today we will take a break from all the rigidity of science and the recent serious concerns around politics contesting science.  Instead, let’s take a look at the whimsical research that is also… seriously a science, which, as Nature once put it, “The Ig Nobel awards are arguably the highlight of the scientific calendar”.

Are Ig Nobel Prizes a real award?

This is one of the top Google searches with the keywords: “Ig Nobel prize”. The answer? YES*. It is a very real award with ceremony and all that has now been going on for 35 years. But “*” was not a typo as it is also, yes, a parody of the all-too-famous Nobel Prize, which probably needs no explanation of its own (hence the namesake and the pun of “ignoble”). For those of you who are unfamiliar, Ig Nobel is annually awarded by an organization called Improbable Research since 1991 with a motto of: “research that makes people LAUGH, then THINK”. This organization also publishes a “scientific humor magazine” (who knew that was a thing?) called Annals of Improbable Research (AIR), so they, in a sense, can be seen as a specialist that focuses on promoting public engagement with scientific research through fun. The Ig Nobel Prizes are often presented by Nobel laureates in a ceremony held at the MIT or other universities in the Boston area. Much like the “real” Nobel prizes, it has different award disciplines like: physics, chemistry, physiology/medicine, literature, economics, and peace, plus a few extra categories such as public health, engineering, biology, and interdisciplinary research. (The award categories do vary  from year to year, though.) The winners are awarded with a banknote worth 10 trillion Zimbabwean dollars (a currency that is no longer used; roughly worth US$0.40), so it’s not really about the monetary value. They also get an opportunity to give a public lecture upon award, but researchers do face the risk of being interrupted by an 8-year-old girl (or, in the case of 2025, a researcher dressed up as one) crying “Please stop: I’m bored”, if it dares go on for too long. The ceremony, as you can imagine from here, has a number of running jokes, and if you are interested, you can watch the whole ceremony of 2025 on Youtube.

Bringing “in” science to the everyday curiosity:

So it’s a parody, yes, but the award does exist and is given to actual researchers. The quickest way to get a sense of the Ig Nobel might be to simply browse the list of research that was awarded prizes. This year, we’ve got:

CategoryTitleReference
AviationStudying whether ingesting alcohol can impair bats’ ability to fly and also‹ their ability to echolocatedoi.org/10.1016/j.beproc.2010.02.006
Biologytheir experiments to learn whether cows painted with zebra-like striping can avoid being bitten by flies.doi.org/10.1371/journal.pone.0223447
Chemistryexperiments to test whether eating Teflon is a good way to increase food volume and hence satiety without increasing calorie contentdoi.org/10.1177%2F1932296815626726
patents.google.com/patent/US9924736B2/en
Engineering designanalyzing, from an engineering design perspective, how foul-smelling shoes affect the good experience of using a shoe-rackdoi.org/10.1007/978-981-16-2229-8_33
Literaturepersistently recording and analyzing the rate of growth of one of his fingernails over a period of 35 yearsdoi.org/10.1038/jid.1953.5
pmc.ncbi.nlm.nih.gov/articles/PMC2249062doi.org/10.1001/archinte.1968.00300090069016
doi.org/10.1001/archinte.1974.00320210107015
doi.org/10.1111/j.1365-4362.1976.tb00696.x
doi.org/10.1001/archinte.1980.00330130075019
Nutritionstudying the extent to which a certain kind of lizard chooses to eat certain kinds of pizzadoi.org/10.1111/aje.13100
Peaceshowing that drinking alcohol sometimes improves a person’s ability to speak in a foreign languagedoi.org/10.1177/0269881117735687
Pediatricsstudying what a nursing baby experiences when the baby’s mother eats garlicpubmed.ncbi.nlm.nih.gov/1896276
Physicsdiscoveries about the physics of pasta sauce, especially the phase transition that can lead to clumping, which can be a cause of unpleasantnessdoi.org/10.1063/5.0255841
Psychologyinvestigating what happens when you tell narcissists — or anyone else — that they are intelligentdoi.org/10.1016/j.intell.2021.101595

I think the goal of “laugh and think” is clearly successful in all of this research.  But speaking of thinking, some of these research topics made me wonder (and maybe you are too): “Why would you investigate that?” (What adult would?) or “Is this real, funded/published research”? What I want to highlight (and what may not be clear from the brief list on the Wikipedia page), is that they all have proper references attached to them. So yes, though their published titles might sound a bit more academic or “stuffy” (though often by not much), they are actual peer-reviewed papers!

Are you ridiculing science?

This question on the official FAQ page caught my attention, because I, as an IgNoble enthusiast, hadn’t imagined any serious criticism against these awards. Digging a bit deeper, I found that decades ago, the UK’s then-chief scientific adviser – Sir Robert May – made a formal complaint request that “no British scientists (should) be considered for an IgNobel, for fear of harming their career prospects”. (Note that the UK, alongside Japan and the USA (no wonder I’m acquainted with this prize), are regulars of this prize as a nation, winning awards nearly every year.) Furthermore, the article reads “He was particularly concerned when ground-breaking research into the reasons why breakfast cereal becomes soggy (by the University of East Anglia) won a prize,” essentially hinting at the concern of public ridiculing science (as a whole?). If you think about it, such a general attitude of “it’s not with the scientific investigation unless it’s clearly applicable/translatable/important” is perhaps far too typical, especially in basic sciences.

However, I think the founder of the prize, Marc Abrahams, had the best defence against the practice of “rewarding silly science”.

“Most of the great technological and scientific breakthroughs were laughed at when they first appeared. People laughed at someone staring at the mould on a piece of bread, but without that there would be no antibiotics… A lot of people are frightened of science or think it is evil, because they had a teacher when they were 12 years old who put them off. If we can get people curious and make them laugh, maybe they will pick up a book one day. We really want more people involved in science and I think the webcast will help do that.”

Slightly on a tangent, but “Maths Anxiety” is a recognized experience that many develop during childhood. While no research might exist on this (yet), I also suspect a similar phenomenon with STEM at large. Sometimes I get comments from students taking humanities subjects (even in Cambridge!) like “wow, you’re doing a real/serious degree”, or “science sounds so difficult”. For some people, “being put off” by science might trace back to a negative experience during their first formal introduction to science as a subject in school. In that case, bringing their interest back to science with all-serious demeanor and stuffy topics might be quite a high barrier to cross. However, looking at the Ig award list from earlier, and how quickly they make you go “huh” after the laugh, I can’t help but think that these funny, curious studies might be the push they need to ignite their curiosity and welcome them back to scientific inquiry without any pressure.

The satire (and controversy?) of IgNobel

That being said, not all IgNobel prizes were specifically awarded to quirky “research that cannot (or should not) be reproduced”. It was also sometimes awarded as a satire. In the recent case of 2020, Ig Nobel Prize for Medical Education was awarded to Jair Bolsonaro of Brazil, Boris Johnson of the United Kingdom, Narendra Modi of India, Andrés Manuel López Obrador of Mexico, Alexander Lukashenko of Belarus, Donald Trump of the USA, Recep Tayyip Erdogan of Turkey, Vladimir Putin of Russia, and Gurbanguly Berdimuhamedow of Turkmenistan. Now, before you start typing away your complaints and protests (or throwing paper airplanes), hear the reason why: they were awarded for “using the Covid-19 viral pandemic to teach the world that politicians can have a more immediate effect on life and death than scientists and doctors can”. I’d say that makes you think quite a bit, especially as a person in the scientific community.

If you consider these instances in isolation, perhaps there is some point to what the former scientific chief advisor was saying, and that a serious researcher might not want to be associated with this prize (kinda like the Raspberry award, I guess?). However, this was apparently not a popular opinion, at least in the UK scientific community, which backlashed at the comment earlier. To this day, we get awardees from the UK in the Ignobel prizes.

Legacy beyond the funny and curious:

Parody and satire, yes, but in case you think this is still a long post for much ado about nothing, as it’s still in the realm of a joke, I want to present you this final case of when these jokes lead to “actual” science (not that they weren’t real science to begin with, but…). Take Andre Geim for instance, who shared the 2000 Ig Nobel in Physics with Michael Berry for levitating a frog – yes, a real frog – using magnets. Ten years later, he went on to win the actual Nobel Prize in Physics for his groundbreaking research on graphene. This itself may sound like a lucky coincidence but it is also worth mentioning that this frog experiment was reported in 2022 to be the inspiration (at least partially) behind China’s lunar gravity research facility.

These are not the only examples where such “silly research” actually ended up having real-world impact and use. In 2006, the Ig Nobel Prize in Biology was awarded to a study showing that a species of malaria-carrying mosquitoes (Anopheles gambiae) is attracted equally to Limburger cheese smell and human foot odor. This initial study was published in 1996, and the results suggested the strategic placement of traps baiting this mosquito with Limburger cheese to combat the Malaria epidemic in Africa. While these applications of the study might not be immediate, I think what allows for this translation (aside from being oddly specific) is partly due to the cost-effectiveness. The more typical “scientific” solution one might envision with disease control might involve genomics, vaccines, or pharmaceuticals. While they are all state-of-the-art and highly effective (and certainly have the sci-fi appeal), the cost both in terms of financial and time resources, can be expensive. Compared to that… cheese? I’m guessing that it’s more budget friendly and easy to implement. This research as well as this year’s award in biology about painting (zebra-like) stripes to cows as a mosquito repellent, all make me re-appreciate that sometimes the viable solution might be something unexpectedly simple and close at hand. These studies show how science, even in its quirkiest forms, can indicate practical and effective solutions to improve everyday lives.

Diversification of sci-comm tactics

Whether you admire the nobleness of the Ig Nobel, think it’s all fun and whimsy sci-comm, or avoid it altogether as an aspiring “serious” researcher, I think this still stands as a rare gem in the diversity of what science-communication can look like. In recent years, “debunking style’ science communication is seemingly (back) on surge, as well as various independent video-based science communication content creators (such as the guest speaker we had last week). In the age where science itself and its institutions are increasingly seen through a critical eye or outright contested, I do understand the urge to fact bomb or even isolate myself in all the “seriousness”. This is especially tempting when we know that some of the fruit of scientific research, like vaccines, can save lives, and we desperately want people to protect themselves. I personally don’t consider myself especially witty, but celebrate those who can masterfully blend research and humor to entice audiences and reignite their interest in science.  Of course, not a single sci-comm tactic is bulletproof – some, like Sir Robert, may find these things distasteful, while others simply prefer something “serious,” and that’s ok. But science as a community might just benefit from having such a quirky tactic under its sleeves, and the diversity in science communication approaches might very well be the best shot we’ve got for this day and age of increasing division. Who knows, maybe some researchers will look into the efficacy of the IgNobel prize headlines against the science-anxiety.

Science and extreme agendas

Author: Raf Kliber (Social Media Officer)

Original feature image art specially drawn by: TallCreepyGuy

While I work myself to boredom at a local retail store, I listen to some podcasts in the background. Something to cheer me up. Among my favourites are the Nature Podcast and Climate Denier’s Playbook. But, on that specific Wednesday, the episode was anything but cheering. I landed on the Nature Podcast’s “Trump team removes senior NIH chiefs in shock move” episode, which provided me with a bleak look into the current US administration’s proceedings. The bit that shocked me the most was how much the move clung to Project 2025‘s agenda. One of the moves discussed was a defunding of ‘gender ideology’ driven research (read anything that includes the word trans, even though such research is useful for everyone). Furthermore, instead of such ‘unimportant’ research, the administration wanted to conduct studies into ‘child mutilation’ (read trans conversation therapy) at hospitals. Eight hours later, while soaking in a mandatory afterwork bath, I began pondering “what is the interplay between extreme agendas and the ‘fall’ of science?” and “what I, a STEM person, could do about it?”. As a Polish person, my first bubbles of ideas started with fascism and the Third Reich.

Jews, fascism, and ‘directed’ science

I moved to the UK when I was twelve years old. This event spared me the traditional trip to Auschwitz one takes when in high school. It spared me from the walls scratched by the nails of the people trapped in gas chambers. It spared me from the place so horrible yet so pristinely preserved that visiting it is as close to time travel as one can get. About a fifth of the population of Poland was wiped out in World War II. On average, every family lost someone. Not on average, many families were completely gone. Due to the gravity of the topic at hand I reached out to Dr. Martin A. Ruehl, lecturer in German Intellectual History at the Faculty of Modern and Medieval Languages at University of Cambridge for some guidance. He also gave a talk on “What is fascism?” during the Cambridge festival, which I recommend. Another reason is that I am by education, a physicist, and just as physicists have their own set or rigorous habits that make their field solid, historians and philosophers have theirs.

Fascism as an idea is fuzzy, or at least with fuzzy borders. One knows definitely that after Hitler took over the power in Germany, it took on Fascist ideology. It is also abundantly clear that the current UK is not a fascist regime. Trying to nail the border delineating the least fascistic and just about not fascistic regime is futile, complicated further by each regime having their own unique element. The process of how it festers and develops in a country is left for others to explain, and I encourage the reader to watch this video essay by Tom Nicholas on how to spot a (potential) fascist. I will go with the conclusion of Dr Ruehl’s talk. Fascism is a racist, nationalistic, extreme and violent idea that often puts the core group in a self-imposed theoretical attack from the outgroup. (e.g. Jews were an imagined threat to the German state, even though they weren’t). I procrastinate talking about subject matter to highlight two important points: Fascism is a complex topic that could be studied for lifetimes and consequently, I am not an expert. I have made my best attempt at giving it the due diligence it deserves.

Disclaimers aside, what was the state of science during Hitler’s reign? Let us set the scene. The role I’d like us to play is that of a scientist at the time. Let us imagine ourselves in 1933 Germany, right at the beginning of the Nazi reign. Nazi party made it rather clear: Either you, as the scientist, are ready to conduct research that aligns with the party’s agenda, or you’re out of academia. Unless you’re Jewish and known to be on the left of the political spectrum (historical pre-nazi left, although it would still include things like early transgender care, for example, as advocated by Magnus Hirschfeld), then you don’t get a choice. Physics Today has a nice article that contains the migration of selected physicists out of Nazi Germany, which I recommend having a look at. Similar goes for other branches of science. The crux of the situation is that if you are studying races or ballistics, you are more than welcome to stay. Hitler did recognise that only the most modern military equipment would allow for the Third Reich to wage war on everyone. Similarly, he did want to put his ideals onto the firm foundation of “cold and logical” science, even though at times that compromised the scientific process. For example, the creation of Deutsche Physik (which denied relativity) and the burning of books by the above-mentioned Magnus Hirschfeld. (As much as my past self would thoroughly disagree, trans people are a cold and logical conclusion of how messy biology can be. More so than arbitrarily dividing all of population into two buckets.)

The adoption of the idea of Social Darwinism (that fittest social groups survive) and the knowledge of what genes do (albeit well before the discovery of DNA structure and the ability to compare genomes) created the foundation of ignorance for ‘scientific racism’ and eugenics. That being said, there was more to it than the current state of not-knowing. According to the introduction of “Nazi Germany and the Humanities” edited by Wolfgang Bialas and Anson Rabinbach, “Creation of the hated Weimar Republic created a deep sense of malaise and resentment among the mandarins, who, for all their differences, had in common the belief that a “profound ‘crisis of culture’ was at hand””. To draw a conclusion, the loss of the war and a tense national atmosphere led to the development of such völkisch ideals way before Hitler’s regime touched the ground. To further quote, “many retained the illusion of intellectual independence”. The general sense of superiority also gave rise to books like Deutsche Physik, a work that opposed Albert Einstein’s work directly.

(Note from the author: Googling “Social Darwinism” will lead you to creationist videos by Discovery Science (A YouTube channel by Discovery Institute, a fundamental creationist think tank). They seem to be hooked on using the aforementioned atrocities to try to link Darwin, and his early understanding of evolution, to Satan and hence to him leading us away from God with his theory. It is worth mentioning that although it bears his name, Darwin did not play a role in coining or using the term.)

To summarise this section: The way the corrupt ideals spread into science and politics in Nazi Germany arose from discontent and false hope. It was more of a fork situation. Both the world of academia and politics took up the story of national threat and superiority due to high levels of discontent originating from the Weimar era, and while intertwined together, I think that the cross-influence only amplified the process. This resulted in academia and politics taking up both ideals independently, and simply supported each other in the downward spiral such as antisemitism.

USSR, Russia, and limiting scientific cooperation.

A nice cup of tea on the following day led to some more thinking about other regimes. Like a true ‘Brit’, I took out my teapot and with a cup of Earl Gray in a fancy Whittard porcelain in my hand, I drifted off again into another rabbit hole. This time instead of west, I dug the tunnel east.
An interesting tidbit from my past regards my primary school. The changing rooms in that place had an interesting design. If one were to pay enough attention, they would see a system of grooves in the floors that were meant to act as drainage. Why drain something from an indoor location? The changing room was meant to serve as an emergency field hospital in case of another war. The school turns out to be old enough to see some of the old soviet practices in its design. For those unaware, Poland was part of the Soviet bloc up until 1991. Just 12 years before my birth, and 13 before Poland joined the EU. So let us journey to the east and see what history has to teach us.

Stalin was a dictator, just like his Austrian-German counterpart. What is slightly different is the ideology that shaped the persecution of scientists at the time –  a different flavour of extremism. I could go on a rant about what Stalinist flavour of Marxism is, but just like Fascism, there are scholars who spend their lives studying it. I am not one of them.

Nevertheless, the parallels between the corruption of sciences in Fascist Germany and Stalinist USSR are rather staggering for such different ideologies. In Germany, anything considered Jewish or going against the greatness of the Aryan race was immediately cut out, while the rest was bent towards the leading political party’s view. Here it was much the same. The humanist subjects took the largest hit in independence, as those in Germany. Lysenkoism played a role in slowing down the genetics research in the USSR. Instead, what followed was an increase in Lamarckism (acquired characteristics are passed on, rather than typical natural selection). This then, possibly, contributed to agricultural decline, creating another subject of memes for the edgy GenZ.

This also led further to isolation of the scientists. While every now and then they would invite foreign scientists (as Feynman wrote in his letters, and let us be honest, this might have been because of his involvement in Los Alamos) the mingling of Russian scientists with the rest of the world was minimal. Did I forget to mention that geneticists were often executed for not agreeing with Lysenkoism? Science is a global endeavour for a reason. It needs way more manpower than any country alone has. A country can never be a fully independent branch, it will simply lead to a slow withering of progress.

To have a nice circular structure in this section and bring it back to my home: Attitudes can also persist after occupation. The Polish government made some unpopular moves in academia during the time of the PIS party. Polish academia uses a scoring system, where each publication in a journal grants you points. Each point tries to quantify your contribution to a field. So technically a biochemistry paper would give you points in both biology and chemistry. They started awarding more points for papers in Polish journals rather than international ones, alongside some mixing of awarding points in political sciences for publishing theology papers. This may be seen as a slight resurrection of the national pride in sciences which I despise so much (Springer Nature’s journals are always going to be my favourite to skim through).

So what?

My Eurocentric summary of history is probably boring you to death. Let us talk about the US. Trump! The name that makes my hair stand on the back of my neck. The similarity of what is currently happening in the USA really makes me think that history does indeed repeat itself.

Firstly, just like Lysenko and his anti-genetics, Trump decided to elect RFK Jr as the minister of HHS. A well known opponent of vaccines is in a position of hiring and firing researchers. The MAHA (make America healthy again) report included a lot of less-than-optimal healthcare research directions. RFK really believes in a mix of the terrain theory (that the terrain of your body i.e. fitness and nutrition, play THE most important part of your immune system) and miasma theory (covered in a previous article here, but basically a medieval theory on bad air making you sick). There are a whole host of reasons for a person to also point out that a recovering drug addict and brain tapeworm survivor does not make for a great leader for a health agency. To be a devil’s advocate though, he did come up as an environmental lawyer. Additionally, RFK supports removal of fluoride from water and has helped to spread misinformation about vaccines in Africa. He has a very tangible body count and actively harms populations.

Secondly, there are the topics from the headlines in the first section. It is clear that the current administration’s aims are not simply doing science to explore x, but rather confirming x under the guise of science. This is why 75% of scientists that answered Nature’s poll said that they are looking to move out of the USA. Additionally, in a piece by the New York Times, experts in Fascism are also moving away from USA. It is something that is now consequently causing the ‘brain drain’ in the USA and, ironically for an administration that is anti-China, hands over the scientific majority to China. (Whether you think that is good or bad, is up to you. I personally am neutral.) Additionally, the administration has already tried to block Harvard’s ability to admit international students which contribute heavily towards their income stream – all in retaliation for Harvard allowing students to express their right to free speech and protest in favour of Palestine. This is slightly more sneaky than executions and imprisonments. Nevertheless, in a capitalist society, it might be somewhat equivalent when the funding we all depend on goes dry.

Lastly, there is a difference I would like to point out. Regimes like the one above often arose from a dire need for a radical leader and major changes. The current administration is exercising what I would like to call stealth authoritarianism (as coined by Spectacles here). Gone are the days of having posters with long-nosed depictions of minorities that eat children on every street (although the ‘they eat the dogs’ moment was close enough for many). The current US president is using rather specialised and closed off social media to reserve their opinions to their most dedicated followers rather than the general public. We live in the age where the algorithm separates us. It is becoming ever less likely to encounter an opinion we disagree with out in the wild without searching for it. Executions are no longer needed to silence the critics, for as long as you have a devoted fanbase, the infectiousness of the internet can create a potent and numerous enough group to win the election.

The fact that someone can be so overtly against reality, so blatantly corrupt, yet at the same time can feed a mirage to the right people to get elected is the true curse of the modern information landscape. For me personally, it is the main reason why CUSAP and similar societies are more important now than ever before.

What can you do

Every good opinion piece should end with a call to action. I also don’t want this entire blog post to be a long way of saying “AAAA WE ARE ALL GOING TO DIE!!!”, because we most likely won’t.

  • If you are in the USA and courageous enough, protest. It should be easy enough to find one nearby. This is not the main recommendation. Police brutality has already made itself visible in the past month.
  • What you can do more safely is support local lobbying. Be prepared that democracy is not as accessible as it seems. Genetically modified skeptic has posted their experiences trying to vote down the requirement for schools in Texas to have the 10 Commandments in classrooms. It was not a pleasant experience, but organisation and support for lobbying individuals can go a long way. Even if it means bringing them food and supplies or sitting in to notify them when it is their turn to speak at meetings.
  • Vaccinate your family against misinformation. The emotions can run high when politics are involved, but perhaps you can connect one bit of their viewpoint to that kernel of truth that may help. My personal jab at right-wing oil enthusiasts is to connect it with their dislike of migration, as this is a likely result of climate change. (Yes, I don’t believe migration is bad, but they do. Sometimes, you have to engage one topic at a time.)
  • Join a group to lobby and promote critical thinking. Here at CUSAP we try to go beyond Cambridge; thus we welcome articles written by non-members. You can get in touch with us at the https://cusap.org/action/. Youth against misinformation is another one. Plenty more can be found online.
  • Most importantly, do not shut up. Speak up when you see fake news. Don’t get distracted by trivial problems. Call your local political governors, meet with them, email them. This goes regardless of which party they are associated with. Make sure that they know that the truth is what you support. (It goes without saying, as long as you feel safe to do so)
  • Lastly, for my own sanity: do not be nihilistic about how little significance one action or vote has. One vote can make a lot of difference when it is surrounded by a couple thousand more singular votes.

Alpha (??) Male (???)

Author: Maya Lopez (Blog Chief Editor)

When I was video calling my parents recently, I noticed that a wildlife documentary was playing in the background. The documentary was on some pack of wolves and followed the tale of a dominant leader who got injured and left the pack, so the next oldest sister stepped up and led the pack. “Leader”. I was actually impressed by the up-to-date wording, reminded me of the story I saw a few years back on the term: “alpha wolves” – and how such outdated remains ingrained in our society.  But also, did the documentary just say sister stepping up as a leader? This led me straight back to the memory hole and some reading in between my deadlines, where I rediscovered the tale of science finding that was embraced by a culture, but culture/society refused to evolve with the scientific updates. Given the modern (and possibly unsustainable) rise of “manosphere” and loneliness epidemic, especially amongst young men (of course, while not uniquely exclusive to men) are believed to be linked to the current political climate and radicalization, we’ll explore where we got this “alpha male” myth often dubbed to be backed by “evolutionary science”. And this turned out to be an emblematic case where culture arguably sought after the label of “scientific” to affirm and add prestige to the social construct that some people wanted to desperately believe, and how this is much more difficult to falsify and update than actual scientific facts.  

“Alpha wolves” finding and its correction status

So, where did this all begin? It is no coincidence that alpha-male to this day is often represented as a wolf emoji, as seen on Wikipedia. 1971, L. David Mech , a zoologist specializing in wolves, observed that a strong dominant wolf seems to be leading a pack. And he published his findings in a book called “The Wolf: The Ecology and Behavior of an Endangered Species”. I couldn’t find the exact record of how many copies were sold, but it had numerous reprints and digital releases until it got taken out of print in 2022.  This is perhaps a testament to its influence in all these years in the competitive world of publishing, and essentially popularized the terms alpha wolves, so I think it is fair to say that the book was super well-received in public. Personally, I think this level of success with science communication in itself is indeed remarkable. Perhaps, being the 1970s when eco-consciousness was on the rise and even “Earth Day” was born from a public movement, the conversation about ecology and endangered species was at the right time. However, the cultural impact (unfortunately?) lies well beyond the realm of ecology, leading to the connotation of this term to characterise a specific imagery of wild, dominant, aggressive (?), masculinity throughout the upcoming years.

The book outlines various facets of studies in wolves across different chapters- from their wild life distribution to the pack structure. The term alpha was introduced in a context to describe an apparent leader in a pack that seemed to have achieved its status by dominating others in the pack. Interestingly, this term was used in a similar sense in a report published in 1947, Germany, so Mech’s book arguably stemmed from a long lineage of academic writing that held this prevailing theory of a wolf pack hierarchy. Also fascinatingly, “beta wolves” in this context is de facto #2 in the group (quite a different nuance from modern internet slang, but I’ll get back to this in a sec). But there was a big caveat to all of these studies: they were based on wolves under captivity – an artificial setting, often with individuals of non-blood relatives boxed in the same environment. So while “alpha wolves (and corresponding female pair)” emerged in captivity, when researchers expanded their search and saw if this is also applicable to their natural state, things went awry.

Like in many natural science findings, the alpha wolf finding was actually corrected and updated in a later decade. In fact, the interesting thing is that this falsification came from Prof. Mech himself (in what I call a true scientist fashion)! Upon his further investigation of wolves, he discovered in the 90s that the natural wolf hierarchy is, in fact, just a family. In this context of kinship, bloodshed and battle for the dominant position were rare. In an interview piece from New Yorker, an associate research scientist with a National Park Service research program in Yellowstone,  Kira Cassidy sums up the current notion of the “wolf hierarchy”:

“It’s not some battle to get to the top position. They’re just the oldest, or the parents. Or, in the case of same-sex siblings, it’s a matter of personality.” 

It’s easy to imagine how parents, naturally older and more experienced, lead the pack, and their offspring follow their lead. Mech himself was one of the most vocal proponents to refrain from using the term alpha wolves because “it implies that they fought to get to the top of a group, when in most natural packs they just reproduced and automatically became dominant.” In ‘99, he tried to describe these parents as having “alpha status”, and eventually the field stopped using the term altogether. If you check the International Wolf Center webpage today, you’ll see it being described as “outdated terminology”. Modern research also finds that in natural reserves where pacts occasionally fight for territories, they can observe rather extensive pacts, including aunts and uncles, and multiple “breeding pairs,” making the structure more flexible and less hierarchical. Furthermore, even these leader positions are essentially not about aggression but rather more about responsibility, and submission is more of a chain reaction mannerism rather than an all-hail-and-serve-the-dictator attitude. To quote the Scientific American’s article, “The youngest pups also submit to their older siblings, though when food is scarce, parents feed the young first, much as human parents might tend to a fragile infant.”

When culture decides not to update based on new findings

So okay, alpha wolves weren’t really a thing unless you split up families and smush them into the same room, and natural leaders aren’t really about aggression and bloodshed. If this tale were as famous as the concept of the alpha male, then it would’ve been a great example of scientific falsification updating the societal norm, but that was not the case. What starts the application of the concept/term of alpha to humans is arguably NOT the wolf book I mentioned earlier, but the book published in 1982 called Chimpanzee Politics: Power and Sex Among Apes, where the author implies that his observations of a chimpanzee colony could possibly be applied to human interactions. But the thing is this term was still mostly in ecological contex (and not applied to discuss human interaction) till around late ‘90s where on top of the wolf example described above (which gives the pack leadership imagery), it also applied in other non-social animals, particularly to refer to male’s mating privileges due to their ability to hold territory, win food consumption, etc.

Then, who popularized this chimp/wolf term to describe a human male? I couldn’t access the actual source article that did this, but it was mentioned in Wikipedia that around the early 90s is when alpha referred to humans, specifically to “manly” men who excelled in business. But the recorded most pivotal moment in (pop?) culture is perhaps the ’99 American Presidential election campaigns, incidentally the same year Mech denounced the alpha wolves concept.  According to journalist Jesse Singal, from New York magazine, the word entered the public consciousness on a mass scale that year when a Time magazine article published an opinion held by Naomi Wolf, who was an advisor to then-presidential candidate Al Gore.  The article describes Wolf as having “argued internally that Gore is a ‘Beta male’ who needs to take on the ‘Alpha male’ in the Oval Office before the public will see him as the top dog.” Naomi Wolf herself, for context, was a prominent figure in the third wave of the feminist movement, with publications like The Beauty Myth in 1991.  But from around 2014, journalists started to describe her reporting on ISIS beheadings, the Western African Ebola virus epidemic, and Edward Snowden as containing misinformation and conspiratorial, and in 2021, her Twitter (… okay, “formerly-known-as-Twitter”) account was suspended for posting anti-vaccine misinformation. Her Wikipedia page now includes a title: conspiracy theorist.

Singal also credits Neil Strauss’s 2005 book on pickup artistry for popularizing alpha male which sedimented the aspirational tone of the alpha male as a status, but I think the pattern is clear: a frankenstein mish-mash of an outdated scientific-concept (literally, revived from death if you think about how the term was dying out in wolves research) and some vague sense of aspirational male figure that encaptulated the “cool” of the era has entered the lexicon, carrying the prestige of “science word” (not entirely untrue but leaving out the many big caveats mentioned above). And once things become a culture, it is hard to change, despite culture, if you think about it, is inheretaly in constant flux in the history of homo Sapiens. I’m not saying all cultures are bad; certainly not: it’s collective behaviors that have adapted throughout history. However, we often use “well, that’s the culture” as a reason to defend practices even after we, as a society, gained the means and the knowledge on how wrong or even harmful some things could be.

The correction status of alpha males (?) in other species

But wait, did you notice how this conversation of dominant status eventually became specifically about dominant “male” status? Where in the world did our social image of the alpha male even come from? Ultimately, it seems that we didn’t want to dismiss this idea of the almighty dominant male. Even to this day, if you Google “myth of alpha male,” you can find Reddit threads with comments that “acknowledge” that it is outdated and untrue in wolves, but people often ignore the male dominance found in Great Apes. Sure, male dominance CAN be a thing in great ape like Gorilla silver backs I guess (but note they get their own fancy title), but the implication made here seems to be that “wolves don’t matter because wild life closer to humans shows alpha males so we human males should also have alpha nature too errrr”. But what if the underlying assumption about the domineering male in relative species… does not hold as well as you think?

So let’s go back to our assumption about relative specie chimps and see if the assertion from the 80s holds true. Long story short, once again, like in the context of wolves, it turned out the scientific reality is more complex than the earlier rendition of it. Chimps are social creatures, like wolves and humans, and, indeed, there is often an alpha male in a group with mating privileges. But dominating other males with power and bloodshed turned out to be not the only way to achieve the top status – one can groom their way to the top. 2009 research found an interesting correlation with different males and their “styles” to achieve their status. Essentially, they saw that smaller chimps with perhaps less intimidation power compensated for this by grooming other members more frequently and equally. This also speaks to the complex nature of the alpha status too: they’re also judged by the other members of the group – in effect, being a popularity contest rather than a pure dictatorship. So while alpha male is a thing, it does not have to be the pure aggressive type that we typically imagine, and the stereotypically “beta-moves” might totally be his strategic winning move.

Let’s also interrogate the other half of the phrase: does it even have to be alpha “males”? Our other equally close relatives, Bonobos, will tell you otherwise. They, in fact, are often termed a matriarchal society for often being led by an experienced female senior(s) as a leader in the wild. In such enviornment, a routy and aggressive male that gets too excited by a presence of fertile female in fact can get his butt kicked – or more like toe bitten off in the extreme case by the experienced females who might gaurd such young female. This was the case for the group of bonobos in Wamba forest in the Democratic Republic of Congo, and his social position in the group plummeted. While the toe-bitten level of fight back is unusual, Dr. Tokuyama describes that “Being hated by females … is a big matter for male bonobos,” as the alpha male attitude here giving unwanted & violent sexual provocation is often met with a strong resistance by the females who woud band together to fend off such behavior.  As a homo Sapiens, I can’t say that this an-eye-for-an-eye tactics lending itself to violence is ideal, but, it is interesting to see an entire specie dynamic where aggression of male that evokes alphaness is arguably seen as reckless, meeting a stong resounding: NO.  

Can we finally update our alpha-male myth?

During my teenage years, I almost got the impression that alpha/beta categorization is increasingly becoming… cringe – a hype, a target of satire that became no longer cool upon oversaturation in the internet lingo. But the modern narratative around manosphere, while not mainstream (… I hope), is hinting otherwise. The very definition of masculinity for some people is somehow seeming more aggressive, dominating, and hierarchical. While such views may have always existed to some degree, highly visual-focused trends nowadays seeping into youth culture are perhaps accelerating this issue in a possibly dangerous way.  Perhaps alpha-male is too catchy, too photogenic, too trendy at this point to go out of fashion overnight (and in fact, during research, I found it immortalized & perpetuated in courses, coachings, and AI characters!).  And you know, as a story archetype (and possibly some people’s …let’s say “romantic type”), I can see some point – but maybe we can leave that to the realms of Wattpad’s Twilight spin-offs.  And I feel something inherently sad about reducing complex human social behaviors and the multidimensionality of personality we can have as REAL individuals to be reduced to a simple slogan and the law-of-the-jungle type of mindset, all with an undertone of violence and a dog-eat-dog world view.  

With simple slogan perpetuates a simple view of the world; an easy pill to swallow compared to a mentally demanding task of critically assessing social constructs.  After all, we are all facing a historic level of exhaustion and work demands. However, next time a trendy catchphrase from a view “supported by (evolutionary) biology” creeps up into your feed, let us ask ourselves, what complexity are we removing and at what cost?  Constant refrain from a critical reassessment of our own culture around us could quickly spiral true subordination of mind, ripe for exploitation (…thus very un-alpha if you ask me).  So let us practice our critical thinking and be wary of narratives that sound too… black-and-white.  Maybe science can help you update and be more flexible with thinking, because hey, science is ultimately unafraid to evolve and update, and so can we.


Plague doctors were onto something?? (albeit for a wrong reason)

Author: Maya Lopez (Blog Chief Editor)

On June 13th, 1645, George Rae was appointed as a second plague doctor in Edinburgh. This was following the first doctor John Paulitious, who died due to, well, plague. While plague was already an endemic disease in the 17th-century UK, this outbreak was one of the worse ones. The 11th major outbreak in Scotland and over in London, this particular outbreak was also known as the Great Plague of London (albeit the last of this scale, hence the name rather than having the highest death toll than earlier iterations). With the rising death tolls in the city of Edinburgh (which will ultimately culminate in 1000s by the end of this outbreak), it was not particularly surprising that the doctors themselves would die from contracting the plague. Such (increasingly) high-risk jobs naturally saw a salary raise, culminating in a monthly rate of a whopping 100 Scotts a month by the time Dr. Rae was appointed. However, Dr. Rae survived his term, and thus he was only paid his promised salary slowly over the decade after the plague epidemic ceased after negotiation. This is not to say that the city council provided a generous pension after his civil service, but rather the council simply did not have the cash to pay him on the spot because, well, they didn’t expect that he would come out of the pandemic alive! (It is believed Dr. Rae never received his full share in the end.)  Is this to say that he was just a lucky soul who had a super immune system? When I heard of this fascinating tale of the man who once walked the narrow streets of Mary King’s Close, Edingburgh, I was extremely fascinated by his secret of survival in a disease where with the bugonic plague, you have roughly 50:50 chance of survival and if it was pnumonic plague, well… it’s nearly always leathal with the treatment options available at the time. So to me, this spoke, he avaded contraction itself – but how? He was actively going out of his way to inspect the sick, and these ultra-narrow, multistoreyed, alley-houses are not what I would call the best example of well ventilated environment. And his most likely secret (of course, it may be that he did have excellent health and an immune system) was no other than the iconic symbol of plague doctors – their outfit.  

How they thought you could catch the plague in the 17th century

Let’s go back a step into the 17th-century body of knowledge about plague. At this point, it was already an endemic disease with multiple outbreaks for centuries, so it was not a completely foreign disease in Europe. While by this time, the Renaissance and Enlightenment were slowly recovering the knowledge loss and new knowledge delay throughout out Middle Ages in Europe, a lot of their medical knowledge was still mostly based on classical antiquity and the Middle Ages, which naturally framed how they perceived and viewed the mechanism of the plague. Plague was thought to be spread based on Miasma – an abandoned medical theory where “poisonous air” (often of bad odor) carries the disease. This theory was deeply rooted throughout the Middle Ages and was the predominant theory used to explain outbreaks of various contagious diseases (like cholera, chlamydia, or the Black Death) that occurred prior to the advent of the germ theory. Additionally for plague, this miasma theory was further combined (?) with astrology in 14th centrury France to elaborate on its mechanism, where 1345 conjunction of “hot planets” (apperantly Mars, Saturn and Jupiter…don’t ask me why) in the zodiac sign of Aquarius (a wet-sign!… whatever that means) took place. This supposedly caused unnaturally hot and moist air to blow across Asia toward Europe, leading to the catastrophic Black Death. While I’m not sure if such a cosmos-level mechanism has been described for EVERY plague outbreak, the idea linking it to some sort of bad things coming from pestilent air was the general view on how the disease came to be, and this naturally affects how the disease prevention would be approached.

When it comes to how people thought the plague manifested in our bodies, this explanation was often based on humorism. This is yet another abandoned medical system that originated from ancient Greece and was upheld throughout Europe and the Middle East, nearly consistently for 2000 years, until, again, cellular pathology explained things otherwise. It is a fairly complex system (and I am NOT going to explain the full details today), but essentially, the plague, like many diseases, was thought to be a bodily result of imbalances in the four humors that constituted our bodies. Particularly, the doctors identified that with the bubonic plague, which results in bubo formation (the stereotypical pus-filled swellings) especially around groins, armpits, and neck, and saw this as evidence of the body attempting to expel humors from the nearest major organs. This results in historical treatments that focus on “expelling” these bad humors by bloodletting or diets and lifestyle coaching that will balance the humors (like cold bath + avoiding “hot foods” like garlic and onions (???) apparently). It was also said that some doctors (and religous services?) provided additional service at a fee, which may include potions and pastes, but as far as I can see, by 17th century, more of the “out of the box remedies” like “Vicary Method” (look up with your own disgretion, but it essentially involves somehow transfering the disease to chicken in a rather graphic way, until the person OR the chicken died), seems to have died out of popularity. However, in cases where these measures aren’t enough and bodies are piling up (which unfortunately was often the case with outbreaks), generally the effort was focused on preventative measures rather than treatments. Traditional approaches includes house hold level quartine, routine searches and removal of deceased by council appointed services, smoking of “sweet smelling” herbs to combad the evil sent, banning of public gathering, and cats and dogs were killed (and this we will learn that it may not been just horrible but double further worsen the situation).

How to catch a plague (according to science)

But okay, what REALLY causes the plague, and what do we know of this disease? You might have some vague idea that this has something to do with rats, which is not completely wrong, but the real mechanism is essentially a blood-borne vector disease, which is the pathology lingo to say that it’s a germ-caused illness transmitted through blood. Blood? Well, not necessarily just of humans, but let me draw you a picture, as I heard it on one of my favorite podcasts.  One hungry flea jumps onto a rat for a blood meal. But oh, no, this rat has Yersinia pestis (the real culprit bacteria behind the whole massacre) in it! So this bacterium gets into the flea and multiplies in its tiny stomach. Within 3-9 days, this poor little flea, now hungry again but super queasy from overflowing bacteria in its tummy, will try to take another blood bite from a new rat it landed on and ends up throwing up – rat blood and the bacteria – but now in quantities of 11,000-24,000 Y. pestis. Once back in mammals, this parasite is in a different life cycle phase and will enter the lymphatic system, duplicate until it eventually the infection spreads to the bloodstream, to the liver, spleen, and other organs. This bacteria can infect over 200 species, but their primary hosts’ (ie flea’s) primary host like Ratus ratus (sewer/black rats) tend to have mild resistance.  This may be allowing for asymptomatic carriers (ie immune system keeps the bacterial duplication/symptoms at bay), and with their relatively high replacement rate, it seems like the natural infections are less of a trouble for these rats. (And see? This is yet another reason why we should’ve kept the cats to keep rats at bay!) However, when the infection happens to humans, the story’s different.

In Homo sapiens for example, the diease can manifest (depending on what type you contract as well) in three ways: bubonic, septicemic, pneumonic. In bubonic plague, following the incubation period of between 1-7 days, the infection spreads to the lymph nodes, leading to the infamous bubos forming – the swellings we discussed earlier that doctors observed that are essentially the incubator full of bacteria and pus. (And yes, this is the one that most people probably imagine the plague to look like on a patient.)  With this type, you actually had roughly a 30-60% chance of survival despite the horrendous visual (more on this later). These patients often also experience other symptoms like fever, chills, head and body aches, vomiting, and nausea. Septicemic plague is the version where the bacteria (say those that overflowed from the swelling lymph nodes or a direct flea bite into the bloodstream) enter the bloodstream, resulting in sepsis. Like most sepsis, left untreated, it’s almost certainly lethal, with a mortality of 80% or 90%. And at this stage, as well as the bubos themselves, can result in localized necrosis, where the body tissues usually from the terminal area like fingers, feet, nose, etc, die locally, turning black (hence the name, “Black Death”).   This is nasty enough, but the scariest variation is probably the pneumonic plague. This, unlike bubonic plague, does not form the characteristic swellings. Fundamentally, to contract the two earlier variants, the infected blood needs to go into you either via a flea bite or with lots of contact with buboes. But with pneumonic plague, it can also be contracted as an airborne disease. The infection takes place in the lungs, resulting in infectious respiratory droplets that can also be transmitted directly from human to human. Furthermore, while the pneumonic plague patients are said to be most infectious at the end stage of their symptoms, their incubation period is really short – around 24h -, and without modern medical intervention (ie, antibiotics!), the mortality is 100%.

Time to call the plague doctor in their OG hazmat suit

So let’s say you were a poor soul after hearing this story who was sent back in time to the 17th century. You notice having the early symptoms of chills and fever, and the buboes are starting to form (which gurgled even according to some horrific accounts!). Time to call the doctor, but if they don’t know the actual cause and with no antibiotics at hand, what CAN they do for you? Besides, it’s not like you need a diagnosis when it’s pretty clear what you contracted, and you had such a high chance of dying at this rate. As described in the first section, it’s true that what doctors could do to effectively treat an individual is limited; hence plague doctors where sometime even seen more synonimous to caller of death because by the time they comes around, there is a good chance for you to be diagnosed as too late and you’re left waiting to die. However, for the neighbors and for public record keeping, it was still a useful service for you to be identified and your house to be marked with a white flag that this household has succumbed to the plague. In other words, while these plague doctors are called “doctors,” they functioned perhaps more akin to public health workers (which is also not surprising that this is the “pre-med school era”, and the credentials behind the beaked mask often varied). While you suffer with fever, you hear the lucky news that, in fact, Dr. Rae may be just able to offer a treatment (given that it appears to be bubonic plague), aside from all of the humor restorative bloodletting: to lance the buboes. This allows the “poison” to run out, cauterizing shut the cleared wound, thus sealing and disinfecting. This was a high-risk treatment in itself, but you managed to survive.

But then you start to wonder, this guy literally just let the biohazard out all over, and how does he manage to survive facing patient after patient? Despite all my debunking of plague treatment tactics in the previous section, this is where the plague doctors, especially their attire, might have been on to something. Amongst his attire, the mask may have been the most iconic, but potentially the most uncertain piece of historical origin that’s worn. However, if it was worn as seen in mid-1600s drawings, a crow-like beak extending far from the face was filled with “sweet smelling herbs”, intending to fight off the “bad air”.  Of course, this doesn’t quite work as they presumed, given that miasma theory was not true. A mask of this sort may have been better than no mask just to give some physical filter, but honestly, the herb-based filtering system is probably not enough to filter out the bacteria of the aerosol droplets coming from pneumonic plague patients (ie, NOT the same standard as modern respirators and clinical masks). The cane that was used to inspect you without touching directly may also have given Rae a social distance measure to “keep away people” (presumably other sick-ish people in streets… while the ethics of that is also dubious, but it was tough times, I guess?).  But the real deal is arguable, the REST of the garment. In fact, in Dr. Rae’s time, he may have been pretty upto date in terms of his PPE game given that the first description that fully resembles what we think of as plague doctor costume shows up in the writing of physician to King Louis XIII of France, Charles de Lorme, during the 1619 plague outbreak in Paris. It was announcing his development of a full outfit made of Moroccan goat leather head to toe, including boots, breeches, a long coat, hat, and gloves. The garment was infused with herbs just like the mask (because, of course, miasmas!). Whether the full credit of this now iconic costume should go to Charles de Lorme seems to be subject of debate. However, this leathery suit did one thing right: it prevented flea bites pretty well. So long as you are extra careful with how you handle the taking off of this OG PPE (and don’t breathe in the pneumonic plague patient droplet), you have a pretty functional protection at hand.

A broken clock is right twice a day – nothing more, nothing less –

So it just so happens to be that Dr. Rae unknowingly (though he may have had sufficient faith in his sweet herbs and leather suits) was geared up to protect himself from the actual culprit behind the plague.  Naturally, I found this to be an emblematic tale highlighting the importance of the correctness of the supporting facts and the logic of a theory, which is indeed a crux of modern science and academia. This may sound obvious, but it’s an important reminder to those who end up in a pseudoscientific line of knowledge (which could be any of us!): just because some specific outcome of the belief system happens to work, the supposed mechanism behind it is not automatically correct. Clearly, with the germ theory falsifying the miasma theory, the leather hazmat suit cannot be used as evidence to say that the miasma theory is correct: it’s just not letting the flea bite.  Conflation of partial truth and correctness of the whole theory is perhaps a philosophical one as well, given that it’s sometimes easy, by human nature, to conflate things that are happening and ought to happen.   

But this is also a lesson for pseudoscience skeptic thinkers: just because something was established or mixed in the pseudoscientific rhetoric, the individual practice/claims/results are not automatically entirely false.  And this is a moment that we all need to be honest ourselves – have we previously dismissed practice or ideas just due to the way it was presented?  Of course, this is not to say that we should actively praise every single little kernel of truth mixed in the pseudoscience rhetoric, which may inevitably be overly assigned credibility.  Heck, in fact, the mixing kernel of truth is indeed a tactic a “sciencey writers” can employ as well.  However, if we decide everything is pseudoscientific based on when/who/where/or the context rather than the content, isn’t this attitude in the very nature of pseudoscience, where we are letting our preexisting notions and biases determine our lens to view “truth”?  So instead of praising individual kernels of truth, let’s acknowledge them as what they are; that is correct; but in the same breath we should be able to say: but doesn’t mean the rest is correct because of blank or it’s not tested.  This is an intentional communication that indeed requires more effort, and if done wrongly, it may still give the same dismissive debunking effect, which could spiral pseudoscientific believers into more pseudoscience.  Therefore, let us practice this fine-resolution distinction of science and pseudoscience and use this to PIVOT the conversations, so that we can invite everyone in the conversation to a factual exploration of intellectual curiosity (instead of saying like “medieval doctors had no clues about bacteria (indeed), so they did everything wrong (see the issue here?).”

And after all, it is important to acknowledge the intention behind some of the pseudoscience/outdated knowledge. It’s not always from malicious intent, unlike some disinformation where one can or DOES actually know better, which should be tackled with fury than these plague scenarios. For example, this miasma theory in a large sense can still be seen as an attempt to conceptualize contagious disease – it was a protective and survival instinct justified with a set of logic back then, and rotting smell is probably a bad sign anyway.  Humorism (which is bona fide pseudoscience in modern medicine) was also wrong and largely unscientific, but it was perhaps an attempt to reconsider nutrition and hygiene practices. So they are wrong, but people were trying to survive, and especially when modern scientific investigation tactics and tools were unavailable, I find something beautiful in humanity still managing to land on “tried n true method” with some kernel of truth that inevitably did protect lives, with many missteps along the way which cost lives.  It is a history of H. sapiens grappling for truth for survival. Acknowledge, and then further explore: but now we know more about these pesky diseases, and we even know why some parts were wrong, while why some parts were right!  So keep thinking, keep asking, and keep talking, and don’t be too scared about correcting or being corrected; and let us all appreciate our inner scientists and our desire to just approach the truth.  And of course, don’t forget to wear adequate PPE (maybe not a leather mask and suits in this day and age) when you are a bit under the weather and you want to keep your friends safe.  Let the fresh air in and ventilate; maybe not to clear our miasma, but to circulate air and keep virulent particles away.  And like my favorite podcast always says, “Wash your hands; Ya filthy animals!” 😉

Recommended Listen/Watch:

Amazing podcast series by two scientists: Erin and Erin.  This episode is a major source of the historical and biological information in this article:

https://thispodcastwillkillyou.com/2018/02/10/episode-5-plague-part-1-the-gmoat/

Something shorter and eye-catching? This video will probably give you a big appreciation of all the illnesses our ancestors were often combating and we’re pretty lucky to not have to face them as much or at all! (It can get visually horrific, so please watch with caution.)

https://www.youtube.com/watch?v=6WL5jy2Qa8I