Connect with us

Sat Mag

Ins and outs of critical thinking

Published

on

Critical thinking is not being “critical” in the way the latter is used in common parlance. Being “critical” of something insinuates negative connotations like censure, disapproval, sarcasm, denial and rejection. As such, “critical thinking” cannot help being unwittingly associated with some of those unwelcome implications of “critical” although the former may replace “analytical thinking” without much loss of “academic” flavour. Thus critical thinking does not necessarily result in the rejection of everything new; it may lead to acceptance, if the inquiry permits it.

By Susantha Hewa

Critical thinking has different levels of acceptance depending on the field it inhabits. Usually, it involves doubting, reasoning, analysis, verification and inference; it is a process of “thinking carefully about a subject or idea, without allowing feelings or opinions to affect you” (Cambridge dictionary). Understood as such, surely, it is persona non grata with many time-honoured creeds. Thus, it is clear that promoting critical thinking will imperil the perpetuation of any established religion. Even Buddhists who are warned against accepting anything on authority forget it in order to don the ‘religious’ hat.

There wouldn’t be very much left of religion for posterity if kids were trained to think critically before they were taught the ‘core truths’ considered nonnegotiable in religion. However, such a preemptive action is impossible as religion comes to them way before they are even ready to begin potty-training.

Considering the way children in their formative years are indoctrinated, one would not expect them to retain any critical thinking at all for future use. However, the case is not so bad. Many of our attitudes are so neatly compartmentalised that our razor sharp critical thinking in one realm never disturbs the serenity of delusion in another realm. Each of us is fitted with a personal computer where critical thinking, credulity, irrationality, superstition, sagacity, modernity, sophistication and simplicity are kept in different files to be opened whenever we need them without opening the others. Nobody has to click on the critical thinking file and the credulity file simultaneously to invite needless embarrassment. This all-under-one- roof arrangement makes our shopping much easier.

This kind of selective critical thinking is not so uncommon. In fact, many of those suave bureaucrats and executives for whom profit-oriented reasoning, analysis, evaluation etcetera is part of daily routine, scarcely feel shy to barter their business sagacity for ‘road safety’ by protecting their limousines with charmed trinkets proffered by a charlatan. In other words, critical thinking or its main ingredients like questioning, reasoning and analysis seem domain-specific. Even passionate advocates of reason in routine life may approach more personal matters in life with amazing naiveté.

Pavan K. Varma in his book Being Indian quietly laughs at the coexistence of discernment and conventionality in some educated Indians when he writes, “…but if the behaviour of the illiterate masses is understandable, the shallow modernity of the educated is less so. Those with engineering degrees wear rings on their fingers on the advice of quacks… they rely on pundits to decipher the exact time for muhurats… they demand dowry and consider women to be subordinate; they match horoscopes and propitiate evil spirits. What they don’t ever do is question…” Critical thinking doesn’t seem to be in a hurry to outpace blind adherence, be it in India or anywhere else.

Compliance or unresisting acceptance, as opposed to skepticism, is the lynchpin of all dogma. Critical thinking is not a hoe that people use in all their farms. It is more often left idle than used and you are not even aware of the fact. Thanks to our culturally inherited conformism, time-honoured narratives continue to prevail repelling all objective inquiry. We band together and promote critical thinking but avoid it at times to reinforce existing attitudes. In an article titled “Critical thinking and the ‘value’ of university education“, appearing in The Island of 3rd August 2021, Prof. Harshana Rambukwella points to this duality when he makes a quip about the “absence of such a critical spirit” in some educators who are often effusive in its praise but shun crucial aspects of a problem being discussed by labeling them as ‘political’ or ‘ideological’ and choose to tinker around trivial issues to avoid trouble.

Critical thinking is not being “critical” in the way the latter is used in common parlance. Being “critical” of something insinuates negative connotations like censure, disapproval, sarcasm, denial and rejection. As such, “critical thinking” cannot help being unwittingly associated with some of those unwelcome implications of “critical” although the former may replace “analytical thinking” without much loss of “academic” flavour. Thus critical thinking does not necessarily result in the rejection of everything new; it may lead to acceptance, if the inquiry permits it.

Critical thinking and its cogs and wheels i.e. questioning, reasoning, analysis etc., seem to “behave” differently in different systems depending on their respective home dynamics. Let’s consider any two different systems. As we know, the more regimented a system is, the less room it allows the individual to employ his critical faculties on any given issue. Thus, in closed systems, critical thinking is little encouraged and hence deviations are rarely allowed to be impartially examined- thus, little leniency for the unfamiliar. On the other hand, more open systems, by creating more leeway for unbiased scrutiny, are likely to let the individual more freedom to admit or reject any unorthodox view depending on how it squares with evidence.

As such, the overall flexibility of any system- be it science, culture, philosophy, religion or aesthetics- may influence the individual’s orientation towards impartial reflection. Let’s, for example, consider two systems of thought, i.e. science and religion, both of which claim to seek the truth, either mundane or transcendent. The former is a veritable combat zone for nonconformist views where critical thinking reigns supreme; the latter is pathetically less so. As a result, the same person would be more open-minded about unorthodox intrusions in science than in religion. Thus, the context in which we work seems to monitor our critical faculties according to its respective ‘rules and regulations.’ That may be why we move in different fields without this inconsistency ever disturbing us.

Familiarity breeds smugness when it comes to our lifelong convictions. They remain untouched because tinkering with them can often prove unsettling. This apathy is the enemy of reasoned scrutiny. And, we often forget that some critical thinker in the past had a heavy price to pay for the complacency we feel right now about many of the established canons in arts, sciences, philosophy etc. In each instance, someone had to ask unpopular questions and earn the sobriquet “rabble-rouser” before he could bring the hitherto “authority” down a notch or two. However, history repeats itself and every generation condemns its flock of ‘black sheep’ leaving it to the future generations to give them their due, if one may say so, ‘retrospectively.’

For all those who want to avoid trouble, discretion is perhaps the better part of critical thinking.



Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Sat Mag

Contagion in Ceylon: Two case studies

Published

on

By Uditha Devapriya

The medical profession in Ceylon was, by the late 19th century, following patterns set down in most of the European colonies in Asia. The Civil Medical Department was set up in 1858, the Ceylon Medical College in 1870, and the Ceylon Medical Association in 1887. As with law, the practice of medicine soon came to be monopolised by the upper middle class, predominantly Burgher and Tamil.

In her extensive study of the history of the medical profession in Ceylon, Kamalika Pieris conjectures that this may have been due to the fact that unlike the Sinhala bourgeoisie, neither the Burgher nor the Tamil middle class had extensive holdings in land. In an autobiographical piece, on the other hand, Ralph Pieris observed that in colonial society, the quickest way for the middle class to guarantee their position was by ensuring their children took to professions where the duration between education and employment was brief; to give just one example, Lucian de Zilwa, divided between law and medicine, chose the latter because “I could begin to earn something the moment I received my sheepskin.”

The entrenchment of a medical elite did not contribute to a rise in health or sanitary standards in the country. The first hospitals had been built for military garrisons in Colombo, Galle, Jaffna, and Trincomalee, while the expansion of the plantation economy necessitated the establishment of hospitals to service the needs of estate labour. In both cases the aim would be to serve imperial interests. Against such a backdrop, expansions of medical institutions, quarantine centres, and isolation camps were driven by bouts of epidemics which became part and parcel of the region in the 19th century. The aim of this essay is to compare and contrast responses by officials to two such epidemics: smallpox and cholera.

Historically, the Sinhalese are said to have feared smallpox the most. Robert Knox wrote that none of the charms “successful to them in other distempers” could cure the illness, while Ribeiro described it as “the most dreaded disease among the natives.”

There’s reason to believe that while smallpox was introduced by European adventurers, it did exist in pre-colonial societies; Asiff Hussein writes of Hugh Neville describing “a gigantic Telambu tree amidst a sacred grove on the site of the Ruwanveli dagaba” which may have figured in an outbreak of a disease at the time of Arahat Mahinda. The first real account of the disease, according to Thein (1988), comes to us from a fourth century Chinese alchemist called Ko Hung; it may have existed in a mild form in fifth century India.

John Holwell was the first British official to write on the illness in the subcontinent, in 1767 in Bengal. From 1779 to 1796, there was a severe smallpox outbreak in that region, while in 1802 we read of a Swiss doctor, Jean de Carro, sending samples of a vaccine to Bombay via Baghdad. The date is significant, for it was in 1802 also that, as Ananda Meegama has noted, vaccination against smallpox commenced in Sri Lanka.

However, owing to inconsistencies in the vaccination programme, not least of which the slipshod way medical officials despatched vaccines to poorer, far flung regions, there were severe periodic outbreaks. In 1886, vaccination was made compulsory for the first time, but even then problems in the Medical Department, set up 28 years earlier, led to the disease spreading throughout the country. To prevent it would have meant enforcing an anti-venereal programme, which transpired 11 years later with the passing of the Quarantine and Prevention of Diseases Ordinance No 3 of 1897. By the 1930s, smallpox had ceased to become a serious threat; by 1993, it had been fully eradicated.

No less feared than smallpox was cholera, which like smallpox goes back many centuries in the country and the subcontinent. The Tamil goddess, Mariamman, is considered symbolic of both smallpox and cholera, while a reference in the Mahavamsa to a yakinni who in the reign of Sirisangabo caused those who came under her curse to have red eyes, and die the moment they came into contact with a patient, may indicate an outbreak in as early as third century Sri Lanka. I have noted this in my SAT MAG column last week.

Historical sources tell us of a reference in ninth century Tibet to a particularly violent illness (the first signs of which included “violent purging and vomiting”), although scholars doubt the authenticity of these texts. In any case, not for nothing did colonial officials call it the Asiatic cholera, given its ancient origins in the lower Bengal where, it must be said, we come across our earliest sources for the cult of a cholera goddess in the subcontinent. On the other hand, the destruction of the local economy, and unnatural changes in agriculture forced on farmers by British authorities, had an impact on these outbreaks: with irrigation schemes destroyed and neglected, water scarcities could only result in pestilence.

There were six cholera pandemics between 1817 and 1917. All of them were aggravated by disruptive economic and social changes unfolding in each of the periods they transpired in. All six originated from the Bengal, inevitable given the genesis of the disease in tropical, depressed climates, and they diffused beyond the region to as far as Russia, the US, and Latin America. Leaving behind heavy fatalities, each was deadlier than the last.

The first case in Sri Lanka was reported in 1818 in Jaffna. However, as Ananda Meegama has noted, the disease never became endemic in the island. It turned into a serious issue only after British colonisers, having appropriated Kandyan lands under the Crown Lands Encroachment Ordinance and Waste Lands Ordinance of 1840, imported labour from South India. The latter led to a widespread contagion: case numbers jumped from 16,869 between 1841 and 1850 to 35,811 in the following decade, with fatality rates rising from 61% to 68%.

Until the passing of the Quarantine Ordinance in 1897 — by which time five of the six cholera epidemics had passed through and devastated the country — officials took haphazard measures to control immigration of South Indian labour. Estate workers, on their way to the plantations, took the Great North Road to the hill country, which brought them into contact with locals in Jaffna and Mannar; this led to several epidemics between 1840 and 1880, after which the number of cases and fatalities began to reduce.

Still, officials remained oblivious to the need to quarantine estate workers until much later, a problem compounded by the fact that many such workers, even after restrictions had been imposed, came disguised as traders, an issue attributable as much to the immigrants as it was to “the partial manner” in which officials enforced the quarantine.

Two years after the passage of the Ordinance, colonial administrators shut down the Great North Road to estate labour. Those who arrived in the country were taken to Ragama. Yet despite such measures, other problems cropped up: as Kamalika Pieris has pointed out, there was widespread hostility to Western medicine, particularly in economically backward regions which doctors never attempted to reach. Contaminated water became a serious issue even in Colombo, as did unplanned urbanisation, while quarantines led those fearing for their lives to flee to other districts, spreading the disease even more.

De Silva and Gomez (1994) have argued that one of the contributing factors to the recession of these diseases were the advances made among the local population in sanitation. The first soap was imported to the island in 1850; until then, only vegetable oils had been widely used. The setting up of various Committees to probe into these outbreaks would have been another factor. Despite this, though, the fact remains that while it had the wherewithal to expand into less well off regions, the government chose for crude reasons of economy not to.

By the 1930s, at which point cholera had been significantly contained — in 1946, two years before independence, only two cases cropped up — the rise in a radical left movement among professionals, including doctors and lawyers, led to a more proactive approach being taken to pandemics and contagions. The apogee of this trend, no doubt, was the malaria epidemic of the 1930s, in which the work of volunteers was done, not by the English speaking medical elite, but by a group of young leftists, among them perhaps the first Western doctor to don the national dress, S. A. Wickramasinghe. But that’s another story.

The writings of S. A. Meegama, Asiff Hussein, Kamalika and Ralph Pieris, Robert Pollitzer, Ariyaratne de Silva, Michael G. Gomez, and M. M. Thein were used for this article.

The writer can be reached at udakdev1@gmail.com

Continue Reading

Sat Mag

The Remedy. Remedy?

Published

on

By Dr F E Dias Introduction

A solution to the Covidian problem was eagerly awaited – and emerged in a legion of vaccines. The development of vaccines to coronaviruses has been difficult due to vaccine-induced enhanced disease responses evident in animal studies. Antibody-dependent enhancement may be involved in the clinical observation of increased severity of symptoms associated with early high levels of SARS-CoV-2 related antibodies in patients. The gene-based vaccines dispense of the need of the wild virus and instead get the inoculated persons to produce multi-trillions of its fusogenic spike protein which by itself is a pathogen that biodistributes itself and potentially harms other parts of the body away from where the injection occurred. And yet, many experimental vaccine products were hastily unleased globally without the pivotal Phase 3 trials concluded or validated.

Front Runners

Israel and the Seychelles had high proportions of their populations vaccinated before other nations caught up. Let us compare sets of data that are not usually presented synchronously, such as in Figures 1 & 2, to observe what happened and is happening in Israel and Seychelles.

EU

The 450 million strong EU consisting of 60% of Europe’s estimated 750 million inhabitants, reported administering 522.4 million doses of vaccines as of August 14th 2021, with over 75% of its citizens receiving at least one dose of an experimental CoViD vaccine.

The EnduraVigilance system is the EU-wide database for recording vaccine injury reports, as well as other medicine-induced injuries, corresponding to the US Vaccine Adverse Events Reporting System (VAERS). EnduraVigilance data indicate, since beginning of the vaccination campaign last year through to August 14th, over two million (2,074,410) reports of vaccine-related injuries, including 21,766 deaths across the 27 member

states.

Approximately half of all reports (1,021,867) were of serious injuries, classified by this EU agency as corresponding to “a medical occurrence that results in death, is life-threatening, requires inpatient hospitalisation, results in another medically important condition, or prolongation of existing hospitalisation, results in persistent or significant disability or incapacity, or is a congenital anomaly/birth defect.”  The highest rate of problematic developments among their four emergency-authorised vaccines occurred, following the use of the Oxford-AstraZeneca product, which is reportedly linked to 947,675 injuries within the EU.

Reports of thrombotic and thromboembolic post-vaccination events among the vaccinated triggered concern regarding post-vaccination donation of substances of human origin (SoHO), such of blood, plasma, organs and tissues that may contain the pathogenic spike protein of recently vaccinated donors.

UK

A groundbreaking paper by the prestigious Oxford University Clinical Research Group, published on August 10th in The Lancet found that vaccinated individuals more than 250 times the load of Covid-19 viruses in their nostrils compared to the unvaccinated. The authors suggest that while moderating the symptoms of infection, the vaccine allows vaccinated individuals to carry unusually high viral loads without becoming ill at first, potentially transforming them into pre-symptomatic super-spreaders, and they suggest that this may contribute to the post-vaccination surges in heavily vaccinated populations globally.

Public Health England (PHE), England’s public health policy department released a report on August 6th detailing the spread of the Delta variant of the virus which includes hospitalisations and deaths where Covid-19 was a factor, between February 1st and August 2nd 2021.

It shows that 65% of hospitalisations and deaths involving Covid-19 are among those who have had at least one dose of the experimental vaccines.  Though the unvaccinated category accounts for around half of overall Delta Covid-19 infections in England, the rate of death in this group is lower than among those who received vaccines.

Considering the fully vaccinated group on its own, the PHE data show 1,355 of 47,008 identified infections were admitted to hospital, which is 2.9%, suggesting that the double-jabbed face a nearly 50% greater chance of being hospitalised if they contract CoViD-19, compared with those who have not been vaccinated.  Further, those who contracted the virus within 21 days of their first shot demonstrated a 0.97% hospitalisation rate, and those who tested positive after three weeks from their first shot demonstrated a 1.14% hospitalisation rate, indicating that the likelihood of hospitalisation is greater for the double-jabbed when compared with the single-jabbed.

North America

The VAERS, jointly run by both the Food and Drug Administration and the Centers for Disease Control and Prevention is the primary US government-funded system for reporting adverse vaccine reactions in the US.  Not all adverse events are reported into it by physicians and the VAERS report is understood to be a substantial underestimate. While 328.9 million Covid vaccine doses had been administered as of July 2nd 2021, between December 14th 2020 and July 2nd 2021, a total of 438,441 adverse events were reported to VAERS, including 9,048 deaths, 22% of which occurred within 48 hours of vaccination, and 37% occurred in people who became ill within 48 hours of being vaccinated.

2,678 pregnant women reported adverse events related to CoViD vaccines, including 994 reports of miscarriage or premature birth. VAERS also reports the deaths of two breast-feeding babies in March and July 2021 due to blood clots subsequent to the mother’s reception of the vaccine. Of the 4,456 cases of Bell’s Palsy reported, 398 reports of Guillain-Barré Syndrome. There were also 121,092 reports of anaphylaxis, 8,256 of blood clotting disorders and 1,796 cases of myocarditis and pericarditis. The VAERS data showed that 22% of deaths were related to cardiac disorders.

By the end of August 2021, the total deaths reported had exceeded 13,000 and reports of harm exceeded 600,000. With over 5000 reports of myocarditis as of August 20th, Pfizer added to their product fact sheet that “post-marketing data demonstrate increased risks of myocarditis and pericarditis, particularly within seven days following the second dose. The observed risk is higher among males under 40 years of age than among females and older males. The observed risk is highest in males 12 through 17 years of age”, admitting also that potential long-term sequalae are unknown.

The Public Health Agency of Canada (PHAC) in July estimated the rate of vaccine-related blood clotting in Canadians who have received the AstraZeneca vaccine, and said there have been 27 confirmed cases to date in Canada, with five deaths among those cases.

Blood-clotting events that are reported are the larger ones that can be detected using MRI or CT scans. However, with the RNA and viral vector vaccines, there is a new phenomenon of micro blood clots, diagnosable via D-dimer tests. These microscopic blood clots could be caused by the vaccine-generated spike proteins altering the vascular epithelia, particularly affecting their interactions with platelets in capillaries.  Some parts of the body like the brain, spinal cord, heart and lungs cannot re-generate when their tissues are damaged by blood clots. Particularly in the lungs, this may cause pulmonary hypertension that may lead to heart failure years later.

In Brief:

Fatality, Natural Immunity and Treatment

The fatality rate in the age-group up to 20, among those who actually become infected is 0.003% which suggests a five times greater likelihood of death in a road traffic accident further to an infection of CoViD, and that for those 20-40 is 0.02%. A study published on August 24th 2021 by Israeli researchers looking at over 670,000 vaccinated and unvaccinated individuals conclude that “that natural immunity confers longer lasting and stronger protection against infection, symptomatic disease and hospitalisation caused by the Delta variant of SARS-CoV-2, compared to the BNT162b2 two-dose vaccine-induced immunity”. Dr Robert Malone the virologist and immunologist credited with pioneering RNA transfection has stated that natural immunity is twenty times more protective than the vaccines. India achieved success by encouraging anti-viral medication.

Sri Lanka

Sri Lanka’s vaccinations and deaths data is shown in Figures 4 and 5.

Conclusion

Correlations in Sri Lanka’s data are apparent. Even if the preceding database reports and scientific theory are disregarded, and the global experience is ignored, it is not unreasonable to suppose that the relationships between daily deaths and daily vaccinations are causal, subject to the observed time lag. It is remarkable that the spike in total number of deaths corresponds to the spike in the total number vaccinations. It is contrary to the expert wisdom that among the Covid deaths in Sri Lanka, the proportion of deaths of vaccinated people is steadily increasing as the vaccination campaign progresses.

There is evidence that the immediate and potential sequential harm due to these vaccines may exceed the risks associated with the disease.

Cytokine storm model of Castelli et al, Front. Immunol, 2020 (above)

Continue Reading

Sat Mag

Brief history of plagues and pandemics

Published

on

Uditha Devapriya

By the 14th century, trade routes between the East and West had made it easier for pandemics to spread, while conquests by the Spanish and the Portuguese in the 15th and 16th centuries would introduce several diseases to the New World. Trade and colonialism hence became, by the end of the Renaissance, the main causes of plague, which scientific advancement did little to combat, much less eliminate: a physician in the 17th century would have been as baffled or helpless as a physician in the 14th or 15th in the face of an outbreak.

No doubt rapid urbanisation and gentrification had a prominent say in the proliferation of such outbreaks, but among more relevant reasons would have been poor sanitary conditions, lack of communication and accessibility, and class stratifications which excluded the lower orders – the working class as well as peasants in the colonies – from a healthcare system that pandered to an elite minority. By 1805, the only hospitals built in Ceylon were those serving military garrisons in places like Colombo, Galle, and Trincomalee.

Among the more virulent epidemics, of course, was the notorious plague. Various studies have tried to chart the origins and the trajectory of the disease. There were two outbreaks in Rome: the Antonine Plague in 165 AD and the Justinian Plague in 541 AD. With a lack of proper inscriptional evidence, we must look at literary sources: the physician Galen for the Antonine, and Procopius and John of Ephesus for the Justinian.

Predating both these was an outbreak reported by the historian Thucydides in 430 BC Rome, but scholars have ascertained that this was less a plague than a smallpox contagion. In any case, by 541 AD plague had become a fact of life in the region, and not only in Pagan Rome; within the next few years, it had spread to the Arabic world, where scholars, physicians, and theologians tried to diagnose it. Commentaries from this period tell us of theologians tackling a religious crisis borne out of pestilence: in the beginning, Islamic theology had laid down a prohibition against Muslims “either entering or fleeing a plague-stricken land”, and yet by the time these epidemics ravaged their land, fleeing an epidemic was reinterpreted to mean acting in line with God’s wishes: “Whichever side you let loose your camels,” Umar I, the founder of the Umayyad Caliphate, told Abu Ubaidah, “it would be the will of God.” As with all such religious injunctions, this changed in the light of an urgent material need: the prevention of an outbreak. We see similar modifications in other religious texts as well.

Plagues and pandemics also feature in the Bible. One frequently referred to story is that of the Philistines, having taken away the Ark of the Covenant from the Israelites, being struck by a disease by God which “smote them with emerods” (1 Samuel 5:6). J. F. D. Shrewsbury noted down three clues for the identification of the illness: that it spread from an army in the field to a civilian population, that it involved the spread of emeroids in the “secret part” of the body, and that it compelled the making of “seats of skin.” The conventional wisdom for a long time had been that this was, as with 541 AD Rome, the outbreak of the plague, but Shrewsbury on the basis of the three clues ascertained that it was more plausibly a reference to an outbreak of haemorrhoids. On the other hand, the state of medicine being what it would have been in Philistine and Israel, lesions in the “secret part” (the anus) may have been construed as a sign of divine retribution in line with a pestilence: to a civilisation of prophets, even haemorrhoids and piles would have been comparable to plague sent from God.

Estimates for population loss from these pandemics are notoriously difficult to determine. On the one hand, being the only sources we have as of now, literary texts accurately record how civilians conducted their daily lives despite the pestilence, while on the other, writers of these texts resorted to occasional if not infrequent exaggeration to emphasise the magnitude of the disease. Both Procopius and John of Ephesus are agreed on the point, for instance, that the Justinian Plague was preceded by hallucinations, which then spread to fever, languor, and on the second or third day to bubonic swelling “in the groin or armpit, beside the ears or on the thighs.” However, there is another account, by Evagrius Scholasticus, whose record of the outbreak in his hometown Antioch was informed by a personal experience with a disease he contracted as a schoolboy and to which he later lost a wife, children, grandchildren, servants and, presumably, friends. It has been pointed out that this may have injected a subjective bias to his account, but at the same time, given that Procopius and John followed a model of the plague narrative laid down by Thucydides centuries before, we can consider Evagrius’s as a more original if not more accurate record, despite prejudices typical of writers of his time: for instance, his (unfounded) claim that the plague originated in Ethiopia.

Much water has flowed through the debate over where the plague originated. A study in 2010 concluded that the bacterium Yersinia pestis evolved in, or near, China. Historical evidence marshalled for this theory points at the fact that by the time of the Justinian plague the Roman government had solidified links with China over the trade of silk. Popular historians contend that the Silk Road, and the Zheng He expeditions, may have spread the contagion through the Middle East to southern Europe, a line of thinking even the French historian Fernand Braudel subscribed to in his work on the history of the Mediterranean. However, as Ole Benedictow in his response to the 2010 study points out, “references to bubonic plague in Chinese sources are both late and sparse”, a criticism made earlier, in 1977, by John Norris, who observed that it is likely that literary references to the Chinese origin of the plague were informed by ethnic and racial prejudices; a similar animus prevailed among the early Western chroniclers against what they perceived as the “moral laxity” of non-believers.

A more plausible thesis is that the bacterium had its origins around 5,000 or 6,000 years ago during the Neolithic era. A study conducted two years ago (Rascovan 2019) posits an original theory: that the genome for Yersinia pestis emerged as the first discovered and documented case of plague 4,900 years ago in Sweden, “potentially contributing” to the Neolithic decline the reasons for which “are still largely debated.” However, like the 2010 study this too has its pitfalls, among them a lack of the sort of literary sources which, however biased they may be, we have for the Chinese genesis thesis. It is clear, nevertheless, that the plague was never at home in a specific territory, and that despite the length and breadth of the Silk Road it could not have made inroads to Europe through the Mongol steppes. To contend otherwise is to not only rebel against geography, but also ignore pandemics the origins of which were limited to neither East and Central Asia nor the Middle East.

Such outbreaks, moreover, were not unheard of in the Indian subcontinent, even if we do not have enough evidence for when, where, and how they occurred. The cult of Mariammam in Tamil Nadu, for instance, points at cholera as well as smallpox epidemics in the region, given that she is venerated for both. “In India, a cholera-like diarrheal disease known as Visucika was prevalent from the time of the Susruta“, an Indian medicinal tract that has the following passage the illness to which reference is made seems to be the plague:

Kakshabhageshu je sfota ayante mansadarunah
Antardaha jwarkara diptapapakasannivas
Saptahadwa dasahadwa pakshadwa ghnonti manavam
Tamagnirohinim vidyat asadyam sannipatatas

Or in English, “Deep, hard swellings appear in the armpit, giving rise to violent fever, like a burning fire, and a burning, swelling sensation inside. It kills the patient within seven, 10, or 15 days. It is called Agnirohini. It is due to sannipata or a deranged condition of all the three humours, vata, pitta, and kapha, and is incurable.”

The symptoms no doubt point at plague, even if we can’t immediately jump to such a conclusion. The reference to a week or 15 days is indicative of modern bubonic plague, while the burning sensation and violent fever shows an illness that rapidly terminates in death. The Susruta Samhita, from which this reference is taken, was written in the ninth century AD. We do not have a similar tract in Sri Lanka from that time, but the Mahavamsa tells us that in the third century AD, during the reign of Sirisangabo, there was an outbreak of a disease the symptoms of which included the reddening of the eyes. Mahanama thera, no doubt attributing it to the wrath of divine entities, personified the pandemic in a yakinni called Rattakkhi (or Red Eye). Very possibly the illness was a cholera epidemic, or even the plague.

China, India, and Medieval Europe aside, the second major wave of pandemics came about a while after the Middle Ages and Black Death, and during the Renaissance, when conquerors from Spain and Portugal, having divided the world between the two countries, introduced and spread diseases to which they had become immune among the natives of the lands they sailed to. Debates over the extent to which Old World civilisations were destroyed and decimated by these diseases continue to rage. The first attempts to determine pre-colonial populations in the New World were made in the early part of the 20th century. The physiologist S. F. Cook published his research on the intrusions of diseases from the Old World to the Americas from 1937. In 1966, the anthropologist Henry F. Dobyns argued that most studies understated the numbers. In the 1930s when research on the topic began, conservative estimates put the North American pre-Columbine population at one million. Dobyns upped it to 10 million and, later, 18 million; most of them, he concluded, were wiped out by the epidemics.

And it didn’t stop at that. These were followed by outbreaks of diseases associated with the “white man”, including yaws and cholera. Between 1817 and 1917, for instance, no fewer than six cholera epidemics devastated the subcontinent. Medical authorities were slow to act, even in Ceylon, for the simple reason that by the time of the British conquest, filtration theory in the colonies had deemed it prudent that health, as with education, be catered to a minority. Doctors thus did not find their way to far flung places suffering the most from cholera, while epidemics were fanned even more by the influx of South Indian plantation workers after the 1830s. Not until the 1930s could authorities respond properly to the pandemic; by then, the whole of the conquered world, from Asia all the way to Africa, had turned into a beleaguered and diseased patient, not unlike Europe in the 14th century.

The writer can be reached at udakdev1@gmail.com

Continue Reading

Trending