Connect with us

Sat Mag

Around and about in Kurunegala

Published

on

By Uditha Devapriya

Photographs by Manusha
Lakshan and Uditha Devapriya

 

Covering 65 kilometres, the road from Colombo to Ambepussa is fairly straight. From there it turns left and right, up and down. To get to Kurunegala via Ambepussa, you have to pass Alawwa and Polgahawela. Between these regions, the terrain rises, offering you a passing if fleeting glimpse of the hill country. Then the mountains recede from view, the mist settles, and the helter-skelter of urban life returns. The shops teem with life, the clock-tower looms over drivers and pedestrians, and the heat rises. From afar, the faintest outline of Ethagala catches your eye. This is your first glimpse of Kurunegala.

Ethagala (Elephant’s Rock) is a stiff climb, though we had a van at our disposal. At the very top, a fairly large statue of the Buddha looms over the region. It is perhaps the highest point of any rocky outcrop in Kurunegala. A plaque near the statue informs us that it is recent and is a replica; the original lies at the Archaeological Museum in Lahore, dating back more than 1,800 years to the time of the Kusana kings. With its distinctly Hellenic touch, the statue is an enduring testament to a profound artistic renaissance that swamped the country, under the Kushana kings, between the first and third centuries AD. The replica is no less majestic; in its own way, it reminds us of the close cultural links between Sri Lanka and Pakistan. Is it any coincidence that the author of that remarkable study, ‘An Enduring Friendship: Sri Lanka and Pakistan’ (Arshad Cassim, 2017), hails from this part of the country?

There was no special reason for our sudden sojourn to Kurunegala. Part of my family hails from there, but the connection was interrupted very early on. Kurunegala entranced me for other reasons: The history, the culture, the literature, and perhaps more than any of these, the people. There were the rocks, many of them inviting onlookers to climb them, even in the heat of the hottest days. There were the temples, too many to list out, and a great many unexplored. How could I resist these temptations?

Kurunegala’s importance has not been fully appreciated by scholars. It was the last of the Wayamba kingdoms after Dambadeniya and Yapahuwa, perhaps the weakest among them. Their rise coincided with the expansion of the Jaffna kingdom under the Aryacarkravartis. In the 13th and 14th centuries, Sri Lanka’s irrigation civilisation was on the verge of collapse. Having stamped out particularism and unified the country under his rule, Parakramabahu I, the most resolute of the Sinhala kings, ironically ensured the deterioration of the polity after his demise. His most illustrious predecessor, Vijayabahu I, had been much more pragmatic in matters of state; for him discretion remained the better part of valour.

Under Parakramabahu I, and to a lesser extent Nissankamalla, these policies changed.

The historian describes these monarchs as ambitious, ruthless, and reckless. It is in the interests of scholarship not to pass arbitrary judgement on the past, and yet it cannot be denied that in Parakramabahu’s time, the state concentrated its powers to itself. Not only did it stamp out any and all particularist tendencies on the part of Ruhuna, it also diverted tax revenues to the construction of agricultural works that justified its centralisation; these more or less provided the raison d’etre for its entrenchment. But in entrenching itself, it undermined its existence, it weakened severely any regional power it could have relied on in the event of an external invasion. Without these powers, no resort was possible.

The expansion of the Aryacakravarti dynasty in the north proved two facts: One, that the defeat of Kalinga Magha had not led to the recovery of Sinhalese power, and two, that the growth of an adversary in the north meant the kingdom had to shift elsewhere. Even before the Aryacakravartis solidified their position, it was very clear that the days of the tanks and irrigation networks in the Sinhala heartland had passed by. The result was to push the kings further to the south-west. Not that their enemies to the north stopped pursuing them once they made this shift: Even in Gampola, there were Tamil tax collectors at work, extracting if not forcing tributes from the land. According to an inscription at Madawala, in Harispattuwa, one collector, Ariyan of Singai Nagar or Mathandan Perumal, “cause[d] tribute to be brought from the hill country.” He had taxes from no fewer than five villages.

The absorption of Wayamba to the Kandyan kingdom followed from its earlier position as a dependable, if weak, fortress against external invasion. In this Dambadeniya and Yapahuwa proved their mettle better than Kurunegala, which may be why not much has been written on the latter.

Yet Kurunegala did not just come into prominence with the shift of the Sinhala heartland there. This was a place teeming with history even before that shift. The number of temples, caves, dwelling places, and ruins attest to the fact that monarchs patronised these places long before the collapse of Anuradhapura. These temples, caves, and ruins stand out perhaps more in the Vanni Hatpattuwa than they did elsewhere. The ruins at Toniyagala and Padigala date back to the first century BC and first century AD respectively, while the Torava Mahilava Viharaya traces its origins even earlier, to the second century BC.

Kurunegala, in fact, bore witness to some of the more peripherally important events in the history of the land. Mogallana, who rebelled against Sanghatissa in the seventh century AD, set up camp at Nikawaratiya, then known as Mahagalla; it was from there that he made his advance towards Anuradhapura. Its reputation for rocky outcrops came in handy as kings, and chieftains, turned those outcrops into formidable fortresses. Yapahuwa, for instance, was chosen as a fortress centre not by a king but by a local chieftain. It was the site of the Janavese king Candabhanu’s defeat. Climbing Yahaphuwa is, of course, not as tough as one might be led to believe from this piece of historical information, but back then, an army of invaders, marching hundreds of miles from Salavata (Halawata, Chilaw), and Puttalam, may have exhausted their energies ascending its steps.

The proximity of these centres of power to the ports of Chilaw and Puttalam sealed their reputation as commercial and trading hubs. This was not really wet country, but it lay far away from the dry zones of Anuradhapura and Polonnaruwa in which the Sinhala kings had flourished. With the fragmentation of the polity into several regional powers, Kurunegala, with Dambadeniya and Yapahuwa, served as useful transit points between Kandy and the north, just as Sabaragamuwa served as a transit point between the Kandyan regions and the Maritime Provinces to the south.

As such the cultural and religious renaissance that swept through Kandy made its presence felt in these parts too. Perhaps the most enduring tribute to the influence on Kandyan culture on Kurunegala is the Ridi Viharaya. BSuilt in the second century BC and rebuilt, repainted, and reconstructed on the orders of Kirti Sri Rajasinghe in the 18th century AD, it attests to a revival of the arts in the kanda uda rata.

Under the British Kurunegala gained some prominence for its agriculture and prosperity, yet it lagged behind other regions in other domains. By 1907, the North-Western province was fifth on the list in size, fourth in population, and third in revenues obtained. Striking as these achievements are, what is more striking is the absence of a proper communications network which could explain them. That they were achieved at all without proper railways and roads is perhaps a testament to its position as an economic hub in the time of the kings. That they went hand-in-hand with increasing mortality rates, arising from epidemics and diseases, is a testament to the decline it underwent under colonial rule. It is true that coconut cultivation thrived in these parts, as did a rush for rubber that considerably improved the fortunes of elites in the early 20th century. But these achievements, if they can be called achievements at all, merely confirmed colonial biases towards particular parts of the economy.

Friendly and open, the people of Kurunegala are hospitable. There is an aura of abundance in almost every corner. Agriculture remains, for many, a peripheral pursuit, but also a part-time occupation. It is difficult to escape the past here, because the past lingers everywhere; in the temples, caves, ruins, and rivers. Starting our journey out in the town, we made our way across Tittawella, Wasiwewa, Panduwasnuwara, Yapahuwa, Deduru Oya, and Arankale. This is a journey one trip can never hope to complete. A land of history, Kurunegala belongs the past. In a big way, it belongs to the present too.

The writer can be reached at udakdev1@gmail.com

 



Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Sat Mag

Contagion in Ceylon: Two case studies

Published

on

By Uditha Devapriya

The medical profession in Ceylon was, by the late 19th century, following patterns set down in most of the European colonies in Asia. The Civil Medical Department was set up in 1858, the Ceylon Medical College in 1870, and the Ceylon Medical Association in 1887. As with law, the practice of medicine soon came to be monopolised by the upper middle class, predominantly Burgher and Tamil.

In her extensive study of the history of the medical profession in Ceylon, Kamalika Pieris conjectures that this may have been due to the fact that unlike the Sinhala bourgeoisie, neither the Burgher nor the Tamil middle class had extensive holdings in land. In an autobiographical piece, on the other hand, Ralph Pieris observed that in colonial society, the quickest way for the middle class to guarantee their position was by ensuring their children took to professions where the duration between education and employment was brief; to give just one example, Lucian de Zilwa, divided between law and medicine, chose the latter because “I could begin to earn something the moment I received my sheepskin.”

The entrenchment of a medical elite did not contribute to a rise in health or sanitary standards in the country. The first hospitals had been built for military garrisons in Colombo, Galle, Jaffna, and Trincomalee, while the expansion of the plantation economy necessitated the establishment of hospitals to service the needs of estate labour. In both cases the aim would be to serve imperial interests. Against such a backdrop, expansions of medical institutions, quarantine centres, and isolation camps were driven by bouts of epidemics which became part and parcel of the region in the 19th century. The aim of this essay is to compare and contrast responses by officials to two such epidemics: smallpox and cholera.

Historically, the Sinhalese are said to have feared smallpox the most. Robert Knox wrote that none of the charms “successful to them in other distempers” could cure the illness, while Ribeiro described it as “the most dreaded disease among the natives.”

There’s reason to believe that while smallpox was introduced by European adventurers, it did exist in pre-colonial societies; Asiff Hussein writes of Hugh Neville describing “a gigantic Telambu tree amidst a sacred grove on the site of the Ruwanveli dagaba” which may have figured in an outbreak of a disease at the time of Arahat Mahinda. The first real account of the disease, according to Thein (1988), comes to us from a fourth century Chinese alchemist called Ko Hung; it may have existed in a mild form in fifth century India.

John Holwell was the first British official to write on the illness in the subcontinent, in 1767 in Bengal. From 1779 to 1796, there was a severe smallpox outbreak in that region, while in 1802 we read of a Swiss doctor, Jean de Carro, sending samples of a vaccine to Bombay via Baghdad. The date is significant, for it was in 1802 also that, as Ananda Meegama has noted, vaccination against smallpox commenced in Sri Lanka.

However, owing to inconsistencies in the vaccination programme, not least of which the slipshod way medical officials despatched vaccines to poorer, far flung regions, there were severe periodic outbreaks. In 1886, vaccination was made compulsory for the first time, but even then problems in the Medical Department, set up 28 years earlier, led to the disease spreading throughout the country. To prevent it would have meant enforcing an anti-venereal programme, which transpired 11 years later with the passing of the Quarantine and Prevention of Diseases Ordinance No 3 of 1897. By the 1930s, smallpox had ceased to become a serious threat; by 1993, it had been fully eradicated.

No less feared than smallpox was cholera, which like smallpox goes back many centuries in the country and the subcontinent. The Tamil goddess, Mariamman, is considered symbolic of both smallpox and cholera, while a reference in the Mahavamsa to a yakinni who in the reign of Sirisangabo caused those who came under her curse to have red eyes, and die the moment they came into contact with a patient, may indicate an outbreak in as early as third century Sri Lanka. I have noted this in my SAT MAG column last week.

Historical sources tell us of a reference in ninth century Tibet to a particularly violent illness (the first signs of which included “violent purging and vomiting”), although scholars doubt the authenticity of these texts. In any case, not for nothing did colonial officials call it the Asiatic cholera, given its ancient origins in the lower Bengal where, it must be said, we come across our earliest sources for the cult of a cholera goddess in the subcontinent. On the other hand, the destruction of the local economy, and unnatural changes in agriculture forced on farmers by British authorities, had an impact on these outbreaks: with irrigation schemes destroyed and neglected, water scarcities could only result in pestilence.

There were six cholera pandemics between 1817 and 1917. All of them were aggravated by disruptive economic and social changes unfolding in each of the periods they transpired in. All six originated from the Bengal, inevitable given the genesis of the disease in tropical, depressed climates, and they diffused beyond the region to as far as Russia, the US, and Latin America. Leaving behind heavy fatalities, each was deadlier than the last.

The first case in Sri Lanka was reported in 1818 in Jaffna. However, as Ananda Meegama has noted, the disease never became endemic in the island. It turned into a serious issue only after British colonisers, having appropriated Kandyan lands under the Crown Lands Encroachment Ordinance and Waste Lands Ordinance of 1840, imported labour from South India. The latter led to a widespread contagion: case numbers jumped from 16,869 between 1841 and 1850 to 35,811 in the following decade, with fatality rates rising from 61% to 68%.

Until the passing of the Quarantine Ordinance in 1897 — by which time five of the six cholera epidemics had passed through and devastated the country — officials took haphazard measures to control immigration of South Indian labour. Estate workers, on their way to the plantations, took the Great North Road to the hill country, which brought them into contact with locals in Jaffna and Mannar; this led to several epidemics between 1840 and 1880, after which the number of cases and fatalities began to reduce.

Still, officials remained oblivious to the need to quarantine estate workers until much later, a problem compounded by the fact that many such workers, even after restrictions had been imposed, came disguised as traders, an issue attributable as much to the immigrants as it was to “the partial manner” in which officials enforced the quarantine.

Two years after the passage of the Ordinance, colonial administrators shut down the Great North Road to estate labour. Those who arrived in the country were taken to Ragama. Yet despite such measures, other problems cropped up: as Kamalika Pieris has pointed out, there was widespread hostility to Western medicine, particularly in economically backward regions which doctors never attempted to reach. Contaminated water became a serious issue even in Colombo, as did unplanned urbanisation, while quarantines led those fearing for their lives to flee to other districts, spreading the disease even more.

De Silva and Gomez (1994) have argued that one of the contributing factors to the recession of these diseases were the advances made among the local population in sanitation. The first soap was imported to the island in 1850; until then, only vegetable oils had been widely used. The setting up of various Committees to probe into these outbreaks would have been another factor. Despite this, though, the fact remains that while it had the wherewithal to expand into less well off regions, the government chose for crude reasons of economy not to.

By the 1930s, at which point cholera had been significantly contained — in 1946, two years before independence, only two cases cropped up — the rise in a radical left movement among professionals, including doctors and lawyers, led to a more proactive approach being taken to pandemics and contagions. The apogee of this trend, no doubt, was the malaria epidemic of the 1930s, in which the work of volunteers was done, not by the English speaking medical elite, but by a group of young leftists, among them perhaps the first Western doctor to don the national dress, S. A. Wickramasinghe. But that’s another story.

The writings of S. A. Meegama, Asiff Hussein, Kamalika and Ralph Pieris, Robert Pollitzer, Ariyaratne de Silva, Michael G. Gomez, and M. M. Thein were used for this article.

The writer can be reached at udakdev1@gmail.com

Continue Reading

Sat Mag

The Remedy. Remedy?

Published

on

By Dr F E Dias Introduction

A solution to the Covidian problem was eagerly awaited – and emerged in a legion of vaccines. The development of vaccines to coronaviruses has been difficult due to vaccine-induced enhanced disease responses evident in animal studies. Antibody-dependent enhancement may be involved in the clinical observation of increased severity of symptoms associated with early high levels of SARS-CoV-2 related antibodies in patients. The gene-based vaccines dispense of the need of the wild virus and instead get the inoculated persons to produce multi-trillions of its fusogenic spike protein which by itself is a pathogen that biodistributes itself and potentially harms other parts of the body away from where the injection occurred. And yet, many experimental vaccine products were hastily unleased globally without the pivotal Phase 3 trials concluded or validated.

Front Runners

Israel and the Seychelles had high proportions of their populations vaccinated before other nations caught up. Let us compare sets of data that are not usually presented synchronously, such as in Figures 1 & 2, to observe what happened and is happening in Israel and Seychelles.

EU

The 450 million strong EU consisting of 60% of Europe’s estimated 750 million inhabitants, reported administering 522.4 million doses of vaccines as of August 14th 2021, with over 75% of its citizens receiving at least one dose of an experimental CoViD vaccine.

The EnduraVigilance system is the EU-wide database for recording vaccine injury reports, as well as other medicine-induced injuries, corresponding to the US Vaccine Adverse Events Reporting System (VAERS). EnduraVigilance data indicate, since beginning of the vaccination campaign last year through to August 14th, over two million (2,074,410) reports of vaccine-related injuries, including 21,766 deaths across the 27 member

states.

Approximately half of all reports (1,021,867) were of serious injuries, classified by this EU agency as corresponding to “a medical occurrence that results in death, is life-threatening, requires inpatient hospitalisation, results in another medically important condition, or prolongation of existing hospitalisation, results in persistent or significant disability or incapacity, or is a congenital anomaly/birth defect.”  The highest rate of problematic developments among their four emergency-authorised vaccines occurred, following the use of the Oxford-AstraZeneca product, which is reportedly linked to 947,675 injuries within the EU.

Reports of thrombotic and thromboembolic post-vaccination events among the vaccinated triggered concern regarding post-vaccination donation of substances of human origin (SoHO), such of blood, plasma, organs and tissues that may contain the pathogenic spike protein of recently vaccinated donors.

UK

A groundbreaking paper by the prestigious Oxford University Clinical Research Group, published on August 10th in The Lancet found that vaccinated individuals more than 250 times the load of Covid-19 viruses in their nostrils compared to the unvaccinated. The authors suggest that while moderating the symptoms of infection, the vaccine allows vaccinated individuals to carry unusually high viral loads without becoming ill at first, potentially transforming them into pre-symptomatic super-spreaders, and they suggest that this may contribute to the post-vaccination surges in heavily vaccinated populations globally.

Public Health England (PHE), England’s public health policy department released a report on August 6th detailing the spread of the Delta variant of the virus which includes hospitalisations and deaths where Covid-19 was a factor, between February 1st and August 2nd 2021.

It shows that 65% of hospitalisations and deaths involving Covid-19 are among those who have had at least one dose of the experimental vaccines.  Though the unvaccinated category accounts for around half of overall Delta Covid-19 infections in England, the rate of death in this group is lower than among those who received vaccines.

Considering the fully vaccinated group on its own, the PHE data show 1,355 of 47,008 identified infections were admitted to hospital, which is 2.9%, suggesting that the double-jabbed face a nearly 50% greater chance of being hospitalised if they contract CoViD-19, compared with those who have not been vaccinated.  Further, those who contracted the virus within 21 days of their first shot demonstrated a 0.97% hospitalisation rate, and those who tested positive after three weeks from their first shot demonstrated a 1.14% hospitalisation rate, indicating that the likelihood of hospitalisation is greater for the double-jabbed when compared with the single-jabbed.

North America

The VAERS, jointly run by both the Food and Drug Administration and the Centers for Disease Control and Prevention is the primary US government-funded system for reporting adverse vaccine reactions in the US.  Not all adverse events are reported into it by physicians and the VAERS report is understood to be a substantial underestimate. While 328.9 million Covid vaccine doses had been administered as of July 2nd 2021, between December 14th 2020 and July 2nd 2021, a total of 438,441 adverse events were reported to VAERS, including 9,048 deaths, 22% of which occurred within 48 hours of vaccination, and 37% occurred in people who became ill within 48 hours of being vaccinated.

2,678 pregnant women reported adverse events related to CoViD vaccines, including 994 reports of miscarriage or premature birth. VAERS also reports the deaths of two breast-feeding babies in March and July 2021 due to blood clots subsequent to the mother’s reception of the vaccine. Of the 4,456 cases of Bell’s Palsy reported, 398 reports of Guillain-Barré Syndrome. There were also 121,092 reports of anaphylaxis, 8,256 of blood clotting disorders and 1,796 cases of myocarditis and pericarditis. The VAERS data showed that 22% of deaths were related to cardiac disorders.

By the end of August 2021, the total deaths reported had exceeded 13,000 and reports of harm exceeded 600,000. With over 5000 reports of myocarditis as of August 20th, Pfizer added to their product fact sheet that “post-marketing data demonstrate increased risks of myocarditis and pericarditis, particularly within seven days following the second dose. The observed risk is higher among males under 40 years of age than among females and older males. The observed risk is highest in males 12 through 17 years of age”, admitting also that potential long-term sequalae are unknown.

The Public Health Agency of Canada (PHAC) in July estimated the rate of vaccine-related blood clotting in Canadians who have received the AstraZeneca vaccine, and said there have been 27 confirmed cases to date in Canada, with five deaths among those cases.

Blood-clotting events that are reported are the larger ones that can be detected using MRI or CT scans. However, with the RNA and viral vector vaccines, there is a new phenomenon of micro blood clots, diagnosable via D-dimer tests. These microscopic blood clots could be caused by the vaccine-generated spike proteins altering the vascular epithelia, particularly affecting their interactions with platelets in capillaries.  Some parts of the body like the brain, spinal cord, heart and lungs cannot re-generate when their tissues are damaged by blood clots. Particularly in the lungs, this may cause pulmonary hypertension that may lead to heart failure years later.

In Brief:

Fatality, Natural Immunity and Treatment

The fatality rate in the age-group up to 20, among those who actually become infected is 0.003% which suggests a five times greater likelihood of death in a road traffic accident further to an infection of CoViD, and that for those 20-40 is 0.02%. A study published on August 24th 2021 by Israeli researchers looking at over 670,000 vaccinated and unvaccinated individuals conclude that “that natural immunity confers longer lasting and stronger protection against infection, symptomatic disease and hospitalisation caused by the Delta variant of SARS-CoV-2, compared to the BNT162b2 two-dose vaccine-induced immunity”. Dr Robert Malone the virologist and immunologist credited with pioneering RNA transfection has stated that natural immunity is twenty times more protective than the vaccines. India achieved success by encouraging anti-viral medication.

Sri Lanka

Sri Lanka’s vaccinations and deaths data is shown in Figures 4 and 5.

Conclusion

Correlations in Sri Lanka’s data are apparent. Even if the preceding database reports and scientific theory are disregarded, and the global experience is ignored, it is not unreasonable to suppose that the relationships between daily deaths and daily vaccinations are causal, subject to the observed time lag. It is remarkable that the spike in total number of deaths corresponds to the spike in the total number vaccinations. It is contrary to the expert wisdom that among the Covid deaths in Sri Lanka, the proportion of deaths of vaccinated people is steadily increasing as the vaccination campaign progresses.

There is evidence that the immediate and potential sequential harm due to these vaccines may exceed the risks associated with the disease.

Cytokine storm model of Castelli et al, Front. Immunol, 2020 (above)

Continue Reading

Sat Mag

Brief history of plagues and pandemics

Published

on

Uditha Devapriya

By the 14th century, trade routes between the East and West had made it easier for pandemics to spread, while conquests by the Spanish and the Portuguese in the 15th and 16th centuries would introduce several diseases to the New World. Trade and colonialism hence became, by the end of the Renaissance, the main causes of plague, which scientific advancement did little to combat, much less eliminate: a physician in the 17th century would have been as baffled or helpless as a physician in the 14th or 15th in the face of an outbreak.

No doubt rapid urbanisation and gentrification had a prominent say in the proliferation of such outbreaks, but among more relevant reasons would have been poor sanitary conditions, lack of communication and accessibility, and class stratifications which excluded the lower orders – the working class as well as peasants in the colonies – from a healthcare system that pandered to an elite minority. By 1805, the only hospitals built in Ceylon were those serving military garrisons in places like Colombo, Galle, and Trincomalee.

Among the more virulent epidemics, of course, was the notorious plague. Various studies have tried to chart the origins and the trajectory of the disease. There were two outbreaks in Rome: the Antonine Plague in 165 AD and the Justinian Plague in 541 AD. With a lack of proper inscriptional evidence, we must look at literary sources: the physician Galen for the Antonine, and Procopius and John of Ephesus for the Justinian.

Predating both these was an outbreak reported by the historian Thucydides in 430 BC Rome, but scholars have ascertained that this was less a plague than a smallpox contagion. In any case, by 541 AD plague had become a fact of life in the region, and not only in Pagan Rome; within the next few years, it had spread to the Arabic world, where scholars, physicians, and theologians tried to diagnose it. Commentaries from this period tell us of theologians tackling a religious crisis borne out of pestilence: in the beginning, Islamic theology had laid down a prohibition against Muslims “either entering or fleeing a plague-stricken land”, and yet by the time these epidemics ravaged their land, fleeing an epidemic was reinterpreted to mean acting in line with God’s wishes: “Whichever side you let loose your camels,” Umar I, the founder of the Umayyad Caliphate, told Abu Ubaidah, “it would be the will of God.” As with all such religious injunctions, this changed in the light of an urgent material need: the prevention of an outbreak. We see similar modifications in other religious texts as well.

Plagues and pandemics also feature in the Bible. One frequently referred to story is that of the Philistines, having taken away the Ark of the Covenant from the Israelites, being struck by a disease by God which “smote them with emerods” (1 Samuel 5:6). J. F. D. Shrewsbury noted down three clues for the identification of the illness: that it spread from an army in the field to a civilian population, that it involved the spread of emeroids in the “secret part” of the body, and that it compelled the making of “seats of skin.” The conventional wisdom for a long time had been that this was, as with 541 AD Rome, the outbreak of the plague, but Shrewsbury on the basis of the three clues ascertained that it was more plausibly a reference to an outbreak of haemorrhoids. On the other hand, the state of medicine being what it would have been in Philistine and Israel, lesions in the “secret part” (the anus) may have been construed as a sign of divine retribution in line with a pestilence: to a civilisation of prophets, even haemorrhoids and piles would have been comparable to plague sent from God.

Estimates for population loss from these pandemics are notoriously difficult to determine. On the one hand, being the only sources we have as of now, literary texts accurately record how civilians conducted their daily lives despite the pestilence, while on the other, writers of these texts resorted to occasional if not infrequent exaggeration to emphasise the magnitude of the disease. Both Procopius and John of Ephesus are agreed on the point, for instance, that the Justinian Plague was preceded by hallucinations, which then spread to fever, languor, and on the second or third day to bubonic swelling “in the groin or armpit, beside the ears or on the thighs.” However, there is another account, by Evagrius Scholasticus, whose record of the outbreak in his hometown Antioch was informed by a personal experience with a disease he contracted as a schoolboy and to which he later lost a wife, children, grandchildren, servants and, presumably, friends. It has been pointed out that this may have injected a subjective bias to his account, but at the same time, given that Procopius and John followed a model of the plague narrative laid down by Thucydides centuries before, we can consider Evagrius’s as a more original if not more accurate record, despite prejudices typical of writers of his time: for instance, his (unfounded) claim that the plague originated in Ethiopia.

Much water has flowed through the debate over where the plague originated. A study in 2010 concluded that the bacterium Yersinia pestis evolved in, or near, China. Historical evidence marshalled for this theory points at the fact that by the time of the Justinian plague the Roman government had solidified links with China over the trade of silk. Popular historians contend that the Silk Road, and the Zheng He expeditions, may have spread the contagion through the Middle East to southern Europe, a line of thinking even the French historian Fernand Braudel subscribed to in his work on the history of the Mediterranean. However, as Ole Benedictow in his response to the 2010 study points out, “references to bubonic plague in Chinese sources are both late and sparse”, a criticism made earlier, in 1977, by John Norris, who observed that it is likely that literary references to the Chinese origin of the plague were informed by ethnic and racial prejudices; a similar animus prevailed among the early Western chroniclers against what they perceived as the “moral laxity” of non-believers.

A more plausible thesis is that the bacterium had its origins around 5,000 or 6,000 years ago during the Neolithic era. A study conducted two years ago (Rascovan 2019) posits an original theory: that the genome for Yersinia pestis emerged as the first discovered and documented case of plague 4,900 years ago in Sweden, “potentially contributing” to the Neolithic decline the reasons for which “are still largely debated.” However, like the 2010 study this too has its pitfalls, among them a lack of the sort of literary sources which, however biased they may be, we have for the Chinese genesis thesis. It is clear, nevertheless, that the plague was never at home in a specific territory, and that despite the length and breadth of the Silk Road it could not have made inroads to Europe through the Mongol steppes. To contend otherwise is to not only rebel against geography, but also ignore pandemics the origins of which were limited to neither East and Central Asia nor the Middle East.

Such outbreaks, moreover, were not unheard of in the Indian subcontinent, even if we do not have enough evidence for when, where, and how they occurred. The cult of Mariammam in Tamil Nadu, for instance, points at cholera as well as smallpox epidemics in the region, given that she is venerated for both. “In India, a cholera-like diarrheal disease known as Visucika was prevalent from the time of the Susruta“, an Indian medicinal tract that has the following passage the illness to which reference is made seems to be the plague:

Kakshabhageshu je sfota ayante mansadarunah
Antardaha jwarkara diptapapakasannivas
Saptahadwa dasahadwa pakshadwa ghnonti manavam
Tamagnirohinim vidyat asadyam sannipatatas

Or in English, “Deep, hard swellings appear in the armpit, giving rise to violent fever, like a burning fire, and a burning, swelling sensation inside. It kills the patient within seven, 10, or 15 days. It is called Agnirohini. It is due to sannipata or a deranged condition of all the three humours, vata, pitta, and kapha, and is incurable.”

The symptoms no doubt point at plague, even if we can’t immediately jump to such a conclusion. The reference to a week or 15 days is indicative of modern bubonic plague, while the burning sensation and violent fever shows an illness that rapidly terminates in death. The Susruta Samhita, from which this reference is taken, was written in the ninth century AD. We do not have a similar tract in Sri Lanka from that time, but the Mahavamsa tells us that in the third century AD, during the reign of Sirisangabo, there was an outbreak of a disease the symptoms of which included the reddening of the eyes. Mahanama thera, no doubt attributing it to the wrath of divine entities, personified the pandemic in a yakinni called Rattakkhi (or Red Eye). Very possibly the illness was a cholera epidemic, or even the plague.

China, India, and Medieval Europe aside, the second major wave of pandemics came about a while after the Middle Ages and Black Death, and during the Renaissance, when conquerors from Spain and Portugal, having divided the world between the two countries, introduced and spread diseases to which they had become immune among the natives of the lands they sailed to. Debates over the extent to which Old World civilisations were destroyed and decimated by these diseases continue to rage. The first attempts to determine pre-colonial populations in the New World were made in the early part of the 20th century. The physiologist S. F. Cook published his research on the intrusions of diseases from the Old World to the Americas from 1937. In 1966, the anthropologist Henry F. Dobyns argued that most studies understated the numbers. In the 1930s when research on the topic began, conservative estimates put the North American pre-Columbine population at one million. Dobyns upped it to 10 million and, later, 18 million; most of them, he concluded, were wiped out by the epidemics.

And it didn’t stop at that. These were followed by outbreaks of diseases associated with the “white man”, including yaws and cholera. Between 1817 and 1917, for instance, no fewer than six cholera epidemics devastated the subcontinent. Medical authorities were slow to act, even in Ceylon, for the simple reason that by the time of the British conquest, filtration theory in the colonies had deemed it prudent that health, as with education, be catered to a minority. Doctors thus did not find their way to far flung places suffering the most from cholera, while epidemics were fanned even more by the influx of South Indian plantation workers after the 1830s. Not until the 1930s could authorities respond properly to the pandemic; by then, the whole of the conquered world, from Asia all the way to Africa, had turned into a beleaguered and diseased patient, not unlike Europe in the 14th century.

The writer can be reached at udakdev1@gmail.com

Continue Reading

Trending