The Pandemic Is a Great Time to Learn How to Reduce Waste

Now is perhaps the perfect time to reevaluate our actions and roles as consumers. Owing to the pandemic, many of us have had to take pay cuts, some have lost jobs, others have lost jobs and found new ones. In general, I am able to sense some semblance of frugality taking hold, and this, I’m inclined to believe, is a good thing.

So here are some very simple ways to reduce waste. Recycling is one of the best ways to accomplish this, but it is most effective when coupled with other conscientious practices.

Reduce and phase out the use of plastic bags

It is a well-known fact that plastic bags are extremely harmful to the environment, yet they are omnipresent and seemingly unavoidable, especially at supermarkets and stores. Carrying reusable cloth bags is a highly effective way of reducing plastic waste. What’s more, cloth bags aren’t expensive either. This means you can buy cloth bags of different sizes and of varying degrees of sturdiness to suit different purposes.

Reduce the use of paper napkins and tissues

Use moisture-absorbing cotton handkerchiefs instead. It might not be a good idea to use one outdoors just yet, but using a handkerchief indoors is safe so long as we ensure basic hygiene and respiratory etiquette.

A good handkerchief absorbs wetness and moisture better than paper napkins and tissues, and it does not leave wet shards of paper on your hands or face. Remember to wash it regularly.

Change how you pack takeouts

This may not work until after we’re done with the pandemic, and it may be a little tricky because you have to convince restaurants to change how they pack your takeout. Invest in sturdy food thermos if takeouts are a regular feature of your life. Not only will this help you reduce waste but it will also keep your takeout hot for a longer duration.

Don’t waste food, but if you do…

Excess food almost always ends up in waste. However, with just a little bit of effort, excess food can be composted and used to enrich the soil in your backyard. There are plenty of easy-to-follow guides online to teach yourself how to compost effectively. But, remember: you don’t have to compost if you don’t waste food. Invest in airtight, refrigerator-friendly containers to store excess food.

Since the coronavirus outbreak, some people have been arguing that all humans should become vegetarians to prevent similar outbreaks in the future. This doesn’t sound like a feasible claim to me. First, a vegetarian diet doesn’t guarantee the prevention of outbreaks. Second, it is almost unthinkable to produce essentials of the vegetarian diet on the massive scale necessary to feed a global population. Additionally, the abrupt shift to vegetarianism will engender adverse health outcomes for many.

Nonetheless, this argument does shed light on the fact that we do waste plenty of food. If more people are to experiment with the vegetarian diet, it is essential to reduce waste. Unchecked consumption and careless handling of food are two aspects we simply must address at once. By doing so, we can lessen the burden on farmers, food producers and distributors: that is, we can reduce the supply-demand gap simply by preventing waste. Not to mention the fact that there will be more food available for people–which is a step closer to ensuring distributive justice.

Buy smart

To be a conscientious consumer is to make smart choices. For instance, fountain pens are more durable than disposable ballpoints, and they also produce less waste. Think of how many non biodegradable refills you won’t be buying if you don’t buy a ballpoint. Similarly, use durable razors as opposed to throw-and-use ones. In fact, consider buying durable products as opposed to use-and-throw ones. You can reduce waste by a significant margin by doing so. Similarly, try to mend products if they can be repaired. More importantly, make sure to buy products that can be mended, restored, or repaired.

The Fishing Cat, A Lovable Rascal

Plenty has been said about the pandemic; we have either become used to the new normal, or we are accepting that things as they stand represent the new normal for the foreseeable future.  This month I’m taking a break from writing about my preferred topics. No posts about academic writing, interdisciplinarity, ethics, etc. This post instead focuses on an all too charming creature known as the fishing cat—an endearing, mischievous rascal. Animals have always cheered me up, and I hope this post might have a similar effect on you.

If cartoons are to be believed, cats love nothing more than fish. Yet, cats are not considered good swimmers. Indeed some cats dislike water altogether. Despite their dislike of water and contrary to popular opinion, most cats happen to be decent swimmers. Enter the fishing cat, an exceptional swimmer; in fact, the fishing cat is so good at swimming that it hunts primarily in water. It not just hunts in water but also really enjoys swimming. What’s more, fishing cats are also extremely playful: they are notorious for attempting to grab ducks’ feet underwater. On the other hand, fishing cats can also be extremely aggressive. They are, after all, wild cats.

A pondering fishing cat
Image Credit: DC Chadwick

Some Facts about the Fishing Cat

Fishing cats are mainly found in South Asia, especially in India, Nepal, Bangladesh, and Sri Lanka. They are stealthy and difficult to spot, which is understandable because they are primarily nocturnal. In addition to fish, these cats also eat birds, insects, and small rodents. Although they are mostly found in wetlands, some fishing cats have also been spotted at high altitude, especially in the Himalayas.

Nonetheless, fishing cats tend to thrive in most kinds of wetlands, especially near fast-moving water bodies. For instance, fishing cats have been known to live rather comfortably in captivity, especially in zoos and well-preserved national parks in North America. These cats are in dire need of protection, and monitoring them in captivity allows us to facilitate breeding. They, however, run the risk of losing their predatory edge in these settings. It typically requires extensive effort on the part of zoo personnel to acquaint fishing cats with their natural instincts in water. Fishing cats in captivity have been known to develop the ability to catch fish with both their strong paws and teeth.

The Conservation Status of Fishing Cats

The IUCN Red List of Threatened Species deems fishing cats Vulnerable. The drastic destruction of wetlands in South Asia has threatened the region’s fishing cat population. Wetlands are either converted into agricultural plots or human settlements. Rapid urbanization has also begun to threaten South Asian fishing cats. Other factors include unregulated fishing, overfishing, and hunting.

Fishing Cats and Urbanization

Although fishing cats prefer the wetlands, they have demonstrated their ability to adapt to urbanization by seeking new hunting grounds. For instance, this article details the ways in which Sri Lanka’s fishing cats have begun hunting in urban landscapes in response to the destruction of wetlands and rapid urbanization. This adaptation is all the more remarkable since fishing cats are highly reclusive creatures. Their ability to spot live, consumable fish in urban settings also speaks to their tenacity and wile.

Did You Know?

Fishing cats have webbed paws, which greatly enhance their swimming ability. They also use their short, furry tail as a rudder underwater.

In captivity, mothers sometimes tend to reject their kittens. However, zoos that raise fishing cats typically have nurseries to take care of rejected kittens.

The Coronavirus Pandemic: What Can We Learn From This?

The coronavirus pandemic has enforced a standstill like never before. Social distancing, self-isolation, and quarantining are being practiced on an unprecedented scale. Yet many are wondering if locking down could be as harmful as the disease (or more) it aims to control? Is a lockdown especially harsh for the homeless, the poor, the unemployed, and the vulnerable? It doubtless is. Shouldn’t we then enlist more services, more workers to ensure essentials and healthcare are distributed evenly? Shouldn’t we revive the economy just a little so that the most vulnerable can be safe?

These are important questions, and asking them does not mean that one thinks a lockdown is useless. On the contrary, it is only by asking these questions can we even begin to figure out our approach to public life. These questions are especially important since no country–not even Germany, Sweden, the US, and the UK (generally considered examples of good public healthcare systems; they’re not necessarily exemplary, but do better than most other countries)–can boast a robust epidemic management system. Although it must be mentioned that Sweden and Germany have been outliers in terms of how they’ve handled the situation. Fortunately, both countries have also been somewhat spared by the uncontrollables, which can be extremely punishing even for states and nation-states with somewhat robust healthcare systems–as has been the case with New York and Italy respectively.

Why We Need an Epidemic/Pandemic Management System:

As always, the poor and the vulnerable are bearing the brunt of poor public planning. We refer particularly to the lack of an epidemic management system (EMS). In its absence, primary healthcare workers are being forced to tackle a landslide of serious infections across countries. A well-drilled EMS would necessarily involve the coming together of medical know-how and military efficiency. The latter is especially useful when it comes to the even distribution of resources and essentials. To ensure distributive justice, in other words. It might also give us more room for a lockdown. If public distribution can be sustained during pandemics or epidemics, we can afford to put economic activity on hold just a little while longer.

If this sounds like wishful thinking, we need only remind ourselves that governments are being forced to think of some sections of the population as expendable. More often than not, it’s the poor and the vulnerable we don’t mind sacrificing. An EMS, on the other hand, would have allowed governments to take care of its most needy more efficiently and humanely.

Biologists, virologist, philanthropists, even filmmakers have been extolling us to not just spend more on healthcare, but also spend on different aspects of healthcare. One can only hope that the corona pandemic forces governments to constitute specialized branches for the management of similar outbreaks and fund them adequately. To this end, governments must also spend less and less on military and defense. Space research is also terribly cosmetic when no nation-state can guarantee universal healthcare to citizens.

Cutting military and defense spending is not as bad a prospect as uber-nationalists might think. If biological warfare is the only feasible form of warfare–what with the unbelievable, unfeasible ruthlessness of nuclear warfare–then an EMS can also serve military interests. Although one shouldn’t need that aspect of it to incentivize its constitution. Simple prudence should be enough.

Coronavirus Impact: Some (Just Some) Good News

The coronavirus pandemic has left most of us quarantined. Healthcare workers, delivery executives, sanitation workers, and a slew of public health officials are working tirelessly and ceaselessly to ensure those in quarantine remain safe. There is a general air of despair, but there is good news. First, the pandemic has thankfully–some say fittingly–not directly affected the health of non-human animals. Reports of strays starving in abandoned streets are heartbreaking nonetheless. Second, we are doing less direct harm to the environment now that most of us are indoors. Among other things, this mass quarantining experiment must surely have alleviated “light pollution.”

Some Facts About Light Pollution

While it was fairly straightforward to predict that the Industrial Revolution would precipitate air, water, and land pollution, it took us quite a while to realize that even light would turn out to be a major pollutant. We refer here to artificial light, of course. Light pollution is a result of the unchecked, excessive use of artificial light. Urbanization is cited as one of the main causes of light pollution. It has also been found that artificial light can alter natural conditions even when used judiciously. Scientists claim that light pollution is an especially complex phenomenon, which may have more adverse effects than presently known.

Definition

Light pollution is defined mainly in terms of the effects of artificial light on the environment. Broadly speaking, artificial light engenders the following two effects: (i) it invariably alters the natural light levels in the environment, and (ii) it can degrade photic habitats. It is worth noting that alteration does not necessarily mean degradation in this context. On the other hand, light pollution inevitably competes with starlight in the night sky. As a result, the Milky Way is no longer easily observable through the naked eye. Light pollution also adversely impacts human and non-human health, astronomical undertakings, and ecosystems. The mass quarantine has certainly led to a decrease in the amount of artificial light in the environment, and those of us with the luxury to gaze up at the night sky should seize this opportunity. We will be going back to the bustle soon enough, sadly. So make the most of this brief period of enforced quiet.

Common Sources of Light Pollution

The term “artificial light” refers to all manner of human-made light, including safety lights on a cyclist’s bib, traffic lights, indoor and outdoor lighting, and even torchlights. However, extreme light pollution is mainly caused by larger artificial lights, such as advertisement hoardings, floodlights, and lights that line the exterior of buildings. Unsurprisingly, light pollution is especially extreme in highly industrialized regions such as North America, Europe, Japan, the Middle East, and parts of North Africa. Over-illumination, or the excessive use of light, is the biggest cause of light pollution in these regions.

How Does Light Pollution Affect Human Health?

How does artificial light affect life on earth? It is known to cause sleep deprivation. In fact, a phenomenon known as “light trespass” is one of the biggest causes of artificial-light-driven sleep deprivation. Light trespass occurs when a strong light enters one’s property from the outside. It can also be a consequence of over-illumination. What’s more, over-illumination also depletes oil reserves (since it takes a lot of oil to not just manufacture lighting devices but also to use these devices). Artificial lighting, therefore, is not only a direct product of reckless consumption but is also a significant driver of environmental degradation. What’s worse, according to estimates, 30-60 percent of energy consumed in lighting is reckless and excessive.

Cannibalism in the Animal Kingdom

Cannibalism is a common occurrence in the animal kingdom. In fact, as many as 1,500 species have been known to practice cannibalism. An act of consuming part or all of another individual of the same species, cannibalism is especially prevalent among aquatic organisms. Interestingly, however, cannibalism is not limited to carnivorous or omnivorous species; it is also practiced by some herbivorous species. For instance, scientists recently observed two hippos feeding on the carcass of another hippo in South Africa. This is particularly interesting not only because the hippo is a herbivore but also because this is only the second recorded instance of cannibalism involving the hippo.

Sexual Cannibalism and Size-Structured Cannibalism

Some animals also practice sexual cannibalism, a form of cannibalism in which the male is consumed by the female before, during, or after copulation. Sexual cannibalism is particularly common among invertebrates, especially spiders. Among some spiders, sexual cannibalism tends to enhance the offspring’s chances of survival. Size-structured cannibalism illustrates just how common cannibalism is in the animal kingdom. In this type of cannibalism, older and larger individuals consume smaller and younger individuals of the same species. It is prevalent mainly in size-structured animal groups—that is, in groups organized based on animals’ size, age, and level of maturity. Size-structured cannibalism can amount to nearly 95% of total mortality in these groups. Sometimes, adult animals consume their own offspring: an instance of filial cannibalism, which is also a type of size-structured cannibalism.

Matriphagy:

Spiders have quite the penchant for cannibalism. While, as discussed above, some female spiders consume their sexual partners, some young spiders consume their mothers. Matriphagy is when an offspring consumes its own mother. In fact, matriphagy is quite common in the insect world. Interestingly, both sexual cannibalism and matriphagy tend to enhance the survival rates of young spiders.

Cannibalism in the animal world is not merely a response to starvation or extreme stress. It may be a necessary process to ensure the survival of a species. It also serves to reduce unhealthy competition for survival and eliminates weaker members of a species.

Historiography, Objectivity, and More

Recently my students and I have been discussing questions such as historicity, historiography, and objectivity. Central to this exploration was our focus on the fiction-fact dichotomy. Indeed, one of the questions that emerged as we prodded on was whether we should hold fiction and fact in such stark contrast. The discussions were deeply fruitful, and this post is a very short summary of what transpired.

Writing History and How History Is Written

There is of course the oft-repeated but altogether true adage: History is written by the victors. But even this overlooks the sociopolitical clout of the dominant, of the victors. For instance, a significant part of domination involves ensuring compliance from the subordinated. This involves forcing the subordinated to accept fabricated versions of history, among other things. That is, the subordinated may not just be forced to accept a valorized version of history but they are also typically restricted from contesting this version. To this end, those in power tend to rely on legal and coercive measures to stifle any resistance.

Over time, versions of history compiled for the benefit of the dominant become institutionalized and even appear in textbooks. In some instances, history aids the transformation of dangerous, violent personalities into generous, other-regarding benefactors–as is the case with Cecil Rhodes, and indeed with colonialism in general.

How then can these versions of history–compiled to be popular and dominant–claim to be fair and objective? In fact, the trouble lies in the fact that the socially and politically dominant largely determine conditions of fairness and objectivity–not just in relation to historiography but also in terms of broader contexts, such as conditions of scientific objectivity, rationality, and, by extension, criteria for what qualifies as fact and fiction.

The Fiction-Fact Question

Central to this inquiry is the suggestion that the truths we produce are necessarily partial–even scientific truths. Admittedly, this is a contentious claim, but it allows us to think fruitfully and critically about fact and fiction.

In other words, it is impossible to produce all-encompassing truths. No matter how comprehensive an account, it will necessarily be limited and partial. The limitations are mostly imposed by our own cognitive finitude as well as by our biases, especially biases we are not aware of. This is not in fact a bleak account of what it is to produce truth and knowledge. If anything, it illustrates the importance of constant critical scrutiny, a quality without which progress–especially scientific progress–might become endangered.

It must also be noted that this argument does not suggest that it is futile to try and present a complete, comprehensive picture of a given issue or phenomenon. On the contrary, it argues that we ought to try and present as accurate an account as possible, and to do that, we must acknowledge that the truths we produce can only be partial. Which means we must also call “objective” methodologies into question. Doing this means ascribing importance and legitimacy to methods and techniques that fall beyond the scope of the scientific method. Oral history is just one relevant example in this context.

Ultimately, it is essential to ask what happens to marginalized voices–both in the context of history and more generally. Why do some voices get marginalized, and some amplified? These are some of the most elementary questions we should be asking when appraising works or accounts that claim to be historical.

Rationality and Animals: Humans and Non-Humans

Are humans animals? Or are we superior–in that are we the only species worthy of the “rational animal” title? To answer these questions, we must necessarily examine definition(s) of rationality. This post argues that the animal-human distinction, based as it is on the view that only humans are capable of rational thought, breeds complicity and speciesism, in turn enabling us to excuse, if not condone, animal cruelty and other similar acts.

Conscious thought, one of the most important characteristics of rationality, is a good starting point in this context. To think consciously is to be aware of what one is thinking. Which is to say, it is also to be able to think about something in a desired way. Admittedly, there are degrees of conscious thinking. Nonetheless, central to it is the capacity to direct one’s own thought. Some are better at this than others, but we all do it from time to time, sometimes–ironically–without even knowing we’re doing it.

Rationality: Thinking in Terms of Means and Ends

Many have also argued, quite unsuccessfully, that what really separates humans from animals is the former’s ability to create and use tools. How is this related to rationality, specifically to conscious thought? To use a tool, animals–both human and non-human–must first recognize that the task at hand cannot be accomplished or even attempted without the aid of a tool. What follows is typically a search for the tool, which in turn is followed by the fashioning of it. All of these processes necessarily involve conscious thinking. They not only require us to assess the problem but also assess our grasp of it.

Much like conscious thinking, tools also vary greatly in terms of complexity. Quite simply, however, a tool is something that aids a process, a thing that allows us to complete or attempt a task with some degree of ease. Tools, moreover, do not necessarily guarantee the successful completion of a task. Nonetheless, the Google Sheets app on my phone is as much a tool as a pebble or a stone is to that thirsty bird faced with shallow water in a narrow trough. Their specific purposes differ, but broadly speaking they tend to introduce ease. They are designed to introduce ease.

One of the most enduring (albeit contested) definitions of rationality states that it has mainly to do with thinking and action in relation to the “means-end” category. That is, rationality, this definition suggests, has mainly to do with thinking about and acting so as to achieve a desired end. In other words, it is rational to carry an umbrella if one wants to avoid getting drenched. Conversely, it is irrational to go out without an umbrella when it’s raining, especially if one’s stated purpose is to avoid getting drenched. In sum, actions contrary to one’s stated ends are largely irrational.

With this limited definition of rationality in mind–coupled with the notion of conscious thinking as an essential aspect of rationality–one can see that rationality is not merely a human thing.

It’s fallacious, therefore, to claim that humans are not animals because they are rational. Animals–human and non-human ones–display several signs of rational, conscious thinking.

How the Raccoon Became a Truly “Urban Animal”

If you thought raccoons were omnipresent, you were right, at least partially. Raccoons are tenacious animals, and they have adapted extremely well to urbanization in North America. In fact, out here raccoons are more common in cities than in the country or the wild. What’s more, they eat just about anything and can live just about anywhere. Even in the wild, raccoons are equally at home in thick forests, wetlands, and grasslands. Raccoons can live almost anywhere because they are not fussy eaters. For instance, in the wild raccoons eat fish, frogs, varieties of other aquatic animals, mice, insects, eggs, fruits, berries, and plants. To be more precise, they eat what’s available.

Raccoons, moreover, are especially adept at thawing open their prey’s hiding spots; they are remarkably dexterous and possess strong front paws, which enable them to rummage efficiently. These qualities have enabled raccoons to thrive in urban areas, which, we must remember, is a nearly impossible feat for most other animals.

Their Original Habitat

Raccoons are known for their resilience and adaptability. They originally lived and thrived in deciduous and mixed forests in North America. However, they’ve gradually become adept at surviving in other, more harsher landscapes in the continent. Today, raccoons can be found even in Europe (including Russia, a country notorious for its harsh winters) and Japan. Interestingly, raccoons were introduced in these regions by humans during the mid-20th century. Raccoons also thrive in captivity; their life span increases rather dramatically: Some captive raccoons have been known to live well into their twentieth year, whereas raccoons in the wild have a life expectancy of only three years. Their foray into urban spaces is not without problems, however. Most raccoon deaths in urban areas, statistics show, are a result of road or automobile injuries. Hunting is another major cause. Humans, therefore, are responsible for a large number of raccoon deaths both in urban areas and in the wild. Nonetheless, raccoons continue to thrive: their numbers are on the rise, and the IUCN Red List Status puts them in the “Least Concern” category.

Some Interesting Facts

  1. Young raccoons are called “kits.”
  2. Most raccoons may not make good pets; even in captivity, raccoons tend to retain their wild instincts. Nonetheless, raccoons have been known to be extremely loving and attached as pets.
  3. What’s more, it is illegal in some states to keep raccoons as pets—mainly due to their vulnerability to rabies, their wild instincts, and the extent of care they require as pets.

What Makes Concerns “Ethical”

The last few posts on this blog raised some ethical concerns. Now would be a good time to ask what constitutes ethical inquiry. Or what it is that makes concerns ethical. Students are often told that Ethics is a field unto itself–a vast, complex one. To a certain extent, this is true. However, to be ethical is to adopt standpoints and explain why a preferred standpoint is ethical. Let’s unpack this claim.

First, it must be said that ethics involves both (i) the act of categorizing an action or a thing as good or bad and (ii) a careful breakdown of this process of categorization. Canonical works are deeply useful at the introductory level, but they often do not capture the ambiguities of everyday life. This is because the canon is typically diluted for student consumption, and the focus remains on ethical conclusions, not on arguments. What’s more, a large part of doing ethics involves the construction of arguments.

What’s good in scenario A may not necessarily be good in scenario B. For instance, it would be fallacious to say that humans should not, under any circumstance, eat animals or other humans. Nonetheless, this principle cannot be applied to agrarian communities or economies that consume animals and animal products as staple. Most such communities do not raise animals solely for human consumption. In fact, without human intervention, certain species may find it difficult to survive and procreate. Cattle is a relevant example here.

Good ethical arguments avoid the under any circumstance clause. They instead refer to specific situations and examples. Doing ethics meaningfully necessarily involves an examination of when and why certain norms or ethical principles cannot be applied. Without this, ethics would lack applicability.

Insect Farming, Food Shortage, and Consumption

As of May 2018, the world population was estimated to have reached a staggering 7.6 billion. As this number is expected to grow, concerned observers have pointed out the dangers of food shortage. Some of the foreseeable consequences include a severe shortage of the meat products we prefer and produce now, loss of agricultural land, and a shortage of dairy products. In other words, population explosion threatens most of our conventional sources of protein. In addition, it may also increase poverty rates and complicate the distribution of food. Therefore, it is necessary to identify alternative sources of protein. In fact, we may already have identified one: edible insects.

Insect Farming: Things to Consider

Insect farming is believed to be more sustainable than livestock production, which is currently one of the most environmentally harmful practices (it has been found that livestock production is one of the main drivers of global warming). However, we are yet to ascertain whether insects can be really edible and safe. Insect farming may also pose unique challenges. For instance, in addition to screening for risks to humans, we must necessarily examine the potential negative effects of insect farming on animals, plants, and the environment. More importantly, we need to identify whether insect farming can adversely affect the planet’s biodiversity. After all, insects are an important part of the food chain. Directly or indirectly, they impact the daily lives of amphibians, reptiles, and mammals—including humans.

On the other hand, we know for certain that insect farming is more sustainable than livestock production. In fact, according to this study, almost one-third of the global cereal produce is fed to animals, especially pigs and poultry raised for human consumption. Conventional meat, therefore, may not offer us a way out of the seemingly inevitable problem of food shortage. In fact, if anything, preparing for livestock production is also part of the problem in that it takes a lot of food to make meat.

Some Arguments for Insect Farming

In addition, with more countries poised to experience significant economic growth, insect farming seems like our safest bet. This is because growth and development typically result in higher demand for meat. Moreover, insect consumption is not as outrageous as it seems. For instance, insects are already an important aspect of Thai diet: locusts, crickets, larvae (of several insects), and spiders are widely consumed in the country. Most of these insects are fried or deep-fried for consumption. Thailand is also known for its innumerable cricket farms, where the insect is raised solely for human consumption. Moreover, Thailand is not the only country where insects are farmed for human consumption. Its Asian counterpart Vietnam is another example. Insects are also consumed in Brazil and Cambodia. However, we are still a long way from replacing conventional meat with edible insects. This process requires active research to determine whether the large-scale farming or production of edible insects is safe for the environment. It also depends on whether people are willing to look at insects as food; the bottom line, however, is that we may have no choice.