How the Raccoon Became a Truly “Urban Animal”

If you thought raccoons were omnipresent, you were right, at least partially. Raccoons are tenacious animals, and they have adapted extremely well to urbanization in North America. In fact, out here raccoons are more common in cities than in the country or the wild. What’s more, they eat just about anything and can live just about anywhere. Even in the wild, raccoons are equally at home in thick forests, wetlands, and grasslands. Raccoons can live almost anywhere because they are not fussy eaters. For instance, in the wild raccoons eat fish, frogs, varieties of other aquatic animals, mice, insects, eggs, fruits, berries, and plants. To be more precise, they eat what’s available.

Raccoons, moreover, are especially adept at thawing open their prey’s hiding spots; they are remarkably dexterous and possess strong front paws, which enable them to rummage efficiently. These qualities have enabled raccoons to thrive in urban areas, which, we must remember, is a nearly impossible feat for most other animals.

Their Original Habitat

Raccoons are known for their resilience and adaptability. They originally lived and thrived in deciduous and mixed forests in North America. However, they’ve gradually become adept at surviving in other, more harsher landscapes in the continent. Today, raccoons can be found even in Europe (including Russia, a country notorious for its harsh winters) and Japan. Interestingly, raccoons were introduced in these regions by humans during the mid-20th century. Raccoons also thrive in captivity; their life span increases rather dramatically: Some captive raccoons have been known to live well into their twentieth year, whereas raccoons in the wild have a life expectancy of only three years. Their foray into urban spaces is not without problems, however. Most raccoon deaths in urban areas, statistics show, are a result of road or automobile injuries. Hunting is another major cause. Humans, therefore, are responsible for a large number of raccoon deaths both in urban areas and in the wild. Nonetheless, raccoons continue to thrive: their numbers are on the rise, and the IUCN Red List Status puts them in the “Least Concern” category.

Some Interesting Facts

  1. Young raccoons are called “kits.”
  2. Most raccoons may not make good pets; even in captivity, raccoons tend to retain their wild instincts. Nonetheless, raccoons have been known to be extremely loving and attached as pets.
  3. What’s more, it is illegal in some states to keep raccoons as pets—mainly due to their vulnerability to rabies, their wild instincts, and the extent of care they require as pets.

What Makes Concerns “Ethical”

The last few posts on this blog raised some ethical concerns. Now would be a good time to ask what constitutes ethical inquiry. Or what it is that makes concerns ethical. Students are often told that Ethics is a field unto itself–a vast, complex one. To a certain extent, this is true. However, to be ethical is to adopt standpoints and explain why a preferred standpoint is ethical. Let’s unpack this claim.

First, it must be said that ethics involves both (i) the act of categorizing an action or a thing as good or bad and (ii) a careful breakdown of this process of categorization. Canonical works are deeply useful at the introductory level, but they often do not capture the ambiguities of everyday life. This is because the canon is typically diluted for student consumption, and the focus remains on ethical conclusions, not on arguments. What’s more, a large part of doing ethics involves the construction of arguments.

What’s good in scenario A may not necessarily be good in scenario B. For instance, it would be fallacious to say that humans should not, under any circumstance, eat animals or other humans. Nonetheless, this principle cannot be applied to agrarian communities or economies that consume animals and animal products as staple. Most such communities do not raise animals solely for human consumption. In fact, without human intervention, certain species may find it difficult to survive and procreate. Cattle is a relevant example here.

Good ethical arguments avoid the under any circumstance clause. They instead refer to specific situations and examples. Doing ethics meaningfully necessarily involves an examination of when and why certain norms or ethical principles cannot be applied. Without this, ethics would lack applicability.

Insect Farming, Food Shortage, and Consumption

As of May 2018, the world population was estimated to have reached a staggering 7.6 billion. As this number is expected to grow, concerned observers have pointed out the dangers of food shortage. Some of the foreseeable consequences include a severe shortage of the meat products we prefer and produce now, loss of agricultural land, and a shortage of dairy products. In other words, population explosion threatens most of our conventional sources of protein. In addition, it may also increase poverty rates and complicate the distribution of food. Therefore, it is necessary to identify alternative sources of protein. In fact, we may already have identified one: edible insects.

Insect Farming: Things to Consider

Insect farming is believed to be more sustainable than livestock production, which is currently one of the most environmentally harmful practices (it has been found that livestock production is one of the main drivers of global warming). However, we are yet to ascertain whether insects can be really edible and safe. Insect farming may also pose unique challenges. For instance, in addition to screening for risks to humans, we must necessarily examine the potential negative effects of insect farming on animals, plants, and the environment. More importantly, we need to identify whether insect farming can adversely affect the planet’s biodiversity. After all, insects are an important part of the food chain. Directly or indirectly, they impact the daily lives of amphibians, reptiles, and mammals—including humans.

On the other hand, we know for certain that insect farming is more sustainable than livestock production. In fact, according to this study, almost one-third of the global cereal produce is fed to animals, especially pigs and poultry raised for human consumption. Conventional meat, therefore, may not offer us a way out of the seemingly inevitable problem of food shortage. In fact, if anything, preparing for livestock production is also part of the problem in that it takes a lot of food to make meat.

Some Arguments for Insect Farming

In addition, with more countries poised to experience significant economic growth, insect farming seems like our safest bet. This is because growth and development typically result in higher demand for meat. Moreover, insect consumption is not as outrageous as it seems. For instance, insects are already an important aspect of Thai diet: locusts, crickets, larvae (of several insects), and spiders are widely consumed in the country. Most of these insects are fried or deep-fried for consumption. Thailand is also known for its innumerable cricket farms, where the insect is raised solely for human consumption. Moreover, Thailand is not the only country where insects are farmed for human consumption. Its Asian counterpart Vietnam is another example. Insects are also consumed in Brazil and Cambodia. However, we are still a long way from replacing conventional meat with edible insects. This process requires active research to determine whether the large-scale farming or production of edible insects is safe for the environment. It also depends on whether people are willing to look at insects as food; the bottom line, however, is that we may have no choice.

What Is Frugal Living?

Frugal living—or frugality—is sometimes frowned upon and associated with miserliness. However, frugal living, in essence, is mindful consumption. Although a personal ethic, frugal living tends to have tangible impacts on the environment and the economy. Principled frugal living necessarily involves conscious effort to reduce or avoid waste altogether. Frugality, therefore, constitutes an alternative lifestyle. It is a statement against overconsumption and decadence. In other words, frugality is practiced not only by those who aim to reduce their spending but also by people committed to reducing their carbon footprint. Therefore, it also involves active down-scaling and, in some cases, complete relinquishment of a certain lifestyle or preference.

Some Aspects of Frugal Living

Similarly, frugality is not only about reducing purchases. It also involves recycling and repurposing of resources. Purchasing used items—especially furniture, clothing, stationery, and automobiles, among other things—is another crucial aspect of frugal living. By doing so, those who practice frugality as a principle aim to dissuade manufacturers and industries from producing excess goods and services. Frugality also does not mean complete abstinence; in fact, this is one of the biggest misconceptions about frugal living. The goal of principled frugal living is self-sufficiency, and this is why frugality tends to have observable positive impacts on the environment. On the other hand, people also live frugally in order to be able to afford something. Typically, this latter kind of frugality, or non-principled frugality, tends to be temporary. Hence, it is important to differentiate between principled and unprincipled frugality.

Frugal Living and STEM Education

How is frugal living relevant in the context of STEM education, you wonder? Simply put, STEM education will enable you to adopt a more hands-on approach to frugality. In effect, it will make you more self-sufficient. For instance, you can explore sustainable engineering, a process aimed at designing or altering devices and systems to ensure the sustainable use of resources. STEM education will also enable you to accurately measure your carbon footprint and identify how to be more efficient and resourceful. Ultimately, STEM education will give you the tools to analyze and quantify the extent of your frugality.

What’s more, frugal living is a well-researched phenomenon. You will, therefore, find plenty of guidance and information. Frugal living has been critically examined from different disciplinary perspectives. For instance, several studies have examined frugality from an economic standpoint; scholars have also examined the political and philosophical underpinnings of this practice. On the other hand, STEM researchers constantly explore different ways to ensure sustainability. To this end, they actively seek to find out how to reduce, if not arrest, the depletion of natural resources. STEM researchers also aim to identify less harmful resources and try to reduce the negative impact of fossil fuels.

Frugal Living and Activism

Frugal living is also characterized by another dimension: when practiced earnestly, it amounts to exemplary activism. Teachers, therefore, can encourage students to undertake energy-efficient and sustainable DIY projects. Tasks can be as simple as designing a makeshift environment-friendly bag for shopping purposes, or planning small-budget travel. You can even encourage students to visit libraries to read actual books as opposed to having them read e-versions or Kindle versions of books. Do not forget to encourage them to cycle or walk to the library, especially if it is not too far away. Alternatively, encourage them to take the bus or organize a carpool. Remember: every little step counts. Nonetheless, be creative and challenge students to come up with their own planet-protecting ideas.

Approaching Footnotes and Endnotes: Thoughts and Strategies

Why footnotes or endnotes? Are they at all necessary, and should we read them diligently? Or can we afford to ignore them? The last two questions have no correct answer. That’s because if one reads the main text very carefully, footnotes and endnotes may be skipped. However, this depends on the purpose of these notes. For instance, some footnotes or endnotes may point readers to relevant, seminal scholarship, whereas others may offer clarifications or elaborations. Sometimes, they may also address anticipated criticism.

The most common complaint is that footnotes and endnotes complicate the reading experience. Therefore, it is essential to discuss how to tackle them. This post only deals with footnotes and endnotes in non-fiction and academic writing; it does not focus on how writers use them in fiction.

Image Credit: Kipala

The Simplest Way (Since There’s No Way Out)

The simplest way involves careful reading of the main text. Once this is done–that is, once we understand what the text is about–approaching the notes should be easy enough. To do this, it is best to keep a checklist, at least at first, before we can breeze through footnotes. The checklist can involve things like:
(i) what the author affirms
(ii) what the author denies
(iii) what the author contests
(iv) what the author doubts

This checklist may well help us understand the purpose of a given note, which will in turn eventually help us ascertain whether or not a note is central to matters discussed in the main text.

As our reading skills and knowledge in a given field improve with time, we may even be able to anticipate the purpose of a footnote fairly accurately. Initially, however, approaching footnotes is quite critical. Doing so allows us to become more proficient in a given field. In other words, by deferring the reading of footnotes, we may only be doing ourselves a disservice. There’s no way out, really.

Some Benefits of Reading Them

If you’re unable to decide on a specific topic for your term paper, footnotes and endnotes may just help you out. Sometimes, they contain information about areas that are under-researched or poorly researched; more often than not, this information also includes citations to texts that may be the first forays into the field. Which means you might just have found your topic.

Second, reading them allows us to learn the ropes of academic or non-fiction writing. If well done, footnotes and endnotes can be exemplary. It can show us what it takes to build good, persuasive arguments and how to structure them.

Finally, it takes a lot of reading to the get to the stage where we can afford to skip them. Until then, however, the only way out is through.

“Can I Cite News Articles and Blogs in My Term Paper?”

I chanced upon a new problem recently: students’ reluctance to cite news articles and blogs in term papers. This reluctance, I found out, has partly to do with the language we–teachers, researchers, and students–employ in classrooms, especially when discussing research. Conversations about research mainly tend to revolve around the validity, authority, and legitimacy of academic sources. To ensure balance it is, therefore, necessary to discuss the usefulness of news articles and blogs with students, especially with students who are just being introduced to the nuances of research.

Although style guides focus extensively on how to cite these materials, these discussions typically follow instructions on how to cite the more traditional sources: journal articles, books, and individual essays in anthologies. This has led students to assume that news articles and blogs are not credible sources. To be sure, not all articles and blogs are citation-worthy; some are just downright poor. However, this is also true of journal articles and books: some are good, some exceptional, and a good number of them bad. Therefore, it would be useful for students to know how to identify credible materials. This would involve encouraging them to explain (both to themselves and in their term papers) why they think a journal article or a blog is credible.

Image Credit: Jisc

But Why News Articles and Blogs?

As far as the Humanities are concerned, news articles and blogs acquire special importance: since they are usually authored by journalists and the general public respectively, they are likely to represent the views on the ground (or views on “the field,” as sociologists and anthropologists say). Unless we are talking about the blog of an academic or an industry expert. Field work is almost non-existent at the high school level. It would also be untenable to introduce students to the practice and ethics of field research before they are introduced to research in general. In this context, the democratic nature of news articles and blogs (democratic because most of these can be read and understood by just about anybody) can speak to the importance and necessity of field research. They certainly cannot be a substitute for field research, but they do somewhat bridge the gap between the theoretical language of academic sources and ground realities.

In addition, this may also allow students to better examine, apply, or test theories. This is an important skill, and may well discourage students from blindly accepting the validity and claims of theories. What’s more, doing so may also allow students to understand what it is to float a theory, and the importance of the “real” in theory-making. This would also be a minuscule step toward driving home the importance of being an accountable researcher or theorist.

Why the Insistence on Graphs and Figures?

Do we really need complex, detailed graphs and figures to tell us about the extent of poverty? Anybody with willing eyes can notice the extent and severity of it. This is not a post about poverty. Rather, it’s about the salience of firsthand experience of the sociopolitical world. To be sure, graphs and figures do help us appreciate the nuances of the social issue at hand, but they are not the only means available to us; they are an attempt to make sense of the sociopolitical world, not its ultimate representation. Too often, the quantitatively-minded tend to dismiss conversations and articles that employ the “I” to discuss social issues. Experience, however, implies the “I,” and there’s no way around it.