Historiography, Objectivity, and More

Recently my students and I have been discussing questions such as historicity, historiography, and objectivity. Central to this exploration was our focus on the fiction-fact dichotomy. Indeed, one of the questions that emerged as we prodded on was whether we should hold fiction and fact in such stark contrast. The discussions were deeply fruitful, and this post is a very short summary of what transpired.

Writing History and How History Is Written

There is of course the oft-repeated but altogether true adage: History is written by the victors. But even this overlooks the sociopolitical clout of the dominant, of the victors. For instance, a significant part of domination involves ensuring compliance from the subordinated. This involves forcing the subordinated to accept fabricated versions of history, among other things. That is, the subordinated may not just be forced to accept a valorized version of history but they are also typically restricted from contesting this version. To this end, those in power tend to rely on legal and coercive measures to stifle any resistance.

Over time, versions of history compiled for the benefit of the dominant become institutionalized and even appear in textbooks. In some instances, history aids the transformation of dangerous, violent personalities into generous, other-regarding benefactors–as is the case with Cecil Rhodes, and indeed with colonialism in general.

How then can these versions of history–compiled to be popular and dominant–claim to be fair and objective? In fact, the trouble lies in the fact that the socially and politically dominant largely determine conditions of fairness and objectivity–not just in relation to historiography but also in terms of broader contexts, such as conditions of scientific objectivity, rationality, and, by extension, criteria for what qualifies as fact and fiction.

The Fiction-Fact Question

Central to this inquiry is the suggestion that the truths we produce are necessarily partial–even scientific truths. Admittedly, this is a contentious claim, but it allows us to think fruitfully and critically about fact and fiction.

In other words, it is impossible to produce all-encompassing truths. No matter how comprehensive an account, it will necessarily be limited and partial. The limitations are mostly imposed by our own cognitive finitude as well as by our biases, especially biases we are not aware of. This is not in fact a bleak account of what it is to produce truth and knowledge. If anything, it illustrates the importance of constant critical scrutiny, a quality without which progress–especially scientific progress–might become endangered.

It must also be noted that this argument does not suggest that it is futile to try and present a complete, comprehensive picture of a given issue or phenomenon. On the contrary, it argues that we ought to try and present as accurate an account as possible, and to do that, we must acknowledge that the truths we produce can only be partial. Which means we must also call “objective” methodologies into question. Doing this means ascribing importance and legitimacy to methods and techniques that fall beyond the scope of the scientific method. Oral history is just one relevant example in this context.

Ultimately, it is essential to ask what happens to marginalized voices–both in the context of history and more generally. Why do some voices get marginalized, and some amplified? These are some of the most elementary questions we should be asking when appraising works or accounts that claim to be historical.

Rationality and Animals: Humans and Non-Humans

Are humans animals? Or are we superior–in that are we the only species worthy of the “rational animal” title? To answer these questions, we must necessarily examine definition(s) of rationality. This post argues that the animal-human distinction, based as it is on the view that only humans are capable of rational thought, breeds complicity and speciesism, in turn enabling us to excuse, if not condone, animal cruelty and other similar acts.

Conscious thought, one of the most important characteristics of rationality, is a good starting point in this context. To think consciously is to be aware of what one is thinking. Which is to say, it is also to be able to think about something in a desired way. Admittedly, there are degrees of conscious thinking. Nonetheless, central to it is the capacity to direct one’s own thought. Some are better at this than others, but we all do it from time to time, sometimes–ironically–without even knowing we’re doing it.

Rationality: Thinking in Terms of Means and Ends

Many have also argued, quite unsuccessfully, that what really separates humans from animals is the former’s ability to create and use tools. How is this related to rationality, specifically to conscious thought? To use a tool, animals–both human and non-human–must first recognize that the task at hand cannot be accomplished or even attempted without the aid of a tool. What follows is typically a search for the tool, which in turn is followed by the fashioning of it. All of these processes necessarily involve conscious thinking. They not only require us to assess the problem but also assess our grasp of it.

Much like conscious thinking, tools also vary greatly in terms of complexity. Quite simply, however, a tool is something that aids a process, a thing that allows us to complete or attempt a task with some degree of ease. Tools, moreover, do not necessarily guarantee the successful completion of a task. Nonetheless, the Google Sheets app on my phone is as much a tool as a pebble or a stone is to that thirsty bird faced with shallow water in a narrow trough. Their specific purposes differ, but broadly speaking they tend to introduce ease. They are designed to introduce ease.

One of the most enduring (albeit contested) definitions of rationality states that it has mainly to do with thinking and action in relation to the “means-end” category. That is, rationality, this definition suggests, has mainly to do with thinking about and acting so as to achieve a desired end. In other words, it is rational to carry an umbrella if one wants to avoid getting drenched. Conversely, it is irrational to go out without an umbrella when it’s raining, especially if one’s stated purpose is to avoid getting drenched. In sum, actions contrary to one’s stated ends are largely irrational.

With this limited definition of rationality in mind–coupled with the notion of conscious thinking as an essential aspect of rationality–one can see that rationality is not merely a human thing.

It’s fallacious, therefore, to claim that humans are not animals because they are rational. Animals–human and non-human ones–display several signs of rational, conscious thinking.

How the Raccoon Became a Truly “Urban Animal”

If you thought raccoons were omnipresent, you were right, at least partially. Raccoons are tenacious animals, and they have adapted extremely well to urbanization in North America. In fact, out here raccoons are more common in cities than in the country or the wild. What’s more, they eat just about anything and can live just about anywhere. Even in the wild, raccoons are equally at home in thick forests, wetlands, and grasslands. Raccoons can live almost anywhere because they are not fussy eaters. For instance, in the wild raccoons eat fish, frogs, varieties of other aquatic animals, mice, insects, eggs, fruits, berries, and plants. To be more precise, they eat what’s available.

Raccoons, moreover, are especially adept at thawing open their prey’s hiding spots; they are remarkably dexterous and possess strong front paws, which enable them to rummage efficiently. These qualities have enabled raccoons to thrive in urban areas, which, we must remember, is a nearly impossible feat for most other animals.

Their Original Habitat

Raccoons are known for their resilience and adaptability. They originally lived and thrived in deciduous and mixed forests in North America. However, they’ve gradually become adept at surviving in other, more harsher landscapes in the continent. Today, raccoons can be found even in Europe (including Russia, a country notorious for its harsh winters) and Japan. Interestingly, raccoons were introduced in these regions by humans during the mid-20th century. Raccoons also thrive in captivity; their life span increases rather dramatically: Some captive raccoons have been known to live well into their twentieth year, whereas raccoons in the wild have a life expectancy of only three years. Their foray into urban spaces is not without problems, however. Most raccoon deaths in urban areas, statistics show, are a result of road or automobile injuries. Hunting is another major cause. Humans, therefore, are responsible for a large number of raccoon deaths both in urban areas and in the wild. Nonetheless, raccoons continue to thrive: their numbers are on the rise, and the IUCN Red List Status puts them in the “Least Concern” category.

Some Interesting Facts

  1. Young raccoons are called “kits.”
  2. Most raccoons may not make good pets; even in captivity, raccoons tend to retain their wild instincts. Nonetheless, raccoons have been known to be extremely loving and attached as pets.
  3. What’s more, it is illegal in some states to keep raccoons as pets—mainly due to their vulnerability to rabies, their wild instincts, and the extent of care they require as pets.

What Makes Concerns “Ethical”

The last few posts on this blog raised some ethical concerns. Now would be a good time to ask what constitutes ethical inquiry. Or what it is that makes concerns ethical. Students are often told that Ethics is a field unto itself–a vast, complex one. To a certain extent, this is true. However, to be ethical is to adopt standpoints and explain why a preferred standpoint is ethical. Let’s unpack this claim.

First, it must be said that ethics involves both (i) the act of categorizing an action or a thing as good or bad and (ii) a careful breakdown of this process of categorization. Canonical works are deeply useful at the introductory level, but they often do not capture the ambiguities of everyday life. This is because the canon is typically diluted for student consumption, and the focus remains on ethical conclusions, not on arguments. What’s more, a large part of doing ethics involves the construction of arguments.

What’s good in scenario A may not necessarily be good in scenario B. For instance, it would be fallacious to say that humans should not, under any circumstance, eat animals or other humans. Nonetheless, this principle cannot be applied to agrarian communities or economies that consume animals and animal products as staple. Most such communities do not raise animals solely for human consumption. In fact, without human intervention, certain species may find it difficult to survive and procreate. Cattle is a relevant example here.

Good ethical arguments avoid the under any circumstance clause. They instead refer to specific situations and examples. Doing ethics meaningfully necessarily involves an examination of when and why certain norms or ethical principles cannot be applied. Without this, ethics would lack applicability.

Insect Farming, Food Shortage, and Consumption

As of May 2018, the world population was estimated to have reached a staggering 7.6 billion. As this number is expected to grow, concerned observers have pointed out the dangers of food shortage. Some of the foreseeable consequences include a severe shortage of the meat products we prefer and produce now, loss of agricultural land, and a shortage of dairy products. In other words, population explosion threatens most of our conventional sources of protein. In addition, it may also increase poverty rates and complicate the distribution of food. Therefore, it is necessary to identify alternative sources of protein. In fact, we may already have identified one: edible insects.

Insect Farming: Things to Consider

Insect farming is believed to be more sustainable than livestock production, which is currently one of the most environmentally harmful practices (it has been found that livestock production is one of the main drivers of global warming). However, we are yet to ascertain whether insects can be really edible and safe. Insect farming may also pose unique challenges. For instance, in addition to screening for risks to humans, we must necessarily examine the potential negative effects of insect farming on animals, plants, and the environment. More importantly, we need to identify whether insect farming can adversely affect the planet’s biodiversity. After all, insects are an important part of the food chain. Directly or indirectly, they impact the daily lives of amphibians, reptiles, and mammals—including humans.

On the other hand, we know for certain that insect farming is more sustainable than livestock production. In fact, according to this study, almost one-third of the global cereal produce is fed to animals, especially pigs and poultry raised for human consumption. Conventional meat, therefore, may not offer us a way out of the seemingly inevitable problem of food shortage. In fact, if anything, preparing for livestock production is also part of the problem in that it takes a lot of food to make meat.

Some Arguments for Insect Farming

In addition, with more countries poised to experience significant economic growth, insect farming seems like our safest bet. This is because growth and development typically result in higher demand for meat. Moreover, insect consumption is not as outrageous as it seems. For instance, insects are already an important aspect of Thai diet: locusts, crickets, larvae (of several insects), and spiders are widely consumed in the country. Most of these insects are fried or deep-fried for consumption. Thailand is also known for its innumerable cricket farms, where the insect is raised solely for human consumption. Moreover, Thailand is not the only country where insects are farmed for human consumption. Its Asian counterpart Vietnam is another example. Insects are also consumed in Brazil and Cambodia. However, we are still a long way from replacing conventional meat with edible insects. This process requires active research to determine whether the large-scale farming or production of edible insects is safe for the environment. It also depends on whether people are willing to look at insects as food; the bottom line, however, is that we may have no choice.

What Is Frugal Living?

Frugal living—or frugality—is sometimes frowned upon and associated with miserliness. However, frugal living, in essence, is mindful consumption. Although a personal ethic, frugal living tends to have tangible impacts on the environment and the economy. Principled frugal living necessarily involves conscious effort to reduce or avoid waste altogether. Frugality, therefore, constitutes an alternative lifestyle. It is a statement against overconsumption and decadence. In other words, frugality is practiced not only by those who aim to reduce their spending but also by people committed to reducing their carbon footprint. Therefore, it also involves active down-scaling and, in some cases, complete relinquishment of a certain lifestyle or preference.

Some Aspects of Frugal Living

Similarly, frugality is not only about reducing purchases. It also involves recycling and repurposing of resources. Purchasing used items—especially furniture, clothing, stationery, and automobiles, among other things—is another crucial aspect of frugal living. By doing so, those who practice frugality as a principle aim to dissuade manufacturers and industries from producing excess goods and services. Frugality also does not mean complete abstinence; in fact, this is one of the biggest misconceptions about frugal living. The goal of principled frugal living is self-sufficiency, and this is why frugality tends to have observable positive impacts on the environment. On the other hand, people also live frugally in order to be able to afford something. Typically, this latter kind of frugality, or non-principled frugality, tends to be temporary. Hence, it is important to differentiate between principled and unprincipled frugality.

Frugal Living and STEM Education

How is frugal living relevant in the context of STEM education, you wonder? Simply put, STEM education will enable you to adopt a more hands-on approach to frugality. In effect, it will make you more self-sufficient. For instance, you can explore sustainable engineering, a process aimed at designing or altering devices and systems to ensure the sustainable use of resources. STEM education will also enable you to accurately measure your carbon footprint and identify how to be more efficient and resourceful. Ultimately, STEM education will give you the tools to analyze and quantify the extent of your frugality.

What’s more, frugal living is a well-researched phenomenon. You will, therefore, find plenty of guidance and information. Frugal living has been critically examined from different disciplinary perspectives. For instance, several studies have examined frugality from an economic standpoint; scholars have also examined the political and philosophical underpinnings of this practice. On the other hand, STEM researchers constantly explore different ways to ensure sustainability. To this end, they actively seek to find out how to reduce, if not arrest, the depletion of natural resources. STEM researchers also aim to identify less harmful resources and try to reduce the negative impact of fossil fuels.

Frugal Living and Activism

Frugal living is also characterized by another dimension: when practiced earnestly, it amounts to exemplary activism. Teachers, therefore, can encourage students to undertake energy-efficient and sustainable DIY projects. Tasks can be as simple as designing a makeshift environment-friendly bag for shopping purposes, or planning small-budget travel. You can even encourage students to visit libraries to read actual books as opposed to having them read e-versions or Kindle versions of books. Do not forget to encourage them to cycle or walk to the library, especially if it is not too far away. Alternatively, encourage them to take the bus or organize a carpool. Remember: every little step counts. Nonetheless, be creative and challenge students to come up with their own planet-protecting ideas.

Approaching Footnotes and Endnotes: Thoughts and Strategies

Why footnotes or endnotes? Are they at all necessary, and should we read them diligently? Or can we afford to ignore them? The last two questions have no correct answer. That’s because if one reads the main text very carefully, footnotes and endnotes may be skipped. However, this depends on the purpose of these notes. For instance, some footnotes or endnotes may point readers to relevant, seminal scholarship, whereas others may offer clarifications or elaborations. Sometimes, they may also address anticipated criticism.

The most common complaint is that footnotes and endnotes complicate the reading experience. Therefore, it is essential to discuss how to tackle them. This post only deals with footnotes and endnotes in non-fiction and academic writing; it does not focus on how writers use them in fiction.

Image Credit: Kipala

The Simplest Way (Since There’s No Way Out)

The simplest way involves careful reading of the main text. Once this is done–that is, once we understand what the text is about–approaching the notes should be easy enough. To do this, it is best to keep a checklist, at least at first, before we can breeze through footnotes. The checklist can involve things like:
(i) what the author affirms
(ii) what the author denies
(iii) what the author contests
(iv) what the author doubts

This checklist may well help us understand the purpose of a given note, which will in turn eventually help us ascertain whether or not a note is central to matters discussed in the main text.

As our reading skills and knowledge in a given field improve with time, we may even be able to anticipate the purpose of a footnote fairly accurately. Initially, however, approaching footnotes is quite critical. Doing so allows us to become more proficient in a given field. In other words, by deferring the reading of footnotes, we may only be doing ourselves a disservice. There’s no way out, really.

Some Benefits of Reading Them

If you’re unable to decide on a specific topic for your term paper, footnotes and endnotes may just help you out. Sometimes, they contain information about areas that are under-researched or poorly researched; more often than not, this information also includes citations to texts that may be the first forays into the field. Which means you might just have found your topic.

Second, reading them allows us to learn the ropes of academic or non-fiction writing. If well done, footnotes and endnotes can be exemplary. It can show us what it takes to build good, persuasive arguments and how to structure them.

Finally, it takes a lot of reading to the get to the stage where we can afford to skip them. Until then, however, the only way out is through.