Citations and Creative/Contentious Interpretations

Many students find it difficult to understand the need for citations. Some even consider it entirely useless. Citations and references, however, are essential aspects of academic writing. In this post, I’d like to show that citations allow writers to present creative, contentious interpretations of texts, especially their primary and secondary sources.

First, reviewers, graders, and professors require citations because it shows them where students have been and what they’ve been able to gather from where they’ve been. That is, if a syllabus allows you to or requires you to identify relevant external sources for a paper, citations become indispensable. These papers just cannot be written without citations. Grades for these papers are likely to be based on the following factors: (i) the relevance of the material(s) cited, (ii) the novelty of the material(s), and (iii) how well you draw connections between the material(s) and the syllabus at hand. Sometimes, for tricky or difficult courses, graders may simply be looking for a relevant external source. That is, how well you understand the material will be of secondary importance; remember: it is not unimportant, however. For instance, if your prescribed reading for class is Michel Foucault’s Discipline and Punish (D and P), your external materials cannot be titles like Michel Foucault: A Primer or A Critical Introduction to Michel Foucault’s Works. To be sure, you can refer to these works to enhance your understanding of Foucault’s approach, but you cannot cite these as external sources. Here’s why.

"Citation needed"
Image Credit: futureatlas.com

Choosing a Relevant External Source

A good external source has to dialog with your prescribed reading, even if only implicitly. Sometimes, there may be no apparent connection between texts, and in these cases students must be able to argue for the relevance of these texts. This inevitably involves quoting from your chosen external material. Sometimes the connection may be straightforward. In both these cases, though, you are looking for texts that critically address–either directly or indirectly–the concerns raised in the prescribed reading. For example, a text titled Human Sciences and Foucault’s Archaeological Approach may be a better external text than the two examples listed above.  On the other hand, an example of a text not directly related to D and P would be Jean-Francois Lyotard’s The Postmodern Condition. If you choose this text, you incur the burden of proof. You must first state why you think this is relevant and then proceed to substantiate this claim. The only way to do this is by (i) quoting from D and P and Lyotard’s text and (ii) explaining in detail why you think these texts should be in dialog.

Creative, Contentious Interpretations

At least as far as the qualitative aspect of the Humanities is concerned, there is plenty of room for students to experiment. Let’s remember some qualities of canonized works. They typically draw from seemingly unrelated texts, interpret texts in radically different manners, and extrapolate seamlessly. This is what makes reading Deleuze or Sontag exciting. At the same time, it is necessary to keep things in check. That is, you can offer a contentious interpretation, not a ridiculous one. Contentious interpretations or thesis statements are exciting and, more importantly, defensible. If unpacked well, readers will be able to discern connections they would otherwise not have made. Ridiculous interpretations are simply indefensible. At best, they may be seem forced. Mostly, however, ridiculous interpretations will mean lousy writing. It will also mean poor grades. So, let’s keep it tastefully contentious.

Advertisements

Yentl Syndrome: A Deadly Data Bias Against Women — Longreads

The Yentl Syndrome is so pervasive. Women’s concerns and complaints are dismissed too often as mere frustrated ramblings. When men ramble, on the other hand, it’s “musing”; their concerns are assumed to have greater depth and urgency. The syndrome may have been coined in the context of the field of medicine, but the bias against women is more deep-rooted and potent than we’d like to admit.

Thinking about Alternative Energy, Nuclear Energy, and Consumption

Any energy drawn from sources other than fossil fuels is considered “alternative.” An essential characteristic of alternative energy is that it does less harm to the environment. However, alternative energy is not necessarily renewable, although these terms are often used interchangeably. For instance, nuclear energy is a form of alternative energy, but not renewable. The use of alternative energy is typically advocated to alleviate global warming and ozone depletion—well-known adverse effects of high carbon dioxide emission. In other words, it is argued that alternative energy emits less carbon dioxide than fossil fuels.

Moreover, it has been shown that alternative energy is cheaper to produce: typically, it does not require complex machinery or equipment.

Nuclear Energy Is Alternative, but Is it Really “Green?”

Nuclear energy is not only expensive to produce but also difficult and dangerous to manage. Advocates of nuclear power claim that it is both “green” and best suited to address the mounting global demand for energy. On the other hand, critics rightly remind us that nuclear energy is non-renewable and that nuclear plants are extremely vulnerable to natural disasters and human error. It is equally difficult to manage and dispose of nuclear waste in a safe manner. This is especially worrying since nuclear plants produce a lot of waste. In addition, the demand for nuclear energy has spawned black markets for elements such as uranium and thorium, which can be used to manufacture nuclear bombs. Unfortunately, moreover, manufacturing a nuclear bomb is not a very difficult task. Therefore, it is important to question whether nuclear energy should be regarded as a form of alternative energy.

How Green Is Alternative Energy?

As mentioned earlier, alternative energy is essentially green and generated from sources other than fossil fuels. Nuclear power and energy, on the other hand, are not green. At the same time, it is untenable to argue that alternative energy is the panacea for all our problems. For instance, it has been shown that electric and hybrid cars, often hailed as green vehicles, are ridiculously expensive to manufacture. What’s more, their production allegedly involves a lot of carbon dioxide emission. However, countless environment-conscious communes have shown that solar energy can be both adequate and green. It remains to be seen whether solar energy can be harnessed in a green and inexpensive manner for social settings larger than communes.

The Bottom Line

It is clear that we must urgently monitor what and how much we consume. In fact, it can be argued that reckless consumption is the main cause of human-driven global warming. Even if one is disposed to consider global warming a myth, one cannot deny what may well be a causal link between excessive consumption and ozone depletion. Therefore, any discussion of alternative energy cannot be complete without careful consideration of related issues such as overpopulation, development and urbanization, deforestation, and energy conservation. These interrelated issues bring the question of consumption to the forefront. In effect, these issues challenge our collective mindset—our unabashed preference for unchecked growth and productivity. Ideally, discussions about alternative energy should urge us to want and consume less.

Technology and Higher Education: Developments and Predictions

This is a guest post submitted by Megan Nichols.

Education is changing and adapting to meet the needs of the modern, technology-driven world economy. In turn, how we go about educating students at the K-12 and university levels on using and building technology will likely influence how those technologies develop. Here’s a look at the ways in which education and technology are becoming increasingly inseparable and interdependent in the “Tech Age.”

What Qualities Must Education Foster in the Tech Age?

What kind of “pressures” does the technology age place on today’s schools? To put it another way, what kinds of adults and members of the workforce must today’s students become?

Let’s consider some of the challenges and opportunities presented by a world economy that increasingly relies on digital platforms to engage in consciousness-raising, commerce, and collaboration:

Image Credit: Mohamad Hassan
  • Comfort with Telecommunication: A global economy requires top-notch collaborative skills, as well as familiarity with digital tools that can link multiple states and continents. As much as 70 percent of the global workforce works remotely at least once a week already. This number is likely to increase as technology continues to decentralize workspaces. Higher education, therefore, should equip students to work with telepresence tools and other relevant technologies.
  • Knowledge of Digital Functions: Students must become “digital citizens,” which includes becoming better-versed in the ways information and disinformation are disseminated, how search engines work, and how automation might upend what we think we know about economic theory.
  • Ability to Adapt: Tomorrow’s workforce will be more mobile and more multi-disciplinary than ever, according to experts. It’s not enough that students learn—they must also “learn how to learn” in order to remain prepared and adaptable. If the technology age heralds one thing, it’s the threat and opportunity of nearly constant change.

Making sure education adapts to the requirements of the technology age requires, perhaps above all, that we impart our K-12 and college students with a true love of learning. To this end, we must make sure we’re building effective—and cost-effective—tools for the classroom that inspire curiosity and open minds; help improve retention of critical lesson materials; and prepare students for a lifetime of self-directed learning, solving problems thoughtfully, and navigating change and complexity with confidence.

Technology Reinvents Education

Education today isn’t just about preparing students for a future in technology-heavy careers. It’s also about leveraging emerging technologies to improve learning experience—that is, to improve information retention and facilitate easier digital access to class material and productivity-boosting classroom assets, among other things.

College and K-12 classrooms of the near future will most likely host the following technologies as a matter of course:

  • Online class dashboards for accessing and submitting assignments, and for enabling personalized, self-paced learning
  • Tablets and smart whiteboards to provide engaging “learning surfaces”
    Professional studies indicate digital whiteboards help improve lesson clarity, as well as student motivation and engagement with the material.
  • A full-scale pivot in the near future to (i) automatic attendance software, using face recognition or biometrics, and (ii) tardy software—which particularly address students who find it difficult to report to class on time
    Technology should enhance teachers’ experience, too. Attendance and tardy software are likely to lessen the workload of our already-busy teachers.
  • Attention-tracking software for teachers who want to know who’s engaging with their lessons in good faith, who’s excelling and may require specialized placement, and who’s possibly struggling

Right now, major technology companies—such as Microsoft, Apple, and Google—are throwing their hats into the education “ring.” They’re developing collaborative presentation and word processing programs; building smarter and more capable portable tablets and other implements, such as smart whiteboards; and making relevant lesson materials even more accessible through digital platforms, such as podcasts, iTunes U, and Amazon’s digital textbook rentals.

Teachers can now find several resources online for learning more about these developments and how to effectively incorporate technology into the classroom. In the contexts of K-12 and higher education, technology doesn’t just raise the bar for students—it also raises the bar for educators.

Education Needs to Prepare Makers (Not Just Consumers)

Recent technological developments, especially digitization, have wrought significant changes in the  economy and the job market. Therefore, we need an education system that creates “makers,” not just “consumers.”

Consider automation. The autonomous nature of contemporary manufacturing and material-handling equipment means the modern shop floor or distribution center employee won’t merely be interacting with a machine someone else has built. Assembly line workers and HVAC technicians are just two relevant examples in this context. Specialists in these fields are expected to possess in-depth educational background and practical knowledge in programming, digital troubleshooting, machine code, APIs, open-source platforms, and the manifold ways in which physical and digital systems interact.

The writing has been on the wall for some time now: programming courses are essential not just at the higher education level but also at the K-12 level, especially if we are to foster what has been called the “maker culture.” Steve Jobs was ahead of the curve when he pointed out, years ago, that programming should be a requirement in modern schools. Today’s makers write apps that keep our personal lives and our enterprises organized and mobile—they are good at solving problems and know how to think critically and logically.

In a world where technology plays a central role in everyday life, we’ll be hurting ourselves in the long run if we don’t urge our K-12 schools to begin teaching at least the basics of coding at an early age, when the human brain is at its most elastic. Some experts recommend children start learning the fundamentals of programming as early as age five.

Here’s another way to look at this: because STEM courses—Science, Technology, Engineering, and Mathematics—are so deeply entwined with values such as self-directed learning, confident troubleshooting, and reasoning, they provide a blueprint that may in turn allow schools to create a “whole-student” approach. In other words, when schools expose students to the sciences—including mathematics, computer science, and programming—from a young age, they grow up better equipped to apply the required skills and think critically even beyond the classroom.

It’s a happy accident that these STEM skills are also highly valued in the job market. Moreover, machine learning specialists, data scientists, and developers not only draw high salaries but also enjoy quick career growth.

Technology Creates Opportunities in Education and Beyond

According to some estimates, 65 percent of today’s students will find themselves in careers that don’t currently exist. That’s proof enough that we need an education system that does two things: (i) immerses students in useful and productivity-enhancing technologies and (ii) creates new generations of students with the practical and critical thinking skills necessary to interact with especially complex digital systems.

The world is full of opportunities for people who leave school not just with a genuine love of the material they studied there but also with an appetite to continue learning beyond the classroom.

As we’ve seen here, technology does more than help us modernize the classroom and make teachers’ lives easier. It also prepares students and educators to engage with and improve technologies to drive social, civic, and economic change using digital platforms; better understand automation, robotics, and the ways in which physical and digital systems interact; and indeed much, much more.

Megan Nichols is a technical writer and science blogger. She is the editor of Schooled By Science, a blog dedicated to making scientific topics easy to understand. Keep up with her latest article by subscribing to her blog or following her on Twitter @nicholsrmegan.

Is Anthropology Science?

Anthropology is being taught at more schools and colleges, yet students and parents still find it difficult to determine whether Anthropology is a science. Puzzled, they often wonder, “What, then, could it be if Anthropology is not a branch of science?” The truth is, Anthropology is a malleable discipline. There are anthropologists who rely greatly on the scientific method, and there are also those who dare to go beyond this method—often relying on practices such as interpretation, empathy, thick description, and other techniques.

Image Credit: Pete unseth

What Makes Anthropology Malleable?

Anthropology is malleable because its concerns are extremely broad. For instance, sub-disciplines such as Physical Anthropology, Biological Anthropology, and Forensic Anthropology draw especially heavily from the the anatomical sciences, and indeed from other natural sciences. These sub-disciplines focus almost exclusively on the physical and biological characteristics of humans. On the other hand, sub-disciplines such as Social/Cultural Anthropology, Anthropology of Religion, or Anthropology of Science focus on the sociocultural cues, norms, rules, and practices that characterize a given community, as well as the ways in which the community relates to macro cultures. In fact, one of the key aims of this form of Anthropology is to supply a comprehensive, yet inclusive, definition of culture. Notably, Anthropology of Science, a rather new branch, aims to critically examine the ways in which the scientific community functions—that is, how the community’s values, interests, and preferences dictate what can be considered valid, scientific knowledge. In short, the branch focuses on the social aspects that enable scientific knowledge-making.

How to Differentiate

As one can clearly see, we can differentiate the branches of Anthropology that rely heavily (if not solely) on the scientific method from those that do not by focusing on the concerns of these branches. That is, the former set of branches largely deals with tangible things (such as the human body, artefacts, etc.), whereas the latter deals with intangible things (such as culture). This is not to imply that anthropologists who focus on culture do not rely on the scientific method. In fact, some anthropologists argue that all branches of Anthropology must be based only on the scientific method. In other words, they claim that culture—and indeed other such intangible things—should also be examined only via the scientific method.

Culture, the Scientific Method, and Thick Description

Should anthropologists rely solely on the scientific method, even to study culture—that intangible, diverse thing? What other methods are there, and why are these preferred by cultural anthropologists? As mentioned above, cultural anthropologists also use thick description to present their ethnographic findings. Thick description requires context, and plenty of it. In other words, it is not enough for an anthropologist to simply record a community’s cultural practices. She must also necessarily explain what these practices mean to members of the community, what they derive out of these practices, and the significance of these practices. In effect, the anthropologist must be able to describe to an outsider the meaning a community ascribes to its cultural practices. To do so, the anthropologist must also forge a bond with the community she is studying.

Cultural Anthropology today focuses on presenting people’s own views of their culture. This is a recent, yet very important, development. The absence of people’s own views results in one-sided, prejudiced ethnographies. The scientific method is undoubtedly useful. Yet, to insist on using only the scientific method is to insist on presenting only an incomplete picture—one we can call “thin description.” For instance, a survey or a purely quantitative study of people’s opinion is less informative than a study that marries these with thick description. That is, through thick description, the anthropologist can contextualize people’s opinion; without context and interpretation, the study would merely be of numerical significance.

This is not to discredit the scientific method—more accurately, this is not to suggest that surveys and quantitative approaches are useless to the anthropologist. To be sure, these are important, and sometimes even necessary, tools. Yet, they cannot be the only tools employed in an anthropological study.

Making the Call

So, if you are keen about Anthropology, try to identify what it is that really interests you. If you’re interested in the different ways in which people seek or make meaning, or the ways in which culture influences individuals and vice versa, Cultural Anthropology may be what you’re looking for. On the other hand, if you’re interested in the physical characteristics of the human body, the ways in which lifestyles shape these characteristics, and so on, Physical or Forensic Anthropology may be your discipline. Remember: there is no hierarchy here; whether you rely on the scientific method or active modes of interpretation, what really matters is the earnestness with which you undertake your research. Finally, to answer parents’ question—Anthropology is not only a science but also a salient Humanities discipline. Moreover, nothing is off topic for the anthropologist. As studies in the field of Anthropology of Science show, even science is open to critical anthropological inquiry. It is one of the very few disciplines that critically studies and questions scientific paradigms. For this reason, Anthropology is also a revolutionary discipline.

What Are Nootropics?

If you think “cognitive enhancement” seems far-fetched, think again. For a while now, scientists and researchers have been actively working to produce “super pills” to facilitate, if not directly achieve, just that—cognitive enhancement. Therefore, it is no surprise to find supermarkets selling memory-enhancing pills and supplements over the counter. Otherwise known as “smart drugs,” nootropics are substances that may enhance memory, creativity, and motivation in healthy individuals. Commercial, mass-produced nootropics are a recent invention; they’ve been around only for three decades or so. Naturally, researchers are yet to fully ascertain their effects. Most cognitive enhancers, therefore, are categorized as stimulants.

don't forget to take your smart pills : iTOUCH, san francisco (2013)
Image Credit: torbakhopper (Flickr user)

Stimulants or Supplements?

In other words, until we know for certain that nootropics do what their manufacturers claim they can, they will be categorized only as stimulants, not as supplements. In fact, since nootropics are enhancers that do not address any particular medical condition, they may be categorized only as stimulants by regulatory authorities even if they are found to be effective. This, however, does not prevent manufacturers and marketers from presenting nootropics as supplements. At the same time, we must remember that the consumption of concentration-enhancing stimulants is not a new trend. To enhance cognition, students have been known to consume drugs typically prescribed for those diagnosed with Attention Deficit Hyperactivity Disorder (ADHD or ADD).

Adderall is one such drug—perhaps one of the most abused prescription stimulants among the US student community—whereas Ritalin is another (although not as ubiquitous as the former). Students usually take adderall and other stimulants to boost their academic performance. Prescription stimulants are also consumed by those who need to clock long hours at work, as these stimulants tend to increase alertness, wakefulness, and energy levels.

Worryingly, these drugs can be purchased without prescriptions. What’s worse, their demand and popularity have also led to the proliferation of black markets in the vicinity of schools and colleges. Many FDA-approved prescription stimulants continue to be abused by otherwise healthy individuals, purportedly for their capacity to enhance cognition and alertness. Most of these drugs, we must remember, are not intended to be used as cognitive stimulants; rather, they are designed specifically for those with learning disorders or difficulties, or other conditions—be they mental, psychosomatic, or neurobiological. Cognitive enhancers, or nootropics, on the other hand, are designed specifically for healthy individuals. They do not treat or address any particular mental condition or disorder. Whether or not they can actually enhance our cognition without adverse effects, only rigorous testing will tell.

How Are Nootropics Different from Other Stimulants?

Firstly, newer nootropics do not contain chemicals, at least not as much as drugs prescribed for those diagnosed with learning or neurobiological disorders. For instance, the main active ingredient in adderall is amphetamine, which is addictive and has been found to have a number of adverse effects on long-term users. Newer nootropics, on the other hand, contain vitamins and lipids derived from common foods. Manufacturers claim that these newer nootropics also include antioxidants, omega-3s, natural vitamins, and many plant-based ingredients.

Nonetheless, as mentioned above, it remains unclear whether nootropics can enhance cognition without entailing adverse effects. On the other hand, nootropics, much like probiotics, may have no effect whatsoever on those who consume them. In all fairness, however, probiotics have been known to enhance digestion in healthy individuals. Nonetheless, since both nootropics and probiotics are aimed at healthy individuals, it is difficult to gauge the extent to which—if at all—these substances benefit their consumers.

Nootropics: Potential for Abuse or Addiction

We are not yet sure whether nootropics can be addictive. Theoretically, newer nootropics should be less addictive in comparison to prescription stimulants, especially since the former do not contain compounds that we know are addictive. At the same time, however, these newer nootropics contain substances whose effects we are not yet aware of. For instance, according to this report, some nootropics contain relatively untested ingredients, such as ginseng root and bacopa, a medicinal herb. Since they include untested ingredients, it is difficult to predict what adverse effects these newer nootropics will have. The only way to find out is through rigorous sampling and testing, which, as we know, is in itself a dubious process that often raises ethical and humanitarian questions, and rightly so.

Are Nootropics Necessary?

Let’s not forget: nootropics are aimed at healthy people. They claim to enhance cognition and alertness. They are a product of a culture that feverishly values productivity. Students, for instance, abuse adderall because the ultra-competitive academic world leaves them little choice. This is not to suggest that healthy students who consume adderall in their pursuit of high academic performance are blameless; rather, it is to point out the salience of sociocultural factors in encouraging addictive behavior. That is, healthy students and young professionals resort to adderall because our culture values and rewards incessant productivity very highly. In this context, we must wonder if nootropics are at all necessary, especially since they are manufactured for healthy individuals. Don’t we already have caffeine (which, let’s not forget, is also addictive)?

Are Probiotics Really Effective?

Probiotic foods have never been more popular. For instance, as of 2015, Americans spent over 36.6 billion USD on probiotics, yet the efficacy of probiotics remains hotly debated. Mostly sold as health supplements, probiotics are foods filled with live, friendly bacteria and yeast. Advocates argue that probiotics cleanse our gut by eliminating unhealthy and potentially harmful bacteria and microorganisms. This has led to the proliferation of such foods as probiotic ice creams and yogurt. It is also argued that probiotics can be used to counter the negative effects of antibiotics, which sometimes tend to eliminate good bacteria from our digestive system. In effect, when consumed properly, probiotics will enable the human body to maintain a healthy balance between good and bad bacteria, or so the advocates argue.

Image Credit: Bicanski (Flickr user)

On the other hand, skeptics point out that most probiotics have not been subjected to enough thorough clinical trials. Therefore, they argue that claims about the benefits of probiotics are rather unfounded. Some observers even claim that probiotics have no effect whatsoever on the human body. What’s worse, very few probiotics have been tested, let alone approved, by the Food and Drug Administration (FDA).

Why There Is Very Little Evidence

Probiotics have flown under the FDA’s radar mainly because it is unclear whether they are a form of medication. While they are almost unanimously regarded as supplements, probiotics are hardly ever presented as a form of medication. This is primarily because probiotics do not treat or alleviate any particular medical condition, and supplements that do not treat specific conditions are typically not subjected to the FDA’s extensive screening process.

What Are Supplemental Probiotics?

Most probiotic foods are considered supplemental because the human body is home to a wide variety of microorganisms—mainly germs and bacteria—that enhance digestion, reduce toxin levels, and ensure intestinal well-being. Contrary to popular belief, not all bacteria are harmful. In fact, these friendly microorganisms are an essential component of the human body. Supplemental probiotics, therefore, as the name suggests, aim to supplement and enhance the digestive functions of the friendly bacteria already present in the human body. Yet, it remains unclear exactly what aspects of digestion supplemental probiotics aid.

More interestingly, supplemental probiotics are not only natural but are also a form of live food, which makes it all the more difficult for regulatory bodies to classify them. Moreover, probiotics have different effects on different people, and without rigorous study, it is very difficult to ascertain the efficacy of supplemental probiotics. Doctors have been urged to recommend only those probiotics that have been tested and approved by the FDA. Nonetheless, most supplemental probiotics are sold over the counter, typically without a prescription. Probiotics, it is claimed, can alleviate a wide range of conditions, from digestive discomfort to eczema and other skin conditions. Probiotics are also presented as supplements that can prevent viral infections and allergies. In addition, it has been argued that probiotics can improve oral health.

The Negative Effects of Probiotics

The negative effects of probiotics have not been fully examined. Yet, they are known to cause digestive discomfort—ironically enough, the very condition they sometimes tend to alleviate. Adverse effects typically include gas and/or bloating. Therefore, it is best to consult a physician before consuming probiotics. It is equally important to look for the FDA approval before purchasing probiotics over the counter.