Find AMP on ISSUU at http://issuu.com/amp01
Hey guys. Thanks for hanging in there while the old AMP website was gathering cobwebs and the new AMP website was struggling to get off the ground. I’ve spent the past month or so lashing this machine together against the cold anvil of WordPress, and I think it’s finally at a point where it can be safely revealed to the very contributors whose work it was built to display. The new site was intended from the beginning to be easy to use, easy to maintain, and easy on the eyes, so expect various updates and improvements on all three fronts throughout the Summer and coming Fall semester. We want this website to be more than just an archive of issues we cut free from the zip-tied stacks in the supply closet: we want it to be a showcase of your talent, your effort, and your creativity.
If you wrote for us in the Spring 2014 semester, your hard work is now available for viewing — in a tasteful sans-serif font, no less — on computers, tablets, phones, and virtually anything else that can browse the internet.
If you wrote for us a little further back, don’t worry. We’re currently preparing to begin the process of uploading AMP’s 10-year-deep backlog to the website, one issue at a time. If you’re interested in fighting the good fight and helping us with this monumental task, we may be prepared to offer mug or t-shirt flavored compensation. (In addition to the bosom-warming feeling of a job well done.)
If you haven’t written for us yet, now is the perfect time to start! We may not publish during the Summer, but we’re always looking for contributors and ideas. We want the first issue of the Fall semester to be even better than this Spring’s May issue, and that means we need your opinions, your passions, your articles, your illustrations, and a little bit of your time to make something great.
It’s easy to blame. It’s easy to label Nazis, North Korean soldiers, or Chinese Communist officers as animals with no remorse and a penchant for merciless violence. Animals are beings without choice and deference; they are without emotional influence or morality. Those that kill in the Reich uniform are easily labeled as them—the perpetrators, the arbiters, the harbingers of death. Rarely ever is there any sympathy or understanding for the men that laughed, tears streaming down their faces, as they slaughtered women and children with automatic weapons. The fear in their hearts that they, too, would be killed by their superiors for sympathy or hesitation for the “enemy” pushes them to extremes, and so forth up the chain of command to the few truly sociopathic leaders. This is the terrible genius of these regimes: a few bad men (or women) create a system of fear so powerful and all-encompassing in its influence that good people become capable of crimes only few would consider committing on their own.
These “cults of personality” transform leaders into Christ-like symbols under which the average person is meant to quake. These leaders enter realms of instability and strife—the Weimar Republic, China, Korea—preaching salvation and prosperity to their subjects, who eagerly absorb the rhetoric in the fervor of desperation. In the safety of public trust, these leaders then consolidate power to resemble the fascist dictatorships we are familiar with in modern history.
It’s not as if the Germans were actively looking to commit genocide atrocities in their struggle to rebound from World War I—quite the contrary. They, like most every society, were looking for the greatest good. This greatest good happened to be the genocide of a race of people at the behest of a charismatic, anti-Semitic politician. Most would say that such a wrong could never happen in the hands of good normal people, but psychology says otherwise.
An experiment conducted by Stanley Milgram in the 1960s required participants to deliver electric shocks to other “participants” (who were a part of the experiment, and were not actually shocked) at the command of the researcher for incorrect answers to a series of questions. These shocks increased on a scale from “mild pain” to “danger: severe shock” and “XXX”. Disturbingly, the participants overwhelmingly complied with the researcher’s commands, even at the pleas of the person being shocked; 65% of the participants delivered the maximum shock. The importance of this experiment is why, exactly, the participants complied: They saw the researcher as competent and assumed that the study was safe, especially given that it was sponsored by Yale, a trusted academic institution.
Instances supporting the findings of Milgram’s experiments have occurred around the world, including the atrocities that occurred under General Shiro Ishii of the WWII Japanese Empire. Researchers for Unit 731, a secret group formed under the Kempeitai, subjected POWs, women, and children to live vivisections, weapons testing, and biological warfare. Subjects would have organs such as the brain, liver, and stomach removed; limbs were amputated to study blood loss, sometimes being reattached to opposite parts of the body. The Japanese participants believed the experiments would lead to important developments in medical treatments for Japanese soldiers, and thus were willing to continue.
However, this research only explains so much. When Milgram’s study was conducted with a “rebellious peer” present, the compliance dropped sharply, with nearly all participants refusing to deliver the maximum shock. If the participant’s trepidation was met with similar feelings from other “participants”, they felt validated in their disobedience, and thus the command of the authority figure was rendered ineffective. Dissent, no matter how slight, places doubt in the minds of the subjects.
As such, the primary weapons of an autocratic leader are fear and the systems that maintain it. Fear-based regimes routinely make troublesome personas “disappear” at opportune moments; malcontentedness becomes a taboo topic in the interest of personal safety. Even children making innocent jokes or statements, a scenario described in Chen Ruoxi’s short story, “Chairman Mao is a Rotten Egg,” can result in grave consequences for both the child and his family. Entire communities of neighbors, families, and friends are sown with distrust for one another; not wanting to be punished as an accessory, they police one another for even the slightest of infractions.
These findings are readily observable in the context of dictatorships and cults of personality. In North Korea, Kim Jong-Il was presented as a figure above common man, being cited in propaganda as a fashion trendsetter, the world’s best golfer, the inventor of the hamburger, and even supernatural. This propaganda served to position Kim as both fearful and worthy of worship from his citizens, and has continued with his successor, Kim Jong-Un. Kim Jong-Il, like other dictators, maintained his position of power by eliminating the chances of opposition and dissent by a “rebellious peer” in his citizenry.
Dozens of similar cults of personality have existed across cultures, geographical locations, and historical influences: Benito Mussolini of Italy; Muammar Gaddafi of Libya; Joseph Stalin of Russia; Adolf Hitler of Germany; Rafael Trujillo of the Dominican Republic; Sun Yat-sen, Chiang Kai-shek, and Mao Zedong of China, all during different periods of the 20th century; even Iraqi leader Saddam Hussein had a cult following before his death.
And, indeed, these cults share unusual traits: depictions of the leader are erected throughout these countries—statues, paintings, photographs—and are often treated as reverently as one would treat a religious shrine or emblem. These images are accompanied by collections of songs, pledges, dances, and other celebrations in honor of the leader, as well as propagandistic media that often depict the leader as heroic, supernatural, or chosen by God—under Rafael Trujillo, churches in the Dominican Republic were required to post signs reading “Trujillo on Earth, God in Heaven”. In the event of opposition, powerful systems of punishment are put in place: Hitler had an army of Schutzstaffel as well as his many concentration camps; Mao had laogai (“labor reform camps”) and a multitude of covert spies.
The result is a culture of dogma and reverence so permeating as to render even questioning thoughts taboooppressive as to shove any reasoning about their leaders’ actions the populace may attempt into the realm of taboo. Indoctrination is a powerful force, and keeps otherwise good people in a trap of circular logic that prevents them from ever seeing the absolute morality of their actions as agents of their leader. As they see it, the deaths committed by their hands is a necessary evil on the path to the greater good promised to them. Looking down the barrel of that weapon is a man who sees God—an angry and vengeful one.
President Barack Obama is certainly one of the most influential presidents America has ever seen. Not only is he the first African-American elected to the nation’s highest office, but he has also been heralded for his many domestic accomplishments, such as guiding the country out of the Great Recession, passing healthcare reform, and dramatically improving America’s global image. However, in spite of these accomplishments, it cannot be denied that there is one area where Obama has been found lacking: foreign policy. Especially in the Middle East, one of America’s key regions of interest, Obama has failed to implement an effective foreign policy during his two terms as president.
At this point you might be thinking: Hold the phone. Didn’t Obama end the War in Iraq, almost end the War in Afghanistan, and kill Osama bin Laden? And yes, it’s true that Obama has a few accomplishments when it comes to Middle East policy, but the list ends there. In addition, most of these “accomplishments” were essentially foregone conclusions. With the costs of war skyrocketing into the trillions of dollars and the economy in a major recession, it was common sense to begin de-escalating these military efforts. And the credit for bin Laden’s death should go to the military and to advances in technology which enabled troops to locate and identify the terrorist mastermind, not to Obama, whose only role was deciding when to execute the attack.
In the meantime, Obama has been ruining America’s international reputation and doing little to ensure that countries in the Middle East achieve higher levels of economic development. Consider the pullout from Iraq and Afghanistan: The U.S. left these economies in shambles and did not make a good faith attempt to reconstruct what they had destroyed. In Iraq, most of the infrastructure programs launched by the U.S. focused on areas that were lucrative for private companies, while ignoring the true needs of the country, such as agriculture development. Iraq now ranks 160th in the world in terms of standard of living, just above Gaza, and lacks key government services such as reliable water, electricity, and waste disposal. Afghanistan has proved difficult for Obama as well, and the relationship between Afghan President Hamid Karzai and him has proved tenuous at best. Because there is a great deal of anti-American sentiment in Afghanistan, Karzai has adopted a savvy but scary political strategy of distancing his government from the U.S. and pandering to the Taliban, a strong political force in the region. Thus, U.S. influence has greatly decreased in Afghanistan, and corruption has become a major problem. Afghanistan is considered the third most corrupt country in the world, with 25 percent of its national revenue channeled to government officials. Yet the U.S. continues to provide monetary aid to Afghanistan, in spite of the evidence that it is doing little to help the general public.
These mistakes could be made by any president, considering the challenges presented by this region, but there are some mistakes for which Obama bears distinct responsibility. While former President George W. Bush may have made disturbingly terrible decisions with regard to Middle East policy, he was at least consistent and guided by a clear vision about how the international arena should function. The same cannot be said about Obama, whose foreign policy is so scattered it can hardly be identified and whose inconsistencies in Middle East policy have resulted in scandals such as Benghazi and the Syria red line calamity. While Republicans have blown the Benghazi scandal out of proportion and attempted to conduct numerous costly investigations, it cannot be denied that Obama could have done a better job to prevent such rumor-mongering. After four Americans, including U.S. Ambassador Chris Stevens, were killed in the U.S. Consulate in Benghazi in September of 2012, the government had a surprisingly hard time gathering intelligence about the attacks. The FBI did not even arrive in Libya to secure the crime scene until more than 15 days after the attacks had occurred. In addition, different governmental departments disagreed publicly about whether the attack was simply an anti-American demonstration that had gotten out of hand, or rather a coordinated and planned terrorist attack, prompting mass confusion and misinformation in the media. While Obama may not have been able to prevent the attacks, he could have done a much better job ensuring that the government present a unified front and release relevant information to the press in a timely manner.
However, Obama’s biggest failure in Middle East policy can be found in Syria. The U.S. failed to intervene early when this civil conflict was more manageable, instead allowing it to escalate to the point where intervention would be costly, dangerous, and probably unproductive. Yet for some reason Obama still felt the need to bare America’s teeth, only to reveal that in the end that they aren’t actually that sharp. With the death toll at 100,000 and the country becoming less and less stable, Obama drew a red line when he told reporters that the U.S. would intervene in Syria’s civil war if chemical or biological weapons were used. Soon after, the Assad regime did begin to use chemical weapons against the Syrian people, including children, and the administration was faced with the decision about whether to intervene militarily. In September of 2013 Obama sought congressional approval for a military strike in Syria, but Congress faltered in giving the President the go-ahead, perhaps wary after the expensive wars in Iraq and Afghanistan. Luckily for Obama, the Russian government came to the rescue by creating a chemical weapon reduction plan that would allow the U.S. to postpone military intervention in favor of a more diplomatic solution in coordination with Russia. A solution? Perhaps, but one that made America and its president appear weak, unresolved, and ineffectual.
These examples clearly show Obama has had more successes than failures when it comes to policy in the Middle East. He may have a chance to turn things around in the next two years, but I, for one, am skeptical. The question for future generations will be whether his domestic record redeems his glaring shortcomings abroad.
On March 31st, the US Army updated its regulations on how female soldiers could wear their hair and on the following day, April 1st, most commenters on articles regarding these regulations were asking the same thing: “is this an April Fool’s joke?” Many have declared these regulations to be racist against American-American soldiers as they bar soldiers from wearing their hair in twists or dreads. These regulations have been in place since 2005; however this new update goes into more details about the types of braids, twists and dreads that are unacceptable.
Now before I delve into why these regulations are racist and discriminatory, I will try my best to explain why this issue is important. Black hair is a complicated topic that I could not begin to explain; in fact there are two different Wikipedia pages dedicated to “African-American Hair” and “Afro-textured Hair.” Afro-textured hair tends to grow either upward or outward so it doesn’t fall down against one’s shoulders as other kinds of hair normally would. Many Black women today will wear wigs/weaves, chemically straighten their hair or simply wear it natural. In recent years, there has been a movement amongst Black women to wear their hair in its natural state because the wearing of wigs or chemically straightening one’s hair is seen as conformity to the “white ideal.” However, this movement has been countered with the idea that natural hair is unprofessional because it is often big and bushy. Now the US Army is barring twists and dreads in the name of professionalism.
The regulations state that braids and cornrows can only be up to ¼ inch in diameter and that the bulk of hair on a soldier’s head cannot exceed more than 2 inches from the scalp. The regulations humorously suggest that female soldiers may wear wigs or weaves to cover their hair. Now, I’ve never been in the army or have I ever been involved in combat but I have had extensions woven into my hair and I don’t imagine that wigs and weaves are a realistic choice in combat. Black women who have natural hair (hair that is not chemically straightened) often wear their hair in braids or twists because it is often difficult to collect natural hair into a ponytail. In order to avoid facing punishment, Black women in the US army will have to resort to wearing wigs/weaves or chemically straightening their hair. These regulations are the sort of racism that may not be born out of deliberate discrimination but rather the type of apathy and negligence that affects so many other things in the United States.
Institutional racism is something that I’ve heard described as “liberal bullshit,” which is easy to say when you are not negatively impacted by it. There are even minorities who deny its existence because they have been adopted as a model minority. Institutional racism, for those of you who don’t have tumblr accounts where it is mentioned in every social justice post, is described as “the collective failure of an organization to provide an appropriate and professional service to people because of their color, culture or ethnic origin.” No matter who may say otherwise, we continue to live in a version of the United States where the primarily white ruling class makes rules without the considerations of those in the minority. Continuously, white Americans are seen as a “default” and all others must conform to the ideals of white America. It is a white privilege in the United States that a white woman will most likely never have to question whether her natural hair is unprofessional. A commenter on a Huffington Post article about the ban tells of the first time she wore her natural hair to her corporate job. The commenter states that although there were no negative reactions, coworkers applauded her courage and her desire to “stick it to the man.”
It should not be seen as an act of rebellion if a Black woman wants to wear her hair the way it naturally grows. A Black woman should especially not have to conform to societal ideals that were not made with her in mind. It is understandable that the US Army maintain strict regulations in regards to the appearance and behavior of their soldiers. However, these regulations seems to target Black women as they are the most likely to wear their hair in this manner. Changing these regulations would be a step in the right direction, towards equality. If the US Army complies with the various petitions aimed at striking down these rules, it may seem that special considerations are being made for a specific group. However, the thing about equality is that it should reflect the circumstances of all involved. Those who made these regulations have no considerations for the diversity of people in the US army and the lack of regard for afro-textured hair is incredibly apparent.
Approximately 3.2 billion years ago, a set of serendipitous events came together to allow a thermodynamic favoring of a group of molecules to grow in size and eventually divide. Commonly accepted as the beginning of life, all of which existence started from a single-celled organism.
That one-celled organism, through another set of carefully planned events which were both genetically and environmentally influenced, evolved into millions of more complex organisms at the apex of which lies a uniquely intelligent species – Homo sapiens, or us.
Out of seven billion people in the world, all striving to survive the artificial ethos we have built around us, I am sad to say that in many cultural and environmental milieus, two individuals are still unable to mate with a person of their choosing.
I am aghast at the endless amount of news stories concerning lovers who are murdered or beheaded, often by their own loved ones, for the naïve mistake of choosing to marry a person by themselves.
As most of you would like to think that this article would ostracize a religion or a community, it will not. The reason being is that it is not one community or religion that have failed but we as a species collectively.
A chimpanzee, our closest relative with only a 2% aggregate difference in DNA from human beings, can easily choose a mate of their liking, sometimes even multiple partners without contention.
While male chimpanzees may compete for females or kill rivals during the courting process, I have not heard of a couple being killed for simply electing to marry one another. Doesn’t that sound completely out of order based on the hierarchy and course of humanity’s evolution?
Take for example, the recent story of the two lovers from Afghanistan who are currently hiding for their lives after eloping and leaving their village to be with one another. Three years ago, a similar couple was executed after falling in love with each other. Honor killings, which are reactive murders enacted against people who have brought shame upon their families or cultures, are not just found in developing nations on the other side of the world, but are also found right here in North America. One only has to look at the case of Jassi Sidhu, who was killed in 1997 in Vancouver as part of an honor killing in response to her secret marriage, to realize that there are still women who do not have the complete freedom to be with a person of their choice without facing disastrous consequences.
Let’s face it, we don’t always choose who we fall in love with and the fact that such inclinations could risk our life or social and familial ostracization is absolutely a step backwards in evolution because quite literally, there are monkeys who have been able to get it right.
I look at the sacrifice these couples have to make and once again, evolution comes to mind because realistically no other animal can actively choose to make a dangerous choice that endangers their life or position in the community. Yet, we choose to destroy that which is the most beautiful and powerful in us- the ability to make a choice in all matters from the mundane to the most enchanting.
Thirty years after the discovery of the ozone hole, a scientific consensus has been reached that human-induced change is also leading to extremes in weather patterns and increasing overall temperature in various regions. After years of denying global warming, even skeptics such as physicist Richard Muller now say global warming is real and humans are almost entire the cause. Because stopping climate change instantaneously is difficult, a method of slowing it down is reducing the usage of coal.
The legislative branch in particular has been uncooperative with large corporations in finding an agreement on the reduction of coal usage. One promising business venture came from American Electric Power, which proposed a plan to strip coal from the atmosphere and store it in deep rock formations underground. After initially promising to cover half of the cost of research – approximately $668 million – the Senate decided against the expansion of the cap and trade policy. The shutdown of this policy, which subsidized technologies that reduced fuel emissions, forced AEP to close its research. Unfortunately, this illustrates only one of many cases where the legislative branch has denied support to green corporations.
Congress also expressed open disinterest in curtailing global warming when it denied a comprehensive solution by the Chamber of Commerce that would reduce emissions. As business elite began to recognize hesitation in policymaking, they transformed the Chamber into a prosecution team for the environmental trial. It waged war against the EPA’s plan to use regulatory means to control emissions by questioning scientific findings which state that greenhouse gas emissions endanger human health.
Although the legislative branch hasn’t minded playing the role of the obedient sheep, companies like Pacific Gas and Electric, PNM Resources, and Exelon are quitting the Chamber due to its extreme rhetoric and obstructionist tactics. Some businesses have formed another coalition known as the United States Climate Action Partnership to come up with a plan for limiting emission. The future of this association looks to follow the footsteps of the Chamber if Congress continues to be unresponsive. This case also gives foresight into likely actions taken by corporations when mismanaged by Congress.
Although Congress has shown a lack of enthusiasm of the biosphere in the recent decade, it has historically done an adequate job in passing environmental legislation. In 1970 for example, a landmark Clean Air Act was passed, which regulated air emissions and granted the EPA power to set air quality standards to counter ozone depletion and acid rain. The Clean Water Act was passed two years later, which placed a limit on the flow of raw sewage into rivers, lakes, and streams. According to EPA statistics, two-third of the nation’s waters are now safe for fishing and swimming, in comparison to the one-third prior to the act’s passage. The Endangered Species Act of 1973 protects wildlife through banning the killing and trading of all endangered plant and animal species. This dynamic Congress continued initiating change through the 1970s by passing the Safe Drinking Water Act and the Toxic Substances Control Act, which outlawed and controlled industrial pollutants to ensure safe drinking water and public health. However, their role has significantly diminished since 1980, no longer adapting to present environmental dangers. In order to combat global warming, Congress needs to incorporate new technology and more ambitious projects; there is little chance that industries will invest unless Congress provides far stronger financial incentives.
Like Congress, the executive branch has also accepted defeat in the fight to combat anthropological climate change. The Obama administration recently permitted oil drilling off the coast of Northern Alaska, the East Coast, and eastern Gulf of Mexico. It has attempted to disguise it with vague language on drilling projects and by offering optimism in the creation of thousands of jobs, tax revenue, and independence from foreign oil.
However, further investigation proves that these projections are largely misleading due to the interchange of “potential” and “current” reserves, in which the former means the amount likely to be discovered over a very long time. The victims in this approval are inland water systems, whose level of dead zones – coastal sea areas where the water oxygen levels have dropped too low to support marine life – has doubled. These water systems support swamps, fisheries, and wetlands that have longed hosted biodiversity.
Another lackadaisical response from the White House was to the British Petroleum oil spill of 2010 off the Gulf of Mexico. A methane leak caused an explosion abroad an offshore BP drilling rig, leading to 170 million gallons of pressurized oil into the ocean. Now the biggest in American history, the BP oil spill cause the death of 11 people, ulcers and internal bleeding in many wildlife populations, an unbalanced food web, and estimated $659 million loss in tourism and recreation. After the incident, the executive branch pursued a hands-off policy, allowing BP to make many of the critical decisions. This included their approval of BP’s choice to use Corexit, a dispersant known to cause significant harm to marine ecosystem, to prevent the oil from spreading.
While its work has fallen short of its campaign promise, the executive branch initially showed enthusiasm toward the environment with the new presidential administration. When elected, President Obama claimed it was the moment when the rise of the oceans began to slow and our planet began to heal. In 2009, he signed the American Recovery and Reinvestment Act, which invested in technologies to advance environmental protection. The commander-in-chief also increased government spending on environmental causes, instructed civil servants to increase the fuel-efficiency of America’s cars, promised to double America’s output of renewable energy, and urged Congress to pass a cap on the country’s emissions of greenhouse gases. President Obama made a national goal to generate 80 percent of energy from clean sources by 2035. The U.S. doubled its renewable-energy generation and became the top researcher and producer or advanced batteries for hybrid and electric cars.
However, these policies were quick to fade in the face of criticism. Some conservative economists and political groups claimed that the power of the executive branch was expanding at an uncontrollable rate. They suggested that the government should seek other effective market alternatives to solve the global crisis, such as an externalities tax on carbon. Unable to find a middle ground, the executive branch has forgotten its promise of virtue and safety.
The judicial branch is the only entity of the federal government that has used its power consistently to protect the environment. In the Sierra Club v. Morton case, the Supreme Court ruled that all any environmental group needed to assert a standing in a natural resource matter was to find among their membership a single person with a particularized interest. This means that any individual who hikes, fishes, hunts, etc. in an area can make a case against a project that takes away from the aesthetic appeal of the area.
The Supreme Court has also proven in rulings that it takes environmental damage seriously. In the Woburn case, eight families who contracted leukemia and died due to contaminated water by W.R. Grace Co. received $9 million in settlement.
The Supreme Court has also delegated a significant amount of power to the EPA through the Chevron USA v. NRDC case, where it differed to the EPA statue as long as it is unambiguous and the interpretation is reasonable. Massachusetts v. Environmental Protection Agency has also made it difficult for organizations to petition EPA instituted acts. The case established that the EPA has statutory authority to regulate greenhouse gases emissions from new motor and must ground its reasons for action or inaction in the statute.
Efforts to change the EPA from watchdog into advocate for polluters have been frustrated by upper level courts. The U.S. Court of Appeals and the U.S. Supreme Court ordered the EPA to follow the law in requiring utilities to install pollution controls when power plants were upgraded. The U.S. Court has required the EPA to start regulation greenhouse gas emissions from automobiles and the D.C. Circuit Court has recently admonished the EPA to require reductions in mercury emissions from coal-powered power plants.
Although many issues exist outside the U.S. borders, nations worldwide look to the United States for assistance or leadership by example. Not only has it failed to meet these expectations, the United States has now also gained a global reputation for deliberately languishing international agreements in the Senate. In 1998, the Rotterdam Convention took place in the Netherlands to control the international trade of highly toxic chemicals, use proper labeling, promote safe handling, and inform purchasers on restrictions. Likewise, the Stockholm Convention took place in 2001 in aims of eliminating the production of persistent organic pollutants.
However, both of these treaties were held up in Congress, displaying a failure in U.S. leadership to participate in global efforts to protect human health. Seizing the opportunity, the United States exported 28 million pounds of pesticides between 2001 and 2003 that were banned domestically.
Also in 2001, President George Bush refused to sign the Kyoto Protocol – an environmental treaty aimed at reducing greenhouse gases such as carbon dioxide – because it didn’t place limits on developing nations. When the agreement went into effect in 2005, the United Sates had failed to join 141 countries.
In order to create a comprehensive solution to environmental issues, regulations need to be implemented for specific industries, not nations. Ecological habitats cannot be preserved without making larger cuts in the emission of greenhouse gases. Because the Kyoto Protocol fails to target larger greenhouse gas producing industries such as power generators, the problem will be unresolved.
Through the 21st century, the United States has actively searched for loopholes in previously signed agreements. One negotiation ignored today is the U.N.’s Framework Convention on Climate Change of 1992, a benchmark in the movement to stabilize greenhouse gas concentrations. Ironically being the first nation to sign the agreement, the United States has since then looked for loopholes to pollute at desired levels. It has decided to implement a quota system where each country would be given permits or quotas on the amount of pollution it can produce. Because these permits can be traded or borrowed for the future, the US can dump more pollutants at a relatively cheap price from unindustrialized nations in Africa.
While the role of the United States in international policy is weak today, it was once at the forefront of initiating proposals for climate change. The United States became one of 24 nations to sign the Montreal Protocol in 1987, an agreement designed to protect the ozone layer by phasing out the production of substances, can as chlorofluorocarbons, responsible for ozone depletion. It also joined the 1992 U.N. Conference on Environment and Development in Rio de Janeiro, also known as the Rio Summit. Here, global diplomats addressed alternative sources of energy to replace fossil fuels, production of toxic components such as lead in gasoline, implementation of public transportation systems, and the growing scarcity of water.
These initiatives were eventually put aside to the fear of higher energy prices, which projected to an economic downturn and energy shortage.
While the United States government defends its policy actions with the fear of becoming less competitive in the global marketplace, many countries including South Africa, Kenya, Brazil, and Denmark have successfully implemented stringent environmental restrictions.
The intentions of the United States’ international environmental policy have become evident to other nations. South African Prime van Schalkwyk was quick to recognize that the U.S. refused to make credible and ambitious proposal in the G8 Summit. He stated: “[South Africa] will not compromise on our demand that the U.S., as the world’s largest historical emitter, should contribute their fair share. For our part, we stand ready to take on our fair share of responsibility.” As the leader of South Africa, he promised to focus on the “creation of more empowering technology, financing framework, and an equitable climate regime.”
The Kenya Climate Change Working Group is a group that works with the Kenyese government to establish nationwide regulations. However, they also expressed concern that they need U.S. support for global change.
The Brazilian government has also set realistic goals and targets to reduce deforestation by 80% by 2020 in response to the expectation that the Amazon rainforest will be wiped off by 2050. Even though the Brazilian government is taking an active stand on fighting climate change, the movement of particulates from major polluting nations such as the U.S. toward the tropics is looking to be a challenge. Brazil’s goals are realistic and can be accomplished by the proposed date if the United States cuts its emissions to the level promised.
The effects of America’s environmental policy on the world have been devastating. Due to the power and greed within its politics, the U.S. has failed alleviate the rapid loss of natural sustainability. Many of its policies attribute to habitat loss, climate change, excessive nutrient load, over-exploitation, and unsustainable use. The United States’ environmental policy is also attributing to the destruction of biodiversity worldwide. These factors lead to the collapse of and threaten extinction to many ecosystems throughout the world rich in diversity, such as coral reefs. Because the United Sates hasn’t proven to be strict in monitoring the illegal trade of endangered species, pressure to destroy habitats for illegal hunting grows in developing nations.
Solving the Crisis
Solutions to environmental issues can only be implemented when the United States addresses challenges by initiating policy that works with businesses but sets stringent requirements. Until private business is forced to address this issue by an international mandate, inventiveness in attacking the problem will be lost. The first challenge is eliminating the burning of coal, oil, and natural gas – eventually forgoing fossil fuels. The international mandate should also work to conserve all parts of the environment. It should not compromise on biodiversity in order to satisfy economic superpowers.
Government can also help to solve environmental difficulties by encouraging and working with big industries to promote the development of renewable energies. It needs to fund research and development technologies and also needs to develop a prize fund with large rewards to attract knowledgeable scientists into environmental research fields. With elites and bureaucrats in society now attempting to undermine climate change, it remains a mystery whether environmental issues will be portrayed as emergencies or propagandas. Until a decision is made, Earth can only wait anxiously.
Let me paint a picture for you. Torn sneakers, black-framed glasses (are they real?), a slouchy knit beanie, and an oversized flannel shirt ($3 at Goodwill). Just for good measure, we’ll toss in a cigarette. I think it is safe to say that a fair number of us have the same term come to mind at this point. It’s a hipster, folks. Striving to separate themselves from the agitating constraints of “normal” and “popular”, this group of bohemian youngsters has thus spawned exactly what they despise: a stereotype. The word “hip” was fashioned in the 1940’s to describe enthusiasts of up and coming fads, but it wasn’t until some time in 2009 that the word hipster became popularly known. Then, they were most closely associated with their predilection for “indie” music, and the public’s image of them has since grown. But I’m not here to tell you what a hipster is. I’m not here to discuss how they came about, their liberal political viewpoints, or their exceptional ability to create a wardrobe of pieces from the Salvation Army. No. Instead, it is important to expand upon the very thing that hipsters are associated with, its emergence into pop culture, and what this means for hipsters. If hipsters pride themselves in staying under the radar from modernized taste and Top 40’s hits, then what happens when their music- their most prized possession- infiltrates the media and society? Here’s a guess: they die.
What is indie music, though? The majority of people most likely can’t say. The word “indie” comes from independent, referring primarily to independent record labels. It isn’t just an abbreviation- like “totes” for totally (my ears are bleeding already)- instead, it is it’s own category and autonomous being. An independent record label is one that takes it’s own DIY stylistic approach to producing and distributing the works of musicians, separating itself from the control and aid of major commercial record labels. While this is the most precise definition of indie, not all bands that are deemed “indie” are part of an independent record label. Indie is also a genre of music, though much more enigmatic. Deriving from grunge, punk, rock, and folk, indie music materialized in the U.S. in the early 1980’s, describing the music produced on post-punk labels. Indie music rejected all things accepted conventionally, introducing loud electric guitar and bass drum-heavy undertones. Then, in the 1990’s when grunge bands such as Pearl Jam and Nirvana swam their way into the mainstream, the word “alternative” (used to depict them previously) lost it’s cultural significance; these bands were no longer out of sight, out of mind… no longer under the radar. This is when the term indie was truly coined, referring to bands that remained faithful to their independent labels. At this time, when grunge bands were hitting the Billboard charts and Green Day was winning Grammys, the Internet proved to be a useful tool for staying in tune with the underground music scene, allowing music and style junkies to discover bands that were blurring the lines of conventionalism yet remaining ultimately unknown. This is where the pride of hipsters comes back into play. Regardless of alternative music taking reign of the Top 40’s, a particular group of people still had their ways of discovering a plethora of unique, conceptual bands that you “probably haven’t heard of”. Thus, along came the hipsters.
You are now well equipped to sit in a salon (the intellectual French gathering, not where you get your hair bleached) and discuss what a hipster is, the meaning behind the word indie (you’ll be one of the minority who actually know it), and why they like it so much (it’s that pride-in-the-unpopular thing again). With any exposure to the radio at all, upon hearing the words “Top 40’s”, artists such as Taylor Swift, Katy Perry, Pitbull, and Ke$ha will probably come to mind. You would be correct in thinking this. You would also be correct in thinking that the hipsters you know so much about despise all of it. But wait: what about that song about the luxury of pop musicians and gold teeth and Grey Goose? That’s by Lorde. Where was she before she became the luxurious musician she was just singing about? What about that dark, apocalyptic song with the cacophonous inhale? That’s by Imagine Dragons. Where were they before their big-ass (36 inches across, to be precise) drum busted through your car speakers from 106.1 Kiis FM? Wait, wait, wait. There’s also that song about walls tumbling down and optimism. That’s by Bastille. Where were all of these people before 2013!? You got it: underground.
The recent turn in trend of the Billboard charts is curious. Why the sudden rush of indie music in the Top 40 hits? Does this mean our society is longing to engage in postmodern pop culture ideals and become cooler than what is already cool? Should we all start wearing grandma’s sweaters and thigh-high socks? And what on earth does this mean for those damn hipsters, now that their secret collection of strange and unknown bands has been unveiled? It’s an indievasion, everyone. The radio is under attack.
Take a prime example. The song “Radioactive” by Imagine Dragons was released officially on the EP Continued Silence on February 14, 2012, but that wasn’t even the beginning of Imagine Dragons’ musical career. The band started by playing 4 sets a week at a bar in Las Vegas before their first EP was released in 2009. They released 2 subsequent EP’s before the one that featured “Radioactive”, with notable songs such as “It’s Time” (that really catchy song from the trailer of The Perks of Being a Wallflower). Finally, their highly acclaimed album Night Visions was released on September 4, 2012, though most of its songs were already featured on the band’s previous EP’s. Case in point. There was roughly a 14-month gap between the time that “Radioactive” was released and the time it was first played on the radio. Within that time, not to mention the 4 years since the band’s first recording efforts, Imagine Dragons lurked elusively outside the scope of mainstream taste. However, they somehow had already amassed a body of fans, clad in black-framed glasses and cardigans, long before their arrival on the radio. Take a wild guess: hipsters. As annoying as it is when you hear one assert the spiel, “I knew this band way before anybody else did, this song isn’t even new”, they’re right. These songs came out way before you think they did. And you can gamble that there were people waiting to discover it, rocking back and forth in front of Macbook Pros, judiciously probing YouTube and 8Tracks in order to claim their title of being “the first to know them”. “Radioactive” became the 3rd best selling song of 2013, right behind “Blurred Lines” and “Thrift Shop” (Let’s not even get started on the hipsters in Macklemore’s fan base). 6 million copies of the single sold later, Imagine Dragons are Grammy award winners, going on 4-part tours and selling out stadiums internationally. Just like Pearl Jam and Nirvana in the 1990’s, it looks like Imagine Dragons aren’t so much under the radar anymore, either. If the term “alternative” was denunciated then, is “indie” on the same track now? As we know, indie refers to the independent qualities and conclusively unknown genre, but with so much popularity, it isn’t really so different anymore.
This, of course, isn’t the only example by any means. The song “Royals” by Lorde was released on November 22, 2012 on her EP The Love Club. It was featured again on her debut studio album Pure Heroine, released September 2013. It didn’t even gain play on the radio in the United States until fall of 2013. We have reached yet another large gap between the studio release and time of radio play. This incidence is just short of a year. Within that time period, indie music gurus poured over Lorde’s 5-song EP and blasted “Royals” and “The Love Club” alike from grainy, used-car speakers. That is, until “Royals” became number 1 on Billboard Hot 100 for 9 straight weeks. A formerly unheard of indie artist has reaped fame once again, so now what? The hipsters have to keep X’ing bands off their lists. And those lists will keep downsizing, as musicians such as Macklemore, Bastille, the Neighbourhood, and American Authors creep (or plow) their way onto Billboard as well.
With such a stigma surrounding hipsters as being self-credible for the discovery of all of these indie bands, and thereafter the gleaming success of said bands, it becomes necessary to wonder what is even going to happen to hipsters as their exclusivity lessens. Their style is spreading (I know you see Converse all over the place too); their music is spreading (who wouldn’t want to scream the words to “Radioactive”). The poor hipsters have to keep digging further underground to discover new music to claim as their own, and as those bands surface on the radio, they’ll have to keep disposing of them and digging deeper. The indievasion isn’t going to stop. Hipsters’ musical gold is being shared with the world. “Alternative” is dead, and the term “indie” is seemingly right there with it, as considerable achievement is made on mainstream charts, separating it from its original uniqueness. And if indie is dead, then so are hipsters. We might as well help them dig their way down 8 feet under and bury them. After all, where do hipsters keep their gold? Underground. (Insert laugh here).
The end of another semester is upon us once again. Summer is rapidly approaching, and with that, many students may have plans to travel during this wonderful time away from school. Some may travel within the state of Texas, while others may trek across the U.S. For those who plan to travel out of the country to places such as South America, the question of safety may be of great concern. Although this is a reasonable concern, what are the real dangers of traveling to this region of the world, and what can be done to minimize those potential dangers?
It seems to be the general consensuses of the average American, that traveling to this region of the world is extremely dangerous. This image of dangerous travel only makes sense. It is not uncommon to turn on a major news network and hear horror stories. Do a Google news search of any country in South America and you will see so many stories that will surly make any traveler weary. The major news stories on Colombia include political corruption, instability, Marxist guerillas, and an American who was arrested on suspicion of terrorist activity. The way news outlets, and other forms of media portray the danger that lurks for the unsuspecting American tourist, creates and feeds into this fear. But do travelers really have that much to fear?
When speaking to Americans who have ventured to South America, the answer seems to be no. A recent acquaintance, who spent the last year backpacking and living in Argentina, Chile, and a few of the boarding nations, had nothing but positive things to say about life there. He explained that growing up in the United States, we are often taught to fear anywhere south of our border, but really there is nothing in these countries to be afraid of. The passion that he had when he spoke of the people, the life styles, the cities, and the parties, would convince even the weak of heart to head down there the next day. His basic opinion on safety in these countries is to just use common sense, if you do not act like a tourist, you will not be seen as one. He did mention that street crimes, such as car break-ins, and mugging can be an issue, but once again, just be cautious of your surroundings. He also recommended having at least a basic understanding of the language. Not only does it make communication easier, but also it is a sign of respect that will go a long way.
As it is, you can be a victim of crime anywhere in the world. It does not matter where you go because there will always be someone out there who is willing to commit crimes.
When taking a look at crime rates, it is interesting to find out that you may not be any safer in the Untied States then you would be in say, Argentina. When taking a look at murder rates in Detroit, Michigan versus Buenos Aires, Argentina, the results are rather shocking. The murder rate for Detroit in 2011 was 48.2 murders per every 100,000 people. This is compared to a rather small number, just 6.93 per every 100,000, in Buenos Aires. This low number is also similar to that of cities in surrounding Chile and Uruguay. It is also striking that the highest murder rate reported in Argentina is only 12.79 per every 100,000. This is still significantly lower than Detroit. If crime numbers are a determining factor when it comes to traveling safety, one is much better off walking the streets of any major city in Argentina then they are in Detroit.
So, say you now see the light, and are convinced that traveling to beautiful South America is for you. What are some things that a backpacking college student needs to know before they begin their journey?
The first piece of solid traveling advice is probably the most simple and easiest to follow. Use common sense! Just because you are now in another country does not mean that you shouldn’t use your head. If anything, you need to use it more. Recognize that your actions can have consequences, so do not put yourself in a situation where you can easily be harmed.
For example, in recent months, the body of a tourist was found dumped on the side of a dirt road in a rural area of Colombia. When this incident was looked into, authorities found out that the deceased man had traveled to this location with people he was not very familiar with in order to take part in a tribal ritual that involved drinking yage, which causes psychedelic hallucinations. He took it once, and then decided to partake in it a second time. This second ingestion of the poisonous substance killed him.
Of course the news outlets that reported on the incident, seemed to put the blame mostly on the local people that the man was spending time with. But is this really a fair assessment of the situation? Yes, the local people gave him the yage, and then they dumped his body on the side of a road out of fear for repercussions. But the blame can only truly be put on the traveler. By putting himself in an unfamiliar situation, and by taking a known poisonous substance, it was truly his own fault that he died. Occurrences like this only lead to the false image that too many people accept as the truth.
The second piece of advice that all travelers should be aware of is to avoid drinking local tap water and be cautious of the food you eat. Although many of these countries are now up to first world standards in major cities, following this piece of advice is not really that hard to do, and will save you from a large amount of pain and endless time in the bathroom. Always make sure your meat is well cooked, try to not eat food that is sold from street venders and try to go to restaurants that seem to be busy because this will most likely mean the food has not been sitting around for extreme amounts of time.
There seems to be this myth that every American tourist will be robbed or kidnapped. Even though a person could realistically be the victim of theft nearly anywhere, something about being the helpless American in a forgeign country who has everything taken from them stirs up images that strike fear into the heart. If the picture has not come together for you yet, this is also not true. Statistically, you are very unlikely to have this happen if you take the necessary precautions.
The final, and perhaps best piece of advice that can be taken when it comes to keeping you and your items safe is to never leave items such as cameras, phones and things of the like sitting out when you are eating at a restaurant. When traveling the streets, always carry small amounts of cash, keep important documents in the front pockets of your pants, always put cameras in some form of a bag or backpack, and try to dress as close to the locals as possible. In the event that you are mugged, it is best to just give them what they want.
When it comes down to safety when traveling in this region, you play a crucial role in determining what happens.