Michael Sliwinski – Law Street https://legacy.lawstreetmedia.com Law and Policy for Our Generation Wed, 13 Nov 2019 21:46:22 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.8 100397344 Is the Separation of Church and State Over? https://legacy.lawstreetmedia.com/issues/law-and-politics/separation-church-and-state/ https://legacy.lawstreetmedia.com/issues/law-and-politics/separation-church-and-state/#respond Mon, 24 Jul 2017 13:02:11 +0000 https://lawstreetmedia.com/?p=62208

Do recent Supreme Court decisions mark a departure from tradition?

The post Is the Separation of Church and State Over? appeared first on Law Street.

]]>
"First Amendment" courtesy of dcwriterdawn; License: (CC BY-ND 2.0)

Over the last several years, the separation of church and state has become a prominent part of many legal battles. From the White House to the Supreme Court, the government has started to reinterpret a legal concept that dates back to the founding of the country. But where exactly did the notion that the government and religious institutions should be distinct come from? Read on to find out more about the history of the division of church and state in the United States and whether or not that distinction is in danger of eroding.


History of Church and State

The United States was founded in part by people fleeing persecution at the hands of state-sponsored religions. However, even after crossing the Atlantic, many of these same people were still under threat of religious persecution. The crown attempted to make the Church of England the official church of the American colonies. That effort was put to bed as part of the revolution and may have even galvanized efforts to separate religion and the government at the nation’s founding. Individual states also rolled back their own efforts to establish state-sponsored religions. Part of the impetus behind this effort was the writings of many thinkers from the Enlightenment and Protestant Reformation, which had important effects on the Founding Fathers as they wrote the Constitution.

Although the notion that there should be a division between church and state has been around for more than 200 years, it is not explicitly mentioned anywhere in the Constitution. The first recorded mention of the concept comes from a letter written by Thomas Jefferson to a Baptist Association in Connecticut. The idea gained traction and was first used by the Supreme Court in a decision in 1879. By 1947 it had essentially become a central part of constitutional law when it was cited as such in the Supreme Court decision in Everson v. Board of Education.


Precedents and Court Cases

While the specific phrase, “the separation of church and state,” is not in the Constitution, the distinction is implicit in several aspects of the document. First would be Article VI, which requires that all government officials swear loyalty to the Constitution and prohibits religious tests for public officials. Second is the Establishment Clause of the First Amendment, which prohibits the government from establishing any state-sponsored church. Lastly, is the Free Exercise Clause, which prohibits Congress from making laws against any religion. These provisions were later extended to the states following the adoption of the 14th Amendment.

These constitutional provisions and others have been used in a number of prominent Supreme Court cases, aside from the two previously mentioned. In 1948, in McCollum v. Board of Education, the Establishment Clause was invoked when the court ruled that religious instruction in public schools is unconstitutional. In 1952, in Burstyn v. Wilson, the court ruled that a state government cannot censor a movie simply because it offended people’s religious beliefs.

In the 1962 case Engel v. Vitale, the court ruled that school-sponsored prayer is unconstitutional. In 1968, a state statute banning the teaching of evolution was deemed unconstitutional. Three years later, in Lemon v. Kurtzman, the court created a test to determine if a government action violated the precedent of the separation of church and state. The test has three parts and can be used to evaluate a law’s constitutionality:

First, the statute must have a secular purpose; second, its principal or primary effect must be one that neither advances nor inhibits religion; finally, the statute must not foster an excessive government entanglement with religion.

In Allegheny County v. ACLU, the court determined in 1989 that nativity scenes in public buildings violated the Establishment Clause. In the case Church of Lukumi Babalu Aye., Inc. v. Hialeah, the court ruled in 1993 the city’s ban on animal sacrifice as part of religious exercise was unconstitutional. There are many other cases as well, but these notable examples show that the Court has actively defined a level of separation between the church and state over the years.


Recent Cases

In several recent cases, however, the pendulum seems to be swinging back to less separation between the church and state. One example comes from 2014 when the court ruled that Hobby Lobby, a privately-owned company, could refuse to provide health insurance that covers birth control to its employees on the basis of the owners’ religious beliefs. The ruling created an exception to the Affordable Care Act’s requirement that all employer-provided health insurance plans must cover contraception.

In another ruling from June, in the case Trinity Lutheran Church vs. Comer, the court weighed in on an issue that could have a major impact on the divide between church and state. This case centered around whether a private, religious school could use public funds for a secular project–namely rubberizing its playgrounds. While the state had initially ruled against the school because it was a religious organization, the Supreme Court ultimately ruled in its favor because it viewed the state’s denial to grant the school funding as discriminatory.

This ruling, in particular, is important for two reasons. First, it seemed to suggest that Blaine Amendments are unconstitutional. The Blaine Amendment was a failed amendment to the Constitution from 1875, which prohibited funds raised through taxes from going to religiously affiliated institutions. Although the effort failed, 35 states currently have their own laws that prevent public funds from going to religious groups.

The second major potential consequence of the Trinity Lutheran case concerns the extent to which this ruling will apply to funding for other activities conducted by religious organizations. Four of the justices attempted to head off this potential problem by clarifying in a footnote that the decision only applied to playgrounds. However, since only four of the nine justices signed off on the footnote, it is technically not the opinion of the court. The ambiguity there will likely result in future legal challenges, as religious groups will seek to identify new areas where they are eligible for public funding.

The video below discusses the facts of the Trinity Lutheran case in further detail:


The Trump Administration’s View

When it comes to the separation of church and state, like many other issues, the president has so far taken a seemingly idiosyncratic approach that may contrast with some of his campaign promises. In May, he signed an executive order that weakened the Johnson Amendment–part of a law that prevented religious organizations from getting directly involved in politics. That order was actually less controversial than what many expected based on Trump’s campaign rhetoric, although it remains to be seen whether he will take more aggressive action in the future.

Less moderate is the viewpoint of President Trump’s Secretary of Education, Betsy DeVos. DeVos has been an avid proponent of religious charter schools and even helped finance the campaigns of politicians who supported them. One of the fears following the Trinity Lutheran decision was that it opened a path to funnel tax dollars to religious charter schools, whose curricula would still not be overseen by the government.


Conclusion

The notion of the separation of church and state has existed in the United States for hundreds of years, and in Western Civilization long before that. Although the term is not explicitly used in the Constitution, the division has been established by the courts through their interpretation of it, particularly the Establishment Clause of the First Amendment. Over the years this specific clause, as well as a few others, has been used repeatedly to strengthen the divide between church and state.

However, the interpretation of the separation seems to have shifted in recent years, as the perception of an anti-religious bias has grown among many on the right, which the Supreme Court has reflected in its opinions. The clearest evidence comes from the recent Trinity Lutheran Church case, which not only allowed a religious school access to public funds but the opened door for future efforts to direct public money to religious organizations.

This opening presents an unclear path forward. While it is unlikely anyone will try to overtly knock down the proverbial wall between church and state, there are indications some holes might be drilled. While the extent of the recent shift is hard to determine, it does seem likely to continue.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Is the Separation of Church and State Over? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/law-and-politics/separation-church-and-state/feed/ 0 62208
How Do Financial Disclosure Laws Work? https://legacy.lawstreetmedia.com/issues/politics/financial-disclosure-laws/ https://legacy.lawstreetmedia.com/issues/politics/financial-disclosure-laws/#respond Fri, 07 Jul 2017 19:19:57 +0000 https://lawstreetmedia.com/?p=61663

Are existing laws enough?

The post How Do Financial Disclosure Laws Work? appeared first on Law Street.

]]>
"Tax Day March" courtesy of Molly Adams; License: (CC BY SA)

President Trump and many of his advisers have faced criticism for not putting enough distance between their government work and their private financial interests. Ethics watchdog groups, state attorneys general, and even Democrats in Congress have resorted to legal action in efforts to address what they consider to be problematic conflicts of interest. While much about the financial interests of the president is unknown to the public, we can tell a lot from regularly mandated financial disclosure laws. As financial disclosures have become particularly relevant as the new administration gets to work, it’s important to take a look to understand existing disclosure requirements and the people in charge in charge of overseeing the conduct of our government officials. Read on to find out the story behind financial disclosures, the agencies in place to keep politicians ethical, and whether or not the current administration’s actions are out of the ordinary.


A History of Financial Disclosure Requirements 

Financial disclosure requirements date back to the 1978 Ethics in Government Act, which was passed in the wake of the Watergate scandal. In an effort at improved oversight, the law created financial disclosure requirements for the president, vice president, all members of Congress (as well as candidates for those offices), federal judges and justices, and certain high-level staff throughout the federal government. High-level staff can include cabinet members, political appointees, agency heads, and others who qualify based on their income and duration of employment. These reports must be filed with the ethics agency that oversees their branch of government by May 15 each year. They must be made available to the public within 30 days of that deadline.

While lower-level government employees do not have to file public financial disclosures, many are still required to submit confidential financial disclosures. Private disclosure requirements generally apply to people whose responsibilities include government contracting and procurement, grant making, licensing, and other areas where conflicts of interest may arise in the course of their work.

So what are all these people required to report? As part of that same Ethics in Government Act of 1978, current federal employees are required to disclose detailed information about their personal financial interests and affiliations as well as some details about their direct family members. Specifically, they must disclose income, gifts, assets, liabilities, transactions, positions outside the government, various agreements, and blind trusts. If assets are held in a qualified blind trust, however, only the value of the assets needs to be reported.

In addition, the STOCK Act passed in 2012 also requires government officials to report transactions totaling more than $1,000 for securities like stocks and bonds. However, reporting is not required for mutual funds. These reports must be made within 30 days of when the official is notified of the transaction and no later than 45 days of the original date of the transaction. The rule applies to all federal officials who also make annual public financial disclosures, and for public officials at the highest level, the disclosures must be posted online. The general idea behind the STOCK Act–which is short for the Stop Trading on Congressional Knowledge Act–is to prevent government officers from using their unique knowledge for their own personal profit.

The video below explains the STOCK Act in terms of how it applies to government employees:


Oversight

Government ethics offices play a crucial role in the oversight processes, as they create ethical codes of conduct and act as a hub for oversight. Members of the executive branch, including the President and Vice President, file their reports with the Office of Government Ethics. For those in the House of Representatives, they file with the Clerk of the House and House Ethics Committee for Review. Members of the Senate file with the Secretary of the Senate and the Senate Select Committee on Ethics. Lastly, those required to submit financial disclosures in the judicial branch submit them to the Judicial Conference.

The Office of Government Ethics, or OGE, oversees 130 agencies within the executive branch, including the White House. That includes about 2.7 million employees and nearly 400,000 public and private financial disclosure records. As part of this effort, the OGE makes sure executive branch programs are in compliance with ethics rules and is tasked with training the more than 5,500 ethics monitors in the executive branch. The director of OGE is appointed to a five-year term by the president. The OGE is divided into four divisions: the General Counsel & Legal Policy Division, Program Counsel Division, Compliance Division, and Internal Operations Division.

The House Committee on Ethics was founded in 1967. Originally, it was known as the Committee on Standards of Official Conduct, but its name was changed in 2011. The committee has a unique structure that is designed to give equal influence to each party. It includes five representatives from each party and has a non-partisan staff. The committee is responsible for regulating the conduct of House members and providing guidance on all ethical issues. This effort has been bolstered over the years by legislation such as the Ethics Reform Act of 1989, which barred government officials from earning money for certain activities outside their job. Separate groups have also formed from within the Committee including the Office of Advice and Education in 1990 and the independent Office of Congressional Ethics in 2008.

The Senate Select Committee on Ethics also traces its origins back to the 1960s. Much like the House, the Senate had a tradition of policing itself as issues arose. However, the push for an established committee reached a peak in 1964, following a senator’s high-profile resignation during an ethics scandal. This led to the creation of the Senate Select Committee on Rules and Conduct. After complaints that the committee was ineffective, it was replaced by the Senate Select Committee on Ethics in 1977.

Lastly, members of the judicial branch, namely federal judges, must adhere to the Code of Conduct for U.S. Judges adopted by the Judicial Conference of the United States. The code provides a blueprint for judges detailing how they should conduct themselves and what activities they should avoid. This prohibits judges from hearing cases in which they have private knowledge of disputed facts, a financial interest in the outcome, personal bias, and prior involvement in the case in a different capacity. However, the law does not prevent them from being active outside of their formal position, and in fact, encourages judges to engage in activities that might improve the quality of the legal system. Employees of the judiciary are also expected to uphold the standards set by the conference.


Mandated Disclosure vs. Tradition

Ensuring the ethical conduct of government officials has become a particularly significant issue for critics of the Trump Administration. Specifically, many people question whether the current president and his associates have adequately distanced themselves from their private interests, as governed by the various ethics committees, to the point where decisions are insulated from conflicts of interest. To his credit, the president has submitted his required financial disclosure form, which at 92 pages is twice as long as former Republican presidential candidate Mitt Romney’s. However, President Trump has been just as steadfast in refusing to release his tax returns, a move that goes against longstanding tradition.

Namely, since 1976, every president or candidate for the position has released their tax returns. While Trump has frequently claimed that he cannot release his tax returns because they are being audited, there is no rule preventing such a disclosure. Like Trump, Richard Nixon refused to release his tax returns during the campaign; however, he did release them later in his presidency while under an audit. This distinction between disclosure forms and tax returns is important because while financial disclosure forms are useful, tax returns would reveal things that would not otherwise be available. Examples of that sort of information include effective tax rates and details about charitable giving. Without this information, there is concern over whether or not the leader of the United States has private interests that may dictate his policy decisions. One example is a proposal made by President Trump and many Republicans to eliminate the Alternative Minimum Tax. When a part of President Trump’s 2005 tax returns was leaked in March, it became public that a large portion of his tax burden that year was due to the Alternative Minimum Tax.

Tax returns released by candidates and presidents have created political problems in the past. For example, Mitt Romney’s tax returns showed that he paid a particularly low tax rate relative to his income because most of his income came from investments, which are taxed at a lower rate. Deductions from charitable contributions also lowered the amount he owed. His tax returns also showed that he had approximately $3 million stored in a Swiss bank account, a fact that the Obama campaign used against him in attack ads. And when President Obama released his tax returns during the 2008 campaign, donations to his church furthered a debate over Obama’s ties to his controversial pastor.


Conclusion

Ethics and conflicts of interest are perennial political issues. In the United States, the Watergate scandal spurred a number of reforms that enshrined certain transparency and disclosure requirements into law. Those efforts extend beyond self-policing as they also created several of branch-specific ethics agencies that set guidelines and investigate misconduct.

Adherence to ethical standards has become an established norm in Washington, D.C. going back at least to the Ethics in Government Act of 1978. Critical to maintaining that tradition is holding people at all levels of the federal government accountable. Ethics and transparency have been issues for presidents over the years and gave rise to particularly large scandals for Presidents Nixon and Clinton. And it appears to be an issue once again with the current administration. While many forms of disclosure are required by law, there are a number of traditions that have previously helped ensure accountability. Given President Trump’s decision to reverse a longstanding norm by refusing to release his tax return, it’s possible that Congress may seek to mandate such disclosure with future legislation.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post How Do Financial Disclosure Laws Work? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/politics/financial-disclosure-laws/feed/ 0 61663
After Years of Decline, Piracy May Be on the Rise Again https://legacy.lawstreetmedia.com/issues/world/piracy-back-rise/ https://legacy.lawstreetmedia.com/issues/world/piracy-back-rise/#respond Sun, 25 Jun 2017 21:26:30 +0000 https://lawstreetmedia.com/?p=61455

Why is piracy so prevalent off the coast of Somalia?

The post After Years of Decline, Piracy May Be on the Rise Again appeared first on Law Street.

]]>

Late April saw a major reversal in what had been a long-running trend. Piracy on the high seas–not including the latest “Pirates of the Caribbean” movie–may be back on the rise after years of decline. Although recent attacks marked the first major assaults on merchant ships in nearly five years, the location of the hijackings, near the Horn of Africa, was normal. However, there has also been a rise in the number of attacks on the West Coast of Africa as well. This all comes despite major efforts following a recent peak in piracy attacks in 2008-2011. Read on to find out why these attacks are happening again and if this latest wave of pirate attacks is the beginning of a new trend or just an isolated spike.


A Brief History of Piracy

Trying to trace the history of piracy is similar to trying to trace the history of other crimes like theft or murder in that there really is no identifiable start date. Nevertheless, most estimates place the beginning of the practice sometime between 1400 and 1200 B.C. near the southeastern coast of present-day Turkey. The practice continued throughout the years, involving every Mediterranean empire and many important historical figures including Julius Caesar.

Piracy was a major tool used by the Vikings and later by the English, most notably when the Queen of England commissioned Francis Drake to attack Spanish ships during a war. The United States had its first brush with pirates in the early 19th century when Barbary Pirates from North Africa attacked its shipments and demanded tribute, which ultimately led President Thomas Jefferson to send the navy to fight back. While the frequency of piracy decreased after that, it was never eliminated outright–it mostly just shifted regions, first to Southeast Asia and ultimately to what is now Somalia.


Somali Pirates

Piracy near the Horn of Africa clearly has a long history for a number of reasons. Recently, its surge has been the result of many factors, notably the region’s significant population growth and failing economy, which is the legacy of various colonial governments cutting up Somalia into disparate parts. Additionally, many of the pirates themselves–who are generally men between 20 and 35 years old–have few employment opportunities and view piracy as lucrative means of employment. In fact, piracy has actually led to the development of many other symbiotic industries such as communications, mechanics, and food production. Pirate crews are often formed along clan lines and some believe that an important part of the reason why piracy is so prevalent in Somalia is due to the amount of illegal fishing in Somali waters. Illegal fishing has significantly depleted the resources available and is likely part of the reason why the local economy does not offer enough opportunity to young men, which forces many to seek alternative means of making money.

The video below looks at piracy in Somalia and some of its underlying factors:

Regardless of the specific reason, piracy exploded in this region and peaked from 2008 through 2011. During this time, more than 700 merchant vessels were besieged. At one point in 2011, as many as 758 individuals were being held for ransom and the costs to the shipping industry were estimated to be higher than $7 billion. Piracy became such an issue during this period that one high-profile incident even became the subject of the blockbuster movie “Captain Phillips.”  But in 2012 this trend slowed dramatically and there were no major hijackings until earlier this year.


Efforts to Fight Piracy

Although it seemed as if piracy in the area around the Horn of Africa just vanished, it was actually the result of several factors. These efforts started by land (and sea) with U.S. airstrikes and efforts by Kenyan security forces that pushed Al-Shabaab (Somalia’s Al-Qaeda offshoot) out of key areas, including the port of Kismayo. These actions along with efforts by local clans, which were irritated at the flashiness of the pirates, brought back some stability to the region.

The greatest effort, though, came from Task Force 151. As part of the U.S.-led force, NATO and the European Union sent ships to the area to protect merchant ships. This effort was joined separately by navy vessels from Russia, China, and India. The primary contribution made by these ships was deterrence, however, they did also attack coastal storage areas and capture pirates to bring in for trial. The coalition also shared vast quantities of information with merchant ships that proved very useful.

The merchant vessel operators themselves also contributed to the reduction in piracy through several actions of their own. According to Foreign Policy, those efforts include, “cruising at higher speeds, installed barbed wire on the lower decks, built ‘citadel’ safe rooms for crews, and toyed with foam machines, high-power water jets, and deafening sonic devices.” Notably, many also employed security teams, which usually consisted of people with military experience.

While it certainly seems like there was a reduction in piracy over the last few years, thanks to a variety of efforts, this may be somewhat misleading. Although Somali pirates generally refrained from attacking high-profile international targets since 2012, there have still been numerous attacks on smaller local fishing boats. In addition, some suspect that several attacks went unreported, suggesting the problem never really went away, but that rather it changed forms.

Latest Developments

Regardless of what happened during that period, piracy is unquestionably an issue in 2017. For the first time in years, a major hijacking occurred off the coast of Somalia when pirates captured the Aris-13 in March. Somali pirates also hijacked an Indian commercial ship in April. Last year marked the first time since 2010 that the costs associated with piracy have gone up, reaching an estimated $1.7 billion. The reason for this spike has been attributed to several causes. One is declining vigilance on the part of shipping companies–the Aris-13, for example, did not have private security on board and was also cruising in dangerous conditions. Aside from the shipping companies, the spike has also been attributed to famine and drought in the area along with the continued lack of stable government and law enforcement in Somalia.

At the same time, piracy is also increasing on the coast of West Africa. Namely, pirate attacks off West Africa nearly doubled in 2016, according to a report from Oceans Beyond Piracy, an anti-piracy NGO. Most of these attacks have occurred off the coast of Nigeria and have focused on attacking the country’s oil infrastructure. The attacks from Nigeria stem primarily from the country’s criminal gangs. The tactics employed by West African pirates differs, however, from their Somali counterparts. While Somali pirates tend to target large ships, West African pirates seek out the crew then go into hiding until they receive ransom payments. Part of this has to do with the nature of the local government. Nigeria, unlike Somali, has a functioning government and military, which makes seizing large ships more difficult. The presence of a functioning state apparatus has also made the need for an international coalition, like the one in Somalia, less necessary.


Conclusion

Piracy is one of those concepts, similar to terrorism, where it often seems as if the international community is pursuing the incorrect, reactive approach. Namely, instead of taking a step back and asking why people engage in piracy, we try to target individual pirate leaders in the hope that defeating them will end the scourge. In other words, we treat the symptoms instead of looking at the underlying cause.

When rates of piracy went down, the international community pointed to increased vigilance and became complacent. With the threat seemingly neutralized, protection decreased and ships started employing fewer armed guards. Unsurprisingly, piracy returned and now the community must grapple with the same problems again. If the world at large hopes to be more successful this time, it must understand the history behind this practice, and more importantly, this divided region. Above all else, though, greater emphasis will need to be placed on the cause, or at least offer an alternative, rather than simply trying to kill a few leaders and assuming that will solve the problem.

If the U.S. and its global partners really want to end piracy they need to establish a secure and functioning state in Somalia and address the food problem there. In West Africa, there is less to do since there is a functional government in place and pirates rarely try to seize entire boats, instead focusing on ransom payments for individuals. In that scenario, however, the government may need to look into addressing the inequality caused by mineral wealth that has left certain groups wanting. There is no one universal approach, other than working to target the reason why piracy exists instead of just reacting when piracy occurs.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post After Years of Decline, Piracy May Be on the Rise Again appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/piracy-back-rise/feed/ 0 61455
Qatar: How the Tiny Peninsula Became the Center of a Regional Proxy War https://legacy.lawstreetmedia.com/issues/world/qatar-center-regional-proxy-war/ https://legacy.lawstreetmedia.com/issues/world/qatar-center-regional-proxy-war/#respond Fri, 16 Jun 2017 14:32:35 +0000 https://lawstreetmedia.com/?p=61245

How Qatar fits into the conflict between Saudi Arabia and Iran.

The post Qatar: How the Tiny Peninsula Became the Center of a Regional Proxy War appeared first on Law Street.

]]>
"Doha skyline in the morning" courtesy of Francisco Anzola; License: (CC BY 2.0)

On June 5, several Arab nations led by Saudi Arabia announced they were cutting off all relations with Qatar. Although terrorism was used as the main rationale for the fallout, alternative claims abound. Whatever the exact reason, this dissension in the ranks comes at a difficult time in the fight against terror, a fight in which Qatar is a maddeningly prominent player on both sides. It also creates an awkward position for the United States which has an important base in Qatar as well as one in Bahrain–one of the nations that severed ties. Most significantly though, this move may just be one more development in the ongoing proxy war between Saudi Arabia and Iran, whose differing viewpoints of Islam are grappling for preeminence in the Muslim world. Read on further to learn more about the fallout and its various impact on Qatar, the United States, and the region at large.


Why the Split?

In total, nine countries have announced that they would cut ties with Qatar, namely Saudi Arabia, Egypt, the United Arab Emirates, Bahrain, the Maldives, Yemen, Libya, Mauritius, and Mauritania. According to these countries, the split is over Qatar’s support for terrorist groups and its close relationship with Iran. Specifically, these countries claimed that Qatar has either supported or protected members of ISIS, Al Qaeda, and the Muslim Brotherhood. In response, Qatar has said that these claims have “no basis in fact.” Another related issue that may have sparked the fallout is a massive ransom payment that Qatar reportedly paid to recover a member of the royal family. The payment is rumored to be as high as $1 billion and Qatar’s neighbors fear that the money amounts to direct funding for terrorist organizations. Finally, the decision also comes shortly after the Qatari News Agency reported on comments allegedly made by the Qatari leader in support of Iran. The report prompted backlash from neighboring countries, but Qatar said that the news outlet was hacked and the report was fabricated.

There is some irony to the split, as Qatar is a Sunni-led, Sunni-majority nation, while Bahrain–one of the countries that cut ties–is actually majority Shia, the Muslim sect championed by Iran. As a result of the decision, Qatari citizens and diplomats will be required to leave many of these countries on very short notice.

The video below describes how the recent dispute unfolded:


Impact on Qatar

The Al Thani family has ruled Qatar from the mid-1800s onward. For most of that time, the country was relatively poor and undeveloped. However, with the development of the country’s vast natural gas reserves beginning a little more than half a century ago, the nation was transformed and attained the world’s highest per capita income in 2007. Despite accruing vast wealth, Qatar has had issues in the past due to its support for revolutionary movements and terrorist organizations, which has caused rifts with many of the countries it is currently clashing with in the past, including Saudi Arabia and Bahrain. (This support may also explain why Qatar was immune from many of the Arab Spring protests experienced by a number of countries in the Middle East.) At one point in 2014, those countries even recalled their ambassadors, but in that case, the differences were ultimately resolved.

In the most recent case, Qatar would benefit from a similarly quick return to good relations. This is true for several reasons. First, because Qatari flights are banned from these countries’ airspace, flight paths to and from Qatar need to be modified to take longer routes, which raises costs and could spell trouble for its airlines. Secondly, Qatar is a peninsula with only one land border, which is with Saudi Arabia. By closing this border, Qatar will have to funnel all food and other supply shipments in by air or sea. This is particularly a problem for Qatar because its climate prevents most domestic food production.

In addition, this move could also hamper Qatar’s construction industry. Qatar was chosen to host the 2022 Soccer World Cup, but many of the materials needed to build the facilities for the stadium and other projects pass through Saudi Arabia, which will now also need to be transported on a less direct route. This will also have consequences on both Qataris living abroad and citizens of other Gulf nations currently living in Qatar, many of whom have been ordered to return home. The impact of these concerns was felt immediately as Qatar’s stock market dropped 7 percent the day after the announcement.

These effects would only pile on the issues Qatar has had to deal with since the price of oil plunged in 2015. Specifically, the country already ran a $8 billion deficit, amounting to 5 percent of its GDP in 2016. To combat these changes, Qatar had already implemented austerity measures such as raising utility rates, levying fines, and scrapping programs, including a proposed national health care system. If this ban is long-lasting, it could have even more deleterious effects on Qatar.


Impact on the United States

As with so many other issues, the decision to ostracize Qatar has implications for the United States as well. One, potentially awkward connection between the recent fallout and the United States, is a speech recently given by President Trump in Saudi Arabia. In his speech, President Trump was very critical of Iran, which many feel emboldened Saudi Arabia to act decisively against Qatar, given its unorthodox relationship with Iran.

This also has a more practical impact on the United States. Following the 1991 Gulf War, Qatar and the United States reached an agreement that brought the countries closer militarily. This commitment was confirmed in 2003 when the United States moved its forward command base from Saudi Arabia to Qatar. That base, known as Al-Udeid, is home to more than 10,000 American troops and is the site of U.S. Central Command. Despite the recent diplomatic fallout, the U.S. has reaffirmed its commitment to the fight against terrorism and has pledged to maintain its regular activity at the base. Nevertheless, the dispute puts the United States in an awkward position of being allied with both parties and having a major base in a country that has been ostracized by its neighbors.


Impact on the Middle East

As with many issues concerning the Middle East, Qatar and the countries trying to isolate it are also interwoven. While this move is meant to single out Qatar, it will also affect the entire region. This begins with regional organizations. The largest is OPEC, or the Organization of Petroleum Exporting Countries. However, cutting ties with Qatar is less of an issue within this organization given its history of internal conflict. For example, Saudi Arabia’s antagonist, Iran, is also a member and the two have been able to coexist. And at certain points in OPEC’s history, members of the organization have actually fought wars against one another. The conflict does seem to be affecting the price of oil though, as crude oil prices fell the day after the announcement. Investors cited concerns over whether OPEC members could adhere to their pledge to reduce production to drive up prices.

Qatar is also a member of the Gulf Cooperation Council along with Saudi Arabia, the UAE, Bahrain, Oman, and Kuwait (Oman and Kuwait have maintained diplomatic relations with Qatar). While this alliance is not threatened, some members, namely Kuwait, are calling for a quick resolution to the problem. These sentiments have been echoed by other countries such as Turkey, Russia, and the United States. In fact, although Qatar is the main subject in this situation, the reality, and the likely biggest impact in the Middle East, is to be felt in the ongoing proxy war between Iran and Saudi Arabia.

Specifically, Iran and Saudi Arabia have been engaged in an unofficial proxy war in countries across the Middle East akin to the Cold War. The two nations have taken opposite sides in a number of conflicts such as the ones in Iraq, Syria, and Yemen. They each see themselves as representing the true nature of Islam–the Shiites in Iran and the Sunnis in Saudi Arabia. After the initial decision to cut diplomatic ties was made, Saudi Arabia cited Qatar’s support for “terrorist groups aiming to destabilize the region” as the justification. But at the same time, Qatar has also backed groups fighting against forces that are supported or tied to Iran in both Syria and Yemen.


Conclusion

As the longstanding proxy war between Iran and Saudi Arabia continues, there are a number of places where conflict has flared up. The most recent example is Qatar, which has complicated ties to both countries. While Qatar certainly seems caught in the middle of something larger than itself, it is not totally blameless. The world’s largest liquefied natural gas exporter has supported groups on both sides of the larger conflict.

The recent fallout will have implications for both the region and other prominent actors, notably the United States. Not only is its largest U.S. military base in the Middle East located in Qatar, some point to recent comments from the American president as a possible cause of the decision to shun Qatar. The complexities of the situation may explain why leaders from around the world are calling for a resolution as quickly as possible.

In the meantime, Qatar is caught in a bind. While it attempts to resolve this dispute, it must also remain conscious of its image, especially as it prepares to host the next World Cup in 2022. With all this in mind, and Qatar’s proximity to Saudi Arabia, this conflict may need to be resolved sooner rather than later.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Qatar: How the Tiny Peninsula Became the Center of a Regional Proxy War appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/qatar-center-regional-proxy-war/feed/ 0 61245
Kashmir: A Region Divided by Three Nations https://legacy.lawstreetmedia.com/issues/world/kashmir-region-divided-three-nations/ https://legacy.lawstreetmedia.com/issues/world/kashmir-region-divided-three-nations/#respond Sat, 10 Jun 2017 14:16:04 +0000 https://lawstreetmedia.com/?p=61105

Why has it been so hard to resolve the conflict in Kashmir?

The post Kashmir: A Region Divided by Three Nations appeared first on Law Street.

]]>
"Pahalgam Valley" courtesy of KennyOMG; License: (CC BY-SA 3.0)

In mid-April, protesters in the Indian-controlled part of Kashmir clashed with Indian soldiers, leaving at least eight dead and more than 200 injured. This came in the wake of elections held in Kashmir that saw only 7 percent turnout, the lowest in 27 years. That record was quickly broken in a re-scheduled election in which only 2 percent of people voted. These are just the latest developments in the conflict over Kashmir between India and Pakistan, which has lasted decades. This conflict is compounded by a number of other issues, such as both countries’ nuclear power status and the involvement of China. Read on to find out more about the Kashmir conflict, its impact on India-Pakistan relations, and how it may eventually be resolved.


Background: A Look at Kashmir

The region of Kashmir has been disputed territory between India and Pakistan since 1947, following British rule and the partition of British India. India, which borders the region to the south, controls the south and southeastern parts called Jammu and Kashmir. Pakistan controls the northern and western parts (and since 1962, China has controlled the northeastern portion). The Indian and Pakistani zones are separated by the Line of Control.

Despite being controlled by India, which is predominantly Hindu, half of Jammu and the entirety of Kashmir are majority Muslim areas. Both religions have long roots in the region, with Hinduism dating back to the area’s early history and Islam coming in the 14th century via Muslim conquerors. The area was also intermittently ruled by Afghan Warlords and Sikh princes.

The video below describes how the borders formed over time:


The Conflict

Although Hindus and Muslims had coexisted relatively peacefully for centuries, conflict quickly gripped the area following independence. The origin of the conflict can be traced back to the choice of the Maharaja Hari Singh of Kashmir. At the point of independence, the Maharaja hoped to remain independent, however, he was ultimately forced to choose between joining either India or Pakistan thanks to an armed revolt within the region. Despite ruling over a majority Muslim area, the Hindu Maharajah decided to side with India.

The Maharajah’s decision allowed India to justify sending troops into the region. Originally it was supposed to be a temporary move, with the ultimate goal of holding a local vote to decide who would be in charge. The conflict continued and in 1948 the United Nations got involved at India’s request. The U.N. Security Council passed a resolution calling on Pakistan to withdraw its forces from Jammu and Kashmir while allowing India to maintain a small military presence. Pakistan refused and the vote that was supposed to determine the fate of Kashmir never took place. But in 1951, elections did proceed in the Indian-controlled portions of Kashmir and Jammu.

Fighting picked up again in the 1960s and 70s, but the first conflict was between China and India in 1962. Chinese forces quickly defeated Indian troops and took control over the region they dubbed Askai Chin. Their territories were separated by the Actual Line of Control, which is different from the similarly named line between Indian and Pakistani Kashmir.

India and Pakistan re-engaged in heavy fighting in 1965 and 1971, following years of unrest in the region. In 1971 the Indian army decisively defeated their Pakistani antagonists. This led to the Simla Agreement that called on both parties to solve matters peacefully and clearly designated the Line of Control. However, in reality, this did not stop the violence. The continuing conflict was carried out by insurgency groups from Pakistan, who flooded into Indian Kashmir to fight against its occupation. There was also the Kargil War of 1999 that nearly led to a nuclear conflict.


Peace Process

The peace process in Kashmir has been ongoing nearly as long as the conflict. There were the ceasefires in 1948 and 1971, however, neither fully stopped the fighting and were largely ineffective. During the 1999 Kargil War and during a period between 2001 and 2002 there were also fears that renewed conflict between India and Pakistan would lead to a nuclear confrontation. Luckily, due to international interference primarily by the United States, the crisis was averted.

More recently, progress was made in what is known as the “composite dialogue,” which began in 2004. This dialogue ultimately ended with the Mumbai bombing in 2008. However, the goals accomplished during the talks, such as a ceasefire at the line of control and passage across the line of control, endured.

Despite this progress, the region once more experienced a surge in violence following the 2008 attack. After a couple years, relations began to improve and in 2012, the President of Pakistan met with the Indian Prime Minister to hold the first high-level talks in nearly eight years. But hope for progress was quickly dashed after India’s decision to execute both the last remaining Mumbai attacker, as well as a Kashmiri convicted in a 2001 attempted bombing of India’s parliament, led to renewed violence.


Line of Control

This situation may also have been exacerbated by the construction of a border fence beginning in 2003. While the numbers suggest the fence has been successful in reducing infiltration by potential militants, it also has its drawbacks. The fence may simply be diverting them to other areas and it is expensive to maintain, as large portions have to be rebuilt after each winter.

Further controversy arose after there were rumors that India planned to build a more solid wall in 2015. Specifically, in 2015, Pakistani officials went to the United Nations and claimed India was planning a 10-meter high, 135-foot wide wall along the entire 197-kilometer border in an effort to make the Line of Control the permanent border in Kashmir (Pakistan does not view the Line of Control as a legitimate border). India denied the claim and the wall never materialized.

India has also installed something known as a “laser wall” in Jammu within Kashmir and along other parts of its border with Pakistan. This technology is able to detect movement and is useful in places where the topography makes it hard to build a physical fence.

Current Situation

The current situation continues to be unstable in light of the recent disputes detailed above. This includes the election chaos from April and protests in May after a militant commander was killed by Indian security forces. There have also been repeated episodes of violence along the Line of Control, along with violence in both countries’ territories. The two sides are also quarreling over the status of an alleged Indian spy whose fate is being decided by the International Court of Justice.


The Region’s Future

Given the persistent conflict, what is the most likely outcome for this region? An article from the BBC details seven possibilities, ranging from variations of India and Pakistan taking over all or part of the region to Kashmir achieving independence. However, for any of these scenarios to take place, one side would need to give up territory, which has become unlikely amid renewed tension.

China, meanwhile, might also have a major role to play in the region’s future. China, whose own claim to Kashmir already played out in a successful war against India, recently signed a $500 million deal with Pakistan. This is just part of a much larger $57 billion deal between the two countries to create a China-Pakistan Economic Corridor in part of China’s even larger One Belt, One Road Initiative. The plan includes rail lines that would run directly through the contested territory. In response, India refused to even send an official delegation to a recent summit in Beijing.


Conclusion

The conflict over Kashmir between India and Pakistan, and China to a much smaller degree, has dragged on for decades and cost tens of thousands of lives. Both sides have legitimate claims to the region. For India, it is simply enforcing the decision of the Maharajah dating back to the 1940s. For Pakistan, it is about incorporating a majority Muslim region into a Muslim nation. Both nations also have significant issues with their adversary’s position–India claims Pakistan seized the areas under its control illegally, while Pakistan states that the Maharajah’s original decision was made under duress and is therefore invalid.

Regardless of the reasoning, the combined populations of India and Pakistan are more than one-fifth of the world’s total, and both countries possess nuclear weapons. Thus, it is imperative that the two sides negotiate some sort of a deal or even agree to a third option where Kashmir is independent. Reaching that agreement has proved elusive and with the involvement of other countries, like China, it may prove even more challenging. The situation in Kashmir is reminiscent of the deadlock between Israel and Palestine and unfortunately shows just as few signs of being remedied in the near future.

 

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Kashmir: A Region Divided by Three Nations appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/kashmir-region-divided-three-nations/feed/ 0 61105
The ANC After Zuma: What’s Next for South Africa? https://legacy.lawstreetmedia.com/issues/world/anc-zuma-next-south-africa/ https://legacy.lawstreetmedia.com/issues/world/anc-zuma-next-south-africa/#respond Wed, 24 May 2017 17:09:16 +0000 https://lawstreetmedia.com/?p=60866

As calls for Zuma to step down mount, what will the country's future look like?

The post The ANC After Zuma: What’s Next for South Africa? appeared first on Law Street.

]]>
"President Zuma" courtesy of Linh Do; License: (CC BY 2.0)

Earlier this week, South African President Jacob Zuma publicly indicated that he might endorse his ex-wife to be the next leader of his party, the African National Congress. Zuma will soon be finishing his term as the head of the party and rumors indicate that he may even end his term as president early amid calls for him to step down. The reason for his potential exit stems from a number of controversies that have reached a fever pitch in the country after he has led the party once run by Nelson Mandela for more than a decade. Read on to find out more about the legacy of the ANC, its current leadership, and how the myriad scandals engulfing President Zuma could affect the party going forward.


The African National Congress

The ANC or African National Congress, now headed by Jacob Zuma and once led by the luminary Nelson Mandela, started back in 1912. Originally, the party was known as the South African Native National Congress (SANNC) and was founded with the hope of achieving equality for the majority black population of South Africa (it was renamed the ANC in 1923). Despite growing pains, due to limited funds and internal squabbles, the party endured and rose to prominence in response to Apartheid, which fueled political activism.

In 1961, the party moved beyond activism and started a military wing known as Spear of the Nation or MK. The military branch waged war with the South African Apartheid government with support from sympathetic African nations and from the Soviet Union. Apartheid finally ended in 1994 and the ANC quickly came to dominate the first few elections up through the early 2000s. But the party’s grasp on power began to slip with the election of Jacob Zuma in 2009, and it slipped further with his reelection in 2014.


Nelson Mandela

One of the key figures in the rise and eventual dominance of the ANC was Nelson Mandela, who joined the party in 1944. Throughout the 1940s and 1950s, he played an instrumental role in many of the party’s major programs–including the ANC Youth League, its Defiance Campaign, and the Freedom Charter Campaign–until his arrest following the 1960 Sharpeville Massacre. After his release and acquittal in an earlier treason trial in the mid-1950s, he led the formation of the MK and was its first Commander-in-Chief. He was arrested again in 1962 and sentenced to five years in prison for incitement and illegally leaving the country when he traveled to Botswana. However, when police discovered his diary detailing his plans for armed conflict, he was infamously sentenced to life in prison on Robben Island in 1964.

Mandela spent the next 27 years in prison. When he was finally released in 1990 the ANC was also removed from the list of banned parties following domestic and global pressure on the Apartheid government. In 1991, he ascended to become the leader of the ANC after two separate stints as its deputy president in the 1950s and 1980s. In 1994, Mandela was elected president of South Africa in an unopposed election. He retired from the post in 1999 and was succeeded by Thabo Mbeki, who had already assumed Mandela’s role as president of the ANC in 1997.

The video below goes into more detail about Nelsen Mandela’s life:

While serving as President of both the ANC and the nation, Mbeki would famously dismiss current South African President Jacob Zuma from his position as the country’s Deputy President in 2005 after he was implicated in a bribery scandal. This led to a split in the party, however, Zuma would ultimately prevail–taking over the ANC in 2007 and the presidency in 2009, while essentially forcing Mbeki into retirement.


Zuma’s Many Controversies

Jacob Zuma was a decidedly different leader than Mandela, although their paths converged in several key instances. Unlike Mandela, a trained lawyer, Zuma was born into poverty to a single mother and had no formal schooling. When he was just 17 he joined the ANC’s militant branch led by Mandela. He was imprisoned alongside Mandela and went into exile in Mozambique after he was released. In 1990, he returned and participated in the discussions that brought about the end of the Apartheid government. Zuma’s everyman appeal and his adherence to traditional African norms made him popular. These traits proved to be the deciding factors in his rise to power and in his dispute with former President Mbeki, whom he helped force to resign in 2008.

While Zuma shared the charisma of Mandela, he has differed in his inability to avoid controversy. Long before he became president, he was embroiled in a bribery scandal concerning a large arms deal in the late 1990s. While the case was eventually dropped almost 10 years later by the country’s National Prosecution Authority, it was done under dubious circumstances and just before he was elected president. The circumstances were so suspicious that a campaign to reopen the case continues today.

Zuma also attracted negative press when he took money from the South African government to make lavish additions to his home, although he promised to pay back the loans. The country’s highest court actually ruled in 2016 that his actions were unconstitutional, forcing him to apologize and promise again to pay back the loans. Even his personal life has been controversial, as he adheres to a Zulu tradition of polygamy and has four wives and 21 children. Some of his children have come from extra-marital affairs, and in one of those cases, he was accused of rape, although he was ultimately acquitted.

Zuma’s Time in Office

Despite his frequent scandals, Zuma did have one notably large accomplishment during his time in office. He oversaw a restructuring of the country’s AIDS policies, which made HIV medication much more easily available to South Africans. This was particularly important given that South Africa has the highest number of people living with HIV in the world. This was in stark contrast to the policies put in place under Mbeki, who doubted the relationship between HIV and AIDS.

But Zuma recently has faced even more criticism when he fired the country’s finance minister, Pravin Gordhan, earlier this year. Gordhan’s firing contributed to Standard & Poor’s decision to downgrade South Africa’s credit rating to junk status. The economic situation is particularly relevant because it was one of the issues Zuma had campaigned on as a way to differentiate himself from his predecessor, who he associated with political and economic elites.

Unfortunately for Zuma, the economy has not done him many favors. While it narrowly avoided a recession last year and is projected to grow by 1 percent this year, things are not great. Although the GDP of Africa’s largest economy is growing, its unemployment rate continues to rise and its per capita income is expected to decline. The unemployment rate in South Africa reached a 13-year high of 27.1 percent in 2016.

Consistent scandals and economic hardship have led to a breaking point for Zuma. Efforts are currently underway to hold a vote of no-confidence by secret ballot. Although Zuma has managed to survive past votes of no-confidence, they have never been done through secret balloting, which could give members of his own party cover to vote against him. A march in support of the secret ballot also took place recently in Johannesburg. Some have suggested that Zuma may endorse his ex-wife in an attempt to secure a pardon from the next president. An endorsement could also ensure that he continues to have political influence even after he leaves his post.


What’s Next for South Africa?

Since the end of Apartheid and the beginning of democracy in South Africa, the ANC has never been out of power. However, after the party lost elections in several key metro areas for the first time last year, that streak may be coming to an end. Specifically, in the area of Gauteng, traditionally an ANC stronghold, a private survey showed a drop of more than 10 percent in the party’s public support following Zuma’s latest round of controversies. Although it is impossible to point to the exact cause of that drop, the survey results indicated that the recent scandals played an important role in last year’s local elections.

With upcoming elections, the party must now consider something once considered impossible, the need to form a coalition government in the absence of a clear majority. Despite the seemingly endless stream of controversies following Zuma, the ANC has so far refused to call on him to resign, although many have criticized his decision to fire the finance minister.

The video below looks at the current challenges facing the ANC:


Conclusion

The African National Congress came to prominence while challenging the Apartheid government in South Africa. It became the leading party in the country for the black majority and stood in opposition to the white minority ruling party. The ANC was eventually led by Nelson Mandela, a man who literally embodied this struggle. Upon his release from prison and subsequent election, the ANC appeared to have unquestioned dominance in South African politics.

Nevertheless, that dominance has begun to show signs of waning. Several municipalities have already voted the ANC out of power and now it must learn to develop coalitions, a challenge that it has never really had to deal with before but must already grapple with at the local level. Part of this can be attributed to the party achieving, at least to some degree, many of its original goals. But a much larger problem is the political capital lost by Jacob Zuma, the party’s current leader  and president of the country. Zuma’s endless scandals and provocative nature appear to finally have worn thin on the voters. The transition of power in Africa’s largest economy and one of its most politically stable since the end of Apartheid bears watching. Even if the ANC retains its dominance, a change of the guard seems to be coming sooner rather than later.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The ANC After Zuma: What’s Next for South Africa? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/anc-zuma-next-south-africa/feed/ 0 60866
Safe Havens? The Story Behind Sanctuary Cities https://legacy.lawstreetmedia.com/issues/politics/story-behind-sanctuary-cities/ https://legacy.lawstreetmedia.com/issues/politics/story-behind-sanctuary-cities/#respond Fri, 19 May 2017 15:22:05 +0000 https://lawstreetmedia.com/?p=60725

What are sanctuary cities?

The post Safe Havens? The Story Behind Sanctuary Cities appeared first on Law Street.

]]>
"Washington D.C." courtesy of Mobilus In Mobili; License: (CC BY-SA 2.0)

On May 7, Texas Governor Greg Abbott signed a bill that will allow law enforcement officers in Texas to inquire about people’s immigration status during stops. It also threatens to punish officers who do not cooperate with federal immigration agents. While the signing–which took place spontaneously on a Sunday night–caught opponents by surprise, the places targeted by the law, known as sanctuary cities, have been a large part of the public immigration debate lately. What is less clear is what exactly sanctuary cities are and why there has been so much controversy surrounding them. Read on to find the answers to these questions and the outlook for so-called sanctuary cities going forward.


Sanctuary Cities

So what exactly are sanctuary cities? Although the term is frequently thrown around, there is actually no legal definition for what constitutes a sanctuary city, it’s more of a concept. Much of the debate boils down to how local law enforcement cooperate with federal immigration efforts. There are several cities and local governments that have laws preventing local law enforcement from turning over suspects to federal authorities for deportation. Although this may seem surprising, as the law currently stands, local authorities have no legal obligation to assist federal immigration enforcement. There are currently at least five states and 633 counties with some sort of laws limiting law enforcement officers from cooperating with federal immigration agents.

The video below details what sanctuary cities are and how they work:


The Political Battle

On his fifth day in office, President Donald Trump entered the fray by drafting an executive order that threatened to punish any local governments that do not aid federal authorities in tracking down and detaining people who entered the country illegally. Not only did Trump’s executive order threaten to punish these cities, it also made more people eligible for deportation. Namely, the order now allows anyone who has, “committed acts that constitute a chargeable criminal offense or pose a risk to public safety in the judgment of an immigration officer” to be deported. Before Trump’s executive order, the focus for deportation had been a crime-based removal rational, specifically targeting those who had already been convicted of crimes.

The previously established guidance allowed local law enforcement to choose to hold someone or not while Immigration and Customs Enforcement (ICE) initiated with deportation proceedings. If a local jail had someone targeted for deportation, federal immigration authorities would ask local law enforcement to hold that person for additional time, typically 48 hours, so that they could initiate the deportation proceedings. However, with the recent executive order, counties that limit their cooperation with federal authorities–an example of which may be declining federal detention requests–would need to change their policies or face a potential loss in federal funding.

Local law enforcement had the option to deny retainer requests in the first place because the Department of Homeland Security determined that holding someone without a warrant while deportation proceedings began could actually be a violation of the Fourth Amendment. And given additional legal issues surrounding conditions placed on federal grant funding, President Trump’s executive order was frozen in April by a federal judge. Regardless of the order’s fate, there is still confusion between neighboring districts and fear among law enforcement that orders like these will prevent immigrants from speaking to and assisting the police.

While sanctuary cities have taken on greater prominence under the Trump presidency, the sanctuary movement actually goes back more than 30 years to another celebrity Republican president. That president was Ronald Reagan and the people arriving then were from Central America, fleeing authoritarian governments supported by the United States in an effort to stop the spread of communism. In that case, the United States refused to help the refugees trying to escape violence from a government that it had helped keep in power. However, churches, colleges, and even cities responded by whisking these people across the border into safe havens.

The video below looks at the origins of the sanctuary movement:

Although targeting sanctuary cities and increasing deportation efforts have become important issues for Republicans lately, historically, expanding immigration enforcement has not been unique to one party. On the contrary, Trump’s predecessor President Obama, who is often touted as a staunch civil rights defender, enacted similar policies during his two terms. In fact, at one point during the Obama presidency, deportations reached an all-time high with more than 400,000 people deported in one year. Even after policy changes that sought to refocus enforcement efforts to target only convicted criminals were implemented, the number of deportations remained as high as 240,000 people in Obama’s last year in office. Most of the people deported by the Obama Administration were from either Mexico or Central America.

As for sanctuary cities themselves, in many ways, former President Obama actually helped fuel their rise. While the sanctuary movement had been around for decades, Obama’s Secure Communities program–built off of an earlier Bush presidency idea, which made it mandatory for local police to share information with federal authorities–vaulted the issue into public debate. Obama did eventually end the program, however, he remained focused on immigration enforcement, as the numbers indicate, up to the end of his term. While immigration enforcement has been a priority for presidents from both parties, Obama’s policies shifted the focus toward punishing convicted criminals and sharing information rather than targeting all immigrants. President Trump’s recent efforts go further to increase the number of people considered priorities for deportation and he has started directly confronting cities that limit cooperation with federal authorities.


What’s next?

Although the Sanctuary Movement has been around since at least the 1980s, its future is unclear. As part of the same executive order President Trump signed in January, he also threatened to cut off all federal funding to sanctuary cities. While experts doubt that Trump would be able to cut off all funding for these cities, many of the legal questions have not yet been resolved by the courts. The Trump Administration could also consider getting an injunction against certain policies in certain sanctuary cities that go beyond not helping and actually hinder federal efforts. The following video looks at what President Trump might do to sanctuary cities that refuse to change their laws:

The Obama Administration also predated any of Trump’s actions by threatening to withhold funds for not complying with federal laws. Last February, the Department of Justice, under Attorney General Loretta Lynch, agreed to transfer illegal immigrants who have completed their federal sentences into the custody of immigration officers instead of local authorities if those local authorities have shown resistance to ICE in the past. Additionally, threats to withhold federal grants for places that do not share information when requested by federal authorities came in 2016 under the Obama Administration.

These were not the only efforts to dissuade sanctuary cities either. In 2015, the House also passed a bill, which would prohibit sanctuary cities from receiving certain Justice Department grants. That bill would block federal funding for immigration-related grants, like a program that reimburses cities for the costs involved in detaining deportation targets for additional time, as well as more general law enforcement funding like money from the Justice Assistant Grant program and the Community-Oriented Policing Services program. Despite these efforts at the federal level, many cities have remained defiant. In Boulder, for example, the city voted to recognize itself as a sanctuary city even though doing so would open it up to further funding threats.


Conclusion

In February, shortly after President Trump took office, federal immigration enforcement executed a number of raids across 12 states in an effort to sweep up illegal immigrants. However, these raids differed from those that took place during the Obama Administration in that they targeted a higher percentage of people who had not been convicted of crimes. Although differing from the past administration’s policy guidance, these actions followed in line with the executive order issued by Trump soon after his inauguration.

The sanctuary movement, and sanctuary cities in particular, have sprung up since the 1980s to respond to increased enforcement efforts. However, efforts both by the previous Obama Administration, and now President Trump, have sought to undercut local governments who seek to restrict cooperation with federal authorities. This has been done through vehicles such as Trump’s executive order but also primarily through threats of reduced federal funding. While the president’s efforts to withhold federal funding from sanctuary cities involves several unanswered legal questions, the scope of potential funding losses could cause a significant blow to local budgets. Nevertheless, these places have for the most part continued to stand up and resist federal immigration policies that would require them to assist in deporting illegal immigrants.

With Trump’s executive order on immigration enforcement and others, such as the travel ban, currently working their way through the courts, these issues are in the process of being resolved. An important question after that point is whether the parties involved will abide by the decision reached by the courts.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Safe Havens? The Story Behind Sanctuary Cities appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/politics/story-behind-sanctuary-cities/feed/ 0 60725
A Right to Life, Liberty and a Basic Income?: The History of Guaranteed Basic Income https://legacy.lawstreetmedia.com/issues/business-and-economics/right-life-liberty-basic-income-story-behind-guaranteed-basic-income/ https://legacy.lawstreetmedia.com/issues/business-and-economics/right-life-liberty-basic-income-story-behind-guaranteed-basic-income/#respond Mon, 08 May 2017 13:37:18 +0000 https://lawstreetmedia.com/?p=60563

This type of welfare program is gaining popularity worldwide.

The post A Right to Life, Liberty and a Basic Income?: The History of Guaranteed Basic Income appeared first on Law Street.

]]>
 IMAGE COURTESY OF STANJOURDAN; LICENSE: (CC BY-SA 2.0

Earlier this week, the Canadian province of Ontario announced it would be conducting a pilot program for 4,000 of its residents, guaranteeing each person minimum income even if they did not work. While the idea of giving away “free money” may draw criticism from some, this is not a new concept. In fact, programs similar to this have been around for nearly 50 years, with the ultimate goal of eventually replacing the welfare system as we know it.

Read on further to find out more about guaranteed basic income (otherwise known as universal basic income or basic income), its purpose, the history behind it, and how it might impact the future of welfare programs worldwide.


Guaranteed Basic Income?

So what is guaranteed basic income (GBI)? According to the Basic Income Earth Network (BIEN), this type of payment has five key characteristics: it is paid in intervals instead of all at once, the medium used allows the recipient to use it any way they want (it is not a Food Stamp card, for example), it is paid on an individual basis only, it is paid without a means test, and those that receive it are not required to work.

Everything else, such as the amount of money in each payment or longevity of payments varies based on the proposal. (In the Ontario test case it does have an income threshold and is paid to only the 4,000 included in the program; the rest of the principles still apply.)

The Purpose of GBI

Guaranteed basic income is not really “free money,” as some may claim; it does serve a few important purposes. An article from Law Streeter Eric Essagof already does a great job of explaining the GBI’s use in fighting poverty. Namely, the income encourages people to keep working, while also ensuring that if their income rises, they won’t automatically lose the benefits they rely on (also known as the “poverty trap”). In addition, in the United States at least, it could streamline a complicated system where someone who needs benefits has to sign up for five different programs that all fall under one welfare system.

There are other potential benefits associated with a guaranteed basic income. If people were assured of at least some income, they might be more likely to go to school for more education or training or even take a chance and start their own business. They could also pursue passions (such as writing, for example) that they are harder to take on when their time is dictated by the necessity to make money. For individual workers, a guaranteed income would also enable them to bargain more effectively with their employers and force employers to agree to concessions in order to keep their workers.


History of GBI

The Ontario GBI pilot program is certainly not the first of its kind; in fact, it is not even the first in Canada. The first program was conducted in the province of Manitoba in the 1970s, and led to societal health improvements while simultaneously not discouraging work participation. The idea for a universal basic income can be traced even further back than that–much further, in fact. In 1797 Thomas Paine, a pamphleteer famous for his work “Common Sense” in support of the American Revolution, stated that in exchange for social consensus among the people, the government should offer yearly payments to its citizens.

Since then there have been numerous debates between thinkers on all sides of the political spectrum, but generally basic income has been viewed as a positive. The accompanying video looks at the evolution of the basic income idea:

This type of program and the philosophy behind it have been embraced outside of Canada as well. The most recent effort was in Finland: earlier this year, the Finnish government selected 2,000 unemployed people at random to begin receiving a guaranteed basic income of €560 for two years instead of the unemployment benefits they had been receiving. The major advantage to this for the participants would be that if they found jobs they would still get to keep their basic income, as opposed to losing unemployment benefits.

Through the Finnish trial, which is still ongoing, the government wants to see whether this type of program can help the country’s ailing economy by encouraging part-time work. In addition to this trial, other similar programs worldwide have proven successful, such as one in Brazil in 2004 and another in Namibia in 2007. There was also a similar cash transfer pilot program in India from 2011 to 2012 that led to increased test scores and improved health in participating villages.

Despite the success of many of these programs, there seems to be a perception that they can only be successful in poorer countries and would never work in an “affluent” country like the United States. However, even the United States has some history with the guaranteed basic income. One of the earliest efforts, the Negative Income Tax Experiments, took place between 1968 and 1990 in New Jersey, Pennsylvania, Iowa, North Carolina, Indiana, Washington state, and Colorado. Although these experiments had successful outcomes, they were not politically popular and they lost their momentum. Arguably the most successful experiment so far concerning guaranteed basic income in the U.S. is currently ongoing, and can be found in Alaska.

In 1976, a permanent fund was set up in Alaska to preserve profits made by the oil industry to ensure that the wealth would benefit future populations in the state. This fund was allocated for a basic income program in 1982, and ever since then anyone living in the state for at least six months is eligible to receive a dividend from the state. At its peak in 2008, the fund annually paid out more than $2,000 per resident.

The following video looks at how the program is playing out in Finland and other places:

 


Future of GBI

With more and more places willing to at least launch guaranteed basic income pilot programs, the future of the measure seems bright. This is especially true given the benefits that it so far has offered, along with the fact that automation is increasingly making many jobs obsolete. Currently, along with Finland, there are also ongoing guaranteed basic income trials occurring in Italy and the Netherlands, with Scotland considering a trial of its own as well.

While a basic income has been advocated by some philosophers, researchers, and other individuals, overall there has not been a tremendous groundswell of support. Even in places where pilot programs have been launched, these are usually only reserved for a few thousand people in countries with tens if not hundreds of millions of citizens. So, if this program has repeatedly proven so successful and could replace faulty welfare programs, why are countries not more willing to try them?

The answer starts with cost. In 2016, Swiss voters rejected a basic income for the country’s citizens, and while Scotland is considering adopting such program, the rest of the UK in general is resistant. This opposition comes even when polls show that up to 64 percent of Europeans approve of a basic income. Part of that, however, might be attributed to how the survey questions were worded, in that they do not mention tax increases necessary to provide that income.

Aside from cost, there are other considerations, such as the fear of automation. Although some fear this trend could lead to a dearth of jobs, some economists are quick to point out this same thesis has been made before with regard to past trends, and has been proven wrong by new innovations that, in fact, created more jobs. Additionally, while some want to use basic income to replace existing safety nets, there is no proof yet that exchanging one for the other is actually superior. Even some of the protections basic income is supposed to offer can be turned on their head, with a basic income convincing some employers they can pay lower wages. There’s also the argument that basic income will lead to people choosing simply not to work. The video below looks at basic income, highlighting some pros and cons:

 


Conclusion

Guaranteed or universal basic income as an idea has been around for hundreds of years. As an idea put into practice, it has been around for at least around half a century. Moreover, in seemingly every case, pilot programs incorporating basic income guarantees have been successful in a number of measures, from raising GDP and improving test scores to ensuring nutrition. Furthermore, these types of programs have been lauded by leaders on all parts of the political spectrum as everything from a panacea for solving the broken welfare system to necessary in a world that is increasingly automated.

However, for all its success stories, guaranteed income has never become widespread nor long-lasting. The reasons for this apparent contradiction are manifold and run the gamut from high costs to exaggerated benefits. Additionally, for every country that has adopted and embraced the idea there are others that have rejected it.

What is basic income’s outlook then? In a world that is increasingly feeling budget cuts and squeezes, it seems unlikely a major initiative to expand the program is possible, especially given the ascendance of more conservative leaders who rose to power partially on attacks of the social welfare system. Basic income, then, is unlikely to be guaranteed or universal anytime soon, yet continued successful trials indicate that when conditions are more favorable, it could become the norm.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post A Right to Life, Liberty and a Basic Income?: The History of Guaranteed Basic Income appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/business-and-economics/right-life-liberty-basic-income-story-behind-guaranteed-basic-income/feed/ 0 60563
What’s Behind the Crisis in Venezuela? https://legacy.lawstreetmedia.com/issues/world/political-economic-crisis-venezuela/ https://legacy.lawstreetmedia.com/issues/world/political-economic-crisis-venezuela/#respond Fri, 28 Apr 2017 17:04:21 +0000 https://lawstreetmedia.com/?p=60385

A look at the political and economic chaos in Venezuela.

The post What’s Behind the Crisis in Venezuela? appeared first on Law Street.

]]>
"Mural" courtesy of David Hernández (aka davidhdz); License: (CC BY-SA 2.0)

For the last several years, Venezuela has been plagued by uncertainty following the death of its former enigmatic leader Hugo Chavez. This is a situation that has only been exacerbated by the steep drop in oil prices. Recently, the crisis in Venezuela reached such a low point that current President Nicolas Maduro accused the country’s bakers of waging an unannounced war on its people through price gouging. Although the bakers of Venezuela are clearly not the major issue plaguing the country, real problems certainly exist. Read on to find out how the country’s leadership and its economic decisions have brought a nation rich in natural resources into a political and economic crisis.


History of Venezuela

Venezuela’s first interaction with the western world began right at the end of the 15th century, around 1498, when Christopher Columbus first landed there. The actual colonization of what would become Venezuela began in 1521 by the Spanish. In 1810, the country declared its independence, and in 1829, it broke away from Gran Colombia to become its own independent nation. In 1945, the country threw out its military leader in a coup and elected its first democratic government. However, that government’s reign was short-lived, with the military taking back over after another coup in 1948.

Democratic government was restored in 1958 and the first peaceful transfer of power between leaders occurred in 1964. Venezuela then rode high oil prices throughout the 1970s and 80s until prices finally started to lag. This development forced then-President Carlos Perez to negotiate for relief with the International Monetary Fund–negotiations that ultimately led to riots in the streets. Following Perez’s eventual impeachment on corruption charges, Hugo Chavez was elected president in 1998.


Venezuela Under Hugo Chavez

Hugo Chavez eventually rose to the presidency following a failed coup attempt that he led in 1992. After he was captured, Chavez delivered a speech on national television that garnered him popularity among average, disaffected Venezuelans. This popularity was essential to Chavez’s eventual pardon and release from prison in 1994. It also positioned him as an anti-establishment force, which would rocket him to the presidency in 1998.

Upon his initial election, Chavez was extremely popular among Venezuelans. He used that public support to give himself extraordinary control over all three branches of government. However, many of those moves, along with his plans to imitate Cuba’s style of government and his decision to antagonize the United States, alienated Venezuela from the West. Those moves, coupled with efforts to gain more power, caused his approval ratings to sink from as high as 80 percent to a low of 30 percent.

Dissatisfaction reached the point where, in 2002, Chavez was briefly removed from office in another coup. However, he quickly returned to power and later gained greater control over the local oil industry following a large-scale strike. Using these new resources and buoyed again by high oil prices, he offered citizens lavish social programs to help ensure his reelection in 2006. In 2009, he passed constitutional reforms to remove term limits and ensure that he could continue leading the country. This move also enabled him to crack down on dissent. Chavez’s power consolidation took a secondary role in 2011, when he went to Cuba for cancer treatment. Chavez would ultimately die from cancer-related effects in 2013.

Post-Chavez

Chavez was succeeded by loyalist Nicolas Maduro. Maduro, a career politician who had been elected as Venezuela’s vice president in 2014, went to great lengths to further Chavez’s ideals. Upon ascending to the presidency, Maduro hoped to solidify his grip on power by arresting opponents. This approach seemed to be working, especially when he garnered the support of the military until oil prices began to fall once more.


Falling Oil Prices

In 2014, global oil prices began to plummet from a high of over $100 a barrel to below $30 a barrel. Venezuela was hit especially hard because roughly half of the government’s revenue comes from the oil industry. While the country set up a fund to save surplus revenues during the oil boom of the 1990s, the fund was drained during Chavez’s reign, as he used it to fund social welfare programs and help ensure his reelection. The accompanying video explains many of the issues plaguing Venezuela following and as a result of Chavez’s regime:

Venezuela’s economy became so dependent on high oil prices, that countries grew less willing to invest there as they started to doubt its ability to pay them back. An example of this occurred in 2016, when the Chinese Development Bank was one of only a few institutions willing to continue lending directly to the South American nation, but it did so with many more conditions than in years past. This also had the added effect of reducing Venezuela’s influence among its neighbors, as it can no longer use its oil exports as leverage. Even if oil prices rebound, Venezuela is still likely to face serious trouble, as its state-run oil company will have so much debt that it could have trouble paying for further oil exploration.


Country in Free Fall

In a country where 95 percent of all exports are oil-related, it is clear how devastating a dramatic drop in prices can be. This drop led the government to make dramatic currency interventions that have sparked massive bouts of inflation and triggered supply shortages for essentials like medicine and even food. When the crisis first began developing, President Maduro denied that there was even a problem to begin with, although he eventually issued food vouchers in an attempt to prevent people from going hungry. Nevertheless, more Venezuelans are increasingly going without food and malnutrition rates are rising. The government itself cannot afford to even import more food as it is out of money.

On top of the food and humanitarian crises–which are largely a result of economic mismanagement and fluctuations in international oil markets–are unpopular political moves by President Maduro to consolidate his power. In March, loyalists who were selected by President Maduro on the Venezuelan high court chose to dissolve the National Assembly. The power of the Assembly was to be transferred to the courts under Maduro who, critics argue, effectively became a dictator. While the ruling was revised days later, much about the rule of law in Venezuela remains in question, particularly given that the high court had already been ruling against the National Assembly’s attempts to rein in Maduro.

These moves, and the sheer desperation experienced by many in the country, have led to mass protests. In recent weeks, thousands of people have taken to the streets in Caracas, the capital, to protest and demand new elections. However, these protests were met with force both from police and paramilitary groups supported by the Maduro government known as colectivos. The harsh crackdown by the government has led to international condemnation from nearby countries such as Peru and global powers like the United States. It has also spurred calls for more mass protests across cities in Venezuela. Nevertheless, Maduro remains in power and enjoys some support among his base and by those who believe the actions of protesters are not the appropriate way to bring about change.

The video below looks at recent protests:


Conclusion

Venezuela, like many countries with a colonial legacy, has struggled to create the vibrant and dynamic economy needed to be competitive in the global economy. For most of its independent history, it has been ruled by military dictatorships with a few years of democratic governance in between–but these temporary civilian governments have been undone by a perpetual series of coups. This inability to establish a competent government and the country’s over-reliance on oil for its economy held the potential for disaster.

That disaster came when oil prices bottomed out, leaving the country unable to feed its citizens or meet their basic needs. Naturally, this has led to a crisis of confidence in current President Nicholas Madero, successor to the charismatic and extremely controversial Hugo Chavez. Chavez and Maduro both ascended to power on the notion of cleaning out the old, corrupt government institutions and installing something more responsive to real people’s concerns. However, the actual results of their decisions led to unsustainable social programs that plunged the country into debt as oil prices fell. Now it seems that most Venezuelans want a new government and most of all, a new president.

How the situation with Madero plays out is critical to the country’s future. If protesters and the government can reach a political resolution and rebuild the government’s rapidly decaying institutions, there is hope for a major turnaround. On the other hand, if Madero continues to crack down on dissent and refuses to address the humanitarian crises taking over the country, Venezuela could be on a course for even more chaos. Even if a resolution emerges, the country will need to diversify its economy to manage its reliance on oil. Given its past failures to do so, that will not prove to be an easy task.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post What’s Behind the Crisis in Venezuela? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/political-economic-crisis-venezuela/feed/ 0 60385
Turmoil in South Sudan, the World’s Newest Nation https://legacy.lawstreetmedia.com/issues/world/south-sudan-worlds-newest-nation/ https://legacy.lawstreetmedia.com/issues/world/south-sudan-worlds-newest-nation/#respond Mon, 24 Apr 2017 14:35:41 +0000 https://lawstreetmedia.com/?p=60069

What's behind the recent conflict in South Sudan?

The post Turmoil in South Sudan, the World’s Newest Nation appeared first on Law Street.

]]>
"South Sudan Independence Day Celebration at Diversey Harbor Grove" courtesy of Daniel X. O'Neil; License: (CC BY 2.0)

In February, the United Nations declared a famine in South Sudan and estimated that 100,000 people faced immediate risk of starvation. This was the first declared famine in six years; the last was in Somalia in 2011. While South Sudan has long been struggling, the question is, how did an oil-rich state and one that had finally gained independence from Sudan after years of fighting, suddenly find itself in this situation? Read on to learn more about the nation’s tumultuous history, the aftermath of its independence, and where it stands today.


The History of Sudan

Sudan emerged as an extension of Egyptian society in 1500 B.C. and shared many of Egypt’s customs after the decline of ancient Egypt. Critical to the current conflict, Christianity was introduced to Sudan beginning in the 4th century, followed closely by Islam. For the next several centuries, the country fell under the sway of various Muslim or Egyptian rulers until it finally became a province of the Ottoman Empire in the 19th century. Not long after, control of Sudan passed to the British after fierce fighting between Britain and local religious leaders.

Ultimately, British machine guns and artillery won out and Sudan was eventually brought to heel under a combined British-Egyptian rule. For approximately the first fifty years of the 20th century, the two sides continued this arrangement, with occasional conflict, as Egypt wanted to rule both Egypt and Sudan as one united country. These protests were ignored and ultimately, after Egyptian consent, Sudan became an independent country in 1956.


The Emergence of South Sudan

In 2011, 99 percent of voters in a referendum decided that the 10 southern-most states of Sudan should break away and become South Sudan. While the final decision ended with a clean break, getting to that point was an arduous process. In fact, the referendum followed on the heels of the 2005 Comprehensive Peace Agreement, an agreement that finally ended a civil war that had lasted for several decades. In total, more than 1.5 million people died in the war and another 4 million were displaced because of the war.

The Civil War started in 1955, before what would become Sudan had even gained independence, when army officers from the south of Sudan mutinied. The officers rose up out of fear that once control of Sudan had passed from Egypt and Great Britain that the Muslim majority in the north, the new government, would impose Islamic Law on the country and promote an Arab identity. The initial conflict ended in 1972 with the Addis Ababa Agreement that granted the south limited autonomy. However, the government reneged on its agreement in 1983 leading to another outbreak in fighting that lasted until 2005. The specific issue was the government in the north’s decision to place Sudan under Sharia Law. While the country was approximately 70 percent Muslim, the other 30 percent was composed of Christians and those who followed traditional indigenous religions. In addition to the religious divide, there is also an ethnic divide between Arabs in the north and black Africans in the south.

In addition to the ethnic conflict that started much of the fighting, a major issue preventing peace was how to divide the country’s oil. Although the south has most of the oil reserves, the north had the pipelines and the port to the Red Sea. In the 2005 agreement, the two sides decided to divide profits equally, however, that arrangement ended in 2011 with South Sudan’s independence. Furthermore, while the 2005 agreement paved the way for southern independence it left many conflicts unresolved. The video below looks at Sudan’s modern history and why it has been plagued by conflict.

The Aftermath

Following the implementation of the peace deal in 2005, South Sudan went through a six-year period of autonomy before it voted for independence in 2011. The initial decision for independence was greeted with enthusiasm due to the promise of a large supply of oil and an end to decades of fighting. However, the agreement also left key elements undecided. Notably, it failed to decisively divide up oil resources evenly and did not extinguish ethnic tensions.

The oil issue grew out of the fact that the new South Sudan had most of the oil, while Sudan had most of the transporting and refining capabilities. This issue also bled into the ethnic conflict as some of the disparate groups were armed by Sudan in an effort to weaken South Sudan from the inside, sparking sudden conflicts. These clashes, especially the one between the two largest ethnic groups, led by the president and vice president, sparked yet another outbreak of civil war, this time within South Sudan. Additionally, there remains conflict between South Sudan and Sudan in various border regions. One of the contested areas, Abyei, was not able to participate in the original 2011 referendum vote, leaving questions about its status in the conflict. Many of these border regions also have considerable amounts of oil.

The following video looks at South Sudan at independence and many of the issues that have plagued it since:


South Sudan’s Civil War

civil war within South Sudan, following its independence, came about in December 2013. At that time the president of South Sudan, Salva Kiir, and the vice president, Riek Machar, were engaged in political infighting. Ultimately, Machar was removed from his role as vice president and fled the country.

What started as a political dispute quickly divided the country along ethnic lines. The Dinka, one of the two largest ethnic groups in South Sudan, supported the president, while the Nuer, the other major ethnic group, supported the ousted vice president. As the ethnic conflict escalated, human rights violations ranging from rape to murder have been documented. Because of the violence, many farmers have been unable to tend their fields and grow their crops, which has led to the food disaster that is now considered a famine.

As many as 100,000 people are in jeopardy of starving because of this famine. In addition, another 5.5 million could face food shortages as soon as July. Making the situation even more difficult, annual inflation has risen to 425 percent, making it nearly impossible to buy food. Aid agencies, which have been making up for most of the shortfall, face significant obstacles as the conflict escalates. More than 80 aid workers have already been killed in the conflict. The situation has gotten so bad that people in the affected areas are hiding and foraging in swamps by day and then tending to their crops, at risk of animal attack, by night while the soldiers sleep.


South Sudan Today

To counter the ongoing turmoil, the international community has tried to intervene. The United Nations Security Council has authorized over 13,000 peacekeepers to be stationed in the country and given them the power to use force to protect civilians. These efforts though, have been continuously undetermined by the fluid situation on the ground, with all sides, including the government, involved in the violence. The international community has taken other steps as well, such as sanctions leveled by the United States on the leaders of both sides of the conflict.

To avoid further sanctions, President Kiir agreed to a peace deal with former vice president and rebel-Leader Machar in August 2015 with the support of the Intergovernmental Authority on Development. As part of this agreement, Machar returned to his old position in April of 2016. However, the deal quickly unraveled with both sides violating the agreement causing Machar to flee once again, plunging the country back into war.

With the ongoing conflict and with tens of thousands of displaced people unable to return home, the situation in South Sudan has become increasingly bleak. As of April, the South Sudanese refugee camp in Uganda, Bidi Bidi, has eclipsed Kenya’s Dadaab camp as the world’s largest, with over 270,000 South Sudanese living there. Moreover, the mass exodus shows no signs of ending soon. In other war-torn areas such as Syria, outward migration has effectively slowed, but in South Sudan, the number has gone up dramatically. South Sudan’s refugee crisis is currently the fastest growing one in the world, although it is not the largest in terms of total numbers.

This refugee flow is only likely to continue with yet another outbreak of violence between the government and the main insurgent force flaring up in mid-April. This comes in the wake of more aid workers being displaced and unable to offer desperately needed assistance to the local population.


Conclusion

South Sudan had to overcome approximately a half-century of conflict just to become a nation. In the process, more than a million people have died and millions more were displaced. Upon its independence, the future looked bright for the new nation. It was home to a large supply of oil and it appeared to have finally put its destructive conflicts behind it.

However, appearances were not what they seemed. Conflicts erupted externally in the form of border disputes with Sudan and internally among the nation’s many ethnic groups. The country’s two largest ethnic groups took opposing sides in a political dispute between the president and vice president that once more plunged the nation into a civil war. The consequences of this conflict have been devastating, with any hope of economic success dashed and even the provision of the most basic means of survival thrown into doubt.

Despite being the youngest nation on earth, South Sudan already finds itself at a critical crossroads. Its government is locked in an internal struggle, thousands of U.N. troops are already on the ground, and millions of its citizens sit in refugee camps ringing its borders. To be successful, the country’s path seems clear: reconcile the various ethnic groups, make lasting peace with Sudan, and let people get back to their lives. Finding a way to make these things happen, however, will be a much more difficult process.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Turmoil in South Sudan, the World’s Newest Nation appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/south-sudan-worlds-newest-nation/feed/ 0 60069
The Fate of Hong Kong’s Pro-Democracy Movement https://legacy.lawstreetmedia.com/issues/world/hong-kong-pro-democracy-movement/ https://legacy.lawstreetmedia.com/issues/world/hong-kong-pro-democracy-movement/#respond Mon, 17 Apr 2017 17:59:06 +0000 https://lawstreetmedia.com/?p=60044

How did Hong Kong's pro-democracy movement start, and what's in store for the future?

The post The Fate of Hong Kong’s Pro-Democracy Movement appeared first on Law Street.

]]>
Image courtesy of Studio Incendo: License: (CC BY 2.0)

Hong Kong recently held elections to determine the next Chief Executive of the semi-autonomous region. Despite widespread pro-democracy protests in 2014, a pro-Beijing government official, Carrie Lam, was elected. Following the election, leaders of that very same pro-democracy movement were faced with threats of arrest. To fully understand these events, it is necessary to look back to Hong Kong’s history as well as the history of the protest movement. Read on further to find out where this movement sprang from and to learn about the current state of democracy in Hong Kong.


History of Hong Kong

Humans have lived in what is now Hong Kong for thousands of years. However, it was not until the rise of the Eastern Han Dynasty that the area was considered part of the Chinese Empire. Beginning in the 12th century, five clans of the Han Dynasty, who still exercise power in Hong Kong today, began to arrive. Some believe that as these groups came to the area they started to push out some of the original inhabitants who moved onto houseboats and formed fishing communities that still exist today.

Despite Hong Kong becoming incorporated into the Chinese Empire, in many respects, it remained largely untended. Its location and the rise in trade allowed for the entrance of foreign actors, namely the Europeans. Trade flows started with the Portuguese and continued with the Dutch, French, and finally the British. Chinese authorities made efforts to curb European influence but they proved futile given the high demand for Chinese goods in Europe. Eager to correct a trade imbalance, the British introduced opium, which led to the emergence of a large market as well as the spread of addiction in China. In response, the Chinese Emperor tried to outlaw opium, culminating in the Opium Wars.

In 1842, following the first Opium War, China ceded Hong Kong to Great Britain and access to several ports in Treaty of Nanking. In 1898, the British were given an additional 99-year lease on the city as well as for 235 other small islands. Over the years, the city became a haven for those fleeing both domestic upheavals and later the Japanese during World War II. In 1941 Japan occupied Hong Kong, causing many to leave for mainland China. Britain later reestablished control in 1946.

Shortly after the war, Hong Kong underwent an economic boom. But in the following decades, the city saw social strife and riots as workers chafed at economic inequality and were influenced by policies from the mainland. In the 1970s, Hong Kong emerged as one of the “Asian Tigers,” a highly developed economy in the region. In 1982 Great Britain and China began negotiations to return the city to China, culminating in the Joint Declaration of 1984. This agreement called for Hong Kong to maintain its capitalist economy and partially-democratic system for the next 50 years. It’s important to note, however, that while the agreement called for eventual universal suffrage, that specific mandate was not guaranteed, leaving it open to interpretation.  The following video provides a good history of Hong Kong from the inception of British rule to the present:


Hong Kong’s Government

The Special Administrative Region, its formal distinction, is governed by the Basic Law of Hong Kong. This system guarantees 50 years of autonomy for the region and a government consisting of the Chief Executive, the Executive Council, a two-tiered legislature, and an independent judiciary. The Chief Executive and the Executive Council, which is essentially the Chief Executive’s cabinet, lead the government and perform many of the same functions as the Executive Branch in the United States.

The Chief Executive is elected by an election committee that is comprised of 1,194 members. Only 70 of the members are government officials while the rest are a mix of elites from various professions. This method of election has garnered extensive criticism and the results have sparked protests in the past. Much of that criticism is due to the heavy influence of Beijing among the elites as well as the extent of its influence over candidate selection and election rules. To win the election, the Chief Executive needs to garner a majority of the election committee’s vote.

The Legislative Council is currently composed of 70 members, up from its original 60. It has been in existence since the beginning of British rule in 1843. Originally, it served as more of an advisory board, but throughout the years–especially following the transition from British Colony to Chinese Special Administrative Region in 1997–it has taken on many of the responsibilities of a traditional Western-style legislature. Some of its specific duties include: enacting and amending laws, creating public budgets, appointing and removing the judges of the Final Court of Appeals and the Chief Justice, and holding the power to impeach the Chief Executive. Half of its members are directly elected based on geography, the other half are chosen by government bodies.

Below the legislature are the District Councils, which direct some public spending at the local level and advise the government on issues affecting people in their jurisdictions. Funding allocated to District Councils is typically used for cultural and community activities within the district.

The judiciary acts independently of the executive and legislature and uses a common law system that is based on the region’s Basic Law. All courts fall under the ultimate authority of the Court of Final Appeals headed by the Chief Justice. The Court of Final Appeals essentially serves as Hong Kong’s Supreme Court.


Pro-Democracy Protests

The pro-democracy sentiment in Hong Kong has existed since before it became a Special Administrative Region in China. In 1984, China and Great Britain signed an agreement to transfer Hong Kong to the Chinese after Britain’s 99-year lease ended in 1997. That treaty led to the notion of “one nation, two systems” for Hong Kong and China. One of the basic tenants of this agreement was the Basic Law, which promised universal suffrage after a certain time period passed. However, the sentiment behind the treaty was quickly brought into question, long before the actual transfer, after China’s tough crackdown in Tiananmen Square. The 1990s saw another brief crisis when Great Britain’s last colonial governor tried to increase democratic reforms, which enraged the Chinese government. Ultimately though, it eventually agreed to a watered-down version of the reforms.

China’s choice for the first post-British leader, combined with a proposed anti-subversion law, quickly galvanized the pro-Democracy movement in Hong Kong. The anti-subversion law, which would have criminalized criticism of Beijing, led 500,000 people to march in the streets. Ultimately, the law was never enacted. Protests continued after this incident, including in 2004 when Beijing ruled against universal suffrage and direct elections for Hong Kong’s Chief Executive. In the following year, protesters held remembrances for the 16th anniversary of the Tiananmen Square protests; Hong Kong was the only part of China to acknowledge the anniversary.

A breakthrough was seemingly achieved in 2007 when Beijing promised to allow direct election of the Chief Executive by 2017 and the Legislature Council by 2020. Events seemed to be keeping pace in 2010 when the Democratic Party held its first talks with the mainland government since the transfer. In 2014, voters pressed the issue and in an unofficial referendum, 800,000 people, or 90 percent, voted in favor of having the power to select the list of candidates up for election. This referendum was dismissed and ruled illegal by China. In 2014, China went further and ruled that citizens of Hong Kong would not be allowed to directly elect leaders in the 2017 election.

These decisions led to the Umbrella Movement in 2014. The movement, named for the umbrellas that protesters used to shield themselves from tear gas and rain, grew out of an earlier student movement and led to the Occupy Central protests in Hong Kong’s financial district.  These, in turn, led to police crackdowns and anti-occupy protests. This continued until the protest camps were ultimately removed in December 2014. The accompanying video summarizes the Umbrella Movement in greater detail:


The Aftermath

Following the protests, new election reforms were proposed in 2015 but were defeated by the Legislative Council. In 2016, protests started again after Beijing removed pro-democracy candidates from the Legislative Council elections, however, they were countered by pro-Beijing supporters and the protests failed to amount to anything.

Following the most recent election, in which pro-Beijing candidate Carrie Lam was elected, at least nine protest organizers were ordered to report to the police or face arrest. This also sparked protests across the city and led to the planning of a citywide protest on July 1, Lam’s first day in office and also the 20th anniversary of Hong Kong becoming a part of China.


Conclusion

Hong Kong has long served as an important port city between China and the West. It served as a toe-hold for several competing European nations until the British finally established a permanent colony. Britain imported large amounts of opium and resorted to force to maintain its control over the city and trade with the region. However, under British rule, Hong Kong was often isolated from Chinese politics and developed its own civic culture. Although residents of Hong Kong never had universal suffrage–either under the British during the colonial era and now as a Special Administrative Region in China–Hong Kong has long had a distinct economic and political system that has been at odds with China.

When the British did eventually return Hong Kong to China, it was with the understanding that customs established under British rule, most notably limited democracy, would be respected. However, since the transition, democracy in Hong Kong has been challenged. The pro-democracy movement has endured in the face of many efforts by the Chinese to maintain control and stability. Perhaps the most obvious example was the Umbrella Movement. Mainland China is back on the offensive again though, with the recent arrests of Umbrella Movement leaders.

So, it will be interesting to see what the next step is. For all the talk of democracy in Hong Kong, its people have never actually elected its top executive; even when the British ruled the governor was appointed. Furthermore, while the protests against Beijing’s interference or for direct elections have drawn massive crowds, they have also spawned counter-protests. Hong Kong remains a divided city that faces several challenges when it comes to democratic concessions from the mainland. While the government in Beijing has allowed some reforms in the past, it remains reluctant to allow anything that resembles universal suffrage. While much of the future relies on the actions of the Chinese government, the pro-democracy movement will also need to coalesce around a clear vision for reform and transition.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Fate of Hong Kong’s Pro-Democracy Movement appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/hong-kong-pro-democracy-movement/feed/ 0 60044
Are Infectious Diseases on the Rise? https://legacy.lawstreetmedia.com/issues/health-science/explaining-rise-infectious-diseases/ https://legacy.lawstreetmedia.com/issues/health-science/explaining-rise-infectious-diseases/#respond Wed, 12 Apr 2017 21:08:39 +0000 https://lawstreetmedia.com/?p=59088

Why is the number of epidemics increasing?

The post Are Infectious Diseases on the Rise? appeared first on Law Street.

]]>
"Ebola Virus Virion" courtesy of CDC/Cynthia Goldsmith; License: Public Domain

In recent years, scientists have been paying a lot of attention to a striking development: the number of infectious diseases has increased considerably. That rise was not just one or two more diseases each year. In fact, over the last 100 years, the number of new infectious diseases discovered each year has quadrupled and outbreaks have tripled. What explains this dramatic increase in new infectious diseases? Read on to find out the answer to this question, how scientists are working to fight diseases, and what the consequences could be if we continue along this same trajectory.


Infectious Disease on the Rise

To begin to understand the rising levels of infectious disease, it is first imperative to understand the common terminology. Four terms, in particular, are used very frequently and require clarification. These terms are outbreaks, epidemics, pandemics, and endemic. An outbreak occurs when the number of cases of a specific disease in a specific community rises above what would normally be expected. Epidemics are, “a widespread increase in the observed rates of disease in a given population.” Pandemics are basically the multinational form of epidemics in that they encompass worldwide outbreaks beyond a particular population. Endemic is decidedly different than the other terms and essentially means a rate of disease that is consistently higher within a given group. These definitions are particularly important for the people treating an outbreak on the ground, as it helps them tune their methods to the reality of the situation. The following video gives an overview of how disease spreads:

Although that rise sounds troubling it is not all doom and gloom. While individual outbreaks are increasing, they are affecting fewer people now than before. Additionally, only a small variety of infectious diseases are responsible for the majority of outbreaks. Furthermore, of these strands, a little over half are zoonosis–diseases that are passed from animals to humans. Even among zoonosis, there are only a few zoonotic diseases that cause most outbreaks. In other words, outbreaks are on the rise but a decreasing number of diseases–passed from animals to humans–account for that rise. The question then becomes, what is leading to the rise in outbreaks?


Factors Leading to the Rise of Infectious Diseases

There are several reasons for this increase, but it starts with us and the actions we take. Many of the recent outbreaks are not new diseases, only new to us as a species. They have been incubating and traveling all across areas like rainforests for tens of thousands of years. However, with human encroachment in the form of farming, mining, housing, etc. people are starting to come into contact with these diseases more often and the results are not always good.

Other human manipulations of the environment are also leading to the rise of infectious diseases. These include seemingly benign activities such as reforestation, animal farming, and even flooding rice patties. Sometimes it can be a combination of human activity and environmental factors, such as when milder winters that are the result of global warming fail to kill off the usual number of pests. In fact, rising temperatures have the potential to be one of the greatest contributors to the continued rise of infectious diseases in the coming years, while ailments such as Malaria, which prosper in warmer climates, may become much more virulent. The video below details how global warming can increase the risk of infectious disease:

Other trends, like urbanization, may also contribute to the rise of infectious diseases. By clumping closer together, the chances of an infection spreading quickly are much higher. This is particularly true when urbanization occurs in poorer countries without effective public health monitoring and preventions systems. Similarly, more travel between countries and regions can introduce infections to places that have never seen them before and it can increase the likelihood that an epidemic becomes a pandemic. Even technology and modern supply chains can present a risk, as processing consolidation may increase the likelihood that contamination spreads.

Resistance to antibiotics and resulting superbugs are additional issues leading to the rising number of infectious diseases. However, this is also a problem for viral infections for many of the same reasons, including over prescription of certain medicines and prescribing the wrong medication for a specific disease. Viruses are especially problematic because they can evolve so quickly that it is impossible to stay ahead of them. The clearest example of this is influenza or the flu which changes from year to year. Along with antibiotics, many sanitation systems are also proving less useful than before. In this case, the issue has more to do with the lack of upkeep in existing public health systems that has led to outbreaks of old diseases such as cholera.


Efforts to Fight Outbreaks

Given this trend, what is being done to stem the tide? Actually, governments began addressing the rise of infectious diseases several years ago. A response was prompted back in 2014, following the outbreaks of MERS and bird flu. That year, the United States, along with dozens of countries and organizations, announced a plan to respond and treat new outbreaks where they start.

Currently, efforts to fight infectious disease in the United States fall under the authority of the Centers for Disease Control, or CDC. Specifically, many of those efforts are housed in the National Center for Emerging and Zoonotic Infectious Diseases or NCEZID. NCEZID focuses on reducing both illnesses and deaths that are associated with infectious diseases. It also strives to be proactive in protecting against the spread of infectious diseases.

At the international level, there is the World Health Organization (WHO). Much like the CDC in the United States, the WHO also focuses on reacting to and fighting epidemics. The WHO acts more like a clearinghouse encouraging individual countries to improve their own existing systems and work to integrate them internationally so a crisis in one country can be handled as effectively by its neighbor if it crosses international borders. When it comes to the spread of infectious disease, the WHO serves as an international monitor to identify and coordinate a response to outbreaks.


Conclusion

Foreseeing and preventing all outbreaks of infectious disease would be impossible. Just last year, for instance, several people in Russia were infected with Anthrax when frozen strains of the disease were released when permafrost melted. While this could easily lead to discussions about global warming, the truth is that it just as clearly exemplifies that it is impossible to anticipate everything. In fact, in some cases, efforts are even seen as misguided or unwanted.

Many recent efforts have focused on identifying and understanding new diseases, like those deep in the rainforest. However, such methods have also been criticized for spending scarce funding to search out new diseases when funds could instead be used for treating known maladies. Although it seems odd to criticize people for being proactive, that might be a fair critique in a world with finite resources. In fact, it might be fair to wonder why people are really that concerned with infectious diseases at all.

This is because non-communicable diseases, like cancer, which cannot be spread from one person to another, kill far more people each year than infectious diseases. However, those diseases also originate within us and frequently have to do with factors that we are less able to control, such as getting older. Conversely, based on the fact that only a few diseases cause most of the outbreaks, infectious disease can be managed and their threat reduced. Thus counteracting the rise of infectious diseases is likely to continue to be a mainstay of health policy both nationally and globally.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Are Infectious Diseases on the Rise? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/health-science/explaining-rise-infectious-diseases/feed/ 0 59088
What Does it Mean for the U.S. to Put a Missile Defense System in South Korea? https://legacy.lawstreetmedia.com/issues/world/mean-u-s-put-missile-defense-system-south-korea/ https://legacy.lawstreetmedia.com/issues/world/mean-u-s-put-missile-defense-system-south-korea/#respond Sat, 01 Apr 2017 21:31:34 +0000 https://lawstreetmedia.com/?p=60000

Behind the U.S. missile defense program.

The post What Does it Mean for the U.S. to Put a Missile Defense System in South Korea? appeared first on Law Street.

]]>
Image courtesy of U.S. Missile Defense Agency; License: (CC BY 2.0)

The United States recently sent a missile defense system to South Korea in order to protect the country and deter its northern rival in the wake of repeated missile tests. However, the situation is complicated by the fact that while the United States and South Korea see the move as defensive, others in the region see it as aggressive. Specifically, China and Russia, along with North Korea, see it as an act of American belligerence meant to undermine their own deterrent capabilities. Arms races and missile defenses have a long history and their presence can often ratchet up situations as much as they calm them down. Read on to learn more about the history of missiles, missile defense, and the ramifications of these systems.


History of Missiles

Crude rockets were developed all the way back in 13th century China. They were used occasionally over the next few centuries but were not heavily utilized because their paper or wood shells often made them inaccurate and they lacked enough power to cause major damage. This began to change in late 18th century India, when Tipu Sultan, leader of the Kingdom of Mysore, used metal-tubed rockets against the British. The metal tubes not only increased accuracy but also increased pressure, making them considerably more powerful.

Following this improvement, rockets started being used with increasing regularity. While missile testing and research advanced during World War I, modern missile technology would not be ready for a couple of decades. World War II saw an explosion of rocket use with the introduction of land-based, vehicle-based, and even human-operated rockets. Following the war, the two resulting superpowers began testing missiles with greater frequency and their respective ranges and destructive power gradually increased.

When it comes to missiles, several important distinctions can be made. The most basic distinction is between what separates a bomb from a missile. Bombs are unguided and have no propulsion system, whereas missiles do. There are two additional differences that determine the type of missile. Ballistic missiles have two phases, the first is the powered guided phase, during which the missile is propelled onto its given trajectory. Once the fuel runs out, the missile enters its second phase where it is essentially guided by the laws of physics. Ballistic missiles are very hard to intercept.

The second type of missile is the cruise missile. Cruise missiles are essentially airplanes with explosives attached. Thanks to their navigation features, such as wings and even GPS, cruise missiles are very accurate and can be aimed at extremely small targets like doors. Due to their maneuverability, cruise missiles are even harder to intercept than ballistic missiles. Both ballistic and cruise missiles can carry nuclear warheads, although cruise missiles typically carry smaller payloads than ballistic missiles. Along with these two classifications are several others that distinguish between things like how a missile is launched, its target, and the terminology used in different countries.


The U.S. Missile Defense Program

When it comes to missile defense systems, the current landscape consists of the United States, and then everyone else. Since halfway through the 1950s up until 2000, the United States spent over $100 billion on missile defense and is the only country, in fact, to commit a significant portion of its defense spending to this specific cause. While the U.S. has spent a significant sum on missile defense, its actual commitment to the technology has waxed and waned over time.

President Dwight D. Eisenhower’s administration began the missile defense program in response to the Soviets developing nuclear missiles. The first missile defense system was deployed by President Richard Nixon as a response to a Soviet defense system and in order to help the U.S. position in arms treaty negotiations. Support then dropped under President Gerald Ford, who saw the system as ineffective. Nonetheless, large expenditures continued under President Jimmy Carter and then ballooned under both President Ronald Reagan and President George H.W. Bush. The first Bush Administration finally cut back the missile defense budget following the collapse of the USSR and defense efforts were refocused on combatting accidental launches.

However, President Bill Clinton signed the National Missile Defense Act in 1999, signaling a shift back to a focus on missile defense. President George W. Bush was a strong supporter of missile defense and increased spending on defense systems significantly. In 2002 the Bush Administration actually withdrew from the Anti-Ballistic Missile Treaty in order to advance its missile defense system. Former President Barack Obama also supported a variety of missile defense initiatives, both in the U.S. and abroad; however, he did reverse some of President Bush’s efforts to place a defense system in Europe.

The current U.S. missile defense system consists of several parts, each of which focuses on missiles at a different stage of flight. The first stage of flight is the boost phase, which occurs when a missile is being propelled by an engine or fuel source. The second stage is the midcourse phase, which is when a missile is done launching and starts on its course to the target. Third is the terminal stage, which occurs when the missile reenters the earth’s atmosphere and continues until impact or detonation.

The five primary components of the U.S. missile defense system have different launch locations in order to intercept missiles in specific stages of flight. The ground-based system focuses on missiles at the midcourse phase. The Aegis Ballistic Missile Defense System, located on submarines, can intercept short, medium, and intermediate-range missiles during their midcourse phase. The Theater High Altitude Area Defense (THAAD) component is launched from a truck to defend against short and medium-ranged missiles during their midcourse and terminal phases. The Patriot Advanced Capability-3 (PAC-3) component is designed to defend against short and medium-ranged missiles in their terminal phase. Finally, the space-based surveillance system is attached to three geosynchronous satellites to provide information and early warnings of missile launches.

The United States is not the only country with a missile defense system. Russia also maintains its own system based around Moscow. In addition, several other countries have their own defense systems. For example, Israel has its “Iron Dome” system in place to protect against local attacks and other systems for long-range missiles. While a few countries have some form of missile defense, a larger number have missile technology and could conceivably develop missile defense capabilities. As of 2014, 31 countries had some form of ballistic missile technology, although the capabilities of some of those countries, such as Afghanistan, are currently in doubt.


Complications of Installing Missile Defense Systems 

The THAAD missile defense system in South Korea is certainly not the first time the U.S. or another country has installed defense systems in other countries, and the United States has already installed the same system in its territory of Guam to counter the North Korean threat.

While the placement of missile defense systems is often controversial, it is fair to wonder if all this concern over the installation of missile defense systems is warranted. The reason for this is two-fold. Every existing defense system is severely limited in comparison to the offensive capabilities of many countries. Specifically, the missiles used for defense cost much more than the offensive weapons, so there are fewer of them. The current cost balance means that it is considerably cheaper for countries to build new missiles than it is for the United States or any country to defend against them. Current systems are also not equipped to handle a strike as large as countries like Russia or China could potentially launch given their weapons stockpiles.

The other major issue is that defensive missile technology is not very reliable. This has been the case in the past too–the initial U.S. missile defense system was viewed as so ineffective that it was scrapped in 1974. This issue continued through the Gulf War when the Patriot System had a considerable difficulty intercepting fairly primitive Iraqi rockets. Even the current systems, in tests, have shot down less than half of the missiles they targeted since 1999. Because tests are typically done under ideal conditions, recent results have cast doubt on the effectiveness of the current system.

Despite the existing limitations of missile defense technology, these systems are still viewed as a threat by other countries. The thinking goes that they encourage the opposing side to build up their missiles to counteract the missile defense system, essentially creating an arms race. In the recent circumstances–both in Guam and now South Korea–China’s concern has focused on the radar technology included in the THAAD system, which China fears will be used to spy on it. While both the U.S. and South Korea have emphasized that the system is only there to protect against potential launches from North Korea, the Chinese have responded by placing economic sanctions on South Korea.

The accompanying video looks at the THAAD system and why China does not want it installed on the Korean Peninsula:


Missile Treaties

To counter fears of an arms race and other threats, numerous treaties have been ratified to reduce the number and types of missiles in the field. The most important treaty regarding missile defense was the Anti-Ballistic Missile Treaty signed in 1972. The purpose of this treaty was to prevent arms races by limiting defense systems that would neutralize attacks. The idea was that both sides having the ability to destroy each other would serve as a deterrent. If one side developed an effective missile defense system, the other would need to make faster or more lethal missiles, leading to a consistent buildup.

This logic was fairly effective and, along with the inability to develop an effective missile defense system, prevented the U.S. and the Soviet Union, and later Russia, from developing adequate defense systems. However, in 2002, the United States withdrew from the Anti-Ballistic Missile Treaty because it wanted to develop a more robust system. But all the United States has developed so far is an unreliable and expensive system that has still left many uneasy.


Conclusion 

Missiles are an old and well-tested technology capable of delivering nuclear weapons around the globe with considerable precision. Conversely, missile defense is still relatively untested and often fails to provide what its name would literally suggest. Why then are certain parties so reassured by missile defense and others so agitated?

The answer is that every missile defense system is at the same time a missile launcher and when a system is close to a foreign border it makes the situation uncomfortable. It also forces the countries involved to continuously counter each other’s capabilities. This has been the case in several instances throughout history and will likely continue as long as adversaries are placing their missiles close to one another. While there have been treaties in place to address this issue, the most important one was nullified by U.S. withdrawal. The future then is likely to continue much as the present–barring one country or a group of countries offering to disarm.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post What Does it Mean for the U.S. to Put a Missile Defense System in South Korea? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/mean-u-s-put-missile-defense-system-south-korea/feed/ 0 60000
Before the Ban: The History of U.S. Immigration Policy https://legacy.lawstreetmedia.com/issues/law-and-politics/ban-history-us-immigration-policy/ https://legacy.lawstreetmedia.com/issues/law-and-politics/ban-history-us-immigration-policy/#respond Fri, 24 Mar 2017 20:32:07 +0000 https://lawstreetmedia.com/?p=58547

How recent calls for immigration restrictions compare to the history of immigration policy.

The post Before the Ban: The History of U.S. Immigration Policy appeared first on Law Street.

]]>
"Statue of Liberty" courtesy of Shinya Suzuki; License: (CC BY-ND 2.0)

President Donald Trump recently issued a revised travel ban temporarily preventing people from six countries and all refugees from entering the United States. The original ban was immediately met with condemnation, protest, and legal action, leading the administration to change course. The revised version amounts to a significant scaling back relative to the original, but many of the longer term consequences remain the same. While this is the most recent and perhaps one of the most chaotic efforts to control who comes into the United States, it is far from the first. The history of U.S. immigration policy is littered with restrictions, quotas, and preferences for certain groups. Read on further to find out how President Trump’s executive action fits in the long lexicon of American immigration policy.


History of Immigration

The United States is and has been a land of immigrants long before it was even a country. European migration began in the 16th century, first with the French and Spanish then later with the English, who founded their first permanent colony in Jamestown in 1607. Many of the earliest European settlers traveled either to avoid religious persecution in their native land or to seek better opportunities. There was also a dark side to this original mass migration. Many white Europeans arrived as indentured servants and even more black slaves were forcibly removed from Africa and brought to the new world.

The second batch of migrants, which came to the United States in the 19th century, was also predominantly from Western Europe. Along with English settlers, came large numbers of Irish and German migrants. Approximately 4.5 million Irish made their way to the United States between 1820 and 1930, settling mostly along the coast. Meanwhile, roughly five million Germans arrived during the 19th century and often moved into the interior of the country. These groups were also joined by a large number of Chinese workers who came to the United States in search of gold. The Chinese immigrants tended to settle in the western portions of the United States.

At the end of the 19th century and into the early 20th century, the demographics of immigration shifted again. During this period there was a large rise in immigration from Southern and Eastern Europe. This was most clearly characterized by the nearly four million Italians who entered the United States. Following this wave, however, immigration slowed dramatically due to international events such as World War I and II, as well as the Great Depression. After World War II, refugees from Europe and the Soviet Union flocked to the U.S., along with those from Cuba following Castro’s rise. The video below from Business Insider provides a good illustration of where immigrants came from and when they arrived in the United States:

According to numbers from Pew Research Center, the highest percentage of foreign-born people living in the United States occurred back in 1890 when nearly 15 percent of the population was foreign-born. The lowest point occurred back in 1970 when just 5 percent of the population was born outside the United States. Recent data suggests we will likely reach a new high very soon. In 2015, about 14 percent of the population was foreign-born, a percentage that is projected to increase to nearly 18 percent by 2065.


The History of Immigration Restrictions

For almost as long as people have been migrating to the United States, policymakers have enacted a variety of different pieces of legislation to restrict immigration in general and for specific groups of people. First was the Naturalization Act of 1790, which made only free white people of good moral character who have lived in the U.S. for at least two years eligible for naturalization. This requirement was later changed to 14 years of residency and eventually back down to five years, due to political reasons. Another restriction was put in place in 1819 when Congress started requiring ship captains to provide a list of any foreign-born people onboard intending to immigrate.

Immigration restrictions did not really intensify until after 1850, which was the first time the U.S. Census asked what country people came from. This was followed by a dramatic increase in migration restrictions, particularly those targeting people from Asia. In 1862 the “Anti-Coolie” Act was passed with the aim of preventing Chinese immigration to California and forced California businesses that hired Chinese workers to pay an additional tax. There was also the Naturalization Act of 1870, which made free white people and “persons of African descent” and “nativity” eligible for naturalization but excluded Asians. Perhaps the most infamous example was the Chinese Exclusion Act passed in 1882, which barred all Chinese immigration for 10 years. This act was extended in 1892 by the Geary Act for 10 more years and then again indefinitely in 1902.

These restrictive measures extended into the 20th century as well, starting with the 1907 gentlemen’s agreement with Japan, where Japan agreed to discourage Japanese migration to the U.S. in exchange for more protections for Japanese people already in the U.S. There were additional restrictions at the state level as well, including in 1913 when California passed the Alien Land Law preventing Chinese and Japanese nationals from owning land. In 1917 Congress went a step further, banning immigration from many Asian countries with notable exceptions being Japan and the Philippines.

Another major immigration policy shift occurred in 1921 when the first of the Quota Acts was enacted. The law placed immigration quotas on countries to restrict the number of people from a certain country to three percent of the number that lived in the United States after the 1910 census. A similar act was passed in 1924 limiting the number of migrants from Eastern and Southern Europe to two percent of the 1890 levels. The adoption of the National Origins Formula delivered the final blow, completely banning immigration from Asia, while still allowing immigration from the Western Hemisphere.

It did not end with Asian immigrants either, as the Oriental Exclusion Act prohibited most immigration from Asia but also included foreign-born wives and the children of American citizens of Chinese ancestry. The Expatriation Act went even further, stating that an American woman who marries a foreign national loses her citizenship; this was partially repealed in 1922 but still held for women marrying Asian citizens. Even the Supreme Court entered the debate over race and citizenship in the case United States v. Bhagat Singh Thind. The court ruled that a caucasian man from India did not meet the definition of white person used in established immigration law, and therefore could not become a citizen.

In addition to people from Asia, other groups were also barred over the years for reasons that were not explicitly related to race or ethnicity. The Immigration Act of 1882, for example, put a $0.50 charge on people immigrating and forbid lunatics and those likely to become dependent on the state. The Alien Contract Labor Law was passed in 1885 to prohibit bringing foreign contract laborers to the country, except for certain industries. In 1891 Congress made polygamists, people with diseases, and those convicted of specific misdemeanors also ineligible for immigration. Political groups were also targeted–following the assassination of President William McKinley, Congress passed the Anarchist Exclusion Act in 1901 barring anarchists and political extremists.

Along with all these outright restrictions were a host of other measures to simply make the immigration process harder–like literacy tests on citizenship applications and additional agencies set up to oversee immigration–that, while not explicitly forbidding it, significantly hindered immigration for many groups. It was not until 1965 when Congress passed the Immigration and Nationality Act that many of the quotas and restrictions were finally eliminated.

The video below gives an overview of the immigration practices of the United States:


Immigrants in the U.S. Today

Despite the complicated history of immigration policy, the number of foreign-born people in the United States has increased dramatically since 1965. As of 2015, there were about 43.3 million foreign-born people living in the United States, which is approximately 13.5 percent of the total population. Of that amount, about 20 million are naturalized citizens, with the rest being permanent residents, people with temporary status, and people who entered the country illegally. The Pew Research Center estimates that in 2014 there was a total of 11.1 million foreign-born people in the United States who entered the country illegally.

The immigrant population rose from 9.6 million in 1970 to the 43.3 million here today. Over that time, the primary source of immigrants has shifted from Europe to Latin America and Asia. Specifically, in 2015 the top five countries of origin for new immigrants were: India, China, Mexico, the Philippines, and Canada. The 2015 numbers generally reflect the leading countries of origin for the total foreign-born population as well, which are led by Mexico, India, China, and the Philippines.

The immigrant population in the United States skews slightly female, at a little more than 50 percent. It is also older than the general U.S. population with a median age of 43.5 years. Demographically, nearly half of immigrants identify themselves as white, a little more than a quarter identify as Asian, and about 9 percent identify themselves as black. Ethnically, Hispanics and latinos are the largest group of immigrants, representing about 45 percent. In terms of education, the percentage of immigrants with at least a bachelor’s degree is almost the same as the national average, at about 30 percent. And geographically, states that border Mexico or have large population centers tend to have the most immigrants, with California leading the way followed by New York, Texas, Florida, and New Jersey.


Immigration and the Economy

From 2009 to 2011, the amount of money earned by immigrants was nearly 15 percent percent of all U.S. wages, although immigrants make up 13 percent of the overall population. Immigrants are more likely to be prime working age and work in higher proportions relative to their share of the population. Immigrants also own nearly one-fifth of all small businesses. Finally, nearly half of all immigrants work in white collar jobs and are often overrepresented in some middle-class occupations such as nursing.

While immigrants are working in disproportionately high numbers, they also generally do not harm the work opportunities for most native-born Americans either. While immigration’s effects on domestic workers is a hotly debated subject, many economists agree that it provides an overall economic benefit, although it could also have significant economic consequences for certain groups. In the long-run, immigrants can actually be beneficial to the American job market overall. Moreover, when immigrants drive wages down, it is often because they lack the protections that American citizens have and thus are susceptible to exploitation.

Immigrants, particularly undocumented workers, often pay into programs such as Social Security, which they cannot draw from, and are actually a net positive for the national budget. A review from the Social Security Administration found that undocumented workers paid as much as $13 billion into Social Security in 2010–which came in the form of payroll taxes from immigrants using fraudulent identification–but only received about $1 billion in benefits.

Aside from economic impacts, immigrants also affect American society in other positive ways. These include introducing new or different foods and cooking styles, presenting alternative forms of spirituality, and even incorporating non-traditional medical treatments.


Conclusion

The words inscribed at the foot of the Statue of Liberty in New York read, “Give me your tired, your poor, your huddled masses yearning to breathe free, the wretched refuse of your teeming shore.” These inspiring words, originally written by the poet Emma Lazarus, perfectly encapsulate the ideals that many speak of when they refer to the United States as a nation of immigrants. However, for much of the nation’s history, the people and practices of this country have failed to live up to that ideal.

Donald Trump’s ban, while definitely not the first, is the latest in a long line of efforts to restrict immigration from certain areas and for certain groups of people. Although these restrictions are often passed under the guise of being in the best interest of America or its citizens, they can have the opposite effect. This is because immigrants are often willing to do many of the jobs native born citizens will not, at lower wages. Despite the United States’ complicated history, immigrants have continuously added to and enriched American culture.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Before the Ban: The History of U.S. Immigration Policy appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/law-and-politics/ban-history-us-immigration-policy/feed/ 0 58547
Not About the Benjamins: Is the United States on the Verge of Eliminating the $100 Bill? https://legacy.lawstreetmedia.com/issues/business-and-economics/not-benjamins-united-states-verge-eliminating-100-bill/ https://legacy.lawstreetmedia.com/issues/business-and-economics/not-benjamins-united-states-verge-eliminating-100-bill/#respond Fri, 17 Mar 2017 22:23:02 +0000 https://lawstreetmedia.com/?p=58924

Will the U.S. follow India's lead and eliminate high-denomination currency?

The post Not About the Benjamins: Is the United States on the Verge of Eliminating the $100 Bill? appeared first on Law Street.

]]>
"Pile of Cash" courtesy of 401(K) 2012/401kcalculator.org; License: (CC BY-SA 2.0)

Last November, India began a campaign to eliminate large bills from its currency by removing 500 and 1,000 rupee notes from circulation. The goal was to go after criminals guilty of everything from tax evasion to drug trafficking by eliminating their means of accumulating wealth. This is not just a limited effort, however, as other countries, including the United States, are monitoring the situation in India and considering following suit. Read on to see if the U.S. is ready to actually scrap the $100 bill, what impact it will have on the country, and if the rest of the world is likely to follow suit.


India’s Move

The move by India’s Prime Minister, Narendra Modi, was aimed at recouping some of the estimated $460 billion in untaxed wealth, which is equal to as much as 20 percent of India’s GDP. Modi was also seemingly attempting to fulfill a campaign promise to go after so-called “black money” in the economy. Despite these motivations, this led to a massive cash shortage that instead ended up affecting the poor the most and caused the IMF to slash their growth forecasts for India by a full percent. Nonetheless, even though the government failed to take these factors into account, some people hid their wealth in items such as gold, and the fact that the 500 rupee is not really a large bill, the move was still widely popular. The video below looks at the impact of the Indian government’s decision to eliminate the 500 and 1,000 rupee notes:


U.S. Efforts

There have also been calls in the United States that large bills should be eliminated, although the exact methods with which the nation would do so are unlikely to copy those used by India. The charge here is being led by Harvard economist Kenneth Rogoff, whose plan calls for eliminating any bill larger than $10 over a 15-20 year time period. The goal would be similar to that of India, namely to target tax evaders and money launderers. Rogoff claims this would be an especially effective move on the part of the U.S. because 75 percent of the $100 bills worldwide are actually held abroad, many by Mexican cartel leaders and Russian oligarchs. Rogoff believes that, since most transactions in the United States are done electronically, unlike in India, eliminating these bills would not be a major change.

While Rogoff and other Harvard economists such as Peter Sands have suggested making the change, there is still no plan to eliminate big bills as of yet; in fact, there is actually strong pushback against the idea. A group of government agencies that include the Treasury, the Federal Reserve, and the Drug Enforcement Agency are opposing the move for a number of reasons.

The first reason that people are opposed to the move is cost: removing $100 bills and replacing them with twice as many $50 bills would wipe out any profit made by the government through printing money. Second would be usage: while many people do not carry $100 bills, about 5.2 percent of the U.S. population still do, which equals millions of people. Lastly, although criminals may be inconvenienced by having to literally carry more bills, eliminating $100 bills would just force them to use other bills or find other means to accumulate wealth. In fact, cash shipments in smaller bills have already been seized at the border. The following video looks at whether or not the U.S. is likely to eliminate the $100 bill, and some of its potential effects:


Impact on Economy

Economically, a switch to smaller bills or to no cash altogether is also a mixed bag. As mentioned earlier, by eliminating larger bills, the government would lose out on profit made from the difference in printing the bills versus the cost of printing, because higher bills generate more revenue. The term for this is seigniorage. The estimated cost would be roughly $6 billion annually, which may seem like a lot but pales in comparison to what the government alternatively spends fighting crime funded by cash and large bills.

A potential positive economic impact of the move would be in regards to monetary policy. During recessions, central banks lower their interest rates, which makes keeping money in savings accounts less appealing and instead encourages spending. However, there is something known as the “zero lower bound,” where the interest rate actually becomes negative and banks start charging people to save their own money. At this point, rational people would withdraw their money and keep the cash until interest rates were raised. This would be much harder to do with many smaller bills and impossible if there were no bills at all.


A Global Movement?

Whether or not the United States decides to follow India’s lead in eliminating large bills, the movement is not dead on arrival. Nearly two decades ago Canada eliminated the $1,000 bill from its currency to combat the very same criminal activities India is targeting. Singapore is eliminating its $10,000 bill as well. India is not even the only nation in the developing world doing away with large notes, as Venezuela recently outlawed its $100 bill with the goal of fighting crime. There is even some precedent in the United States: in 1969, the United States did away with $500, $1,000, $5,000 and $10,000 bills because of lack of use, as they were almost entirely utilized for moving money between different Federal Reserve branches.

It is in Europe, however, where the most aggressive steps have already been taken. Last march, the European Union announced it will discontinue using the 500-euro note and stop replacing it entirely by 2018. While other large currency notes are often used by criminals, the 500-euro note had become so ubiquitous among criminals that it had garnered the unflattering nickname “the Bin Laden” after the former terrorist leader. But some critics are quick to point out, as they have in other countries, that eliminating big bills just forces illicit funds into other venues. Some also contend that this is a way to force people to spend more, because banks would be forced to use negative interest rates to reduce the larger number of bills in their safes. The accompanying video looks at the impact of eliminating the 500-euro bill on Europe:

Some countries in Europe have gone even further: in Sweden, for example, there is an unofficial yet concerted effort to do away with cash entirely. In that country, only 2 percent of national wealth is held in bills or coins and only about 20 percent of total transactions are processed in cash, so the move makes some sense. However, a switch of this magnitude and nature does not come without consequences. People who do not have access to the apps that are replacing cash and cards, such as older individuals and refugees, may find themselves unable to pay for basic necessities if the transition is made. Additionally, if all transactions are made electronically they are more susceptible to hacking and government oversight. Nevertheless, Sweden is not alone in this push, with Denmark and Norway also following suit.


Conclusion

Reducing the supply of big bills or even eliminating cash altogether comes down to a simple cost benefit-analysis. Having smaller bills will force much of the money associated with the black market economy out into the open, and at the very least it will make it harder to carry. However, as had been mentioned, there are numerous alternatives to keeping illegal funds in cash.

Conversely, while by making transactions increasingly with cards or electronic forms of payment may make it harder to hide crime, it also makes everyday purchases easier to track. This includes not only tracking by the government, but also by websites or corporations. It would make it more likely that information will get stolen by hackers or other nefarious groups as well, simply because there are more opportunities. This is not even factoring in the effort it would take to acquaint many people with the new form of payment or the effect it could have on monetary policy.

In some places this trade-off has seemingly been deemed acceptable, but for the most part it has not caught on worldwide. Cash, even in large bills, is likely to remain king until security and privacy concerns are considered less of an issue compared to concerns over how criminals are hoarding their ill-gotten gains.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Not About the Benjamins: Is the United States on the Verge of Eliminating the $100 Bill? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/business-and-economics/not-benjamins-united-states-verge-eliminating-100-bill/feed/ 0 58924
The Obesity Epidemic: What’s Behind One of America’s Largest Health Problems? https://legacy.lawstreetmedia.com/issues/health-science/obesity-epidemic-health-problem/ https://legacy.lawstreetmedia.com/issues/health-science/obesity-epidemic-health-problem/#respond Fri, 10 Mar 2017 15:08:02 +0000 https://lawstreetmedia.com/?p=58314

Can anything be done to reverse the trend?

The post The Obesity Epidemic: What’s Behind One of America’s Largest Health Problems? appeared first on Law Street.

]]>
"one blemish" courtesy of waferboard; License: (CC BY 2.0)

A recent study found that there has been a rise in insurance claims for obesity-linked illnesses, such as high blood pressure, type 2 diabetes, and sleep apnea. While the results of this study–completed by the nonprofit Fair Health, a national clearinghouse for claims data–is nothing new, it is one of the first to use actual claims data. This is important because claims data shows treatments actually rendered, which can help illustrate the medical costs associated with high obesity rates. Beyond the results of this specific study, though, the fact is that obesity has become a major problem in the United States for people of all ages. Read on to learn more about the American obesity epidemic, what is being done to fight it, and the outlook going forward.


Obesity in America

Obesity is a somewhat mysterious term, so it first bears clarifying. According to the Centers for Disease Control (CDC), a person is obese when his or her Body Mass Index is above 30 percent. BMI is calculated by dividing a person’s height by weight. A BMI above 40 percent is considered extreme or severe obesity. While BMI is a useful tool to help assess health on a basic level, it does not directly measure the amount of body fat a person has.

In 2014, 36 percent of adults in the United States were obese. According to estimates from 2008, obesity cost the nation approximately $147 billion for medical costs. On a more individual level, people who are obese spend $1,429 more on medical costs per year than people of normal weight. Not only does obesity have negative physical effects, but it can also have negative mental effects and lead to depression.

From a demographic perspective, obesity tends to affect certain groups more than others. Non-Hispanic black Americans have the highest age-adjusted rate of obesity, at 48.1 percent. They are followed closely by Hispanics and non-Hispanic whites. The group with the lowest rate of obesity by far is Asian Americans, who have an average obesity rate of just 11.7 percent, which is well below the national average. Additionally, while obesity is on the rise in many demographic groups, middle-aged adults still have higher rates of obesity at 40.2 percent than both older adults and young adults.

Continuing along this same path, for men, there is not much of a correlation between income or education level and obesity. The one exception being that black and Mexican-American men with higher incomes are more likely to be obese than lower income men in the same groups. For women, a more widespread correlation exists–higher income and better-educated women of all races are less likely to be obese than women from the opposite income and education groupings. Geographically, there is a lower prevalence of obesity in states in the West and Northeast of the United States, with those in the Southeast having the highest rates of obesity. The following gives an overview of the facts behind the obesity epidemic:


Factors behind the Obesity Epidemic

So what causes obesity and what led to its rise? While many people may point to a simple lack of self-control to explain the prevalence of obesity, in many cases it is much more complicated than that. One of the major issues is genetics, namely different people absorb, store, and process food differently, which can make them more likely to gain weight.

In the same vein, medical problems that lead to inactivity, such as arthritis, can also contribute to obesity. Similarly, certain medications taken for completely unrelated conditions, such as depression, can cause weight gain. Age and pregnancy can lead to obesity as well, with people’s metabolisms generally slowing down as they get older and some women having difficulty losing weight after giving birth.

In addition to the physical factors, there are also several environmental factors at play. These include access to a place to exercise, knowledge of healthy cooking, and even being able to afford healthy food. Quitting smoking can affect someone’s weight as well, although its potential negative health effects are generally outweighed by its positives. Even a change in sleep patterns can lead to significant weight gain, as they can lead to hormonal changes that affect how food is digested.

Sometimes there are things completely beyond a person’s control, an example being meals at restaurants, which today are four times larger on average than they would have been back in the 1950s. Along with quantity and size, the cost of food also plays an important role in the rate of obesity. Since the 1970s the cost of food as a portion of income has gone down. Nor is all food is created equal, and while all food has gotten relatively cheaper, unhealthy foods tend to cost even less than healthy alternatives such as vegetables. Even if you set aside how healthy cheap food is, the sheer availability of food makes being obese more likely. While factors such as poor diet, family lifestyle, and inactivity can lead to obesity, they are clearly not the only causes.


Efforts to Reduce Obesity

While determining the causes of obesity has been a challenge, actually reducing it has been particularly difficult. However, that failure is not for a lack of trying. The CDC funds programs at the state and local level in an effort to reduce obesity by advocating for a combination of healthy eating habits and an active lifestyle. The CDC’s High Obesity Program provides grants to universities in areas with a high prevalence of obesity that involve a targeted approach to address the issue. Several states and cities have also implemented a range of policies to address health concerns, ranging from taxes on soda and sugary drinks to school nutrition programs.

There are many resources outside the government as well, in the form of non-governmental organizations that are focused on combatting obesity. A number of these organizations–like the Obesity Action Coalition or TOPS Club, inc–echo their government counterparts, preaching that a combination of education, healthy eating, and physical activity is necessary to combat the obesity epidemic.

The accompanying video looks at ways to fight obesity:

Nevertheless, for all the energy these organizations, government and non-government alike, are exerting their efforts seem to be in vain. In fact, despite major efforts in research, clinical care, and the development of various programs to counteract obesity, after more than 30 years there are few signs that suggest the fight against the epidemic is succeeding. While the overall trend has not reversed itself, some targeted efforts have managed to bring about success at the community level.


Going Forward

Obesity is a major factor in predictions that for the first time children growing up today may not outlive their parents. That is because obesity rates and body weights, in general, have skyrocketed over the last 40 to 50 years. From 1962 to 2006 the obesity rate among Americans grew from 13.4 percent to 35.1 percent. The average person today weighs 26 more pounds than he or she would have in the 1950s. A 2005 study found that if obesity trends continue on their current path, the life expectancy gains from the past several decades could flatline or even go in the opposite direction.

This troubling news concerning obesity comes at an especially bad time. With rates already increasing, government programs that target obesity prevention, in particular, could lose federal money. One of the many aspects of the Affordable Care Act involved the creation of a Prevention and Public Health Fund, which provided resources to important prevention programs–including some obesity-related grants–and makes up a sizable portion of the CDC’s total budget. With Congress debating whether or not to repeal the law, such funding could be cut. More than 300 public health organizations signed on to a letter to congressional leaders asking them not to get rid of the fund in January.

Investing in these public health interventions is becoming more important now than ever, as estimates indicate that the obesity epidemic will continue to be a problem in years to come. Two different studies predict that the obesity rate could continue to rise to 42 to 44 percent by 2030.

While this is an American epidemic, and America has the highest percentage of obese people, the United States is not the only place feeling the burden. Roughly 30 percent of the world’s population, or 2.1 billion people, are either overweight or obese. This trend affects both developed and developing countries alike, however, it affects them in different ways. In developed nations, men have higher rates of obesity whereas women in developing countries have higher rates.

Regardless of demographics, though, obesity rates are increasing all over the world much like they are in the United States. Also, like in the United States, preventive measures to reduce obesity have mostly failed. It has gotten to the point now that regions outside of North America and the West actually have the highest rates. Currently, the Middle East and North Africa have the highest adult obesity rates in the world.


Conclusion

While obesity tends to affect certain groups more than others, overall obesity rates have increased significantly in the past several decades. While obesity rates have leveled off among American youth in the past 10 years, they have continued to climb for adults and remain at record highs for both. Unfortunately, many of the attempts to reverse these trends have had little success so far. This is extremely troubling as obesity has gone from a problem to an epidemic.

The impact from rising obesity rates has the potential to be disastrous. Obesity already costs the United States alone hundreds of billions of dollars annually. For nations that cannot afford this level of care, obesity could lead many people to develop obesity-related diseases and complications without any way to treat or address them. While most efforts have failed to reverse the trend, some targeted interventions have been effective. Ultimately, the problem will need to be addressed at a larger scale for rates to decline.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Obesity Epidemic: What’s Behind One of America’s Largest Health Problems? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/health-science/obesity-epidemic-health-problem/feed/ 0 58314
The Story Behind the U.S.-Russia Relationship https://legacy.lawstreetmedia.com/issues/world/story-behind-russia-us-relationship/ https://legacy.lawstreetmedia.com/issues/world/story-behind-russia-us-relationship/#respond Mon, 06 Feb 2017 17:20:18 +0000 https://lawstreetmedia.com/?p=58101

The two nations have a long and complicated history.

The post The Story Behind the U.S.-Russia Relationship appeared first on Law Street.

]]>
"His Excellency Mr. Vladimir V. Putin, President of the Russian Federation" courtesy of UNclimatechange; License: (CC BY 2.0)

On Thursday, December 29, President Barack Obama placed sanctions on Russia for its alleged hacking of several American institutions. While the sanctions themselves are not unprecedented–the U.S. had sanctioned Russia two years earlier–they point to another unfortunate episode in an increasingly contentious relationship between the two nations. This relationship has eroded as these former adversaries have clashed over Crimea, the nation of Georgia, Syria, and U.S. presidential elections, all within the past decade. While some recent developments suggest the relationship may start to improve, much of the future remains uncertain. Read on to see how the U.S.-Russia relationship has developed over the years, where exactly it stands now, and what it will look like in the future.


The United States and the Empire of Russia

The relationship between the United States and Russia, then the Russian Empire, dates back to before the U.S. government was even firmly established. During the American Revolutionary War, Russia ultimately decided to remain neutral and not offer any support to the British despite being a British ally at the time. The next time the two nations went to war, in the War of 1812, Russia was once again involved as it offered to serve as a mediator. While the offer was declined by the British, a relationship between Russia and the United States was forming.

Although the relationship had initially been positive, a degree of tension arose between the two nations when the Holy Alliance–Russia, Austria, and Prussia–threatened to intercede in Central and Latin America, a perceived violation of the U.S. sphere of influence established by the Monroe Doctrine. The issue was ultimately resolved and no serious conflict resulted. Russia would get back in America’s good graces when it nominally supported the United States during the Civil War, including sending ships to the American East and West Coasts. While historians contend this move was actually to avoid having those ships blocked or destroyed by British and French troops during the Crimean War, and although Russia never provided physical support, the presence of the Russian sailors was positive.

Perhaps the most significant interaction between the two, prior to World War II, was the purchase of Alaska, completed in 1867. Russia was keen to sell the land because it was too far away to administer and also because it needed money following the Crimean War. The United States initiated the purchase in 1859 but held off on actually buying the land until 1867 following the Civil War. The sale price was $7.2 million and was initially viewed as a mistake until large mineral deposits were discovered.

The United States and Russia continued their relationship into the 20th century during several important events. The first was the United States getting Russia and several European empires to agree to an Open Door Policy in China, which ensured its territorial integrity. The second was the United States, under Theodore Roosevelt, mediating the Russo-Japanese War of 1904-1905.

Even when the relationship was strained, the U.S. offered substantial aid to Russia following the outbreak of World War I and later during a massive famine in 1921-1923. However, the United States, along with other Allied governments, also sent troops in to undermine the new communist regime following their takeover and subsequent withdrawal from World War I. When the USSR was declared in 1922, all diplomatic ties were severed.


World War II and the Cold War

The United States did eventually reestablish diplomatic ties with the USSR in 1933. During World War II, the two countries would become allies, with the USSR receiving supplies from the U.S. as part of the Lend-Lease program and later when both countries fought the Axis Powers. These two nations, along with France, China, and the U.K. would also become the five permanent members of the U.N. Security Council following the war.

That was the high point, however, and for the next 45 years or so, the relationship was increasingly tense during the Cold War. This was particularly true with the “Iron Curtain” descending on Eastern Europe in 1947 and America introducing its policy of containment. The two sides then squared off in a stalemate, which was occasionally punctuated by major events like the Cuban Missile Crisis in 1962. Both countries also engaged in a heated space race with the USSR launching the first satellite in 1957 and United States becoming the first and only country to land a man on the moon in 1969.

Tensions normalized somewhat in the 1970s with the first talks on reducing nuclear weapons stockpiles and for cooperation in space. However, they flared once more in the 1980s with the Soviet invasion of Afghanistan. The decade closed, though, with resumed talks on disarmament. The 1990s began with a bang, or more specifically a coup in 1991. The coup failed and so did the USSR soon after, breaking into 15 countries later that year. The video below looks at the history of the Cold War:


After the Thaw

The fall of the Soviet Union was greeted hopefully by the United States with the Nunn-Lugar Cooperative Threat Reduction Program, aimed at collecting nuclear material, infamously named “loose nukes,” from the former Soviet Republics. The two also collaborated again on the space program, culminating with the International Space Station.

Relations began to cool again after both George W. Bush and Vladimir Putin came to power, particularly when President Bush withdrew from the Anti-Ballistic Missile treaty in 2002. This was followed by Russian opposition to the Iraq war, U.S. support for Kosovo gaining independence, and an American anti-missile defense system proposed for Poland. Relations between the two countries declined precipitously following the 2008 invasion of South Ossetia and the subsequent war between Russia and Georgia.

Following this episode, the then-incoming Obama Administration called for a policy “reset” in 2009. Things certainly seemed promising with the New START agreement that called for nuclear arms reduction between the United States and Russia in 2010, along with Russia agreeing to new sanctions on Iran’s nuclear program. From there, however, the situation took a turn for the worse again, when Russia supported Bashar al-Assad in Syria, alleged Russian spies were detained in the United States, and Russia cracked down on human rights in 2012 following Vladimir Putin’s election as president. Russia also expelled USAID from the country and made all NGOs register.

Although both countries came to some agreements to strengthen sanctions on North Korea following its nuclear weapons test in 2013, relations continued to deteriorate when Russia granted asylum to Edward Snowden later that year. This intensified significantly with Russia’s seizure and subsequent annexation of the Crimean Peninsula, as well as its support for separatists in Eastern Ukraine. These actions led the U.S. to place economic sanctions on Russia and expel it from the G8.

Most recently, the United States and Russia have continued to bicker over the Syrian conflict and Russian support for the Assad regime. However, the greatest spat appears to have come in the wake of the recent election when several U.S. intelligence agencies concluded that Russia had interfered in the presidential election through targeted hacking and leaking. This news caused President Obama to increase sanctions and expel 35 Russian nationals from the country. The CIA updated its assessment to conclude that not only did Russia interfere in the election, it did so to help elect Donald Trump.


Going Forward

While Russia and the U.S. have shared a tense relationship for more than a decade, the two countries see signs of hope with the election of Donald Trump. President Trump has seemed to confirm this with what he has already said concerning Russia. For evidence, one need look only as far as President Obama’s recent sanctions against Russia and President Trump’s subsequent praise of Vladimir Putin’s intelligence for not responding in kind. The following video looks at the potential relationship with Donald Trump as president:

President Trump indicated that he hopes to warm relations between the two countries not just with his words but also with his recent actions–namely, by nominating Rex Tillerson for Secretary of State. Tillerson was formerly the CEO of Exxon Mobile and has a lot of business experience working with the Russian government. In fact, Tillerson was once awarded Russia’s Order of Friendship by Vladimir Putin himself. While all of Tillerson’s experience with the country comes from his work in the private sector–acting on behalf of Exxon Mobil rather than the American government–early indications suggest that Russia is pleased with his selection.

Nevertheless, the U.S.-Russia relationship is dictated by more than just the president and his cabinet and that is where things start to get complicated. While Trump sang Putin’s praises for exercising restraint, Republican members of Congress were happy to see additional sanctions placed on Russia, which many considered overdue. In some cases, such as with Senator John McCain, the sanctions were not enough and he pledged to work for even tougher measures. Thus the jury remains out on the future of the relationship; however, the opportunity for improvement appears to be there.


Conclusion

The relationship between Russia and the United States has ebbed and flowed. At first, like many other countries in Europe, Russia treated the United States as a trading partner but not much else. However, with the dawn of the twentieth century and the ascension of the United States as one of the preeminent powers in the world, Russia began to take notice. This situation came to a head following World War II when they were the only two superpowers left standing, prompting competition for ideological control of the world.

However, following the collapse of the Soviet Union, early indications seemed to suggest the United States and Russia could finally work together and form a more collaborative relationship. Unfortunately, this was not to be the case, as the early 21st century featured more disagreement and mutual antagonism. With the rise of Vladimir Putin and his sustained grip on power, the situation has only deteriorated further. While newly elected President Trump has suggested a closer partnership, it remains to be seen if that will stand the test of his term or if Congressional Republicans will even allow it. In the meantime, the United States and Russia will continue their long, circling dance, interacting when necessary and quarreling regularly.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Story Behind the U.S.-Russia Relationship appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/story-behind-russia-us-relationship/feed/ 0 58101
What a Recent U.N. Resolution Means for the Israeli-Palestinian Peace Process https://legacy.lawstreetmedia.com/issues/world/u-n-resolution-israel-palestine/ https://legacy.lawstreetmedia.com/issues/world/u-n-resolution-israel-palestine/#respond Wed, 01 Feb 2017 17:28:49 +0000 http://lawstreetmedia.com/?p=57884

Will the Obama administration's last effort at peace make a difference?

The post What a Recent U.N. Resolution Means for the Israeli-Palestinian Peace Process appeared first on Law Street.

]]>
"israeli settlement in the middle of hebron, palestine" courtesy of Jordan Klein; License: (CC BY 2.0)

The United Nations recently passed a resolution regarding Israeli settlements in occupied Palestinian lands. The most significant takeaways from this development are that the United States allowed the U.N. resolution to be passed and the specific language included in the resolution. This is particularly true when coupled with the language used by Israeli Prime Minister Benjamin Netanyahu to criticize the United States and President Barack Obama after the resolution’s passage. Read on to find out what exactly the resolution means for Palestine, Israel, and the United States as well as the history of the conflict that led to the resolution in the first place.


The Resolution

So what does this resolution do and why has it made the leadership of Israel so upset? The U.N. resolution declares that Israeli settlements are a violation of international law and calls for an immediate end to all settlement activities. The actual determination in the resolution is nothing new, in fact, it has been the view of the international community for some time. What is new is that the Obama administration allowed it to pass without vetoing it as well as the emerging context surrounding the dispute–many are now starting to doubt whether the long sought after two-state solution is still a viable option.

After the resolution was passed, Palestinian leaders indicated that they would use the resolution to support their case against Israel in international courts, a move strongly opposed by Israel. While the condemning language and the reaction of Palestinian leaders are significant, they pale in comparison to accusations leveled by Israeli Prime Minister Netanyahu against President Obama. Netanyahu has effectively accused Obama and his administration of plotting against Israel and even crafting the resolution in the first place, which the Obama Administration denies.

The resolution that did pass is not actually binding; while it may condemn Israel’s actions it cannot forcibly stop them. Additionally, President Donald Trump has already vowed to veto any resolution that would actually force Israel to cease and desist settlement activities. The video below looks at the U.N. resolution:


The History of the Conflict

The origins of the conflict between the two sides go back to the 19th century. Initially, the territory in question was part of the Ottoman Empire. However, during World War I, when it was clear that the Ottomans would lose, Britain and France created their own plan for the region following the war with something known as the Sykes-Picot Agreement. This agreement effectively carved up Arab lands in the Middle East between France and the U.K., which went against earlier promises for an Arab state following the end of the war.

In 1917, the U.K. issued the Balfour Declaration, in which it announced its support of establishing a Jewish homeland in Palestine. British responsibility for creating an Israeli homeland was reaffirmed by the Palestinian Mandate in 1921, which gave it control over former Ottoman lands along the terms agreed upon in Sykes-Picot. However, neither the mandate nor the earlier Balfour Declaration mentioned anything about creating a Palestinian homeland, despite the wishes of the Palestinian population.

This resentment, coupled with earlier broken promises to create an independent Arab state and the continuing and increasing Jewish migration, led to persistent conflict. In the 1930s Jewish militias helped the British put down the Arab uprising hoping to rekindle support for their independent state. Instead, they were betrayed again by yet another British agreement known as the White Paper of 1939, which would limit further Jewish migration, even as the Holocaust occurred in Europe.

After World War II the British ended its mandate in the area, transferring the land and the problems that went with it to the United Nations. The U.N. then attempted a two-state partition that instead led to more fighting and eventually the first Arab-Israeli War, which ended with an armistice in 1949. Per the terms of the agreement, Israel took control of 77 percent of the original mandate, Jordan received control over the West Bank and East Jerusalem and Egypt acquired the Gaza Strip, Palestinians did not control any territory following the fighting. This also led to a mass exodus of Palestinians and a huge refugee problem that continues to this day. The biggest flare of violence between Israel and its neighbors after this, until 1967, was a joint British-French-Israeli effort to take back the Suez Canal after it had been nationalized in 1956.

While the root of the general conflict can clearly be traced back further, the root of the modern conflict can trace its most direct route to the 1967 war, known most commonly as the 6-Day War. After a series of Palestinian attacks from surrounding countries and Israeli retaliation, Syria, Jordan, Egypt, and Iraq started mobilizing their militaries. However, Israel then took a surprise early offensive, decimating much of its adversaries’ air force and went on to capture a dominant victory. As a result of the victory, Israel won and occupied the Sinai Peninsula, Gaza Strip, West Bank, all of Jerusalem, and the Golan Heights. In the process of the war, hundreds of thousands more refugees were forced to leave their homes and more than a million Palestinians fell under the direct rule of Israel.

Another conflict emerged in 1973 when Egypt launched a surprise attack on Israel. The attack prompted the United States to step in and seek a diplomatic resolution. After several years, Israel and Egypt signed a peace treaty that included the return of the Sinai Peninsula to Egypt.

However, since 1967, there has been an almost unstoppable pace of settlement in occupied territories by Israeli settlers. As of 2013, there were over 200 settlements and outposts of Israeli settlers in lands occupied since the end of the 6-Day War, namely in the West Bank and East Jerusalem. These settlements, even the outposts that the Israeli government considers illegal, are encouraged and supported by the government through subsidies and tax breaks on housing, education, and opening new businesses.

Apart from incentives, Israeli settlers also enjoy many other advantages over their Palestinian neighbors in the occupied territories. One is a separate legal system that greatly benefits settlers over Palestinian natives who instead are governed by military law. Another is access to resources such as water, transportation, and electricity, which settlers get from the Israeli government. The settlements have led to perpetual conflict, despite numerous efforts at peace. The following video gives a good description of the roots of the conflict:


The United States, Israel, Palestine and the History of Peace Talks

Since the end of the 6-Day War, there have been several efforts aimed at achieving peace between Israel and Palestine and establishing some framework in which both peoples can have states of their own. This started with two other U.N. resolutions, namely 242 and 338, which put an end to the 1973 War and also called for Israel to withdraw from the territories it occupied. Building off of the 1973 War were the Camp David Accords, which led to Israel withdrawing from the Sinai Peninsula and Egypt recognizing it as a state. But these talks did not involve the Palestinians.

The Madrid Conference in 1991 was aimed at similar goals, namely ensuring recognition of the state of Israel. Ultimately, it led to peace between Israel and Jordan, but none of the other combatants. While the Palestinians were represented at the Madrid Conference, the first deal to actually incorporate them was the Oslo Accords in 1993. In exchange for promising to incrementally withdraw Israeli troops from Gaza and the West Bank, the Palestinian Liberation Organization, or PLO, would acknowledge Israel’s right to exist. Opposition groups in Palestine and settler groups in Israel opposed the deal, which led to violence. The agreement was never fully implemented.

Probably the closest the two sides ever came to lasting peace was the second Camp David Accords in 2000 when both sides offered land swaps, however, they were not quite enough to entice the other to agree to a peace resolution. A last ditch effort in Taba in 2001 and an Arab Initiative in 2002 both also failed. In 2003 President Bush submitted his road map to peace and became the first president to call for an independent Palestinian state. Unfortunately, another set of negotiations, the Geneva Accords of 2003, attempted to fix the same problem from another direction. Both attempts were unsuccessful. Two more rounds of talks in 2007 and 2010 seemed close to reaching deals at times but both ultimately fell short as well.

This history led President Barack Obama to seek some positive action before his term ended. Without having to worry about reelection, he allowed the recent resolution to pass. While his actions are not unprecedented, they are still controversial. Other resolutions have been passed regarding Israeli-Palestinian relations, but this was the most recent one to condemn settlements since 1980. Additionally, while Obama is not the first U.S. president to allow a resolution related to this conflict to pass without a veto, it is the first time in his presidency. The accompanying video looks at the peace process as it currently stands and the remaining inherent trouble:


Moving Forward

While the resolution is non-binding, it is not entirely toothless. What it does is create a template for future negotiations and potentially other resolutions that would be binding. While a January 15 International Peace Conference seemed to offer a forum to draft that kind of resolution, no such progress was made. Instead, the focus was mainly on reopening dialogue between the two parties and reiterating support for past ideas, such as a two-state solution and the return of land occupied by Israel to Palestine.

Aside from creating guidelines, the recent U.N. resolution also eliminated many of the legal arguments Israel could have used to justify settlements. The resolution may also lead to subsequent efforts to apply sanctions on Israeli goods made in the occupied territories or force Israel to go to the International Criminal Court.

President Trump has already denounced the resolution and promised to repeal it. However, that seems unlikely as he would have to introduce a new resolution and, like the current one, get it through the Security Council without a single veto, which is unlikely. However, Trump and the Israelis can cut funding to the United Nations, which would be significant, as the U.S. supplies 22 percent of the organization’s budget. Israel can also go after the nations who voted for the resolution, summoning ambassadors from countries that supported the resolution.The U.S. embassy, notably, was not among those targeted by Israel.

In the meantime, the settlements continue to be built and expand further into Palestinian occupied lands. There are now 600,000 Jewish Israelis living in either East Jerusalem or the West Bank where once there were none. In other words, nearly 10 percent of the country’s Jewish population lives beyond the borders established in 1967 and in territory recognized as Palestinian.

As a result, Palestinians view these settlements as an unjust seizing of the land that they would receive if a two-state solution ever came to fruition. The Israelis view these settlements as a necessary buffer and feel justified through scripture. They also contend that since Jordan, which once laid claim to these areas of land, is no longer interested in the lands, there is no sovereign power who has control over them. However, even Israel will not go so far as to claim the disputed lands in the West Bank as part of its own sovereign territory. Any solution to the problem will likely have to include land swaps, among other things, something that Israel has shown it is not totally against, such as when it abandoned a settlement in the Gaza Strip in 2005.


Conclusion

The issues in the Israeli-Palestinian peace process are not going to be resolved by one U.N. resolution. However, that was never the point as the resolution was non-binding. The idea behind the resolution was to create some type of momentum for negotiations–or possibly block the momentum of efforts that many believe run against the interests of a peace settlement. In this circumstance, the onus was put on Israel, as the international community sought to make a strong statement on settlement building.

The likelihood of reaching an actual deal depends on more than just these two countries. While the rest of the Security Council, and the world in general, have an interest, the United States has played a key role in many past peace attempts. This U.N. resolution then could signal an important step forward if all sides involved are willing to look past politics and are serious about achieving some sort of two-state solution. However, it appears unlikely that the incoming president will take the same line on Israeli settlement building, which could cause many to question the negotiation process given that most view settlements as an important obstacle to a lasting resolution.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post What a Recent U.N. Resolution Means for the Israeli-Palestinian Peace Process appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/u-n-resolution-israel-palestine/feed/ 0 57884
The Unemployment Rate: What the Measure Tells Us https://legacy.lawstreetmedia.com/issues/business-and-economics/unemployment-rate-making-measure-work/ https://legacy.lawstreetmedia.com/issues/business-and-economics/unemployment-rate-making-measure-work/#respond Mon, 09 Jan 2017 14:25:04 +0000 http://lawstreetmedia.com/?p=57691

The frequently debated statistic measures something very specific.

The post The Unemployment Rate: What the Measure Tells Us appeared first on Law Street.

]]>
"Help wanted sign" courtesy of Andreas Klinke Johannsen; License: (CC BY-SA 2.0)

In November, the monthly jobs report released by the Bureau of Labor Statistics told us, among other things, that the unemployment rate dropped down to 4.6 percent, the lowest it has been since 2007, before the Great Recession. While that number seems to speak glowingly of both the job market and the efforts of the Obama Administration, others contend the opposite. President-elect Donald Trump, for one, is not convinced by the statistic and claims the actual number is much higher. Who is right, are they both right, or are they both wrong? Read on to find out the backstory behind the unemployment rate and what it can tell us about the economy.


History of Unemployment and Methods to Address it

While there were several ways that policymakers attempted to determine the unemployment rate in the early 20th century, the unemployment measure that exists today was not created until the 1940s, when the Census Bureau began administering the Current Population Survey. Estimates suggest that the unemployment rate reached an all-time high of 23.9 percent. This was followed by a record low of 1.2 percent in 1944 during World War II. The lowest rate not during a time of war was 2.9 percent in 1953.

Since 1948, there have been 11 observed recessions and there have been a variety of means to combat resulting high levels of unemployment, many of which have varied by president. During the second term of Harry Truman’s presidency, the first one with reliable data, unemployment was very low except for a brief recession as the economy adjusted after the war. Truman left office with an unemployment rate below 3 percent. The rate rose though during the following administration under President Dwight D. Eisenhower but passing the Federal Aid Highway Act in 1957, which paved the way for the National Highway System, helped bring the rate back down.

President John F. Kennedy inherited an unemployment rate around 6 percent and was unable to do much to affect it before his assassination, despite expanding Social Security and cutting taxes. The story of Lyndon B. Johnson’s presidency was the complete opposite, with a large decrease in unemployment. This success under Johnson was the result of wartime hiring and new government projects from the War on Poverty, including Medicare and Medicaid. Following Johnson, the administrations of both Nixon and Ford saw continuously rising unemployment with the rate reaching a new post-war high of 9 percent in 1975. President Ford actually had the highest average unemployment of any president since data was officially collected, at 7.8 percent.

Jimmy Carter succeeded Ford and saw an initial decline in the unemployment rate. However, that was reversed following an oil crisis at the end of his term. This trend continued into the Reagan presidency which saw the highest unemployment rate since the Great Depression at 10.8 percent at the end of 1982. Nevertheless, President Ronald Reagan was ultimately able to reduce that number by half when he left office. Overall, Reagan actually had the second highest average unemployment rate, barely edging out Barack Obama. Taking the reins from Reagan was the first President Bush, who watched the unemployment rate rise steadily during his tenure.

In 1992 George H.W. Bush was replaced with President Clinton who, like Johnson thirty years earlier, saw the unemployment rate steadily decline. Under George W. Bush, the unemployment rate ticked up at the beginning of his presidency after the 9/11 attacks and a mild recession. It eventually ticked back down before starting to rise dramatically at the beginning of the Great Recession. This carried over into President Obama’s time in office, peaking at 10 percent in 2009 before steadily declining to where it now sits at 4.6 percent.


The Meaning of Unemployment

The unemployment rate is calculated with the hope of learning who does not have a job and why to help policymakers understand the state of the economy and make informed decisions. The data for calculating unemployment is derived from surveys conducted by the Bureau of Labor Statistics. If a person has a job they are obviously counted as employed. The real discrepancy is in the unemployed category, which there are actually several different ways to measure. If someone is looking for work, but does not have a job, they are considered unemployed. However, if they are not looking for  work they are considered outside of the labor force and thus not included in unemployment figures. People living in institutions and those in the military are excluded from the survey.

The goal of the survey is to classify people age 16 and older into one of the two groups. Generally the divisions are pretty clear, however, there are a few gray areas. For example, people who are unpaid but work more than 15 hours a week for a family business are considered employed. In the case of unemployed people, the clarification is over whether they are actively pursuing a job within four weeks of the survey or would like a job but are not looking for one. Those who have looked for a job in the four weeks and are available to work are considered unemployed. The labor force, for the purpose of the unemployment measure, is considered those who are employed and those who are unemployed. Passive job searchers are not counted as part of the labor force,

Of those not included in the labor force, some are discouraged workers–those who do not think they can attain a job. A marginally attached worker is someone who has looked for a job at some point in the last 12 months but has not done so in the past four weeks. Within the marginally attached worker category is the subcategory of discouraged workers, who have not recently looked for work because they do not think they can get a job, either because they are unqualified or for another reason related to the state of the job market. The rest of the people in this category are generally out of the labor force for another reason such as attending school or taking care of a family member. In total, there are six measures of unemployment, ranging from the U1 to the U6. Of those, we use the U3, which measures the amount of people who do not have jobs as a percentage of the labor force.

The following video looks at exactly how the unemployment measure works:


How is Unemployment Measured?

Some assume the government uses the number of people on unemployment insurance each month or surveys every household to determined how many people are unemployed, but that is not actually how it works. Using unemployment insurance would only count people who are eligible or have applied for insurance, so if a person does not qualify in either of those categories they would not be counted. On the other extreme, surveying every household every month, in a process similar to the census, is impractical.

Instead, the unemployment rate is actually measured using the Current Population Survey, which started in 1940 and was taken over by the U.S. Census Bureau in 1942. In total, there are 60,000 households each month that are eligible for the survey and are organized into 2,000 physical areas. The Census Bureau then creates a survey that will incorporate 800 of these areas in order to create a representative sample to reflect the variety of people and job types across the United States. Each month, a quarter of the households in the sample are changed to ensure no household is interviewed for more than four consecutive months. These households are then taken out of the sample for eight months, before being interviewed for another four month period. In other words, three-fourths or about 75 percent of the sample remains the same from month to month and one-half or about 50 percent stays the same from year to year.

Every month, the included households are contacted and asked questions to determine whether they are employed, unemployed, or not looking for work. These interviews are done either in person or over the phone, generally the week of the 12th day of the month. During the first interview, demographic information is collected through a computerized database, which is then used to create a representative sample. Because this measure is derived from a survey and not a count of every person in the country, there is room for error. But the margin of error calculated by the BLS finds that 90 percent of the time the survey will yield an unemployment number that is within 300,000 of the results that you would find if you counted every single person. The measure also takes into account seasonal employment with rate adjustments. In no case over the last decade has the margin of error been large enough to skew the actual unemployment rate.

Neither the people asking nor those answering the questions actually determine what classification they fall under. Instead, that is determined when the answers are put into the computerized form. Critical to successfully measuring unemployment is ensuring comparable results. Due to this requirement, the interviewers are extensively trained.


Criticism of the Unemployment Rate

While the unemployment rate has steadily gone down over the last few years, and although it seems very cut and dry mathematically, the measure still has its critics. These critics include people from both sides of the political spectrum, from President-elect Trump to former presidential candidate Bernie Sanders. Their criticisms extend beyond the trite example that discouraged workers should also be included in the unemployment rate.

These people point to another major flaw with the rate in that it excludes many job seekers. Namely, while a person may have a job, that does not mean they are fully employed, in essence, working a full-time job that can support them. Unfortunately for these people, there are also not enough jobs that could fully employ them either. Instead, if they are employed at all, they are often forced to cobble together multiple jobs or rely on the social safety net. Moreover, while the government does measure various forms of unemployment, only one, the U3 unemployment rate, tends to get most of the attention.


Conclusion

Since the unemployment rate is calculated using data and sophisticated sampling techniques, some might think the measure is beyond partisanship. Unfortunately, that is not the case. While some of that may be political, the unemployment rate itself deliberately excludes a large portion of people to measure a very specific thing. While many do criticize the definition, it is still important to measure the number of people who do not have a job and are actively looking for work.

Despite the ambiguity, purposeful or not, the unemployment rate has been one of the most consistent barometers for measuring the health of the United States’ economy since the end of World War II. Undoubtedly better measures either exist or could be formulated, although the practicality of compiling more in-depth numbers that would have to be gleaned from a 300 million plus population is more dubious. Thus, candidates and activists can debate and denounce the merits of the unemployment rate but for now, we seem to be stuck with it, even if it does not take many of us into account.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Unemployment Rate: What the Measure Tells Us appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/business-and-economics/unemployment-rate-making-measure-work/feed/ 0 57691
The Electoral College: What is it and Why Do We Still Have it? https://legacy.lawstreetmedia.com/issues/politics/electoral-college/ https://legacy.lawstreetmedia.com/issues/politics/electoral-college/#respond Tue, 27 Dec 2016 14:56:30 +0000 http://lawstreetmedia.com/?p=57641

Despite several changes, the Electoral College remains intact.

The post The Electoral College: What is it and Why Do We Still Have it? appeared first on Law Street.

]]>
"#298 i vote" courtesy of Kelley Minars; License: (CC BY-SA 2.0)

On December 19, electors gathered in state capitals to formally elect Donald J. Trump to be the next President of the United States. An event that typically garners little attention every four years had its time in the national spotlight this year as many called for electors to turn against the will of the voters and prevent a Trump presidency. While the effort to use the Electoral College to block Trump never panned out, there were more of the so-called faithless electors in 2016 than in any election in many years.

But before we can dig into the recent controversy surrounding the Electoral College, it is important to understand the system itself. Specifically, what exactly is the Electoral College, what is its purpose, and why is it the final arbiter in the election, not the popular vote? Read on to find out the answers to these questions and more.


History of the Electoral College

The history of the Electoral College goes back to the Constitutional Convention of 1787. It was during that seminal moment in American history when the idea of the Electoral College was determined to be the best way to elect the President of the United States. The number of electors in each state is determined by combining the number of senators and representatives in that state. Today, there are 538 electors in total (one for each of the 435 representatives, 100 senators, and the three given to Washington, D.C. by the 23rd Amendment), ranging from three in some states to 55 in California. The number of electors in each state can change with every census, depending on population changes, but no state can have fewer than three electoral votes.

While the number of electors each state has is equal to the combined number of representatives and senators, those representatives or anyone “holding an Office of Trust or Profit under the United States” is not allowed to serve as an elector. If one candidate does not receive a majority, 270 votes, then the House of Representatives decides the election. Parties in each state select the electors for their presidential candidate. In most states, this is done either through state party conventions or central committees. In a few states, a mix of other methods are also employed.

Election Day–which is held every four years on the Tuesday after the first Monday in November–is actually an intermediate step in the presidential election process. While voters cast their votes for a presidential ticket, they are actually choosing a slate of electors who, in the following month, will participate in the final election. The slate for the candidate who wins the most popular votes is elected; this is known as the winner-take-all, or general ticket, system. However, two states, Nebraska and Maine, do not follow the winner-takes-all rule. In those states, electoral votes can be split among multiple candidates through the state’s system of proportional allocation. Regardless of the methodology, once all the votes have been cast and tallied, Congress certifies the results on January 6 of the following year–2017 for the most recent election.

The video below gives an overview of the system and its history:


Changes in the Electoral College over time

The Electoral College system has changed little since its initial unveiling, aside from an adjustment due to the passage of the 12th Amendment in 1804. Before the 12th Amendment, electors in each state voted for two people (at least one of whom had to be from a different state than the elector) and the person with a majority of votes became the president while the runner-up became the vice president.

In the 1796 election, that system produced a president, John Adams, from the Federalist party and a Vice President, Thomas Jefferson, from the Democratic-Republican Party because Federalist Party electors split their votes between multiple vice presidential picks. Then in 1800, the electors voted along party lines for both a president and a vice president, but due to the two-vote system, there was a tie and the House was forced to determine the president. After the complexity of those two elections, lawmakers got together to devise the 12th Amendment, which changed the Electoral College so that electors vote for president and vice president with one vote. That, in general, is the system used in the United States today.

The process of choosing the electors has also changed slightly from the initial procedure in many places. Originally, in several states, the state legislature would determine the electors, meaning that the public had no direct role in the presidential election process. However, that was changed as voting rights spread. In fact, since 1876, every state has used the popular vote to select electors.


Issues with the Electoral College

Naturally, for a system that has been around for 200 years, the Electoral College has dealt with its share of criticism. While electors are expected and have pledged to vote for their state’s popular vote winner, there are a few examples of electors going against the voters. In the last century, at least one example of this practice has occurred in the elections of 1948, 1956, 1960, 1968, 1972, 1976, 1988, 2000, and of course in 2016, which set a modern record. These people are commonly known as “faithless” or “unfaithful” electors. Although it has happened several times in the past, faithless electors have never actually influenced the outcome of the election. Some states have laws on the books to penalize faithless electors, although some argue that if challenged in court, such laws may be deemed unconstitutional.

Beyond faithless electors, the system has had one controversial moment that did end up deciding an election. Namely, in 1824 Andrew Jackson won the most electoral votes; however, he did not win a majority. As a result, the election was thrown back into the House of Representatives and the runner-up in the original election, John Quincy Adams, went on to be elected President of the United States. This was the first and only election where the candidate with the most electoral votes did not win the election. It was also the first time that the candidate with the highest share of the popular vote did not become president. The accompanying video looks at some of the issues with the Electoral College:


Electoral College vs the Popular Vote

A major recurring issue in American presidential elections is that the final outcome is decided by the Electoral College and not the popular vote. Generally, this has not been an issue as the winner of one usually ends up winning the other as well. There are only four instances when the winner of the Electoral College lost the popular vote: 1876, 1888, 2000, and in 2016 (in 1824, no one won a majority in the electoral college and the House chose the president). The margin of President-elect Donald Trump’s loss in the popular vote this election cycle was five times larger than any other election winner in history, with nearly 2.9 million fewer votes. The results of this election, in particular, have led many to criticize the use of the Electoral College, which raises the obvious question: why does the popular vote not determine the winner?

The answer to that question starts with the first Secretary of the Treasury and George Washington’s confidant, Alexander Hamilton. In Federalist 68 he defended the system as a sort of compromise between an aristocracy and a democracy. While Hamilton and many of the other founders wanted a democratic nation, they also wanted an informed and level-headed electorate, something that Hamilton did not view the American people as at that time. Hamilton based this on his knowledge of the downfall of classical democracy, but also an interest in states’ rights.

Namely, Hamilton wanted states that do not necessarily have large populations to be accounted for and have a say in the government. Without the Electoral College one state with a huge population, California now or Virginia in early U.S. history, would be able to significantly influence the final election outcome. This, in turn, would lead candidates to campaign in large states and population centers while ignoring the rest and their associated interests. Moreover, Hamilton wanted the electoral college to ensure that a candidate could appeal to the entire country. However, opponents of the current system argue that modern swing states tilt the campaign in much the same way.


Conclusion

After close elections, particularly those with a split between the popular vote and the Electoral College, many who supported the losing candidate tend to criticize the system. The most recent election featured a split that was very large by historic standards, making that sentiment even stronger. Ultimately, the Electoral College has survived since its inception over 200 years ago and is likely to survive in the future as well. While the system has had several tweaks over the years, the general framework remains intact.

The system is not perfect and simply relying on the popular vote may assuage people’s anger, at least if it benefits their favored candidate. In the meantime, there are other avenues for the disaffected, such as fighting laws that restrict access to voting or even encouraging more people to vote; in 2016 for example, only around 58 percent of eligible voters actually voted.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Electoral College: What is it and Why Do We Still Have it? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/politics/electoral-college/feed/ 0 57641
The Reality Behind Fake News https://legacy.lawstreetmedia.com/issues/entertainment-and-culture/just-reality-behind-fake-news/ https://legacy.lawstreetmedia.com/issues/entertainment-and-culture/just-reality-behind-fake-news/#respond Mon, 19 Dec 2016 14:15:48 +0000 http://lawstreetmedia.com/?p=57369

What can be done about fake news?

The post The Reality Behind Fake News appeared first on Law Street.

]]>
"Bush Love Letters to Condi" courtesy of F Delventhal; License: (CC BY 2.0)

In our present information environment, there is news coming from every direction, at every angle, all the time. Due to this overabundance of information, it is often hard to tell reality from fiction. This can be especially difficult when opinions and fake news have also been interjected into the media landscape. Fake news is far from a periphery player too, in fact, it is splashed across some of the most popular websites on the internet like Facebook and Google. It may even have played a role in the outcome of the presidential election. Read on to learn more about the fake news phenomenon, its place in history, how popular websites made it mainstream, and the consequences of its rise.


The origin of “Fake News”

So what is fake news exactly? As its name suggests, fake news is literally made-up news about events that did not happen. In many cases, the creation of fake news is done by people from all over the world seeking to spread misinformation or looking to promote something and get rich doing it. One of the most egregious examples comes from a few writers in Macedonia who claim they made between $5,000 and $10,000 a month publishing fake stories. These people create extremely partisan pieces for the sole purpose of drawing the most eyeballs. The goal is to get readers interested because more traffic means more ad revenue.

But intentionally fabricating false stories isn’t the only way fake news spreads. It can also be the result of a person’s earnest, yet inaccurate beliefs such as this one example chronicled by the New York Times. Eric Tucker, a Trump supporter, posted a picture on November 9 of what he believed were charter buses bringing in paid protesters to dispute the election. While that was just how he interpreted what he saw, and something he later determined was not true, that did not stop his tweet from going viral. Tucker was a private citizen with a small Twitter following, yet his post was seized upon by several Trump supporters and conservative websites to justify their belief in a conspiracy. The way individuals interpret an event, often without full information about what actually happened, has become increasingly important.


Facebook and Google

Two of the companies that end up promoting (and profiting) from fake news the most are Facebook and Google. So how are these two tech titans attacking this problem? Before this question can be answered it is important to look at why these websites allow fake news in the first place. The issue of fake news on Facebook came to the forefront after a major incident earlier this year. In May, a member of a team that curated the “trending news” section for Facebook said that the group regularly avoided featuring conservative stories. This admission created a political firestorm that led to the end of the trending news team within Facebook and news curation on the site altogether.

In its place, Facebook installed an algorithm that would determine which news stories are being shared the most. However, shortly after its debut, the new section began elevating stories that were completely false. While the company still has some human oversight of the new trending section, they are told to exercise less editorial control over the articles that are featured, leading many fake stories to slip through.

While fake news on Facebook may not seem like a major issue on its face, a poll conducted by the Pew Research Center found that 44 percent of Americans get news on Facebook. In another, more recent poll, Pew found that nearly two-thirds of Americans believe fake news created confusion about basic facts. Facebook and other social media sites provide a way for articles to quickly go viral and reach a remarkably large audience. While most agree that the spread of fake news is a problem, finding an appropriate solution is not particularly easy. Facebook has been cautious in its response out of fear of censoring legitimate news outlets or once again projecting an anti-conservative bias.

How Companies Have Responded

The nature of Facebook’s business makes fake news a difficult issue to approach. At its core, Facebook relies on its large user base to sell advertising to. If the site eliminated fake news it could run the risk of seeming biased or alienating people and losing their engagement and possibly lucrative advertising revenue.

Despite this challenge, Facebook has said that it plans to address fake news. The CEO, Mark Zuckerberg, has stated that Facebook is already working on blocking or flagging completely false articles and recently announced a partnership with third-party fact checker sites to help accomplish that goal. Ultimately, Facebook and other companies must walk a tight line. The most blatantly false news stories may be somewhat easy to identify, but in an era of polarized politics even some facts are contested, making it hard to create a clear rule.

For Google, the approach is slightly different because its search engine is predicated on reliability–if it is just showing fake news articles it would lose the trust of its users. However, that is not to say Google has avoided fake news altogether. The most significant example of fake news on Google is a result of the way the search engine ranks results. While search results often feature articles from the company’s curated Google News section, the “Top Stories” at the top of many search results include a broader range of articles that in some cases include fake news. It is particularly confusing because when you click to “read more” articles, it takes you to the Google News section, which is editorially vetted. This stems from the fact that Google Search and Google News are viewed as separate entities by Google. This distinction really becomes problematic because Google News does not accept ad revenue whereas Google Search does. A similar issue exists on Google’s mobile platform, which features AMP stories–web pages that are optimized to load almost instantly on mobile devices–at the top of the results page. This is yet another way for fake news to sneak into the top of the results page.

Google uses an algorithm to weed out spam and fake news websites, although it is not 100 percent foolproof. In light of the recent debate, Google has promised to fight fake news by restricting fake news sites’ access to its AdSense platform, which is often their source of revenue–fake news sites make money by generating a lot of traffic and serving viewers ads, often using Google’s advertising tools. Facebook also made a similar move to prevent fake news sites from using its advertising network.

The following video looks at fake news online and what companies are doing to stop it:


The Impact of Fake News

As many realize the extent to which fake news has spread online, some wonder whether it could have impacted the outcome of the recent election, as news reports indicate that fake news tends to have a conservative bias. Although it is impossible to show the exact impact of fake news on the election–and although Mark Zuckerberg dismissed the notion that fake news was consequential in the election–widespread false information is almost certain to have some sort of impact on people. In fact, according to an article from Buzzfeed News, there was actually more engagement with the top fake election news articles on Facebook than with the top content from traditional media sources in the last three months of the campaign. But, like many factors used to explain the election results, it’s impossible to say whether or not fake news actually tipped the election one way or another.

The video below features a PBS NewsHour discussion of fake news and its potential impact.

The effect of fake news has also been felt outside of the United States. An example would be in the Philippines, where a spokesperson for the president posted graphic images to justify the country’s violent campaign against drug dealers, even though fact checkers later realized that the images were actually taken in Brazil. Fake news also spread widely in the lead up to elections in Indonesia and a fake article about the Colombian peace deal with the FARC went viral shortly before the referendum vote. The problem was so disruptive that some African nations shut down social media sites after unconfirmed security threats spread before elections.

Fake news certainly has precedent in the United States. From the late 1890s through to the 1920s something known as Yellow Journalism reigned. During that period, competing newspapers would publish sensational and often false stories, each more so than the last, in an effort to win eyeballs. The scourge of Yellow Journalism became so bad at one point that many believe it contributed to the Spanish-American War of 1898.


Conclusion

So what is to be done about this problem? Some suggest that Google and Facebook could help create a crowd-sourced list of news stories that can be peer-reviewed. Others argue that big companies should not have the power to determine what is true. Recent efforts to reduce fake news sites’ access to the biggest advertising networks may help get rid of their financial incentives, but alternative ad networks may not follow suit.

The example of Yellow Journalism may also be a model to look at. The exaggerated and fabricated news stories at the turn of the 20th century were ultimately undone by waning public interest, court cases that protected the privacy of individuals, and a code of ethics adopted by many newspapers. But in the modern news environment centered around internet, and the abundance of media that comes with it, it may be difficult to weed out these stories altogether.

In the meantime, identifying fake news is a case by case effort that requires everyone’s diligence. It requires a balancing act of separating reality from fiction, but also a tolerance for information that you may not agree with and a skepticism for that which confirms your existing beliefs. Efforts of this nature are already underway on the platforms where most fake news is found. Now it is up to readers to determine if what they see is legitimate or not. If anything, the rise of fake news may drive people to become more critical news consumers.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Reality Behind Fake News appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/entertainment-and-culture/just-reality-behind-fake-news/feed/ 0 57369
Raqqa: Behind the Effort to Retake ISIS’s Capital https://legacy.lawstreetmedia.com/issues/world/raqqa-isis-syrian-capital/ https://legacy.lawstreetmedia.com/issues/world/raqqa-isis-syrian-capital/#respond Mon, 05 Dec 2016 14:15:48 +0000 http://lawstreetmedia.com/?p=56959

The importance of Raqqa and the obstacles to retaking it.

The post Raqqa: Behind the Effort to Retake ISIS’s Capital appeared first on Law Street.

]]>
"Ar Raqqa - Government building" courtesy of Beshr Abdulhadi; License: (CC BY 2.0)

As Iraqi Security Forces, with the help of U.S. air support and the Kurdish Peshmerga, continue the effort to retake Mosul, a major city in Northern Iraq, a new push is underway in Syria. This push is led by the American-backed Syrian Democratic Force (SDF) and is targeting the ISIS capital of Raqqa in Northern Syria. In addition, the United States and Turkey agreed to help develop the plan to take and then hold the city once ISIS has been defeated. Read on to find out more about the importance of Raqqa to ISIS, the SDF, the role of the United States and Turkey, and the consequences that taking back the city could have.


ISIS’s Conquest of Raqqa and Life Afterward

Raqqa was actually the first major city in Syria to be freed from regime control during the Syrian civil war. However, like many of the violent conflicts that emerged in the wake of the Arab Spring, gains in Syria were quickly co-opted by extremists. Initially, the struggle was between local activists and the Al-Nusra front, both of which were attempting to fill the void left by regime forces. After the city was taken from the Assad regime, the Free Syrian Army rebels and the extremist groups competed for political control.

While these two groups were bickering, ISIS moved in and swiftly forced out the Al-Nusra front, setting the stage for its own showdown with the rebels. The Free Syrian Army, which was actively fighting the Assad government, did little to confront ISIS as it took control of the city and began a brutal crackdown on the residents there.

As in other places under ISIS control, life in Raqqa has been extremely harsh. It started with violent executions and crucifixions in public spaces. Next, schools were closed, drinking and smoking were forbidden, and women were forced to adhere to strict dress codes or face violence. Children were also abducted and forced into ISIS’s ranks. ISIS fighters, on the other hand, particularly those from western nations, have had access to luxury goods. The accompanying video looks at life under ISIS in Syria:

Raqqa’s Importance to ISIS

Along with Mosul, Raqqa is one of just a few major cities that remains under ISIS control. Raqqa also operates as the group’s capital, making it a particularly important target for decreasing the group’s ability to carry out attacks outside of the shrinking area that it controls. While Raqqa and Mosul are both very important to the group, Canadian Brigadier General David Anderson recently said, “I think that probably Raqqa matters more.”


Efforts to Retake Raqqa

The group leading the assault into Raqqa will be the Syrian Democratic Force or SDF, in an operation dubbed “Euphrates Rage.” The SDF is a coalition of militias made up primarily of Kurds, Sunni Arabs, and Syriac Christian fighters. While the group is a hodge-podge it is dominated by the Kurdish army in Syria (the YPG) and its all-female units (the YPJ). The coalition is also supported by American airpower, as it was in the Battle of Kobane where ISIS was handed its first defeat on the battlefield.

While the Kurds are the main actors in this group, the United States also has hope that Sunni Arab militias will be able to play an important role in efforts to take ground from ISIS. The United States has selected a few of the militias to support its efforts, dubbed the Syrian Arab Coalition. The hope is that these groups can continue the fight against ISIS when the Kurds are no longer willing or when they enter territory where their presence creates political complications.

The SDF has also set up its own political party, the Democratic Syrian Assembly or DSA, which incorporates both Kurdish and Sunni Arab elements. The assembly also allows the United States to interact with the Kurds while providing a buffer between the U.S. and PKK, the Kurdish Workers’ Party in Turkey, that is designated as a terrorist group by the United States. The video below looks at the SDF and the gains they have made:

The United States and Turkey

Speaking of Turkey and the United States, balancing the relationship between the NATO allies and the SDF fighting on the ground has been difficult. Currently, the SDF is the only legitimate force on the ground with any hope of pushing ISIS out of Raqqa. Unfortunately, the group is also closely linked with the YPG, which Turkey also considers a terrorist organization.

Unsurprisingly, following on the heels of the SDF’s announcement, the Department of Defense announced an effort to forge a long-term plan that incorporates Turkey into any attempt to retake the city. The plan will not only cover the retaking of Raqqa but also holding it and subsequently governing the city after as well. The main discussion currently is over the makeup of the forces involved in the attack. The U.S. and Turkey are both pushing for more local fighters, which they hope will make for a more stable government when the city does ultimately fall. While the parties involved are working on some sort of post-ISIS solution, it is important to understand how difficult politically and militarily it will be to take and govern the city.

The following video looks at the difficult relationship between the SDF, Turkey, and the United States:


Impact

While Turkey may be the greatest concern, it is certainly not the only concern when considering Raqqa after ISIS. As is the case in Mosul, the impact of ISIS losing a major city will reverberate beyond the city itself. This will be particularly true if the group loses both cities, as it will no longer hold a substantial population center. What will be their next move be when they have no city-sized safe-haven to launch attacks from?

Read More: The Battle for Mosul: The Fight for ISIS’s Stronghold in Iraq

While ISIS forces are being beaten back in Mosul they have become entrenched in smaller groups around the city, planning on surviving the offensive and continuing to fight as part of an insurgency. It is worth noting that in the fight for Mosul, ISIS has the luxury of retreating to Raqqa, but if Raqqa falls there is no such option.

In addition to ISIS itself, there are also the three principal actors in the effort: the Kurds, Turkey, and the United States. As mentioned previously, the United States has already announced a plan to include both the Turks and the Kurd-dominated SDF in taking and later governing the city. However, the details of this plan have not been revealed, which may be troubling to those familiar with secret deals concerning governing parts of the Middle East.

Additionally, the Assad regime, the Russians, and the Iranians also play an important role in the conflict. Although these groups are not involved in the planning and assault on Raqqa, so far at least, if ISIS lost the city it would change the nature of the fight in Syria. Instead of having ISIS to keep them occupied, the allied powers could then shift their focus to Assad. This could lead to any number of things, from more concerted peace talks to a full-on proxy war between the Assad regime’s supporters and the U.S. and its allies. The only certainty seems to be that if and when ISIS is pushed out of Raqqa, a power vacuum will be created and someone will have to fill it.


Conclusion

The SDF recently announced its intentions to take ISIS’s capital Raqqa, coinciding with the push to remove them from Mosul. However, this is much easier said than done. Not only is the geography different, the needed troops are not as readily available. In addition, the competing political concerns in Syria may be even greater than those in Iraq.

Despite these competing interests, people in ISIS-controlled areas are undoubtedly being slaughtered. Groups like Raqqa IS Being Slaughtered Silently have regularly shown extreme examples of repression under ISIS’s rule. It is because of this reality that the United States has pledged to act, however, sorting out the political challenges has slowed those efforts.

Along with appeasing the interests of its allies, the United States must also figure out the next step in its relationship with the Assad regime and its foreign backers. The taking of ISIS’s last major stronghold offers an opportunity for greater dialogue between the two sides, but also an avenue for direct conflict if peace cannot be achieved. Even if both Mosul and Raqqa are taken from ISIS, the group’s ideology is not likely to be eliminated completely. All of those involved must figure out what the future of Syria will look like before another group steps in to take up ISIS’s mantle.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Raqqa: Behind the Effort to Retake ISIS’s Capital appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/raqqa-isis-syrian-capital/feed/ 0 56959
After Calais, Europe is Still Struggling to Deal with Refugees https://legacy.lawstreetmedia.com/issues/world/rumble-jungle-end-refugee-camp-means/ https://legacy.lawstreetmedia.com/issues/world/rumble-jungle-end-refugee-camp-means/#respond Fri, 25 Nov 2016 14:00:25 +0000 http://lawstreetmedia.com/?p=56606

Europe's ongoing challenge to deal with the refugee crisis illustrates a political backlash.

The post After Calais, Europe is Still Struggling to Deal with Refugees appeared first on Law Street.

]]>
Image courtesy of malachybrowne; License: (CC BY 2.0)

On October 25, France began dismantling the infamous migrant camp in the city of Calais nicknamed “the Jungle.” The camp was home to thousands of migrants and refugees and has been a source of division and animosity for the surrounding area. However, it is not just Calais that will be affected. In fact, the decision to shut down the camp actually speaks to larger trends both in France and in Europe at large. As the migrant crisis continues, many countries in Europe have had a hard time accommodating the influx of people.

Read on to find out more about what exactly is happening in Calais, where the refugees living there are headed next and how this all fits into the larger backdrop of national and continental politics.


Background

Migrants began settling in Calais, France way back in 1999. The camp survived several closure attempts, including one earlier this year. All the while the population grew, totaling more than 9,000 people, according to recent reports. Many people settled there on their way to the UK, as the camp is located near a tunnel between the two countries.

Read More: The “Great Wall of Calais”: The UK’s Controversial Plan to Stop Migrants

What’s Happening Now?

When authorities decided to tear down the camp, the next question was what exactly that meant for its inhabitants. Before the camp was cleared, there were thousands of people living there and at least 70 operating businesses. The plan is to move all these people and whatever they can carry with them to several sites across France. To expedite this process, the camp’s inhabitants were broken up into four groups: adult men, families, minors, and other vulnerable groups. During the removal process, conflicts and fires broke out as some were reluctant to uproot.

The video below depicts the deconstruction of the jungle:


Where are the Refugees Going?

Those leaving the camp were transported by bus to more than 450 individual reception centers across the country. These centers are generally abandoned hospitals, hotels, and army barracks located in many small towns. Once there, refugees are able to apply for asylum, but if their requests are denied they will face deportation. Not everyone is being forced out, unaccompanied children were allowed to stay in converted shipping containers as the rest of the camp was taken down. Days later, the remaining children were moved to various reception centers throughout the country. But NGOs have warned that since being resettled, many children are living in unsuitable conditions and are being forced to work.

Refugees there have already demonstrated a clear persistence to stay put if possible, with the goal to ultimately make it to England. England is currently set to accept some 200 children from the camp who have proven relatives in the UK, although it has promised not to accept any more.

Breaking down the Calais camp has also reignited the debate over immigration and refugee settlement. In England, politicians have been resistant to accept more refugees even as French President Hollande asks them to take on a greater share. British politicians, however, have been steadfast in their refusal, some have even been calling for dental exams to prove that children claiming refugee status are indeed children and not adults. And many small town residents in France have taken to the streets to protest the settlement of refugees in their communities.


Political Impact

At the forefront of the protests in France is the Front National, a nationalist political party led by Marine Le Pen. Le Pen’s party has spearheaded efforts to protest the settlement of immigrants in small towns. However, Le Pen’s party is certainly not alone. This development is emblematic of a trend across Europe where far-right parties, who oppose immigration as one of their central tenets, are on the rise.

Read more: Right-Wing Groups in Europe: A Rising Force?

This includes countries like Greece, Hungary, and Poland where dissatisfaction with the EU and the rising number of migrants has led to far-right parties securing large portions of parliament and in some cases the governing coalition. Some of these groups, such as the FIDESZ-KDNP in Hungary, have gone even further, espousing anti-Semitic views and seeking to criminalize homosexuality.

This rise is not solely confined to the poorer eastern portions of Europe, several nations, including France, have seen a growing backlash against immigration and immigrants. For example in Sweden, often held up as a golden standard of liberalism, the rise of the far-right Swedish Democrats, a party that strongly opposes immigration, led to the formation of a tenuous coalition government between the Social Democrats and the far-left Green Party.

In the upcoming elections in Germany, a far-right party may gain seats in parliament for the first time since World War II. Following mass reports of sexual assault last New Year’s Eve in Cologne, the Alternative for Germany Party, which has hard-line positions on immigration and strongly opposes Islam, grew in popularity. Perhaps the most significant example is in Austria, where the leader of a nationalist party has a very realistic chance of winning the presidency in the December runoff election. If successful, he would be the first far-right head of state elected in Europe since World War II. Migration also played a prominent role in the UK’s decision to leave the European Union earlier this year.


European Refugee crisis

Much of this reaction to the refugees in Calais is actually part of the larger reaction to a wave of immigrants flooding Europe in general. Europe has several demographic factors that make it an ideal place for immigrants, namely a shrinking native population and an increasing need for caretakers as its population ages. In addition, in terms of personal safety and economic prospects, many migrants see Europe as a significant improvement relative to their home countries.

In 2015, more than a million people arrived in Europe seeking asylum. Of those, about 476,000 have applied for asylum in Germany. While Germany received the most in total, on a per capita basis, Hungary, Sweden, and Austria have received more. Not coincidently, those three have seen a notable rise in far-right parties, all with platforms seeking to dramatically curtail immigration.

In Slovakia, Macedonia, and Hungary border walls have been erected to prevent migrants from getting through. France, Germany, Austria and Sweden, several of the most popular destinations, have instituted border checks. Norway has gone perhaps the furthest, though, by actually confiscating migrants’ valuables to pay for their care. Aside from these individual efforts, the EU as a whole has also worked on a deal with Turkey where, in exchange for billions in aid and reconsidering that country’s EU application, Turkey will prevent more migrants from entering Europe. The following video looks at the migration crisis in Europe:


Conclusion

What tearing down the Jungle actually means is unclear at this point. Particularly because it has been tried before, yet the camp has remained in place for almost 20 years under a range of politicians. What is more telling is the spirit behind the most recent decision to tear down the camp. While refugees are being offered the opportunity to be resettled, many migrants may not be granted asylum and will likely face deportation. Moreover, the situation in Europe has dramatically changed as far-right political parties are seeing their influence and popularity surge.

The refugee crisis has engulfed the continent. While many were first met with open arms, the mood has shifted and now many places are erecting barriers and denying entrance. This has coincided with a rise of far-right parties across the continent (as well as anti-immigrant and anti-refugee sentiment in the United States). Tearing down the Jungle, if it lasts this time, is symbolic as much as anything. However, the exact message being sent, whether hostile or not, remains unclear. The important thing to watch now is how those living the camp are resettled and how residents react to an influx of refugees.


Resources

CNN: Calais ‘Jungle’: Demolition of Massive Migrant Camp Begins

Law Street Media: The “Great Wall of Calais”: The UK’s Controversial Plan to Stop Migrants

NBC News: France Begins Evicting 6,000 Migrants From ‘Jungle’ Near Calais

Vox: France’s ‘Jungle’ Refugee Camp is Being Dismantled and Residents may have Nowhere to go

Reuters: ‘A Lot of Controversy’ Around Resettling Calais ‘Jungle’ Refugees

Law Street Media: Right-Wing Groups in Europe: A Rising Force?

The New York Times: How Far Is Europe Swinging to the Right?

BBC News: Migrant crisis: Migration to Europe Explained in Seven Charts

BBC News: How is the Migrant Crisis Dividing EU Countries

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post After Calais, Europe is Still Struggling to Deal with Refugees appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/rumble-jungle-end-refugee-camp-means/feed/ 0 56606
Feeling Okay? The History of the Flu and Flu Vaccines https://legacy.lawstreetmedia.com/issues/health-science/story-behind-the-flu/ https://legacy.lawstreetmedia.com/issues/health-science/story-behind-the-flu/#respond Sun, 20 Nov 2016 15:46:34 +0000 http://lawstreetmedia.com/?p=56224

Fall has started and along with it comes several long-anticipated events like football season, changing weather, and Thanksgiving. But there’s something else associated with this time of year that no one is looking forward to–flu season. Despite being seemingly innocuous, the flu is one of the greatest scourges in the history of mankind and is still […]

The post Feeling Okay? The History of the Flu and Flu Vaccines appeared first on Law Street.

]]>
Image courtesy of KOMUnews; License: (CC BY 2.0)

Fall has started and along with it comes several long-anticipated events like football season, changing weather, and Thanksgiving. But there’s something else associated with this time of year that no one is looking forward to–flu season. Despite being seemingly innocuous, the flu is one of the greatest scourges in the history of mankind and is still a potent killer. It has also given rise to a billion dollar vaccine industry bent on stopping it.

Read on to find out more about the history of the flu, the flu vaccine, and the business that it has spawned.


The History of the Flu

Human beings have been victims of the flu or influenza for as many as 6,000 years. While no precise date is readily available, it is believed that once humans started to domesticate animals they also started acquiring the flu from them, as many animal species carry flu strains. The name “influenza” originated in eighteenth century Italy where its outbreak was blamed on poor air quality.

Although the existence of the flu has been known for centuries, it is only within the last hundred years that it has been clearly identified. In 1918, a veterinarian actually discovered that a disease found in pigs was similar to one found in humans. In 1928 other researchers proved, through experiments on pigs as well, that the mysterious killer influenza was actually caused by a virus. Still, it was not until 1933 that scientists finally identified the specific virus that caused influenza.

The video below gives an overview of the history of the flu:


Types of Flu

Although the flu is commonly referred to as a monolithic thing, it is actually a combination of related viruses. There are two main types of flu virus: H-types and N-types. These letters correspond with genetic markers for two glycoproteins, hemagglutinin (H) and neuraminidase (N), which are the antigens the host of the virus develops an immunity to. Along with these are three major strains: A, B, and C. The A strain is the one that causes major outbreaks that lead to widespread deaths. There is also a D strain, which primarily infects cattle and is not known to harm humans.

The reason why the flu is so deadly is because of its genetic makeup. Since the genetic code of the influenza virus is made of RNA and not DNA, the viruses replicate very quickly and are more prone to mutations. Thus, viruses can change numerous times before a human, for example, can even build up an immunity to the original virus. This is done through two processes. The first is called antigenic drift, and it occurs when mutations change the virus over time eventually making it so immune systems can no longer recognize it. The second is called antigenetic shift, which involves a dramatic change in the composition of the virus, like combining with an animal subtype, which is often the process that leads to pandemics.

The flu generally hits elderly people, those with asthma, pregnant women, and children the hardest. For anyone who has had the flu before, the symptoms are familiar: fever, chills, coughing, sore throat, achiness, headaches, fatigue, vomiting, and diarrhea. The virus is usually transmitted through the air via respiratory droplets, but can also move through physical contact. Some people who get the flu are asymptomatic meaning, while they have the flu, they do not experience the typical symptoms, yet can still get others sick. The flu also triggers several related complications including, pneumonia and sinus and ear infections. It can worsen existing medical conditions such as chronic pulmonary diseases, or cause heart inflammation.


Deadliest Strains

While the flu is perceived as commonplace and not particularly dangerous today, it is still one of the deadliest viruses in human history. During the 16th and 18th centuries, there were a number of massive and deadly outbreaks. Since 1900 there have been four major flu pandemics. The Asian flu lasted from 1957-1958 and killed one to four million people. The Hong Kong flu circulated from 1967-1968 and killed one million people. The third was the Swine flu, or H1N1, which broke out in 2009. The greatest outbreak by far, though, was the Spanish flu that broke out in 1918, right on the heels of World War I. The epidemic killed as many as 50 million people worldwide, more than the war itself.

The accompanying video looks at the deadly 1918 pandemic:

Aside from these major outbreaks, the flu remains a virulent threat. Although it is hard to pinpoint exactly how many people die each year from the flu, the CDC estimates that more than 55,000 people died from influenza and pneumonia in 2015. But that is an estimate and the numbers often vary. An earlier estimate for the flu alone, by the CDC, put the yearly average somewhere between 23,000 and 33,000. The discrepancy is caused by outliers in yearly totals and different strains that respond to the flu vaccine differently.


The Flu Vaccine

If someone catches the flu there is little that can be done for them. Infected people can take over-the-counter remedies and in certain cases can even be prescribed antiviral medications, although many strains of the virus have grown immune to such treatments. Generally, the only way to consistently ward off the flu is by trying to prevent it in the first place with a flu vaccine.

Developing the flu shot has been a long process and one that is still in progress. The first step was on the heels of two important discoveries–scientists managed to grow the flu virus in eggs for the first time in 1931 and were able to isolate the virus itself in 1933. While Louis Pasteur was the first to actually attempt to make a flu vaccine, it was a Soviet researcher in 1936 who developed the first prototype. While this vaccine was used in the former USSR for 50 more years, it had the drawback of using a live strain of the flu.

However, scientists quickly overcame this by finding a new source of the dead, “inactivated” virus to use in vaccines instead. In 1940 a new problem arose as a second strain of the flu was discovered, leading to the bivalent vaccine in 1942, which targeted one A and B strain. The next major step in the development process occurred in 2007 when the source of the virus for vaccines moved from hen eggs to cell cultures, making reproduction and sterilization easier.

On top of the bivalent vaccine, trivalent and quadrivalent vaccines were developed, containing multiple A and B strains. Vaccines typically change each year because the virus itself mutates from season to season, often making old vaccines ineffective. Strains of the virus are actually monitored all year long, with the Northern Hemisphere monitoring what is circulating in the South and vice versa. When the prevailing strains are identified, a vaccine is tailored to them. Additional vaccines with other strains can also be created in emergencies. This system came about as a result of a WHO recommendation in 1973. Since 1999 WHO has issued two sets of vaccine recommendations each year, one for the Northern Hemisphere in February and one for the Southern Hemisphere in September.

The video below explains how the flu shot works:


The Business Side

Developing a flu vaccine and then redeveloping it each year to fight the different strains of the flu virus has been a long and arduous task. An estimated 171 to 179 million doses of the vaccine were created for the United States in 2015 alone. That amounts to a $1.61 billion industry in the United States and roughly a $4 billion one worldwide.

With an industry this large, it is fair to ask whether the pursuit of profits has overwhelmed the pursuit of health. Roughly 44 percent of Americans received the vaccine in 2015 and the shot is considered the best way to fight the flu. But because of the difficulty of matching the vaccine to the dominant strains, it is only 50 to 60 percent effective. Furthermore, there are different types of vaccines sold depending on how many strains the shot will protect against.


Conclusion

Each year, millions of people are infected with the flu and thousands or even tens of thousands die. It took centuries to identify the virus and much of what we know about the virus was discovered in the last hundred years. Given the nature of the virus and the rate at which it mutates, vaccines often have a hard time keeping up. The international community has developed a sophisticated monitoring system to identify and track new strains of the virus to ensure that vaccines are as effective as possible. But because of the frequent changes, new vaccines must be developed each year, prompting the development of a substantial industry.


Resources

CDC: Deaths and Mortality

CDC: Seasonal Influenza, More Information

WHO: Influenza: Surveillance and Monitoring

NPR: How Many People Die From Flu Each Year? Depends How You Slice The Data

Medical Ecology: Influenza

CNN: Getting a Flu Shot? It may be Better to Wait

The History of Vaccines: Influenza

Medscape: The Evolving History of Influenza Viruses and Influenza Vaccines

CNBC: The $1.6 billion Business of the Flu

Flucelvax: History of the Flu Virus and Influenza Vaccination

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Feeling Okay? The History of the Flu and Flu Vaccines appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/health-science/story-behind-the-flu/feed/ 0 56224
Judging a Book by its Cover: The History of Racial Profiling in the United States https://legacy.lawstreetmedia.com/issues/law-and-politics/judging-book-cover-legacy-racial-profiling/ https://legacy.lawstreetmedia.com/issues/law-and-politics/judging-book-cover-legacy-racial-profiling/#respond Fri, 18 Nov 2016 19:37:42 +0000 http://lawstreetmedia.com/?p=55749

What exactly is racial profiling and does it work?

The post Judging a Book by its Cover: The History of Racial Profiling in the United States appeared first on Law Street.

]]>
Image courtesy of Michael Fleshman; License: (CC BY-SA 2.0)

During the campaign, President-elect Donald Trump gave a speech in the wake of two bombings in New York and New Jersey. In response, Trump said that the police force should have the ability to profile suspects in order to be able to target individuals and subsequently catch criminals faster. While people quickly debated what exactly Trump was calling for, whether it was racial profiling versus criminal profiling, his comments immediately stirred debate over the questionable practice.

Read on to find out more about the history of racial profiling, how it is still used, its effectiveness, and the impact it has on individual freedom.


History of Racial Profiling

According to the ACLU, racial profiling is “the discriminatory practice by law enforcement officials of targeting individuals for suspicion of crime based on the individual’s race, ethnicity, religion, or national origin.” Racial profiling is closely associated with and only narrowly differentiated from criminal profiling which “is the reliance on a group of characteristics they believe to be associated with crime.”

In criminal profiling, the cumulative characteristics of people who have committed a crime are used to identify those who may be likely to commit the same crime. However, racial profiling assumes any member of a racial or ethnic group of people may commit a crime because of who they are.

In addition, part of racial profiling is willfully overlooking members of the majority when they commit crimes. The ACLU cites the following example:

An African American man in Maryland, who after moving into a white community, was attacked and subjected to property damage. Local police failed to respond to his repeated complaints until they arrested him for shooting his gun into the air, trying to disperse a hostile mob outside his home.

The accompanying video looks at the practice of racial profiling and what it means:

Racial profiling in the United States traces its roots all the way back to colonial times. One of the earliest examples was a registry in which free blacks were required to enlist. The registry kept track of a number of physical characteristics as well as how that person came to be free. The idea behind it was to limit the movement of free black people around the South. If they were unable to prove their status they could even be forced into slavery. This kind of targeting particularly reemerged during the Jim Crow era and continued on throughout the Civil Rights Movement and into the present. Often when it comes to racial profiling, the discriminatory practices are not written down in a record but implied.

The closely associated criminal profiling also has a long history, dating back to the 1880s in England when experts tried to track down the elusive Jack the Ripper. Profiling in the United States began gaining momentum in the late 1950s with profiles contributing to the arrest of suspects in high-profile crimes. In 1974 the FBI launched its Behavioral Science Unit utilizing profiling techniques to locate serial rapists and murders. Over the years psychology has taken a major role in these profiles as certain, common traits are identified in many of the cases and used to pinpoint other offenders. While this approach is used more to identify specific individuals guilty of specific crimes, it also creates a template for future investigations to use as well, which is similar to how the ACLU describes criminal profiling. Nevertheless, these same profiles cannot be overly broad generalizations, or they risk being another form of racial profiling.


The Use of Profiling

In his speech, one of the points Donald Trump alluded to was Israel’s use of racial profiling and the success it has had with it. While many other Western nations have shunned the practice, Israel has readily adopted it as a means of protection. This is especially true in airports where people with Jewish last names or links to Israel are able to quickly move through security while those from other regions, particularly from predominantly Muslim regions, are often held up for hours for intense inspections.

While the United States does not have similar programs, for the most part, there is one glaring exception that generated a lot of high profile coverage just a few years ago. That is the infamous stop-and-frisk program that was a major component of the New York Police Department’s effort to fight crime. While the city claimed this program was an effective way to reduce crime, a federal judge disagreed, claiming instead that it provided cover for officers to target non-white citizens in unnecessary and illegal ways.

While police officers are well within their right to stop someone they suspect of committing a crime or are likely to commit a crime, they must be able to demonstrate some cause. However, in the case of stop-and-frisk, people of color were being stopped at a disproportionately high rate, which led a federal judge to deem the policy unconstitutional. In fact, 83 percent of the stops conducted by the NYPD between 2004 and 2012 were of black or Hispanic people, while those groups made up slightly more than half of the city’s population. The following video gives the details behind the stop-and-frisk ruling in New York:

Although stop-and-frisk was really the only major program that led to clear racial profiling in an attempt to fight crime, as mentioned earlier, racial profiling is often done without a directive or anything on the books. The ACLU, for example, has a long list of what it claims are incidents of racial profiling against a variety of minority groups. Over the last couple of years, there have been a number of high-profile incidents involving white police officers and non-white victims, which certainly seem to indicate racial profiling does still occur even without an explicit policy.

When it comes to criminal profiling, the practice has gotten a lot of attention in popular culture but its effectiveness has also been called into question. Part of the problem is that criminal profiling is not much more reliable than racial profiling. According to a small study done in Britain, only 2.7 percent of 184 cases showed that the practice helped lead to an arrest. The main issue was there were so many different characteristics that it was hard to create a single profile that would be used to capture criminals. This sentiment was echoed by a Secret Service report on school shooters that suggested that potential shooters would have to be identified on an individual basis because they were all so different. The most common results, unfortunately, were confirmation bias at best, and at worst simply another form of prejudicial profiling.


Evaluating the Use of Profiling

Since racial profiling only targets a select group of people it is unsurprisingly not very effective. For the clearest evidence, one need only to look at New York’s stop-and-frisk program once more. Of all the people stopped, nearly 90 percent were released with no further action and were free to go. In other words, only 6 percent of stops ended with an arrest and another 6 percent resulted in court summonses. In fact, the data indicates that stop-and-frisk likely had little relation to the number of murders and other violent crimes in New York.

Other instances, such as the ones listed by the ACLU, also show how racial profiling is typically not effective. In fact, these instances of racial profiling only seem to make matters of crime worse as they encourage disaffected people to lash out in anger.

Impact

Not only is racial profiling ineffective it can also be harmful in the long run. The reason for this is because people who are unfairly targeted by police tend to feel a reduced trust in the police force as an effective means of fighting crime. When people do not trust the police, then the police are less able to do their jobs because they lack both authority and the necessary assistance from communities to help them with their work.

This feeling of being excessively targeted also seems to be supported by the numbers. The clearest example, and by now one that is well-worn, is the existence of clear racial discrepancies in prisons. Black male children born in 2001 are approximately 5.5 times more likely than white children to be incarcerated at some point in their life.

From a dollars and cents perspective, racial profiling is also costly. Retraining officers following a racial profiling incident or paying a settlement for racial profiling can cost a city or police department millions, if not tens of millions of dollars. There are countless examples of this, one of the most egregious is in Arizona–where the actions of notoriously prejudiced Sheriff Joe Arpaio just cost his county $22 million in settlements with Latino community members.


Conclusion

Like other controversial techniques for preventing crime, racial profiling does have its defenders. They argue that it has been successful in reducing crime and point to examples like stop-and-frisk in New York and to other countries that embrace the measure such as Israel. It even has a closely related cousin, criminal profiling, which relies on somewhat related methods to help in the hunt for criminals.

However, like many of those very same controversial techniques, the numbers suggest racial profiling actually does not really reduce crime at all. In fact, it may actually increase crime by lowering trust in police and diminishing officers’ effectiveness in minority communities. It also seems to fill prisons in the United States disproportionately by race while also costing police department millions in settlements and training.

Racial profiling then seems to be a practice that is more harmful than good. However, the reality of that may not outweigh some people’s perception that it is effective. In either case, the practice is unlikely to be done away with entirely, in the meantime it is likely to make the tenuous relationship between the police and many communities even worse.


Resources

CNN: Donald Trump Defends Racial Profiling in Wake of Bombings

ACLU: Racial Profiling: Definition

History News Network: The Roots of Racial Profiling

Haaretz: in Israel, Racial Profiling Doesn’t Warrant Debate or Apologies

The New York Times: Racial Discrimination in Stop-and-Frisk

American Psychological Association: Criminal Profiling: The Reality Behind the Myth

Center for Science and Law: Criminal Profiling, Present, and Future

National Institute of Justice: Race, Trust, and Police Legitimacy

Economic Policy Institute: Where Do We Go from Here? Mass Incarceration and the Struggle for Civil Rights

CNN: Racial profiling Costs Arizona County $22 million

Brennan Center for Justice: Ending New York’s Stop-and-Frisk Did Not Increase Crime

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Judging a Book by its Cover: The History of Racial Profiling in the United States appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/law-and-politics/judging-book-cover-legacy-racial-profiling/feed/ 0 55749
The Battle for Mosul: The Fight for ISIS’s Stronghold in Iraq https://legacy.lawstreetmedia.com/issues/world/battle-mosul-isis-stronghold/ https://legacy.lawstreetmedia.com/issues/world/battle-mosul-isis-stronghold/#respond Mon, 14 Nov 2016 00:35:58 +0000 http://lawstreetmedia.com/?p=56373

What the fight to reclaim Mosul will mean for Iraq.

The post The Battle for Mosul: The Fight for ISIS’s Stronghold in Iraq appeared first on Law Street.

]]>
Image courtesy of DVIDSHUB; License: (CC BY 2.0)

On October 17, Iraqi Security forces, with the help of the Kurdish Peshmerga, irregular Iraqi forces, U.S. special forces, and American air power, began their assault on ISIS with the hope of retaking Iraq’s second largest city, Mosul. Not only is Mosul one of the largest and most economically important cities in Iraq, it also serves as a symbol of ISIS’s rise in the country as well as the Iraqi government’s inability to secure its land. The assault promises to be a long campaign, but if successful, could signal the impending end of ISIS in Iraq.

Read on to find out more about the campaign to retake Mosul, its significance in the fight against ISIS, and what it would mean for Iraq to regain the city.


History and Significance of Mosul

The city of Mosul emerged on the former site of Nineveh, an Assyrian fortress. The city’s rise began with it serving as an important link between Syria, Anatolia, and Persia. By the 8th century, it became the major city in Northern Mesopotamia, which is modern-day Iraq. Mosul reached its height in the 12th century under the Zangid Dynasty when it was a hotbed for metal work and miniature paintings. It was subsequently destroyed by Mongolian conquerors in the 13th century.

Mosul was slowly rebuilt and later ruled by the Ottoman Turks from the 16th to the 20th centuries. The British conquered the city during World War I and occupied the surrounding area for several years. It was later incorporated into Iraq. In the Lausanne Treaty negotiations  following the war, Mosul proved to be a contentious issue between the British and Turkish governments. The issue was eventually resolved by the League of Nations, which concluded that the city should be a part of Iraq, but the dispute shaped the way Turkey views the city today.

Prior to ISIS’s rise, Mosul was the capital of Iraq’s Northwestern Province. It had a population of approximately 2 million people before the invasion in 2014. Originally, Mosul was situated on the western bank of the Tigris River, however, it expanded across the river and now occupies parts of the eastern bank as well. In addition to being a regional capital, Mosul is also the commercial center of Northern Iraq. Not only is it home to several major industries and oil production, it also serves as an agricultural marketplace.


Mosul Under Saddam Hussein and the Iraq War

Mosul has also been the site of significant ethnic strife. Traditionally, Mosul was a major center for ethnic Kurds, however, in the 1970s Saddam Hussein’s Baath Party initiated a resettlement plan that moved a large number of Arabs into the area to displace them. Hussein’s plan was successful, eventually leading to a large Arab majority in the city. The new Arab majority responded favorably to Hussein and eventually there were as many as 300,000 Baath Party members in Mosul. Along with displacing the Kurds as a result of his Arabization policy, he also waged a war against them in the late 1980s and early 1990s, which left another 100,000 Kurds dead.

During the initial occupation of Mosul in 2003, U.S. forces managed to establish order in the city. However, when the American force was reduced, ethnic tensions spilled over with Kurds controlling one half of the city and Arabs the other. The strife broke out as Kurds tried to reclaim what they viewed as stolen property. This led to an insurgency of former regime members culminating in the Battle for Mosul in 2004. A coalition of American and Kurdish forces managed to push back the insurgents, at which point the battle lines returned to their status quo on the east and west halves of the city.

This was not the end of the insurgency, however, as the resistance shifted from former Baath members to al-Qaeda in Iraq. In early 2008, following the U.S. surge a year earlier, another round of fighting broke out between American and insurgent forces. The city was once again cleared of insurgents and greater efforts were then put in place to engage the community and avoid another conflict.


Mosul under ISIS

Capturing Mosul was key to ISIS’s rise in the region. ISIS derives much of its income from oil revenues and taxes.  Mosul offered both as it is close to key oil fields and has a massive population that could be taxed. Its location was also strategically important in allowing ISIS fighters to freely move about. Lastly, by conquering the ethnically and religiously diverse city, ISIS could claim the superiority of its own ideology.

ISIS’s takeover of Mosul came swiftly, marking a significant embarrassment for the Iraqi government and military. In June of 2014, ISIS fighters headed toward Mosul with the hope of occupying certain parts of the city for a short period of time to make a statement. But instead of just making a statement, ISIS was able to take the entire city and most of the surrounding region. The Iraqi security forces left to guard the city were undermanned and outgunned, yet another result of the government infighting that had plagued the nation. In their retreat, Iraqi forces also left behind weapons and other supplies that only strengthened ISIS’s capabilities.

Life under ISIS has been harsh for the city’s residents. While it was tolerable to some at first, especially those who supported the group, conditions have deteriorated, particularly after coalition bombings increased. ISIS became increasingly unable or unwilling to provide basic services such as electricity, fresh water, sanitation, and adequate food. Additionally, ISIS quickly embarked on a city-wide crackdown, forcing residents to abide by its strict religious and moral codes or receive punishment or even death. The city has slowly morphed into a prison-like atmosphere as the group has refused to let anyone leave.

The video below looks at the importance of Mosul to the Islamic State and why it is important for Iraqi forces to gain control of the city.


Taking back Mosul

The fight to take back Mosul is expected to be especially grueling and difficult. One of the Peshmerga generals predicted it may take up to two months to actually retake the city. That long timeline might surprise outside observers who look at the lopsided number of coalition forces and see a clear advantage–coalition forces have nearly 100,000 troops while estimates suggest there are at most 7,000 ISIS troops in Mosul. The matchup is even more advantageous for coalition forces because they will have significant air support while ISIS does not.

However, the assault on Mosul has not been a secret, although the exact dates have not been clear until recently. This lead up has given ISIS ample time to set up booby traps, lay IEDs, and develop defensive structures like tunnel networks. The group is also employing other familiar deadly weapons such as suicide bombers. Some even believe ISIS has mustard gas, an extremely harmful chemical agent, which it may unleash as a last resort. The group is unlikely to relinquish the position without a fierce fight, as it is symbolic of ISIS’s strength in Iraq. After all, Mosul is where the caliphate was originally declared. Losing Mosul would then be a significant blow for ISIS in Iraq.

The following video looks at the effort to take back Mosul:


Aftermath of the Battle for Mosul

What exactly happens for those involved once Mosul is liberated? The answer starts with the civilians on the ground; the United Nations, the Iraqi government, and the United States have already announced plans for humanitarian aid that will be desperately needed once ISIS has been ousted from the city. This includes basic survival goods that may need to be supplied for up to 12 months.

Building off of that, many of the people who are likely to flee the fighting are Sunnis. One of the major issues within the government, and one that helped sow the seeds for ISIS’s rise, was discrimination against Sunnis by the current and former Iraqi governments. The people in charge will have to figure out how to create a more inclusive country, instead of continuing to seek to redress old wrongs. The other side of that same concern is the role of the Kurds.

The Kurds make up a significant part of the force attempting to retake Mosul, however, there is an agreement in place stopping them from entering the city’s center in order to avoid political tensions. The Kurds’ power has only grown and solidified over the last two years as they have played a pivotal role in the fight to defeat ISIS, while the official Iraqi government has basically just weathered the storm. If ISIS is defeated in Mosul as many anticipate, in the wake of the victory the Kurds may finally feel strong enough to declare an independent state of their own in the north.

Lastly, it is important to look at the battle’s significance for ISIS itself. What would losing its Northern Iraq stronghold mean to the group? It will likely mean the end of the ISIS-proclaimed caliphate in Iraq and Syria, where ISIS is also losing territory. However, it does not mean the end of the group and certainly not the end of ISIS-style extremism. ISIS still has bases in other countries with weak governments and where Sunni minorities are ostracized, such as Libya and Yemen. As long as those conditions exist, ISIS is likely to thrive. And even if it is not ISIS, another group will likely emerge to replace it, much like how Al Qaeda in Iraq led to ISIS in the first place. The main issue then is the social, economic, and political exclusion of certain groups. These conditions have often been exaggerated by Iran and Saudi Arabia’s battle for the Middle East, which must be addressed to prevent the influence of terrorist groups in the region.


Conclusion

Even if the battle for Mosul is a success, will it be viewed as a success for everyone? The Kurds certainly look to gain with the elimination of their main rival in the North. The fall of ISIS in Mosul, combined with other gains that the Kurds have made since ISIS emerged, has them in a position to potentially seek a state of their own.

However, an independent Kurdish state may not be particularly appealing to the Sunni Arabs in Mosul, who have long battled Kurds for control of the city and have felt marginalized by the Shia-dominated government in Baghda. Speaking of the Iraqi government, will Iraqi citizens trust a fractious government to protect them going forward when it just let them fall under the control of an extremist group?

Will this also be the end of extremist groups in the region or will simmering Sunni discontent lay the groundwork for another group or some form of ISIS resurgence? Only time can answer these questions, but even if the battle for Mosul is successful, it may not be the last one in the near future.


Resources

Institute for the Study of War: The Fight for Mosul

Encyclopedia Britannica: Mosul

Business Insider: One Paragraph Explains how ISIS Managed to Seize Iraq’s Second-Largest City

CNN: Mosul offensive: Territory Recaptured from ISIS

The Guardian: Life Under ISIS in Raqqa and Mosul: ‘We’re Living in a Giant Prison’

Reuters: As Mosul Fight Approaches, Worries About the Day After

Newsweek: The Battle Against ISIS in Mosul Could Lead to an Independent Iraqi Kurdistan

CNN: What happens after ISIS loses Mosul?

Human Rights Watch: Claims in Conflict Reversing Ethnic Cleansing in Northern Iraq

ARA News: Peshmerga Official says Kurds Won’t Enter Mosul City

Rudaw: The importance of Mosul for ISIS

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Battle for Mosul: The Fight for ISIS’s Stronghold in Iraq appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/battle-mosul-isis-stronghold/feed/ 0 56373
Do Presidential Debates Really Matter? https://legacy.lawstreetmedia.com/issues/politics/story-behind-presidential-debates/ https://legacy.lawstreetmedia.com/issues/politics/story-behind-presidential-debates/#respond Sun, 09 Oct 2016 14:00:47 +0000 http://lawstreetmedia.com/?p=55882

How have debates shaped U.S. presidential elections?

The post Do Presidential Debates Really Matter? appeared first on Law Street.

]]>
"Republican Party debate stage" courtesy of [Gage Skidmore via Flickr]

As the election season winds down, most of the attention will turn to the remaining debates. These debates have taken on an important role in the presidential selection process, allowing viewers to see candidates pitch their visions for the country side by side. However, debates did not always play such a major role in elections and are actually a relatively new development. They have also not always had the impact they are perceived to have nowadays, something that could become even more exacerbated by the effects of modern technology.

Read on to find out more about the history of presidential debates in the United States, take a closer look at some of the most significant debates, and see how the process has changed over time with the influx of new technologies.


How the Debates Work

Debate rules, like the candidates themselves, change from election to election, and this year they even change from debate to debate. Nevertheless, 2016’s debates will work off the framework established by the 2012 edition and share some commonalities. Each will be 90 minutes long with no breaks. The moderator will be the sole deciding factor in which questions are asked, whether or not to extend segments, and he or she will be in charge of keeping the discussion appropriate. Some of these rules are new and others have been in place for a while, however, they all compare starkly to the first major U.S. debate way back in 1858.


History of Debates

One of the first high-profile debates between politicians occurred back in 1858, but it wasn’t between presidential candidates. The famous Lincoln-Douglas debates shaped the Senate race in Illinois, but they were quite different from the modern style of debates we see today. These debates only came about because Lincoln had been following Douglas on the campaign trail and asking questions at a number of his stops, which eventually led the two to hold a series of formal debates. These debates were quite long and did not even feature moderators. Following that election, there were no high-profile debates for roughly 90 years, as candidates instead preferred to make individual speeches.

The first year that presidential candidates had a public debate was in 1948 in the Republican primary. The first presidential debate between major party nominees was not for another 12 years, in 1960. The 1948 Republican debate was also the first debate broadcast on radio; 40 to 80 million people listened in. The 1960 debates were the first debates to be broadcast on television. For that first televised debate, approximately one in three Americans watched, or 66.4 million people. There was another long gap between debates following that year, as the next round of presidential debates was not held until 1976. However, from that point on, debates have been held in every election cycle. In 1976, there was a vice presidential debate, a practice that has become a tradition ever since the 1984 cycle.

According to the rating service Nielsen, the highest rated Presidential debate ever was in October 2012 between President Barack Obama and Republican Nominee Mitt Romney, which 46.2 million households watched. In terms of individual viewers, the Carter-Reagan debate of 1980 had the most, with 80.6 million. Since 1987, the debates have been under the direction of the Commission on Presidential Debates, a bipartisan organization tasked with setting the format and rules of each debate.

The following video gives a look at the evolution of debates over time:


Major Debates and Their Impact

Regardless of their medium and audience size, debates have now been taking place in U.S. presidential elections for more than 70 years. In that time there have been some memorable moments, both at the presidential and vice presidential levels. Time has a list of its ten most memorable debates, although there have been many. Often these tend to focus on politicians making embarrassing mistakes that doom their campaign, like Rick Perry in 2012, or on one-liners like the infamous one delivered by Ronald Reagan to Walter Mondale in 1984 about their respective ages.

The video below highlights some of the most memorable moments in presidential debates:

One of the most famous debates was the one between John F. Kennedy and Richard Nixon in the lead up to the 1960 Presidential contest. Coming into the debate, the candidates were locked in a close race, however, physically they were very different as Nixon had been recently hospitalized for an infection. Normally, this would not have played a role, but this was the first televised debate. Thus, for most of the viewers watching on television, the young, healthy looking JFK defeated his opponent, the sickly-looking Richard Nixon. This debate not only signaled the importance of the rise of television–radio listeners generally thought Nixon did better–but it helped usher in the short but iconic Kennedy era. The debate also had an effect on Nixon, who refused to participate in debates the next time he ran for president and again when he ran for reelection.

Do the debates matter?

While there have been memorable debates, some of which we still talk about today, it is fair to ask what impact they actually have on the outcome of elections. Although people involved in politics, such as pundits or political advisors, like to suggest they have a major impact on voters in the same way party conventions can, the numbers do not really bear that out. According to two separate studies done by political scientists–the first by James Stinson and the second by Robert Erikson and Christopher Wlezien–the effects of debates on polls are negligible and often mirror whatever trend was already occurring.

It is not that the debates don’t matter, they just often have a very small effect, if one at all. Even the infamous Kennedy-Nixon debate may have only led to a 3 or 4 point swing, which is within the margin of error in most polls. It is also important to note that these debates do not happen a vacuum, so what might appear as an effect of a debate is often just another symptom of an ongoing issue with a candidate. In addition, the candidates are traditionally similar enough or have prepped long enough so there is no clear winner or the person deemed the winner varied based on the viewer’s political preferences.

What the debates are seemingly most useful for then, is informing voters about a candidate. This is especially true in the first debate when voters may still be learning about the candidates. This is also true for a challenger whom the debates may favor. Indeed, despite the studies mentioned earlier, some groups still contend that debates are very important in deciding the presidency. The Pew Research Center found that in 2008, two-thirds of voters said that the debates would influence their vote.


How the debates have changed

While there are some differing opinions on whether the debates have an impact on voters, one undisputable truth is that technology has influenced the debates. When Douglas and Lincoln had their famous debate they would go from town to town, giving hours-long speeches that would be covered in newspapers. When debates returned in the 20th century, the new medium was radio, which reduced the length and substance of the events. Next was television, which shortened the events even more while adding a visual element.

Unsurprisingly, Presidential debates have continued to change a lot since the first debate aired on television in 1960. The last few election cycles, in particular, have brought about a number of major changes, all involving the use of the internet and social media. In 2008 for example, people were allowed to send in questions through YouTube. In 2012, questions in primary debates started coming via Facebook. This year, the debates will be streamed live on both YouTube and on Twitter, along with the major networks. In addition to watching with social media, users are also able to get real-time feedback on their opinions, both through those sites and on their television screens, which have a line showing who is perceived to be winning the debate as it happens.

The following video looks at the role of technology in today’s debates:


Conclusion

In our current age of instant–and some might say excessive–exposure, debates are the ultimate platform for presidential candidates to prove themselves to the nation or fail in about as public a way as possible. At least that is the perception anyway. However, Presidential debates are relatively young and have changed dramatically throughout the years as technology has evolved. Additionally, their role in determining who inevitably becomes the President may also be overblown. Major studies have shown that debates have little or no impact and serve more to reinforce long-standing beliefs.

But the debates serve as one of the best opportunities for the audience to get to know a candidate before the election and for the candidates to get their message out. Presidential debates have become extremely popular events and intertwine themselves into the pre-election fabric so they are unlikely to go away. Their usefulness, however, is up for, well, debate.


Resources

Commission on Presidential Debates: Debate History

National Parks Service: The Lincoln-Douglas Debates of 1858

Forbes: 13 Quick Facts About The History Of Presidential Debates In America

Time: 10 Memorable Moments in Presidential Debate History

History: The Kennedy-Nixon Debates

Washington Monthly: Do Presidential Debates Actually Matter?

Journalist’s Resource: Presidential Debates and Their Effects: An Updated Research Roundup

Commission on Presidential Debates: Format for 2016 General Election Debates

Tech Crunch: How Technology Destroyed The Once Substantive Presidential Debate

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Do Presidential Debates Really Matter? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/politics/story-behind-presidential-debates/feed/ 0 55882
G7, G8, G20: Are the Major Economic Forums Important? https://legacy.lawstreetmedia.com/issues/world/g7-g8-g20-economic-forums-worlds-elite/ https://legacy.lawstreetmedia.com/issues/world/g7-g8-g20-economic-forums-worlds-elite/#respond Wed, 05 Oct 2016 18:57:33 +0000 http://lawstreetmedia.com/?p=55568

While often major news events, what purpose do these summits actually serve?

The post G7, G8, G20: Are the Major Economic Forums Important? appeared first on Law Street.

]]>
"G20 Summit in Brisbane Australia, 14 Nov 2014" courtesy of [GovernmentZA via Flickr]

The 2016 G20 Summit ended on September 5, however, it didn’t come to a conclusion on all of its topics of discussion. Issues that remained unresolved after the meeting include the Syrian crisis, Russian involvement in Ukraine (and Syria), the fallout from Brexit, North Korea’s missile tests, and even whether President Obama using a smaller staircase to descend from Air Force One amounts to a slight by the Chinese. While these summits are often touted as important events for economic and diplomatic cooperation, it’s worth asking what purpose they serve when so many issues consistently remain unresolved.

What are they for, why do the size of the summits change, how does a country get an invite, and is anything discussed at them actually lasting? Read on to find out about all these questions and more about the G-summits.


G20 2016

The 2016 G20 summit was held in Hangzhou, China. As China was the host it took center stage, touting its economic plan focusing on its new Silk Road initiative and the Asian Investment Infrastructure Bank. China also made an effort to include a record number of countries in order to bolster its mission to achieve development and inclusion, something dubbed the “Hangzhou Consensus.”

Perhaps the most notable development, though, was the Chinese and U.S. joint ratification of the Paris climate change, which happened just before the summit began. Nevertheless, the criticism of this meeting, like many past summits, was the quantity of promises exceeding that of concrete commitments.


Background

The 2016 meeting was the most recent G20 summit but it was by no means the first. The inaugural event happened in 1999 as a meeting among finance ministers in countries with large economies. Since then, regular meetings have become an important aspect of economic cooperation. As the name implies it includes 20 members of the international community–Argentina, Australia, Brazil, Canada, China, France, Germany, India, Indonesia, Italy, Japan, Mexico, Republic of Korea, Russia, Saudi Arabia, South Africa, Turkey, the United Kingdom, the United States, and the European Union. Along with these nations and regional blocs, several non-state actors attend the G20 including the United Nations, the International Monetary Fund, the World Bank, the World Trade Organization, the Financial Stability Board, the International Labor Organization, the Organization for Economic Cooperation and Development (OECD).

The G7 was the first of these international summits, starting back in 1975. (The first meeting was actually only attended by six of the countries, as Canada did not attend until 1976. For this reason, some refer to the initial summit as the G6).

The G7 originated when seven countries, including the United States, joined together to address the oil embargo in 1973 that had been triggered by Western support for Israel in the Yom Kippur War. G7 is short for the Group of Seven Industrialized Nations and it included the United States, United Kingdom, Germany, France, Italy, Japan, and Canada. Since 1991, Russia regularly met with countries at the G7 summit and in 1998, it was added to form the G8.

The following video looks at the emergence of these summits and their purpose:

While the exact reason for the addition of the G20 in 1999 was not explicitly spelled out, the increasing economic significance of developing countries made them desirable partners for the international summits. Even with this addition, though, some feel that G8 members still wield greater power in the G20 than the other members. Regardless of that specific sentiment, even when the organization was just seven members, other countries in the developing world were invited to participate and exerted some influence. Russia, in particular, participated before becoming a member and even joined in on some of the dialogue since 1991. Even after the expansion to the G20, the original G8 members still meet at a separate event. The site where the meeting will be hosted is rotated between the eight member countries. The video below gives a brief explanation of the G8 and G20:

The video below gives a brief explanation of the G8 and G20:


Purpose

Initially, the G20 summit, and the G8 before it, was a place for finance ministers and central bank governors to discuss economic matters, including international financial and monetary policies, international institutions, and world economic development. However, that role changed following the 2008 financial crisis when the first Leaders Summit was held in Washington, D.C. At the next summit, in 2009 in Pittsburgh, Pennsylvania, world leaders announced that the new G20 would overtake the G8 as the primary means of international economic cooperation.

Ultimately, no decisions made in the meetings are binding. The main goal of the meetings is to serve as an open forum for communication. Thus, it is not surprising that even after the meetings end, follow-up discussion continues. This is done through sherpas–representatives of the officials who attend the actual meetings. These people keep in regular communication concerning the decisions made at the most recent meeting. Ministerial meetings also occur throughout the year, which include finance ministers from member nations.


Criticism and Controversy

Like other international events involving economic policy, the G8 and G20 have their detractors. The first protesters at a G8 Summit appeared in 1998 to denounce globalization. The dispute eventually turned deadly three years later when a protester was killed during a clash. The G20 has also been the site of protests virtually every year since its inception. This has been increasingly true following the financial crisis, as the leaders who met annually were viewed as the personification of a problem that led to global calamity.

Aside from protests surrounding the meetings, there has also been dissent within. In 2014, the original G7 members declined to attend that year’s G8 meeting in Russia to protest Russia’s actions in Ukraine. Instead, the member countries held their own G7 conference in Brussels. Since then, Russia has been expelled from the group and the G8 has returned to the original G7 group.


Impact

For all the media exposure that these summits generate, the actual impact of the G7/G8 and G20 is up for debate. For its part, the G20 is not actually an organization but a network that brings various organizations together. However, this has not stopped it from competing with other organizations, on purpose or not, such as the IMF and World Bank. Still, the G20 relies on those organizations and others to actually address the issues identified and outlined at the meetings.

The G20’s biggest impact has been through its broadened membership. Specifically, it now includes countries such as China, India, and Brazil, all of which have large economies. The hope behind this move was that the additional countries would speak for themselves of course, but also for other countries facing similar issues. Due to the informality of these events, however, some argue that this hope has not yet been realized. For many of these countries, with this being their first time on the world stage, they are merely learning the basics.

The G8 is also a rather informal affair and, in fact, has maintained that arrangement on purpose. In 1998 for example, Britain stopped sending its finance minister as a way to separate the G8 from the traditional ministerial meetings. The concept was further decentralized and ministers began having separate events throughout the year. The impact of both the G8 and G20 is not necessarily in the results that they generate but the opportunities they present for collaboration and communication.


Conclusion

The G8 and G20 serve as important forums where the world’s most powerful countries can meet to discuss pressing economic issues that affect the entire planet. The process started as many of the leading organization today have, with a western focus. However, with the introduction of the G20, in which many more voices are heard, that focus has decidedly broadened.

Even after the addition of the G20, it is still unclear what these summits do and if they are even useful. Similar to criticisms of other international institutions, such as the United Nations, these summits have been viewed as generating lots of ideas while offering fewer actual benefits. Additionally, these summits have been met with significant backlash, as evidenced by the mass protests that follow them around, year after year, from country to country. This unpopularity has only increased following the 2008 financial crisis.

While these criticisms may have merit, they also lack nuance. The G8 and G20 were specifically designed to be forums, not supranational deciding bodies. As unpopular as they are now, it is easy to imagine how much more so they would be if they actually made policy. For an example, look at the E.U. and the way that many of its member states have taken issue with its top-down style of rule-making. Instead, they are a place for leaders and ministers to gather and discuss. Sometimes they can be used as political tools, but they can also offer a chance at dialogue that otherwise does not exist. Thus, the meetings are likely to continue because at the very least they offer prestige to the host (see China in 2016) and at best, they create the international community leaders need to address problems.


Resources

Time: 7 Things You Might Have Missed at China’s G20 Summit

DifferenceBetween.net: Difference Between G8 and G20

G20: About G20

University of Toronto Library: What are the G7 and G8?

Global Brief: The G20 and the Developing World

The Guardian: China’s G20 summit was big on show but short on substance

European Commission: G7/G8, G20

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post G7, G8, G20: Are the Major Economic Forums Important? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/g7-g8-g20-economic-forums-worlds-elite/feed/ 0 55568
Overseas and Undertaxed: How Companies Avoid Paying Taxes https://legacy.lawstreetmedia.com/issues/business-and-economics/how-companies-avoid-paying-taxes/ https://legacy.lawstreetmedia.com/issues/business-and-economics/how-companies-avoid-paying-taxes/#respond Tue, 04 Oct 2016 14:50:00 +0000 http://lawstreetmedia.com/?p=55341

Apple is just the latest example.

The post Overseas and Undertaxed: How Companies Avoid Paying Taxes appeared first on Law Street.

]]>
Image courtesy of [Olle Eriksson via Flickr]

Recently, the European Union determined that Apple must pay $14.5 billion in back taxes to Ireland. Unsurprisingly, Apple is challenging the ruling and claiming that its arrangement with Ireland was perfectly legal. What is surprising is that Ireland may actually support Apple’s appeal and claims it doesn’t want the money, a concept that many in the United States support as well.

Read on to find out what this scandal has revealed about the tax policies in Ireland and the E.U., how corporations avoid paying higher taxes, and what the likely ramifications would be for Apple if is ultimately forced to pay the taxes in Ireland and other nations around the world.


Apple, Ireland, and Unpaid Taxes

The decision against Apple was handed down by Margrethe Vestager, the head of the E.U.’s competition committee. The competition commission has taken it upon itself to go after corporations and countries that strike unfair tax deals, with companies such as Starbucks and Amazon also under investigation. However, this is the first major case brought against a company operating in Ireland.

Ireland, for its part, already has one of the lowest tax rates in the world at 12.5 percent, something it uses as a comparative advantage to attract companies to relocate there. However, that rate is actually significantly higher than the tax rate it agreed to with Apple. In 2013, a United States’ Senate committee discovered a deal between Ireland and Apple that called for a 2 percent or lower tax rate for the company. Both parties have seemingly taken the “lower” option, though, as Apple was allegedly being taxed at a rate of 50 euros for every 1 million euros made or 0.005 percent. While that number is up for dispute, it is clear that the effective tax rate that Apple faced was remarkably low.

This remarkably low tax rate was what the E.U. had taken issue with. The E.U. argues that when countries like Ireland give special tax rates to certain companies they are engaging in anti-competitive behavior that unfairly punishes companies without these deals. But regardless of the ruling, the United States would not be able to tax the money Apple stashed overseas unless it is brought it back into the United States.

In the video below, Margrethe Vestager explains the recent E.U. ruling:


Tax Havens

For its role, Ireland has been labeled a “tax haven.” However, countries serving as tax havens, while unscrupulous, are not necessarily breaking any laws. In fact, many corporations who take advantage of what these nations are offering have made dodging taxes into a virtual art form. Individuals can use tax havens to legally refrain from paying a variety of taxes such as inheritance, capital gains, and even regular income tax while companies take advantage of low corporate tax rates.

While Ireland gets the most attention for its low tax rates because of the situation with Apple, it’s deservedly so as a quarter of Fortune 500 companies had offices there which they may use to pay lower taxes. There are a number of places, mainly in Europe and the Caribbean, in which low tax rates, lax enforcement, or protective banking laws make for attractive spots to place wealth and profit.

Companies are able to avoid paying U.S. taxes by booking their profits in countries with particularly low tax rates. We can tell that companies do this because these countries are often small, yet the profits from large multinational companies are quite big. In some cases reported profits in certain countries actually surpass the GDP there. The following quote from the Citizens for Tax Justice, a left-leaning think tank, sheds light on what is actually happening:

It is obviously impossible for American corporations to actually earn profits in a given country that exceed that country’s total output of goods and services. Clearly, American corporations are using various tax gimmicks to shift profits actually earned in the U.S. and other countries where they actually do business into their subsidiaries in these tiny countries. This is not surprising, given that these countries impose little or no tax on corporate profits.


Corporations Not Paying their Share?

Just as Ireland is not the only country with a low corporate tax rate and a willingness to strike a deal with American companies, Apple is not alone in moving its profits abroad in order to avoid paying the high U.S. tax rate. As of this year, American companies had $2.4 trillion in income that has not been repatriated and instead is being held outside the United States. If that money was brought back to the United States, the subsequent taxes would amount to nearly $700 billion. While Apple has over $200 billion in profits held overseas, it is hardly alone. High-profile companies such as Microsoft and Pfizer also hold substantial amounts of cash outside the United States.

The main culprit for all of this is allegedly the U.S. corporate tax rate, which at 35 percent is one of the highest in the developed world and one of the highest overall. While some companies have discussed moving their money back to the United States–particularly if there were a window or a temporary tax holiday like the one in 2004–many are pursuing another course. Namely, many large companies have resorted to corporate inversions, in which an American company merges with another, often smaller, company located in a country with a low corporate tax.

Companies employ other, similar techniques as well. One is something known as earnings stripping, where a U.S. subsidiary issues debt to their foreign parent company, thereby lowering its taxable income.

The following video looks at how companies relocate to reduce their tax burden:


Ramifications of E.U. Ruling

With all the ways corporations avoid paying taxes and past efforts to close loopholes, perhaps the most surprising aspect of the European Commission’s decision to make Apple pay back taxes is who is opposed to the ruling: Ireland. Shortly after the commission made its decision, Ireland announced it will appeal. Obviously, the immediate question is why a country would be willing pass on $14.5 billion. The answer is that while Ireland would be passing up a lump sum in tax revenue, keeping Apple’s taxes low ensures the company’s continued presence there and the benefits that come with it, like jobs. If Ireland were to raise Apple’s taxes, Apple may be compelled to move somewhere else as Ireland would be less appealing. It could also have the effect of scaring away other companies who viewed Ireland as a good place to do business.

This decision is complicated further because it is not made in a vacuum. Ireland has had some notable fiscal issues and was bailed out by the E.U. and IMF in 2010. But Ireland may not want to risk losing the jobs that Apple brings as well as its reputation as a business friendly country.

Another surprising group opposed to the European Commission’s ruling is a collection of U.S. lawmakers. The members of both parties in the United States are opposed ultimately because that money would otherwise be paid to the United States. Lawmakers feel that if Apple is forced to pay Ireland it will complicate tax matters between the E.U. and the United States. If Apple ends up paying the $14.5 billion it will actually be able to use that as a credit against other taxes that it owes in the United States.

While the fight seemingly boils down to who gets the money and whether Ireland can keep Apple’s headquarters, it also has the potential to raise an even larger question. Specifically, by forcing Ireland to tax at a certain rate, the E.U. is entering the murky waters of Irish sovereignty. While the E.U. is not concerned with Ireland’s low tax rate specifically, its problem is with the special tax deal it offered Apple–the ruling does amount to the E.U. intervening in a country’s private affairs. This is a particularly awkward position given that the U.K. just voted to leave the E.U., in part, over concerns of overreach by Brussels. While talk in Ireland has certainly not approached that level yet, some people have already broached the topic with names like “Irexit.”


Conclusion

As of February, Apple was the most valuable company in the world with a market value of approximately $534 billion. It’s is not surprising that even the most basic effort to reduce Apple’s tax burden would become news. Of course, the news tends to snowball when its tax deal with Ireland violates European Union rules. While Apple’s actions are not necessarily illegal as much as they are a profit-driven company taking advantage of a terrific deal, the situation certainly raises eyebrows, particularly when it’s not unique to one company.

Whether or not the E.U. will allow Apple’s current tax arrangement in Ireland to persist remains to be seen, as an appeal would take some time to sort out. Either way, it has shone the spotlight on a very common practice by major corporations. While Apple may have the most cash and be the most egregious case, the other companies doing the same thing reads like a who’s who of major corporations.

Unfortunately, the climate is such that instead of lauding the E.U. for trying to recover owed taxes and protect competition, there was an immediate backlash against the ruling. Ireland tried to refuse $14.5 billion out of fear of losing jobs and other investment and American politicians criticized the E.U. for targeting American companies. Many questions remain in the United States. In order to really address the underlying problems, U.S. lawmakers will need to reform the corporate tax code to truly prevent American companies from engaging in international tax avoidance.


Resources

The New York Times: Apple Owes $14.5 Billion in Back Taxes to Ireland, E.U. Says

CNN Money: Ireland doesn’t want $14.5 billion in tax from Apple

The Washington Post: How U.S. companies are avoiding $695 billion in taxes

The Motley Fool: 10 Best Tax havens in the World

Yahoo Finance: US taxpayers could end up covering Apple’s back taxes in Ireland

Forbes: These Are the 10 Most Valuable Companies in the Fortune 500

Bloomberg: Pfizer-Allergen Deal may Be Imperiled by U.S. Inversion rules

The Washington Post: How the E.U.’s ruling on Apple explains why Brexit happened

Citizens for Tax Justice: American Corporations Tell IRS the Majority of Their Offshore Profits Are in 12 Tax Havens

 

 

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Overseas and Undertaxed: How Companies Avoid Paying Taxes appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/business-and-economics/how-companies-avoid-paying-taxes/feed/ 0 55341
What Walmart’s Purchase of Jet.com Says About the Retail Industry https://legacy.lawstreetmedia.com/issues/business-and-economics/walmart-purchase-jet-industry/ https://legacy.lawstreetmedia.com/issues/business-and-economics/walmart-purchase-jet-industry/#respond Fri, 09 Sep 2016 16:16:25 +0000 http://lawstreetmedia.com/?p=55273

Walmart's decision to purchase Jet.com illustrates a shift in retail.

The post What Walmart’s Purchase of Jet.com Says About the Retail Industry appeared first on Law Street.

]]>
"Walmart" courtesy of [Mike Mozart via Flickr]

In August, Walmart purchased the online-only retail website Jet.com for $3 billion. Before the sale, Jet.com forecasted that it would be losing money until at least 2020, as it attempts to establish itself prior to becoming profitable. That raises the question of why the world’s largest retailer, with an online presence of its own, would decide to buy a fledgling retail site that didn’t plan on making money for several years. Read on to find the answer to that question and how it is influenced by the changing retail marketplace, where online presence is more important than brick and mortar stores and is necessary to compete against online behemoths like Amazon and Alibaba.


Walmart and Jet.com

Walmart bought Jet.com for $3 billion in cash and $300 million worth of shares for people high up in the company. Although Jet.com CEO Marc Lore will take over the online business of both retailers, they will remain separate entities. This is partly to retain Jet.com’s potential appeal to new users, such as millennials, who might view the site differently than they would Walmart. Jet.com’s business model was originally similar to its major competitors such as Amazon, which charges annual fees for membership while offering special deals and services. However, that plan was scrapped in October 2015 when Jet.com’s discounts became available to everyone at no additional cost.

The hope for many at Jet.com–including its CEO, who also founded another online retailer startup that was purchased by Amazon–is that with Walmart’s economy of scale it can ramp up sales and turn a profit sooner than expected. The company also hopes to expand beyond the United States, where it currently operates.

The video below looks at Jet.com’s initial business model and how it sought to compete with Amazon:


The Why

While Jet.com has the potential to improve Walmart’s e-commerce presence and its overall sales, why did it decide to make this move and spend $3.3 billion on a startup that is years away from making a profit? The answer is that the retail market itself is rapidly shifting. In January, Walmart announced that it will be shutting down 269 stores, including 154 in the United States. The Walmart downsizing was just a precursor to a bigger change within the retail industry. In June, another wave of major retailers, including the likes of Macy’s and J.C. Penny, announced huge layoffs of their own. In total, as many as 38,000 jobs have been lost in retail so far this year, second only to the crude oil industry.

All these layoffs point to the fact that the way Americans buy their goods is shifting from brick and mortar stores to an online marketplace. Indeed, last quarter the nationwide e-commerce market in the United States grew by 15 percent. Walmart is certainly part of this growth, as its online sales grew by 7 percent during that period but that was slower than the industry as a whole and considerably slower than competitors like Amazon. The pace of Walmart’s online growth has slowed for nine consecutive quarters while its competitors continue to post large gains.

Although Walmart managed to make $14 billion in 2015 through online sales, that was only a paltry 3 percent of its total revenue. While Walmart has made efforts to improve its own online sales–such as expanding the number of products listed and allowing merchants to provide descriptions of their goods–the move to incorporate Jet.com may make sense as it tries to keep up.

While Jet.com’s revenue doesn’t come close to the amount of money made by Walmart’s e-commerce efforts, it has grown relatively quickly with about 400,000 new customers each month. It has sustained this growth by offering lower prices than Walmart and others like Amazon, by linking the buyers directly with the sellers while not accumulating a massive warehouse of inventory. Jet.com’s different style of business and the demographics that it caters to are likely why Walmart found the company so attractive. The following video looks at why Walmart purchased Jet.com:


Competing Against Amazon (and Alibaba)

Walmart’s overall goal is to compete or at least challenge Amazon’s dominance in the American e-commerce market. While Walmart made more in total revenue, Amazon made far more in online sales, pulling in $107 billion (including the web-services component) to Walmart’s previously mentioned $14 billion. Not only did Amazon make more in total, its growth in sales increased at a far greater rate, 31 percent to Walmart’s 7 percent last quarter. Amazon’s growth was also twice as much as the industry as a whole.

Walmart has already taken shots at its online retail rival. On top of recent investments, like its purchase of Jet.com and other efforts to improve its own e-commerce presence, Walmart has also been mimicking some of Amazon’s best practices. Namely, in response to Amazon’s Prime’s free two-day shipping for members, Walmart announced its own Shipping Pass promotion. Shipping Pass provides Walmart shoppers with the option to buy free two-day shipping for a year at about half the price of an Amazon Prime membership. However, unlike Amazon, this deal does not also include a wide range of other benefits like video and music streaming.

One thing that Walmart is unlikely to copy is Amazon’s new strategy of opening physical stores. Seemingly running counter to the emerging conventional wisdom, Amazon recently opened a brick and mortar store in Seattle and plans to open another in San Diego. While the exact rationale behind this decision remains unknown, Amazon founder Jeff Bezos told the Wall Street Journal that the company is experimenting with a lot of new ideas to maximize its revenue. As of now, it looks like Amazon is alone in its decision to open additional physical stores.

International Competition

Although Amazon has established itself as the big fish in the U.S. e-commerce industry and in much of the rest of the world, an even greater threat may be emerging in China. There, a company known as Alibaba controls about 80 percent of the market. Walmart has already taken steps to counter this threat with 400 locations in China and it recently partnered with JD.com to expand its influence in the Chinese market. The accompanying video looks at Alibaba’s business model:

Alibaba is slightly less of a threat to Walmart now because it is primarily focused on China. Additionally, its first foray into the U.S. market failed. However, it does hold a stake in other American companies, namely Groupon, and at one point it even owned a stake in Jet.com. The company’s focus in the near term seems to be learning the nuances of the American market, which could make it a serious challenger in years to come.


Conclusion

Walmart’s purchase of a company that planned to lose money for at least the next four years is indicative of the changing reality of the retail industry. Namely, while retail has long been the domain of brick and mortar physical stores with Walmart leading in sales, things are changing. While physical stores still account for a large proportion of all sales, retailers continue to put an emphasis on e-commerce.

While Walmart is still one of the largest if not the outright largest retailer in the world, in the e-commerce sphere, it lags behind. When it comes to online sales, it finds itself behind the reigning heavyweight Amazon and up against other stiff competitors, such as Alibaba, in developing markets. This explains Walmart’s recent purchase of the fledgling startup, Jet.com, and partnership with other e-commerce sites operating only in China. While $3 billion may seem like a lot, if it gives Walmart an edge online it could be worth much more. But most importantly, this purchase provides more evidence that the future of the retail industry is in online sales. This purchase reflects this trend and illustrates some of the challenges that traditional companies with physical stores may have as they try to adjust.

Currently, none of these three retail titans seems to have outright control over the online market. Walmart appears to be trying to figure out how to fully tap this market like it has the traditional retail industry. Going forward, whichever company is able to most effectively harness these trends will likely be the most successful. Walmart’s purchase of Jet.com may reflect its desire to succeed in a market where it has, so far, fallen behind.


Resources

Tech Crunch: Confirmed: Walmart Buys Jet.com for $3B in cash to Fight Amazon

USA Today: List of the 154 stores Walmart is Closing

CNN Money: Layoffs in Aisle 4! Retailers are Big Job Killers

Wall Street Journal: Wal-Mart to Acquire Jet.com for $3.3 Billion in Cash, Stock

The Street: Walmart’s New Alliance to Take on Alibaba and Amazon

USA Today: Walmart vs. Amazon: Walmart offers Free Trial of 2-day Shipping Program

Wall Street Journal: Amazon Plans More Stores, Bulked-Up Prime Services

Forbes: How Alibaba is Working Toward Establishing Itself in The U.S.? 

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post What Walmart’s Purchase of Jet.com Says About the Retail Industry appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/business-and-economics/walmart-purchase-jet-industry/feed/ 0 55273
What a Major Insurance Provider Leaving the Obamacare Exchanges Means https://legacy.lawstreetmedia.com/issues/health-science/insurance-company-obamacare-exchanges/ https://legacy.lawstreetmedia.com/issues/health-science/insurance-company-obamacare-exchanges/#respond Thu, 01 Sep 2016 16:07:31 +0000 http://lawstreetmedia.com/?p=55120

The president's landmark health law has some big problems.

The post What a Major Insurance Provider Leaving the Obamacare Exchanges Means appeared first on Law Street.

]]>
Image courtesy of [Wonderlane via Flickr]

In August, Aetna announced that it was dropping its participation in many of the Affordable Care Act exchanges that it had previously used to provide insurance, citing significant losses as its reason. While this decision may have been made for other reasons, it will have major implications for the president’s landmark legislation commonly known as Obamacare. This problem goes beyond Aetna, which followed UnitedHealth’s decision to leave the exchanges in 2017–also pointing to significant losses as the motivation for its withdrawal. Read on to find out the full story behind the departures from the Affordable Care Act marketplaces, why companies are pulling out of it, and what implications this may have for the law going forward.


Brief History of Obamacare

Passing and then implementing the Affordable Care Act, or ACA, was no easy feat. A wide range of politicians, including the Clintons, have made efforts to reform American health care for years, but these attempts largely failed. However, that began to change in 2009 when support for an overhaul began to swell and Democrats held the presidency and both chambers of Congress. Following the untimely death of Senator Edward Kennedy and opposition from Republicans, a complex legislative process involving a filibuster, cloture, and budget reconciliation eventually led to the Affordable Care Act’s passage in March 2010.

The health care law included a number of protections for consumers, like eliminating insurance companies’ ability to deny someone coverage because of a pre-existing condition, eliminated lifetime and annual limits on coverage, and prevented rescission. It also created an appeals process so people could challenge insurance companies’ decisions and allowed children to stay on their parents’ health plan for longer. The law attempted to cater to businesses as well by providing them a time frame to offer coverage to employees and also provided a number of tax credits. It let certain people continue with their existing coverage if purchased before a certain date and encouraged new enrollees through a tax if they did not sign up. Importantly, the legislation also allowed states to expand Medicaid coverage, which contributed to a dramatic decrease in the number of people without health insurance.

The ACA was quickly challenged in the courts and after a long fight was largely upheld by a Supreme Court decision in 2012. While this was a major victory it was not the last issue to plague the Act. It has had to endure a number of problems from the online marketplace not working when it was initially launched to repeated, high-profile challenges in the House and Senate. As of February, approximately 12 to 20 million people had enrolled in health insurance either provided by the marketplace or through coverage expansion such as the one with Medicare.


Why Providers are Leaving

Aetna is likely leaving public exchanges for many reasons, but particularly it posted a $200 million loss in the second quarter of 2016 in its individual products. While it is not pulling out of all the states it was operating in, it is leaving 11 of the 15. Aetna’s decision actually leaves one county in Arizona without any coverage at all. Some evidence suggests that Aetna’s decision to leave the exchanges may have amounted to payback for the Justice Department’s efforts to block its merger with Humana. But Aetna argues that the decision was purely based on its recent losses on the exchanges.

The following video looks at Aetna’s pullback from the marketplace in terms of the company’s possible motives and the implications going forward:

The issue might be left at that, but Aetna is not the only company leaving the exchange. Along with Aetna, UnitedHealthcare and Humana–the company Aetna recently tried to merge with–are both leaving exchanges. On top of these departures are those of smaller providers, including several government-funded carriers. For many of these providers, the biggest problem is demographics. Namely, the people signing up for the program are older and sicker than expected. Some people may also be taking advantage of insurers by waiting until they are sick or need medical help to sign up. These people require higher costs, which are not being balanced out yet by new, healthier enrollees. Because of these unanticipated developments, insurers have had a hard time setting their prices and, as a result, they are losing money.

Adding insult to injury, the Affordable Care Act is dealing with more than just the loss of insurance providers. In a recent study done by the New York Federal Reserve, one out of every five businesses in that district has reported hiring fewer people because of the law. Additionally, there are now allegations that some healthcare providers are steering patients to Affordable Care Act policies instead of Medicare and Medicaid because they receive higher reimbursements. This would raise costs for insurers because sicker patients end up on the exchanges instead of government-run healthcare plans.


Implications

While it definitely sounds bad, what exactly does the departure of major providers from ACA exchanges mean for the law? For starters, it means there will be a lot less competition in many places. In fact, 36 percent of markets now will have only one provider, which is up from just 4 percent at the beginning of the year. In five states–Alabama, Alaska, Oklahoma, South Carolina, and Wyoming–there will be only the one provider. On top of this, 55 percent will have only two or fewer providers, which is also up from 33 percent at the beginning of this year.

The biggest issue here, aside from the fact that one county in Arizona may wind up with no coverage options at all, is that competition was supposed to be an important way to cut healthcare costs. Without a competitive market, insurance providers can offer lower quality service at higher prices because there is no alternative. The accompanying video looks at what the major insurance companies are doing:

The news is not all doom and gloom, however, as other carriers are expanding in certain areas including Cigna in Chicago and a startup call Bright Health in Colorado, which was actually founded by the leaders of United Healthcare. Additionally, not all insurers are losing money either in the Affordable Care Act exchanges. In fact, many smaller insurers, who have more experience in government healthcare markets like Medicare and Medicaid, are actually thriving. They are succeeding because the experience they have gleaned has helped them operate on more moderate government-style plans than the more expansive employer-sponsored plans that larger insurers like Aetna are most familiar with.

Even if these large insurers ultimately decide to pull out of the market now that does not prevent them from reentering in the future. In fact, the opposite is true, as along with its announcement that it was leaving the government exchange, Aetna also hinted at the possibility of a return when the market was more receptive to its practices.


Conclusion

Republicans have attempted to repeal all or parts of the Affordable Care Act as many as 60 times since it was passed without success. While politicians may have been unable to sweep away President Obama’s crowning achievement, the market may have succeeded. Losing yet another major healthcare provider, such as Aetna, deals a major blow to the Affordable Care Act as it decreases competition and brings into question the viability of the entire system.

At least that is how some perceive it. To others, it is simply the result of survival of the fittest, where the companies best equipped to do business in a government exchange are and are doing well. While insurance giants balk over reported losses, these companies may fill in the gaps and grow their own brands further. Many, including President Obama, believe that the recent difficulties in the exchanges should revive efforts for a public option akin to Medicare or Medicaid. But as competition decreases in many local markets, the system has many issues that need fixing.

The Affordable Care Act is unlikely to go away entirely. Even if insurers continue to leave Obamacare exchanges, the law will have allowed for a dramatic expansion in health care coverage. Instead of revolutionizing the way health insurance is provided to individuals, the Affordable Care Act may end up looking like a traditional entitlement program that made insurance available to more Americans. After all, only 11 million Americans get their insurance through exchanges, while around 150 million have employer-provided plans. But in order to ensure that the marketplaces are viable going forward, more people will need to enroll and insurance providers will need to return to provide coverage. And that is by no means a simple task.


Resources

The Atlantic: Why Is Aetna Leaving Most of Its Obamacare Exchanges?

CNN Money: Choices Dwindling for Obamacare customers

MSNBC: On Groundhog Day, Republicans vote to repeal Obamacare

Business Insider: Obamacare has Gone from the President’s Greatest Achievement to a ‘Slow-Motion Death Spiral’

CNBC: Health Providers May be Steering People to Obamacare to get Higher Reimbursement

The Daily Caller: Another Huge Insurance Company Is Leaving Obamacare

eHealth: History and Timeline of the Affordable Care Act (ACA)

Obamacare Facts: ObamaCare Enrollment Numbers

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post What a Major Insurance Provider Leaving the Obamacare Exchanges Means appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/health-science/insurance-company-obamacare-exchanges/feed/ 0 55120
Chip Cards: Making Credit Cards Safe Again? https://legacy.lawstreetmedia.com/issues/business-and-economics/chip-credit-cards-safe/ https://legacy.lawstreetmedia.com/issues/business-and-economics/chip-credit-cards-safe/#respond Fri, 26 Aug 2016 18:36:13 +0000 http://lawstreetmedia.com/?p=55000

What's behind the switch to chip credit cards?

The post Chip Cards: Making Credit Cards Safe Again? appeared first on Law Street.

]]>
"Credit Cards" courtesy of [Sean MacEntee via Flickr]

By now, most people have become familiar with the additional hoop they have to jump through when buying things at the store. When you get to the checkout counter you do an awkward dance: do I swipe my credit card, do I put it into the chip reader, do I need my PIN, can I pay with my phone? The United States is currently in the middle of an update to its credit card infrastructure, an update that has been difficult for many consumers to navigate. What’s behind the recent changes and why have they just started now?


History and Background

The technology behind the original magnetic strip card was developed as early as the 1960s, but credit cards didn’t become a mass technology until the 1980s. Before then, their use was largely limited to business people and frequent travelers. The main issue originally holding plastic back was the cost. Namely, in the United States transaction costs were relatively low, a few cents each, because they were conducted over phone lines. In Europe, however, transactions were not done this way, leading to much higher costs. And using phone lines was actually a step up from the initial way credit cards were authorized, through carbon papers. When that was the case, people could commit credit card fraud by simply digging through dumpsters.

The video below gives a detailed account of the history of credit cards:

Because the European credit card system was so susceptible to fraud, European companies needed a more secure way to process transactions. Stakeholders got together and began to develop an alternative. This alternative, chip card technology or the formal name, EMV, was unveiled in 1994 and became widespread by 1998 in Europe (EMV stands for Europay MasterCard Visa). Despite widespread usage of EMV technology in Europe and other parts of the world, it has been slow to gain traction in the United States. It was only in 2015 that American companies and merchants began a concerted effort toward adopting EMV.


Adoption in the United States

So why did the United States ultimately decide to switch to EMV cards in 2015 when the technology had been readily available since the 1990s? The primary answer is the recent surge in credit card fraud, starting with the massive hack at Target in 2013 in which millions of credit cards were stolen. This hack, along with several other high-profile incidents, revealed the truth. Namely, companies were trying to secure customers’ data in the 21st century with cards from the 20th. The accompanying video looks at why the United States switched to EMV and what it means for cardholders:

However, the recent change was not the result of a top-down mandate from the government. In fact, the effort was led by a private group of credit companies including American Express, Discover, MasterCard, and Visa. The 2015 deadline was not a concrete point of no return, but one created by the same credit card companies. While companies did not have to meet the deadline by law, the liability for card-present fraud would shift to those who do not comply with the new technology. In other words, if a company or a bank did not adopt the EMV technology by that date and was the victim of fraud then it was on the hook for the cost.

While retailers had until 2015 to comply, automated gas dispensers have until 2017. Likewise, ATMs also have a little extra time–Mastercard and Visa ATMs must make the shift by October 2016 and 2017 respectively.


Advantages of Chip Enabled Credit Cards

Aside from following in Europe’s lead and satisfying the requirements of credit card companies, the EMV cards offer a number of advantages over traditional cards. First and foremost is security. Whereas traditional cards have one magnetic strip that can be easily copied onto a fake card, EMV cards do not. Instead, the EMV cards utilize the chip embedded in the cards, which creates a different transaction code for each purchase. As a result, if someone manages to get that transaction data, the code is unique to a specific transaction and cannot be used again for future purchases.

EMV cards work in two ways. They can be dipped into a machine, where they have to be held for a few seconds longer than it takes to swipe a traditional card. The card can also be held up to a contactless device, however, these devices are more expensive and less likely to be available as the technology is only now entering the American market.

Another distinction for these type of cards is the actual transaction process. Most of the models in the United States will be the traditional swipe or dip and then sign. Currently, most new credit cards have both a chip and a magnetic strip in case a store’s technology has not yet been updated, but in the future, cards will likely only have chips in them. There is an even more advanced version that requires consumers to enter their PIN numbers after dipping the card, but it costs more and is less likely to be seen in the United States anytime soon. In fact, this type has almost been discouraged as the major credit card companies that initiated the switch to EMV cards did not require them to be “Chip and PIN” models.

The video below from Mountain America Credit Union looks at the chip card and some of its advantages:


Disadvantages of Chip Enabled Credit Cards

Even with the deadlines, adoption of the EMV cards has been a slow process. By 2015, 25 percent of new cards issued were EMV. By the end of 2016, a year after the deadline, that number is projected to be only 75 percent. While part of this is due simply to technological limitations and the difficulty small banks can have when switching up their technology, there is more to the story that just that.

Namely, the switch will be very expensive. Updating the millions of traditional card readers will cost approximately $7 billion. On top of that is the cost to replace the cards already out there, which is estimated to be another $1.4 billion. Last, there is the cost  to replace ATMs and old software so that they can read the new cards, a change that may cost up to $500 million.

Although the chip cards’ chief advantage over traditional cards is safety, they are far from hacker-proof. Now instead of targeting the cards themselves hackers can target the machines that read them. Specifically, criminals can drill or even insert devices into card readers that are able to read the information protected on the cards. Using this information, thieves have been able to make counterfeit cards with magnetic strips and use them in places that do not have the new technology.

On top of security is also the issue of privacy, as the new cards also transmit a large quantity of data. Information, like a person’s present location, may become available if the card is hacked. Lastly, the cards are slower to process and many of the merchants required to make the shift either do not understand the technology or its benefits.

Finally, while it is not the fault of the cards themselves, experts suggest that stiffer security from the new cards will lead to greater rates of card-not-present fraud like online transactions. While these cards may improve security for in-person transactions, fraud may simply move elsewhere.


Conclusion

EMV cards have long been popular in Europe and other markets, yet Americans have been resistant. But that resistance crumbled when a series of hacks revealed how easily credit card information could be obtained. The new chip cards do offer advantages, most notably in terms of security. However, they also have a number of disadvantages and will not get rid of fraud altogether.

Not surprising then, even following deadlines for adoption required by credit cards companies, many American merchants have been slow to endorse them. Additionally, American consumers have also been slow to embrace the new technology due to its slower transaction times. Nevertheless, EMV technology is likely here to stay and will soon become the dominant form of credit purchasing. There will undoubtedly be a number of hiccups in the short term as the technology’s flaws are exposed. But ultimately, those same flaws are likely to be addressed and, in the end, just might make the average American’s wallet a safer place.


Resources

Iovation: The History of Credit Cards and How EMV Will Change Things

USA Today: Where is the EMV Card 10 Months Later?

CreditCards.com: 8 FAQs About EMV Credit Cards

Nerd Wallet: What Are the Downsides to EMV Technology?

Payments Source: EMV Tech in the U.S. Is Still Too Slow and Expensive

Gizmodo: How Criminals Can Easily Hack Your Chip & Pin Card

Computerworld: EMV Smartcards Offer Security Benefits Even Without PIN, Visa Says

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Chip Cards: Making Credit Cards Safe Again? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/business-and-economics/chip-credit-cards-safe/feed/ 0 55000
How Close are we to Driverless Cars? https://legacy.lawstreetmedia.com/issues/technology/close-driverless-cars/ https://legacy.lawstreetmedia.com/issues/technology/close-driverless-cars/#respond Sun, 14 Aug 2016 21:05:06 +0000 http://lawstreetmedia.com/?p=54624

Where are we now and what's left to figure out?

The post How Close are we to Driverless Cars? appeared first on Law Street.

]]>
"Google Self-Driving Car" courtesy of [smoothgroover22 via Flickr]

Driverless cars are all the rage in the auto industry lately, as efforts have been underway for years to move away from the outmoded, human-driven gas guzzlers of the 1990s and early 2000s. The rosy image of the future of cars took a hit recently, however, when a man was killed while riding in his Tesla while on autopilot mode. This fatality immediately raised concerns over safety and the need for regulation that had been relatively quiet in the quest to go driverless. Read on to find out more about self-driving cars, how they are being regulated, and how close we are to a society where your car does the driving for you.


The History Behind Driverless Cars

While history has seen many driverless vehicles, from boats to rockets, driverless cars are typically something that is only seen in science fiction writing. Although the idea began drawing increasing interest around World War II, the concept was still in its infancy. Some actually believed the future wasn’t in driverless cars, but in safe highways that could guide cars like trains on a track. The movement for driverless cars really began to gain momentum with artificial intelligence enthusiasts during the 1960s who dreamed of developing driverless cars by the new millennium. Several prototypes were developed starting in the late 1970s, but the main challenge of developing a vehicle that could not only drive but react to conditions in real time remained elusive.

The industry got a boost when the U.S. military became involved in 2004, initiating a competition to develop driverless cars for use on the battlefield. Since then, driverless vehicles have popped up all over, including on farms, mines, and warehouses. The late 2000s and early 2010s also saw self-driving cars become a more plausible option for the average commuter, particularly when Google began its foray into the industry. Along with Google the other big name in the driverless car industry is currently Tesla Motors, while other many car companies have autonomous car projects of their own. The accompanying video looks at the history of driverless cars and their potential future:


Impact of Driverless Cars

Many carmakers are already fretting about the future of their industry. With the rise of affordable car-sharing services like Uber and Lyft, getting around without owning a car has become much easier for people who live in cities. This has led major car companies to invest in these services. Car companies are also investing in driverless cars, which could present another possible major disruption to the status quo of their industry.

Forecasts suggest that if and when driverless cars become widespread, likely sometime in the next couple decades, families will actually own fewer vehicles. This is due to the fact that a car that can operate by itself will be able to fulfill more functions with greater efficiency than traditional cars. Many may even forego ownership and use car sharing services instead.

It’s also important to consider driverless cars’ impact on the fusion of two industries. Namely, many of the largest and most well-known names in the auto and tech industry are teaming up to create the cars of tomorrow. The idea behind these collaborations is each industry is lending what it does best. In the case of the auto industry that is large scale production of vehicles that are safety and emissions compliant. For tech companies, that is developing the software, not only to allow for driverless cars but also for functions associated with computers or smartphones today.


Safety Concerns

The push toward driverless cars has continued even as the potential for danger remains high. While the accident in May that resulted in the death of the driver was the first fatality related to a self-driving car, it was certainly not the first accident. In fact, driverless cars are between two and five times as likely to get into accidents as cars with drivers, although that range depends on whether unreported accidents in regular cars are included. These numbers are slightly skewed by the small number of driverless cars compared to the large number of traditional cars. Nonetheless, while driverless cars are more likely to be in accidents, until the recent fatality, all the accidents that did occur only resulted in minor injuries because the driverless vehicles were always going slow at the time of the accident.

The following video details the first fatality in a driverless car:

Although the accident rate for driverless cars is higher, nearly all accidents between driverless cars and human drivers are the human’s fault. In fact, it was not until this year when the actions of a driverless car led to a traffic accident. This begs the question, why are driverless cars getting in more accidents when they are less likely to cause them in the first place? Ironically, the problem is that driverless cars are programmed first and foremost to always obey the laws of the road. However, in a busy intersection or highway, humans often disregard the rules and drive as necessary. The conservative approach that is taken by driverless cars actually increases their accident rate.

The inability to overcome the human element has caused the two leading companies in the driverless car industry, Google and Tesla, to pursue different approaches to the same goal. Based on observations made by Google engineers during an initial test phase–where drivers quickly trusted self-driving cars and stopped paying attention–Google decided to take a slower approach with the goal of creating a car that is 100 percent driverless. Conversely, Tesla embraces a notably different method. Tesla’s cars actually have many autonomous features already; however, the company instructs its drivers to keep their hands on the wheel while riding. Telsa argues that this approach will allow the company to collect enough data in order to improve its technology in a shorter timespan.

Perhaps the most significant safety issue, though, could affect cars with drivers and driverless models alike. This issue is the threat of hacking. With the amount of technology in modern cars and the many ways that they can connect to the internet, hackers are now able to take over a person’s vehicle.

The video below looks at the threat of car hacking:


Debate and Regulation

The high rate of accidents and the recent fatality has naturally intensified the debate over whether driverless cars are safe enough to traverse American roads. While autonomous features are commonly found in cars already, fully driverless cars are a work in progress. Getting driverless cars to the point where they are significantly safer than normal cars, however, could take hundreds of years of test driving. It becomes a question of how safe is safe enough and whether a crash in a driverless car is worse than a crash in a normal car.

While regulators are trying to step in and set up a framework for driverless cars, they too are uncertain about how to best regulate the industry. California, home to Google and most major tech companies, is currently the epicenter of the driverless car industry. In California, it is still illegal for driverless cars to operate on public roads without a licensed driver able to take the wheel at any moment. The main problem with these regulations, at least for the carmakers, is the requirement for cars have pedals and a steering wheel–features that driverless car makers like Google want to get rid of. California was actually the second state to authorize self-driving cars for testing and public use. The first state was Nevada, which has much looser regulations. Michigan has also authorized the use of driverless cars.

However, to avoid the varying state standards that have already popped up, many carmakers are anxious for national rules. Such rules may come sooner rather than later, as the Department of Transportation aimed to have a nationwide standard for regulations outlined within six months of its January announcement. In July, Transportation Secretary Anthony Foxx announced some progress on rulemaking and outlined steps going forward. However, many questions about regulation remain unanswered.


Conclusion

Driverless cars are a relatively new and exciting technology. Like any new advancement before them, there is an inherent risk involved, especially at the early stages. However, that risk has to be controlled in order to ensure driver safety.

While driverless cars excite the imagination, they still have a long way to go before they are adequately safe and regulated. This will not be an easy transition as it means people will have to embrace giving up control at 70 plus miles per hour. This also comes at a time when everything, including cars, is vulnerable to online attacks. Nevertheless, driverless vehicles appear to be an important next step in transportation technology. Even if they suffer several growing pains along the way, a car where everyone rides shotgun is likely the car of the future.


Resources

New York Times: Self-Driving Tesla Was Involved in Fatal Crash, U.S. Says

Computer History Museum: Where to? A History of Autonomous Vehicles

Los Angeles Times: Tesla and Google are Both Driving Toward Autonomous Vehicles. Which is Taking the Better Route?

The Economist: The driverless, Car-Sharing Road Ahead

USA Today: Study: Self-driving cars have higher accident rate

Wired: Google’s Self-Driving Car Caused Its First Crash

Bloomberg Technology: Humans Are Slamming Into Driverless Cars and Exposing a Key Flaw

Los Angeles Times: Is the World Ready for Driverless Cars? Are Driverless Cars ready for the World?

Governing: When Regulating Self-Driving Cars, Who’s at the Wheel?

Wired: The FBI Warns That Car Hacking Is a Real Risk

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post How Close are we to Driverless Cars? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/technology/close-driverless-cars/feed/ 0 54624
The Coup That Wasn’t: Inside Turkey’s Failed Military Takeover https://legacy.lawstreetmedia.com/issues/world/turkeys-failed-military-takeover/ https://legacy.lawstreetmedia.com/issues/world/turkeys-failed-military-takeover/#respond Thu, 04 Aug 2016 17:11:40 +0000 http://lawstreetmedia.com/?p=54532

What's next after the chaos?

The post The Coup That Wasn’t: Inside Turkey’s Failed Military Takeover appeared first on Law Street.

]]>

On the night of Friday, July 15, while President Recep Erdogan was on vacation, members of the Turkish military attempted a coup. The effort involved members of several branches of the Turkish military and was only thwarted after the President took to social media to call on the people to rise up and protect the existing government. Although Erdogan was able to fend off a challenge to his rule, the history behind the coup attempt and Turkey’s significance both in the fight against ISIS and in Europe’s refugee crisis cannot be understated.

Read on to find out more about the coup itself and what it would mean if such an attempt was successful both in Turkey and throughout the region.


The Coup in Turkey

The coup started late on a Friday night when tanks dispersed into the Turkish capital of Ankara, passage to Europe along the Bosporus Bridge was blocked, and soldiers took to Taksim Square in Istanbul claiming the elected government was illegitimate and that the military has taken over the country.

However, before the military could completely seize power, President Erdogan did an interview with CNN Turk. Using Facetime, President Erdogan urged citizens to stand up to the coup and protest. This proved to be a catalyst for action, as many Turkish people took to the streets and faced down the military. By the time Erdogan landed in the early morning hours of Saturday, the coup was over and his administration was back in power. At the end of the incident, nearly 300 people were killed and an additional 1,400 were injured.

The video below details the failed coup:

Some History

The recent attempted coup was far from the first effort by the military to exert control over the country. Since 1960, three military coups have taken place and a fourth movement led by the military effectively forced out a sitting government in 1997. Although military coups take on the image of power-mad army officers bursting into cabinet offices, Turkey’s case is slightly different.

That is because the Turkish military has long served, at least in its own eyes, as the protector of the modern state of Turkey, which was founded by Mustafa Kemal Ataturk in 1909. As this earlier Law Street article on the history of the Turkey illustrates, the military has played a crucial role in the development of the modern Turkish state. Chief among the army’s self-imposed responsibilities is keeping the country secular and free of the religious sentiment that has gripped many Middle Eastern countries to its south.

The following video looks at the history of coups in Turkey:

In the most recent coup attempt, the army officers in charge seemed to be rebelling against President Erdogan himself. Erdogan has won a series of elections each time consolidating more power for himself while neutralizing and even arresting his opponents.

While President Erdogan himself has blamed Fethullah Gulen, his former ally who now lives in Pennsylvania, Erdogan’s opponents cite his disregard for laws and the constitution. Erdogan is now in the process of seeking Gulen’s extradition from the United States, but the U.S. government has remained relatively resistant to his request.


The Aftermath

In the aftermath of the failed coup, many outside observers worried and some have even warned President Erdogan about using it as a justification to eliminate his rivals and further consolidate his power. These fears quickly seemed to be coming to fruition with Erdogan’s crackdown to oust from the government and military people he suspects were involved in the coup attempt. It started with the military, as thousands of personnel, including over a hundred generals and admirals, were detained. After that, it spread to educators, government officials, and members of the judiciary who allegedly had ties to the coup plotters as well.

The following video looks at the aftermath of the failed coup:

President Erdogan also targeted members of the media who have been critical of him in the past. Many of these arrests have come with little or no evidence of wrongdoing. Amnesty International recently reported concerns that detainees were being beaten, tortured, and even raped while in custody.

This is hardly the image of democracy triumphing over a military dictatorship that Erdogan trumpeted after the coup failed. Following the coup, Erdogan extended a state of emergency across the country that dramatically expanded the authority of the president with little oversight from the Turkish Parliament.


A Crucial Time for the West

The reason why the outcome of Turkey’s attempted coup is so important is because Turkey is a central actor in two of the biggest events currently affecting the western world. First, there is Turkey’s role in fighting ISIS and within the larger Syrian conflict.

Turkey is currently in a particularly complicated position when it comes to Syria. While it plays a large role in facilitating U.S. airstrikes against ISIS, Turkey is fighting Kurds within its own borders. The Kurds have been central to efforts to regain territory from ISIS and Turkey’s domestic issues with the ethnic group has complicated its role in the larger conflict. Turkey has also been supporting several rebel groups that are fighting the Assad regime in Syria. So far, some have criticized Turkey’s level of engagement in the fight against ISIS, as many hoped it would take on a larger role after ISIS carried out a string of bombings in multiple Turkish cities, including of the Istanbul airport.

However, that outlook may change following the coup. Lately, Turkey has been refocusing inward, purging its own military ranks of many officers suspected in the coup. This has the negative impact of reducing Turkey’s ability to fight. So far, Turkey has been an important U.S. ally in the fight against ISIS by serving as an airbase for the United States. However, Erdogan and many Turkish officials have started to argue that the United States played a role in the recent coup attempt. If relations between the two countries begin to sour–particularly if a battle to extradite Fethullah Gulen erupts–then the U.S. efforts to fight ISIS could face significant setbacks. Lastly, Turkey is home to some of NATO’s nuclear missiles, making political instability there even more concerning.

In addition to Turkey’s role in the fight against ISIS, it plays a crucial role in the international effort to deal with the refugee crisis. Turkey is home to the largest refugee camp of Syrians in the world, with 2.5 million living there. In a deal with Europe earlier this year, Turkey promised to do its best to keep refugees in exchange for more than $3 billion in aid as well as a promise to reconsider Turkey’s candidacy for EU membership. The deal, however, was also contingent upon Turkey improving its human rights practices, which the recent crackdown will likely call into question.


Conclusion

In the aftermath of the failed coup in Turkey, chaos reigned. First, it was very unclear who actually led the coup. While it appears to have been a coordinated effort by many in the military, no central figure ever came forward to claim responsibility, which may be another reason why it failed. Some speculate that the United States may have been behind the coup, training dissidents and allowing Gulen a safe haven to denounce Erdogan’s government. Other reports suggested Erdogan himself may have been behind the poorly planned insurrection, as it gave him cover to finally purge many of his foes from the government and military.

It remains unlikely that we will know the full story behind the coup anytime soon. What is indisputable, though, is Turkey’s significance to the scope of European Union, NATO, and U.S. operations. While the United States may not agree with Erdogan’s subsequent power grab or the methods of his crackdown, he has been a strong ally for the most part. For now, it appears as though the west and Turkey will need to work together, but if instability continues or worsens that cooperation could face serious challenges.


Resources

CNN: Turkey Coup Attempt: How a Night of Death and Mayhem Unfolded

Al-Jazeera: Timeline: A History of Turkish Coups

Law Street Media: Turkey: A Country Perpetually at a Crossroads

Politico: What Caused the Turkish Coup Attempt?

RT: Turkish Prosecutor Claims CIA, FBI Trained Coup Plotters

Al-Monitor: Was Turkey’s Coup Attempt Just an Elaborate Hoax by Erdogan?

Time: Turkey’s President Is Using the Coup Attempt to Crack Down on the Media

Reuters: Turkey Dismisses Military, Shuts Media Outlets as Crackdown Deepens

BBC: Turkey Coup Attempt: Crackdown toll passes 50,000

PRI: Turkey’s Coup Failed, but it Can Still Hurt the Fight Against ISIS

Vox: Turkey’s Failed Coup Could Have Disastrous Consequences for Europe’s Migrant Crisis

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Coup That Wasn’t: Inside Turkey’s Failed Military Takeover appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/turkeys-failed-military-takeover/feed/ 0 54532
Breaking Down Brexit: What the U.K.’s Decision Means for Itself and the World https://legacy.lawstreetmedia.com/issues/world/breaking-brexit-uks-decision-means-world/ https://legacy.lawstreetmedia.com/issues/world/breaking-brexit-uks-decision-means-world/#respond Mon, 01 Aug 2016 16:58:41 +0000 http://lawstreetmedia.com/?p=54020

What does Brexit mean going forward?

The post Breaking Down Brexit: What the U.K.’s Decision Means for Itself and the World appeared first on Law Street.

]]>
"Brexit" courtesy of [freestocks.org via Flickr]

On June 23, the United Kingdom held its long-awaited vote on whether or not to stay in the European Union. In a somewhat surprising development, 30 million people across the U.K. voted to leave the European Union. In the end, Leave voters won with 52 percent of the vote while Remain had 48 percent, in an election with the nation’s highest voter turnout since 1992.

While the debate over whether to leave the Union generated acrimony between the two sides involved, it also held the potential to leave a much larger impact on the world at large. Read on to find out more about the United Kingdom’s exit from the European Union, nicknamed Brexit, the immediate impact on the nation and the possible regional and global ramifications that may still play out.


The United Kingdom and the European Union

The European Union has its origins in the European Coal and Steel Community, an agreement made between six countries, notably including France and Germany, following World War II in an effort to prevent future wars. The agreement quickly evolved into the European Economic Community in 1957, furthering ideas such as free trade and free movement, which serve as the basis of the EU today.

Britain at first was hesitant to join, seeing itself as above the Union and on par with the great post-war powers such as the United States and the Soviet Union. However, following sluggish economic growth in the 1960s, Britain eventually reached out about joining. Britain finally joined in 1973 but in 1975, almost immediately after joining, the country actually had its first referendum on whether or not to stay in the union. In that case, the Remain vote was overwhelming.

Despite the positive referendum results, Britain’s two major political parties, Conservative and Labour, took turns decrying the EU and suggesting an exit during the 1970s and 1980s. Ultimately, though, the nation remained with some caveats, such as not buying into the union’s single currency. Support for the union increased and remained steady within British ruling politics throughout the 1990s and early 2000s. Things began to turn on their irrevocable course beginning in 2005 when David Cameron assumed leadership of the Conservative Party.

Cameron had incorporated Euro-skeptics into his winning coalition and thus had to agree to policies that began distancing Britain from the EU. That move was combined with the rise of anti-immigration sentiment, anti-EU parties, and the EU’s own economic decline following the Great Recession. As part of his most recent election victory in 2015, Cameron promised a referendum on Britain’s EU membership, which ultimately led to Brexit.


Brexit

Clearly, the Brexit vote was a long time in the making as Britain seemingly always had one foot out the door. The argument took two sides. Those who opposed exiting the EU believed that Britain, as a small island, needed to be part of a larger unit to continue to enjoy economic success and to remain secure. Conversely, those campaigning against the EU decried the perceived growing overreach from Brussels (where EU institutions are located), which they contend threatens Britain’s very sovereignty.

The Remain camp was led by then Prime Minister David Cameron, who essentially staked his reputation and political career on voters deciding to remain in the European Union. Within the U.K., Cameron was supported by most of his own Conservative Party, the opposing Labour Party, the Liberal Democrats, and the Scottish National Party. Globally his coalition was strengthened by notable world leaders including German Chancellor Angela Merkel, Chinese President Xi Jinping, and President Barack Obama. Most major businesses and prominent economists also supported staying in the union.

The opposition was headed by the United Kingdom Independence Party (UKIP) then led by Nigel Farage. Supporting him were other members of Cameron’s own party including, Boris Johnson and Michael Gove. Those in favor of exiting the European Union were also endorsed by far-right parties across Europe including in France, Germany, and the Netherlands. To learn more about the recent rise of right-wing, nationalist groups in Europe check out this Law Street explainer.

To formally leave the European Union, the U.K. must invoke Article 50 of the Lisbon Treaty, which was signed in 2007. According to Article 50, the U.K. will have up to two years to negotiate with other EU members the conditions of its exit covering everything from trade to immigration. Experts, however, contend the negotiations could take much longer. No one is entirely certain of how the process will work out–the U.K. is the first country to leave the EU-and until the negotiations are complete, conditions will remain the same as they are currently. The video below looks at the consequences of Brexit:


The Fallout

Although no one knew for sure what exactly the impact would be if the United Kingdom voted to leave the European Union, many predicted it would be unfavorable. The speculation seemed to become a reality both economically and politically for the island nation.

While consumer spending has remained relatively flat, there are a number of other indicators that suggest not all is well. This starts with the British Pound, which quickly lost one-tenth of its value against the dollar and the FTSE 250, a domestic British index, which has also lost significant value. Additionally, hiring has gone down, while unemployment may be increasing. This quagmire is further complicated by business investment, which has also been shrinking. Even hope that a reduced Pound would lead to more travel seems quelled as inflation is rising faster than the increase in tourism.

Britain is not only struggling economically but politically as well. Following the Brexit vote, then Prime Minister David Cameron, who had wagered his career on remaining in the European Union, resigned. This move was followed by a wave of uncertainty as the main opposing party to Cameron, the Labour Party, dealt with a leadership challenge of its own and two of the major candidates for the Prime Minister position dropped out of contention.

While Theresa May ultimately assumed control of the Conservative Party, her new cabinet is a hodge-podge of those in favor of remaining in the EU and those for Brexit, including Boris Johnson who was one of the people who recently dropped out of contention for the role of Prime Minister. Although the Conservative party remains in flux, the Labour party has turned into a disaster with the leader refusing to step down despite a no-confidence vote, leading to an internal struggle.


Regional Impact

Aside from what occurred in England, is what happened and what might happen within the United Kingdom at large. Although England and Wales both voted to leave the European Union, Scotland and Northern Ireland voted with greater majorities to stay. While this may be less of a problem if these were different states within a country, they are actually all independent countries.

After all, it was only last year that the nation of Scotland voted narrowly to stay in the United Kingdom. It is unsurprising then that Scotland’s prime minister has now floated the idea of holding a second referendum for Scottish Independence following Brexit as a way to keep the country within the EU. Scotland is also likely to suffer more economically than Britain as it relies on oil sales for a large portion of its economic output, which were already hampered by low prices.

Along with a potential second Scottish referendum, some even want Ireland to hold a vote to unify following Brexit, however, that idea was quickly shot down by the leader of Northern Ireland and seems much less likely. Even the tiny British territory of Gibraltar will be affected. Situated on the southern tip of Spain, Gibraltar faces the threat of greater Spanish incursion with Britain leaving the EU. The following video looks at the impact of Brexit on Northern Ireland and Scotland:

Impact on the United States

In the United States, the impact has been relatively subdued. While it remains to be determined how Brexit will affect the close relationship between the United States and Britain as well as the European Union at large, the economy was the first to feel the brunt of the decision. Following Brexit, U.S. stocks plunged for two straight days before rebounding and actually reaching record highs a few weeks later. Since then, the effects of Brexit in the United States have been portrayed as negligible with the Federal Reserve still planning on going ahead with at least one interest rate increase this year–something unlikely if the economy was believed to be in real financial danger. The accompanying video looks at some of the potential ramifications of Brexit for the US:


Conclusion

The United Kingdom never seemed to be fully committed to the European Union, and when the EU’s downsides started to outweigh its advantages in the eyes of British citizens, it was deemed time to leave. The impact of this decision has been swift with economic consequences spanning the world. But the true extent of the damage and even what leaving the EU will mean for the U.K. will still take years to sort out.

While much of the blame for this decision rests on British politicians, they are not solely at fault. The Brexit vote was the culmination of a much larger pattern across Europe and may even have parallels to the United States. In the U.K. politicians turned to advocating for nationalism and a refocusing of government policy inwards versus abroad. This was only further exacerbated by the mass migration crisis gripping the continent. This decision, however, was also the result of a union that is stuck in a proverbial purgatory, too united in some regards and not enough in others.

Lastly, the European Union may still face some challenges to the way in which it creates rules for member states–has the process become too top-down, with little bottom-up influence? Certainly in the case of the Brexit vote, citizens at the lowest level voted to topple the existing order and cast the futures of many parts of the world into question. While Britain’s exit may now be unavoidable, this is a good opportunity for pause both for the EU and the U.K., to consider how decisions are made and how to avoid future independence movements or bouts of fragmentation.


Resources

BBC News: The U.K.’s EU Referendum: All you need to know

European Futures: How Did We Get Here? A Brief History of Britain’s Membership of the EU

The Telegraph: Theresa May Pledges to Save the Union as Nicole Sturgeon Promises Scottish Referendum Vote to EU Nationals

The New York Times: ‘Brexit’: Explaining Britain’s Vote on European Union Membership

Law Street Media: Right-Wing Groups in Europe: A Rising Force?

The Economist: Straws in the Wind

NBC News: Brexit Fallout: Gibraltar Worries About Spain’s Next Move

The Financial Times: A tempest Tears Through British politics

The Week: What is Article 50 of the Lisbon Treaty?

Bloomberg: Two More Fed Officials Play Down Brexit Impact on U.S. Growth

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Breaking Down Brexit: What the U.K.’s Decision Means for Itself and the World appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/breaking-brexit-uks-decision-means-world/feed/ 0 54020
Did the Program Meant to Rescue the VA Healthcare System Make it Worse? https://legacy.lawstreetmedia.com/issues/health-science/fixing-fix-program-meant-rescue-va-system-made-worse/ https://legacy.lawstreetmedia.com/issues/health-science/fixing-fix-program-meant-rescue-va-system-made-worse/#respond Wed, 25 May 2016 20:25:42 +0000 http://lawstreetmedia.com/?p=52591

Veterans still have serious problems getting healthcare.

The post Did the Program Meant to Rescue the VA Healthcare System Make it Worse? appeared first on Law Street.

]]>
"department of veterans affairs" courtesy of [Ed Shipul via Flickr]

Two years ago, Congress created a new program, the Veteran’s Choice Program, to fix the well-publicized problems facing health services at the Veterans Affairs Administration, known as the VA. These problems ranged from poor care to wait times to see a doctor that were so long a person was likely to end up dead before they could be told why they were dying. Two years later, the program meant to put an end to these issues is experiencing the same problems and may be even worse than it originally was.

How did the VA healthcare system get to this point? Read more to find out how the VA system was originally crafted, and the issues it faced and continues to face as those in charge search for answers.


The VA System

Since the beginning of war, disease–not actual conflict itself–has been the number one killer of soldiers. With that consideration in place, the United States has offered benefits of some kind to veterans going all the way back to the Revolutionary War. While the system is still serving veterans of wars long over, it became more codified in 1930 when President Hoover created the Veterans Administration. At the time of its inception, the system had 54 hospitals, served 4.7 million veterans, and employed 31,600 people. Over the following years, a number of other agencies were created, including the Board of Veterans Appeals in 1933, the Department of Medicine and Surgery in 1946, and the Department of Veterans Benefits in 1953. All of these departments were eventually organized under the singular umbrella of the VA, which was also made a cabinet level department in 1989.

Over the years, the system has grown in size to become a massive department today. Now, the Veterans Health Administration operates with an annual budget of $59 billion. This budget covers a lot; according to the VA, it funds “150 medical centers, nearly 1,400 community-based outpatient clinics, community living centers, Vet Centers and Domiciliaries.” The system also employs over 305,000 health care professionals. On top of this, the VA is the largest Medical training system in the United States, serving the most graduate-level students and contributing greatly to continued medical research and discovery. This includes 76,000 volunteers, 118,000 trainees, and 25,000 faculty.

Overall, this massive system serves over 9 million veterans in the United States. Based on VA guidelines, once enrollment is initiated veterans undergo a means test to see if they are a priority and if they are able to afford the co-pays. Once these steps are completed, veterans then go to see a doctor within 14 days if they are new patients and between 14 and 30 if they are existing members.


Problems with the VA

The issues plaguing the VA primarily center on wait times. This concerns one of the three branches covered by VA system, namely the Veterans Health Administration. The other two primary branches deal with benefits and burials for veterans. The VA scandal involved a variety of issues, but wait times and the difficulty that many veterans have merely accessing medical care garnered most of the public’s attention.

Some veterans have had to wait for longer than 125 days to see a doctor, a stark contrast to the 30 days required by the system. In facilities across the country, there have been allegations that administrators falsified records to make it appear as though patient wait times were not longer than required. It had gotten so bad that some may have even died while waiting; however, due to record keeping issues, we don’t know exactly how many veterans with pending records were actually waiting for care when they died.

These complaints were not isolated to just one or a few places either, locations in Phoenix; Fort Collins, Colorado; Miami; Columbia, South Carolina; and Pittsburgh, to name a few, all reported problems. There were also issues with claims, especially as more Vietnam veterans were included in disability coverage. Claims have no time limit and can be filed at any point. The primary backlog that most are concerned with is not for decisions on claim appeals, but for the initial claim decisions themselves. These issues were severe enough that the head of the VA resigned in 2014 after the extent of the scandal became known.

Two years after the initial reports broke, results are still not much better for the VA system. This year there have again been reports from states about inaccurate wait times, cost overruns, poor care, and refusal to discipline employees despite poor care.

The following video looks at the scandal with the VA system:


The New System and Lingering Issues

In an attempt to solve the problem Congress created the Veteran’s Choice program. At a cost of $10 billion, this program was supposed to put an end to the problems facing the VA system, particularly long wait times to see a doctor. Under the program, eligible veterans are able to get healthcare from nearby medical centers rather than traveling to VA facilities if wait times or distance are an issue. However, instead of helping, the effort has by many measures made things worse. Wait times have actually increased under the new program, though, according to the VA that is in part because so many veterans are trying to use it. In some places, veterans were never referred to the program or the doctor they were designated to see was too far away.

Based on the system’s structure, the patient had to be the one to initiate appointments, not the provider. However, that wasn’t entirely clear for everyone involved and many veterans were left waiting for calls to schedule appointments. And even in cases where veterans are able to schedule an appointment and see a doctor, the Choice Program has a long backlog of payments that prevents doctors from being paid on time. Doctors have reportedly waited for 90 to 180 days after a long claims process to simply get paid for their services. The situation got so bad that thousands of veterans referred to the new program actually ended up going back to the traditional VA system because it was more efficient.

Why Isn’t it Working?

So how has the new system that was meant to address these problems only exacerbated them in many cases? The answer starts with how the program was set up in the first place. The program’s basic tenet was to give veterans care faster and closer to home, specifically, this meant that if patients had to wait more than a month to make an appointment or drive over 40 miles to the nearest VA facility, they would be eligible. But the system has largely failed to live up to those promises largely because of how quickly the program was created and implemented.

Namely, once Congress approved funding and the president signed the Choice Program into law, the VA was only given 90 days to implement it. This was a program that would affect millions of veterans, hundreds of thousands of medical care professionals, and the families of both. The deadline was so short, in fact, that the VA quickly excluded itself from the process because it knew it would be unable to meet the requirements. This forced the agency to look to the private industry. However, most companies in the private industry were also turned off by the 90-day timeline.  While the VA was ultimately able to settle on two organizations, they have been scrambling to build the requisite network of health care professionals and still rely on the VA for referrals leading to the delays. The system proved to be too complicated and difficult to use for everyone involved, from veterans to doctors and VA administrators.

The accompanying video looks at the problems with the Choice Program:


Conclusion

When the true reality of the VA scandal broke two years ago, everybody agreed that the system was broken and needed to be fixed, fast. However, this is not the type of system that can be repaired and streamlined in just a few months. Unsurprisingly, the quick fix has turned into a disaster in need of a fix of its own. So what is the appropriate action moving forward?

Some have called for a total dismantling of the VA healthcare system as it is known today. Instead of providing care directly to veterans, the new system would simply pay for their care. However, critics are quick to denounce a system that would leave veterans to their own devices. It does seem unlikely an organization as sprawling as the VA will be torn down completely. Consequently, more internal reforms are likely. While the situation is in dire need of a solution, new fixes should not be rushed. Lawmakers will need to create a system that works well and gives veterans the care they need when they need it.


Resources

NPR: How Congress and The VA Left Many Veterans Without A ‘Choice’

NPR: For The VA’s Broken Health System, The Fix Needs A Fix

U.S. Department of Veterans Affairs: 10 Things to Know About the Choice Program

House Committee on Veterans’ Affairs: History and Jurisdiction

U.S. Department of Veterans Affairs: Veterans Health Administration

The Washington Post: Everything You Need to Know About the VA–and the Scandals Engulfing it

The Washington Times: VA Still Plagued by Problems Two Years After Scandal

The Military Advantage Blog: Care Commission Shocker: The Push to End VA Healthcare

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Did the Program Meant to Rescue the VA Healthcare System Make it Worse? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/health-science/fixing-fix-program-meant-rescue-va-system-made-worse/feed/ 0 52591
Trouble Below: The Problems Plaguing the Washington D.C. Metro https://legacy.lawstreetmedia.com/issues/business-and-economics/trouble-problems-plaguing-washington-d-c-s-metro/ https://legacy.lawstreetmedia.com/issues/business-and-economics/trouble-problems-plaguing-washington-d-c-s-metro/#respond Tue, 24 May 2016 20:19:23 +0000 http://lawstreetmedia.com/?p=52475

The long list of failures that got us to where we are today.

The post Trouble Below: The Problems Plaguing the Washington D.C. Metro appeared first on Law Street.

]]>
"The DC metro" courtesy of [urbanfeel via Flickr]

Transportation Secretary Anthony Foxx recently promised to close the Washington D.C. Metro system unless it complied with safety requirements. How did it get to this point where the nation’s second-largest mass transit system is on the verge of being shuttered due to safety concerns and a series of mishaps?

Read on to find out more about the history of the Washington D.C. metro system, how safety concerns have been postponed, the recent spate of issues leading up to its current situation, and the future of the Metro in this uncertain climate.


The History of the Metro

The Washington Metropolitan Area Transit Authority, known locally as the Metro, was an ambitious project dreamed of by local residents going back all the way to the beginning of the 20th century. Residents hoped for a public transit system on par with other major northeastern cities. This hope finally started to come to fruition in 1959 when the first concrete plans for a rail system in the nation’s capital were drafted. After a bill allowing its construction passed Congress in 1965, work was finally under way.

Initially, the system was meant to serve only the capital so as not to compete with new freeways, however, it was later expanded to parts of the neighboring states of Maryland and Virginia as well. Completion of the first segments occurred in 1976, but tracks and routes were changed to meet proximity requirements for important areas like the National Mall. Only in 2001 were all the originally planned tracks finished. Upon completion, the Metro became one of the largest public works undertakings ever completed, stretching 103 total miles and serving the second most people of any rail system in the U.S., following only New York City. The completion of the Metro required the input of politicians, architects, construction workers, and engineers with the end goal of creating a system acceptable to everyone while also being aesthetically pleasing.

One of the Metro’s important goals was the ability to expand and modernize as time went on. Aside from incorporating automated technology, the area covered by the system also expanded to 117 miles served by 91 stations. In addition, the Metro operates a bus system with 1,500 buses, which enables it to serve the region 24 hours a day, seven days a week. The combined system now services an area of approximately 1,500 square miles with as many as 4 million residents. The Metro also recently completed the first phase of the new Silver Line, which will eventually connect the system to Dulles International Airport in Virginia with the second phase currently under construction. Although continued expansion and additional services have been proposed, the system is now faced with a series of long-delayed safety challenges that it must also tackle.


Success Over Safety?

While the Metro was an unqualified success in its first few decades of operation, after that, problems with the system became more apparent. First, was the structure of its governing body–elected officials from each of its four regions control the WMATA and often do not have experience with subway operations.

Also not helping matters is the Metro’s long-term funding plan, specifically, that it has never really had one. In most major cities a large portion of a system’s funding comes from dedicated taxes, often well over 30 percent. In the case of the D.C. Metro, dedicated funding amounts to just 2 percent of its budget. The lack of dedicated funding has essentially left the leadership with the task of fundraising every year to keep the system funded. Raising money became increasingly difficult after the system stopped rapidly expanding, as funds were needed to simply cover costs and maintain the tracks. While funding did continue to increase it was also increasingly siphoned off for other aspects of the Metro system such as the bus network.

As funding became harder to come by, the Metro started to become political. Representatives on the Metro’s board fought for new stations or services in their districts so as to appease their constituencies. They also worked to keep Metro open longer and later, which was met with approval by the riding audience but made routine maintenance harder and harder to complete. The D.C. Metro lacked an additional track as well, something other major systems have to allow trains to bypass maintenance work. Without additional tracks, the Metro is forced to single-track trains while doing maintenance and construction, which causes delays and decreases customer satisfaction.


Incidents

With all these issues plaguing Metro it is no surprise to critics that they started to manifest themselves in the form of accidents. In the early years, these accidents were rare. In 1982, the first major tragedy occurred when a train derailed killing three riders. It was not until 1996 when in another incident occurred when a train skipped a track in the ice and collided with another train, killing the operator. As time went on and the Metro failed to address these issues, the regularity of high-profile accidents began to increase. In 2004, 20 people were injured in a collision. Shortly after, in the span of two years between 2005 and 2007, four workers were killed in accidents. Again in 2007, 23 more passengers were injured in a derailment. The worst accident came in 2009, however, when nine people were killed and over 80 were injured in a crash on the Metro’s Red Line.

This series of embarrassing incidents forced Metro’s hand. The Metro was able to at last secure additional funding to address many of the problems haunting the system. Nevertheless, for all that was seemingly done, the true depth of the problems became clear after track issues continued to plague the rail system. An incident at the L’Enfant Plaza station in 2015 left passengers stuck on a train as it filled with smoke. As a result, one woman died and more than 80 people were taken to a hospital for smoke inhalation. This was followed by a number of different episodes including derailments, oil spills, and more fires.

The situation was so bad, in fact, that the D.C. Metro became the only major rail service in the United States to have its oversight be placed under the direct control of the Federal Transit Administration. Even after this step, though, the problems have endured, with at least eight separate incidents of either smoke or fire causing evacuations or service halts since April 23.

Recent Developments

These repeated episodes seem to have been enough to force authorities to act, but the impetus for the dire threats from Secretary Foxx were not just the incidents described above. After a fire at the McPherson Square Metro stop on March 14, safety inspectors became increasingly concerned that parts of the rail system were unsafe for operation. In an unprecedented move, the entire system was shut down on March 16 in order to conduct emergency safety inspections.

The results of the inspections prompted system administrators to consider drastic action to ensure track safety and on May 6, the Metro introduced a preliminary SafeTrack plan to complete urgent repairs. The plan, which will begin in June, calls for a series of extended repair surges and the shutdown of several stretches of track for weeks at a time. Just a day before that plan was announced, a third-rail insulator cast doubts on the system. Despite the explosion, the station was not closed until hours later and safety inspectors were not allowed access, raising concerns over the safety training of Metro workers. Unsurprisingly, incidents such as this, along with worsening service and decaying facilities, have led Metro’s weekday ridership to decline by as much as 6 percent since the end of the 2015 fiscal year.

The following video looks at the severity of the problem:


The Future

Some see a silver lining with the SafeTrack plan; the decisive action may mark an important shift in the system’s management. A major aspect of the plan is a series of safety surges, where portions of the system will be aggressively repaired for extended periods of time, meaning either single-tracking or outright closures of entire stations. The plan also calls for a roll back on operating hours, meaning stations and their corresponding lines will be closing earlier, and weekend and special event hours will be rolled back in order for workers to have more time to address issues plaguing the system. As a means to offset all these closures and service reductions, bus service will be expanded and used to service stations during temporary closures.

The accompanying video gives a brief summary of the plan:

But before these plans could even be implemented (in typical Metro fashion) they were put on hold. In this case, the decision came at the discretion of the Federal Transit Administration. Specifically, the FTA is calling on Metro to begin immediate work on three sections in particular. The impetus for this call was the recent track explosion and Metro’s botched handling of it; while these three sections were scheduled for maintenance later in the year, they will now become priorities. While it is reassuring to see work finally underway to address Metro’s core problems, these new contradictory directives also raise more questions over who is in charge and whether there is a complete plan in place going forward.


Conclusion

The Metro was once the pride of Washington D.C., admired by visitors from all over the world. However, due to poor initial planning and an even worse maintenance record, that is no longer the case. Now the Metro is faced with a potential $2 billion shortfall by 2025 due to budget cuts and lost ridership. Couple this with all the accidents, injuries, and even deaths, the Metro now finds itself in a very unenviable position.

But not all is necessarily lost. Other cities such as Chicago and New York faced similar problems and resorted to efforts like town hall meetings and extended track closures to address their problems. Going forward, the Metro can incorporate these methods along with its own ambitious maintenance and rehabilitation plans. Ultimately, the Metro needs to find leadership that is willing to make unpopular decisions and then follow through with them. In the short term, customers will be unhappy about delays and closures, but a late train is still better than one that never arrives at all.


Resources

USA Today: Transportation Department threatens D.C. Metro shutdown if safety doesn’t improve

Center for History and News Media George Mason University: Building the Washington Metro

Metro: Washington Metropolitan Area Transit Authority

Washingtonian: The Infuriating History of How Metro Got So Bad

The Washington Post: At Least 6 killed in Red Line Crash

The Washington Post: 5 facts about Metro’s ‘Safe Track’ Plan

Greater Greater Washington: The Feds Tell Metro to Rearrange its Maintenance Plan

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Trouble Below: The Problems Plaguing the Washington D.C. Metro appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/business-and-economics/trouble-problems-plaguing-washington-d-c-s-metro/feed/ 0 52475
Calling in Sick: The Problems with Detroit Public Schools https://legacy.lawstreetmedia.com/issues/education/problems-detroit-public-schools/ https://legacy.lawstreetmedia.com/issues/education/problems-detroit-public-schools/#respond Sat, 21 May 2016 13:00:41 +0000 http://lawstreetmedia.com/?p=52265

Public schools in Detroit and across the country face some big challenges.

The post Calling in Sick: The Problems with Detroit Public Schools appeared first on Law Street.

]]>
"Michigan Central Station as seen from the Detroit River" courtesy of [Jeff Powers via Flickr]

On Tuesday, May 3, teachers across Detroit called in sick. Enough, in fact, that 94 of the district’s 97 schools had to close for the day. This was not the result of Zika or some other new super virus; the teachers weren’t actually sick at all and everybody knew it. So what is exactly was happening? A sick-out. In Detroit and in other places in the past, teachers have been resorting to this desperate tactic in order to protest the shabby state of schools. Read on to find out more about the “sick-outs,” why they are happening in Detroit and other places and whether or not they are doing anything to inspire the changes they are meant to incite.


What’s happening in Detroit?

A sick-out is defined as exactly what is sounds like: “An organized absence from work by workers in the pretext of sickness.” The sick-out that occurred in Detroit earlier this month was a two-day, school district-wide protest that involved over half of the area’s 3,000 teachers. After fears that the school district would not be able to pay all of its teachers for the full year heightened, many teachers began protesting.

Specifically, there are two ways teachers can be paid–with paychecks spread out over a full calendar year or only during the school year. Due to serious budgeting shortfalls, the school system is set to run out of money for teacher salaries some time in the summer. As a result, those paid year round will end up with less than those who get paid only during the school year itself. If the budget was in good shape, teachers on both pay schedules would get the full amount, just paid out over a different period of time. Teachers in Detroit already held a mass sick-out in January to protest the deteriorating conditions in many Detroit public schools, which include pest infestations, mold, and damaged infrastructure. They have so far opted for sick-outs because other traditional means of protest, namely strikes, are against the law for teachers in Michigan.

The video below looks at the most recent sick-out:

A stop-gap measure has already been in place since March in the form of a $48.7 million agreement passed by the legislature to keep schools operating until the end of June. While this temporary fix is already in place, Michigan legislators have been debating whether or not to pass an additional $700 million dollar solution. This plan would create a new school district to educate students and leave the debt to be paid off with the old district. Essentially, it would leave one district to handle the task of paying the debt and the other would only be concerned with educating students. However, even if this plan makes it through the state legislature, there is no guarantee that it will work.

These budgeting issues are largely products of Detroit’s much-publicized bankruptcy back in 2013. When Detroit declared bankruptcy, city leaders estimated it had as much as $18 billion in debts that it could not pay, ranging from pensions to bond obligations. The amount of debt was ultimately reduced to $7 billion, which included a grand bargain in which private entities agreed to donate approximately $816 million to not only reduce cuts to pensions but also to ensure the survival of other important aspects of Detroit’s culture, such as its art museum. Even with Detroit emerging from bankruptcy and early returns showing the city doing better, it is still a long path to full recovery.

The following video looks at the totality of the Detroit bankruptcy:


Where else are sick-outs happening?

This was not the first time Detroit’s teachers have fought back. In 2006, they held a strike and earlier this year held another sick-out. However, this most recent sick-out was the largest. Detroit’s teachers are also not alone in using tactics such as these to protest pay and working conditions. In 2014, teachers in Colorado staged sick-outs of their own. In that case, the dispute was partly over the collective bargaining agreement, but also over attempts to prevent changes to history courses, which conservatives within the school districts thought would reflect poorly on American history.

A closer comparison, though, may be what is happening in Chicago. While there have not been any actual sick-outs in Chicago yet, the situation certainly seems ripe for that type of action. Much like Detroit, the governor of Illinois has called for the school district in Chicago to declare bankruptcy, which would, among other things, free the district from its obligations to many of its teachers and employees.

This situation is not new to Chicago either; teachers protested in 2012 over many of these same issues and the situation was only averted through concessions from both sides. However, movements to strip state employee rights as cost cutting measures have been growing lately, as displayed by events like these as well as developments like the anti-union legislation of Wisconsin Governor Scott Walker. Chicago has already forced teachers to take unpaid days off and has laid off employees, including some teachers, to cut costs. This is also the impetus for getting the school district to declare bankruptcy–if that happens the state is no longer beholden to union agreements and may be able to reduce its pension obligations.

In order for the Chicago Public School System to declare bankruptcy, the city itself would have to declare bankruptcy. In the case of Chicago at least, it is not able to file for bankruptcy under current laws, though a proposal may be making its way through the state legislature. In 2015, Illinois Republicans proposed a bill that would make bankruptcies legal for municipalities, but it failed to pass. While it would certainly be a major embarrassment if Chicago, the third largest and a very affluent city, was forced to declared bankruptcy, many state leaders support the option.


Is any of this making a difference?

The battle in Detroit has drawn the usual criticisms from both sides. The teachers are critical of the government’s handling of the city’s finances, claiming they just want to be paid the money owed to them and be provided with acceptable conditions to teach in. Conversely, politicians called the teachers’ actions political, claiming that they are jeopardizing the futures of the students they teach. While the two sides hurl accusations at each other, it is fair to ask if what they are doing is actually improving the situation.

On the Wednesday following the recent sick-outs, teachers agreed to return to work after the state legislature moved forward on a $500 million measure to address the district’s fiscal issues. However, this deal must still be reconciled with a similar piece of legislation passed by the state’s senate before a solution can be finalized. If the two sides are unable to agree, another stop-gap measure may be used, but that would risk more sick-outs and further erode the confidence in the state government.

The video below looks at the cumulative problems plaguing Detroit Public Schools:


Conclusion

Can Detroit right the ship when it comes to its schools?  This is a question and a problem that is only compounded by the many complicated issues facing the city. Detroit Public Schools have lost over 100,000 students in roughly 13 years to charter schools, private academies, and attrition. That is a lot of lost revenue for any city, but it is especially taxing for one that just emerged from the largest municipal bankruptcy in U.S. history.

Detroit isn’t the only city with public schools in poor fiscal shape. Chicago is probably the most comparable example, which may soon face many of the same issues and has already taken some drastic measures to cut costs. In light of Detroit’s bankruptcy, teachers and city officials have become increasingly concerned with how the school district will meet its long-term pension obligations and even its regular teacher salaries. The same issues play important parts in the debate over whether bankruptcy is the appropriate tool to deal with the city of Chicago and its public school system.

In light of Detroit’s bankruptcy, several difficult decisions were made yet the city’s schools are still in a particularly difficult situation. If the city is unable to find a solution beyond paying off one debt by accruing another, while at the same time offering fewer services, this may not be the last time its teachers call in sick.


Resources

CNN: Most Detroit Schools Closed Again Due to Teacher ‘Sickouts’

Merriam-Webster: Definition of Sick-out

Detroit Free Press: DPS Sick-outs a Symptom of Lansing’s Ill Behavior

Think Progress: Everything You Need To Know About Detroit’s Bankruptcy Settlement

The Bond Buyer: Detroit, A Year Out Of Bankruptcy, Still Faces Long Road Back

In These Times: Why Chicago Won’t Go Bankrupt-And Detroit Didn’t Have To

The Guardian: Colorado Teachers Stage Mass Sick-out to Protest U.S. History Curriculum Changes

Fortune: Why Chicago’s Fight With Teachers Is the Sign of a Much Bigger Problem

Chicago Business: GOP Plan Would Allow State Takeover of CPS and Bankruptcy

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Calling in Sick: The Problems with Detroit Public Schools appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/education/problems-detroit-public-schools/feed/ 0 52265
What Explains Life Expectancy in the United States? https://legacy.lawstreetmedia.com/issues/health-science/gazing-crystal-ball-life-expectancy-united-states/ https://legacy.lawstreetmedia.com/issues/health-science/gazing-crystal-ball-life-expectancy-united-states/#respond Sat, 07 May 2016 13:45:41 +0000 http://lawstreetmedia.com/?p=52130

Why do some groups live longer than others?

The post What Explains Life Expectancy in the United States? appeared first on Law Street.

]]>
Image courtesy of [Glenn3095 via Flickr]

Studying life expectancy allows us to understand what factors help people live longer and how changes in conditions affect people’s lives. Recent research shows how life expectancy varies among different groups of Americans, sparking some important questions about future policy decisions. While life expectancy has increased overall over the past several decades, those gains tend to vary widely among certain groups. This research has implications on a range of issues from public health and inequality to Social Security. 

Read on to find out more about the current U.S. life expectancy, how it has changed over time, and how American lifespans compare to those of people in other countries, particularly other advanced nations.


How long do People Live?

The most common way to measure how long people live on average is through life expectancy, which is the amount of time in years a newborn baby is likely to live based on current conditions and health trends. The average life expectancy at birth for Americans as of 2014 was 79.68 years, meaning a child born in 2014 in the United States could be expected to live to that age. While this provides a baseline, the numbers can be further divided among a variety of demographics, which tell a more in-depth story.

Life expectancies tend to vary among different groups, particularly when categorized by race, sex, and income levels. In the case of race, there is a wide disparity between black and white Americans. A Centers for Disease Control and Prevention report in 2009 found that the average life expectancy for black Americans was 75 years old, which was the same as it was for white Americans back in 1979. According to the CDC, the reasons for this disparity are higher rates of cancer, diabetes, homicide, heart disease, and perinatal conditions. Racial disparities are considerably larger when you further divide by education. Some of the largest gaps in life expectancy exist between white Americans with more than 16 years of education and black Americans with fewer than 12. One dynamic keeping this gap from becoming even larger is the much higher rate of suicide for whites.

Sex also plays a role in how long the average person can expect to live. In 2012, the average life expectancy for a male in the United States was 76.4 years and 81.2 years for women. This gap is not uncommon, however, as women tend to live longer than men for a number of reasons. One such reason is that women generally engage in less risky behavior and suffer fewer car accidents. These numbers hold even though more baby boys are actually born than baby girls, although that is mostly the result of female embryos having slightly higher rates of miscarriage than male embryos.

A third major factor that relates to life expectancy is income. In a recent study based on data from 2001 to 2014, researchers found that the average life expectancy for the richest men in the United States is approximately 87 years, which is about 15 years longer than the poorest. To put that in a clearer context, rich men in the United States live longer than men in any other country, while poor men live, on average, the same number of years as men in countries like Sudan and Pakistan. The numbers are similar among wealthy women who have an average life expectancy of 89 years old, 10 years longer than women in the lowest income group. Although researchers have not drawn a causal line between life expectancy and income to explain what drives this gap, a clear correlation exists between the two.

The accompanying video summarizes the study’s findings:

The numbers can also be parsed further. Although the rates for the richest men and women vary little depending upon the geographical area, the same is not true for the poor. On average, low-income people live shorter lives in the middle of the country compared to those who live in rich coastal cities. The study’s authors note that most of the geographical differences may be behavior-related and potentially explained by factors like rates of smoking and obesity. They also note that in wealthy cities with high levels of education and public spending, those at the bottom of the income scale tend to live longer than their counterparts in less affluent cities.


Changes in Life Expectancy

Life expectancy in the United States has changed dramatically over time. For example, in the 1930s, when Social Security was first introduced, the average man only lived to be 58 years old and the average woman 62 years old. Ironically, the retirement age for Social Security was set at 65. Another important consequence of the gap in life expectancy for the rich and poor is its effect on economic inequality and Social Security. As wealthy people live longer, they also receive more in Social Security benefits because they get additional payments over the course of their lives. Depending on how large the gap is, wealthy people may end up taking out a larger share of what they contributed relative to their income, which could reduce the progressivity of the Social Security program. This gap also has important consequences for the debate about retirement age, which many argue is necessary to keep the program funded as baby boomers retire.

While life expectancy has changed a lot over the past several decades, it has affected different groups in distinct ways. The clearest explanation comes in the same three characteristics mentioned earlier: sex, race, and income. In this instance, sex and race tend to blend together. Traditionally, white women have lived the longest, however, a recent study found that life expectancy for white women actually went down by a month. While this group still lives longest by far, the number has shrunk slightly due to a combination of factors, including rising suicide, drug overdose, and liver disease often caused by alcoholism. While white people, in general, suffer from these problems more than other groups, women have been particularly susceptible. This dip in life expectancy is actually the first one since totals have been calculated and happens at a time when other health concerns such as strokes and heart disease are causing fewer deaths.

The video below looks at this unexpected change:

While white women saw a reduction in life expectancy, several other groups saw an increase. Namely, black males and Hispanics of both sexes are expected to live longer. The third group, made up of white males and black females, saw no change in their life expectancy. Aside from sex and race, income level’s influence on life expectancy also changed. In the case of income, the richest people in America have gained three years in life expectancy from 2001-2014, while life expectancy for the poorest Americans did not change.


The United States Compared to the Rest of the World

Reliable data for life expectancy covers a relatively short time in history. In fact, for the United Kingdom, the country with the farthest reaching information, rates only go back to the 191h century. In the U.K., and virtually every other country, life expectancy was very short in the early 1800s, averaging between 30 and 40 years old; in South Korea and India, it was as low as 23. However, as healthcare and science improved, especially regarding infant mortality, life expectancy rose dramatically across the globe around the beginning of the 20th century.

This rapid improvement occurred in the United States as well, but the U.S. average of 79.68 years currently ranks 43rd relative to the rest of the world. Although countries with longer life expectancies may not be as large and diverse as the United States, it is important to ask why–for such a rich country–the U.S. life expectancy is relatively low, particularly compared to other developed nations.

The answer, according to the CDC, is threefold: drug overdose, gun violence, and car crashes. These three categories lead to injuries that account for roughly half of the deaths for men and a fifth for women in the United States. Americans, on average, live two years fewer than people in similarly developed countries. The effects of these, particularly drug overdoses, have been most acutely felt among middle-aged white Americans. Another important factor that contributes to America’s lower life expectancy is smoking tobacco. Many people in the United States started smoking earlier and in larger numbers than in other places.

Another major factor affecting life expectancy and keeping the United States behind other developed countries is the infant mortality rate. The infant mortality rate “compares the number of deaths of infants under one-year-old in a given year per 1,000 live births in the same year.” The infant mortality rate for the United States is 5.87. Although that is historically low, it is less impressive compared to other countries–the United States has the 167th highest rate out of 224 countries, and is a far cry from most other developed nations that average between two and four. Like life expectancy in general, infant mortality rates are also affected by things such as race and income with more affluent and white babies at a much lower risk of death than lower-income and black babies.


Conclusion

There is no conclusive way to say exactly how long a person will live, but life expectancy provides an effective measure to see how certain factors contribute to longevity. In the United States, these numbers have been broken down further to take into account the differences across a wide range of demographics. In general, the most recent data was positive, with groups either staying where they are or seeing life expectancy gains, except for a few cases. However, even these modest gains still leave the United States behind many other developed nations. The reasons for this shortcoming are manifold, ranging from high infant mortality rates to smoking tobacco. Regardless of the results, though, life expectancy can provide people with a good baseline for how long they might live and what factors contribute to longevity.


Resources

The World Bank: Life Expectancy at Birth

Infoplease: Life Expectancy for Countries, 2015

The Journal of the American Medical Association: The Association Between Income and Life Expectancy in the United States, 2001-2014

Social Security Administration: Life Expectancy for Social Security

The Washington Post: The Stunning–and Expanding–Gap in Life Expectancy Between the Rich and Poor

CNN: White Women’s Life Expectancy Shrinks a Bit

NPR: Life Expectancy Drops For White Women, Increases For Black Men

CNN: Why Americans Don’t Live as Long as Europeans

Population Reference Bureau: Smoking-Related Deaths Keep U.S. Life Expectancy Below Other wealthy Countries

Central Intelligence Agency: World Factbook

Our World in Data: Life Expectancy

USA Today: Life Expectancy in the USA Hits a Record High

Population Education: Why Are More Baby Boys Born Than Girls

USA Today: Infant Mortality Rates hits Record Low, Although Racial Disparities Persist

Business Insider: Huge Racial Gap in Life Expectancy

 

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post What Explains Life Expectancy in the United States? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/health-science/gazing-crystal-ball-life-expectancy-united-states/feed/ 0 52130
Conflict in the Caucasus Mountains: The Battle over Nagorno-Karabakh https://legacy.lawstreetmedia.com/issues/world/quiet-conflict-caucasus-mountains-azerbaijan-battle-nagorno-karabakh/ https://legacy.lawstreetmedia.com/issues/world/quiet-conflict-caucasus-mountains-azerbaijan-battle-nagorno-karabakh/#respond Tue, 03 May 2016 20:49:06 +0000 http://lawstreetmedia.com/?p=52013

What's going on in Azerbaijan?

The post Conflict in the Caucasus Mountains: The Battle over Nagorno-Karabakh appeared first on Law Street.

]]>
Image courtesy of [ogannes via Flickr]

While the world focuses on international terrorism, ISIS, and North Korea’s nuclear ambitions, another long-running conflict is winding down. In Azerbaijan, a longstanding ceasefire agreement has boiled over into violent conflict in the Nagorno-Karabakh region. Although government and separatists forces appeared to have reached a resolution, fighting quickly erupted again.

Read on to see why were these two parties fighting, what is currently happening, and what role Armenia and other outsiders played in the struggle.


The Conflict

For centuries, Armenians, Turkish Azeris, and Persians struggled over the territory. In Azerbaijan, the struggle is centered in an area known as Nagorno-Karabakh, featuring a familiar conflict between Christians and Muslims. In this case, the Christians are ethnic-Armenian separatists and the Muslims are Azeris native to the country. This conflict was essentially settled in the 19th century when the region was incorporated into the Russian Empire. The two groups lived side-by-side and engaged in a rivalry for territory, however, the rivalry only occasionally boiled over into violence. But after the USSR emerged, it reversed course by moving an Armenian majority into the historic Azerbaijan territory of Nagorno-Karabakh, fueling conflict between the two groups. The USSR was essentially trying to reduce opposition to its own rule by employing the concept of divide and conquer–pitting the groups against one another.

As the USSR maintained control over the region, tempers between the two sides were largely held in restraint. But the old conflict flared up as the Soviet Union started crumbling at the end of the late 1980s and early 1990s. Without the Soviet presence, there was no check on the two competing sides. This situation continued to simmer until 1988 and in 1991 Nagorno-Karabakh declared itself an independent republic, a move that even Armenia did not acknowledge. The subsequent war between Armenia and Azerbaijan lasted until 1994 leaving 30,000 people dead and another million displaced. Since the end of the first conflict, the region has effectively been independent of Azerbaijan and receives continued support from Armenia. 

The video below takes a closer look at the conflict:


The Ceasefire

The original ceasefire agreement in 1994 was brokered by Russia. That settlement left an especially bad taste in the mouths of Azeris as it allowed the Armenians to remain in the territory they occupied. In some of the territories this development effectively allowed Armenia to occupy parts of Azerbaijan. However, in the case of Nagorno-Karabakh, this made some sense as 95 percent of the population is ethnically Armenian. While the 1994 ceasefire was unpopular, it held for 10 years.

Things began unraveling in 2014 when the Azeris shot down an Armenian helicopter, ratcheting up tensions once more. This was followed by a series of ceasefire violations throughout 2015, culminating with recent violence this year, which has left an estimated 60 people dead. Following the most recent spate of violence, Azerbaijan and Armenia announced a mutual agreement for another ceasefire in early April. But almost immediately after it was announced, the violence reportedly continued and both sides accused the other of violating the agreement.


The Role of Regional Powers

The conflict between Azerbaijan and Nagorno-Karabakh is really a struggle between Armenia and Azerbaijan. However, like many other conflicts–both within in the region and elsewhere in the world–outside global and regional powers play an important role. In this case, the influence comes primarily in the guise of Russian support for Armenia and Turkish support for Azerbaijan.

In light of the conflict’s resurgence, Turkey has reiterated its support for Azerbaijan. In a recent trip to the United States, Turkish ruler Recep Tayyip Erdogan pledged to support Azerbaijan to the end. Turkey has also signaled its support by closing its border with Armenia, which hurts the country economically and blocks its access to the Mediterranean Sea.

There is also history to consider. Turkey was the site of the mass killing of Armenians during WWI, for which repercussions persist. For most Turks, the topic of the genocide is a non-starter. Turkish ruler Recep Tayyip Erdogan has also actually garnered support by denying the event took place and its recognition internationally remains a contentious issue. If Turkey were to suddenly change course and admit to the atrocity, it could potentially be held liable to pay reparations.

Read More: The Armenian Genocide: A Battle For Recognition

Russia has a particularly familiar relationship with all the parties involved, which at one point were all part of the Soviet Union. Armenia and Russia enjoy a particularly close relationship, in fact, Armenia is home to one of Russia’s largest foreign army bases. Russia was also part of a triumvirate of nations involved in the Minsk Group, which was founded in 1992 with the express purpose of resolving the original Armenian-Azerbaijan conflict. Although Russia’s history with both sides is clear, its motives remain murky.

While Russia has long been a supporter of Armenia it has also served as the main source for weapons in the escalating arms race, supplying both sides. Russia’s potential duplicity extends beyond just selling weapons. While Russia currently benefits from the status quo, selling weapons to both sides, some speculate that it may be willing to send in peacekeeping forces to bolster its influence in the area. Aside from influence, Russia may also be motivated by the vast amount of oil present in Azerbaijan. So far, Russia has advocated for a peaceful settlement and theories about using the conflict as an excuse to move into Azerbaijan are speculation at this point.


Other World Powers

Aside from Russia and Turkey, other nations may also play a role in resolving this situation, notably the United States and the European Union. Azerbaijan produces 850,000 barrels of oil a day and if the conflict does escalate to its post-Soviet levels that production may be in danger, which could impact oil prices. While this is less of a concern to the United States directly because of its domestic oil industry, Azerbaijan is an important oil exporter to Europe and Central Asia. The United States was also a member of the Minsk group along with Russia and France. Like Russia, other world powers seek a swift peace resolution.

Israel also relies on Azerbaijan as its largest oil supplier, using a pipeline that runs through Turkey. In return, Azerbaijan is one of Israel’s biggest customers for weapon sales. Azerbaijan provides Israel with an avenue to monitor Iran, as the two countries share a border. Adding to the animosity in the present situation, Israel does not recognize the Armenian Genocide. Iran and Armenia enjoy a close relationship–Iran supported the Armenians in the war with Azerbaijan back in the 1990s and it hopes to build a rail project in Armenia in the future. While these two nations’ agendas may be more political than others, they have not called for any escalation of the conflict.


Conclusion

Nagorno-Karabakh is another one of the flash points around the world that few people know about and even fewer understand. The region is the epicenter of a centuries-old conflict between Armenians and Azeris, complicated even further by religious undertones. It is also situated in an unstable region, the Caucasus Mountains area, with Russia to the north and the Middle East to the south.

The geographical location of Nagorno-Karabakh further complicates things, as it serves as a proxy both for Armenia and Azerbaijan as well as larger regional powers like Russia and Turkey. A war in the region could set off a larger conflict between Armenia and Azerbaijan, which in turn could increase tension between Russia and Turkey. Furthermore, if Turkey was involved, as a member of NATO, the other members of NATO would be obligated to assist it. While there is no hint of this yet, the potential for volatility remains. To make the situation even more confusing and unstable, Azerbaijan is a major oil exporter. While the conflict has the potential to exacerbate tensions within the region, so far all outside powers have advocated for peace.

Unfortunately, the original dispute in Nagorno-Karabakh was never resolved and has festered for years, becoming what is known as a frozen conflict. One reason for this is that many regional powers seem to have conflicted interests at play. After all, several countries profit from related arms deals and, so far, the coveted oil supply has not been threatened. The simmering conflict is likely to continue as it has and Russia remains poised to play the largest role in influencing it, especially as the United States focuses on fighting ISIS in Iraq and Syria. However, with all the interconnected parties at play, the conflict could have implications beyond the contested region in Azerbaijan.


Resources

BBC News: Nagorno-Karabakh: Azeri-Armenian Ceasefire Agreed

BBC News: Nagorno-Karabakh profile

Council on Foreign Relations: Nagorno-Karabakh Conflict

Al-Jazeera: Armenia and Azerbaijan call Nagorno-Karabakh ceasefire

U.S. News and World Report: Turkish President Recep Tayyip Erdogan has Vowed to Back Azerbaijan in the Conflict with Armenia Over the Separatist Region of Nagorno-Karabakh

Law Street Media: The Armenian Genocide: The Battle for Recognition

The Heritage Foundation: The Nagorno-Karabakh Conflict: U.S. Vigilance

OSCE: Minsk Group

Newsweek: Russia ‘Arming Armenia And Azerbaijan’ As Hostilities Increase

Voice of America: What’s Hiding Behind Russia’s Calls for Peace in Nagorno-Karabakh

Harretz: Nagorno-Karabakh: The Conflict No-one, Including Israel, Wants to Solve

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Conflict in the Caucasus Mountains: The Battle over Nagorno-Karabakh appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/quiet-conflict-caucasus-mountains-azerbaijan-battle-nagorno-karabakh/feed/ 0 52013
It Takes a Policy: The Fight Over Paid Family Leave in the United States https://legacy.lawstreetmedia.com/issues/business-and-economics/takes-policy-fight-paid-family-leave/ https://legacy.lawstreetmedia.com/issues/business-and-economics/takes-policy-fight-paid-family-leave/#respond Fri, 15 Apr 2016 13:15:47 +0000 http://lawstreetmedia.com/?p=51812

Despite recent efforts the United States is still an outlier.

The post It Takes a Policy: The Fight Over Paid Family Leave in the United States appeared first on Law Street.

]]>
"Family" courtesy of [mrhayata via Flickr]

The city of San Francisco recently approved a measure guaranteeing fully paid parental leave for the birth or adoption of a child for up to six weeks. With this new policy, San Francisco becomes the first city in the United States to offer 100 percent of a worker’s salary during parental leave. While this is a major step for the “City by the Bay,” the fact that San Francisco is the first to make paid leave mandatory is also a troubling sign for the rest of the country.

Read on to find out what exactly San Francisco’s law means, who is following the city’s lead, and why the United States lags so far behind other countries.


San Francisco’s Parental Leave Policy

San Francisco’s policy guarantees an employee his or her full salary during the six-week parental leave period. However, this policy actually builds on a plan that already exists within the state of California. California’s state policy guarantees 55 percent of an eligible worker’s salary for the same period, funded using employee contributions and administered using an insurance fund. Essentially, San Francisco’s policy promises the other 45 percent, but this time, the employer pays the cost.

Like minimum wage increases, San Francisco’s new policy will not take effect immediately. Instead, it will be phased in beginning in 2017, when companies with 50 or more employees will be required to meet the new standard. Companies with 35 employees will follow in July of the same year and finally those with 20 or more employees in 2018. Eligible workers will have to work at least eight hours a week, spend at least 40 percent of their work week in San Francisco, and wait 180 days after being hired in order to be covered by the policy.

The video below provides some additional detail on San Francisco’s new policy:

Similar Programs

While San Francisco and California’s system is the most comprehensive, other states and cities have similar plans. Three states have long had their own policies in place: New Jersey, Rhode Island, and Washington. Like California, New Jersey’s policy covers six weeks while Rhode Island’s guarantees only four.

Like California, these states, with the exception of Washington, created their policies as an extension of existing short-term disability programs. The programs are largely based off of short-term disability programs that only five states have in place. The programs are funded by withholding a small portion of employee wages each paycheck, much like Social Security. Washington  passed a law for paid family leave but state legislators have yet to fund the effort, which has prevented its implementation.

In addition to these states, New York recently passed a program of its own. New York’s policy covers employees, both part-time and full-time, for up to 12 weeks. Furthermore, there will be no exemptions for small businesses and employees will only need to be with a company for six months to be covered. The program is funded through an insurance model, which involves taking small payments for the program from each worker’s paycheck. In this sense, the policy is a lot like the ones in states that built on existing short-term disability insurance programs.

New York’s paid leave plan will, like San Francisco’s, be implemented gradually–the full 12 weeks of leave and 67 percent of pay will not be guaranteed until 2021. One of the New York plan’s greatest strengths is its job protection component, which prevents someone from losing his or her job for taking leave. This expands on current federal law, which guarantees full-time employees’ job protection and 12 work weeks of unpaid leave.

The accompanying video looks at New York’s policy:

While these programs are a start, only five states have them and the programs that do exist still leave the United States behind most of the developed world. Even Bangladesh, which is not a country typically associated with progressive social rights, has a mandatory 16-week policy. However, for stronger programs to be enacted in the United States, it will likely have to start on the federal level and right now that doesn’t seem probable.


Problems with the System

There are several problems with the current state of family leave in the United States beyond the lack of paid leave in most places. This starts at the federal level with the Family and Medical Leave Act (FMLA), which was passed back in 1993. While the FMLA does guarantee up to 12 weeks of leave for parents for childbirth or adoption, to care for an ill family member, and for an illness that prevents someone from working, the leave is unpaid. Even this unpaid leave comes with caveats, as it only applies to employees who have worked at their current company for over a year, have worked more than 1,250 hours in the past 12 months, and work at a company with more than 50 employees.

Democrats in Congress have proposed a law that they hope will fill in the holes left by the FMLA. Senator Kirsten Gillibrand introduced the Family and Medical Insurance Leave Act last year. This bill calls for the federal government to guarantee up to 66 percent of a worker’s income for 12 weeks in the case of serious illness or a new child. This would cover all workers regardless of how long they have been employed, the size of the business, or any of other existing limitations. To fund this leave the bill proposes a new payroll tax of 0.2 percent, which would be about $1.50 for a typical worker based on an estimate from the National Partnership for Women & Families.

Unfortunately, it is pretty unlikely that this bill will make its way through Congress, especially with a Republican majority in control of both houses that seems unlikely to take up the bill. Even a Republican alternative, proposed by House leader Paul Ryan, seems unlikely to gain traction due to Congressional Democrats’ criticism that it would attack essential worker’s rights.

The following video looks at some of the problems with the existing system:


Resistance

The United States is the only industrialized country and one of just three countries in the world to not guarantee some type of paid parental leave. Attempts to change the status quo in the United States have often been met with backlash. In California, for example, the Chamber of Commerce labeled it as potentially the number one killer of jobs when the bill was passed. The National Federation of Independent Businesses and the Society for Human Resources management are also opposed to forcing companies to offer paid leave. This sentiment has been echoed all over the U.S. by many small businesses as well, where fears of costs are too great to garner support.

All this negativity, though, may not be well-founded. A 2011 survey in California, taken six years after the state implemented a family leave program, found that 90 percent of companies felt that the policy had either a neutral or positive impact on the work environment. Even more telling, this positive effect was seen at higher levels among small businesses relative to large ones, despite fears of overwhelming costs.


Gender Bias

Another major issue in the fight over paid parental leave is that it is seen as a women’s issue, and not an issue for both parents. Unsurprisingly, in a country that does not offer paid leave to mothers, the United States does not give the option to new fathers either. This is another characteristic that sets the United States apart from the rest of the world–47 percent of countries offer leave to fathers as well as mothers.

Giving men time off allows them to help with childcare duties and also enables women to improve their health and sustain their careers, as evidenced in countries with well-established programs like the ones in Norway and Sweden. Without paid leave for men, most of the childcare responsibility is placed on mothers, often forcing them to take more time off from work, which can make returning to the labor force even more difficult. Unpaid leave also has a significant effect on single mothers who must both care for their children and work in order to make ends meet.

Without the option to take paid leave, some women and men are forced to put off having children until a later age when they are more financially established. However, waiting longer to have children risks increasing fertility problems. While these concerns have been somewhat reduced through improved egg-freezing methods, which companies like Facebook have promised to help pay for, in-vitro fertilization is still not always effective and can be particularly expensive.


Conclusion

Efforts to implement a more comprehensive family leave system have regularly run into a number of arguments for why it should not be done–it will hurt small businesses, it is too expensive, and so on. The issue has also become increasingly gendered, as opponents claim that only women need time off. However, none of these arguments hold water. Paid time off is important for both men and women and most plans also seek to cover medical and family emergencies as well.

Yet the resistance remains, and the United States remains an outlier among developed countries. Part of this is due to an antiquated piece of federal legislation that offers time off but little else, including no pay. While many state and local governments have taken it upon themselves to address the issue, these policies are far from widespread in the United States. In order to implement a comprehensive paid family leave program, Congress will need to take action at the federal level. This will inevitably require more taxes, but a program of this nature may be necessary to ensure the United States remains competitive in the world economy.


Resources

Proskauer: San Francisco Approves City Ordinance Providing For Fully Paid Parental Leave

United States Department of Labor: Family and Medical Leave Act

New York Magazine: New York Just Created a Revolutionary New Family-Leave Policy

NPR: Is It Time To make Medical and Family Leave Paid?

Govtrack: Summaries for the Family and Medical Insurance Leave Act

Time: Company-Paid Egg Freezing Will Be the Great Equalizer

Bustle: Paid Paternity Leave Is Essential For Gender Equality. Why Is The United States Taking So Long To Catch On?

The Atlantic: Work in the Only Industrialized Country Without Paid Maternity Leave

 

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post It Takes a Policy: The Fight Over Paid Family Leave in the United States appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/business-and-economics/takes-policy-fight-paid-family-leave/feed/ 0 51812
Tunisia: The Last Vestige of the Arab Spring? https://legacy.lawstreetmedia.com/issues/world/tunisia-last-vestige-arab-spring/ https://legacy.lawstreetmedia.com/issues/world/tunisia-last-vestige-arab-spring/#respond Thu, 14 Apr 2016 17:47:17 +0000 http://lawstreetmedia.com/?p=51724

Why is there still hope for Tunisia?

The post Tunisia: The Last Vestige of the Arab Spring? appeared first on Law Street.

]]>
Image courtesy of [Aya Chebbi via Flickr]

On Monday, April 4, Tunisia became the first country to announce that it would reopen its consulate in Tripoli, the capital of Libya. While Tunisia did so out of concern for its citizens living in Libya, as well as for trade considerations, the move also reinforced Tunisia’s place as an outlier. Approximately five years after the Arab Spring, Tunisia is the only country left standing with a democracy and not mired in a civil war or an authoritarian takeover. While its neighbors, especially Libya, have all but collapsed, Tunisia has remained above the fray, even as the threat of ISIS rises.

Read on to find out how the Arab Spring affected Tunisia, how it is handling the ISIS threat, and what unique qualities have allowed the country to succeed following the Arab Spring when every other nation has essentially failed to live up to its promises.


History of Tunisia

The settling of the land where Tunisia now sits has been ongoing for thousands of years thanks to the country’s access to both the Mediterranean and the inland Sahara region. In classical times, Tunisia was home to Carthage, the powerful empire that challenged Rome but ultimately lost. After its defeat, it was ruled by Rome and later by the Berbers who converted to Islam in the 7th century after their defeat by Arab invaders. Tunisia was then ruled by a series of Muslim empires until 1881 when it was conquered by the French, becoming one of its colonies.

The French maintained control over Tunisia through a mixture of repression and concession, but that was not enough to stem the Tunisian independence movement, known as Destour. In the 1930s, Habib Bourguiba, who later became Tunisia’s first president, started the Neo-Destour party to renew the independence effort. Bourguiba was imprisoned in France but was later released by the German occupiers during World War II, at which point he began advocating for the gradual independence of Tunisia. In 1956, France officially granted Tunisia complete independence with Bourguiba as the head of state.

Habib Bourguiba served as Tunisia’s president from 1957 to 1987. During his tenure in Tunisia, Bourguiba ruled as a one-party leader who led a relatively western-style government. The first few years after its independence were marred by residual conflicts with France. Other significant events occurred later in Habib’s rule, such as when the Arab League and the PLO temporarily relocated their headquarters to Tunisia. In 1987, Bourguiba was replaced by his prime minister, Zine el-Abidine Ben Ali in a nonviolent coup. Problems with corruption and human rights violations led many to become dissatisfied with Ben Ali’s rule, eventually sparking a revolution.


Tunisia and the Arab Spring

Perhaps it is fitting that Tunisia is the last country espousing the promise of the Arab Spring since it was in Tunisia where protests initially started. In December 2010, an unemployed man named Mohamed Bouazizi lit himself on fire in protest after Tunisian police stopped him from selling fruit on the street. This act sparked protests that led to mass unrest. The protests eventually prompted the end of Ben Ali’s rule, forcing him to flee to Saudi Arabia. Since then, the country has had two democratic elections and enjoys relatively high levels of freedom.

However, things in Tunisia did not go off without a hitch. The first election was won by an Islamist group in 2011. By 2013 however, the momentum behind this group had stalled and was accompanied by the assassination of a prominent opposition candidate. Instead of collapsing, the party in charge looked at its neighbor in Egypt, which was seeing a nearly identical situation unfold, and agreed to try something different. In Tunisia, the ruling Islamist party agreed to step down while maintaining a role in the election process. This negotiation was orchestrated by four groups of activists which eventually earned them the Nobel Peace Prize.

Despite some success, the country faces several significant challenges. Namely, Tunisia has been slow to clamp down on corruption or hold security forces accountable for their history of violence. It is also struggling to deal with the growing influence of extremism among its citizens. Still, it bears asking, how or why has Tunisia succeeded while every other country involved in the Arab Spring failed?


Preserving Democracy

One explanation of Tunisia’s success appears to be the very same Islamist party, Ennahada, which was elected after the revolution and then subsequently gave up power. Unlike many of its regional neighbors, such as the Muslim Brotherhood, Ennahada is much more secular and abhors the ideals of radical Islam. Aside from its ideology, what probably preserved democracy in Tunisia as much as anything else was its decision to give up power, which is one of the pillars of any democracy. By stepping aside in a volatile environment, Ennahada showed that change can be achieved peacefully through democratic means.

Despite these encouraging signs, more work needs to be done. In a recent poll, 83 percent of Tunisians stated they believed the country was going the wrong way. Much of this discontent is focused on slow-moving political reforms, especially when it comes to the economy. Many of Tunisia’s educated young working class have had a hard time finding jobs and are losing faith in the government’s ability to solve the problem. On top of all this is the security threat from Libya and ISIS. Tunisia has kept the democracy experiment alive through voting and the peaceful transition of power, but many Tunisians are getting frustrated with the current path.

The following video looks at Tunisia in the context of the Arab Spring and how it has been successful:


The Threat of ISIS

Tunisia’s success with democracy is all the more impressive given the large influence that ISIS has in the region and within the country. This paradox is most clearly illustrated by the fact that, despite being the most democratic and literate countries in the Islamic World, Tunisia is also the largest source of foreign fighters for ISIS. Between 6,000 and 7,000 Tunisians have left their homes to join ISIS with another 15,000 barred from making the trip. Even more interesting, many of these fighters have come from middle-class, even affluent, backgrounds. This throws the traditional narrative of terrorist breeding grounds into question. Some locals and experts attribute ISIS’s success in recruiting to a sense of disappointment with the post-Arab Spring government.

While Tunisia grapples with being simultaneously being the Islamic world’s most promising democracy and at the same time home to the most ISIS fighters, it must also look to danger from abroad. In 2015, Tunisia was home to two terrorist attacks that left 38 and 21 people dead respectfully. These attacks were conducted by ISIS and targeted tourists. As a result, the tourism industry, which makes up close to 10 percent of Tunisia’s economy, fell decreased significantly.

Perhaps Tunisia’s greatest threat, though, comes from neighboring Libya. Libya is where many Tunisian ISIS recruits go to train before coming back to plan attacks. One such attack was beaten back earlier this year in March when Tunisian security forces managed to fight off an invasion attempt from ISIS soldiers in the town of Ben Guerdane. Although they managed to successfully repel ISIS in Ben Guerdane, the fear and likelihood of more attacks remain strong. In fact, Tunisia is now constructing a wall along its border with Libya with help from the United States and Germany.

The video below takes a closer look at Tunisian government’s difficulty preventing its citizens from joining ISIS:


Conclusion

Five years after the start of the Arab Spring the results do not look promising. Three of the countries involved are engaged in civil war–Yemen, Libya, and Syria–and in Bahrain, the monarchy clings to its power. Meanwhile, authoritarian rule has been restored in Egypt. By most accounts, Tunisia is the only power to experience any meaningful progress toward democratization. Tunisia’s success is even more inexplicable because it is also the number one source of foreign fighters for ISIS and it remains under the constant threat of attack by ISIS fighters who are based in Libya.

But Tunisia and its Arab Spring idealism continue to endure. The nation is certainly not without difficulties and it nearly succumbed to the same problems that doomed Egypt. How was Tunisia able to navigate this mine field when everyone else failed? A leading explanation is Tunisia’s peaceful transition from one government to another, despite its political and social chaos.

Ultimately, though, people’s patience is not infinite and new polls suggest concerns over the political process, the economy, and national security may threaten the long-term success of democracy in Tunisia and the Arab Spring in general. But if Tunisia can solve these problems it will be a testament to a movement that believed that democracy is possible in the Islamic world.


Resources

Libya Herald: Tunisia to Reopen its Tripoli Embassy and Consulate

Encyclopedia Britannica: Tunisia

History World: History of Tunisia

Time: Why the Arab Spring Has Not Led to Disaster in Tunisia

The Wall Street Journal: How Tunisia Became a Top Source of ISIS Recruits

The Telegraph: Tunisia sees a Million Fewer Tourists after Terror Attacks

NPR: Tunisia’s Fragile Democracy Faces A Threat From Chaotic Libya

The Atlantic: Tunisia Is Still a Success

U.S. News and World Report: 5 Years After the Spring

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Tunisia: The Last Vestige of the Arab Spring? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/tunisia-last-vestige-arab-spring/feed/ 0 51724
The Future is Bleached: The Trouble With the World’s Coral Reefs https://legacy.lawstreetmedia.com/issues/energy-and-environment/future-bleached-trouble-worlds-coral-reefs/ https://legacy.lawstreetmedia.com/issues/energy-and-environment/future-bleached-trouble-worlds-coral-reefs/#respond Thu, 07 Apr 2016 18:43:19 +0000 http://lawstreetmedia.com/?p=51569

What's going on with our coral reefs?

The post The Future is Bleached: The Trouble With the World’s Coral Reefs appeared first on Law Street.

]]>
Image courtesy of [Greg Goebel via Flickr]

A recent aerial survey of Australia’s Great Barrier Reef found that as much as 95 percent of the coral that makes up the northern portion of the reef is bleached. While this may have significance for marine biologists or other people who understand precisely what this means, to most people, bleaching is something you do to your laundry. However, coral bleaching, especially of the Great Barrier Reef, is a huge deal because it is yet another litmus test for the planet and another sign of the current economic challenges facing the environment.

Read on to find out about where the world’s coral is located, what coral bleaching is, and what the consequences are for the future of coral reefs and marine life.


Coral Basics

Coral reefs are located off the coasts of more than 100 countries worldwide. They are primarily located in the tropic region, between the Tropics of Cancer and Capricorn. In some cases, coral is found outside of this region as long as warm currents flow there. Two examples are around Florida and outside parts of Japan. In total, coral covers approximately 110,000 square miles.

The first thing that must be understood about coral is that it’s not a rock or some type of inanimate object; corals are alive. And while they may look like plants, corals are actually animals. Like other animals, corals need to eat to survive because they can’t produce their own food. They get their nourishment from mutually beneficial relationships with algae, but corals also hunt prey. Using stinging tentacles, a coral paralyzes its prey and then slowly digests it.

Corals reproduce in two ways, they either produce both the egg and sperm to perform the fertilization process themselves, or they produce one of them, release it, and then rely on another colony to provide the other. Corals form when larvae attach themselves to rocks or other hard surfaces upon which they can grow. Coral reefs are made up of colonies of hard coral polyps. This type of coral secretes calcium carbonate, or limestone, to create their hard exoskeletons. Many organisms make up a coral reef but for corals to survive they form symbiotic relationships with  zooxanthellae algae, which is what gives the coral its unique colors.

The coral and the zooxanthellae essentially provide each other with added nourishment. Along with the zooxanthellae, corals also need several other conditions to survive; namely, sunlight, clear water, warm water, clean water and salt water.

Coral reefs are important because of their immense biodiversity. The reefs themselves only cover about 1 percent of the ocean floor yet are home to more than 25 percent of all marine life. To put it another way, there is more biodiversity in coral reefs than there is in tropical rain forests. The coral work in concert with the many species that inhabit the reefs; helping clean the water, provide food to one another, and even make the white sand that lines the world’s most exotic beaches. Coral also has a symbiotic relationship with humans, with as many as six million fishermen relying on the coral ecosystem for their livelihood.

The video below describes coral and its purpose in greater detail:


What’s happening to the Coral?

The attributes that make coral reefs special are also the aspects that face the most significant threats. Namely, because the reef is so biodiverse it is more resilient than a simpler ecosystem. But even with this built-in resiliency, coral reefs are dying at an accelerating rate. The World Wildlife Fund estimates that as many as a 25 percent of the world’s corals are already damaged beyond repair and another 66 percent are in danger of facing the same fate. So what exactly is happening?

Much, if not all, coral reef destruction can be attributed to human activity, the most blatant of which is rising ocean temperatures due to global warming. While coral reefs prefer warm water, they require an equilibrium to exist. This means that if water is too warm it can also be destructive. This is true for two reasons. First, if the water gets too hot the zooxanthellae will either die or be expelled from the coral. When the zooxanthellae are expelled the coral loses it color, which is where the term bleaching comes from. Second, warming water can also lead to harmful algae growth, which can cloud the water. This is problematic because corals need clear water to survive. When algae prevent sunlight from hitting corals, their zooxanthellae can die, killing the coral as well. Overfishing has also contributed to algae overgrowth because it eliminates the algae’s natural predators, removing barriers to growth.

There are also more direct human actions which negatively affect the coral. Dumping pollution in the water can increase nitrogen levels leading to algae overgrowth. Sediment and silt runoff from construction along coastal regions can also smother the corals blocking their access to sunlight. Destructive fishing practices like the use of poisonous chemicals and explosives to fish can also have serious consequences for coral reefs. Finally, tourism can pose a problem for corals because divers and snorkelers can stir up them up and cause damage. Perhaps the most directly harmful action is the direct mining of coral reefs for building materials and even for souvenirs.


The Effects

The impact of the destruction of coral reefs is widespread. Not only do coral reefs provide people with things such as food and building materials, they are also used in the development of many new cancer-fighting medicines. Corals serve as a buffer for the shoreline and filter the water as well. From a more aesthetic viewpoint, corals are also the center of the tourism for many small countries. While the threat currently faced by coral reefs may not directly concern the United States, for smaller countries that depend on tourism as a major part of their economies, the loss of coral reefs could be an economic disaster.

The following video below explains the danger coral faces:

Prevention

While the outlook for the world’s coral reefs is grim, it is not totally hopeless. After all, even when coral is bleached, zooxanthellae may still be able to return to the coral if conditions return to normal, allowing the reefs to survive. There are several efforts underway to protect and restore coral reefs globally. These include Marine Protection Areas, which regulate fishing, coastal development, and pollution. They also focus on raising awareness of the issues affecting coral. The Great Barrier Reef is one place that has been protected since the 1970s; however, even those efforts have had mixed results.

Along with those concerned with protecting the reefs are organizations aimed at restoring them, such as the Coral Restoration Foundation, which supports coral research and grows coral in nurseries to rebuild reefs. Efforts to restore coral reefs are underway in the United States as well. In 1998, the president established the Coral Reef Task Force (CRTF) to protect and restore reefs through mapping, monitoring, and research. As a result of the CRTF’s work, an additional law was passed in 2000, the Coral Reef Conservation Act, which required the National Oceanic and Atmospheric Administration to protect coral reefs.

On a more individual level, there are a number of things the average person can do to preserve coral reefs. These efforts include conserving water; reducing pollution; using organic and ecologically friendly fertilizers; disposing of trash appropriately; diving and snorkeling safely; and even planting trees to reduce run-off. People can also play a role by raising awareness and supporting businesses that promote conservation. The following video details some of the efforts to preserve coral reefs:


Conclusion

Coral reefs are the most biodiverse ecosystems on the planet, yet their very existence is increasingly threatened. Coral reef conditions are some of the most telling measures that humans have to understand the state of the environment. Most of the current threats faced by coral reefs are the products of human activity, and as a result, are in many ways preventable. By raising awareness and stopping harmful activities like dumping pollution, harmful fishing, and climate change, future damage can be prevented. But a considerable amount of damage has already been done and many wonder whether or not it is possible to reverse it. While many are skeptical that the entire situation can be reversed, a significant amount of damage may still be undone.

In light of coral reefs’ importance to marine biodiversity and maybe even modern medicine, protecting them should be a priority for coastal areas. The damage done to coral reefs is emblematic of the larger threats facing the environment. It is up to people to make a concerted effort to preserve, restore, and properly manage their activities to ensure coral and coral reefs remain.


Resources

CBS News: Great Barrier Reef Coral Bleaching Hits “Extreme Level”

Coral Reef Alliance: Where Are Coral Reefs Located?

Florida Keys National Marine Sanctuary: Corals are Animals

Coral Reef Alliance: How Reefs Are Made

Coral Reef Alliance: How Corals Reproduce

Coral Reef Alliance: What Do Coral Reefs Need To Survive?

Coral Reef Alliance: Types of Coral Reef Formations

Coral Reef Alliance: Coral Reef Biodiversity

World Wildlife Fund: Coral Reef Threats

Ocean World: Coral Reef Destruction and Conservation

State of the Planet: Losing Our Coral Reefs

The Nature Conservancy: Coral Reefs of the Tropics

National Oceanic and Atmospheric Administration: Corals

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Future is Bleached: The Trouble With the World’s Coral Reefs appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/energy-and-environment/future-bleached-trouble-worlds-coral-reefs/feed/ 0 51569
Who are the Kurds? https://legacy.lawstreetmedia.com/issues/world/who-are-the-kurds/ https://legacy.lawstreetmedia.com/issues/world/who-are-the-kurds/#respond Tue, 05 Apr 2016 13:00:40 +0000 http://lawstreetmedia.com/?p=51479

And how did they become a major player in the fight against ISIS?

The post Who are the Kurds? appeared first on Law Street.

]]>

Most people know that the Kurds have been one of the most effective groups when it comes to fighting ISIS. But beyond that, little is known to some in the United States about who the Kurds are and what the history of the ethnic group is. With the United States and the many countries involved in the fight against ISIS relying on the group, it is important to take a closer look at who they are and what the majority want.

Who exactly are the Kurds and how did they become the largest ethnic group without a homeland? Read on to find out who the Kurds are, what their role in the Middle East is, and most importantly, what they are looking for.


History of the Kurds

The history of the Kurds is, in many ways, as convoluted as their present–with no exact date or time for when they first appeared on the world stage as an ethnic group. Some speculate they were part of an ancient group that ruled large chunks of the Middle East more than 2,500 years ago. The first widely acknowledged mention of the Kurds as a people came in the seventh century when they converted to Islam. The Kurds often “fought for other groups that succeeded as regional powers, receiving a reputation for being fierce fighters.”

Along with their fighting prowess, the Kurds were also known for their nomadic lifestyle. According to the Kurdistan Tribune, following the dissolution of the Ottoman Empire, the Kurds–like many other groups in the region–were guaranteed a homeland by the Treaty of Sevres in 1920. But like many other groups, they were lied to. After Kemal Ataturk rose to power and Turkey’s borders were formalized in the 1923 Treaty of Lausanne, the Kurds were not given a country of their own. They were then left in the historically unenviable position of being an unpopular minority in an unwelcoming region. This led to a revolt by Kurdish groups and a subsequent violent crackdown by Turkish forces in the 1920s and 30s.

The Kurds and Turks have had an especially hostile relationship following these failed revolts. For years, the Turks tried to suppress the Kurds’ cultural identity by forbidding them to wear traditional clothes or teach their own language in schools. Not surprisingly then, a Kurdish leader named Abdullah Ocalan rose up and created an organization, the Kurdish Workers’ Party or PKK, to fight the Turks and gain a Kurdish homeland in 1978. Despite years of fighting and guerrilla warfare against Turkey, Ocalan ultimately failed and was eventually captured by Turkish forces in 1999. Turkey considers the PKK a terrorist organization and its campaign to fight the group in the southeast region of the country has escalated recently.

Aside from Turkey, the Kurds also had issues in other surrounding countries where they have sizable minorities. After many years of allowing Ocalan to manage the PKK from within its borders, the Syrian government forced him from the country in 1998 after being pressured by Turkey. In Iran, the Kurds made two attempts, both with little success, to carve out an autonomous region.

Iraq rivaled Turkey in its harsh treatment of the Kurds. Throughout the 20th century, the Kurds in Northern Iraq launched several revolts, all of which ended in defeat. However, the worst situation for the Kurds came after Saddam Hussein took power. Angry over their support for Iran in the Iran-Iraq War, Hussein targeted the Kurds with chemical weapons. These attacks stopped after Iraq was defeated in the first Gulf War, however, he crushed another Kurdish revolt soon after.

The video below gives a look at Kurdish history:


Role in the Middle East

Today the Kurdish people live in an area at the intersecting borders of five countries; Armenia, Iran, Iraq, Syria, and Turkey. Despite not having a homeland, the Kurds are still an important group made up of as many as 30 million people–the fourth largest ethnic group in the Middle East. So what role does such a large group, spread over a number of countries, play in the region?

Turkey

Currently, Kurds make up 15 to 20 percent of the population of Turkey. Turkey and the Kurds have a long and bloody history of animosity. Much of this recent struggle has centered on fighting between the PKK and Turkey. Since the PKK took up arms in 1984, approximately 40,000 people have been killed. However, when the PKK toned down its demands and exchanged autonomy for independence in 2012, a ceasefire was finally reached. Nevertheless, all that work was undone in 2015 following a suicide bombing against the Kurds in Suruc. In response, Kurdish forces lashed out against Turkish authorities reigniting the old feud.

Still, the PKK is not representative of all Kurds and, in fact, many are actually entrenched in the Turkish economy. This group, in fact, is a strong pillar of support for the ruling Turkish Justice and Development Part (AKP).  There is also a third group that splits the middle between the supporters of the Turkish AKP and the militant PKK, the People’s Democratic Party or HDP.

Iraq

The Kurds make up as large of a portion of the Iraqi population as they do in Turkey–between 15 and 20 percent. As in Turkey, the Kurds in Iraq have faced years of crackdowns and repression following unsuccessful rebellion attempts. However, they achieved some limited autonomy following the First Gulf War and even greater autonomy after the second in 2003. Since the formation of the new Iraqi government, the Kurds have been constant participants in Iraqi politics. Amid the rise of ISIS and the resulting conflict, Kurdish leaders have gone beyond autonomy and called for a referendum on independence.The Kurds and the Iraqi government eventually reconciled several of their differences and started working together closely in the fight against ISIS.

Kurds in Iraq have made the most significant progress toward autonomy relative to Kurds in other countries. The 2005 Iraqi Constitution actually guarantees the Kurds an autonomous area, in which they have established their own government that operates within Iraq’s federal rule. The Kurds have taken advantage of this arrangement with its involvement in the Iraqi oil industry. The Kurds operate a pipeline between Iraq and Turkey, for which they have a revenue sharing agreement with Iraq. A recent dispute over the revenue sharing agreement disrupted oil transfers pending a new agreement.

Syria

The Kurds make up a sizable portion of the population in Syria as well, accounting for between 7 and 10 percent before the Syrian Civil War erupted. This population was concentrated in urban centers and in the north of the country. Like in Turkey and Iraq, Kurds in Syria were also marginalized through repression from the government, which also denies citizenship to over 300,000 Kurds living there. Once the war in Syria began, however, Kurds began asserting their rights and now plan to carve out autonomous regions for themselves. They have also sought to be actively involved in the peace talks determining Syria’s fate.

The Kurds’ battle against ISIS has been particularly challenging in Syria. Several Kurdish positions were overrun by ISIS, partly because Turkey refused to let Turkish Kurds cross the border to intercede. But in October, Turkey eventually allowed some fighters to help Syrian Kurds push back ISIS with the support of U.S. airstrikes. However, the Kurds continue to encounter challenges in terms of their relations with Turkey, notably after their recent attempt to establish an autonomous zone in Syria. While they were quick to clarify they are not seeking independence to appease Turkey, this may have fallen on deaf ears. The Turks have worked to exclude the Kurds from Syria’s peace talks, meaning the appeasement may not be enough.

Elsewhere

Kurds make up about 10 percent of Iran’s population, however in total numbers, they rank only second to those living in Turkey. Nevertheless, unlike in other countries Iranian Kurds enjoy no autonomous regions and like in other neighboring countries they are violently repressed. There is also a much smaller Kurdish community living in Armenia; unlike in other places this group does not govern nor aspire to an autonomous region. The accompanying video looks at the Kurds role in the Middle East:


What the Kurds Want

As the world’s largest “stateless nation,” a priority for the majority of Kurds has long been a country of their own. This has been evident since the start of the Kurdish nationalist movement beginning after WWI, following the dissolution of the Ottoman Empire. It is also evident today in Kurdish efforts to achieve autonomous areas where it has large populations, which it has in three countries: Iraq, Syria and Turkey. The real question, then, is not what the Kurds want, but how they hope to achieve it.

But it’s important to note that the Kurds are by no means a monolithic group. While they share the same ethnicity, they are a very diverse group. In Turkey, where the largest Kurd population resides, there are three major Kurdish political groups ranging politically from loyal to the state to hostile to it. There are also major divisions in Iraq with one party controlling two of the Kurdish provinces and a different party controlling the other. The leaders of the dominant party have close relationships with Turkey and have even worked with the Turks in fighting the Turkish PKK. The Kurds in Iraq also fought a civil war during the 1990s which lasted three years.

The Kurds are also divided at even smaller levels with sizable differences between those in cities and those still adhering to their nomadic roots. Even in a country as small as Armenia, there are divisions between traditional Kurdish Sunni Muslims and Kurdish Christians. While many Kurds seek a homeland, for now, the best they may be able to get are autonomous regions like the ones in Syria and Iraq. The following video looks at some of the different Kurdish parties at play across the Middle East:


Conclusion

It is easy to characterize the Kurds as just one more ethnic group with deep historical roots wandering the Middle East searching for a homeland, but that characterization is overly simplistic. The Kurds are not a monolithic group, but a diverse set of actors spread mostly across five countries that are bound by a common heritage. Yes, many do want a homeland, but due to the diversity within the group, how they achieve it, or even if they can, varies widely.

In the seemingly never-ending conflicts in the Middle East, the Kurds are a recurring actor. Much of what is known or understood about them comes from other generalizations–they are Sunni Muslims, they are searching for a country, etc. This is all true but the reality is more complicated. The Kurds’ situation is emblematic of many other realities in the Middle East, an intricate web of interconnected groups with, at times, converging and differing interests. While the Syrian conflict has actually given them the opportunity to further assert their claims, nothing in the fluid region is certain. Thus, only time will tell if those dreams can amount to more than that.


Resources

Washington Post: Who Are the Kurds?

New Historian: History of the Kurds

BBC News: Who are the Kurds?

The Atlantic: What Exactly Are ‘the Kurds’?

Reuters: Iraq seeks financial agreement with Kurds before pumping crude to Turkey

RT: Turkish fighter jets pound PKK targets in Northern Iraq

BBC News: Iraqi Kurdistan Profile

The New York Times: The Kurds Push for Self-Rule in Syria

TA Central: Kurds

Council on Foreign Relations: The Time of the Kurds

Editor’s Note: This article has been updated to reflect sources of information.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Who are the Kurds? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/who-are-the-kurds/feed/ 0 51479
The Costs (and Benefits) of Free Trade https://legacy.lawstreetmedia.com/issues/business-and-economics/real-costs-benefits-free-trade/ https://legacy.lawstreetmedia.com/issues/business-and-economics/real-costs-benefits-free-trade/#respond Wed, 30 Mar 2016 18:25:51 +0000 http://lawstreetmedia.com/?p=51336

How has free trade affected the United States?

The post The Costs (and Benefits) of Free Trade appeared first on Law Street.

]]>
"Sustainability poster - Fair trade" courtesy of [Kevin Dooley via Flickr]

There is not a lot that Donald Trump and Bernie Sanders agree on in their current presidential campaigns, but one thing the two do seem to share is a general disdain for free trade. The notion of free trade has joined the lexicon of despised things in the United States right next to bank bailouts and tax breaks for the rich. The clearest evidence of this is all the candidates’ desperate efforts to move as far away as quickly as possible from free trade agreements like NAFTA and the Trans-Pacific Partnership.

But is this much ado about nothing? Is free trade really gutting the economy and costing millions of jobs as suggested? More to the point, what does free trade mean? Read on to learn more about what free trade is and to find out if it is really as bad for Americans as some argue.


What is Free Trade?

Free trade does not mean that goods are given to other countries for free. It’s the idea that, for the sake of economic efficiency, tariffs, quotas, and trade barriers should be lowered or removed altogether, which economists argue will make goods cheaper for consumers. The process is aimed at improving efficiency by focusing on what is known as a country’s comparative advantage. Comparative advantage is the idea that a country should produce and export goods that it can make better, faster, and cheaper than other countries. By removing barriers to trade, two countries are left to compete with each other on their natural footing and whichever country can produce a good most efficiently has a comparative advantage for that good.

Comparative advantage is essential to free trade and is generally why economists like the concept of trade in general. Without barriers to trade, countries begin to specialize in products that they have a comparative advantage to produce, which ensures that all goods are made as efficiently as possible and lowering prices for everyone. The video below clarifies further what free trade is:

Globalization and Free Trade

Globalization and free trade are often seen as synonymous, but the two are not quite the same thing. According to the World Bank, “‘Globalization’ refers to the growing interdependence of countries resulting from the increasing integration of trade, finance, people, and ideas in one global marketplace.” Put simply, it’s the increasing inter-connectedness of every country and person on the planet.

Free trade, on the other hand, is a major driver in making globalization happen. By eliminating things such as tariffs and quotas, countries are encouraging exchange and, as a result, more people are coming into contact with each other and new connections are being made, further integrating the global system. Free trade, then, is just one part of the larger globalization puzzle.


History of Free Trade

Globally

While the constant battle over free trade seems to be an American issue, this is certainly not the case. While earlier theorists may have touched on its concepts, it was Adam Smith who first articulated the concept of free trade in his book “The Wealth of Nations” back in 1776. David Ricardo later introduced the concept of comparative advantage in 1812. The idea of free trade was rapidly adopted by economists after that as the preferred method of economic interaction. It was also embraced by the British Empire who, as the world’s dominant power for over a century, used its power to spread free trade internationally. Today, there are several free trade blocs across the world most notably the European Union as well as Canada, Mexico, and the United States, all of which are part of NAFTA, the North American Free Trade Agreement.

Domestically

Free trade, while not an America invention, does have a long history in the United States. However, for much of that history, the inclination was to resist it. In fact, from the inception of the United States, economic leaders such as Alexander Hamilton advocated for protective tariffs to help the nascent nation’s industry grow, instead of promoting free trade. This movement continued with the number of goods and the size of tariffs fluctuating over time.

Beginning in the early 20th century, a series of events played a major role in altering this narrative. In 1913 the United States government adopted the federal income tax, which became the country’s new largest source of income, supplanting the money made from trade tariffs. With the new guaranteed revenue stream, the government could change tariff rates without fear of forgoing necessary income.

The second major event was the passage of the Smoot-Hawley Tariff in 1930. This tariff was unique because it united industries like agriculture and manufacturing around one policy. It was also unique in the sheer amount of opposition that it faced. The debate following the tariff was whether it directly caused the Great Depression or just intensified it. While common wisdom now points to the latter, the tariff reduced trade and produced reactive tariffs from other nations during the worst period of economic contraction in U.S. history.

The tariff quickly became unpopular and was a major issue during the 1932 presidential campaign when Franklin Roosevelt ran on a platform opposing it. Once elected, Roosevelt made good on his promise, virtually eliminating the effects of the tariff by 1934 through a number of laws such as the Reciprocal Trade Agreements Act. Roosevelt and his advisors had their eyes on a post-war future in which free trade would be the dominant philosophy at last.

Following WWII, the United States finally adopted its free trade stance. The United States was under pressure to support free trade because many other nations were desperate following the war and wanted greater access to U.S. markets. This move was codified by the creation of the General Agreement on Tariffs and Trade (GATT) in 1948. This organization later transformed into its current iteration, the World Trade Organization (WTO) in 1995. While the United States did not embrace free trade for much of its history, it was already benefitting from the concept. This is because the United States was such a large market itself that trade between states was a lot like free trade enjoyed by countries in places like Europe.


Trade Agreements

NAFTA

NAFTA or the North American Free Trade Agreement is an agreement between Canada, Mexico, and the United States that took effect in 1994. Unlike other free trade agreements, this did more than eliminate tariffs and quotas, it effectively synced the policies of the three nations. It was also notable because of the economic differences between the three countries. NAFTA faced a lot of criticism because it sought to create uniform trade laws among the three countries involved. As a result, countries ended up changing their laws to meet the agreement’s requirements even if the same policies had been rejected at a local level in the past.

Namely, while treaties are supposed to require a two-thirds majority to pass in the Senate, according to the Treaty Clause, NAFTA received only a simple majority–more than 50 votes–and was still able to be signed into law. The question at hand was whether NAFTA was a treaty or an international agreement, which would not require a two-thirds majority in the Senate. As a result, NAFTA it was challenged in court but the case was eventually dismissed. The constitutionality of NAFTA was also challenged for its binational trading panels, which review the enforcement of U.S. trade laws and could even override such enforcement.

TPP

The TPP or Trans-Pacific Partnership is another free trade agreement like NAFTA but on a much larger scale. In this case, the deal includes 12 countries bordering the Pacific Ocean, notably excluding China. This deal again has many of the traditional criticisms and promises. Unlike NAFTA, however, the TPP has not yet been approved by Congress and may face significant opposition given the current backlash toward free trade.

Read more on the Trans-Pacific Partnership and its potential impact on intellectual property rights.

The accompanying video looks at free trade and free trade agreements following the switch in focus to free trade following WWII:


Criticisms of Free Trade

While free trade has been lauded in the past by economists, politicians, the media, and corporations, it has also drawn a lot of criticism. Most of these criticisms center specifically on its effects–namely that while free trade promises to be the rising tide that raises all boats, opponents claim that it actually does the opposite. First, by reducing tariffs and other protective measures a country is not only eliminating its own trade barriers but is doing the same thing for another country. If the two countries were operating on equal footing this would not be a problem, however, that is generally not the case.

In the case of a developed nation, like the United States, it has the economies of scale to put less efficient, smaller operations out of business. This is what happened in Mexico as large American agricultural companies started competing with small Mexican farmers, forcing them from their livelihoods and leading, in part, to their migration to the United States. Conversely, in countries where workers’ rights and environmental regulations are less developed these too can be exploited. In these countries, companies can lower the cost of production and undercut advanced nations with stronger regulations and higher standards.

In this sense then, the notion of comparative advantage is turned on its head. Instead of rewarding the best producer it can reward the cheapest or the least concise. This problem alone would be bad enough, but the critique continues. During this process of racing to the bottom, free trade has eliminated jobs in wealthier countries that pay more and created them in less advanced nations that pay less. Unfortunately, these newly employed workers are not wealthy enough to buy more goods and the now unemployed workers in the developed country are also buying less. According to critics, instead of creating a mutually beneficial society, free trade has brought about reductions in trade.

A major issue is that comparative advantage is supposed to move laborers from unproductive endeavors to more useful ones. But instead of seeing their efforts refocused in a more prosperous industry, workers in developed countries typically have to find jobs in different sectors of the economy. In countries like the United States, many factory workers have lost their jobs due to international competition. But instead of getting a different factory job they tend to move to the services industry, which typically involves lower wages.

There is some empirical support for these criticisms as well, with workers in the United States seeing a loss of manufacturing jobs since their height in the 1970s, rising trade deficits despite free trade, and low or negative wage growth. The question then is why would anyone support a concept that hurts the American worker while rewarding countries with loose regulations and low wages? The answer and the primary culprits in the criticism of free trade are the people who run multi-national corporations. According to the critics of free trade, the process naturally benefits these people as it allows companies to cut costs by paying its workers less while facing fewer regulations. The following video details some of the effects of free trade:

Despite all of its criticism and shortcomings, free trade is not all bad. The concept of competitive advantage increases the efficiency in the global economy. Aside from that, free trade offers a number of other potential benefits including reduced inflation, economic growth, greater innovation, increased competition, and greater fairness. Proponents of free trade also argue that turning to protectionism now won’t really solve the problem and may even be impossible. Finally, although manufacturing jobs have left the United States, many of those who gain jobs in other countries have been lifted out of extreme poverty.


Conclusion

Throughout U.S. history, Americans have grappled with whether protectionism or free trade is in their best interest. While free trade means more markets it also means greater competition, especially from places where things such as workers’ rights and environmental concerns are less prevalent. And it means doing away with protections that may very well have helped the nation develop and become a dominant world power.

However, trade policies, like anything else, move in waves. For the majority of the nation’s history, this wave has crested with protectionism on top. In fact, it took the greatest depression and largest war in the history to finally create a global system that favored free trade. While the Bretton Woods agreement and other deals such as NAFTA or TPP have continued, free trade policies have never been universally accepted. In an election where it seems like voters and candidates can hardly agree on anything across party lines, the current backlash against free trade may bring people together for at least a brief moment.


 

Resources

WBUR: Free Trade Fact-Check: NAFT Becomes Campaign Issue

Common Dreams: What’s The Problem With ‘Free Trade’

Foundation for Economic Education: Free Trade History and Perception

World Bank: Globalization and International Trade

CATO Institute: The Truth about Trade in History

The Fiscal Times: Free Trade vs. Protectionism: Why History Matters

The Economist: The Battle of Smoot-Hawley

BBC News: A century of free trade

Public Citizen: North American Free Trade Agreement (NAFTA)

BBC News: TPP: What is it and why does it matter?

Reference for Business: Free Trade Agreements and Trading Blocs

Law Street Media: What’s Going on With The Trans-Pacific Partnership

Law Street Media: Trans-Pacific Partnership: Why is the IP Rights Chapter Receiving So Much Criticism?

Los Angeles Times: Court Rejects Challenge to Constitutionality of NAFTA

PR Newswire: Recent U.S. Supreme Court Decision Reinforces Doubts About Constitutionality of NAFTA Chapter 19 Panel System

Mercatus Center: The Benefits of Free Trade: Addressing Key Myths

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Costs (and Benefits) of Free Trade appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/business-and-economics/real-costs-benefits-free-trade/feed/ 0 51336
The NCAA Tournament: The Method behind the Madness https://legacy.lawstreetmedia.com/issues/entertainment-and-culture/ncaa-tournament-method-behind-madness/ https://legacy.lawstreetmedia.com/issues/entertainment-and-culture/ncaa-tournament-method-behind-madness/#respond Tue, 22 Mar 2016 15:33:26 +0000 http://lawstreetmedia.com/?p=51068

What's behind the NCAA Tournament?

The post The NCAA Tournament: The Method behind the Madness appeared first on Law Street.

]]>
"NCAA Basketball" courtesy of [Phil Roeder via Flickr]

On Sunday, March 13 the 68 teams vying for the Men’s NCAA basketball championship were selected, and actual play started just a few days later. While the tournament will capture the spotlight and attention of millions for nearly a month, there are a number of other related events that often go unnoticed. This includes everything from massive amounts of money changing hands to the number of work hours wasted checking phones, computers, or simply watching the tournament. Keep reading to find out the impact of the NCAA basketball tournament beyond filling out brackets and the action on the court.


History

Although the current men’s tournament is portrayed as some early spring colossus, this was not always the case. The men’s tournament began in 1939–that year Oregon defeated Ohio State in the final game. Even after its start, the original tournament would be almost unrecognizable by today’s standards, with only eight teams competing for the first 12 years. At that time the tournament was so insignificant, in fact, another tournament, the National Invitation Tournament (NIT), decided the national champion.

The tournament slowly expanded to 16 teams in 1951 then to 22 teams in 1953. The number stayed relatively static for the next twenty years. In 1975, the field grew again to 32, incorporating teams beyond the conference champions. The tournament stretched to 40 teams in 1979 and 48 the subsequent year. In 1983, the number of teams increased to 53, eventually reaching the familiar number of 64 in 1985.

The field grew again in 2001, finally reaching 68 teams in 2010. The competition now includes four play-in games before the traditional tournament starts. Along with the tournament’s consistent growth in size came new methods of selecting teams and seeding them after selection.

The women’s tournament began much later in 1982 but started on a larger scale with a 32-team field. The women’s tournament later expanded to its current size of 64 teams. Apart from slight differences, like play-in games, the men’s and women’s tournaments operate in the same fashion with four regions whittled down to a final four then a semifinal and the national championship.


NCAA

The men’s and women’s basketball tournaments are run by an organization known as the NCAA, or National Collegiate Athletic Association, which administers and oversees college-level athletics in the United States. It began in 1906, adopting its current name in 1910, as an organization to codify college football rules and those of other sports. While it is hard to fathom today, the organization lacked any real clout until 1942, and particularly gained prestige in 1952, when it began regulating live football coverage. In 1973, it separated into three divisions: Division I, Division II, and Division III. The larger schools with larger budgets play in Divisions I and II and smaller schools play in Division III.

Today the NCAA oversees collegiate athletics regionally and nationally, including 80 national championships which spread across 20 sports in the three divisions. Along with determining rules and performing administrative tasks, the NCAA’s other function–one that draws a lot of criticism–is determining eligibility.


Moneymaking

As both the bracket and the NCAA have grown over the years, so too has profit from the basketball tournament. While this is often seen as a time of entertainment and friendly bets at its core, there is a massive amount of money changing hands. In 2014 for example, advertisers alone spent $1.13 billion to show ads on networks during the NCAA tournament. Furthermore, over a 10-year period ending in 2014, revenues exceeded $7.5 billion. Much of this money then goes to the NCAA through licensing deals. Of the $1.13 billion in 2014, approximately $800 million went to the NCAA for licensing fees. The video below looks at the business side of the tournament:

The NCAA sends most of this money to the participating schools. How this process works is for every non-championship game played in the tournament a team earns its conference a unit of money. The money is then paid out to the conferences over a six-year period based on the number of games its teams played. The conferences typically split the money among member teams. Since the money is split among every team in a conference, some schools earn less than if they were able to keep their respective winnings. However, this provides a needed source of income for teams not consistently in contention. Even with this bonus–along with streams of income like ticket sales, boosters, and merchandise–nearly one-third of teams lose money or just break even on their basketball programs after expenses.


Gambling

Unsurprisingly, for a tournament that generates so much money and interest, gambling plays a major role. This gambling is generally divided into two parts. The form that is most often associated with the tournament is done through bracket pools, which are filled out across the country in businesses, schools, and among friends. While this good-natured form generally includes only small wagers, it is probably illegal for most. But when it comes to enforcement, the authorities typically don’t intervene.

The other form is the high stakes level of gambling. Betting on the NCAA men’s basketball tournament even surpasses the amount bet on the Super Bowl, though, the tournament spans several games. According to estimates from the American Gaming Association (AGA), $9.2 is expected to be bet on this year’s tournament and Americans will fill out approximately 70 million brackets. But according to the AGA, much of that is done illegally. In a press release the association notes:

Of the $9.2 billion that will be wagered this year, only about $262 million will be bet legally at Nevada sports books. The total illegal sports betting market in the United States grew to $148.8 billion in 2015.

The following video looks at the gambling associated with the tournament:


Amateurism

One of the largest issues surrounding March Madness, beyond gambling and its legality, is the role of the athletes who remain unpaid as the NCAA, basketball conferences, schools, and gamblers profit. Collegiate athletes are considered amateurs by the NCAA, which prevents them from earning money for their work. Instead, they are paid in scholarships. While other students, particularly those with student debt, may see athletes’ free education like a decent deal, its value is often far below market value for what the athletes generate.

Athletes are also unable to augment their wages with endorsements or money from agents or boosters. In fact, athletes are barred from having any contact with agents until they give up their amateur status. But universities are free to accept money from and even depend upon boosters. Athletes in sports like basketball and football are also barred from jumping straight to American professional leagues as well, leaving them few options other than going to college or playing professionally overseas. The demands of playing Division I or II sports may also detract from athletes’ educations.

The scholarships that are available to student athletes are also misunderstood. They are often seen as a “full ride,” but many athletes are not guaranteed the scholarship for the full four years. Until 2012, schools were not even allowed to give out multi-year scholarships. This means that coaches and athletic directors have the power to take away scholarships for players who are not performing while players have less ability to seek recourse. Even for those who maintain their good standing and their scholarships, the value is often not enough to live on, which could tempt some to take money from money from people like agents, violating NCAA rules.

The amateur system in the NCAA is well-known and has sparked several lawsuits and legal actions. Perhaps the most prominent example is Northwestern University football players’ attempt to unionize. The issue remains far from settled, and even if these athletes eventually do get paid, a system for compensation could be particularly complicated given the nature of college sports and the money involved.

The following video provides a good look at the question of amateurism:


Lost Productivity

March Madness may not be unfair just to the athletes, it also impacts businesses across the United States in the form of lost productivity. According to Challenger, Gray & Christmas, employers may lose as much as $1.9 billion in pay because of the tournament.  This loss in productivity comes in the forms of absenteeism, employees wasting internet bandwidth on watching games, and employees becoming distracted while at work.

To offset this issue, many companies have actually tried co-opting the excitement over the tournament in order to funnel it into still-tangible results. For example, one company, Headwaters MB, threw a barbecue and invited local business leaders to improved morale and develop contacts and potential clients. Not all businesses are able to leverage the tournament to help productivity, but many have at least acknowledged its effects on their employees’ work output.


Conclusion

March Madness is a great time for sports fans across the nation. Not only do they get inundated with almost non-stop action, but they can also form bonds over brackets and friendly rivalries. However, behind all the glory and triumph is a seedier, more practical side to the tournament. This side is concerned with the vast amount of money the tournament generates.

This money is generated by many legitimate means but also through gambling and illegal betting. Businesses also have to suffer losses in productivity as employees find their attention drift from their work to the tournament. On top of all of this is the question of the athlete’s role in the moneymaking. The amateur basketball players generate massive sums but aren’t able to make money for themselves unless they end up playing professionally. While the best approach to the issue remains hotly debated, several court cases and legal actions have attempted to change the status quo.


Resources

ESPN: 2016 NCAA Tournament Schedule

History: March Madness is born

Bracketography: The Tournament Over Time

Encyclopedia Britannica: National Collegiate Athletic Association

The Economist: Amateurism in Sports

CBS Sports: Schools Can Give Out 4-year Athletic Scholarships, but Many Don’t

Law Street Media: The Battle in College Sports: Northwestern Football and Unions

Fortune: Guess how Much Money Employers Lose during March Madness

ESPN: 68-team Tournament Approved

ABC News: Into the Pool: NCAA Tourney Betting Booms

Yahoo Finance: March Madness: The $1.5b behind the NCAA tournament

IBT: March Madness 2015: Getting To The NCAA Finals Costs A Lot, But The Rewards For Most Are Slim

Go Banking Rates: Betting on March Madness and NCAA Brackets Could Get You in Legal Trouble

Challenger, Gray & Christmas Inc.: It’s March Madness: This Year’s Madness Could Cost 1.9B

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The NCAA Tournament: The Method behind the Madness appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/entertainment-and-culture/ncaa-tournament-method-behind-madness/feed/ 0 51068
Title IX: More Than Just Sports https://legacy.lawstreetmedia.com/issues/education/title-ix-just-sports/ https://legacy.lawstreetmedia.com/issues/education/title-ix-just-sports/#respond Tue, 08 Mar 2016 19:53:59 +0000 http://lawstreetmedia.com/?p=50804

The statute's becoming an increasingly important tool to prevent sexual assault.

The post Title IX: More Than Just Sports appeared first on Law Street.

]]>
Image courtesy of [Tzuhsun Hsu via Flickr]

Recently, several former members of the University of Tennessee Volunteers female training staff sued the University for violating their Title IX rights. While many people may have been caught up with Peyton Manning’s name in the filing, others were probably confused about why Title IX was invoked at all. After all, Title IX is concerned with female athletes having the opportunity to receive scholarships for playing collegiate sports, right? Partly, but it can also be invoked in cases where a woman feels her rights have been infringed upon, notably in the context of a number of high-profile sexual assault cases at major universities. Read on to find out the whole scope of the landmark statute and what role it is playing in potentially punishing universities for their actions.


What is Title IX?

Title IX is actually a section of the Educational Amendments that were passed in 1972. The purpose of these amendments was to prevent discrimination on the basis of sex in all federally-funded education programs and activities. Title IX states:

No person in the United States shall, on the basis of sex, be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any education program or activity receiving Federal financial assistance.

Since its inception, the law has been the basis for numerous amendments, reviews, political actions, and even Supreme Court cases. While Title IX is primarily discussed in the context of athletics, there are several other areas that the law regulates.

In regard to athletics, Title IX regulations require schools to give women the same amount of access as they do for men. Once it became law, Title IX had a measurable effect on female participation in sports. The law ensures that all schools provide equitable opportunities for both male and female sports including availability, resources, and scholarships. In 1972, when the law went into effect, only about 295,000 girls played sports at the high school level in the United States. Fast-forward to 2011 and that number has risen to 3.2 million. Additionally, the number of women receiving athletic scholarships went from zero to 200,000 over the same period.

The opportunity to participate in athletic programs has significant consequences beyond access for women who want to participate. In fact, increased participation has also been associated with increased graduation rates, healthier lives, and diminished trouble with the law.

The video below gives an overview of the effects of Title IX in sports:

Criticism of Title IX

While Title IX has clearly had a significant impact on female participation in athletics and equality in education more generally, the law still has its critics. On one side are those who complain that Title IX is still not doing enough to prevent discrimination. This group’s argument began almost immediately after the law’s inception when it was weakly enforced and nearly eliminated thanks to the 1984 Supreme Court decision in Grove City vs. Bell. Even as the law started to be more rigorously enforced after Congress passed the Civil Rights Restoration Act of 1987, women at all levels of athletics still have much lower rates of participation and receive less funding than men. Others also argue that enforcement remains weak and investigations can drag on for a long time without being fully resolved.

On the other hand, the law is also criticized by those who bemoan its effects on men’s sports. This starts with the prevailing belief that funding a women’s sport means cutting funding to men’s teams. But between 1988 and 2011, for example, over 1,000 new men’s sports teams were added by NCAA members. Additionally, many of the men’s sports that have seen spending cuts during this time were the victims of universities’ increasing focus and spending on two high-profile sports–football and men’s basketball–and not necessarily because of funding for women’s sports. The interaction between these two sports and Title IX is also frequently misunderstood. Title IX does not require schools to spend the same amount of money on men’s and women’s sports. Instead, all Title IX requires is that the “benefits and services” provided to both men and women are equal.


Preventing Assault

While most discussion of Title IX focuses on athletics, much of the public’s attention has started to shift toward the law’s role in preventing sexual assault. Indeed, protecting students against sexual assault has become one of the most important aspects of Title IX. The Supreme Court even ruled that schools may be liable if they fail to address reported incidents. According to the Department of Education, sexual violence “refers to physical sexual acts perpetrated against a person’s will or where a person is incapable of giving consent… A number of different acts fall into the category of sexual violence, including rape, sexual assault, sexual battery, sexual abuse, and sexual coercion.”

As more students speak out about the issue of sexual assault on college campuses and evidence about its prevalence mounts, the government has taken a more active role in dealing with the issue using Title IX. There’s a large number of surveys that measure sexual assault and sexual violence on college campuses, but many often come to different conclusions about the extent to which it affects undergraduates. Most cite the statistic that 1-in-5 female students are victims of sexual assault, and even that figure has its critics. Tyler Kinkade at the Huffington Post points out that while these statistics may be good talking points, the reason that the issue has become so important is because of the large number of students calling for more attention and better procedures to deal with these incidents.

Enforcement and High-Profile Incidents

While concern and outrage over alleged sexual assaults have increased, enforcement has faced some resistance. This seeming indifference reached such a zenith that in 2014 the Department of Education released a list of over a 100 colleges and universities under investigation for violating Title IX. The Department expounded upon this last year, releasing a “Dear Colleague” letter in which it reminded its constituent schools what sorts of actions violate Title IX laws. That letter was a follow-up to a similar one sent out in 2011–which itself was a reminder of sexual harassment guidelines released in 2001–that gave schools instructions on how to deal with sexual assault complaints. As these steps show, these schools under investigation have been repeatedly reminded of their responsibilities, yet many high profiles cases have come up recently.

The incident involving Peyton Manning and the University of Tennessee is a perfect example of the difficulties surrounding these types of cases. The case began all the way back in 1997 with a lawsuit against Manning and the University of Tennessee. It continued with another lawsuit against Manning in 2003, after the release of his autobiography in which he depicted one of the women involved unfavorably. The newest lawsuit that was filed earlier this year shows how long the process surrounding these cases can last. In the meantime, the woman who accused Manning had to agree to leave the school, while the university won a national championship and he was able to enjoy a long and storied career. According to the suit, instead of protecting victims, the school actually created an environment that was hostile to them.

This is certainly not the only controversial incident. Another high-profile incident involved former Florida State University quarterback and current Tampa Bay Buccaneers player Jameis Winston. In 2012, a female student sued Florida State for its investigation of her rape complaint against Winston and its “deliberate indifference” throughout the process. FSU’s poor handling of the case also led her to file a lawsuit. The civil suit against the school was resolved this year when FSU paid a $950,000 settlement. The alleged victim has also filed a civil suit against Winston; he has countersued.

The following video looks at the alleged Title IX infraction at FSU:

Results and Remaining Issues

Since the Office of Civil Rights began stepping up its expectations and enforcement of Title IX violations, the number of investigations has increased dramatically. Accusations like these and the actions of the Department of Education are not isolated incidents. As of April 2015, the Department of Education had over 100 active investigations for sexual violence-related Title IX issues. In its Dear Colleague letters, OCR instructs institutions to develop new standards for investigating complaints and instructed institutions to hire a Title IX coordinator to ensure that cases are handled properly.

It is important to note that in many of these investigations, including the ones surrounding Manning and Winston, no one has been found guilty in a criminal court–though criminal guilt is not necessarily the point. Regardless, the original claims were not adequately investigated, and in some cases ignored. Proper investigations may also disprove the claims and absolve the accused. Too often, though, school are accused of not pursuing complaints thoroughly or do not have the necessary processes in place to properly investigate them. Due to these shortcomings, victims are often depicted negatively and a culture of hostility can result.

Unfortunately, OCR’s investigations and related civil suits often take a very long time to complete. The Department of Education has a large backlog of investigations into schools that have been accused of violating Title IX. While President Obama made a push for more funding, little more was granted, and likely not enough to offset the rise in the number of cases and the loss of approximately a third of the Department’s workforce. Title IX also covers K-12 school districts, along with colleges and universities–adding another lay of emphasis in resolving these cases and achieving resolutions.


Conclusion

While Title IX is often seen as a law that guarantees equality in sports, it is much more than that. Athletics is only one of many areas in which the statute seeks to ensure fairness and equality. What is clearer than Title IX’s exact breadth is its impact, as it has drastically improved the opportunities for women and girls in the United States. Unfortunately, what is also clear is the limitations of the legislation and the trouble that many institutions have complying with the new guidance.

One example of these limitations, and probably the most troubling, is in regard to sexual harassment. There have been repeated, high-profile incidents of workers and students complaining of sexual harassment or assault. As the growing number of OCR investigations indicate, schools have had a hard time instituting processes to adequately deal with these cases. This is exactly the type of thing Title IX was meant to prevent, yet has struggled to accomplish. The law is certainly not a panacea, but it applies to more than just sports and with greater implementation, it can have a very wide-reaching effect.


Resources

Feminist Majority Foundation: Empowering Women in Sports

Title IX: History of Title IX

NCAA: How is Title IX Applied to Athletics?

The Washington Post: Title IX has Helped Encourage Many Girls to Play Sports

USA Today: Florida State Agrees to pay Winston Accuser $950,000 to Settle Suit

ESPN: Baylor Faces Accusations of Ignoring Sex Assault Victims

CNN: 23% of Women Report Sexual Assault in College, Study Finds

Huffington Post: Federal Campus Rape Investigations Near 200, And Finally Get More Funding

Department of Education: Dear Colleague Letter on Title IX Coordinators

U.S. Department of Education: U.S. Department of Education Releases List of Higher Education Institutions with Open Title IX Sexual Violence Investigations

U.S. Department of Education: Dear Colleague Sexual Violence

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Title IX: More Than Just Sports appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/education/title-ix-just-sports/feed/ 0 50804
Ransomware: Holding Our Digital Lives Hostage? https://legacy.lawstreetmedia.com/issues/technology/ransomware-holding-digital-lives-hostage/ https://legacy.lawstreetmedia.com/issues/technology/ransomware-holding-digital-lives-hostage/#respond Wed, 02 Mar 2016 21:40:34 +0000 http://lawstreetmedia.com/?p=50935

Why is ransomware so effective?

The post Ransomware: Holding Our Digital Lives Hostage? appeared first on Law Street.

]]>
"Virus" courtesy of [Yuri Samoilov via Flickr]

A hospital in Los Angeles, the Hollywood Presbyterian Medical Center, recently agreed to pay a ransom of $17,000. But the ransom wasn’t paid to free some worker held hostage or to prevent the release of a catastrophic pathogen. Instead it was handed over to hackers for the safe return of its patients’ medical files. Hackers managed to penetrate the hospital’s computers and encrypt its files, and demanded a large sum to be paid in the form of Bitcoins. While this scenario sounds far-fetched, this type of crime is actually on the rise. Read on to find out more about ransomware, bitcoins, why these types of attacks are increasing, and what can be done to stop them.


What is Ransomware?

Ransomware is a type of malware employed by hackers to stop users from accessing their own information or data.  It does this in one of two ways. Either a screen is locked and instructions are provided for unlocking it, or important information is encrypted and a password or key known only to the hackers is required to reopen the essential information. While the exact date of ransomware’s origin is non-definite, it appears to have started in Russia sometime around 2006, spreading globally by 2012.

By 2013, ransomware hackers were using encryption through something known as CryptoLocker. Before encryption, ransomware typically blocked people from using their computers or tricked users into paying to regain access to their computers. An example of this is Reveton, which shows notifications claiming to be from a law enforcement agency, informing the user that a crime has been committed and a fine must be paid. But such malware could be uninstalled or removed with an antivirus program, though even that can be particularly difficult. When encryption came on the scene, hackers began encrypting files, making it impossible for users to access their own information without an encryption key. Even if the ransomware is removed, the files remain encrypted. This key element of ransomware is what makes it both very dangerous and lucrative, as it can be removed yet continue to do damage.

In 2014, ransomware hackers also began using the Tor network to remain anonymous. Tor is a unique network that does not directly plug into the internet, connecting through a series of servers instead. Hackers began using this network to communicate with command and control servers that store the encryption key, which can be sent to an infected computer after a ransom is paid. Doing so makes it nearly impossible to track an attack to an individual because their identity is concealed throughout the process.

The accompanying video gives a quick look at what ransomware is:

Payment

Paying the ransom part of ransomware is also an increasingly complex process. In the case of ransomware like Reveton, hackers often request payment through several services that are difficult to trace such as UKash, PaySafeCard, and MoneyPak. But a growing trend among these hackers has been to request the money in Bitcoins, which is how the hospital in Los Angeles paid its ransom. Bitcoin is a type of cryptocurrency that exist entirely online with no physical presence. Bitcoins are not controlled by a central bank and are based on mathematics, making it completely decentralized and not tied to the value of a commodity like gold or silver. Bitcoin is particularly attractive to hackers because of the anonymity it provides.


Growing Popularity of Ransomware

The threat of ransomware is also on the rise. As of January 2013, there had been 100,000 such attacks but by the end of that year alone that number rose to nearly 600,000, according to Antivirus software company Symantec. Symantec also looked at data from command and control servers used by ransomware hackers to estimate how profitable these scams really are. According to its calculations, hackers can earn around $33,600 per day, amounting to as much as $394,000 in a month. Two primary questions remain: how do hackers select targets and why are attacks increasing?

To answer the first question, targets so far have generally been chosen at random, although future hackers could research a target beforehand to find the most lucrative one. While targets are generally chosen at random, many victims have been infiltrated by viruses or spyware before, suggesting that certain victims may be chosen simply because their systems are easy to penetrate. Traditionally, these random targets were individuals who paid small sums, but recently, the size of the target and the requested ransoms have increased. Conventional wisdom on the use of ransomware is also changing as the payment for these random attacks has shifted more and more to Bitcoins.

Bitcoins help answer the second question–why are ransomware attacks on the rise? While Bitcoin is completely transparent when it comes to transactions, it is often very difficult to trace a Bitcoin address back to an individual, making it easy for hackers to remain anonymous. The rise of Bitcoin has given hackers a reliable and anonymous method to receive ransom payments, which likely contributes to the rise in ransomware attacks.

The video below comments on the attack in LA and the rise of such attacks:


Stopping Ransomware

So with ransomware attacks increasing, how can people avoid falling victim?  There are several steps any user can take to eliminate or, at least, mitigate their exposure to dangerous ransomware. First is to use a reputable anti-virus software to help prevent and remove malicious programs. But reputation is important, as there are many fake options that may actually give your computer a virus. Similarly, it is important to make sure your computer’s existing firewall is strong and activated.

Even with anti-virus software in place and a strong firewall, it is still paramount to be cautious. Using a pop-up blocker and being careful when opening email attachments is also an important way to avoid exposure. It is additionally important to back up files and information regularly. If you have a backup of your files in the cloud or on an external hard drive, you will still have access to your information even after it is encrypted by ransomware.

In the event of a ransomware attack, it is also important to get the authorities involved, including the FBI, as ransomware is generally beyond the scope of local police departments. In fact, the police themselves are not immune to attacks either, as police departments in both the Boston area and in Maine fell victim and paid subsequent ransoms.

So far, the FBI has actually had some success fighting ransomware.  In 2013, for example, it stopped the software platform Citadel, which was behind the Reveton-style ransomware attacks. In 2014, the FBI also disrupted a major botnet–a network of computers used to infect computers with malware– and seized control of the servers behind CryptoLocker. While the FBI has had some success fighting these hackers, in certain cases the bureau says the best way to fight ransomware is to actually pay the ransom. While this goes against the conventional wisdom of not giving into criminals’ demands, the encryption used is often nearly impossible to crack and the requested ransoms may be relatively small. Put simply, for some people its often easier to just pay up.


Conclusion

Not only is ransomware on the rise, it is becoming much harder to combat and hackers are moving to even more lucrative targets. While it is bad enough that individuals often have to deal with ransomware, hackers are now starting to go after essential institutions such as police departments and hospitals. While targets take on an ever-growing importance, the reality is that ransomware is not going away anytime soon. In many respects, ransomware is not that different from other types of malware, with the exception that it offers to restore the user’s capabilities for the right price. As is the case with other malware, ransomware shows no signs of fading. Its methods are becoming more effective and recovering payments is easier than it has ever been.

Unfortunately, potential targets and those already affected have little recourse in this battle. While the FBI has made some progress, even it suggests that paying up for relatively small amounts may be victims’ best option. An important question going forward is how to respond if hackers increasingly target important institutions. And as the profiles of these targets increase, will the ransoms increase as well?


Resources

Symantec: Ransomware: A Growing Menace

Tech Times: LA Hospital Hit By Ransomware Pays Hackers $17,000: Is It The Right Choice

Trend Micro: Ransomware

Tor Project: Tor Overview

Coin Desk: What is a Bitcoin?

Phys.org: Why Ransomware is on the rise

Norton: Beware the Rise of Ransomware

Federal Bureau of Investigations: Ransomware on the Rise

The Security Ledger: FBI’s Advice on Ransomware? Just Pay The Ransom

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Ransomware: Holding Our Digital Lives Hostage? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/technology/ransomware-holding-digital-lives-hostage/feed/ 0 50935
Stadium Deals: The High Price of Your Home Team https://legacy.lawstreetmedia.com/issues/business-and-economics/stadium-deals-high-price-home-team/ https://legacy.lawstreetmedia.com/issues/business-and-economics/stadium-deals-high-price-home-team/#respond Sun, 21 Feb 2016 17:05:33 +0000 http://lawstreetmedia.com/?p=50537

The cost has become huge.

The post Stadium Deals: The High Price of Your Home Team appeared first on Law Street.

]]>
"FirstEnergy Stadium" courtesy of [Erik Drost via Flickr]

As of 2015, each NFL team was worth at least $1.4 billion according to Forbes. Despite these massive valuations, three teams–the Oakland Raiders, St. Louis Rams, and San Diego Chargers–hoped to leave their current cities and move to Los Angeles. The reasoning behind the desire to move was a yearning for a new, higher revenue stadium to play in. Each team was eager to leave its current city because of the failure of the local government to fund similar sites. This has increasingly become an issue in the NFL, as with other major sports leagues. It has even made international events like the World Cup and Olympics less appealing to host cities.

Read on to find out more about the real cost of having a hometown team, why the public keeps giving in to team demands, and how this problem has spread around the globe.


The Cost

The cost to taxpayers to renovate or build new stadiums has been enormous, particularly when it comes to America’s most popular sport, professional football. Over the last 15 years, the NFL has received $12 billion in public funds. These funds are not isolated to a few teams either, 29 of the 31 stadiums in the NFL (the two New York teams play in the same stadium) have used public financing to build new arenas. Public support for pro sports teams comes in the form of tax breaks, loans, and grants offered primarily by a city or county and occasionally the state.

Despite this funding, most cities and voters have little, if anything, to show for it. In fact, some reports suggest that using subsidies to pay for stadiums can negatively affect metrics like poverty rates and median income levels. Robert Baade, a researcher of these stadiums and their economic consequences, disputes the clear connection between the two, but his own research suggests that new stadiums almost always fall short of delivering the promised economic uptick suggested when they are funded.

Even stadium-related costs for things like renovation go to projects such as luxury boxes or seat licenses, which are not typically accessible to the casual fan or those with less disposable income. These same fans also do not see any sort of break on the cost of tickets or concessions. In fact, the only group that really seems to make out well in all this is the owners. Not only do the owners get to keep most of the revenues from the stadium, minus a very modest rent, they also pay very little in upkeep costs.

The video below goes into detail about what goes into building these stadiums:


Pay… or Else

Since most economists agree that there are very little, if any, benefits to using public funds to build these stadiums, it may be worth considering why people agree to finance them at all. Sometimes, it may simply be because the public doesn’t know about the plans. In 2013 for example, the Atlanta Braves organization agreed to move the team out of the city and into Cobb County. The move was done in secrecy and all information about the deal was kept from the public because otherwise voters could have rejected the plan.

While the Braves left Atlanta to get funds from a nearby county, the city was already building a new stadium for its NFL team as well as a practice facility for a new MLS team. This move is also unfortunate because the original stadium, which was built for the Olympics and would become the Atlanta Braves’ home, was initially financed without any public money. Making matters worse, that stadium they are so desperate to move out of is not even 20 years old.

In the recent fight for the St. Louis Rams, the court voided a law requiring a public vote to approve stadium funding only to watch the Rams leave for Los Angles anyway. But when citizens actually do get the chance to vote on how their tax dollars are spent, sports teams have mastered an invaluable tactic to keep the money flowing. That tactic is the threat of relocation, or basically taking the hometown team hostage to see if the city and the taxpayers will pay up. Once again, the Rams are not the first to use this threat or to follow through with it. The Baltimore Ravens and Indianapolis Colts, two high-profile teams that have won Super Bowls in the last 10 years, both left the cities they originally played in when their demands for new stadiums were not met.

The following video gives a lighthearted look at how teams use the threat of relocating to get new stadiums:


Case Study: The St Louis (now Los Angeles) Rams

While the Rams are clearly not the only team to move or used the threat of relocation in an attempt to leverage a city for a better stadium deal, it is the most recent. The Rams originally moved from Los Angeles in 1995, lured to St. Louis with a $250 million stadium paid for exclusively with public funds. However, as the stadium aged, Rams owner Stan Kroenke, the 63rd richest person in the United States, worth $7.6 billion, exercised an opt-out clause that let him flee St. Louis for Los Angeles and its much bigger media market.

This decision came even after St. Louis agreed to offer the Rams an additional $158 million while the new site in Inglewood, California offers no public funds at all. This may seem foolish, but as the owner of the new site–which includes amenities beyond just a football stadium–and landlord for whichever other teams play in Los Angeles, Kroenke is likely to make back his initial investment and more. Although Kroenke ultimately opted for a venue that does not utilize public funds, the Rams managed to get a remarkable offer from St. Louis. In the negotiation process, the owner, and to an extent the local government, utilized many of the classic leveraging techniques such as removing the influence of voters, threatening relocation, and ultimately following through on a threat to move.


A Global Epidemic

While NFL stadiums are the biggest and most public culprits of the stadium financing problem, there are a number of high profile examples around the globe. These often come in the form of stadiums built for the World Cup and the Olympics.

After less than two years, many of the sites for the Men’s World Cup in Brazil sit idle, barely being used. Some of the stadiums are only now being finished while others are put up for sale so that the government can make back some of its investment–an estimated $3 billion spent on building and refurbishing the facilities. Brazil is also scheduled to host the Summer Olympics in 2016, a move that comes with similar problems but on an even grander scale.

The abandonment of Olympic stadiums has also become a major issue for host countries. Facilities in places like Beijing, Seoul, Athens, and Montreal sit abandoned or are rarely used just years after costing the cities that built them hundreds of millions of dollars. This is especially disconcerting because it comes with the additional cost of hosting the Olympics. In what makes the NFL’s demands look like pocket change, the cost of the Olympics has averaged $3.6 billion from 1968 to 2010 and $16.2 billion after that. As bad as these costs are, the fact that they have an average cost overrun of 167 percent is even more concerning.

In fact, due to the ever-rising costs of holding the Olympics, many cities now are hesitant to host future events. In the run-up to the selection of the site for the 2022 Winter Olympics; Poland, Germany, Switzerland, Sweden, and the Ukraine withdrew their bids when polls showed it would have been incredibly unpopular. Boston similarly followed suit when it withdrew its bid to host the Summer 2024 Games. Ironically, one of the best examples of a city actually repurposing one of its old Olympic stadiums comes from Atlanta where the recently abandoned Turner Field Baseball stadium was created by repurposing the 1996 Olympic Stadium. The accompanying video gives a chilling look at a number of stadiums used for the Olympics and then just abandoned:


Conclusion

Professional sports teams, like any business, are always seeking to maximize their profits and it is unrealistic to expect them to do otherwise. Even though the public usually pays a large portion of the cost for a new stadium, they rarely pay all of it. Instead, that cost is usually spread out between the owner–in the case of the NFL, all the owners by using something known as the G4 Fund. Sports leagues are not the only corporations utilizing these tax breaks and deals offered by local governments. Many companies take advantage of these opportunities and there is often less of an outcry when cities, counties, or states offer huge tax breaks to lure other organizations. While these teams and businesses may have a lot of bargaining power, the decisions to use public funds remain up to local governments, and in some cases, voters.

However, while there is certainly enough blame to spread around, only the owners of these teams or their partners tend to benefit from them. This is particularly true for NFL owners, as they not only benefit from stadiums but media deals as well. Until recently, the NFL was also considered a non-profit organization, giving it even more tax breaks.  All the public has to show for this is more debt, empty stadiums, and the knowledge that the next threat of a move could happen at any time. The situation is increasingly frustrating and begs the question: what can be done?

Unfortunately, there may not be much that cities can do. In his recent budget proposal, President Obama included a plan to end tax-free bonds for teams, but that has a long way to go before it becomes law. Other suggestions have included anti-trust lawsuits, but these too gained little traction. The ultimate problem is that there are more cities than teams, meaning the teams will always have leverage of some kind and cities are often interested in getting a new team.  The real lesson in all this is while we root for our favorite hometown teams we should remember those teams will likely only remain in our hometowns if the price is right.


Resources

Forbes: The Business of Football

City Lab: The Never-Ending Stadium Boondoggle

The Huffington Post: “Taxpayers Have Spent A ‘Staggering’ Amount of Money On NFL Stadiums

Buffalo Rising: New Stadium Prospectus: Finance-Truth, Misconceptions, and Consequences

The Wire: Voters Don’t Want to Pay for Sports Stadiums Anymore

Curbed: How Atlanta’s Stadium-Building Madness is Nothing New

WBUR: Nearly 20 Years Later, The Legacy Of Atlanta’s Olympic Venues Is Still being Written

The New York Times: In Losing the Rams, St. Louis Wins

St. Louis Business Journal: Kroenke (Stan and Ann) are some of America’s richest

The New York Times: World Cup Stadiums Leave a Troubled Legacy in Brazil

The Wire: Turner Field Is the Latest In a Long Line of Abandoned Olympic Stadiums

Business Insider: The cost of hosting the Olympics is getting out of control

Slate: How to Stop the Stadium Wars

The Atlantic: How the NFL Fleeces Taxpayers

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Stadium Deals: The High Price of Your Home Team appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/business-and-economics/stadium-deals-high-price-home-team/feed/ 0 50537
Why Does Peace in Syria Remain Elusive? https://legacy.lawstreetmedia.com/issues/world/peace-syria-remains-elusive/ https://legacy.lawstreetmedia.com/issues/world/peace-syria-remains-elusive/#respond Tue, 16 Feb 2016 18:51:39 +0000 http://lawstreetmedia.com/?p=50503

Where each of the major players stand.

The post Why Does Peace in Syria Remain Elusive? appeared first on Law Street.

]]>
Image courtesy of [Kurdishstruggle via Flickr]

After years of fighting destroyed cities, led to massive waves of refugees, and killed hundreds of thousands of people in Syria, world leaders are finally coming to the table in order to reach a peace agreement. On February 1, leaders from around the region and the world met in Geneva, Switzerland in order to lay the groundwork for a deal that might end the conflict.

While even getting this far is an accomplishment, actually achieving a sustained peace is further complicated by the various regional and world powers involved, each of whom has their own agendas to satisfy. Couple that with the role of non-state actors such as ISIS and the Al-Nusra Front and the reason why peace has been so elusive becomes clearer. Read on to find out about the origins of the Syrian conflict, what each side wants and how those involved expect to create a lasting peace.


A Brief Overview

The war in Syria marks the last gasp of the Arab Spring. Beginning in March 2011, thousands of protesters took to the streets after government forces arrested, tortured, and killed opponents of the Syrian regime. But doing so escalated the conflict leading to the consolidation of several rebel factions that rose up in violent resistance. Since the conflict devolved into full-fledged civil war, there have been atrocities and war crimes committed by both the rebels and the Syrian government led by Bashar al-Assad. The most infamous were the chemical weapons attacks in 2013, which nearly led to a direct U.S. intervention. The situation was eventually resolved when the United States, Russia, and Syria reached an agreement to dispose of the Syrian chemical weapons stockpile.

Unsurprisingly, the conflict has resulted in violence and destruction on a mass scale. As of the start of 2016, an estimated 250,000 people had been killed and 11 million others have been displaced either internally or abroad. The resulting refugee crisis has reached historic proportions, testing the limits of neighboring countries and the European Union.


Who is Involved?

Due to the long-running nature of the conflict as well as the number of people killed or displaced, many of the world’s major powers have also gotten involved. The contingent opposing Assad includes Saudi Arabia, Jordan, Turkey, Qatar, the United Kingdom, France, and the United States. The countries bolstering Assad are Iran and Russia. Along with these nations are non-state actors such as ISIS and the Al Nusra Front. With all of these groups involved, to understand how the peace process hopes to work, it is first necessary to understand what they each want.

The United States and its Allies

The clearest distinction in what the two sides are hoping to achieve comes in the targets of their respective airstrikes. The U.S.-led collation has focused on targeting ISIS positions while trying not to assist Assad in any way. The coalition’s main goal is to bring the conflict to an end peacefully, ensure that Assad leaves office, and also stop the flow of refugees.

So far, the west has focused almost exclusively on defeating ISIS and not fighting the Assad regime directly. The Obama administration initially authorized a program to train rebels, but it was viewed as a disaster and the program was shut down last October. Aside from logistical problems, one area of contention was Washington’s insistence that rebels focus on fighting ISIS over Assad, which they did not agree with. In its place, the United States began to directly offer arms to the Syrian rebels.

An ideal peace agreement for the United States would involve Assad leaving power and the creation of some form of a cooperative, moderate government to take his place. Doing so would need to also enable displaced Syrians to return home and allow the United States to focus on defeating ISIS exclusively.

Russia

Much of Russia’s interests in Syria run counter to what the United States wants to see happen. This starts with Russia’s airstrikes, which have reportedly been targeting the opposition groups fighting Assad and not terrorist organizations such as ISIS. Like Iran, Russia hopes to keep its client Assad in power in Syria, however, its larger aims in Syria and the greater Middle East are far-reaching and complex. For more information about Russia’s role in the Middle East and its interests there check out this explainer.

So far, Russia has been willing to openly assert its positions even at the expense of a potential peace deal. Most recently, as countries involved in the region agreed to a version of a ceasefire, Russia embarked on an airstrike campaign to support a Syrian government attack on Aleppo, frustrating potential peace partners. For Russia, the best case scenario would be Assad maintaining his power so that Russia maintain its foothold in the area and the stability of one of its longstanding allies.

Saudi Arabia and Iran

Two other major players are Saudi Arabia and Iran. While the Saudis are tentatively an ally of the United States, the country has several important interests in the conflict. Iran is similarly situated but on the other side of the conflict, finding itself partially aligned with Russia. Both countries’ concerns with the Syrian conflict center over their expanding proxy war, which pits them against one another on religious and geopolitical grounds. The conflict was already sectarian in nature, pitting President Assad–a member of the minority ruling Shia Alawite sect–against the majority Sunnis. Iran, another Shia country, provides billions of dollars in military and economic aid to Assad. Meanwhile, Saudi Arabia has been funneling a lot of support for the Syrian rebels. The escalating feud between Iran and Saudi Arabia has already strained the existing peace efforts–the execution of a cleric in Saudi Arabia causing Iran to retaliate and tensions to rise.

For Iran, it would be a major victory if Assad is able to stay in power. Not only would it mean keeping him as a client, it would also help them maintain influence in Lebanon as well. Additionally, it would serve as a victory over both Saudi Arabia and the United States. For Saudi Arabia, victory would mean Assad losing power and a new government made up of the Sunni majority population. This would give the Saudis a badly needed win in a proxy war that has so far seen Iran gain influence throughout the gulf.

Non-State Actors

Adding fuel to the sectarian nature of this war is the presence of non-state groups such as ISIS and the Al-Qaida sponsored Nusra Front. These groups have battled each other, the other countries acting in Syria, and Assad’s forces. ISIS has proven to be the most successful and prominent group, taking and holding large chunks of territory in both Iraq and Syria. In fact, ISIS is the reason why the foreign powers are in Syria in the first place, although Russia, Iran, and likely some of the Gulf States are clearly there for other concerns as well.

The presence of ISIS and Al Nusra has severely complicated the situation in Syria. The mere presence of these groups makes any effort to arm Syrian rebels much more complicated, as countries fear that their weapons will fall into the wrong hands. Unfortunately, it is nearly impossible to distinguish who is a member of ISIS and who is just someone fighting against the regime. Aside from ISIS and Al Nusra, Iran-backed Hezbollah and the Syrian Kurdish PYD have also been involved in the fighting.

Syria

Then there’s Syria itself. The ongoing conflict has destroyed much of the country’s infrastructure and displaced massive portions of the Syrian population. Estimates indicate that the cost to fix the damage done to the country from a monetary standpoint could be as much as $200 billion. Considering how hard it has been to merely find the funds to help Syrian refugees, it appears unlikely that much money could or would be raised to rebuild an unstable country.

The best case scenario for Syria is hard to pinpoint. Assad’s departure would certainly be in the interest of the majority Sunni population, but doing so could also create a massive power vacuum furthering the rise of extremism. In this case then, perhaps forming some type of coalition which incorporates both the opposition and elements of the Assad regime in to order maintain some sort of peace may be the most that can be hoped for.

With all these parties involved and the constant infighting, little has been accomplished. The reality is, there is more than one war going on in Syria at the moment. To achieve peace in Syria, all these separate conflicts would need to be resolved at once, with the possible exception of the fight against ISIS.

The following video gives a sample of what may be next for Syria:


Peace for our Time?

In mid-December, the U.N. Security Council agreed to create a path that would eventually lead to peace in Syria. After years of violent conflict, peace talks finally began on February 1 in Geneva, Switzerland. The talks started with a U.N. special envoy Staffan de Mistura meeting separately with the government and opposition representatives. The talks are tentatively planned to last for six months. However, there is not even a preliminary understating of how, let alone if, Assad will give up power.

In fact, the only reason these talks are even taking place now is conditions are so bad in some places as to potentially demand war crime charges. The opposition only considered participating because they were promised that major headway would be made toward addressing these most serious issues. And almost immediately after the process was initiated, it was suspended due to attacks by the Syrian government with Russia’s backing. How much ultimately comes from these talks and whether they even occur as planned remains a mystery. The following video gives a quick look at some of the problems plaguing the peace talks:


Conclusion

After years of fighting, millions displaced, and hundreds of thousands dead, peace talks in Syria must be a good idea, right? Unfortunately, all available evidence suggests that there is very little chance of a sustainable peace agreement on the horizon. While talks may help strengthen diplomatic ties as the conflict rages on, there appears to be very little in the way of progress to stop the violence.

The problem with this peace process is there are too many different parties at play, with very different sets of interests and strategic goals. One side wants Assad to stay, the other will not negotiate unless he is forced to leave. But that is just one of the many questions at hand, as many parties have a wide range of strategic interests in the war. This problem is compounded further, by the fact that the opposition to Assad is a hodge-podge of groups and no one can agree on who to trust. In fact, the strongest opposition group in Syria is probably ISIS or the Al Nusra Front, but neither of them was invited to the peace conference for obvious reasons.

While some sort of peace in Syria may be possible down the road, the possibility that it is favorable for all those involved, especially the Syrian people, is far less likely.


Resources

International Business Times: Syria: Shaky Peace Process to Start in Geneva Amid Deadly Bombings and Sieges

BBC News: Syria: The Story of the conflict

BBC News: Syria Crisis: Where Key Countries Stand

Law Street Media: Why is Russia Getting Involved in the Middle East?

The Guardian: Future of Assad in Doubt as UN Unanimously Supports Syrian Peace Process

Euro News: Aleppo Assault Threatens Fragile Syrian Peace Process

Al Jazeera: Prominent Syrian Rebel Commander Killed in Airstrike

Al Jazeera: Saudi-Iran Crisis Throws a Wrench in Syria Peace Talks

History News Network: 6 Predictions About What will Happen in Syria

CNN: You Thought Syria Couldn’t Get Much Worse. Think Again

The New York Times: Syria Talks Are Suspended

BBC: Arming Syrian rebels: Where the US Went Wrong

 

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Why Does Peace in Syria Remain Elusive? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/peace-syria-remains-elusive/feed/ 0 50503
Iran’s Leadership: Inside the Complex Regime https://legacy.lawstreetmedia.com/issues/world/irans-leadership-bottom-top/ https://legacy.lawstreetmedia.com/issues/world/irans-leadership-bottom-top/#respond Thu, 11 Feb 2016 21:15:12 +0000 http://lawstreetmedia.com/?p=50379

Who is in charge in Iran?

The post Iran’s Leadership: Inside the Complex Regime appeared first on Law Street.

]]>
"Ayatollah Khomeini" courtesy of [David Stanley via Flickr]

While Iran has a parliament and president, like many western nations, its political structure is far more opaque. From the Supreme Leader to influential religious councils, understanding Iran’s leadership presents a challenge in and of itself. This challenge has been highlighted by a number of high profile events where it was unclear who had the final say in important Iranian policy decisions. Read on to learn how the Iran leadership was developed, how it is currently structured, and how that leadership defines itself both domestically and abroad.


The Revolution and Aftermath

The Iranian Revolution that occurred in 1979 was years in the making; its origins go back to at least to 1953. During that year, the CIA helped overthrow the recently elected prime minister in favor of the Shah, who had Western leanings and was an opponent of Soviet-style communism. While the Shah honored his loyalty to the United States, he was less kind to his own people, frequently imprisoning and even torturing those opposing him.

This set the stage for the revolution of 1979. This movement was led by Ayatollah Ruhollah Khomeini who returned from Paris where he had been exiled during the Shah’s rise. In place of the Shah’s one-party government, the Ayatollah installed his own based on Islamic teachings, placing himself as the country’s Supreme Leader. The new emphasis on strict adherence to Islam meant a rollback on the Shah’s few, more liberal reforms concerning the economy and women’s rights.

The following video details the specifics of the Iranian Revolution:

The Shah, who had come to power following World War II, ruled as the head of a constitutional monarchy with himself as the final arbiter. When he was deposed by Khomeini the democratic institutions that had existed were kept, however, any power they had was drained. In the new system, Khomeini ruled as the unquestioned leader of his own government which focused heavily on instilling Islamic concepts and resisting interaction with Western nations he viewed as corrupting Iran. The next sections will detail the unelected and elected elements of Khomeini’s Iran and how they are structured so that his power is virtually unchallenged.


Unelected Officials

Similar to the U.S. government, part of Iran’s government is appointed, independent of any elections. In the Iranian case, however, this aspect of the government is unquestionably the most powerful part, including many important institutions.

The Supreme Leader

As the final decision maker, the Supreme Leader has either direct or indirect control over nearly the entire government because his primary responsibility is to maintain the continued existence of the Islamic State of Iran. To ensure this, the Ayatollah has power over all three branches of government, the military, and even the state-run media. He also has power or influence on virtually every other political institution, the economy, and major policy decisions. In other words then, the Supreme Leader is the undisputed power in the Iranian regime.

The person who spearheaded the 1979 revolution and the first to hold this all important office was Ayatollah Ruhollah Khomeini. Khomeini founded the state and defined his role in it by championing four key characteristics, “justice, independence, self-sufficiency, and Islamic Piety.” Khomeini also offered a religious justification for the office, believing he held the place on earth of a 12th Imam, a descendant of the Prophet Muhammed who has since gone into hiding. Khomeini died in 1989 with no appointed successor.

The man who succeeded him and the current supreme leader of Iran is Ayatollah Ali Khamenei. Khamenei has served in this position since 1989 making him the second longest current ruler in the Middle East. Khamenei was a longtime loyalist to Khomeini and also served two terms as Iran’s president before outmaneuvering rivals for the coveted Supreme Leader position.

The Guardian Council

Next in Iran’s unelected hierarchy is the Guardian Council. The Guardian Council is arguably the most important Iranian institution aside from the Supreme Leader. The council has the final say on legislation passed by the parliament and maintains the ability to determine which candidates are eligible to run for public office in the parliament, presidency, and the Assembly of Experts. There are 12 members, six chosen by the Supreme Leader and six chosen by the judiciary and confirmed by parliament. The members of this group serve six-year terms. This group’s ability to evaluate legislation is part of its role that is similar to the U.S. Supreme Court. While the Supreme Court evaluates laws based on their adherence to the U.S. Constitution, the Guardian Council determines whether laws are compliant with both Iran’s constitution and Islamic law.

The Expediency Council

The Expediency Council serves as advisors to the Supreme Leader, much as the cabinet does to the president. This assembly is directly appointed by the Supreme Leader and consists of highly regarded political, social, and religious authorities. Aside from advising the Supreme Leader, this body’s main responsibility is to act as the final arbiter in disputes between the Parliament and Guardian Council. In 2005, it was also granted sweeping powers by the Supreme Leader over all branches of the government.

The Judiciary

Iran’s judiciary is a multi-tiered system of courts tasked with overseeing the enforcement of the law and settling grievances among Iranian citizens. The Supreme Leader has a considerable amount of control over the judiciary as he appoints its leader, who then appoints the head of the Supreme Court and the top public prosecutor. There are three main branches of the judiciary, the public courts, the revolutionary courts, and the special clerical court. While the public court deals with criminal and civil matters, the latter two courts deal with everything else.

Based the structure of the judiciary and its position beneath the Supreme Leader, many believe that it is often used as a political tool to squash dissent and maintain strict control over the people of Iran. Critics also note that the trial process in Iran is often very opaque and restrictive, allowing greater government influence.

The Revolutionary Guard

The Islamic Revolutionary Guard Corps (IRGC) is yet another body whose leadership is appointed by the Supreme Leader, along with the regular army. This group was created following the revolution to defend its key figures and fight its opponents. Unsurprisingly, this group only answers to the Supreme Leader. Aside from being in charge of militia branches in every town in Iran, the Revolutionary Guard has widespread influence throughout Iranian life.

The Revolutionary Guard’s special place both within the military and within Iran itself comes from its initial purpose of serving as an armed body loyal to the revolution as the regular army that had been loyal to the departed Shah. Since its inception, the guard has acquired billions of dollars from a variety of activities such as shipping, construction, defense contracts, and oil production. The group uses many of these assets to fund militant or extremist groups abroad such as Hezbollah. The Revolutionary Guard is so powerful, in fact, that some of the American and E.U. sanctions have targeted the IRGC specifically.

The two other major components of Iran’s defense forces are the army and the ministry of intelligence and security, which is essentially the Iranian CIA. All three of these groups are under the direction of the Supreme National Security Council. While this agency is again tentatively under the control of the president, in reality, the Supreme Leader possesses most of the control.


Elected Officials

Also similar to the United States, a portion of the Iranian government is elected by the people. Anyone over 18, including women, is eligible to vote. Also like in the American system, the different branches have some checks on one another.

The President and Cabinet

The presidency in Iran shares some of the characteristics of the same position in the United States. Namely, the presidential term is four years, and a president can only be elected for two consecutive terms. However, while the president, in theory, is the second most powerful person in Iran behind the supreme leader, reality suggests that the office’s power is drastically curtailed by unelected leaders. Not only does the president answer to the Guardian Council, which chooses who can run for the position in the first place, but the Supreme Leader retains final authority over most major political decisions. In fact, the President of Iran is the only executive in the world to not have control over the country’s military.

Parliament

Iran’s parliament has 290 members and is similar to most western legislatures. Notably, this body has its membership determined through popular elections. Once elected, members have the power to introduce and pass laws as well as summon and impeach cabinet ministers and the president. Once again, though, Parliament’s power and even who is eligible to run for office is determined by the Guardian Council. Unlike in the United States, the Iranian legislature is a unicameral body whose members serve four-year terms. The Iranian parliament’s sessions and its minutes are open to the public.

Assembly of Experts

The final part of Iran’s leadership that is directly elected is the Assembly of Experts. There are 86 members of this body and each one is elected to an eight-year term. To be considered, each member must be a cleric or religious leader. This group has the critical responsibility of appointing and subsequently monitoring the Supreme Leader. Members of this group are vetted first by the Guardian Council, the primary check on its influence. This group meets for only one week each year and although it has the power to depose the Supreme Leader it has never challenged any of his decisions since the Islamic Republic of Iran formed. The accompanying video gives a concise explanation of how the Iranian government is organized:


Major Challenges Facing Iran’s Leadership

Domestic Dissent

Protests in Iran became particularly significant in the 20th century, as Iranian citizens frequently spoke out against the government. For the first half of the century, this was aimed at the decadent dynastic government and later colonial masters. The resistance then focused on the Shah, which eventually led to the Iranian Revolution. Following the revolution, discontent emerged in 2009 when people took to the streets to dispute then President Ahmadinejad’s reelection. In 2011, another flare-up of protests occurred concurrently with the Arab Spring revolts in nearby countries. Much of the protest again focused on the contentious 2009 elections and were led by the Green Movement.

International Relations

Political decisions in Iran are often the result of a complex process that is typically driven by the Supreme Leader. Given the nature of the Iranian government, several international concerns have significant implications for the country and how its government responds.

Possibly the most pressing concern facing Iran is its proxy war with Saudi Arabia. The two countries have effectively positioned themselves as the defenders and standard bearers of Islam, but champion different denominations. This is especially true of the Supreme Leader who feels it is his mission to lead Islam and who also views Saudi Arabia as an obstacle in the way of that. This proxy conflict threatens to turn into more direct action if Iran reneges on its nuclear deal. The video below details the proxy war between Iran and Saudi Arabia:

The recent nuclear deal between Iran and the United States brings up another important challenge for the country. While the two groups have worked together to finalize the deal, a conflict remains. Aside from the history of distrust between both countries, Iran’s support for a number of groups such as Hezbollah and Hamas–which are considered terrorist organizations by the U.S. State Department–and its anti-Israel policy remain hurdles.


Conclusion

Iran has a large and complex leadership structure, which originated in the aftermath of the revolution in 1979. On one hand are democratic institutions such as the president and parliament, which are similar to American and Western models. On the other are a series of appointed offices that wield a significant portion of political power in the country. At the heart of this system lies the Supreme Leader who has control over many of the appointments and final say over virtually all of the country’s affairs. This system itself is a reaction to the previous secular regime of the Shah, which was founded upon a greater emphasis on Islamic law as well as inherent animosity toward the United States.

Iran is a mixture of theocracy and democracy, and understanding how Iran is governed and run is critical to understanding how to effectively deal with it. As history has shown, many countries, particularly the United States, have misinterpreted or misjudged the nation’s leadership.


Resources

The New York Times: 1979: Iran’s Islamic Revolution

United States Institute of Peace: The Supreme Leader

BBC News: Guide: How Iran is Ruled

Your Middle East: Iran’s Century of Protest

Global Security.org: Pasdaran: Iran Revolutionary Guard Corps (IRGC)

The Guardian: Iran Protests See Reinvigorated Activists Take to the Streets in Thousands

Politico: The Hidden Consequences of the Oil Crash

The New York Times: U.S. and Iran Both Conflict and Converge

Encyclopedia Britannica: Mohammed Reza Shah Pahlavi

United States Institute of Peace: The Oil and Gas Industry

PBS: The Structure of Power in Iran

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Iran’s Leadership: Inside the Complex Regime appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/irans-leadership-bottom-top/feed/ 0 50379
Going Green: The Future of Renewable Energy is Getting Brighter https://legacy.lawstreetmedia.com/issues/energy-and-environment/going-green-future-renewable-energy-getting-brighter/ https://legacy.lawstreetmedia.com/issues/energy-and-environment/going-green-future-renewable-energy-getting-brighter/#respond Sat, 30 Jan 2016 14:15:34 +0000 http://lawstreetmedia.com/?p=50211

The future looks bright and green.

The post Going Green: The Future of Renewable Energy is Getting Brighter appeared first on Law Street.

]]>

Image courtesy of [Chuck Coker via Flickr]

The precipitous decline in the price of crude oil and its effect on the stock market have caught the interest of the energy community, Wall Street, and the American public. What may be lost in all this is that even while oil prices plummet, renewable energy sources are closing the gap and becoming increasingly affordable thanks to a combination of things such as improved infrastructure and beneficial policies, just to name a few. In fact, renewable energy’s recent success even got a mention in President Obama’s final State of the Union address. This article will look at how renewable energy turned the corner and has become a beacon of light for the future of American power.


Sources of Renewable Energy

The outlook for renewable energy has improved significantly in the United States. President Obama noted in his State of the Union Address, “in fields from Iowa to Texas, wind power is now cheaper than dirtier, conventional power.” While the validity of this quote only applies to select places and not the entire nation, the fact that it has any truth behind it at all is a sign of major change.

Wind

Wind power, in particular, has seen massive growth in the past few years. Last December, wind energy production passed the 70 gigawatts threshold. To put this in perspective, this means that the power generated from wind can supply 19 million homes. The leap forward in production is the result of a combination of events, including the expansion of wind infrastructure. Wind power now has 50,000 working turbines in 40 different states and Puerto Rico. It is also thanks to tax credits, namely the Renewable Energy Production Tax Credit, which was extended after a brief lapse as part of the latest budget. This increase had led to a drastic decrease in the price of wind power, dropping 66 percent from 2009 levels. It has also meant a large jump in its share of the energy market going from less than 1 percent in 2007 to between 4.5 and 5 percent this year. The U.S. Department of Energy predicts that wind’s production and share of the energy market will increase dramatically in future.

Solar

Like wind power, solar power has grown rapidly over the past few years. According to the Solar Energy Industries Association, solar energy provided 40 percent of all new electrical generating capacity in 2015. This growth has translated into 22,700 megawatts of capacity, or enough to power 4.6 million American homes. As a result, the price of solar installations have dropped 73 percent since 2006 and 45 percent for residences since 2010. Much of that decrease in cost has been fueled by the Investment Tax Credit, which took effect in 2006 to help promote growth in the renewable energy industry.

Hydroelectric Power

Another major source of renewable energy is hydroelectric power. Currently, the total capacity for hydroelectric power is 79,000 megawatts. This production is spread across 2,400 facilities throughout the United States, although the majority are located along the West coast. From 2000 to 2010, hydroelectric power accounted for somewhere between 5.8 and 7.2 of the total energy produced in the U.S., and about 17 percent globally. Initially, hydroelectric power accounted for the vast majority of all renewable energy sources, though that number is falling as additional renewable sources produce more power. Hydroelectricity is incredibly cheap and flexible, especially in comparison to solar and wind units, though their prices have been decreasing in recent years.

Biomass

Wind, solar, and to a lesser extent hydroelectric, get most of the attention, but when it comes to production, biomass is the leader in terms of energy output. This may come as a surprise because exactly what biomass is can be somewhat confusing. The way that it has garnered such a share of the market is largely through ethanol fuel, which in 2013 made up 43 percent of all fuel used in the United States. In fact, that the same year, biomass made up nearly half of all non-renewable energy used–twice as much as the second highest, hydroelectric power. This notion that biomass is the leader in renewables should not be particularly surprising because for most of human history, biomass, namely wood, was used primarily for energy, with the conversion to coal only coming fairly recently. But it is important to note that while most renewable energy sources are touted as environmentally friendly, biomass energy sources are not completely carbon neutral.

Geothermal

Another growing renewable power source is geothermal energy. The United States has approximately 3,000 megawatts of geothermal generating capacity, but it accounts for less than 1 percent of the total U.S. energy output. Growth in geothermal has been relatively slow in recent years, although scientists argue it could be a major source of electricity in the future.

The video below explains how the different types of renewable energy work:


Regulations

On the Federal Level

Starting at the federal level, there are a number of programs and regulations in place to monitor and encourage the growth of renewable energy. One example is the Federal Energy Management Program, which aims to reduce emissions by government vehicles and in government buildings. The Environmental Protection Agency also has a number of programs in place to reduce emissions, conserve the environment, and encourage the use of renewable energy. One is the Green Power Partnership, which provides free advice, training, and support to companies who want to better utilize renewable energy. Another is the Landfill Methane Outreach Program, which seeks to protect the environment, helping landfills reduce emissions and capture methane for use as a renewable energy source. The EPA’s AgStar program also promotes the recovery of methane, this time from animal feeding areas. Lastly is RE-Powering America’s Land, which encourages developing renewable energy projects on the sites of previously contaminated areas.

State and Local

There are also a number of beneficial policies at the state level. Some of the most significant state-level policies are Renewable Portfolio Standards, which require utility providers to provide a certain percentage of renewable energy to their customers. These standards help encourage the growth of renewable energy and allow for a more localized approach to setting requirements.

Public Benefits Funds for Renewable Energy create a pool of money to invest in renewables. The funds are supported by a special charge on customers’ energy bills and can help encourage local renewable production. Output-Based Environmental Regulations define limits on how much energy can come from any one source. In doing so, states can encourage utilities to expand their energy portfolios to provide electricity from new sources. Many states also have Interconnection Standards to help ensure that new energy sources have easy access to the electrical grid. Another policy that is particularly rewarding is Net Metering, which allows customers who have a renewable system in place, like rooftop solar panels, to be paid for any energy they provide back to the power grid. The combination of interconnection standards and Net Metering helps make it cost-effective for homeowners to adopt sources of renewable energy. Along similar lines are Feed-in-Tariffs, which force electrical companies to pay a premium to individuals that provide renewable energy to the grid.


The Future of Renewables

With all the growth in renewables, it is not surprising that the future looks very green. The U.S. Energy Information Administration projected that renewable energy will be the fastest-growing energy source through 2040. Additionally, while the industry still relies on tax credits, it seems on the verge of weaning itself off of them, as growth in renewable energy has caused some renewables to be competitive, if not cheaper, than fossil fuel sources in some parts of the United States. Companies are also on the verge of improving storage solutions, which is one of the major problems with solar and wind energy currently. The accompanying video looks at the future of renewable energy:

It is no surprise then, that according to a Renewable Electricity Futures Study, renewable energy could provide as much as 80 percent of all U.S. electricity by 2050. In the process, this transformation will have significant benefits for the climate, economy, and public health. By switching to these forms of energy production, the United States will also see a drastic reduction in its water consumption because large quantities of water are currently needed to cool traditional power plants.

The efforts supporting renewable energy will likely benefit from last year’s Paris Climate Conference. During the conference, nations across the world vowed to reduce emissions and invest more in renewable technology in an effort to prevent the planet’s temperature from rising 2 degrees Celsius. The following video looks at the specifics of the Paris Climate Conference and its impact on the future of renewables:

Criticisms/Setbacks

While renewable energy sources are growing and expanding, it is still not full steam ahead–even renewables come with caveats. The main issue currently is cost and investment. While wind and solar are thriving, they are doing so with the help of generous tax breaks. While many applaud the growth of solar power, supporters also warned that growth could start to decline if tax breaks do not continue. There are many examples of failed renewable energy companies. Perhaps the most notorious example is Solyndra, a solar panel installation company, which was given a $535 million loan by the Department of Energy, but ended up defaulting. There other examples too, including Abound, a solar company, and Fisker, an electric car company.

The many forms of renewable energy production have their own challenges. Hydroelectric power, for example, is very useful but few, if any, new projects have been planned for the future. Biomass also has accompanying issues, namely that production requires valuable land and resources to grow the corn used in ethanol, which could potentially be better used to grow crops for food. Plants burning biomass may also produce more pollution than traditional energy plants that burn coal or natural gas. Even solar and wind, while not nearly as hazardous to the local environment, are have trouble storing any energy produced in excess of existing demand.


Conclusion

Production from renewable energy sources has seen dramatic growth in recent years and estimates suggest that growth will only continue, if not speed up. But despite the recent success of the renewable energy industry, as oil prices remain low money could easily move back toward traditional sources power. Yet doing so would almost certainly be bad for business–the decline in oil’s price is not the sign of a prosperous future, but a perilous one. Renewable energy sources are growing quickly and are already becoming competitive with traditional, dirty sources of electricity. Even much-publicized failures are showing signs of improvement–the Department of Energy’s loan program, which gave the infamous Solyndra loan, is now turning a profit from interest.

Renewable energy certainly has many hurdles to jump through as far as environmental impact, scale of production, and effective storage. The industry will also need to become less dependent on tax breaks, though signs of that are already emerging. If state and federal programs continue to support the growth of new energy sources, the United States may be able to meet its goals for renewable energy production in the coming decades.


Resources

FactCheck.org: Obama’s Wind Energy Claim

NPR: Wind Power Continues Steady Growth Across The U.S

Solar Energy Industries Association: Solar Industry Data

Center for Climate and Energy Solutions: Hydropower

The Break Through: Growth of Biomass far outstrips Growth of Solar and Wind

Geothermal Energy Association: 2015 Annual U.S. & Global Power Production Report

Energy.gov: About the Federal Energy Management Program

EPA: State and Local Climate and Energy Program

Scientific America: Strong Future Forecast for Renewable Energy

Union of Concerned Scientists: Renewable Energy Can Provide 80 Percent of U.S. Electricity by 2050

The Huffington Post: The Paris Climate Conference is Over, but the Renewable Energy Transformation Has Kicked Into High Gear

NPR: After Solyndra Loss, U.S. Energy Loan Program Turning a Profit

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Going Green: The Future of Renewable Energy is Getting Brighter appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/energy-and-environment/going-green-future-renewable-energy-getting-brighter/feed/ 0 50211
Finding El Chapo: What his Arrest Means for Mexico and the Drug Trade https://legacy.lawstreetmedia.com/issues/world/finding-el-chapo-arrest-means-mexico-drug-trade/ https://legacy.lawstreetmedia.com/issues/world/finding-el-chapo-arrest-means-mexico-drug-trade/#respond Thu, 28 Jan 2016 22:10:46 +0000 http://lawstreetmedia.com/?p=50139

Will it make a difference?

The post Finding El Chapo: What his Arrest Means for Mexico and the Drug Trade appeared first on Law Street.

]]>
Image courtesy of [Florent Lamoureux via Flickr]

Early in the morning on January 8, the notorious cartel leader Joaquin Guzman, also known as El Chapo, was captured, yet again, by Mexican authorities following a heated gun battle at his hideout. While Guzman’s story has a number of interesting subplots, including his multiple previous escapes and an interview with Sean Penn, it also points to something: the ongoing war on drugs taking place with its epicenter in Mexico. However, this has not always been the state of things, as South America, particularly Colombia, was once home to the heart of drug trafficking and its most infamous leader Pablo Escobar. But the recent arrest highlights how that center has moved north and, not coincidently, much closer to the U.S. border. Read on to see how the heart of the drug trade has shifted in recent years, what impact that has had in Mexico, the role of the United States, and if capturing El Chapo really makes any difference in the larger war on drugs.


It Started in South America Now it’s Here

To understand the importance of capturing someone like El Chapo, or even the Mexican drug trafficking industry in general, it is necessary to travel one step backward to Colombia. The Colombian drug trade really took off in the 1970s when marijuana traffickers began trading in cocaine because of increased American demand for the drug. Trafficking cocaine was considerably more profitable than marijuana and the growth in profits caused a dramatic increase in the scale of smuggling.

The amount of money in this industry led to the formation of two incredibly powerful competing cartels, the Medellin and the Cali Cartels. The Medellin Cartel, known for its ruthlessness and use of violence, was epitomized by its leader, the notorious Pablo Escobar. The Cali cartel, on the other hand, was much more inconspicuous, reinvesting profits in legitimate businesses and using bribery instead of violence to get its way. The competition between these two groups turned violent, eventually involving the Colombian government and even the United States.

In the 1990s, these two groups were finally undone by concerted efforts between the local Colombian government and U.S. advisors that led to their leaders being either imprisoned or killed. Since their peak, these empires have fragmented, as smaller groups took control over various parts of the cocaine-producing process. While the violence in Colombia has decreased, though not disappeared altogether, the dominant player in the drug trafficking world has shifted to Mexico.


Going North

Mexico had originally been the final corridor through which Colombian cocaine passed before entering the United States. Before Mexico, cocaine had been smuggled through the Caribbean to cities like Miami. Ultimately, though, those routes were shut down by the United States. During the peak years of operation in Colombia, Mexico was little more than a path into the United States. However, this began to change with the demise of the Cali and Medellin cartels, coupled with continued American pressure and aid packages to help the Colombian government fight the local drug trade. Due to fragmentation and weakening Colombian cartels, the center of the drug trade shifted north in Mexico. Mexico served as a natural hub due to its earlier involvement in distributing the drugs produced in Colombia.

While the Mexican cartels came to dominate the illegal drug trade, their rise preceded the actual demise of their Colombian brethren. Much of the history of modern cartels in Mexico can be traced back to one man, Miguel Angel Felix Gallardo. Gallardo was responsible for creating and maintaining the smuggling routes between Mexico and the United States. When he was arrested, his network splintered into several parts, laying the groundwork for many of the cartel divisions that exist today. The first major successor was the AFO or Tijuana/Arellano Felix organization. However, its status was usurped by the Sinaloa Cartel under El Chapo’s control.

The Sinaloa Cartel is believed to control between 40 and 60 percent of the drug trade in Mexico with that translating into annual profits of up to $3 billion, but it is only one of nine that currently dominate Mexico. The activities of these cartels have also expanded as they are now involved in other criminal activities such as kidnapping, extortion, theft, human trafficking, as well as smuggling new drugs to the United States.

The rise of the Mexican cartels can be attributed to other factors aside from the demise of the Colombian groups. One such factor was the role of the Mexican government. During the important period of their ascendancy, the cartels were largely left alone by the Mexican government, which was controlled by the Institutional Revolutionary Party (PRI) for 71 years. When the PRI’s grip on power finally loosened, the alliance with the cartels also shredded.

The growth of the Mexican cartels may also have been the result of economic problems in the United States. Stagflation in the United States led to higher interest rates on loans, which Mexico could not pay. In order to avert an economic crisis, several international institutions stepped in to bail Mexico out, which shifted the government’s focus from its economy to repaying debt. As a result of aggressive policies directed toward Mexican workers and because of the deleterious effects of the NAFTA treaty, there was a dramatic loss of jobs and a shift to a more urban population.

In this new setting, there were few opportunities available, making positions with drug cartels one of the few lucrative options along with growing the crops like poppy, which is used to create the drugs themselves. According to farmers interviewed by the Guardian, growing poppy is the only way for them to guarantee a “cash income.” An increase in the availability of firearms and other weapons smuggled south from the United States only added to the violence and chaos. The video below depicts the history of the Mexican drug trade:

Impact on Mexico

These endless wars for control between cartels in Mexico have taken a significant toll on the country. Between 2007 and 2014, for example, 164,000 people were killed in America’s southern neighbor. While not all those murders are drug-related, some estimates suggest 34 to 55 percent of homicides involve the drug war, a rate that is still incredibly high.

Aside from the number of deaths, all of the violence has influenced the Mexican people’s trust in the government as a whole. That lack of faith may be well founded as the weaknesses of the judicial and police forces are widely known. When the PRI was the single ruling party, it had effectively served as patrons to the drug cartels where an understanding was essentially worked out between the two. When the PRI lost its grip on power, this de-facto alliance between the government and the cartels also splintered. Without centralized consent, individuals at all levels of government as well as in the judiciary and police became susceptible to bribes from the various cartels.  In fact, many were often presented with the choice of either going along with the cartels in exchange for money or being harmed if they resisted. The corruption and subsequent lack of trust in authorities have gotten so bad that some citizens are forming militias of their own to combat the cartels.


Role of the United States

In addition to the impact that the U.S. economy has in terms of job opportunities, particularly since the passage of NAFTA, the United States has had a major impact on the drug trade in two other ways. First are the U.S. efforts to curb the supply of drugs, which were organized as part of the overall war on drugs. While the United States has had a variety of drug laws on the books, it was not until after the 1960s that the government took direct aim at eliminating illicit substances. In 1971, President Nixon formally launched a “war on drugs,” taking an aggressive stance implementing laws like mandatory minimum sentencing and labeling marijuana as a Schedule I drug, which made it equivalent to substances like heroin in the eyes of the law.

This emphasis on drug laws only intensified under President Reagan, whose persistence in prosecuting drug crimes led to a large increase in the prison population. During Reagan’s presidency, Congress also passed the Anti-Drug Abuse Act in 1986, which forced countries receiving U.S. aid to adhere to its drug laws or lose their assistance packages. These policies more or less continued for decades, often with more and more money being set aside to increase enforcement. Only in recent years has President Barack Obama offered much of a change as he has overseen modifications in sentencing and the perception of medical marijuana laws.

This focus on supply extends beyond the U.S. border as well. First, in Colombia, the United States repeatedly put pressure on the Colombian government to fight the drug traffickers. With these efforts still ineffective and with violence mounting, the United States again poured money into the country, helping to finance needed reforms in the Colombian security forces and for other things like crop eradication. In Mexico, a similar approach followed as a series of presidents, beginning in the 1980s, took on much more combative roles against the cartels with the approval and support of the United States. The United States helped support an armed forces overhaul to combat the traffickers and root out corruption within the Mexican armed forces, which had begun to permeate as a result of low wages. In Mexico, successive governments even went so far as to send the military into cartel-dominated cities and engage in assaults. While Presidents Zedillo, Fox, and Calderon sent in troops and met with some immediate success, in the long term it led to mass army defections, greater awareness of the reach of the drug economy, and ultimately other cartels filling the void where government forces were successful.

Since the inception of the drug war, the United States has spent an estimated $1 trillion. Primarily what the United States has to show for this is a number of unintended consequences such as the highest incarceration rates in the world. Another is one of the highest rates of HIV/AIDs of any Western nation fueled, in part, by the use of dirty syringes among drug users.

The problem is that for all its efforts to eliminate supply, the United States has done much less about demand, its other contribution to the drug trade. In fact, the United States is widely regarded as the number one market in the world for illegal drugs. To address demand instead of concentrating on supply, the United States could shift more of its focus to programs that educate or offer rehabilitation to drug users, which have been shown to be effective in small scale efforts.  Certain states have begun to decriminalize or legalize marijuana, a step which will certainly reduce the number of inmates and may also reduce levels of drug-related violence. Yet there is no single way to outright reduce the demand for drugs and some view decriminalization as actually fueling the problem. The following video provides an overview of the resources invested into the United Stats’ war on drugs:


The Importance of Capturing El Chapo

Considering all of the resources and efforts put in place, it is important to consider how much of an impact El Chapo’s arrest will actually have. Unfortunately, it looks like the answer is not much, if any at all. In fact, even El Chapo himself weighed in on his arrest’s effects on the drug trade, telling Sean Penn in an interview, “the day I don’t exist, it’s not going to decrease in any way at all.” El Chapo’s point is clearly illustrated through the number of drug seizures at the border. While exact amounts fluctuate, nearly 700,000 more pounds of marijuana were seized in 2011 than in 2005. The amount of heroin and amphetamines seized has also gone up as well.

The following video details El Chapo’s most recent capture:

His most recent arrest was actually his third; the first two times he escaped from maximum security prisons in stylish fashions, which is one of the reasons that U.S. authorities want Mexico to extradite him. Regardless of where he is ultimately held, since his first arrest in 1993 the drug trade has not suffered when he or any other cartel leader was captured or killed, nor has it suffered from the growth in seizures.

In fact, one of the major points of collaboration between Mexican and U.S. authorities has been on targeting, capturing, or killing of the kingpins of these cartels. While these operations have been successful in apprehending individuals, what they really result in is the further fragmentation of the drug trade. While some may argue that detaining top leaders and fragmenting the centralized drug trade is a mark of success, evidence suggests this is not so.


Conclusion

Aside from relocating the hub of the drug trade to Mexico, the war on drugs has had several other unintended consequences such as high civilian deaths, persistently high rates of HIV infection, and massive levels of incarceration to name a few. While the United States has had some success targeting suppliers and traffickers, it has been unable to reduce demand domestically.

Those involved in Mexico faced a similar conundrum. Not only do citizens in Mexico not trust their government, many of them have become dependent on the drug trade and shutting it down could actually hurt the economic prospects of many citizens.

While El Chapo’s most recent capture has the potential to provide the government with some credibility, it still may not mean much. Even if he is prevented from escaping again or running his old empire from jail, someone will likely take his place. That is because the drug trade does not rely on individuals but on demand and profits. Until these issues are addressed and Mexican citizens have legitimate alternatives to joining cartels, it does not matter how many cartel leaders are arrested, the situation will remain the same.


Resources

CNN: ‘Mission Accomplished’: Mexican President Says ‘El Chapo’ Caught

Frontline: The Colombian Cartels

Borderland Beat: The Story of Drug Trafficking in Latin America

Congressional Research Service: Mexico: Organized Crime and Drug Trafficking Organizations

Jacobin: How the Cartels Were Born

Frontline: The Staggering Death Toll of Mexico’s Drug War

Council on Foreign Relations: Mexico’s Drug War

Drug Policy Alliance: A Brief History of the Drug War

Matador Network: 10 Facts About America’s War On Drugs That Will Shock You

The Washington Post: Latin American Leaders Assail U.S. Drug ‘Market’

The Huffington Post: Why The Capture of ‘El Chapo’ Guzman Won’t Stop His Cartel

The Guardian: Mexican Farmers Turn to Opium Poppies to Meet Surge in US Heroin Demand

CIR: Drug Seizures Along the U.S.-Mexico Border

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Finding El Chapo: What his Arrest Means for Mexico and the Drug Trade appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/finding-el-chapo-arrest-means-mexico-drug-trade/feed/ 0 50139
The Evolution of Activism: From the Streets to Social Media https://legacy.lawstreetmedia.com/issues/politics/evolution-activism-streets-social-media/ https://legacy.lawstreetmedia.com/issues/politics/evolution-activism-streets-social-media/#respond Thu, 21 Jan 2016 17:37:48 +0000 http://lawstreetmedia.com/?p=49853

While the process has changed, the fundamentals have not.

The post The Evolution of Activism: From the Streets to Social Media appeared first on Law Street.

]]>
Image courtesy of [Anonymous9000 via Flickr]

Activism in some form dates back to the beginning of politics. The United States itself was founded on the back of a series of protests that incited a rebellion and created a nation. Protesting or, more generally, activism are ancient practices that have persisted to the current day. However, while speaking out is nothing new, the platform people use has evolved from face to face, to written, to social media. Protests were once announced through picket lines; now they are championed through hashtags, while the same constant goal of seeking to correct an injustice has remained. Read on to see the history of protests in the United States, how they have changed, and if they have staying power in a rapid-fire digital age.


 A History of Discontent

The United States has been a hotbed for activism even before its inception. Multiple protests in a number of states set off the Revolutionary War and led to an American nation. Protests against the powers that be did not stop there, in fact, they continued on almost immediately starting with Shay’s Rebellion. In this case, farmers in Massachusetts organized and fought against the government over taxes and penalties for debt. Although the rebellion was quickly crushed, the threat it personified hastened the end of the Articles of Confederation and the creation of the Constitution.

Protests diversified as well, with a shift from farmers to the issues of slavery and labor rights. In 1831, Nat Turner launched his infamous slave rebellion which claimed the lives of 60 white people in Southampton County, Virginia. That rebellion, along with many other events, laid the groundwork for the ultimate litmus test on slavery, the Civil War.

Even after the Civil War, race remained a contentious issue, but the battle over labor also took center stage. One of the most infamous examples was the Pullman strike of 1884. This strike over declining wages involved a mass worker walkout, nearly crippling the nation’s rail industry. However, the strike ended when President Grover Cleveland sent federal troops in to help local security forces root out the protesters.

The next century had many of the same issues, with frequent protests over race or labor grievances. It also saw several other groups assert their rights as well. One such group was women seeking suffrage. While the seminal Seneca Falls Convention was held the century before, women still found themselves unable to vote at the beginning of the twentieth century. However, after trying a variety of tactics of varying effectiveness, highlighted most publicly by protests at the White House gates during WWI as well as women’s service during the war, the government eventually granted women the right to vote in 1920.

LGBT individuals also began asserting their rights publicly with a major turning point coming at the Stonewall Inn in 1969 where protesters clashed with police. Native American protestors particularly reemerged during this time too. In 1973, at Wounded Knee, South Dakota the American Indian Movement seized the town and engaged police and law enforcement in a 71-day standoff where no one was allowed to come or go. The area had been the site of one of the most gruesome massacres in history the century before, when in 1890 between 250 and 300 people, including many women and children were killed without reason.

The major movement of the 20th century, though, was the fight by Black Americans to receive the rights they were granted following the Civil War. Along with the right to vote and an end to segregation, among many other concerns, this movement was distinct in its scale and use of non-violent means. The civil rights effort also became tied to other concerns of the era, including the fight against poverty and protests over Vietnam. While the protests organized by Martin Luther King as well as many against the Vietnam War preached peace, they were often met with force. One of the most infamous examples is the killing of four Kent State students in 1970 by National Guard troops. The video below looks at one of the most prominent moments of activism the Civil Rights movement:

The activism that characterized the first 200 years of American history was a ground-up affair that was often very violent. In the beginning, violence was used as means for both sides, although even then the authorities often acted as instigators. But beginning in the 20th century and taking focus during Martin Luther King’s Civil Rights movement, the notion of non-violent resistance came to the forefront. While this certainly did not lead to the end of physical confrontations between protesters and those they protested against, it signaled a shift in the tactics used by protest groups. But with the rise of personal computers and the internet, protests have shifted again, with protestors moving from the physical world to the virtual.


Going Virtual

Unsurprisingly, as technology has permeated the world, activism has shifted from grassroots to the internet. Like other types of activism, the digital movement goes by a variety of names depending on the means used; perhaps the most all-encompassing is virtual activism. As the name implies, virtual activism uses a variety of digital mediums to get its message out including: the internet, cell phones, proxy servers, blogs, online petitioners, and most especially social media.

While this type of activism has only recently come to the forefront, it has been around for several decades. It was not until the 1990s, though, that it started gaining traction through new platforms like the launch of MoveOn.org and the use of email by protesters to organize during protests in Seattle against the WTO in 1999. Virtual activism continued and increased during the decade of the 2000s with protests against immigration policies, terrorist groups, education cuts, and authoritarianism.

This type of activism really hit the mainstream in 2011 with the Arab Spring. In this case, protesters used social media to coordinate demonstrations, denounce authority figures, and circumvent government influence. In more recent years, protests and movements like Occupy Wall Street and Black Lives Matter have continued to articulate their concerns over the internet expanding the medium as a tool. The following video looks at the potential of virtual activism:


Effectiveness

For all of the internet and social media’s ability to reach unprecedented audiences and provide up to the minute information, one question continues to linger: is this form of activism actually effective, or is it quickly forgotten from one day to the next? Online activism certainly has its limitations, which generally can be divided into two groups: first are the technical limitations like access to the internet, computer literacy, and government censorship, to name a few. An example of this is Iran’s censorship of the internet following riots stemming from an election in 2009 dubbed the Green Revolution.

The second type of limitation is highlighted well by another Law Street Media explainer about Hashtag Activism: can it be effective without a physical presence? As the piece explains, the main criticism of this new age of activism is that it lacks traditional aspects such as a leader and the requirement that people put themselves in literal harm’s way, so it may not carry the same weight as traditional forms of protest. This argument certainly has some substance to it, but even some of the hardest fought-for gains have lost their impact over the years despite being earned the old fashion way. From successful movements like the abortion and voting rights efforts, countervailing forces have removed many gains. Whether or not that is a good thing depends on your views, but the point is that traditional protests can also struggle to become or remain effective as well.

The accompanying video looks at how social media can play a role in activism:


Conclusion

When people look at protests or activism, everyone wants to point to the seminal moments–when someone stood up to armed police officers or stared down a tank. However, these moments are few and far between. In the meantime, there is a lot of suffering that goes unreported, speeches that go unheard, and a great amount of effort that ultimately may not lead to anything. In some cases even when circumstances appear to change, another incident shows they have not or previous gains are repealed or reduced.

While the manner of protests may have changed, the nature of them has not. At the core of each is a feeling by a person, a group or even a nation of an injustice that simply must be corrected. This started with people in the streets, continued through television and has now arrived in individual homes and workspaces via the internet and social media. Does this change in medium make these movements any less effective or any less righteous? Ultimately, it seems like only time will tell.

Until that time, however, what is vital is maintaining a spirit of questioning, of dissenting when something is wrong. Dissent is not always bad–it often moves the conversation, opens minds and paves the way for action. After all, it was Shay’s Rebellion that prompted Thomas Jefferson to write his friend James Madison saying, “a little rebellion now and then is a good thing.”



Resources

History: Shay’s Rebellion

History Matters: The Nat Turner Rebellion

Encyclopedia of Chicago: Pullman Strike

History: Seneca Falls Convention Begins

Civil Rights: Stonewall Riots

The Atlantic: Occupy Wounded Knee

Britannica: American Civil Rights Movement

New York Times: 4 Kent State Students Killed by Troops

Reset: Digital and Online Activism

Mashable: History of Internet Activism

TeleSur: What Became of Occupy Wall Street

Think Progress: Forty-two years after Roe v. Wade, The Sad State of Abortion Rights in the United States

Early America: Jefferson Letters to Madison

History Channel: The Fight for Women’s Suffrage

Encyclopedia of the Great Plains: Wounded Knee Massacre

Law Street Media: Hashtag Activism: Is It #Effective?

 

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Evolution of Activism: From the Streets to Social Media appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/politics/evolution-activism-streets-social-media/feed/ 0 49853
What’s Going on in Oregon? Domestic Terrorism in the Beaver State https://legacy.lawstreetmedia.com/issues/politics/occupy-oregon-domestic-terrorism-beaver-state/ https://legacy.lawstreetmedia.com/issues/politics/occupy-oregon-domestic-terrorism-beaver-state/#respond Fri, 15 Jan 2016 20:43:46 +0000 http://lawstreetmedia.com/?p=49954

How do the Bundy's fit into the history of right-wing violence?

The post What’s Going on in Oregon? Domestic Terrorism in the Beaver State appeared first on Law Street.

]]>
Image courtesy of Gage Skidmore; License: (CC BY-SA 2.0)

As people recovered from New Year’s Eve and went back to work, attention returned to the challenges facing the United States, from Russia to the Middle East. However, while Americans continue to fret over ISIS sleeper cells, an armed, anti-government group occupied a wildlife refuge in Oregon. While the group’s specific demands remain unclear, this type of armed insurrection is nothing new in the United States.  Starting with the nation’s inception to the present, with several high-profile cases in the 1990s, anti-government rhetoric and militia type groups have been and remain a major issue.  This article will look at the specifics of this incident, the history of these types of groups, similar organizations, and the impact all this has on the United States.


A Wildlife Refuge under Siege

The catalyst for this most recent incident was the conviction of father and son ranchers, the Hammonds, on charges of arson on government land. While they claimed to be merely clearing dangerously flammable brush and invasive species, the pair was convicted of starting the fire to cover up poaching activities in 2001. Although the two men turned themselves in and ended up receiving the minimum sentence for the crime they were convicted of, this was not enough to stem the controversy that has since ensued.

In response, a group led by Ammon Bundy, whose father led a similar stand-off against the government in 2014, held a rally and protest, then seized control of a federal building on the Malheur National Wildlife Refuge. The group, now dubbing itself the Citizens for Constitutional Freedom, has remained in the buildings, which were unoccupied, since the night of January 2. While the group’s demands are still unclear, their complaints seem to center on people having greater access to federal land and the release of the convicted ranchers.

While their exact motivation also remains uncertain, what is clear is that Citizens for Constitutional Freedom is not the first group of its kind. While the group has had previous run-ins with the government, movements protesting federal control of land have roots that go back decades, even centuries. For much of that time, the debate has been between those who wish to conserve areas and those who wish to utilize the land for resources. In the 1970s and 80s the idea that the land should be controlled locally gathered steam and became what has been dubbed the Sagebrush Rebellion.

That movement’s primary complaint–and one of the complaints offered by the group in Oregon–is that the government controls too much land and is not using it appropriately. While the methods being used by the Bundy family are certainly illegal, the protesters may have a point. In total, the government owns roughly one-third of all land in the United States and 53.1 percent of the land in Oregon respectively. Regardless of the validity of these claims, the Oregon group’s inability to articulate its specific complaints have made dealing with it a challenge.  This challenge is only exacerbated by how the group is viewed and portrayed by different people and organizations.

What Do We Call Them?

Much of the debate over this group and why they are protesting concerns how they should be classified. More specifically, is this domestic terrorism?  While many people were quick to denounce the group’s tactics as unpatriotic, there was a noticeable lack of coverage and condemnation of their methods. In fact, many argue that the media coverage of the occupation–which some have even called a peaceful protest–is unfair and biased. Critics contrast the Citizens for Constitutional Freedom’s efforts with other protests, such as the ones in Baltimore and Ferguson, which were called riots and met with armed confrontation from authorities.

So what is this group, then? They are clearly protesters speaking out against something they view as unfair. But the presence of weapons and their vague demands over land use rights, freeing the Hammonds, and fighting against government intimidation appears to make them something more. In fact, the group’s actions do seem to fall more in line with the FBI’s definition of domestic terrorism, which includes any action that is “calculated to influence or affect the conduct of government by intimidation or coercion, or to retaliate against government conduct.” The key to identifying this group then ultimately appears to rest with their intent. Since their actions appear to be based on specific perceived injustices and are tied to specific demands, we can differentiate them from mere protesters.

For context, other examples of the importance of intent in defining an act as terrorism concern two of America’s most recent and deadly shootings. In the case of Sandy Hook, Adam Lanza’s actions were not technically domestic terrorism because there was no ideological intent aside from killing; whereas the shootings by Dylann Roof at a Charleston Church were an act of terrorism as the intent was racially and politically motivated. In other words, although the occupiers in Oregon have not yet used force, the threat of force remains and when you couple that with their intentions they appear to be domestic terrorists. For greater clarity the accompanying video gives another voice to the domestic terrorist debate:


Militia Groups in the United States

Historically, one of the primary perpetrators of domestic terrorism in the United States has been militia groups.  Like the definition of terrorism, the definition of a militia is also vague. The general consensus is that a militia is an irregular military force made up of citizens that are called upon only in the event of an emergency. Once again the protesters in Oregon do not fit neatly within this definition; however, many of them are members of a self-styled militia group known as the Patriot movement. This movement began back in the 1970s and was originally concerned with protecting the United States in the event of a foreign occupation. Since the fall of the Soviet Union, the group has refocused its attention to standing up against perceived threats from the government, particularly fear of the government taking away their guns. While the protesters, or domestic terrorists, in Oregon are the latest example of this type of group, they are by no means the only one.

In fact, the number of anti-government groups has mushroomed since 2008, coinciding with the election of President Obama. The number of these groups went from 149 that year to an estimated 1,360 groups by 2012 according to the Southern Poverty Law Center. Again, the extent of the threat that these groups actually pose is up for debate. Some counter that their numbers and danger are overblown by organizations like the Southern Poverty Law Center, who compiled the numbers as means of drumming up donations. However, others view them as a far more serious concern. The Southern Poverty Law Center also notes that during the 1990s only 858 such groups were identified, almost 500 less than 2012. Even with population growth factored in, that level of increase is concerning. It is especially troubling given the number of high-profile conflicts between the government and anti-government groups during the 1990s.

History of Discontent

The 1990s were a time of numerous conflicts between the government and anti-government groups.  The government standoffs and civilian deaths at Ruby Ridge and Waco raised the specter of government repression, especially among militia-type groups. This culminated with the Oklahoma City Bombing, which left 168 people dead when an anti-government sympathizer blew up a government building. While this attack greatly reduced support for militia groups, particularly for the Patriot movement, it was certainly not the end of the violence or domestic terrorism.

In fact, the American Prospect compiled a list of bombings from 1867 to the present. The list includes attacks from anti-war groups, anarchists, foreign separatists, lone wolfs, and the Boston Bombers to name just a few. In addition to bombings, mass shootings in the United States also involve an element of domestic terrorism, such as the recent San Bernardino shooting.

Currently, the protestors in Oregon have stated that they will only resort to violence if forced into a confrontation by authorities. So far, the authorities have aired on the side of caution, letting the group be in an effort to wait for the occupation out.

Even if the protesters in Oregon leave peacefully, the threat of right-wing militias remains. In fact, in a survey conducted by the Police Executive Research Forum last year, the number one threat identified was these militia-style groups, even relative to the threat of foreign terrorism from groups like Al-Qaeda. The protesters’ biggest impact may come in the form of shedding greater light on these groups. The following video gives a look at the militia movement in the United States:


Conclusion

As of right now, much of what is going on in Oregon remains unclear. Even how the group should be classified is debated; are they protesters, terrorists, a militia, or something else? About the only thing that is clear is that what they are doing is unpopular. Already the town has come together and asked them to leave. The Paiute Indian Tribe, which can trace its lineage to the area back 9,000 years, believes they have no legitimate complaint and they should leave. Even the Hammonds–the two men convicted of the crime that supposedly sparked the protest–have distanced themselves from the protesters.

While the debate rages over how to treat them, the specter of FBI assaults on seemingly similar groups in the 1990s lingers. Additionally, figuring out how to deal with groups like these takes on ever-increasing importance as their numbers swell and they become increasingly well-armed.

As of right now, it is too early to know exactly how the events will ultimately unfold in Oregon. In all likelihood the protesters will run out of steam, most will likely leave and the masterminds, such as Ammon Bundy, will be held accountable. It could also go the other way if cooler heads do not prevail.


Resources

CNN: Armed Protesters Refuse to Leave Federal Building in Oregon

Al Jazeera: Double Standards Cited Amid Armed Protest in Oregon

Legal Information Institute: 18 U.S. Code § 2332b – Acts of terrorism transcending national boundaries

Merriam-Webster: Militia

USA Today: Record Number of Anti-government Militias in the USA

The American Prospect: A History of Terrorism on U.S. Soil

Time: This Is What It Takes for Mass Murder to Be ‘Terrorism’

National Geographic: Why Federal Lands Are So Wildly Controversial in the West

The Blaze: Ammon Bundy Says There’s Only One Scenario in Which Armed Protesters Would Resort to Violence Against Authorities

The New York Times: The Growing Right-Wing Terror Threat

CNN: Native Tribe Blasts Oregon Takeover

Politico: What Do the Oregon Ranchers Really Believe?

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post What’s Going on in Oregon? Domestic Terrorism in the Beaver State appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/politics/occupy-oregon-domestic-terrorism-beaver-state/feed/ 0 49954
Rising Homicides in Some American Cities: What’s Actually Going on? https://legacy.lawstreetmedia.com/issues/law-and-politics/looking-behind-curtain-facts-behind-rise-homicides-american-cities/ https://legacy.lawstreetmedia.com/issues/law-and-politics/looking-behind-curtain-facts-behind-rise-homicides-american-cities/#respond Wed, 30 Dec 2015 20:06:44 +0000 http://lawstreetmedia.com/?p=49653

What's going in our cities?

The post Rising Homicides in Some American Cities: What’s Actually Going on? appeared first on Law Street.

]]>
Image courtesy of [Ariane Middel via Flickr]

Baltimore recorded its 300th homicide of 2015 last month, marking the highest number of killings for the city since 1999. Given the decrease in the city’s population over the past several decades, the actual murder rate in 2015 may be the highest in the city’s history. While the rising number of homicides is certainly troubling for Baltimore, it is not the only U.S. city experiencing a spike in homicides. The explanations for this abrupt rise, after years of decline, range from the after-effects of much-publicized police killings to a drug epidemic to simply warmer weather. This article will examine this rise and seek to determine if it is an outlier or a sign of some new trend.


Murders on the Decline?

Before even getting to whether homicides are an increasing threat or even up in 2015, the numbers have to be put into perspective. The much larger trend at play has been a large and consistent decline in violent crime, including homicides, over the past few decades.

Since 1993, the violent crime rate per 100,000 people in the United States has dropped by more than 50 percent. Additionally, while the drop was felt nationwide, it was also specifically evident in cities like New York that have historically been associated with crime, though that association may be starting to wear off. In 1990, there were 2,245 homicides in New York City. By contrast, there were 328 murders in 2014, the lowest number seen since 1963 when New York was also a much smaller city. In other words, crime is down, way down, from twenty years ago. Two other examples are Los Angeles and Washington D.C., which saw their murder rates drop 90 and 76 percent respectively since 1992.

The explanations behind these drops range far and wide. A number of factors have been suggested, including a better economy, higher incarceration rates, the death penalty, more police officers, and even the greater acceptance of abortions to name a few. While all these have been suggested, however, none has necessarily been shown to hold water. Interestingly one of the most scientifically supported reasons has been the reduced use of lead in everyday goods because lead exposure in children is believed to cause more violent behavior. Reduced drug and alcohol use is another factor that has been cited in the reduction.


What’s Going on This Year?

In August, the New York Times published an article noting that 35 U.S. cities have seen their murder rates rise in 2015. This includes a number of major cities in the U.S. such as New York, Chicago, Philadelphia and even the nation’s capital, Washington D.C. After years of dramatic decline, what could be causing these rates to reverse course and begin to rise again?

The Devil is in the Details

While the data seems to suggest a rise in violent crime and there are several plausible sounding theories to support it, is it actually happening?  The answer to that question is both yes and no. After the New York Times published its article, Five Thirty Eight decided to take a closer look at the statistics. Using partial-year data for the nation’s 60 largest cities, it found that homicides are indeed up 20 percent from last year in 26 of the nation’s 60 largest cities and 16 percent overall. However, they were also down in 19 of the same 60 cities including places like Boston, Las Vegas, and San Diego to name a few. In other words, the results used in the sample from the Times article may be skewed. While certain cities’ homicide numbers are up, at most they are only up a fraction or not at all. It is also important to look at the raw numbers in addition to the percentages when there is a relatively small number of homicides to begin with. For example, Five Thirty Eight found that Seattle, Washington experienced a 20 percent increase in homicides at the end of August relative to the previous year, but that increase was the result of three additional murders–going from 15 in 2014 to 18 this year. It is also important to acknowledge that the data is preliminary and only includes part of the year. The full, definitive dataset will not be available until the FBI publishes its annual statistics next fall.

While certain cities’ homicide numbers are up, in most they are only up a fraction or not at all. It is also important to look at the raw numbers in addition to the percentages when there is a relatively small number of homicides to begin with. For example, Five Thirty Eight found that Seattle, Washington experienced a 20 percent increase in homicides at the end of August relative to the previous year, but that increase was the result of three additional murders–going from 15 in 2014 to 18 this year. It is also important to acknowledge that the data is preliminary and only includes part of the year. The full, definitive dataset will not be available until the FBI publishes its annual statistics next fall.

Thus, while the overall rise in the national rate of 16 percent is statistically significant–Five Thirty Eight’s finding among the largest 60 cities–many cities’ individual changes are not. Statistical significance is a test to determine whether or not a change or relationship is the result of chance. It is also worth noting that in 2005 almost an identical rise of 15 percent in the national rate of homicides occurred before the number regressed to the mean and continued its slow decline.

The Who, What, Where, and Why

There seem to be as many explanations for murders may be rising in these cities as there were in explaining the large decline in violent crime over previous two decades. However, many of theories behind the recent rise in homicides do not seem to stand up to scrutiny either.

One that has gained a lot of traction is a theory known as the “Ferguson Effect.” According to this theory, a major contributing factor to the spike in violence is a growing reluctance among police officers to carry out routine police work in fear of criticism. This theory is largely a response to the controversial shooting of Michael Brown in Ferguson, Missouri as well as the death of Freddie Gray while in the custody of Baltimore policy. Put simply, bad guys are running free because police officers fear public damnation.

Ironically, there is competing theory from a community perspective, arguing that police actions have made regular citizens less likely to go to the police for assistance and more willing to take matters into their own hands. In either case the rise in violence in St. Louis or Baltimore, which has been attributed by some as the result of a Ferguson Effect, actually started prior to the highly publicized incidents of police brutality so these explanations do not seem very plausible. Attorney General Lauretta Lynch also testified before Congress saying that there is “no data” to support that theory.

Another explanation is the vast number of guns in the United States. While the exact number of guns in civilian circulation is impossible to pinpoint, it is estimated there are as many as 357 million nationwide–approximately 40 million more guns than U.S. citizens. Once again, while having more guns around likely leads to more gun-related deaths, there were hundreds of millions of guns around prior to this year so that explanation is also not very convincing.

Others argue that an increase in gang violence, fueled by drugs, has led to increased homicides. Of the reasons given, increased gang warfare is one of most likely explanations because it would likely affect only certain neighborhoods or cities and not the entire country. Some argue that cities like Chicago, are experiencing an increase in gang violence and illegal guns, which may explain recent spikes in homicides, but that is unlikely to be the case for every city.

Even the economy has been blamed as part of the “routine activities theory,” which suggests that when people are better off financially they more likely to go shopping or out to eat and thus more likely to encounter criminals. Others argue that crime generally goes down when the economy is doing well. However, John Roman, a senior fellow at the Justice Policy Center at the Urban Institute, noted in an interview with Vox that a good economy can also lead to higher crime if improvements are not distributed equally and the needs of the underserved are not addressed.

When you look at all of the data and try to make sense of it with the competing theories, it seems likely that each city has its own explanation. We do not yet know whether or not the spike identified this summer is indicative of a trend, but if that is the case we likely need more data to determine what might be causing it.

The accompanying video looks at the increase and some of the reasons suggested for it:


Perception is Believing

Despite what the numbers say or whether the theories much of this data is based on are viable, people ultimately make up their own minds on what is true or not. In a 2013 Pew Research Center survey, 56 percent of  those polled believed that gun violence was higher than it was 20 years earlier, but in reality, gun homicides had nearly been cut in half by 2013.  This poll was conducted before the recent spate of highly publicized police killings, indicating the number may even be even higher now. It is not surprising the notion of higher homicide rates resonate with people, even if they are a one term aberration and near historic lows. The following video looks at the perception or misperception of crime in the United States:


Conclusion

While violent crime, including homicides, has been decreasing since the early 90s, recent evidence suggests there may be a spike in homicides this year–at least in some of the United States’ largest cities. But it remains unclear whether this is emblematic of a trend, or even if it was just a brief increase as has often occurred in the past. Even with this increase, however, the rate is nowhere near approaching the record highs from two-decades ago.

In light of these findings, many questions emerge. Why is the homicide rate up this year? Are these numbers skewed by an unrepresentative sample? Is this the sign of a trend or just a temporary blip? Questions like these will not be answered for years if they are answered at all. While it is necessary to try and understand the data in order to improve policing and crime-related public policy, it is important to take a more local look at why homicides might be going up in each city. A spike in several cities is not necessarily indicative of a national problem.


Resources

The Washington Post: Baltimore’s 300th Killing This Year: A violent Milestone in a Riot-Scarred City

NYC: News from the Blue Room

The New York Times: Murders in New York Drop to a Record Low, but Officers Aren’t Celebrating

Forbes: What’s Behind the Decline in Crime?

The New York Times: Murder Rates Rising Sharply in Many U.S. Cities

Vox: Why Murder Rates are Up in St. Louis, Baltimore and Some Other Cities

The Washington Post: There are Now More Guns Than People in the United States

Five Thirty Eight: Scare Headlines Exaggerate the U.S. Crime Wave

Stat Pac: Statistical Significance

Pew Research Center: Gun Homicide Steady After Decline in the 90s; Suicide Rates Edge Up

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Rising Homicides in Some American Cities: What’s Actually Going on? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/law-and-politics/looking-behind-curtain-facts-behind-rise-homicides-american-cities/feed/ 0 49653
The Middle Class is Shrinking: Why Does it Matter? https://legacy.lawstreetmedia.com/issues/business-and-economics/middle-class-shrinking-matters/ https://legacy.lawstreetmedia.com/issues/business-and-economics/middle-class-shrinking-matters/#respond Tue, 29 Dec 2015 21:10:16 +0000 http://lawstreetmedia.com/?p=49769

The middle-class is no longer the majority of the country.

The post The Middle Class is Shrinking: Why Does it Matter? appeared first on Law Street.

]]>
Suburbia" courtesy of [Daniel Lobo via Flickr]

The middle class has been many things to many people, but more evidence indicates that it continues to get smaller. Many view the middle class as a key component to the American identity and a driver of economic expansion over the past several decades. For politicians, it is the group they campaign on helping the most in an increasingly contentious election.

But the middle class is shrinking–according to a recent Pew Research Center report, for the first time since the 1970s the middle class does not make up the majority of people in the United States. Read on to see how the all-important cohort is changing with the times, how it developed, what is rising to fill its place, and why all of this matters to the United States going forward.


What does the Middle Class Look like Today?

The first step in evaluating the middle class is to determine who actually meets that qualification. Unfortunately, there is no universally-agreed upon definition for the middle class. To categorize Americans, experts have looked at a variety of factors including demographics, income level, wealth, what people consume, and even their personal aspirations, but no standard has been agreed upon.

The Pew Research Center study focused on income, defining the middle class as those who earn two-thirds to two times the median income in the United States. Pew then adjusts its statistics for household size, using a three-person household as the benchmark. Those that fall in the lower-income bracket have an adjusted household income of $31,000 dollars or less per year. Americans who makes more than $188,000 annually are considered upper-income. At the edges of these groups are two more distinctions, dubbed “lower-middle income” and “upper-middle income.” The lower-middle group is defined as earning between $31,000 and $42,000 per year. On the other side, the upper-middle group’s income ranges from $126,000 to $188,000 per year. Those in between that range make up the middle-class.

As the demographics of the United States have changed over the past 40 years, so has the makeup of the middle class. For one, the middle-class today is far more educated than ever before. The proportion of people with at least some college experience increased and those with a high school diploma or less plummetted. Looking at trends over the past several decades, college-educated adults are currently much more likely to be in upper-income than in the past. While college-educated adults managed to maintain their economic status, those with less education fared much worse. As Pew points out, “Among the various demographic groups examined, adults with no more than a high school diploma lost the most ground economically.”

The middle class has also gotten older, much like the country at large, with a smaller proportion of people 44 years of age and younger and a greater percentage of people 45 and older. The middle class has also diversified, with Asian, black, and Hispanic populations taking on a bigger percentage of the pie. In the same vein, foreign-born citizens’ share of the middle class has also increased. The following video looks at the difficulty in defining the middle class:


Why the Middle Matters

The definition of what the middle class is is contested, but so is whether or not the middle class even matters.

Inequality and the Middle Class

One of the most widely discussed trends in recent years is the growing wealth gap between upper-income Americans and the rest of the country. The problem is not necessarily that the rich are getting richer, but whether they are doing so disproportionately and at the expense of the economy and everyone else.

Some argue that inequality is the result of actions by the federal government, namely through decreasing tax rates for the wealthy starting during the Reagan administration. Others contend that the effects of tax decreases are not nearly enough to account for the massive disparity that exists today. While many debate the exact causes of this inequality, its effects on the middle-class are important.

Several studies show that developing a strong middle class is the ideal recipe for economic success. This is true for several reasons, which the Center for American progress outlined in a recent report. One is that a strong middle class means more access to education and subsequently better trained human capital. Second, a stronger middle class creates a larger and more stable market for demand, especially in relation to a small elite that can only consume so much. Third, a strong middle class is a hotbed for the next generation of innovators; job growth comes primarily from expanding small businesses, not large corporations. Finally, a powerful middle class demands the necessary political and social goods required to improve an economy–from infrastructure to fair regulations that may be overlooked when politicians cater only to a small elite.

The Middle Class in Politics

In light of this growing inequality, it’s important to ask: does the middle class still matter? While terminology on the campaign trail may be changing, the middle class as an issue and the middle class as a group remain at the heart of American politics. Politicians from both parties have made courting the middle class essential to their electoral success–every 2016 candidate from Bernie Sanders to Rand Paul is working for the support of this elusive group. The question then is why, given that people with higher incomes are more likely to vote than those with lower incomes and the middle class is shrinking. The real reason why the middle class seemingly gets such an out-sized share of attention may have to do with how it formed and what it means to American ideals.


Origins of the American Class System

In the United States there originally existed two main groups of people, the proverbial haves and have-nots, an entrenched elite and everyone else. Beginning in the 19th century, however, this began to change as members of the lower class began to split into two separate spheres. In one sphere was the traditional manual laborer, or the working class. In the other was a new group, which would become the middle class. While manual workers moved from the farm into the factories, the burgeoning middle class worked white-collar jobs as clerks and small business owners.

In the process, these middle-class workers became better educated and skilled, which allowed them to rely more on ability and less on social networks. Thus, they were able to develop an individual identity while the working class was not. Additionally, they no longer had to rely on large social networks to achieve significant gains, which the working class did through means such as unions and strikes. Along with severing close community ties, the developing middle class also began buying homes, experienced shifts in gender norms, and managed to provide a better education for their children. This group, along with the working class, experienced a huge boom following the end of WWII. However, like any rush, it was short-lived and beginning in the 1970s the middle class began to stagnate.


Rise of the Margins

What started as stagnation has eventually led to decline. Since 1971, the middle class has shrunk from 61 percent of the population in 1971 to 50 percent this year. This decline has been gradual with no single defining moment. The people exiting the middle class, however, had to go somewhere and coinciding with this group’s reduction is a rise in people at the margins, in the low and high-income brackets. According to the Pew Research Center study, the percentage of Americans living in the lower income bracket rose from 16 percent in 1970 to 20 percent in 2015, the higher income group rose from 4 to 9 percent. While these two groups have grown at similar rates the amount of national income going to the upper income group has increased dramatically more, from 29 percent in 1970 to 49 percent in 2015. The video below highlights the plight of the middle class:

Aside from shrinking, the middle class has lost a significant amount of its wealth. Since 2000, this group has lost 4 percent of its median income. Additionally, due in large part to the housing-related effects of the recession, the wealth of the middle class has dropped an additional 28 percent. Those among the middle class who saw the biggest losses since the 1970s were Hispanics, people with only high school diplomas, young adults, and men. However, the changes within this cohort were not bad for everyone, as the elderly, women, blacks, whites, and married couples saw their positions improve.

Social Mobility

The changes within and outside of the middle class have not been the same for everyone. The main concern, however, is not so much whether you are in the middle class now, but whether can you reach or even surpass it. In the United States there has long existed the notion that through hard work, education, and possibly luck you can move up the class ladder; this concept is known as social mobility. More exactly, social mobility is the “movement of individuals, families, or groups through a system of social hierarchy or stratification.” If a person changes positions, typically by switching jobs, but does not change their class, that’s called horizontal mobility. If the changing role also leads to a change in class, either up or down, it’s vertical mobility. While the belief in social mobility remains strong, how realistic is it in the United States today?

Unfortunately, a recent study paints a troubling picture. As the Atlantic notes, the study found that roughly one-half of parental income advantage is passed on to children. The impact actually increases as people get wealthier, growing to two-thirds. What this means is that if someone has rich parents he or she is much more likely to be financially successful as well. This effect is greater for men and married couples than women and single people. Put simply, as the middle class is shrinking, the chances of improving one’s economic status also decreases.


Conclusion

While politicians continue to focus on helping middle-class voters, the number of people who fit that description, as well as their wealth and income, continue to shrink. Exactly what is driving this decline remains a subject of debate, but most argue that it is a combination of factors. The middle-class has been a culturally and economically important group for the United States for over a century, but as it shrinks its significance may fade. Recent economic changes identified by the Pew Research Center report highlight the importance of education to the American economy and the middle class. While people with college degrees maintain their economic status, those with less education have not fared nearly as well.

While the two major parties have debated the best course of action to take in fixing the middle as it has declined since the early 1970s, it is hard to argue that the middle does not matter. Not only has it been shown to be the engine of a growing and prosperous economy but it increasingly symbolizes the nation as a whole. Perhaps it would be prudent to address the challenges facing this all-important American bedrock before it is gone.


Resources

Primary

Pew Research Center: The American Middle Class is Losing Ground

Additional

Education Action: Social Class in the United States A Brief History

The Atlantic: America is Even Less Socially Mobile Than Most Economists Thought

CNN Money: What is middle class, anyway?

Pacific Standard: The IMF Confirms That ‘Trickle Down’ Economics Is, Indeed, a Joke

Center for American Progress: Middle-Class Series

New York Times: Middle Class is Disappearing, at Least from Vocabulary of Possible 2016 Contenders

Al Jazeera America: Most Americans Don’t Vote in Elections

The Atlantic: 60 Years of American Economic History, Told in 1 Graph

Encyclopedia Britannica: Social Mobility

NPR: The Middle-Class Took Off 100 years Ago…Thanks to Henry Ford?

Encyclopedia Britannica: Social Class

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Middle Class is Shrinking: Why Does it Matter? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/business-and-economics/middle-class-shrinking-matters/feed/ 0 49769
An Over-Supply of Underpriced Oil: Explaining the New Energy Crisis https://legacy.lawstreetmedia.com/issues/business-and-economics/supply-underpriced-oil-explaining-new-fuel-crisis/ https://legacy.lawstreetmedia.com/issues/business-and-economics/supply-underpriced-oil-explaining-new-fuel-crisis/#respond Fri, 18 Dec 2015 20:34:32 +0000 http://lawstreetmedia.com/?p=49506

Why is oil so cheap?

The post An Over-Supply of Underpriced Oil: Explaining the New Energy Crisis appeared first on Law Street.

]]>
Image courtesy of [alex.ch via Flickr]

The Organization of Petroleum Exporting Countries (OPEC) recently met in Vienna to discuss an official output quota. By the end of the meeting, however, the member countries did not agree on a quota and oil production remains near record levels. While this may not seem like breaking news, the group’s decision will have major ramifications far beyond its members. That is because this decision comes at a time when the price of oil is falling to lows not seen since the Great Recession. It is also coming at a time when a massive over-supply of oil exists in the market.

Read on to learn more about OPEC’s decision based on its past and future plans. Why does the group refuse to turn off the pumps when the wealth of supply seems to be hurting the bottom line?


History of OPEC

OPEC was founded in 1960 by its five original members: Iran, Iraq, Kuwait, Saudi Arabia, and Venezuela. Since then, nine members joined the group: Qatar, Indonesia, Libya, the United Arab Emirates, Algeria, Nigeria, Ecuador, Angola, and Gabon. The organization’s stated objective is to “co-ordinate and unify petroleum policies among Member Countries, in order to secure fair and stable prices for petroleum producers,” but the group has historically faced criticism for trying to control the price of oil for political and economic reasons. OPEC’s members meet regularly to agree upon oil production quotas, which in turn influence the price of oil internationally.

While many have a negative perception of OPEC, the organization’s roots were generally good-intentioned. The group formed shortly after many oil producing countries emerged after colonial empires were split up. Its inception, in part, explains OPEC’s desire to set a price as a means to control and benefit from its member nations’ natural wealth.

Criticism of the group peaked in the 1970s after two high-profile events: namely, its 1973 embargo on oil exports to the United States and the fallout from the 1979 Iranian Revolution. Oil prices eventually dropped dramatically in the 1980s only stabilizing in the 1990s. This happened because of a variety of factors including a burgeoning interest in the environmental impact of oil. Oil experienced another boom in the late 90s through to the mid-2000s. However, it once again experienced a sharp decrease as a result of the 2008 Global Recession.

Following the recession, oil prices started rising, reaching a peak in 2014. Since the middle of last year, the price of oil has dropped precipitously, causing a flurry of responses from countries that are dependent on the oil industry for survival. The video below provides a detailed history of OPEC:


What is OPEC up to?

The most recent drop in oil prices brings us to where we are now. On December 7, oil prices hit their lowest levels in seven years. In fact, since June 2014 when the price of oil peaked at $108 per barrel, the price of oil has lost two-thirds of its value. The underlying driver behind the recent price drop is primarily an over-supply of oil. One explanation for the drop is the American shale boom, which significantly increased oil production in the United States. Another is the decision by OPEC not to cut its production but to keep it at near record output levels.

If a good’s supply increases but demand stays the same or decreases then its price will go down. The overall goal then is to find the equilibrium somewhere in the middle, where sellers can offer their goods at a price they feel is reasonable and at which consumers are willing to pay. OPEC’s recent decision to continue to keep production levels high has contributed to the massive drop in the global price of oil. Doing so challenges OPEC members’ ability to cover their expenses and profit off of high prices.

The question then is why? The simple answer is market share and scale of production. Saudi Arabia, a major player in OPEC, is willing to take a loss on oil in the short-term in an effort to disadvantage its competitors. The relatively long period of high oil prices that occurred over the past few years made new, more expensive means of getting oil profitable. This led to a rise in oil extraction methods like deep-water drilling and shale oil production (including fracking) in the United States. This method of getting oil is notably difficult and expensive, but with high oil prices, companies were able to spend more to extract oil because they could still turn a profit. Now that the price of oil has fallen dramatically, such efforts are becoming too expensive and shale oil production has gone down. If the price of oil stays low for a long period of time this could significantly hurt the shale industry helping OPEC countries like Saudi Arabia in the long run. This would play into the Saudis’ long-term goal of gaining back its market share, once the playing field has been thinned. But while a decrease in U.S. production has already started to happen oil prices have not yet gone back up, putting oil producers in a tricky place. The accompanying video gives a look at OPEC’s actions:

In the meantime, Saudi Arabia and the rest of OPEC also have to contend with other established nations in the oil industry, namely Russia. While the Saudis have started to make their way into traditional Russian oil markets, Russia has fired back by temporarily becoming the largest supplier to Asia, an area typically dominated by OPEC.  The struggle between these two has also added to the oversupply in the market, as neither wants to concede its customers.

Further Trouble Ahead

OPEC’s strategy is decidedly risky for reasons beyond temporary loss in revenue due to lower prices. First, there’s the return of Iran to the forefront of the global oil market. Iran is currently under sanctions and its oil exports are limited to roughly 1.1 million barrels a day–about half of its peak production in 2012.  However, international sanctions on Iran are now going away in light of the Iran nuclear deal, and the country plans to produce 500,000 more barrels a day with the ultimate goal of reclaiming its market share–as Saudi Arabia and Russia are doing–no matter the cost.

Second, demand for oil could also start contracting next year, as some analysts think demand could shrink by up to as much as one-third. While drivers typically do more driving when oil is cheaper, the economic slowdown in Asia, particularly in China, threatens to cause an even larger over-supply of oil on the world market. But foreseeing changes in demand can be particularly difficult. Other analysts argue that the recent changes in China could lead to even greater demand for oil as the country shifts to a more consumer-driven economy.


Ramifications

OPEC

The concerns listed are less true for Saudi Arabia, OPEC’s de facto leader, which the IMF estimates can last about five years with oil prices at current levels before it needs to make significant changes to its budget. The Middle Eastern countries in the worst shape, however, are Iran and Iraq. While Iran’s refining costs are not particularly high relative to other countries, its economy suffered a significant blow from international sanctions. Its neighbor, Iraq, is in even worse shape, facing not only mounting debt but also the specter of ISIS operating and controlling a large swath of its territory. Forgone revenue from unusually low prices could start to hurt oil-exporting countries without large cash reserves.

The consequences of low oil prices could be just as bad, if not worse, for members of OPEC outside of the Middle East. Countries such as Ecuador, Venezuela, Nigeria, and Algeria are extremely reliant on oil for government revenue, often for the majority of their budgets. Low prices have already sparked fear of unrest in areas such as Nigeria and Venezuela, which like Saudi Arabia use oil revenue to maintain social and economic stability. In Ecuador, these fears have already been realized–thousands have gone to the streets to protest government cost-cutting as a result of the falling price.

Russia

Outside of OPEC, perhaps no country is feeling the effects of the declining value of oil as much as Russia. Like many of the OPEC nations, it is very dependent on oil for income. In fact, oil and gas make up roughly two-thirds of Russian exports and half of all government revenue. With prices dropping so low, the nation has subsequently felt the effects–Russia’s economy will contract by about 3.8 percent this year and is expected to shrink further in 2016.

United States

Unlike Russia and the OPEC nations, the United States is not particularly dependent on oil production for government revenue, but the drop in prices will have some impact. If OPEC and Saudi Arabia hope to keep prices low to eliminate American competitors, evidence suggests that may be working. The number of oil rigs in the United States has fallen slightly and domestic production has decreased. In fact, for some U.S. states that rely on the oil industry for jobs and revenue, like Texas, Alaska, North Dakota, Oklahoma, and Louisiana, falling prices can pose a notable economic challenge.

However, the price plunge is certainly not all bad news for Americans. The average price of gasoline per gallon is now considerably lower than this time last year. Additionally, according to the United States Energy Information Administration, the average household is also likely to save $750 on gas this year. These savings are especially helpful for lower-income people who spend more of their income on gas and heating. Similar savings will likely occur in many European countries as well. The following video looks at some of the effects of low oil prices:


Conclusion

The members of OPEC, particularly Saudi Arabia, are taking a notable gamble with their decision to keep oil production high despite low prices. If oil-exporters reduce their production they could lose their market share, but if oil prices remain low they could face fiscal crises and possibly unrest. Yet the decision could pay off in the long run as more expensive forms of oil production slow down and prices go back up.

While OPEC is notably pumping too much oil, an issue that will likely become worse when Iran increases its exports, nearly all oil producing countries find themselves in a race to the bottom. Oil producing countries are already experiencing the consequences of low prices, but that will likely worsen if the status-quo continues. Meanwhile, the United States and most oil-importing Western nations stand to benefit.


Resources

CNN: OPEC is at War and it’s Sending Shockwaves Around the World

OPEC: Brief History

CNN: Oil prices dive below $37 to Lowest Level in Seven Years

Library of Economics and Liberty: Supply

Bloomberg View: Saudi Arabia’s Oil War with Russia

U.S. News and World Report: Iran to Add 500,000 Barrels of Oil Exports After Sanctions are Lifted Through Nuclear Deal

The Wall Street Journal: Global Demand Growth for Oil May Fall by a Third in 2016

CNN Money: Saudi Arabia to Run Out of Money in Less Than 5 Years

New York Times: From Venezuela to Iraq to Russia, Oil Price Drops Raise Fears of Unrest

Reuters: Russian Government Sees 2015 GDP Down 3 percent, More Optimistic Than Other Forecasts

International Business Times: Oil Price 2015 Russia Forced to Make Additional Spending Cuts, Official Says

Guardian: OPEC Bid to Kill off U.S. Shale Sends Oil Price Down to 2009 Low

New York Times: Oil Prices What’s Behind the Drop? Simple Economics

The Christian Science Monitor: Can Canada’s Oil Sands Survive Low Oil Prices?

U.S. News and World Report: Energy Stock Winners and Losers When U.S. Oil Exports Go Global

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post An Over-Supply of Underpriced Oil: Explaining the New Energy Crisis appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/business-and-economics/supply-underpriced-oil-explaining-new-fuel-crisis/feed/ 0 49506
The Cost of Terrorism: How is ISIS Funded? https://legacy.lawstreetmedia.com/issues/world/kickstarting-terrorism-isis-funded/ https://legacy.lawstreetmedia.com/issues/world/kickstarting-terrorism-isis-funded/#respond Mon, 14 Dec 2015 17:02:47 +0000 http://lawstreetmedia.com/?p=49307

Following the money behind ISIS.

The post The Cost of Terrorism: How is ISIS Funded? appeared first on Law Street.

]]>
Image courtesy of [Pictures of Money via Flickr]

ISIS has been the focal point of public discussion for several months now and it seems like the group will not leave the spotlight anytime soon. But while we often talk about what the group is doing and how to respond, we often spend less time understanding how it is able to sustain itself. How do ISIS and other terrorist groups manage to continuously fund global operations while being attacked by several world powers?

In the case of ISIS, estimates suggest that the group’s assets equaled approximately $875 million in 2014, coming from a variety of sources that include oil production and taxes. ISIS and many other terrorist groups actually seem to resemble a sort of mix between a state government and a criminal syndicate, as their funding comes from a wide variety of sources. Read on to see where some of the money supporting ISIS, and other global terror groups, comes from.


Funding Terrorism in the Past

Traditionally terrorism has primarily been funded through private donations. This was certainly the case for ISIS’s predecessor, al-Qaida, which received much of its funding from wealthy Saudis. Charities can be effective because they are difficult to detect and tie to radical organizations. Many of these groups worked on legitimate causes while also funneling money to extremists, muddying the waters even further.

The money raised by these charities is then laundered through shell companies and some legitimate businesses, then transferred to a terrorist group. Another popular means of moving money is through remittances, which are popular in the Middle East. One example of remittances is the use of Hawalas, which are essentially untraceable wire transfers that allow people to send money from one country to their friends or relatives in another. According to a Treasury Department report, Hawalas are often cheaper and faster than traditional bank transactions, making them particularly appealing. Using middlemen with contacts in both countries, payments can be made without needing to transfer money for each transaction. While Hawalas are useful for many people who send money abroad for legitimate reasons, they are also well-liked by terrorist groups because they can be used in areas with little financial infrastructure and are hard to trace.

Efforts have been made to crack down on this type of financing–pressure has been placed both on Gulf nations, like Saudi Arabia, and financial institutions to look for any suspicious activity. While this certainly remains a viable source of income for terrorists, it has generally stopped being the number one source as governments have placed additional scrutiny on international financial transactions. Instead, ISIS and other groups have shifted to new tactics. The following video gives a look at money laundering and how terrorist groups raise funds illegally:


Help from Their Friends

While Gulf states’ support for terrorism has declined, it has certainly not been eliminated altogether. People in countries like Kuwait, Qatar, and Saudi Arabia have long been known as funders of ISIS and other extremist groups that include al-Qaida. These countries, which are in many ways American allies, argue that they are protecting Sunnis from Shiites in a larger struggle for the heart of Islam. The accompanying video looks at from where and how ISIS gets private donations, including those from American allies:

While banks, especially Western banks, have measures in place to identify money laundering and terrorist funding, the same is not true for all of the Gulf states. In places like Qatar, these controls are not as stringent and are not strongly enforced. ISIS also hires fundraisers to reach out to wealthy individuals and solicit money to support its cause.

Money can also be sent to ISIS in the form of fake humanitarian aid packages. These packages are often sent to war zones under the guise of humanitarian assistance but are not actually directed to an individual or organization. These transactions tend to be very difficult to stop for a host of reasons. In addition to poorly regulated banking systems, groups and individuals who send money are often influential in their home countries. Additionally, few humanitarian organizations have direct ties in the region to ensure that the assistance makes it to the proper aid workers.


Traditional Means

Taxes, Extortion, and Robbery

To fill the gap from private donations, ISIS, like traditional states, relies heavily on taxes. The group places a tax on everything it believes to be valuable, from businesses to vehicles. ISIS also taxes non-Muslims, giving them the choice between forced conversion, paying a tax, or facing death. These shakedowns take place at businesses, public areas, or at checkpoints, forcing people to pay or face violence and possibly death. ISIS also sends fundraisers ahead of its fighters to a town or city to demand money. It is important to note that the group only attacks and attempts to conquer areas with some sort of financial value. It rarely, for example, conquers vast tracts of desert simply to take more territory. Taxation has become an especially important source of income as its other revenue streams, like oil production, have declined.  In fact, taxation and extortion were actually ISIS’s largest sources of income in 2014, amounting to a reported $600 million in revenue.

In many ways, the taxation practiced by ISIS is a form of theft, but the group also does its fair share of outright robbery. When the group took the Iraqi cities of Mosul and Tikrit last year it seized vast quantities of money from bank vaults–estimates suggest those confiscations amounted to $1.5 billion. The group is also notorious for outright stealing possessions from people when it conquers a new territory.

Kidnapping

Another means for ISIS to offset its expenses is turning to organized crime. Emulating its predecessor al-Qaida, ISIS has relied heavily on kidnapping for ransoms. ISIS’s victims are traditionally Westerners, many of whom work for wealthy organizations. Although European countries sometimes pay ransoms, some countries such as the United States will not, though some corporations will discreetly pay ransoms for their workers.

In 2014, the U.S. Treasury estimated that ISIS made as much as $20 million dollars from kidnapping. This money did not only come from abducting foreigners, it was also the result of the group’s willingness to kidnap citizens within its own territory if it feels it can generate a high enough payoff.

Drug Trafficking

Along with trafficking in people, like any criminal organization, ISIS may deal in drugs. While it is unclear how much revenue the group receives from the practice, it seems likely that drugs are one more weapon in ISIS’s financial arsenal. This is another example of ISIS learning from its predecessor Al-Qaeda.

Oil/Water/Food

While these are all important revenue streams for ISIS, its most valuable asset is the one it shares with its Middle Eastern neighbors: oil. Iraq has the fifth largest proven oil reserves in the world and ISIS uses this supply to help fill its coffers. While many of the world’s nations impose sanctions on ISIS to prevent it from selling any of these supplies, the group still manages to smuggle oil for profit. Using paths developed in Iraq during the time of Saddam Hussein, the group is able to smuggle out oil, cash, and other contraband to neighboring countries. In 2014, depending on the always-volatile price-per-barrel of oil, ISIS was making between $1 to $2 million a day off oil revenue.

Although much of the oil is smuggled illegally into neighboring countries, it may also be finding more legitimate routes. According to Russian sources, Turkey is allowing large shipments of oil from areas known to be under ISIS’s control. While this could very easily be a baseless accusation in the wake of Turkey shooting down a Russian fighter jet, it may be worth considering. David Cohen, the Undersecretary for Terrorism and Financial Intelligence at the U.S. Treasury noted that Turkey, Syria, and the Kurds have all made deals, through middlemen, to acquire oil from ISIS despite openly fighting the group.

Other Means

ISIS has also utilized other creative methods to fill its reserves. One such method is looting the historical sites that it has become notorious for destroying. Another is through its well-known skills with social media. ISIS uses apps such as WhatsApp and Kik to coordinate covert money drop-offs from its supporters. Other groups such as Boko Haram have even more innovative schemes, from acting as local muscle to employing internet scams.

Ultimately, though, how much ISIS relies on any one source and how valuable any one source is to the group tends to fluctuate a lot. After all, the group now makes far more from taxes than oil production and early sources of income like robbing banks may start to dry up. So far, this strategy has been effective as ISIS really only spends money on fighter salaries, while it salvages weapons and avoids building projects because of the threat posed by airstrikes.ISIS’s strategy is one of thriftiness, especially regarding the social services it offers to its conquered subjects, could prove more decisive than any allied bomb strike in determining its future.

The video below details how ISIS gets its money:


Conclusion

ISIS has proven to be extremely difficult to defeat by conventional means. Despite waves of airstrikes and military support for the Syrian, Kurdish, and Iraqi militaries, the group has endured and even thrived. This is a result of several factors, one of which is ISIS’s ability to draw revenue from a variety of sources while operating a crude form of local government. Another is its ability to draw revenue from a variety of sources much like a criminal enterprise. Many of these methods were pioneered by al-Qaida and are now also being adopted by Boko Haram as well.

However, ISIS’s ability to survive is also partly attributable to the difficulty, and the occasional unwillingness, of bordering countries to crack down on the flow of money to terrorist organizations. These countries have, in some cases, let ISIS smuggle goods into their countries, rampage unopposed and even somewhat directly financed its operations.

To eliminate ISIS, like al-Qaida before it, ISIS’s finances must be crippled. If you can’t pay people to fight for you, or provide services as a government, staying in power becomes increasingly difficult. However, ISIS and like-minded groups have become particularly effective at keeping the lights on.


Resources

Council on Foreign Relations: Tracking Down Terrorist Financing

Newsweek: How does ISIS fund its Reign of Terror?

The Jerusalem Post: How does the Islamic State Fund its Activities?

Security Intelligence: Funding Terrorists the Rise of ISIS

The Daily Beast: America’s Allies are Funding ISIS

Independent: Russia Publishes “Proof” Turkey’s Erdogan is Smuggling ISIS Oil Across Border from Syria

RFI: Nigerian Intelligence Chief Calls for Untangling of Boko Haram Funding

Perspectives on Terrorism: A Financial Profile of the Terrorism of Al-Qaeda and its Affiliates

Political Violence at a Glance: ISIS, Ideology, and the Illicit Economy

New York Times: ISIS Finances Are Strong

Vox: This Detailed Look at ISIS’s Budget Shows That it’s Well-funded and Somewhat Incompetent

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Cost of Terrorism: How is ISIS Funded? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/kickstarting-terrorism-isis-funded/feed/ 0 49307
Mali: A Tale of Two Countries? https://legacy.lawstreetmedia.com/issues/world/mali-tale-two-countries/ https://legacy.lawstreetmedia.com/issues/world/mali-tale-two-countries/#respond Fri, 04 Dec 2015 15:26:32 +0000 http://lawstreetmedia.com/?p=49289

What caused the recent attack in Mali?

The post Mali: A Tale of Two Countries? appeared first on Law Street.

]]>
Image courtesy of [qiv via Flickr]

While the world mourned the terrorist attacks in Paris, another attack occurred in the African nation of Mali. Last Friday, terrorists took 170 hostages and killed 19 people at a Radisson Hotel in Mali’s capital of Bamako. The hotel allegedly was targeted because it was a popular place for foreigners. The group that claimed responsibility for the attack was Al Mourabitoun, a group associated with Al-Qaeda in the Islamic Maghreb. While the attack itself was a serious concern, for Mali it follows a series of troubling incidents that have included a rebellion, a coup, and several terrorist organizations, including one associated with the recent bombing, becoming increasingly aggressive, all within the least three years.


Kings in the North

The country of Mali has been populated for thousands of years by a hodge-podge of ethnic groups. It is also home to historically important places such as Timbuktu, once a regional center for trade and learning. The country became an important colony for the French during the 19th and 20th centuries. Following independence in 1960, Mali, like many other former colonies in Africa, experienced political and economic turmoil characterized by events such as coups and separatist movements.

The Tuareg

Last week’s terrorist attack was not the first from an internal insurgency. The Tuareg, a group that has lived in the territory that is now Mali since 500 BCE have been fighting against the state for decades. The Tuareg people are a collection of nomadic groups spread primarily across three countries; Mali, Niger, and Algeria, and it’s important to note that there’s significant diversity within the group.

Their society is hierarchical with very clear roles for individuals. They were originally animists before the arrival of Islam in the region. The group determined to win independence for the Tuareg in Mali goes by the acronym MNLA or National Movement for the Liberation of Azawad. The video bellows gives a detailed look at the Tuareg and why they are rebelling:

Following independence in the 1960s, some Tuareg demanded their own autonomous region in the north, an area they dubbed Azawad. This first rebellion was over a perceived marginalization and repression of the Tuareg people by the more successful south. This led to a series of small-scale clashes, but the rebellion was ultimately crushed.

A second Tuareg rebellion followed in the early 1990s. This fight centered on many of the same issues. However, this time, the Tuareg group was able to win greater concessions and were more successful in battle. Some of this success can be attributed to some tribes’ years fighting and becoming battle-hardened in Libya for its then-ruler Muammar Gadhafi. Nevertheless, the main issue of Tuareg autonomy remained unsettled and led to a third rebellion. This third rebellion, which started in 2006, consisted more of hit and run type attacks by Tuareg fighters on Malian soldiers and failed peace deals.

The most recent rebellion began in 2012. In this case, battle-tested Tuareg rebels who had stolen large amounts of weapons from Libya once again attacked Mali soldiers. With their new weapons and alliances with various Islamic extremist groups, the Tuareg were able to make substantial gains this time, taking over the northern half of the country. Although these rebellions are divided into different campaigns, the Tuareg struggle has been going on so long that some observers regard them as one continuous fight with times of increased violence.


The 2012 Coup

In 2012 Mali, like many other countries in Northern Africa and the Middle East, was also the site of coup. But unlike many of the others, which echoed the positive ethos of the Arab Spring, the Mali coup was different. First, in Mali, the man in charge was democratically elected and nearing the end of his term without any intention of running again. Additionally, those in charge of the coup were members of the military. But while military coups are certainly not rare, this one started from the bottom up with regular soldiers and low-ranking officers rebelling against the government. This is the inverse of the traditional pattern in which higher ranking officers, such as colonels or generals, are behind a revolt.

So why did this coup happen, and what explains its uniqueness? Much of that can be blamed on the rebellion in the north. Soldiers who had just been routed by the Tuareg were the ones who initiated the overthrow. These men were dissatisfied with how the Mali government was handling the insurrection, namely, at how they were ill-equipped to take on the Tuareg rebels, who were now well-supplied from weapons seized in Libya following Gaddafi’s collapse. The accompanying video gives a quick look at the coup in Mali and its immediate aftermath:

Extremism

As has been the case in other countries that have experienced coups or rebellions since the Arab Spring, the movement against the government has also included radical elements, including those from recognized terrorist groups. In Mali’s case, despite the uniqueness of the coup itself, this same principle holds true.

Following the victories against the Malian government forces in 2012, a number of Islamic extremist groups got involved in the movement. This mix is made of five main groups; Ansar Dine, Movement for Unity and Jihad in West Africa (Mujao), al-Qaeda in the Islamic Maghreb (AQIM), the Signed-in-Blood Battalion, and the Islamic Movement for Azawad (IMA). Of these five, the two main and most active groups are Ansar Dine and AQIM.

Ansar Dine is the nationalistic Islamic group in Mali and it hopes to impose Sharia law across the whole country. The IMA is a splinter of this group, which broke off because it opposed the violent tactics of Ansar Dine and wanted to return to talks instead.

The other major player in the region is the AQIM. It also wants to impose Sharia Law on Mali, as well as erase the legacy of French colonial rule. It was created out of a combination of groups in 2007 and afterward it allied itself with Al-Qaeda. The Mujao and Signed-in-Blood Battalion are both off-shoots of this group. The Mujao wants to expand AQIM’s goals to all of the Western Africa. The Signed-in-Blood Battalion meanwhile, formed following a fallout between its leader and the AQIM. While these groups claim to be fighting for Malian interests the sect of Islam they represent, the Saudi sponsored Wahhabi, is in fact in conflict with that of the majority of Malians who are Sufi Muslims. Al Mourabitoun, the group that committed the recent hotel terrorist attack, is also associated with the AQIM.


Outside Influence and Peace

Role of the French

As Mali’s former colonial power, the French played a key role in the fight against the insurgency, as well as the recent terrorist attack and its aftermath.  In the instance of the 2012 rebellion, France sent in ground troops to launch a counter offensive as well as fighter jets, which launched airstrikes. These attacks, in concert with Malian efforts, were critical in the Malian government pushing back the extremists.

Additionally, during the recent attack by terrorists, elite French troops along with American counterparts, helped rescue hostages and kill the terrorists. However, the future of France’s role remains up in the air as it shifts its gaze to Syria, meaning less resources are available to assist the government of Mali. The following video gives an in-depth look at the French efforts in Mali:

End of the Conflict?

The most recent peace agreement was reached on June 20th, which called for a ceasefire and for Taureg rebels to give up the territory they had taken and the weapons they had acquired in Libya.  However, actual peace remains elusive. Rebels have continued to breach the ceasefire, thus preventing any further steps at reconciling the country. Furthermore, even if Mali can get the rebels to reach a peace agreement and more importantly honor it, they still have to deal with the various terrorist elements in the country–the hotel attack is just one example of the potential violence wrought by some of these groups. Unlike the Tuareg who want autonomy and greater governmental support, these groups’ motivation is often religious in nature and their demands are then much harder to meet.


Conclusion

Mali is the perfect example, and victim, of events beyond its control. If Gaddafi had not been killed in Libya, it is reasonable to wonder if the Tuareg, who had been repeatedly crushed by the government, would have ever made the gains they did. If the Tuareg had never been so successful, it is equally as likely that no coup would have occurred and then also no French involvement. Of course if the French had never made a colony out of Mali in the first place, it may have developed along ethnic lines instead of being welded to together unwieldy. However, Mali is also the victim of its own doings. If it had not tried to crush the Tuareg for so long or marginalize them, it may not have eventually felt there wrath. Additionally, if it had more strenuously confronted extremists it might have prevented their ideology from taking root and becoming so powerful as well.

In the case of Mali then, there are a lot of ifs and buts, however the reality is unmistakable. As the attack last week drove home the country is at war with extremists entrenched in its own borders and possibly on the verge of breaking in two, if it fails to honor the latest peace treaty. Only time will tell if Mali can navigate these perils.



Resources

The Guardian: Mali Attack: More Than 20 Dead After Terrorist Raid on Bamako Hotel

Encyclopedia Britannica: Mali

Global Research: The Crisis in Mali a Historical Perspective on the Tuareg People

Al Jazeera: What do the Tuareg want?

New York Times: Soldiers Overthrow Mali Government in Setback for Democracy in Africa

International Business Times: Mali Tuareg Rebellion: the Fight for Independence of the Blue People

BBC: Mali Crisis Key Players

Time: Mali Hotel Attack Highlights France’s Strategic Dilemma

United Nations: Path to Peace in Mali

CNN: Mali Hotel Attack

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Mali: A Tale of Two Countries? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/mali-tale-two-countries/feed/ 0 49289
What Does the FBI Do Abroad? https://legacy.lawstreetmedia.com/issues/world/fbi-going-global/ https://legacy.lawstreetmedia.com/issues/world/fbi-going-global/#respond Mon, 30 Nov 2015 22:17:54 +0000 http://lawstreetmedia.com/?p=49200

It's not just a domestic agency anymore.

The post What Does the FBI Do Abroad? appeared first on Law Street.

]]>
Image Courtesy of [Cliff via Flickr]

Caught up in the whirlwind of the recent terrorist attacks in Paris was an interesting little footnote: American FBI agents headed to France to assist in the investigation. The FBI going to Paris or anywhere else outside the United States may, on its face, seem like the agency is overstepping its jurisdiction. When it comes to matters outside the borders of the country most assume that the Central Intelligence Agency would be the organization involved, not the FBI, whose mandate is more domestically focused. However, as this recent example and others have shown, the FBI does, in fact, operate abroad. Read on to see what the FBI does around the globe, how its role has changed over the years, and how all this activity is perceived internationally.


The Federal Bureau of Investigation Abroad

The beginning of the FBI’s work abroad can be traced back to World War II. In 1940, as the war intensified and the prospect of the United States joining in the fight grew, President Roosevelt assigned the Federal Bureau of Investigation to handle intelligence responsibilities for the Western Hemisphere. In an era before internationally-focused agencies like the CIA existed, the FBI was given the task. This initial step centered on finding and exposing Nazi spies who were attempting to sneak into the United States from South America.

The FBI realized early on that in order to maximize its effectiveness it needed to coordinate with local authorities in other countries. Starting in Bogata, Columbia, the FBI began assigning special agents to positions that would eventually be known as Legal Attachés or “Legats.” When WWII ended, the CIA took over much of the foreign intelligence work and the FBI shifted its international focus to training and developing working relationships abroad. Since then the program has continued to expand. According to the FBI’s Legal Attaché website:

Today, we have Legal Attaché offices—commonly known as Legats—and smaller sub-offices in 75 key cities around the globe, providing coverage for more than 200 countries, territories, and islands. Each office is established through mutual agreement with the host country and is situated in the U.S. embassy or consulate in that nation.

In addition to Legat offices in foreign countries, the FBI coordinates with similar organizations overseas like Europol. The following video looks at what the FBI does abroad with a focus, in this case, on investigation:


What the FBI Does

So what does the FBI do with all these agents and other personnel stationed abroad? A major focus of the FBI’s effort abroad is training. Among other things, the FBI’s training focuses on providing information on kidnapping, fingerprinting, and corruption. As part of this exchange, the FBI also welcomes a growing number of foreign nationals to its training facility in Quantico, Virginia.

In addition to training, the FBI assists with investigations in other countries. In the most recent example, the terrorist attacks in Paris, the FBI sent agents with particular expertise. According to the New York Times, the agents sent by the FBI have skills that focus on recovering data from electronic devices. The agents will help assist French police recover intelligence about the attackers and provide any information about U.S. security interests back to the United States. The FBI conducted a similar operation in Uganda in 2010. In that investigation, a large contingent of FBI agents were sent to the African nation to investigate the terrorist attacks and aid in identifying potential suspects.

In order to understand the FBI’s role abroad, it is important to look at how the bureau changed in the wake of the September 11 attacks in 2001. After the attacks, the FBI moved away from its traditional role of investigating domestic crime to a new focus on counterterrorism and intelligence gathering. This transition has been widely documented and is openly accepted by the bureau itself. According to an FBI report on its counterterrorism program after 9/11,

Since the horrific attacks of September 11, 2001, the men and women of the Federal Bureau of Investigation (FBI) have implemented a comprehensive plan that fundamentally transforms the organization to enhance our ability to predict and prevent terrorism. We overhauled our counterterrorism operations, expanded our intelligence capabilities, modernized our business practices and technology, and improved coordination with our partners.

A major driver behind the FBI’s international cooperation is enabling other countries to handle terrorism within their own borders so the FBI does not have to bring a suspect back to the United States to faces charges.


Abuse Abroad?

While the FBI’s methods are targeted to prevent terrorism and assist organizations abroad, the bureau has faced a lot of scrutiny for its actions in other countries. There are several examples of people, often American nationals abroad, alleging that improper techniques were used to extract information or force compliance.

One example is of an American national, Gulet Mohamed, who was detained in Kuwait on suspicion of being connected with known terrorist Anwar Al-Awalaki. Gulet, who had traveled in Yemen and Somalia, was detained and aggressively interrogated. He was then placed on the no-fly list. Amir Meshal, another American national, fleeing Somalia and headed to Kenya, shared a similar fate. Meshal was detained in Kenya and claims that he was tortured in order to force a confession. Another instance is the case of Raymond Azar. Azar, a Lebanese citizen, was captured, stripped, and taken from Lebanon then flown to the United States to be charged with bribery.

Yonas Fikre is yet another example. Fikre was arrested in the UAE at the behest of the FBI. He claims that he was detained after he refused to be an FBI informant. According to Fikre, he was then subsequently tortured and added to the FBI’s no-fly list. Fikre is one of nine members of his mosque who he claims have been added to the FBI’s no-fly list for refusing to become informants. In Uganda, four men from Kenya and Tanzania claim that they were illegally detained and abused by FBI agents. They claim that they were tortured in relation to a bombing in Kampala that killed 76 people, a crime which they were suspected of committing. A spokesperson for the FBI said that all agents act within FBI guidelines and the laws of the country where they operate.

These methods have not gone unchallenged. Both Fikre and Gulet challenged their place on the no-fly list and the methods employed by the FBI during their detentions. Meshal also took issue with his capture and actually attempted to file a lawsuit against the FBI. However, the U.S. Court of Appeals for the District of Columbia Circuit denied him the opportunity because his case dealt with national security. The suspects allegedly tortured in Uganda took issues with the conduct of the FBI as well. Their allegations of abuse led to an internal investigation by the FBI. Ultimately, though, following the investigation, no charges were filed. In all cases, the FBI has maintained that its agents acted appropriately.

The video below details the specifics of the case:


Conclusion

Unbeknownst to many citizens, the FBI has had a large and increasing presence abroad since the early days of World War II. This presence generally takes the form of agents, known as legal attachés, who are stationed at American embassies all over the world. The agents are primarily concerned with acquiring information and disseminating it to local law enforcement and counterterrorism agencies in the United States. This system helps empower countries to effectively combat terrorism and domestic threats as well as further U.S. security interests abroad.

Aside from information gathering and training, FBI Legal Attachés are often called on for assistance in investigating crimes and terror threats. There are numerous examples of this, from the recent attack in Paris to earlier attacks in Africa; the FBI also has its own list of selected accomplishments. In these cases, the FBI offered expertise in analyzing a crime scene or technical skills that the local government did not have.

Nevertheless, the FBI’s efforts abroad are not universally acclaimed. In the course of its investigations, the agency has repeatedly faced criticism for abuse and punishments for not cooperating, such as adding suspects names to the no-fly list without probable cause. Despite criticism, the FBI often maintains that its agents acted properly and internal investigations rarely find wrongdoing.

The FBI’s shift from focusing on domestic investigations to gathering counterterrorism intelligence has led many to criticize the agency. But the FBI maintains that it must change in order to be “a global organization for a global age.” The FBI has continued to grow its international presence in the form of Legal Attachés and several counterterrorism task forces after the 9/11 attacks. While some may criticize this trend, most evidence suggests that it will continue.


Resources

New York Times: F.B.I. Sending Agents to Assist in Paris Investigation

San Diego Union-Tribune: Al-Shabab Leader Threatens More Ugandan Attacks

New York Times: In 2008 Mumbai Attacks, Piles of Spy Data, but an Uncompleted Puzzle

New York Times: Detained American Says F.B.I. pressed him

CBS News: American Can’t Sue FBI Over Abuse Claims, Federal Appeals Court Says

Los Angeles Times: Lebanese Man is Target of the First Rendition Under Obama

Open Society Foundation: FBI Responds to Kampala Abuse Allegations Cited in Open Society Justice Initiative

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post What Does the FBI Do Abroad? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/fbi-going-global/feed/ 0 49200
How Has Egypt Changed After the Arab Spring? https://legacy.lawstreetmedia.com/issues/world/egypt-mired-chaos/ https://legacy.lawstreetmedia.com/issues/world/egypt-mired-chaos/#respond Sat, 21 Nov 2015 22:05:11 +0000 http://lawstreetmedia.com/?p=49117

What happened to Egypt?

The post How Has Egypt Changed After the Arab Spring? appeared first on Law Street.

]]>

On Halloween night, a Russian plane leaving the Egyptian town of Sharm al-Sheikh crashed mysteriously in the Sinai Peninsula. While the conversation quickly shifted to whether this was a result of a bomb or not, it is just one more in a series of events that depict the chaos on-going within Egypt. The start of this chaos coincided with the Arab Spring that upended a decades-old dictator only a few years ago.

Read on to see the political evolution in Egypt, beginning with the Arab Spring, through its messy post-revolution transition, to the current government under military leader Abdul Fattah al-Sisi. How have these events shaped the country, and what role do countries like the United States and groups like ISIS play in the shaping of Egypt’s recent political turmoil?


Background

The Arab Spring

Fresh on the heels of widespread protests in Tunisia, a similar uprising emerged in Egypt over the rule of Hosni Mubarak, which was characterized by oppression and poverty. After the protests grew, President Mubarak eventually offered to step down at the end of his term and appoint a vice president for the first time in his reign. However, these changes did little to placate Egyptians who continued the protests in Tahrir Square. After continued dissent and the government’s failed attempts to  violently end the protest, Mubarak ultimately resigned, leaving power in the hands of the military. The following video provides a good insight into the Arab Spring and aftermath in Egypt:

Hosni Mubarak

Egypt’s longtime ruler came to power during a time of chaos as the vice president succeeding Anwar Sadat, who was killed by Islamic extremists during a military parade. Upon ascending to the presidency, a role he would maintain for the next thirty years, Mubarak declared a state of emergency which was in effect until he stepped down in 2011. While Mubarak at points seemed untouchable, eventually even his time would come. After finally ceding power, the longtime ruler was also arrested and subsequently put on trial. Mubarak was charged with embezzlement, corruption, and complicity in the killing of protesters.

In 2012, he was convicted for being complicit in killing protesters and was sentenced to life in prison. He was later granted a retrial in 2013 and was acquitted in 2014. Then, he was convicted of the other two charges as well, granted a retrial for these in 2013, acquitted of corruption in 2014 but found guilty of embezzlement. Mubarak’s final retrial will take place in January 2016.


Post Revolution

Following Mubarak’s forced resignation, power passed to a military consortium known as the Supreme Council of the Armed Forces. This group vowed to draft a new constitution and eventually cede power to a democratically elected government. However, during the transition period, the military cracked down on protests and dissolved the previous government. The council also began gradually taking on greater powers, including the ability to pass new laws and regulate the budget. Concurrent to the presidential election, the council dissolved the recently elected parliament, which at the time was dominated by members of the Muslim Brotherhood. Egypt eventually elected Mohamed Morsi president, setting up a power struggle between the elected government and the military.

The Muslim Brotherhood

The Muslim Brotherhood originated in 1928, combining political activism with charitable work based on Islamic principles. The brotherhood was initially banned in Egypt after trying to overthrow the government, but in the 1970s it renounced the use of violence. Instead, it sought to provide social services for Egyptians, which built up public trust and support. The group became so influential that President Mubarak banned the Brotherhood from competing in elections. However, after he left power, the Brotherhood won majorities in both Egypt’s lower and upper houses and eventually the presidency.

Mohamed Morsi

The Muslim Brotherhood’s candidate, Mohamed Morsi, won the presidency in 2012 to become the first democratically elected president in Egypt. Morsi campaigned on his desire to rule on behalf of all Egyptians, and not just Islamists who favor the Muslim brotherhood, but after his election much of the criticism claimed that he did just that. Critics argued that after his election Morsi consolidated power for himself and the Muslim Brotherhood and did little to spur economic growth. But Morsi argued that he had to take dramatic action in light of Egypt’s recent turbulence. Egyptians quickly became dissatisfied with Morsi’s rule and protests emerged. The dissenters intensified their efforts and eventually clashed with the government. After a period of large-scale uprisings, the military stepped in and ousted Morsi from power. His presidency lasted for just over a year.

After being forced out of office, Morsi was charged with a number of crimes, ranging from espionage to terrorism. He was eventually convicted and sentenced to death. After several legal battles, the court reaffirmed the sentence in June.

Abdul Fattah al-Sisi

Abdul Fattah al-Sisi came to power in the elections following Morsi’s ouster, in which he ran virtually unopposed. Upon al-Sisi’s election, Egyptians thought they were getting a strong nationalist leader who would rid the country of the Brotherhood’s radical Islamism and reinvigorate the economy. Instead, al-Sisi has unleashed a crackdown on dissent, particularly the Muslim Brotherhood. Under al-Sisi’s presidency, the economy continued to falter, only staying above water thanks to support from nations like Saudi Arabia, Kuwait, and the United States. Assessments of his presidency cite human rights violations and a crackdown on free expression and dissent.

The video below shows life in Egypt under al-Sisi:


Other Actors

The United States

Egypt has long been an important country to the United States because of its large population and the presence of the Suez Canal, one of the major avenues for world trade. The importance of this relationship can be quantified by the $76 billion in aid given to Egypt since 1948, including $1.3 billion annually for Egypt’s military.

Recently, however, this relationship has taken a different direction. In light of the forced removal of Mohamed Morsi’s government in 2013, the United States has been reevaluating its relationship with Egypt. The United States began withholding certain military equipment in 2013 to express dissatisfaction with the political trend in Egypt–although military cooperation continued.

As the Congressional Research Service notes, Egypt later signed arms deals with France and Russia and after terrorist attacks in the region earlier this year, the United States resumed its shipments. However, this aid is subject to continued evaluation and beginning in 2018 it will be directed for certain missions instead of being given as a blank check to the military. Egypt’s governing issues and changing U.S. policy priorities, like a nuclear deal with Iran, have reduced Egypt’s long-standing importance as an American ally.

The accompanying video gives a good look at Egypt-U.S. relations:

ISIS

Like other parts of the Arab world, Egypt has become a home for Islamic extremists loyal to the Islamic State. In Egypt, the group is based out of the Sinai, which has been loosely governed since it was returned to Egypt from Israel in 1979. This group has been responsible for a number of attacks and has claimed responsibility for the recent plane bombing that killed 224 people. Despite several military offensives, Egypt has been unable to rid itself of the terrorist group.

In addition to ISIS affiliates, other actors are also making a play in Egypt. Russia reached a preliminary agreement to provide Egypt with $3.5 billion in arms, a deal seen as filling the gap left by the United States. France also signed a major arms deal with Egypt that is valued at nearly $6 billion. Saudi Arabia and Iran are also competing for Egypt’s favor in their on-going proxy war. In fact, Saudi Arabia is one of Egypt’s largest supporters helping keep the al-Sisi regime in control.


Conclusion

Like many other countries that experienced a change in leadership following the Arab Spring, Egypt has found itself stuck in place and may possibly be reverting to its old ways. While the prospect for democracy in Egypt looked bright shortly after the uprising in 2011, the military has successfully managed to maintain control. Mohammed Morsi’s brief rule was quickly followed by the election of a military leader. The current president, Abdul Fattah al-Sisi, has continued the consolidation of power that led to Morsi’s ousting and will likely continue to do so, justifying it with the threat of terrorism.

While the United States may not approve of the recent governing issues in Egypt, other countries have stepped in to provide military aid to the al-Sisi government. Egypt now presents a challenge to both itself and its traditional allies. As the threat of terrorism grows in the region, a democratic Egypt is becoming less of a policy priority for the west. As a result, there is little pressure on President al-Sisi to uphold liberal principles. We’ll  have to see if that conundrum holds true in the new year.


Resources

Reuters: Russian Officials Believe Sinai Plane Brought Down by Bomb

Council on Foreign Relations: Egypt’s Muslim Brotherhood

Encyclopedia Britannica: Egypt Uprising of 2011

BBC: Hosni Mubarak

Frontline: What’s Happened since Egypt’s Revolution?

BBC: Egypt’s Muslim Brotherhood

BBC: What’s Become of Egypt’s Morsi?

Biography: Mohamed Morsi

Al Jazeera: President Sisi’s very bad year

CNN: ISIS beheading an ominous sign in struggling Egypt

Reuters: Russia, Egypt seal preliminary arms deal worth $3.5 billion

Al-Araby: Saudi Arabia and Egypt friends or foes?

Congressional Research Service: Egypt Background and U.S. Relations

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post How Has Egypt Changed After the Arab Spring? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/egypt-mired-chaos/feed/ 0 49117
What A Massive Prisoner Release Means for the Criminal Justice System https://legacy.lawstreetmedia.com/issues/law-and-politics/massive-prisoner-release-means-criminal-justice-system/ https://legacy.lawstreetmedia.com/issues/law-and-politics/massive-prisoner-release-means-criminal-justice-system/#respond Sat, 14 Nov 2015 21:42:34 +0000 http://lawstreetmedia.com/?p=48986

The United States is starting to deal with its prison problem

The post What A Massive Prisoner Release Means for the Criminal Justice System appeared first on Law Street.

]]>
Image courtesy of [Corrie Barklimore via Flickr]

In the span of four days–between October 30 and November 2–federal prisons around the country released 6,000 nonviolent prisoners. This marked the largest single prisoner release in the history of the United States. The decision was the result of the government’s growing desire to address the overcrowding within the prison system. An additional 40,000 convicts could also be released in the coming months as new, more lenient sentencing rules can be retroactively applied to them as well. Read on to see what led to the recent release and how it compares to similar releases in the past. Now that the government is starting to deal with an issue that has been building for decades, what will a continued response look like and how will the prison system change in the future?


Releasing Prisoners

In the Past

While the recent release of so many prisoners all at once has drawn a variety of reactions, including warnings of increased crime, this is not the first time that a large number of prisoners has been released. In 2011, the Supreme Court ordered the state of California to release 30,000 inmates due to overcrowding in the state’s prison system.

On top of this are the prisoners that are also released over the course of a year as well–the federal government releases up to 55,000 prisoners each year. However, this is only a small portion of the inmates set free, as many as 10,000 are set free each week.

This Release

The recent release was a long time in the making. The final decision came about following the advice of the U.S. Sentencing Commission. The commission lowered maximum sentences for people convicted of drug-related offenses. The change could then be applied retroactively, meaning if a prisoner was already convicted and serving a sentence they could apply for early release. Ultimately, the decision was up to federal judges who reviewed eligible cases and determined whether the person in question would be a threat if released back into society.

Like the process, the release itself was not as straightforward as it may seem either. Of the 6,000 inmates, approximately a third were undocumented immigrants. This group will not be released into the public, but will instead be detained by Immigration and Customs Enforcement, which will begin deportation proceedings. Additionally, many of those who were released were already on parole or in half-way houses. On average, those being released already served nine years of their sentences and were only being released around 18 months earlier than expected. The video below details the recent release:


Current Issues

Overcrowding

One of the major reasons for releasing these prisoners is that the prison population is simply too big for the system to manage effectively. There are 698 prisoners for every 100,000 people in the United States, the second highest rate in the world. A 2014 estimate from the Prison Policy Initiative suggests that there are as many as 2.4 million people in U.S. prisons on any given day, including 1.36 million in state prisons. Perhaps most troubling are the findings of a Department of Justice report, which shows that there are nearly 71,000 children in residential placement facilities in February 2010.

In order to properly put this in perspective, it is necessary to look at the U.S. prison population in an international context. As the NAACP points out, the United States has about 5 percent of the world’s population, but it has 25 percent of the world’s prison population. Not only is the United States’ prison population disproportionately large, its racial makeup is also heavily imbalanced. Although Hispanics and African Americans make up approximately 25 percent of the total population, they make up close to 60 percent of all American prisoners.

While simply having a massive number of prisoners does not necessarily mean that the existing prisons are overcrowded, when you look at the concentration of these prisoners it becomes clear that overcrowding is clearly an issue for many states. In fact, California’s mass prison release in 2011 was due specifically to over-crowding.  There were so many prisoners that inmates were being packed into gymnasiums. The situation became so bad that the Supreme Court forced the prisoner release because it was literally a health crisis. California is not an isolated case. While it may be the most extreme example, as of 2014, 17 states had prison populations far above the capacity of their facilities. While overcrowding recently caused states to reconsider their justice systems, it also led to the rise of controversial for-profit private prisons.

Sentencing

Overcrowding is largely a product of the United States’ historically severe sentencing rules. The idea of being “tough on crime” swept the nation in the 1980s. Tough on crime policies continued through the 1990s and early 2000s and only now is the trend starting to reverse itself. The severity of these laws varied from state to state. California had some of the toughest policies, enacting a three strikes law in 1994 that created mandatory punishments for repeat offenders. In 2012, California voters passed Proposition 36, which amended the state’s constitution to limit the use of its three-strikes law.

These sentences are known as mandatory minimums. As the name suggests, these policies lead to mandatory sentences of a minimum length for particular crimes, removing much of the discretion that judges have in the sentencing process. According to Families Against Mandatory Minimums (FAMM), “Most mandatory minimum sentences apply to drug offenses, but Congress has enacted them for other crimes, including certain gun, pornography, and economic offenses.” A U.S. Sentencing Commission report found that 14.5 percent of all offenders in 2010 were subject to mandatory minimum penalties–a total of 10,605 prisoners.


What’s Next?

While there are some who fear that releasing so many prisoners, especially at the same time, will lead to a surge in crime, the numbers suggest otherwise. In the California mass release, only auto thefts increased after 30,000 of the state’s inmates were released. Furthermore, a Stanford University study, which involved 1,600 prisoners released when California changed its three strikes law, found a remarkably low recidivism rate. Prisoners released after the three-strikes law changed had a recidivism rate of just 1.3 percent compared to 30 percent for regularly released inmates.

Not all laws are created equally–perhaps the most infamous is the differing penalties for crack cocaine offenses compared to the one for cocaine in its powder form. Originally, the sentencing ratio was 100:1–with those sentenced for crack-related offenses facing much longer prison sentences. While that was reduced to 18:1 with the Fair Sentencing Act in 2010, a disparity remains. The troubling part of this issue is that most people arrested for crack-related offenses were black while most of those who were arrested for cocaine possession were white–reinforcing the racial imbalance in American prisons.

Post-Release Questions 

Another major issue is the question of what former prisoners will do once they get out. A notable concern is recidivism–when a prisoner returns to prison for another crime after his or her initial release. This worry seems warranted in light of a 2005 study by conducted by the Bureau of Justice Statistics (BJS)–57 percent were re-imprisoned after one year, 68 percent by year three, and 77 percent by year five.

It should be noted that the way the Bureau of Justice Statistics records its numbers may not be the best way to understand recidivism. In a recent study, researchers found that recidivism is actually much lower than what is reported. Rates found in the BJS studies likely overrepresent people who are re-arrested after being released from prison.  However, even if these new findings are taken into account, which emphasize that certain offenders have a higher risk of recidivism, the issue remains a notable problem for American prisons.

Moreover, for those who do avoid re-offending, life can be difficult once they leave prison. While there are certainly a number of programs and organizations in place, it is still hard for someone with a criminal record to find a job. In a 2008 study from the Urban Institute, only 45 percent of ex-cons had jobs eight months after leaving prison. The following video discusses what happens to prisoners if and when they can make it out of prison:


Conclusion

The recent release of so many prisoners has reignited old fears that the reintroduction of prisoners into society will lead to a wave of crime. However, the evidence from past releases calls this line of thinking into question. Too many people, especially those of color, face long prison sentences, putting significant strain on American prisons. The current system is also costing the United States an estimated $39 billion each year.

To effectively reduce the size of the American prison population, changes beyond releasing prisoners need to be made. While recent sentencing reform, which led to this prisoner release, is an important step toward reducing the American prison population, it will not solve the issue. In addition to reducing the number of prisoners, policymakers will also have to deal with helping inmates readjust to society when they are released.


 

Resources

Vox: The biggest prisoner release in U.S. History, explained

Time: What happened when California released 30,000 inmates?

NPR: What You Should Know About the Federal Inmate Release

Newsweek: The Unconstitutional Horrors of Prison Overcrowding

FAMM: What are Mandatory Minimums?

The Economist: America’s Prison Population

CNN: Roughly 6000 Federal Inmates to be released

ACLU: Fair Sentencing Act

National Institute of Justice: Recidivism

Business Insider: Getting a Job after prison

NAACP: Criminal Justice Fact Sheet

Washington Post: Prisons in These 17 States are Over Capacity

Huffington Post: For-Profit Prisons are Big Winners of California’s Overcrowding Crisis

Slate: Why do so Many Prisoners End up Back in Prison? A New Study Says Maybe They Don’t

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post What A Massive Prisoner Release Means for the Criminal Justice System appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/law-and-politics/massive-prisoner-release-means-criminal-justice-system/feed/ 0 48986
Burkina Faso: A Troubled History and Looming Elections https://legacy.lawstreetmedia.com/issues/world/burkina-faso-monumental-change-unlikely-place/ https://legacy.lawstreetmedia.com/issues/world/burkina-faso-monumental-change-unlikely-place/#respond Mon, 02 Nov 2015 21:19:22 +0000 http://lawstreetmedia.com/?p=48634

In a country plagued with coups, will are successful elections possible?

The post Burkina Faso: A Troubled History and Looming Elections appeared first on Law Street.

]]>
Image courtesy of [Dormiveglia via Flickr]

Burkina Faso, a small, land-locked country in Western Africa, is currently in the midst of a political transition that could be monumental for the region. Much like the North African nations that underwent political change during the Arab Spring, Burkina Faso is currently in the throes of political turmoil. In a country with a long history of military coups, mass protests recently forced Burkina Faso’s president to resign after holding power for 27 years.

While an interim government plans to hold elections at the end of the month, recent challenges have made the country’s transition extremely difficult. From a brief counter-coup to the relatively strong influence of the military, the country has a long way to go before its government is stable again. Read on to see exactly what is going on in Burkina Faso, how it all started, and where the conflict is likely to go next.


History of Burkina Faso

Burkina Faso, at the time called Upper Volta, achieved full independence from France in 1960 after a long period of colonial rule. Between its independence in 1960 and 1987, the country went through five separate military coups. The first coup occurred just six years after it gained its independence when the democratically elected Maurice Yaméogo was ousted by military leader Sangoulé Lamizana.

Upper Volta adopted a new constitution in 1970 giving Lamizana power until another coup, led by Saye Zerbo, removed him in 1980. Zerbo was quickly replaced by Major Jean-Baptiste Ouédraogo in 1982. Ouédraogo’s forces quickly splintered into two groups: conservative and radical. Thomas Sankara assumed control of the radical faction and usurped Ouédraogo to become the country’s leader. After coming to power in 1983, Sankara implemented a series of left-wing policies. Upper Volta was renamed Burkina Faso in 1984.

Like his predecessors, Sankara’s rule was short-lived, as he was overthrown and killed in 1987. Blaise Compaoré, an aide to Sankara, led the 1987 military coup. Deviating from the previous instability, Compaoré managed to hold power in Burkina Faso for 27 years. A new constitution was put in place in 1991 and Compaoré won in a widely criticized election. He would go on to win three more elections, in 1998, 2005, and 2010.

The video below gives a brief history of Burkina Faso starting with Compaore’s coup in 1987:


Recent Developments

Compaoré Steps Down

Given Burkina Faso’s history of coups and Compaoré’s near overthrow in 2011, it appeared likely he would step down at the end of his term in 2015. However, like many of the rulers before him, Compaoré sought to maintain his power. In October 2014, Burkina Faso’s National Assembly considered a bill to remove the term limit on the presidency, meaning that he could run for reelection the next year. This immediately led to protests, the burning of parliament, and clashes between protestors and the military, much like what happened in 2011.

Authorities eventually imposed martial law due to the violence, which included protestors taking over state-controlled media outlets and looting the president’s home. In addition to martial law, the vote to extend the term limit was dropped, yet the protests continued. In a final effort to ease tension, Compaoré dissolved his hand-picked government and promised more dialogue with the protestors.  Finally, after all the other measures failed, Compaoré resigned from the presidency after 27 years in office. After Compaoré’s resignation, the military briefly took control before a panel appointed Michael Kafando the interim president; Kafando was formerly a foreign minister and Burkina Faso’s ambassador to the United Nations.

Leading up to the events that caused Blaise Compaoré to resign, Burkina Faso was in many ways primed for change. Despite recent economic progress and a large gold reserve, Burkina Faso was one of the poorest countries in the world. In fact, the situation became so dire in 2011 that it appeared a coup was imminent, as soldiers protested unpaid housing and food allowances. That conflict was likely only avoided because of a series of concessions offered by Compaoré. When the question of extending his term limit came up last year, Compaoré quickly ran out of options to appease protestors.

The video below details the fall of president Compaoré:

Recent Developments 

The coup, or forced resignation, of October 2014 fits into Burkina Faso’s long history of power struggles, but this time the driving force seemed to be dissatisfaction among the public and not exclusively through military intervention. However, in a unique twist, the interim government under president Michel Kafando was briefly overthrown in a counter-coup in September.

The brief coup was led by the Presidential Security Regiment, which remained loyal to Blaise Compaoré after his rule ended. Members of the regiment orchestrated a coup due to their of support for the previous ruler and the fear that they would not be allowed to participate in the country’s upcoming elections. The coup lasted for about a week before its leaders were taken into custody. They now face trial for trying to “stop the process to democracy and liberty for the people of Burkina Faso.” Pressure from country’s military, the West African Bloc, and once again, the citizens of Burkina Faso themselves ensured that the takeover was only temporary. Elections remain scheduled for the end of November.

The accompanying video below details the end of the attempted coup:


Impact Abroad

While a controversial figure, Blaise Compaoré was also an invaluable mediator and his absence from the country may have important consequences for the region. Compaoré played a vital role in negotiations aimed at ending the violence in nearby Cote D’Ivoire and Mali. In 2013, the International Crisis Group implied that if he left power in 2015 it would be a significant loss for a strategically important point in West Africa.

Compaoré was also an important ally in the west’s fight against extremism in West Africa. Both the United States and France, Burkina Faso’s former colonial ruler, have troops stationed there. Following the protests, there were no immediate signs these troops would be removed or forced to leave. At the time of Compaoré’s resignation, it was also feared that his ouster could be a sign of things to come, a movement dubbed the “African Spring.” However, this concern never became a major issue.

Moving Forward

So what’s next for Burkina Faso?  Some view the recent changes in Burkina Faso as part of a larger movement, akin to the Arab Spring in North Africa, but possibly even larger. Zachariah Mampilly, an associate professor of Africana Studies at Vassar College, argues that the developments in Burkina Faso reflect a major trend in Africa. To Mampilly, the protests in North Africa and in places like Burkina Faso are not separate but intertwined over issues of inequality and perpetual poverty. In other words, the Arab Spring and the African Spring were not different movements, rather one larger movement across Africa. While relatively little progress has been made, the emerging trend in protests across the continent may be related.

On the other hand, some see the transition as far less altruistic. Immediately after Compaoré resigned, yet another, Lieutenant Colonel Zida, was elected to be Prime Minister of the interim government. The fact that a military man was once again involved raised questions over whether this was a change sparked by people or just another coup. While many remain skeptical, others are hopeful as the country continues to prepare for elections at the end of the month.


Conclusion

After 27 years under the rule of Blaise Compaoré, Burkina Faso is undergoing a period of rapid political change. After Compaoré’s forced resignation, an interim government was appointed only to be briefly overtaken by yet another coup. While the interim government has regained its control, the country has a long way to go before stability can return. Although elections are scheduled for the end of the month, the military’s involvement in the interim government has led many to question whether it will continue to consolidate its power in the vacuum left by Compaoré.

If Burkina Faso can stem off future coups and actually hold elections, it will go a long way to proving that it has made strides. If and when that happens, the country must then find a way to cultivate its natural wealth, while avoiding past pitfalls. If not, Burkina Faso could easily fall back into the cycle of coups that has plagued its history. If that turns out to be the case, the comparisons between what happened in Burkina Faso and the Arab Spring may, unfortunately, be quite fitting.


 

Resources

Encyclopedia Britannica: Burkina Faso

History World: History of Burkina Faso

Time: What You Need to Know About the Unrest in Burkina Faso

New York Times: Burkina Faso Charges General Who Led Failed Coup

World Politics Review: Compaoré’s Fall in Burkina Faso Signals Trouble for Africa’s ‘Presidents for Life’

Washington Post: Burkina Faso’s Uprising Part of an Ongoing Wave of African Protests

Al-Jazeera: Burkina Faso: Uprising or military coup?

New York Times: Violent Protests Topple Government in Burkina Faso

The Guardian: Burkina Faso Coup Leader in Custody

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Burkina Faso: A Troubled History and Looming Elections appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/burkina-faso-monumental-change-unlikely-place/feed/ 0 48634
Seeds of Hope: Inside the Doomsday Seed Vault https://legacy.lawstreetmedia.com/issues/energy-and-environment/seeds-hope-inside-doomsday-seed-vault/ https://legacy.lawstreetmedia.com/issues/energy-and-environment/seeds-hope-inside-doomsday-seed-vault/#respond Wed, 28 Oct 2015 16:16:56 +0000 http://lawstreetmedia.com/?p=48796

Why is a vault in Norway storing the world's seeds?

The post Seeds of Hope: Inside the Doomsday Seed Vault appeared first on Law Street.

]]>

October 19 marked the first time in history that the Svalbard Global Seed Vault was opened up for a withdrawal. Often referred to as the “doomsday vault,” the seed vault was built to serve as a backstop for plant extinction, storing seeds for individual countries to ensure that plant diversity is not lost in a catastrophe. While weather disasters and global warming pose significant threats to the future of agriculture, the recent withdrawal was the result of the war in Syria. Researchers sought additional seeds as the multi-year war significantly reduced their supply of drought-resistant wheat.

The idea of a last-resort vault full of the world’s seeds may surprise many, but the planning and implementation of the world’s seed bank have been a long and thought-out process. Read on to learn about the process that created the vault and ultimately how it will be used in the future.


History

Seed Storage

The practice of protecting and storing seeds dates back as far as the start of agriculture itself. The exercise started with farmers in the fertile crescent keeping a surplus of seeds, from harvest to harvest, in a variety of secured locations to ensure the survival of their crops.

The process of securing seeds continued in the modern era and the technology has advanced. Today, many seeds are kept in gene banks. According to the Crop Trust, there are currently over 1,700 such gene banks worldwide, which house a variety of different seed species for protection and research.

The groundwork for the international seed vault was laid with the International Treaty on Plant Genetic Resources for Food and Agriculture in 2001. The treaty was a product of the Food and Agriculture Organization at the United Nations and sought to promote international cooperation to preserve plant diversity. Because issues like climate change and water availability highlighted the threats to seed banks on a local level, researchers sought to create the universal vault that could protect diversity in light of the emerging threats.

The Vault

The seed vault was established in 2008, as a way to prevent the extinction of plant species used for crops. It earned its nickname, the “doomsday vault” because it is designed to survive nearly any catastrophe imaginable. From earthquakes to nuclear war, the vault was built in the permafrost on an island north of Norway. While its nickname sounds like something out of a movie, the location of the vault appears to justify it.

The vault itself sits in Svalbard, Norway on the side of a frozen mountain. In fact, the facility is so remote that it is closer to the north pole than it is to Norway. In addition to being remote, the facility it is also heavily secured with four locked areas between the vault and the outside world.

The site itself was also strategically designed. The vault is located in an area with low humidity and notable geological stability, which protects it from earthquakes. Additionally, it is high enough up in the mountain that there is little risk of flooding, even in the event of rising seas due to global warming. The seeds are contained in foil packages inside of boxes on shelves within the vault. The following video provides a look into the vault:


Usage

Since its opening in 2008, the vault’s storage has accumulated approximately 865,000 seeds from seed banks all over the world. The facility has the capacity to hold up to 2.5 billion total seeds, which is equivalent to 500 seeds for each of the roughly 4.5 million crops grown on the planet. The seeds themselves are kept at 0.4 degrees Fahrenheit, but in the event of a power outage its location in permafrost will preserve the seeds naturally for long periods.

While the recent withdrawal may seem like cause for concern, it is actually an example of why the vault was originally built. Researchers at the International Center for Agricultural Research in the Dry Areas (ICARDA) withdrew some of their seed deposits from the vault in order to move their research to new facilities in Lebanon. The withdrawal will help researchers replenish their supply and continue to improve drought resistant wheat, an essential crop for the region. Seed banks like this are able to improve and adapt crops to emerging agricultural challenges and the vault provides a backup copy in case a seed bank’s supply is threatened.

The accompanying video details the Syrian withdrawal:

Who Runs the Vault?

The facility is joint-operated by the Norwegian government and the Global Crop Diversity Trust. Any individual, group or country, who deposits seeds must do so under nationally and internationally agreed-upon laws. The vaults depositors, typically individual countries or seed banks, maintain full access and control of their seeds while in the vault. The Nordic Gene Bank maintains public records for all of the seeds deposited in the vault to help promote information sharing between depositors.


Future of Vault

While the vault serves an important purpose–holding a backup supply for crucial crops around the world–it has other benefits. Namely, in the current era of mono-cropping, many gene varieties have been lost as a few robust strands of crops such as corn or rice are grown in greater and greater amounts. While these versions may offer higher yields or are generally more robust due to little genetic variation, a single blight could wipe them all out at once. The vault also serves the role of preserving less commercially viable types of seeds which keep the gene pool more diverse and, therefore, resilient. Preserving this biodiversity may also be crucial to meeting growing food needs, as researchers will be able to develop more productive strains of plants to increase their yield.

While research and genetic diversity are valuable pursuits, the vault was ultimately designed to protect the world’s crop supply in the event of a disaster. Arguably the greatest long-term threat is global climate change, which may change life on earth as we know it and significantly alter global agriculture. The video below details the uses of the vault:


Conclusion

Due to the Svalbard Vault’s nickname, the doomsday vault, it is generally considered a source of last resort. While the first withdrawal by ICARDA in the Middle East does not indicate the end of the world, a vault is able to support regional seed banks in their efforts to develop and improve crops. In fact, the recent withdrawal proves the facility is performing the way it should. Not only is the vault designed to safeguard seed diversity, it was also designed to serve as a depository for scientists who may call upon its stock in order to improve their current varieties.

The notion of climate change has been overwhelmingly accepted by the scientific community. While the international community has not yet been able to address the impending challenges, scientists may be racing to build up the vault’s supply as challenges mount. Not only are current strands of crops under threat from disaster, they also face risk from farmers’ own growing habits. While a small number of crops have become essential to global agriculture, climate change may increase the need for diversity to meet ecological change.

Rarely do North Korea and the United States agree on any initiatives and even more rarely do these two governments receive support from large non-profits, such as the Gates’ foundation. The Svalbard Seed Vault is a notable exception. While people may still argue over the best way to preserve biodiversity this endeavor marks an important point of progress in securing that goal.


Resources

Primary

Crop Trust: Svalbard Global Seed Vault

Additional

CNN: Artic ‘Doomsday Vault’ Opens to Retrieve Vital Seeds for Syria

The Newyorker: Sowing for Apocalypse

NBC News: ‘Doomsday’ Seed Vault: The Science Behind World’s Arctic Storage Cube

Live Science: Science Behind World’s Artic storage Cube

Wired: That Arctic Seed Vault isn’t Just There for Doomsday

Washington Post: The ‘Doomsday’ Seed Vault That Preserves the Food of the Past and Ensures its Future

Union of Concerned Scientists: Global Warming Impacts

The Guardian: The Doomsday Vault: The Seeds That Could Save a Post-Apocalyptic World

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Seeds of Hope: Inside the Doomsday Seed Vault appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/energy-and-environment/seeds-hope-inside-doomsday-seed-vault/feed/ 0 48796
Why is Russia Getting Involved in the Middle East? https://legacy.lawstreetmedia.com/issues/world/russias-role-middle-east/ https://legacy.lawstreetmedia.com/issues/world/russias-role-middle-east/#respond Tue, 20 Oct 2015 20:05:00 +0000 http://lawstreetmedia.com/?p=48546

Russia expands its influence

The post Why is Russia Getting Involved in the Middle East? appeared first on Law Street.

]]>
Image courtesy of [Global Panorama's via Flickr]

In September, Russian forces began a controversial air campaign in Syria in an attempt to increase the nation’s involvement in the Middle East. While some leaders have welcomed Russia’s increased involvement, many in the west have been skeptical of President Vladimir Putin’s motives. As Syrian dictator Bashar al-Assad’s position weakens amid an ongoing civil war, Russia has stepped in and with Iran’s help is ensuring he stays in power.

The situation in Syria is becoming increasingly complex as the Islamic State seeks to expand its control in the midst of a civil war between Syrian rebels and the Assad regime. But Russia’s intervention in Syria is only part of an emerging trend for the country, as it seeks to exert its influence outside of its borders. Recent developments have caused many to ask why Russia is intervening and what it hopes to gain. Read on to see what Russia has been doing to grow its influence and expand its role in the Middle East.


History in the Middle East

Russia’s intervention in Syria is not the first time that the country has been involved in the Middle East. In fact, the country has a long history in the region. The Soviet Union was a major supplier of the Arab forces who fought against Israel in the 1973 Arab-Israeli War, also known as the Yom Kuppur War. The USSR later invaded Afghanistan in 1979, occupying the country for nearly 10 years. In 1990, it lost a key ally in the region when what was then South Yemen merged with the North. Growing U.S. influence in the region further hurt the Soviet Union’s control of the region, particularly after the success of the Operation Desert Storm, a significant victory for the United States over Saddam. Shortly afterward, the Soviet Union collapsed and its influence in the Middle East largely receded.

The following video depicts Russia’s difficulties in Afghanistan:


Russia’s Return

Russia worked its way back into the region as an alternative arms supplier to the United States. Many Middle East countries saw Russia’s more lenient human rights perspective as an appealing reason to do business with the country. This shift allowed Russia to attract many Middle Eastern countries away from their traditional supplier, the United States, which was quick to abandon authoritarian leaders during the Arab Spring.

While the Arab Spring helped Russia increase its arms exports, the region was already an important market for Russia. Between 2006 and 2009 Russia’s largest arms buyers were in the Middle East. While the Arab Spring increased demand for weapons in the Middle East, Russia did not immediately expand its sales to new countries. However, its traditional customers did significantly increase their demand–most notably Syria, which increased its purchases by 600 percent.

The breakthrough for Russia came later in the aftermath of the Arab Spring, as countries who were normally loyal customers of the United States began looking to Russia. This movement started with Egypt, whose relationship with the United States soured during the Arab Spring and the subsequent overthrow of the democratically elected government of Mohammad Morsi. Seeing an opportunity, Russia secured a deal with Egypt. A potential deal between Russia and Saudi Arabia, arguably the United States’ closest ally in the region outside of Israel, highlights Russia’s ambitions for its weapons industry. However, the Russians also supply Iran, Saudi Arabia’s most significant regional enemy.

The video below details Russia’s displacement of the US in formerly pro-Washington areas:


Current Operations

In addition to expanding its weapons exports in the Middle East, Russia recently started conducting military strikes in Syria, making the ongoing civil war even more complicated. At the end of September, Russia began a controversial airstrike campaign, which largely helped the Assad regime by targeting Syrian rebels. These actions have had an impact on the relationship between Russia and several key nations within the region as well as observers in the west.

The accompanying video provides an in-depth look at Russia’s actions in the Middle East:

Turkey

Russia’s relationship with Turkey is potentially its most complicated. Turkey relies on Russia, as well as Iran, for energy and trade, which amounted to $31 billion in 2014. The leaders of the two nations are often compared to each other, with President Erdogan reminding many of Putin based on his leadership style and his motivations to remain in office.

However, the relationship has been strained recently with Russia’s bombings of anti-Assad rebels and its repeated violations of Turkish airspace. There is also a historical legacy hanging over the two countries dating back to the time of the Ottoman Empire, which repeatedly fought the Russian Empire.

Syria

Even before Russia’s recent intervention in Syria, the two were close allies. This relationship has existed for years based, initially, on military contracts that Russian arms dealers had with Syrian buyers. Their relationship was strengthened back in 2010 after Russia’s U.N. Security Council veto–Russia, along with North Korea and China, blocked a resolution to force President Assad to step down. Since then, Russia has been Syria’s strongest backer outside of the Middle East. Russia also successfully negotiated the transfer and destruction of Syria’s chemical weapons in 2014, diffusing a particularly controversial issue with the United States.

All of this serves as the backdrop for Russia’s recent incursion into Syria and its civil war. It started with Russia sending advisors and fighter planes but has continued to include ground troops, artillery, and stationing ships off Syria’s coast. Russia’s intervention in Syria has been particularly controversial because of the targets that the country has chosen to attack. While Russia initiated its air campaign with the intention to focus on ISIS and fight terrorism, many of the strikes have benefited the Assad regime.

Iran

Russia’s relationship with Iran is also particularly complex. Recently, Russia played an important role in securing the deal to stop Iran’s nuclear weapon program. But after the deal, Russia quickly unfroze an $800 million deal–previously suspended during negotiations–to give Iran a missile defense system. Additionally, it approved an oil-for-goods deal, which allows Russia to buy up large amounts of Iranian oil in exchange for food and other goods. But oil is also an area that could create conflict between the two countries. Iran’s now-unsanctioned supply of oil, when dumped on the market, could lower the international price of crude oil even further. Lately, the falling price of oil has hurt Russia’s economy, particularly in light of sanctions after its annexation of Crimea from Ukraine.

Since the Iranian Revolution, the two nations have been joined by their desire to keep the West at a distance. Even as sanctions are lifted on Iran, this relationship is likely to endure, allowing Iran to continue its anti-western rhetoric. Both nations are also united in strong support for the Assad regime in Syria. However, this shared sentiment flies in the face of more distant history–one that involved Russia either trying to acquire Iranian territory or intervening in the country’s affairs, as it did in the Iran-Iraq War in the 1980s. More recently, Russia continues to arm Iran’s regional enemies and has gone along with American sanctions on the nation.

Iraq

Along with its collaboration with Iran and the Assad Regime in Syria, Russia also recently agreed to share information with Iraq in its fight against ISIS. Doing so has put the United States in a challenging situation, as it has been sceptical of Russia’s increased presence in the region, but has also advocated for international action against the Islamic State.

Russia also has a history of supporting Iraq, most notably in the form of funding during the Iran-Iraq war. Following the American invasion in 2003, it has also worked to normalize relations with the new government, especially in order to re-secure lucrative energy contracts.


Conclusion

So why is Russia wading back into the Middle East, especially given its history in the region? For most, an interest in the Middle East generally relates to the wealth of oil found there, but for Russia it is more complicated than oil alone. While Russia has worked to get energy contracts there, it is also one of the leading producers of crude oil and is widely regarded as having the largest proven reserves of natural gas. Traditionally, the Middle East had been a major market for Russian weapons, but as the politics of the region changed the United States took hold of the market. But in the wake of the Arab Spring, Russia has been working to expand its weapons exports, while also strengthening ties to its regional allies, like Syria and Iran. The revenue from arms sales is even more important considering the growth of sanctions from the west and the falling price of oil, a crucial source of revenue for Russia.

While a more involved position in the region may help Russia economically, either through energy or weapons, that does still not seem to be the major impetus for its invasion in Syria. Ultimately, Russia’s growing role in the Middle East may simply be a product of its efforts to grow its influence around the world. Russia seems to be positioning itself to be an effective alternative to the United States and its recent actions best reflect that goal. This move, while viewed critically in the West, has also been welcomed by leaders in the Middle East as a counterweight to American influence. Russia’s recent involvement in Syria, combined with its important role in the Iran nuclear deal, lends it even more regional significance.


 

Resources

The National Post: Why is Russia further expected to increase its presence in Syria?

Washington Post: Russia’s move into Syria upends U.S. plans

BBC: Russia in the Middle East Return of the Bear

Al-Monitor: New Russian arms deals could shakeup Mideast market

New York Times: Russia’s military actions in Syria cause rift with Turkey

New York Times: For Syria, Reliant on Russia for Weapons and Food, Old Bonds Run Deep

Wall Street Journal: Removal of Chemical Weapons from Syria is completed

CNN: NATO Secretary General questions Russia’s aim in Syria

The Washington Post: Russia-Iran relationship is a marriage of opportunity

The United States Institute of Peace: Iran and Russia

Financial Times: Iraq and Russia to collaborate in fight against ISIS

World Politics Review: Russia-Iraq Relations

 

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Why is Russia Getting Involved in the Middle East? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/russias-role-middle-east/feed/ 0 48546
Gentrification: What is it Doing to Our Urban Centers? https://legacy.lawstreetmedia.com/issues/business-and-economics/gentrification-transforming-urban-centers-isnt/ https://legacy.lawstreetmedia.com/issues/business-and-economics/gentrification-transforming-urban-centers-isnt/#respond Tue, 06 Oct 2015 20:46:56 +0000 http://lawstreetmedia.com/?p=48434

What's going on in our cities?

The post Gentrification: What is it Doing to Our Urban Centers? appeared first on Law Street.

]]>
"Urban Landscape - Gentrification in the East Harlem" courtesy of [Carlos Martinez via Flickr]

You’ve probably heard the term gentrification before–the process in which college-educated, higher-income individuals move into low-income parts of a city in order to live closer to cultural centers. While most people argue that this process leads to new development and better government services, they also highlight how it can displace the existing residents of these communities.

While that narrative is pretty straightforward and easy to grasp, it is important to ask whether gentrification is responsible for many of America’s urban problems. Read on to see what the arguments for and against gentrification are and what studies actually say about the process. Is gentrification as bad as people make it out to be or are other developments just as problematic?


What is Gentrification?

Like any word, gentrification has a simple definition, but in practice the process tends to be much more complicated. The formal definition of gentrification is: “the process of renewal and rebuilding accompanying the influx of middle-class or affluent people into deteriorating areas that often displaces poorer residents.”

History

Prior to gentrification, there was white flight–the phrase used to describe the mass migration of whites out of inner cities and into suburbs. White flight started in the mid-20th century and continued for decades to create the many suburbs that we have today. In the wake of this migration, cities like Washington, D.C. took on a majority-minority character as minorities moved downtown and white people left for the suburbs. Gentrification is typically used to describe the reverse of this process, with affluent people, often white, trickling back into inner cities.

While dilapidated or unused properties have always been refurbished and repurposed over time, the term gentrification itself traces its roots back to 1960s London. In 1964, a British sociologist named Ruth Glass coined the term to describe what was happening in a run-down neighborhood of London. Working-class immigrants were being replaced by professional types, who wanted to be closer to the cultural centers of life. This follows the narrative of development associated with gentrification today as young, educated people seek affordable rent in new parts of a city.

According to the typical story of gentrification, this group is then followed by a second wave that is usually composed of young professional types who move in once a neighborhood becomes more established. After the second wave, the neighborhoods themselves also begin to improve aesthetically as more money pours in. New residents create a stronger tax base and increase investment incentives for companies. Infrastructure is repaired and rebuilt while new construction is started. All this new activity begins to raise the property value of everything from the corner store to the apartment complex down the street. As a result, the original low-income, typically minority residents are essentially priced out of their own communities and forced to leave for somewhere more affordable. The video below looks at several aspects of gentrification and how it is normally understood:


Who does gentrification affect?

The major criticism of gentrification is that the process boils down to affluent whites pushing poor minorities out of their own neighborhoods, in an effort to return to the inner city that their parents and grandparents abandoned years earlier. However, when you look at the evidence and research on gentrification, that narrative doesn’t always hold up.

According to several recent studies by economists and sociologists, the process of gentrification, as it is generally understood, is actually not always accurate. On average, there is little evidence to suggest that more gentrification leads to greater displacement among the original residents. This is not to say that no one ends up being displaced, but generally speaking, displacement is not a significant consequence of gentrification.

In fact, for those who stay in their neighborhoods, regardless of race, gentrification can actually have positive effects. While rents do rise as property taxes increase, residents also have more opportunities like better jobs. In fact, the whole narrative associated with gentrification is called into question as the studies also showed whites are not very likely to move into historically minority neighborhoods at all.

Regardless of whether gentrification is as bad as some people believe, a backlash against the perceived trend has already begun. There are examples in Brooklyn and Philadelphia, but arguably the most notorious backlash occurred against a store in London selling cheap cereal at high prices in a low-income neighborhood, which led to boycotts and protests. In a somewhat surprising turn of events, however, this backlash against gentrification has spawned a counter-backlash, with those accused of gentrification standing fast in the face of criticism.


What Does It All Mean?

While gentrification can affect poor communities, generally that is not the most significant problem. The new investment and diversity actually tends to improve a community. The real problem is that the process of gentrification might only affect certain communities, leaving others with extremely high rates of poverty.

In a study of Chicago’s poor neighborhoods, Harvard researchers found that gentrification only occurred or continued to occur in neighborhoods where the racial composition was at least 35 percent white. They found that the process would actually stop in places where 40 percent or more residents were black. In other words, affluent whites may not be forcing poor black people out of their neighborhoods, rather they are bypassing them completely. This is not to say that the influx of wealthy whites simply improves poor neighborhoods, rather historically black neighborhoods tend to be neglected when it comes to new investment and development. Not only does this challenge the conventional perception of gentrification, it also reinforces an older and more sinister problem in the United States: segregation.

The continuation of segregation is not being perpetuated only by whites returning to the inner city, but also in black migration out of cities. Recent evidence suggests that minority populations are increasingly moving to the suburbs. While individual neighborhoods may be integrating, new suburban trends are actually increasing segregation. On the suburban and town level within metropolitan areas, racial divisions are actually increasing. The following video gives a look at segregation in the US and the problems it leads to:

While the continuation of segregation is bad enough, it has yet another negative aspect associated it. Since gentrification or any other process of development are slow, if not completely non-existent in historically poor neighborhoods, those neighborhoods remain poor and disadvantaged. For all its own potential evils, gentrification may simply expose the familiar problems of segregation and perpetual poverty that are still going unaddressed.


The Government’s Role in Gentrification

The idea of outsiders coming into an inner city neighborhood with cash and plans for improvement is not a new idea and had a name before gentrification: urban renewal. Urban renewal, unlike gentrification, was a product of government policy, which was intended to revitalize various sections of cities. Housing reform movements began as early as 1901 but really gained momentum in the 1930s when zoning ordinances were passed separating housing and industrial areas.

The movement was crystallized in Title 1 of the Housing Act of 1949: the Urban Renewable Program, which promised to eliminate slums, replace them with adequate housing, and invigorate local economies. The act failed, however, in one of its other main goals: addressing segregation. Developers’ decisions to build high-income housing, large development projects, and highways that physically divided cities ensured the practice would continue. This disproportionately affected minority residents. Many were forced to move, often to other more crowded and/or expensive areas.

The government took another try at housing with the Fair Housing Act of 1968, which was meant to stop segregation in neighborhoods at all levels. Additional measures were put in place over the years such as the Housing and Community Development Act of 1974, which replaced the emphasis on the demolition of decaying urban areas with rehabilitation.

As these problems persist, and with the racial strife continuing to plague the United States, President Obama sought to create legislation to address housing once again. In his plan, which was announced in July, data would be compiled then given to local authorities who could use it to more accurately distribute Housing and Urban Development funds. These efforts are intended to end the negative aspects that gentrification perpetuates, including poverty concentration and segregation. The accompanying video below details Obama’s plan to address segregation:

 


Conclusion

Gentrification is a well-known issue in the United States, but when you take a closer look at what is going on the trend becomes much more complicated. While displacement and housing costs are significant problems for local governments, gentrification might not always be at fault. The traditional gentrification narrative says that as wealthy people move to poor urban areas housing prices and live costs rise, displacing low-income residents. Emerging research challenges that narrative but notes that many low-income communities still face significant challenges. While people are starting to question the traditional understanding of gentrification, backlashes against inner city development and its perceived effects continue.

Studies show that gentrification does not cause displacement at the rates that most people may think, but it does highlight new trends in segregation. While inner-city communities are becoming more diverse, urban housing prices in general are going up. As a result, many low-income residents are moving to suburbs, which face further racial division. Historic segregation and displacement from urban renewal has created areas of concentrated poverty, which have grown consistently over the past decade. This poverty also tends to disproportionately affect minorities. According to CityLab, “One in four black Americans and one in six Hispanic Americans live in high-poverty neighborhoods, compared to just one in thirteen of their white counterparts.” While most people think of the inner city when they think of poor neighborhoods, poverty and segregation are actually growing in many U.S. suburbs. Overall, the face of many American cities and towns are significantly changing.


 

Resources

Regional Science and Urban Academics: How Low Income Neighborhoods Change

US2010 Project: Separate and Unequal in Suburbia

Slate: The Myth of Gentrification

The Atlantic: White Flight Never Ended

City Lab: The Backlash to Gentrification and Urban Development has Inspired its Own Backlash

Harvard Gazette: A New View of Gentrification

The Hill: New Obama housing rules target segregated neighborhoods

Curbed: As ‘Gentrification’ Turns 50, Tracing its Nebulous History

Encyclopedia.com: Urban Renewal

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Gentrification: What is it Doing to Our Urban Centers? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/business-and-economics/gentrification-transforming-urban-centers-isnt/feed/ 0 48434
Defining Japan’s Place in the World https://legacy.lawstreetmedia.com/issues/world/land-stagnant-sun-defining-japans-place-region/ https://legacy.lawstreetmedia.com/issues/world/land-stagnant-sun-defining-japans-place-region/#respond Tue, 06 Oct 2015 20:34:54 +0000 http://lawstreetmedia.com/?p=48241

With a stagnant economy, Japan loosens limits on its military.

The post Defining Japan’s Place in the World appeared first on Law Street.

]]>
Image courtesy of [James Cridland via Flickr]

After 70 years of pacifism, the Japanese parliament recently voted to allow the use of military force under specific conditions, potentially moving the country away from a longstanding policy that guided it since the end of World War II. While this decision immediately drew an outcry of criticism from Japanese citizens, it was strongly supported by Japan’s conservative Prime Minister Shinzo Abe. The new legislation will not completely do away with the country’s policy of pacifism, but it does allow for the use of military force under a set of narrow circumstances. After years of stagnant economic growth, this decision reflects the efforts of conservatives in the country’s parliament to expand its role in the region. Read on to see how Japan has currently defined itself and what that will mean for its future.


Japan’s Economy

The strength of Japan’s economy has been central to defining its place in the region. Japan was the first in Asia to modernize along European lines, starting at the end of the 19th century. Adopting a Prussian-style, government-dominated economy, Japan became a powerhouse up until WWII. Following the war, while Japanese industries struggled to recover, the nation was helped by a large surplus of young educated workers and free trade.

Then, starting in the 1960s, Japan began its economic miracle in which it relied on exports to make it a world power economically, second only to the United States. That miracle, however, came to an end in the early 1990s, as GDP growth leveled off. From 1992 onward, Japan’s GDP growth remained largely stagnant. Despite a brief period of growth in the 2000s, those gains were erased by the 2008 global recession.

Japan’s most recent effort to reverse this trend was the election of Shinzo Abe as prime minister in 2012. After three years and a large economic stimulus, there has been very little to show for it. But despite that stimulus, there have been brief periods of recession. Economic stimulus has also come with significant costs–the national debt in Japan is currently 240 percent of its GDP, the highest in the world.

Demographics

Part of Japan’s economic problem is the demographics of its workforce. On average, Japanese citizens have the longest life expectancy of any people on the planet. While that is certainly good for the people of Japan, when you couple that with the country’s extremely low birth rate it creates a significant issue for the Japanese workforce. Namely, while life expectancy is going up, the birth rate is going down. This means that there are fewer young workers to replace the older retirees. So, the younger workers must now support more retirees per person, along with themselves and their families. The best way to visualize it is as an inverted pyramid. This problem is only made worse by Japan’s relatively young, customary retirement age of 60. The video below details the issues plaguing Japan demographically:

Foreign Relations

China

Up until the end of the 19th century, Japan had been under the influence of China, even adopting its customs and language. However, beginning in the 1890s and continuing into WWII, the roles were reversed as Japan became the dominant power. Japan earned long lasting infamy and hatred among the Chinese when its army killed and raped hundreds of thousands of people.

Japan’s current relationship with China can be characterized as contentious, particularly in light of China’s growing military and economic power. In this case, Japan serves as the traditionally dominant power that it is being overtaken by the upstart China. The following video below shows the difficult relationship between the two nations:

North and South Korea

Many of the complications with Japan’s relationships with its neighbors stem from its deep history in the region. For thousands of years, there existed an exchange of ideas and customs between Korea and Japan. But in 1910, Japan annexed Korea, holding the territory as a colony until its defeat in World War II. This period involved particularly harsh rule and oppression from Japan, which is the source of strong resentment that still exists today. Despite Japan’s policy of pacifism adopted after World War II, resentment from past conflict continued to shape Japan’s relations with its neighbors.

Japan’s relationship with North Korea is also filled with wariness, much like the one with China. However, the reasons why Japan mistrusts North Korea are different. Unlike the Chinese, an economic and territorial rival, North Korea’s danger lies in its instability. Couple this instability with its nuclear capability and the repeated missile tests near Japan and it presents Japan with a very dangerous and unpredictable potential adversary close by. Recently Japan has been part of the six-party talks about North Korea’s nuclear program. As North Korea has proven committed to the program, Japan has employed sanctions, further distancing itself from North Korea.

Despite their similarities, Japan and South Korea have had a strained relationship since the early 1900s. Both countries have democratic market-based economies, causing them to share several interests in the region–both are wary of China’s growing role in the region and are close allies with the United States. Scholars argue that formal relations between the governments are largely shaped by public opinion. In their bookThe Japan-South Korea Identity Clash, Brad Glosserman and Scott Snyder argue:

We conclude that the threat-based and alliance-based evaluations of conditions for Japan-ROK [Republic of Korea] cooperation cannot overcome the psychological and emotional gaps in perspective on Japan-ROK relations, chasms that are reflected in public opinion in both countries. For this reason, this study has chosen to utilize public opinion data as a way of getting into the heads of the publics on both sides and more deeply understanding the nature and parameters of identity-related issues that have inhibited development of the relationship.

United States

Prior to its defeat in World War II, Japan was a staunch rival of the United States. Japan’s imperial interests in Southeast Asia conflicted with the United States’ interests in the region and threatened the United States’ Open Door policy in China. But after the war, Japan developed on the United States’ terms and has since become one of the most important U.S. allies over the past several decades.

In an almost ironic twist, relations between the United States and Japan are as good, if not better than with any of its neighbors, which is significant given the legacy of WWII. Since the end of the war and American occupation, Japan has been a close ally–it now hosts a major U.S. military base on the island Okinawa. Japan is also a major market for U.S. goods and an important regional partner for diplomacy.


Japan’s Military

The Cost of War

Japan’s movement away from pacifism also has the potential to affect the nation’s bottom line. Although next year’s budget increase for military spending is not huge, about $41.7 billion or 2.2 percent, it matters quite a bit in the context of the country’s economy, as Japan is mired in extreme debt.

Along with rising costs of an expanded military, there are the effects on the weapons industry in Japan. Last year, the country allowed its weapons manufacturers to export military weapons for the first time. Prior to 2014, companies were only allowed to sell weapons to the Japanese military. But it remains unclear whether this move will actually benefit these companies, which are usually part of much larger corporations. This is because these manufacturers have never had to compete for business before. While exposure to more markets may seem like a good thing, removing the protections in place may not provide many short-term benefits.

Nuclear Weapons

When we talk about a less-passive Japan, the topic of nuclear weapons may also come up. Most of this rhetoric comes from China, Japan’s chief rival, who suggests with Japan’s advanced nuclear energy knowledge, building a weapon would be very easy. The second part of the assertion is certainly true, as most experts believe that with their know-how and inventory of radioactive material, the Japanese could likely build a nuclear weapon in a matter of months. However, the idea that Japan would do so seems unlikely for several reasons. These reasons include a nuclear guarantee from the United States, a strict commitment to the Nuclear Non-Proliferation Treaty (NPT), and the growing criticism of nuclear technology in Japan in general following the nuclear meltdown disaster in Fukushima. Finally, the historical significance of nuclear weapons still resonates with Japan after the nation was decimated in World War II. The following video gives a complete analysis of Japan ending its policy of pacifism:


Conclusion

Like its economy, Japan itself seems caught in a malaise which threatens to affect its role within the region. Japan’s economy remains in neutral despite the election of Shinzo Abe, who pledged to turn things around with government spending and other innovations. Diplomatically, relations with its Asian neighbor remain simmering, especially with China and the Koreas.

This may explain Japan’s recent decision to move away from its 70-year-old policy of pacifism. However, popularity and concern for spending certainly remain issues for the country, as the decision flies in the face of both. The decision also threatens to further aggravate tensions with Japan’s neighbors, who still carry memories and grudges from World War II.

Japan’s role in the region seems to be the same as that of many countries in their respective spheres, not as big as it thinks it is or should be. Perhaps becoming a more assertive military power is a way for Japan to bolster itself, especially in the face of a rising China. It may also just be a reaction to the arms races currently ongoing in Asia, set off by a rising China.


Resources

CNN: Assertive Japan Poised to Abandon 70 Years of Pacifism

BBC: China & Japan: Rival Giants

Stanford: Learning from the Japanese Economy

The National Interest: The Demographic Timebomb Crippling Japan’s Economy

The Heritage Foundation: Japan Needs Real Economic Reform

Wall Street Journal: Japan Military Spending in Cross Hairs

CNN: Pacifism bill: Why Japan Won’t Build a Nuclear Weapon Quickly

The ASAN Forum: North Korea in Japan’s Strategic Thinking

Department of State: U.S. Relations with Japan

Voice of America: American History: US-Japan Relations Before World War Two

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Defining Japan’s Place in the World appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/land-stagnant-sun-defining-japans-place-region/feed/ 0 48241
What’s in Your Food?: A Look at Regulating the Food Industry https://legacy.lawstreetmedia.com/issues/health-science/you-are-what-you-eat-what-is-that-exactly/ https://legacy.lawstreetmedia.com/issues/health-science/you-are-what-you-eat-what-is-that-exactly/#respond Mon, 05 Oct 2015 01:11:34 +0000 http://lawstreetmedia.wpengine.com/?p=48045

A look at the food industry's newest regulations.

The post What’s in Your Food?: A Look at Regulating the Food Industry appeared first on Law Street.

]]>
Image courtesy of [Paul Swansen via Flickr]

A recent report from several environmental consumer advocacy groups graded of 25 of the leading fast food restaurants on the use of antibiotics in the meat they use, and for anyone concerned with the use of antibiotics in the food supply the results are illuminating. The report yielded some surprising ratings for America’s most popular fast food restaurants, giving only two an A grade.

We often hear about the use of things like antibiotics, hormones, and other additives in our food, but not everyone knows exactly what effects they have on our health. Read on to learn more about what’s in your food and what’s being done to make sure everything we consume is healthy.


 Antibiotics

When most people think about antibiotics the first thing that comes to mind is the medicine you get for strep throat or ear infections, so how are antibiotics important in terms of what we eat? The answer lies at the beginning of the food production, when farmers raise livestock for food.

The purpose of antibiotics is to kill harmful bacteria, which is important for both humans and for animals in the food supply. The problem is that they do not necessarily kill all the bacteria. For example, they may kill 99 out of 100 bacteria cells. However, the one percent that survives is immune to the antibiotic and reproduces. Over time, the resistant bacteria can reproduce, making the antibiotic no longer effective, which mean that a stronger medicine is needed to kill the bacteria. Resistant bacteria may also be transmitted to humans during consumption, which can lead to significant health concerns. The growing population of resistant bacteria could pose a significant health risk as antibiotics become less and less effective.

Farmers treat their animals with antibiotics for several reasons, but most importantly they do it because they want to keep the animals healthy. The use of antibiotics has also led to larger and heavier animals, which also means more profit. The central issue with the use of antibiotics in livestock is the fact that they are used when they are not actually needed. While few argue that antibiotics should never be used on animals, the use of “sub-therapeutic” doses, which are given when an animal is not sick is what most people have a problem with. Over time, these doses lead to resistant bacteria, which may be transferred to humans when consuming meat.

FDA regulations instruct farmers to use antibiotics only when an animal is sick or if there is an unusually high risk of disease, but that is not always the case. According to the Friends of the Earth and National Resource Defense Council mentioned above, 70-80 percent of the antibiotics used in the United States are given to animals. The report found that 20 of the top 25 fast food chains received a failing grade for their antibiotics policies.

Hormones

Like antibiotics, hormones are used to make animals bigger and stronger. But the hormones contained in the meat that people eat is passed along to humans as well. The FDA approves and regulates all hormones that are used in food production. The amounts allowed in food are determined by the FDA through research and are supposed to be well below the levels that naturally occur in the human body, thereby preventing any negative effects.

When it comes to hormones, there are a lot of gray areas in terms of their health effects. The two largest concerns associated with their use are a possible increased risk of cancer and the early onset of puberty in children. Existing studies suggest that lifetime exposure to hormones like estrogen can be linked to greater risk of cancer, and hormones previously used in animals have actually been tied to cancer risk. Diethylstilbestrol (DES), was a used in the 1960s and was connected to heightened risk of certain forms of cancer, but DES use ended after this connection was discovered. The amount of estrogen present in food is significantly lower than levels that naturally occur in our bodies. Currently, there is not sufficient evidence to draw a clear connection between growth hormone use in animals and an increased risk of cancer. Hormones given to animals are essential for growth and development, and the FDA regulates them to ensure that their presence in our food remains at safe levels. According to the FDA:

People are not at risk from eating food from animals treated with these drugs because the amount of additional hormone following drug treatment is very small compared with the amount of natural hormones that are normally found in the meat of untreated animals and that are naturally produced in the human body.

One issue that has a stronger connection to the use of hormones in the food supply is the early onset of puberty in children. On average, children have been starting puberty earlier than in the past, which some scientists have linked at least in part to the presence of hormones in food. While food still has relatively low levels of these hormones, their mere presence can cause children to reach puberty earlier. Although studies have found a connection, the use of hormones, like most many food-related health issues, still requires further research to clarify the link.


Pesticides

While antibiotics and hormones are designed to fatten animals, pesticides serve a similar purpose for fruits and vegetables. Pesticides do not increase the size of fruits or vegetables, but they do help ensure their survival from threats such as insects or weeds. They also kill potentially harmful organisms such as mold or fungus which can grow on foods.

Due to the widespread use of such chemicals on most foods the EPA and similar organizations in other countries have set tolerances for the amount allowed in foods. These tolerances are set after conducting risk assessments, which look at the potential health risks of individual pesticides. Once the tolerance has been set, it is then enforced by the USDA and FDA, or a corresponding agency in another nation. Yet several questions remain about the use of pesticides as well as their effects on humans and the environment. Pesticides are not allowed to be used on foods until they have gone through an assessment and they are also occasionally re-evaluated to make sure the set tolerance is appropriate. Re-evaluating pesticides is the primary way to address issues with tolerances and new information about health effects.


Preservatives

Preservatives, like other food additives, are in foods to serve a purpose beyond increasing profits, but they also come with their own risks. Generally speaking, the purpose of preservatives are to make food last longer and prevent rotting.

While preservatives help keep things fresh, they may also harm the people who ingest them. A recent study by immunologists at Georgia State University found that pesticides can erode a protective lining in the colon, which can lead to inflammation and even change the nature of bacteria located there. This has been linked to higher rates of inflammatory bowel disease and obesity.

Preservatives are regulated by the FDA, which has the final say on which additives are approved for consumption. While the FDA regulates the use of preservatives, most additives are never actually tested by the FDA. This is because the FDA uses the GRAS labeling system for many things that are added to food. GRAS, an acronym for the term “Generally Recognized As Safe,” allows producers to add things to food according to established practices, but without requiring pre-approval from the FDA. While this system expedites the rate at which new products can go on the market, there have been several instances where the FDA approved the use of a certain additive, only to repeal it later when they proved to be carcinogenic. Examples of this include Cyclamate, Safrole, Saccharine, and most recently, trans fats.

The video below looks at a variety of food additives and their uses:


The Future of Food

Organic Food, Buying Local, and Farmer’s Markets

According to the United States Department of Agriculture (USDA), organic food is “produced without antibiotics, hormones, pesticides, irradiation or bioengineering.” By its very definition then, organic food has the potential to alleviate many of the alleged effects of added ingredients simply because it does not have them. What’s more, becoming certified organic by the USDA is a long, difficult, and expensive process. As a result, the validity of this label also comes with a certain amount of clout. In 2014, the combined sales of organic food in the United States was up 11.3 percent from the previous year to $39.1 billion, approximately five percent of the entire food market.

The rise in organic food sales can, at least in part, be associated with the rise in farmers’ markets. Between 2006 and 2014 the total number in America rose 180 percent to 8,268. Farmers are also turning to selling to local restaurants, distributors and even directly to local schools. Aside from offering food that does not contain any of the additives listed above, doing so has the added impact of reducing fuel costs and pollution.

The Food Industry

Another major change is on the part of the traditional food industry, from grocery stores to restaurants. With organic food emerging as a major trend, supermarkets have been quick to respond. Chains such as Whole Foods have been some of the most successful stores, as their businesses operate on the notion of selling these types of foods.

Fast food, on the other hand, is under a tremendous amount of pressure to use healthier food with fewer additives. The industry has taken some important steps, but based on evaluations like the report mentioned at the beginning of this piece, it has a long way to go. Even before the release of that report, efforts had been underway to rid menus of additives, GMOs, and the equally demonized high-fructose corn syrup. Industry leaders like Panera and Chipotle stopped using these ingredients in their food long before the issue came back up in the news.

Like the use of additives, organic foods also have their own costs and benefits. The first is price–organic food is notoriously more expensive than its non-organic counterparts, as are the prices in restaurants that serve them. Additionally, it is still unclear if they offer any more nutritional value than non-organic foods beyond the absence of additives, hormones, and pesticides. The following video details the pros and cons of organic food:


Conclusion

When people eat food that has antibiotics, hormones, or preservatives, these additives become part of the body and may have adverse effects on their health. We have recently taken important steps to understand what exactly goes into our food and how that affects our health, but there is still a lot that remains unknown. Buying organic food and trying to reduce the use of pesticides and certain additives is an important step to ensure that everything we consume is healthy. But taking these steps to monitor what we eat is only part of the equation; we also need more research to determine exactly how our bodies react to various things that are added to what we eat. While the FDA, USDA, and EPA regulate food production to ensure pesticides, additives, and hormones do not exceed safe levels, these regulations evolve with research. It is up to every individual to make their own informed choice based on their own means. When choosing food people should identify what goes into what they consume and how it is produced. As research progresses, recommendations and regulations can and will continue to change.


 

Resources

Primary

FDA: Steroid Hormone Implants Used for Growth in Food-Producing Animals

EPA: Pesticide Tolerances

Additional

Frontline: Is Your Meat Safe?

CNN: Report Examines Antibiotics in Meat on Fast Food Menus

Nature: Food Preservatives Linked to Obesity and Gut Disease

Organic Trade Association: U.S. Organic Industry Survey 2015

NPR: Are Farmers’ Market Sales Peaking? That Could be a Good Thing for Farmers

RxList: Antibiotic Resistance

Health: America’s Healthiest Grocery Stores

Entrepreneur: 7 Major Restaurants That are Getting Rid of Artificial Ingredients

National Institute of Health: NIH Human Microbiome Project Defines Normal Bacteria Make-up of the Human Body

Sustainable Tables: Additives

Organic Consumers Association: Beef, Hormones Linked to Premature Onset of Puberty & Breast Cancer

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post What’s in Your Food?: A Look at Regulating the Food Industry appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/health-science/you-are-what-you-eat-what-is-that-exactly/feed/ 0 48045
America’s Role in Solving the Migrant Crisis https://legacy.lawstreetmedia.com/issues/world/americas-role-migrant-crisis/ Fri, 18 Sep 2015 18:08:42 +0000 http://lawstreetmedia.wpengine.com/?p=47888

What can the United States do?

The post America’s Role in Solving the Migrant Crisis appeared first on Law Street.

]]>
Image courtesy of [Trollman Capote via Flickr]

Hundreds of thousands of migrants are fleeing war-torn countries like Syria and Afghanistan, making a perilous journey to Europe. For those who arrive in Europe, the reality may not be much better as the European Union, already struggling to stay together amidst financial issues, is now faced with one of the greatest migration crises in history. Meanwhile, in the United States, known for its history of immigration, the question of what can be done to help is gaining attention. Read on to learn about immigration history in the United States, what it has done so far, and what it can do in the future to assist Europe and the migrants risking their lives to make it there.


History of Immigration in the United States

The United States takes pride in being a nation of immigrants, from its initial colonization through several waves of immigrants in the 19th and 20th centuries. The first big wave of immigrants came to the United States in the mid-19th century, consisting largely of Irish and German migrants fleeing famine and blight in their own nations. The arrival of new people to the United States was not universally welcomed. Because many of the immigrants who came during this period were Catholic, anti-immigration sentiment emerged among many American Protestants who feared the rise of Catholicism in the United States. Another wave of immigrants came during the late 1800s and early 1900s. This group was comprised of Southern and Eastern Europeans such as Italians and Russians. Opposition to the changing U.S. population proved lasting, affecting several different policy decisions in Congress.

Laws Limiting Migration

As large waves of immigrants came to the United States, Congress enacted several new laws to manage, and most notably, limit, the flow of people. The first was the Naturalization Act of 1790, which outlined who was allowed to become an American citizen. While the Naturalization Act of 1790 allowed free white people to become citizens, the Naturalization Act of 1870 expanded to include people of African descent, but still prohibited people from other places of origin. Many early U.S. immigration policies attempted to restrict the flow of immigrants from Asia. For example, there was the Chinese Exclusion Act passed in 1882, which was later repealed in 1943.

The most notable change to the U.S. immigration policy came in the 1920s when Congress passed the Johson-Reed Act. The act, passed in 1924, established a quota system that limited the number of immigrants of each nationality to the levels present in the 1890 census. This effectively restricted immigration from Southern and Eastern Europe, and stopped immigration from Asia–seeking to prevent changes to the racial composition of the population. Finally, in 1965, Congress passed another immigration law, which established the visa system that the United States has today, doing away with the formal quota system. This law sought to focus on reuniting families and allowing skilled immigrants to live and work in the United States. The law also allowed for a notable increase in immigrants from countries in Asia.


How to Enter the United States

Despite the various changes to U.S. immigration policy in the 1900s, becoming a citizen or even coming to the United States remains an arduous process. Currently, there are several ways one can become a citizen or live in the United States temporarily. The first is family-based; a person can come to the United States if they are a child, direct relative, or spouse of someone who is already a citizen.

Second is the visa system, which can be broken down into temporary and permanent categories. Foreign nationals can receive temporary visas for tourism, business, or education. For long-term visitors, the U.S. government can issue a green card giving them permanent resident status if certain conditions are met. There is also a diversity program that encourages immigration from countries with low levels in the United States, in an effort to attract the best and brightest from around the world.

Refugees

Refugees are also a large part of the yearly immigration total in the United States. Following WWII and the passage of the Displaced Persons Act of 1948, approximately 650,00 people were admitted from war-torn Europe. In subsequent years, additional waves of refugees settled in the United States, often escaping oppression from their home governments. The Cold War led to a notable rise in refugees to the United States, particularly after the Vietnam War.

Congress passed the Refuge Act of 1980, which standardized the definition of who is a refugee and how he or she can be resettled within the country. Every year, the president and Congress decide how many refugees the country will accept and from where they will come. Since 1975, roughly three million refugees have been permitted to settle in the United States. The number of refugees accepted annually has ranged from 207,116 in 1980 to 27,100 in 2002.


The Migrant Crisis

What’s Going on in Europe

Before we get to the United States’ role in the current crisis, let’s first go over what is going on in Europe and the Middle East. The immigration crisis affecting Europe is unlike anything the region has ever seen. So far, 350,000 people have migrated to the continent, dwarfing last year’s record high of 219,000 people. Many of the migrants come from Syria where a civil war has caused over four million people to flee the country. More than 2,800 have died while attempting to cross the Mediterranean this year alone.

The massive influx of migrants has created a significant problem for the European Union, which so far leaders have failed to properly address. For more information about the crisis in Europe check out Law Street’s explainer as well as the video below.

Refugees in the United States

According to the U.S. Department of State, a refugee is, “someone who has fled from his or her home country and cannot return because he or she has a well-founded fear of persecution based on religion, race, nationality, political opinion or membership in a particular social group.” For any refugee, the first step is to apply for refugee status with the United Nations in the country where he or she is seeking asylum. Even if a person is granted protected status, there is no guarantee that he or she will be accepted in the United States (there are as many as 15 million worldwide). The idea behind admitting refugees is often to provide a temporary home until they can return to their own countries. While few refugees are admitted, even fewer are allowed to stay somewhere permanently.

Current U.S. Efforts

So far, the United States’ primary contribution has come financially–America has given Europe $4 billion in aid to combat the crisis. However, when it comes to accepting migrants, the United States has come up short. Many of those fleeing to Europe are Syrians, trying to escape a civil war in their homeland. Nonetheless, the quota allotted to Syrian refugees was just 1,500 until recently. On September 10, the Obama Administration called for the United States to resettle at least 10,000 Syrian immigrants in the next fiscal year, which starts October 1.

Some non-profits have called for a much higher number. The United States Committee for Refugees and Immigrants believes the United States should accept as many as 100,000 migrants from Syria in addition to 70,000 to 100,000 immigrants from other countries. This has stirred debate in Congress, where some Republican members are worried that allowing more Syrian refugees could increase the threat of terrorism. The following video outlines the U.S. government’s actions so far in this crisis:

Illegal Immigration in the US

Another consideration for the United States is the number migrants already inside the country. As of right now there are approximately 11 million illegal immigrants–of those, around 50 percent are from Mexico. Illegal immigration remains a hot-button issue for the U.S. government, making its willingness to help Europe even more complicated. When you factor in the population already here, the likelihood of the United States accepting a large number of Syrian refugees is not very high.


 Conclusion

The current migration crisis in Europe threatens to overwhelm the European Union, which is struggling to handle so many people. Europe’s inability to control the influx has led to a wide range of criticism. Many are now looking for the United States to step up its involvement in the crisis. So far the United States has given a significant amount of money to help alleviate some of Europe’s problems, but it has done relatively little in terms of accepting refugees. The recent announcement to accept 10,000 Syrians will certainly help, but given the number of refugees fleeing Syria and other conflict-torn countries, both the United States and Europe will need to do more.

People attempting to migrate to the United States, even refugees, face an array of requirements that make the process difficult. Couple that with fears of terrorism and the existing immigration problem facing the United States, and it seems unlikely that it will fill its historic role as the home of last resort. Whatever the United States decides to do, it and the European Union must move quickly, as pressure continues to mount.


Resources

Primary

Pew Research Center: 5 Facts About Illegal Immigration in the US

UN Refugee Agency: Syria Regional Refugee Response

Additional 

HSTRY: A History of Immigration in the USA

CNN: European Migrant Crisis

France 24: Hungary to Return Economic Migrants to Where They Came From

American Immigration Council: How the United States Immigration System Works

Refugee Council USA: History of the US Refugee Resettlement Program

US Department of State: Refugee Admissions

The Economist: Migration from Europe

New York Times: As European Crisis Grows, US Considers Taking in more Syrians

Voice of America: US Pledges to Accept More Migrants

INQUISITR: 29 countries accepting refugees from Syria and the Middle East

Center for Immigration Studies: US Immigration Population Record 41.3 million in 2013

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post America’s Role in Solving the Migrant Crisis appeared first on Law Street.

]]>
47888
Attacks, Insurgency, and a Military Takeover in Modern Day Thailand https://legacy.lawstreetmedia.com/issues/world/attacks-insurgency-military-takeover-modern-day-thailand/ https://legacy.lawstreetmedia.com/issues/world/attacks-insurgency-military-takeover-modern-day-thailand/#respond Wed, 09 Sep 2015 09:42:23 +0000 http://lawstreetmedia.wpengine.com/?p=47634

What's going on in Thailand?

The post Attacks, Insurgency, and a Military Takeover in Modern Day Thailand appeared first on Law Street.

]]>
Image courtesy of [Guillén Pérez via Flickr]

Thailand is a warm, tropical country situated south of China on the Asian mainland, home to a wide range of vegetation and animal life. From an outside perspective, it is a veritable paradise, but on the inside it is far from perfect. This reality was brought to light last May when a military coup ousted the sitting government, and again last month when a bombing at a major shrine in Bangkok shook the country.

Lately, Thailand has been turning from its traditional allies in the west to those that more closely mirror its new authoritarian government, like China. This shift comes at a critical time for Thailand; its economy has stagnated and many of the issues that it has worked hard to overcome–like border conflicts and its central role in the sex trade–have persisted. Read on to learn about the recent challenges facing in Thailand, stemming from its history to where the country is heading now.


History of Thailand

From as early as the 7th century, Thailand has been populated with a variety of people. Up until the 18th century, Thailand was ruled by several competing monarchies. Beginning in the early 19th century, the country began to modernize along European lines while still maintaining its monarchy. This seemingly culminated in its support of the British in WWI.

Thailand’s allegiance briefly changed in WWII when it joined with the Japanese, but after Japan’s defeat Thailand returned the territory that it took from its neighbors and once again allied with the west. In fact, it served as a base for U.S. operations during the Vietnam War. Domestically, Thailand was also going through a rash of changes spurred by a string of protests, coups, and conflicts.

Thailand’s Current Government

In a military coup last May, the elected legislature was forced from power and the military took over. After ousting the government, the military junta imposed martial law, lasting for nearly a year. Military coups are nothing new for Thailand, however, between 1932 and 1991 there were at least 17 different coups. The video below details the most recent military takeover:

Following the coup, and even after lifting martial law, the situation has not improved much. The military junta granted itself new, all-encompassing powers assuming formal control of the government. The junta originally justified its control as a way to avoid violence between the existing government and protesters. But as the military has increased and prolonged its rule many are growing skeptical of its desire to hold elections and give up control. Controversy arose recently when the military junta rejected a draft constitution, which extended its control over the government into next year.


Current Challenges

Recent Attack

In August, a large explosion occurred at the Erawan Shrine, one of the most popular shrines in Thailand–killing 22 people and injuring 120. The Erawan Shrine is located in Bangkok, the capital of Thailand.

In the aftermath of the attack, two men have been arrested, while as many as seven more suspects remain free. So far, the government does not suspect a connection between the attack and international terror organizations, but the motive behind the bombing remains unknown. While the two men in custody have been charged with illegally possessing explosive materials, authorities do not consider them the masterminds behind the bombing.

Insurgency

While the recent bombing garnered a lot of attention in the capital, most of the violence in Thailand has historically occurred in the south, where insurgents fight for territory. The insurgency has plagued the country’s southernmost regions for over a decade, killing nearly 6,400 and wounding nearly 11,000 people. The unrest is the result of a struggle between the Malay Muslims–who demand local autonomy after attempts to incorporate with the rest of Thailand failed–and the Thai government, which refuses to give up land. The rebels fear assimilation and the loss of their culture, which has been an issue since the area was incorporated into Thailand in 1909. While the territory has been a point of contention since 1948, recent violence did not emerge until 2004–coinciding with the election of an unpopular Prime Minister, Thaksin Shinawatra. In addition to their fear of cultural assimilation, the rebels’ other grievances relate to poor access to education and high rates of poverty.

The conflict has persisted largely due to the government’s inability to find rebels to negotiate with, as the movement is largely decentralized. Even more so, it is the result of the government refusing to make any concessions while attempting to downplay the issue altogether. The new military government has vowed to take a tougher stance–pledging to end the conflict within 12 months, but also does not plan to make concessions to the insurgents.

The following video explains the insurgency in southern Thailand:

Sex Trafficking

Another long-standing issue for Thailand is sex trafficking. The country is one of the world’s most notorious centers for sex trafficking. Although exact numbers are difficult to determine, estimates indicate that there are tens of thousands of victims. A recent State Department report gave Thailand a third tier rating for its response to human trafficking issues, the lowest rating possible. Due to its location, local corruption, and the government’s reluctance to intervene, Thailand has become a regional hub for sex trafficking.

Prior to the military coup, Thailand made some efforts to crack down on the sex trade. In 2008, it passed the Anti-Trafficking in Persons Act which made it illegal to traffic in any persons. Additionally in 2011 the government passed its second six-year national policy strategy aimed at eliminating trafficking. Despite these efforts as well as further collaboration with international organizations and NGOs, trafficking remains a major problem. While the military government has made some attempts to crack down on trafficking, critics argue that these efforts have done very little to address the underlying problem.


Thailand’s Place in the World

Thailand’s domestic issues are only compounded by the country’s increasingly unclear place in the world.

United States

As a result of the military coup in 2014, relations between Thailand and its traditional partner, the United States, have become strained. Evidence of this can be seen in the negative reactions from the United States and its EU allies in response to the military’s increasing authority. Directly after the coup, the United States suspended military aid, as is required by U.S. law after a coup, as well as ceased joint military exercises and aid. The accompanying video depicts Thai-US relations:

China

To fill the gap left by the United States, China has stepped in. Initially, this was through a deal in which Thailand would buy submarines from China in an effort to arm itself in relation to its increasingly armed Asian neighbors. The relationship has expanded as Chinese diplomats began visiting the nation, and talks of a railroad connecting the two nations are currently in progress. Thailand also recently deported more than a hundred Uighurs to China, which the United Nations and several human rights groups condemned. Uighurs, a minority in China, face repression and possibly torture at home.

Thailand’s Economy

Thailand’s relationship with China is not only important as the countries grow closer militarily, but also because China is Thailand’s number one trading partner. Thailand needs China to buy its goods, for which it shows support for the regime in Beijing. So far Thailand has delivered–supporting initiatives like the Maritime Silk Road and the Asian Infrastructure Investment Bank, both of which are headed by China.

Trade is important to Thailand because, like many economies in Southeast Asia, Thailand is overly reliant on exports for growth. In fact, exports make up more than half of its GDP, making it the second-largest economy in Southeast Asia behind Indonesia. The rest of its production comes through a variety of industries including tourism, agriculture, and fishing.

Although Thailand has one of the largest economies in the region, it has not shown much growth lately. In 2014, its GDP grew by only 0.7 percent and may actually contract this year. The exact cause of Thailand’s economic woes is difficult to trace, but most believe the recent coup and the resulting uncertainty have not helped. The recent attack on the Erawan Shrine could also damage Thailand’s economy–particularly in terms of tourism, which has accounted for a significant portion of its economic growth lately. While these fears may be legitimate, the government maintains that last month’s bombing will not affect the tourism industry.

As economic uncertainty mounts, Thailand continues to align itself with countries like China and Russia in order to maintain trade relationships as its connection with the West becomes increasingly strained.

Border Disputes

Thailand’s most notable border dispute is with Cambodia, which is centered on the Prear Vihear Temple. A ruling from the International Court of Justice (ICJ) in 1962 originally awarded the area to Cambodia. However, the decision has left Thailand unsatisfied and led to a three-year conflict from 2008 to 2011. The conflict ended with a new decision by the ICJ in 2011, which gave control of the temple to Cambodia, but left much of the surrounding area undetermined. Currently, both countries are in talks to create a solution for the remaining land, but little progress has been made.

Another dispute, between Thailand and Malaysia, was largely settled decades ago. However, control over Bukit Jeli, a stretch of land on the border between both countries, remains unsettled. Despite the lack of agreement, a conflict between the two countries remains unlikely. Some believe that a similar agreement, or lack thereof, could be applied the situation with Cambodia.


Conclusion

Thailand is home to an ancient civilization and the country was one of few never to be colonized by Europeans. But, the recent coup and the military junta’s expansion of power are beginning to reveal the many issues facing the country. Given issues with sex trafficking, border disputes, and an insurgency in the south, Thailand’s domestic troubles may also affect its position in Southeast Asia.

So what does all this mean going forward?  If Thailand continues with its military-controlled government, which looks likely, it may also continue to alienate its old allies such as the United States. Subsequently these decisions push Thailand further into the orbit of countries like China who have their own authoritarian governments. While this may not be what the people of Thailand want, the importance of China as a trade partner leaves few options for the country.

All this is subject to change if the government gives up power or is deposed, and when you look at its history another coup in Thailand is certainly possible. However, the growing concentration of power among military leaders reduces the likelihood power will change hands. As long as the military stays in control the situation is likely to remain while social issues–such as unrest, border disputes, and sex-trafficking–go unaddressed.


Resources

Primary

CIA World Factbook: Thailand

Additional 

Human Trafficking.org: Thailand

BBC: Bangkok’s Erawan Shrine bomb

Combating Terrorism Center: The Smoldering Thai Insurgency

DW: Why Thailand is boosting ties with Russia and China

BBC: Thailand Profile

Al Jazeera: Thailand

The Economist: The Darkened Horizon

BBC: Thai Military Government Replaces Martial Law

CNN: Bangkok Shrine Explosion Kills 22 Including Tourists

The Interpreter: Religion in the Southern Thailand Conflict

Value Walk: What do Growing Thai-China Relations Mean for the U.S.?

Tourism Thailand: the Economy of Thailand

East Asia Forum: Thai-Cambodia Relations One Year After the ICJ Judgment

DW: A look at southern Thailand’s smoldering insurgency

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Attacks, Insurgency, and a Military Takeover in Modern Day Thailand appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/attacks-insurgency-military-takeover-modern-day-thailand/feed/ 0 47634
Riding the Wave: The Tumultuous Global Stock Market https://legacy.lawstreetmedia.com/issues/business-and-economics/riding-wave-tumultuous-global-stock-market/ https://legacy.lawstreetmedia.com/issues/business-and-economics/riding-wave-tumultuous-global-stock-market/#respond Sat, 29 Aug 2015 19:40:14 +0000 http://lawstreetmedia.wpengine.com/?p=47373

What's going on with the global stock market?

The post Riding the Wave: The Tumultuous Global Stock Market appeared first on Law Street.

]]>
Image courtesy of [dflorian1980 via Flickr]

On Monday August 24, the main Chinese stock market, the Shanghai composite, fell 8.5 percent in one day. This massive drop set off losses worldwide, beginning in nearby Asian markets including Japan. However, the worry and lack of confidence quickly spread to Europe and to the United States. In the west, this led to massive sell-offs of stocks and the value of the American market dropping 10 percent below its record high, which was achieved only a few months earlier at the beginning of summer. While this was initially blamed on the volatility of the Chinese market and its slowing economic growth, the loss also revealed more issues in the other global markets. Read on to learn about the history of an up and down global stock market, the reasons for the recent crash, and what is expected in the future.


History of Volatility in the Global Stock Exchange

The recent stock market losses, while severe and brisk, are by no means the first and certainly not the worst that have occurred in history. Since ideas such as interconnected economies and even nations are relatively new concepts, to find the first major market crash one would only have to look back less than 400 years. This crash occurred in the Netherlands in 1637. Based on speculation, tulip prices in the country soared, leading many citizens to invest. But the prices eventually peaked and then plunged back to earth, causing many investors to lose everything.

Several more speculative bubbles grew and then burst over the ensuing years in economic powerhouses like Britain and France. The phenomenon would reach a head however, with the stock market crash in 1929 which ultimately led to the Great Depression. This stock market crash, like the ones before it, was the result primarily of speculation, which had been fueled by massive economic growth within the American economy during the 1920s.

However, when the economy began to stagnate, investors initiated a mass sell-off which caused the market to plummet. This was followed by a run on banks so severe that thousands were forced to close. The effects of this collapse spread beyond the borders of the United States to Europe, due to both places’ reliance on the gold standard. The Great Depression would play a major role in the lead up to WWII and it was not until after the war that the global economy recovered.

Despite the devastation of the 1929 collapse, major market meltdowns would continue to take place. In 1987 the U.S. market lost over a fifth of its value in just one day, a day known as Black Monday. The oldest bank in England, Barings Bank, was forced to close due to speculation. Meanwhile in Japan, following a thirty year growth spurt, the market collapsed beginning in 1989 and has left the country in a prolonged state of malaise ever since.

The two most recent crashes both originated in the United States. The first was at the turn of the millennium–the dot-com bubble. The bubble had built upon the belief that the internet was ushering in a new type of economy, which was not subject to the same issues as the past. This led to a number of unwise investments in companies mired in debt or with no value. The crash began in 2000 and continued into 2002. The bursting of this particular bubble cost the NASDAQ 80 percent of its value and led to a recession.

The most recent crash began in 2008.  From its pre-recession peak until the market bottomed out 18 months later in 2009, the Dow lost more than 50 percent of its value. This collapse was triggered by sub-prime mortgages, but spread to other industries such as automotives and was prolonged due to other connected issues globally, including the debt crisis in Europe. The economy was only saved and confidence only tentatively restored through massive bailouts.  The video below explains the 2008 crisis and the root of many of the stock market crashes:


Reasons for the Recent Crash

Like other crashes before it, the current crash is the result of a number of factors which have combined to cause speculation and panic on a global scale.

China

At the center of the most recent stock market crash is China. China had already been dealing with a declining market since at least June of this year. On June 12 the Chinese government stepped in to fill the void left by a bubble, which had been created by Chinese citizens investing money they did not have. While the government tried a variety of stop-gap measures, these appear to have had little effect. Compounding this problem more was China’s slowing growth. In fact, many of those who invested did so based on the prolonged growth of China’s economy for the last 20 years.

Additionally, confidence in China from the outside also appears to be faltering. This comes as a result of several recent events. The most glaring is the government’s inability to handle this current stock crisis. Even after intervening and devaluing the currency in an effort to make borrowing money cheaper, the market has continued to fall. Other events as well, such as the fiasco with a chemical plant explosion and China’s dubiously reported economic figures have caused foreign investors to lose confidence.

Commodities

Another area directly impacted by China’s recent crash is the commodities market. Commodities are things such as oil, gold, and copper. Many emerging markets, such as Brazil and Turkey, relied on selling commodities in order to build up their economies. However, with China losing vast tracts of wealth daily in its stock market, it can no longer buy as many commodities as in the past. This has resulted in less demand, which means reduced commodity prices and subsequent losses in the emerging markets reliant on them.

United States

Another area feeling a market correction, a loss of 10 or more percent, was the United States. Along with the news about China’s falling market, was the fear of the interest rate hikes in September, which would make borrowing money more expensive. While the United States is not the economic engine it once was, nor the borrower of last resort, it is still the world’s largest economy and any sudden crash in experiences would reverberate worldwide with even greater force than China.

Other Countries

Aside from the United States and emerging markets like Brazil, other places around the world also felt the crunch from China’s continued market crash. This included places like Europe, whose combined market had its worst losses since at least 2011. This also includes countries closer to home near China, such as Japan and Australia, each of whom saw sharp losses in the immediate wake of China’s loss. The accompanying video provides a thorough overlook of the recent Chinese Stock Market crash:


 

After the Drop

So with all the recent fluctuations in the stock exchange it bears asking, what is next for the world’s markets? The answer is seemingly more of the same. In the U.S. the Dow plummeted 588 points first on Monday, then another 204 points Tuesday. However, on Wednesday and Thursday the market rallied, gaining over 1,000 points in two days. The rally means that, for the week, the market is actually up. In fact the surge on Wednesday and Thursday marks the largest gain in any two-day period in the history of the American stock market.

Around the world, other markets were also experiencing a rebound on Thursday. In Europe and Japan, the stock market rose following dramatic losses earlier in the week. Even in China, the market rose more than five percent, ending a week of losses. In fact, even with all the recent losses, China’s market is still up 43 percent from a year ago.

However, even with markets quickly rebounding, China’s stock market crash cannot just be dismissed. The recent collapse has certainly shaken faith globally, for those who viewed China as the number one growth engine for the future. Furthermore, if this is unfortunately true, there is really no one to take China’s place. Emerging markets, such as Brazil, are overly dependent on commodities, Japan is still stuck in stagnation and Europe, as China’s largest trading partner, is too interconnected, especially as it still recovers from the 2008 crisis.

This leaves the U.S. as the world’s steadying force. While U.S. markets rebounded on the back of news that the GDP grew 3.7 percent in the second quarter, up from the original estimate of 2.3, and that jobless claims continued to fall, that status remains shaky.

Certainly, everything is not perfect in the American economy either. Following the recent market correction and due to the tumultuous world economy, the Federal Reserve has said it will probably not raise interest rates after all. This means that money can still be borrowed cheaply, however it also reveals the fear of weakness in the U.S. and global economies. This weakness is especially troubling because unlike before, when interest rates could be slashed, that option is no longer available. The following video looks at the future of the economy:


Conclusion

There is a saying that goes, “those who don’t learn from history are doomed to repeat it.” The history of the global stock market can offer many examples that attest to the validity of this sentiment. Throughout its history, the market has repeatedly surged and crashed, like waves against a beach. The recent case of China is just one more example of this situation. Luckily in this case though, the losses seem temporary and appear to offer no long-term threat to the global economy.

Nevertheless, the danger remains. This is due to the persistent existence of rampant speculation which falsely builds up the value of any market. When a market is then faced with stagnation or a correction, investors panic and begin selling off their shares or running on banks for cash. This cycle has repeated itself time and time again and shows no sign of stopping despite the numerous examples of markets failures and warning signs. This most recent crash again offers the opportunity to learn and stop repeating the same mistakes which have plagued people and nations as long as markets have existed.


Resources

Vox: The Global Stock Market Crash, Explained

The Economist: The Causes and Consequences of China’s Market Crash

Reuters: Markets Rebound from China Slump, Strong U.S. Data Helps

The Bubble Bubble: Historic Stock Market Crashes, Bubbles & Financial Crises

History: The Great Depression

About News: Stock Market Crash of 2008

International Business Times: China Stock Market Crash Explained in 90 Sseconds

The Wall Street Journal: China to Flood Economy with Cash as Global Markets Lose Faith

USA Today: Stock Leaps

The Guardian: China’s “Black Monday” Sends Markets Reeling Across the Globe

CNN Money: Dow sets a 2-day Record, Finishes up 369 Points

 

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Riding the Wave: The Tumultuous Global Stock Market appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/business-and-economics/riding-wave-tumultuous-global-stock-market/feed/ 0 47373
Making Reality Virtual: The Rising Tide of Virtual Reality https://legacy.lawstreetmedia.com/issues/technology/making-reality-virtual-rising-tide-virtual-reality/ https://legacy.lawstreetmedia.com/issues/technology/making-reality-virtual-rising-tide-virtual-reality/#respond Sun, 16 Aug 2015 18:18:32 +0000 http://lawstreetmedia.wpengine.com/?p=46929

What's going on with virtual reality?

The post Making Reality Virtual: The Rising Tide of Virtual Reality appeared first on Law Street.

]]>
Image courtesy of [Nan Palmero via Flickr]

Phones, MP3 players, tablets, even watches have been heralded as next big thing in technology over the past decade or so. Now, however, that title belongs to Virtual Reality. Companies ranging from Facebook to Google to Microsoft have developed or are developing headsets that allow users to finally realize the virtual reality experience. While the industry is still in its infancy, a recent surge in funding and attention suggests virtual reality break-throughs in a number of different aspects of everyday life ranging from video games to movies. Read on to learn about the next big thing starting with its origins, how it actually works, what is currently available on the virtual reality market, and what is on the horizon.


History of Virtual Reality

The concept of virtual reality extends all the way back into the 1860s, when artists created 3-D circular pictures. However, like many other inventions, the progression of virtual reality was neither linear, nor was it the result of the work of just one person.   The rise of modern virtual reality can be traced the 1950s. In 1957 a man named Martin Hellig invented something known as the Sensorama, which combined 3-D pictures with wind, sound, and various smells.

This vision was shared by technician named Douglas Engelbart who saw the potential for digital displays in the future. His vision began being realized as part of a larger effort on behalf of the United States government to design a new radar system, whose recordings would be easier for humans to understand and interpret.

This work started bearing fruit in the 1960s when another scientist, Ivan Sutherland, developed a headset to design cars and mechanical parts on a computer. Another leap forward came in the form of flight simulators, which became increasingly popular in the 1970s as they offered a safer alternative to flight training than the actual thing. Initially, the simulator started off as stitched together movies, however as the technology advanced so they began offering video simulations that placed the person in the virtual cockpit.

VR also made it to the movies, as evidenced by the increasing incorporation and even reliance on visual effects in Hollywood blockbusters. In fact the first movie to depict the notion of Virtual Reality was the original Tron, which was released in 1982. VR was utilized in the video game industry as well as in arcades and early headsets, like Sega’s model form 1993. VR was also developed to assist with various rehabilitation exercises, namely to help those dealing with PTSD.  The video below takes an in-depth look at the history of virtual reality:


 

How does virtual reality work?

What exactly is virtual reality? For a basic definition it can be thought of as a computer generated three-dimensional world. This world can be experienced as the real world is, through sights, smells, and sounds. The basic parameters of generating virtual reality are similar, objects have to be life sized and there has to be some tracking mechanism, so when the viewer’s perspective changes what he or she sees also changes.

A key component of this is a concept known as immersion. Immersion includes depth and breadth of information. These deal with how a user interacts with their virtual environment and how effectively that environment is presented to them. There’s also another concept known as latency, which is the lag time between when a user changes their perspective to when the new perspective is clear. If the delay is too long, the immersive experience can be ruined. One important consideration regarding virtual reality is that while many of the recent models require or offer some form of a headset, a headset is not required to experience VR.


 

The Virtual Reality Industry

Presently

The number and quality of VR options on the market already vary widely. On the simplistic side is Google Cardboard, literally a cardboard box and a pair of lenses, which with a smart phone placed inside can give the illusion of virtual reality. At the other end of the spectrum are headsets like those produced by Oculus Rift. Oculus was initially funded through the crowd-sourcing site Kickstarter, until it was bought by Facebook for $2 billion dollars.

The Rift however, is not alone on the high end of virtual reality. Several rival headsets are also in production including the Samsung Gear, HTC Vive, and the Sony Morpheus. Perhaps the most intriguing rival is Microsoft’s HoloLens. Unlike the other headsets, which strictly focus on generating a virtual reality, the Microsoft headset is capable of augmented reality. Augmented reality is combining elements of the virtual and real world together.

Regardless of whichever headset a user prefers, there are number of uses for virtual reality already. Aside from some of the uses mentioned earlier such as gaming, treating PTSD, and training pilots, VR is also becoming valuable in sports. In football training, players can relive past situations in the hopes of better being able to diagnose how to perform in a certain situation in the future.

VR is helping dentists train as well, by offering an environment where they can learn without causing any real or feared damage to their patients. Virtual reality is also being employed in everything from public speaking training to helping people rehabilitate from strokes.

On the Horizon

So, what is next from this technology? Like with the advent of 3D movies, one of the first impacted fields will likely be the film industry. In July of this year, Oculus signed a deal with Felix and Paul Studios to produce VR videos. There are other deals also in place for companies like Samsung and Google, who are using their own VR devices to provide customers with virtual experiences.

Along with movies, another area that is increasingly incorporating VR is the gaming industry. In fact Facebook’s Oculus Rift headset was developed originally for video games. There are also a number of competitors, including Sony and Microsoft, who are also planning to use their own headsets along with their video game systems.

Additionally, it has been suggested that VR can play more of a role in everyday pursuits. For example, imagine a courtroom setting where jurors could potentially put on one of these headsets and be transported to a crime scene so they could more clearly appreciate the facts of a case.

Conversely, while Facebook and its competitors see VR as a medium accessed through a headset that may not be its final form. In fact according to a study by Siemens Research people completing tasks guided by one of the headsets are actually less successful than those simply following a paper manual. The article suggests an alternative to the headset in general. This comes in the form of improving camera and display technology for a more immersive experience.

For any of this to be accomplished though, the major challenges of VR must still be confronted. First is solving issues with tracking–a major problem for early versions of VR is that they couldn’t accurately respond to a user. Secondly, environments themselves must also be developed that are complex enough to grab a user’s attention, but can also match what the user perceives, meaning he or she should hear wind if they see a tornado. The following video looks at virtual reality and its future:


 

Conclusion

VR has certainly made leaps and bounds over the decades, evolving from 3D images to headsets that can increasingly mimic natural surroundings. Nevertheless, for all the progress made, there is still much farther to go. Most of the tech heavyweights have put considerable resources behind this technology, but there are still few early returns.

Additionally, even when these headsets start hitting the market in earnest, either later this year or early next, their actual availability may still be limited to a number of factors, such as cost. The Oculus headset for example, is projected to run for as much as $1500 dollars. Additionally, precisely how VR will be used remains a mystery. While there is talk of it infiltrating all corners of modern life, the initial efforts seem limited to video games, videos, and to enthusiasts. Even the idea of a headset is not set is stone, as other traditional uses such as screens present strong alternatives.  Virtual reality may one day be the end all, be all of technology. For now though, all most people can do is sit back, relax and imagine a world where VR reigns supreme.


 

Resources

How Stuff Works: How Virtual Reality Works

The Week: 6 Innovative Uses for Virtual Reality

Tom’s Hardware: The Past, Present and Future of VR and AR

University of Illinois: Virtual Reality History

CNET: Google Cardboard Reviewed

Wareable: The Best VR Headsets

Read Write: Virtual Reality Films Could Put the Whole Industry in the Spotlight

Game Rant: Virtual Reality in Gaming

Forbes: No More Headgames

 

 

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Making Reality Virtual: The Rising Tide of Virtual Reality appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/technology/making-reality-virtual-rising-tide-virtual-reality/feed/ 0 46929
Lions and Tigers and Bears: Inside the Exotic Animal Trade https://legacy.lawstreetmedia.com/issues/energy-and-environment/lions-tigers-bears-inside-exotic-animal-trade/ https://legacy.lawstreetmedia.com/issues/energy-and-environment/lions-tigers-bears-inside-exotic-animal-trade/#respond Sun, 16 Aug 2015 16:58:24 +0000 http://lawstreetmedia.wpengine.com/?p=46612

Learn about the exotic animal trade--from hunting to breeding and trafficking.

The post Lions and Tigers and Bears: Inside the Exotic Animal Trade appeared first on Law Street.

]]>
"BDF Wildlife Training Area" courtesy of [US Army Africa via Flickr]

Much controversy has arisen over Cecil the lion, who was killed, skinned and beheaded by a dentist from Minnesota looking for a trophy. The slaying of a protected and revered animal drew criticism from national media and social media and forced the man responsible into hiding. However, this is just one example of an exotic animal killed for its skin, head, or body parts. These unique species are often smuggled or illegally traded, transported through grueling conditions, and forced into unnatural environments . Read on to learn about the exotic animal trade–from hunting to breeding and trafficking.


Live Animal Trading

Traded animals come from a variety of sources, including from being captured in the wild to being sold as surplus from zoos or zoo-like institutions. These animals are then sold over the internet or through live auctions. Both marketplaces are poorly regulated, although auctions are regulated through the Animal Welfare Act and any state regulations that might also apply.

In addition to acquiring these animals through smuggling, Americans began breeding their own exotic animals beginning in the 1960s and 70s. Due to the high profitability of selling these animals, breeding has become a successful industry. The increased value often comes from restaurants seeking alternative types of meat or for hides.  High import taxes, which have  reduced poaching, also make breeding a profitable option for American sellers who can potentially benefit from decreased competition and expenses.

Wherever animals are bought and sold, they often face poor treatment from their new owners, even unintentionally. For companies, this is frequently a result of economics–it is usually cheaper to let an animal die or not treat it than it is to pay for a veterinarian. In the case of private individuals, animals are often bought without the knowledge of how to care for them, particularly in the case of exotic species. Unsurprisingly, this can lead to a high rate of abnormal behavior, sickness, and even death amongst animals living in new environments.

Perhaps the most gruesome and public incident occurred four years ago in Zanesville, Ohio. In 2011, a caretaker of large exotic animals released many of them before killing himself. Most of the animals were then subsequently killed by authorities, including endangered species such as lions, tigers, bears, and wolves.  In this case, local authorities unfamiliar with how to trap these exotic species resorted to killing them in order to protect the public.

Transporting animals illegally also presents significant challenges. When shipping animals to evade authorities, many of the animals transported actually die in transit.  While exact figures are not entirely clear and are often exaggerated to drum up action, the laundering of animals in ways such as the one used by a man who strapped lizards to his chest when passing through LAX can lead to high mortality rates. The following video gives an inside look into the exotic animal trade:


Fur and Pelt Trade

In the Wild

Various animals and plants are hunted for a number of reasons. However, continued hunting has left many of these animals and plants at endangered levels or very near to it. Perhaps the most affected are large mammals, as well as sharks and whales. While these animals can be hunted for meat, more often they are hunted for sport, either for the head or other body parts.

While it is illegal to hunt these animals without a permit, if the hunter is willing to pay enough he or she can hunt virtually every animal on or approaching the endangered species list; the idea is the money spent killing them can then be used in protecting healthier members of the same species. This includes lions and even black-horned rhinos, one of the most endangered species in the world.  In fact the right to kill one such rhino last year, a rhino past his effective breeding age, generated 350,000 dollars for Namibia to use for its conservation efforts.

In Captivity

As an alternative to the wild, exotic animals are also bred on farms. These farms often exist for the express purpose of producing furs. The value of animal furs cannot be understated in history, as it was a main source of development for Europeans in modern-day North America and Russia. Currently, most of the furs used for clothing come from animals raised on fur farms. The United States is home to many such places, however, China is quickly dominating more and more of the fur farm market.  The following video provides some details on the lives of animals raised on fur farms:

Aside from raising animals on farms for fur, they are also being bred on ranches. In this case, for half a century exotic animals have been brought up in ranches in Texas with the express purpose of hunting them. In the view of those running such ranches, this allows hunters to get out their desire to hunt these animals in a controlled setting and thus reduce the need for killing them in the wild.  The video below looks at the ranches in  Texas:

Method to the Madness

The reasons why people kill these exotic animals include use in religious ceremonies, for food or clothing, and even to use them as trophies. The most common underlying motive, though, seems to be for profit. Killing most animals is allowed by governments for the right price and with the proper licensing.  Walt Palmer, the dentist who shot Cecil the lion for example, reportedly claimed to have paid more than $50,000 dollars and obtained the proper permits to kill the animal. The return on this investment–whether it be a trophy, skin, or head–to the hunter can more than make up the cost in hunting, especially if the hunt is orchestrated illegally.

Tigers, for example, can fetch as much as $15,000 for their skins. In fact, the profitability of killing and selling animals is so great that is has become the sphere of organized crime. These professional criminals employ every means from using helicopters to night-vison goggles to body armor to kill their prey.


Regulating the Animal and Fur Trade

According to the Michigan State Animal Legal and Historical Center, while few states require anything more than permits for hunting, the United States has several federal laws that seek to prevent the killing of animals for fur, including the Lacey Act, the Marine Mammal Protection Act, the Fur Seal Act and the Endangered Species Act deal. Additionally, the Fur Products Labeling Act requires precise labeling on clothing of exactly what type of fur is being used. There is also the Dog and Cat Fur Protection Act, which prevents using fur from those animals in clothing. However, these laws are ultimately aimed at stopping the fur trade from wild animals, not those bred on fur farms.

The U.S. government sees fur farms like most other farms, leaving them under the authority of the Department of Agriculture. While the United States has laws against animal cruelty, there are very few other protections in place. However, some other countries have much stricter fur regulations. In fact, the UK, Austria, and Croatia ban fur farms entirely.


Conclusion

Killing and illegally transporting is a problem, but a problem that begs the question: how much more can be done? More laws and agencies can continue to be created with the goal being to stop this industry, but like with drugs or gambling, after a while more restrictions stop having any more effect.  Existing laws have been somewhat successful, particularly using the money for the right to kill animals to reinvest in protecting them, a fact even acknowledged by the World Wildlife Fund. Furthermore, like those industries, these laws fail to get at the root of the problem: the demand for these animals and their skins. No matter how many poachers or hunters get arrested there will always be someone else to take their place, especially when it offers a lucrative job for a person in need.

There are several alternatives to the hunting and poaching of these animals in the wild, like fur farms and ranches. However these also draw a great deal of criticism which puts hunters in a somewhat difficult position where they face criticism for hunting animals in the wild and on ranches whose sole purpose is hunting.

Either way, while major news outlets and social media platforms crackle with indignation over the death of one lion, it appears if people at times lose track of more important issues. While high-profile events like the killing of Cecil the lion gain a lot of attention and may even spark public outcry, focusing on individual cases may cause people to lose sight of the issues with the animal trade as a whole. The industry is likely to continue as long as people have a desire to hunt, the price is right, and society lacks a proper understanding of how these animals are treated.


 

Resources

Born Free USA: The Dirty Side of the Exotic Animal Trade

Michigan State University Animal Legal & Historical Center: Fur Production and Fur Laws

Chicagoist: Minnesota Trophy Hunter Accused of Killing Beloved Lion

Discover News: Trophy Hunting Is There Any Benefit to Conservation?

USA Today: Fury Over Cecil the Lion also Sparks Race Controversy

PETA: Inside the Exotic Animal Trade

The Center for Consumer Freedom: FBI Anti-terror Unit Investigated PETA

USA Today: Years later Effects of Exotic-Animal Tragedy Still Felt

Dr. Steve Best: Top Five Animals that Face Extinction Satisfying Human greed

CBS News: Can Hunting Endangered Animals Save the Species?

National Wildlife Federation: Overexploitation

List Verse: 8 Endangered Species Still Hunted

Priceonomics: The Exotic Animal Trade

UC Small Farm Program: Exotic Livestock

Humane Society: Dangerous Exotic Pets

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Lions and Tigers and Bears: Inside the Exotic Animal Trade appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/energy-and-environment/lions-tigers-bears-inside-exotic-animal-trade/feed/ 0 46612
A Crack in the Great Wall: Chinese Stock Market Takes a Tumble https://legacy.lawstreetmedia.com/issues/business-and-economics/crack-great-wall-chinese-stock-market-takes-tumble/ https://legacy.lawstreetmedia.com/issues/business-and-economics/crack-great-wall-chinese-stock-market-takes-tumble/#respond Wed, 05 Aug 2015 20:48:15 +0000 http://lawstreetmedia.wpengine.com/?p=45274

How will it affect China and the world economy?

The post A Crack in the Great Wall: Chinese Stock Market Takes a Tumble appeared first on Law Street.

]]>
Image courtesy of [Aaron Goodman via Flickr

Recent shifts in the Chinese stock market make America’s subprime mortgage fiasco look run-of-the-mill. From January 2014 to June of this year the market rose 150 percent, but since mid-June, the Shanghai Composite Index has lost more than 30 percent of its value. In fact, the recent slide was only slowed through direct intervention from the Chinese government. Following the Chinese market plunge, a number of opinions regarding what the crash means have been expressed. The debate ranges from fears of a full-scale, 1920s-era depression mixed with a housing bubble, to a simple market correction. Read on to learn about China’s market, what it actually means, as well as what impacts it has had on China and the world economy.


The Chinese Stock Market

History

The Chinese stock market, known as the Shanghai Stock Exchange (SSE), was founded in 1990. The market is overseen by the Chinese Securities Regulatory Commission, which is essentially a Chinese version of the U.S. Securities and Exchange Commission. After its start, the SSE has risen above competing exchanges to become the dominant market in China. Its purpose, so far, has been to raise capital for companies, particularly those in the infrastructure and tech fields. The SSE’s current goal is to transform Shanghai into a leading financial center on par with New York and London.

Volatility

While the SSE’s recent crash has garnered headlines, it followed a huge upswing, meaning that despite the recent plunge in stock prices, the market is actually up this year. To accurately discuss the volatility of the market it must be divided into two parts: a rise and fall.

First was the rise. Since the beginning of 2014 to June 2015 the market’s value rose by 150 percent. In the first five-and-a-half months of this year alone, the value rose by nearly 50 percent. The rise in China led to the increased importance of its financial industry, at the expense of traditional manufacturing powerhouses. This was all part of the government’s plan, which hoped to transition to a more financially driven economy, as growth rates slowed and eventually fell below the 10 percent glory days. However, China’s plan for its stock market has a taken a significant hit.

The rise, of course, is followed by a fall, and that fall has been dramatic. In a single two-week span, the value of the market fell by 25 percent. To put it another way, in just two days the market lost 11 percent of its value, which in the United States would translate to a 2,000 point drop in the DOW. These recent losses can be attributed to investors who were in highly leveraged positions, meaning they accumulated much more debt than the equity that they held, leading them to sell when margin calls began. A margin call happens when a broker demands that an investor, who used margin to pay for his investment, put up more money or collateral to cover a potential loss. In other words, people were buying shares on the SSE in an effort to get rich quick. However, when the market began collapsing, many were forced to sell in order to cover margin calls, which led to plummeting stock prices and marketwide panic.  The following video explains the fall:


A Deeper Meaning?

Government Response

So what does the recent crash mean for China? The Chinese government did not wait to find out. Following the collapse, the government reduced regulations on margin buying, halted new IPOs, and encouraged several other efforts aimed at increasing stock sales. Additionally, it stopped trading on most stocks and put a moratorium on selling in place for six months for all large investors. Finally, it threatened individuals and groups known as short sellers–parties that make money when a stock price declines.

The results of all these efforts have been less than encouraging. Following the temporary ban on selling, the market began falling even further. This is a result of the market being reopened to a natural state where investors can sell when an investment looks bad, as much of the Shanghai market looks right now.

Should we be worried about the collapse?

In the best case scenario, the collapse was all just the result of panicked investors with little experience; they saw the value of their investments grow rapidly and were anxious to cash out before anything bad happened. On the other hand, it is possible that the market was experiencing a bubble. A bubble occurs when something is overvalued because of continuous investing and not actual results, or because of the influx of a new product that projects future growth–think housing or tech in the United States.

Unfortunately, if the latter is true things could get worse before they get better. Because the Chinese government prevented the bubble from completely bursting, it could essentially be lingering there, waiting to burst when the regulations ease up. Even if this is not the case, the perception of a bubble could lower stock prices and companies’ desire to invest. Corporations who invested, including foreign ones, are also barred from selling right now and this episode may make them less likely to invest in China going forward.

Currently, the final outcome is still unclear and opinions remain divided. Some companies, such as Bank of America, Merrill Lynch, and Credit Suisse, see China as a systemic bubble that can or will burst when the government removes its support. On the other hand, Goldman Sachs sees the recent plunge as a market correction, needed to reduce over-valued stock prices and push out the wrong type of investors.  The video below details the crash and the government’s reaction:


The Impact

Because China has the number two economy in the world, a stock market crash is likely to have an effect that will reverberate around the world. So, what exactly does the stock market crash mean?

China

It may be too soon to understand how the recent plunge will affect China’s economy in the long run. While the market lost nearly $3.5 trillion since mid-June, its value remains positive this year due to a massive upswing early on. It is also difficult to tell whether the recent volatility will continue or if the market will start to settle down.

What the crash shows most clearly, though, is the oversized role of the government in economics and the unclear nature of its actions. The government’s response, which involved a significant amount of intervention from regulators, may discourage future investment. That response and the apparent lack of regulatory coordination indicate that the Chinese government have many worrying that it will run into further challenges as it attempts to balance stability with a more market-driven system.

Worldwide

The international impact of the Chinese market collapse has been less noticeable than the effect on China itself. Since China remains relatively isolated from the global financial system, the effect of the losses has had little impact on other markets. In fact, the stock market crash in China had considerably less influence on the world economy than tiny Greece–simply because Greece was more plugged in.

The real significance, if anything, will come in the future. If China’s economy takes a nose dive it could mean less investment coming from the country as well as fewer opportunities to invest in its markets. Additionally, efforts to further incorporate China into the world system may be scuttled. The video below discusses the ramifications of the Chinese stock market crash:


Conclusion

Seeing the Chinese stock market lose 25 to 30 percent of its value in about a month is very unsettling, especially with the recent Greek crisis and the lingering memory of the United States’ 2008 meltdown. But it is important to note that despite all the panic, the Shanghai index remains positive this year.

The real impact of the crash focuses primarily on China itself. For the average investor,  the collapse could have wiped out a lifetime worth of income, and may be the first sign of a lingering bubble in the market. For China’s general population, the crash revealed, much like in developed nations, the growing gap between haves and have-nots. For the have-nots, the fiasco may also slow promised social reforms, which could further exacerbate the wealth gap.

Ultimately for China, the crash presents yet another crossroad. The stock market was supposed to be the avenue for future growth when the country’s manufacturing sector fizzles, as it did in the earlier Western model. But the crash raises doubt. If China ever truly wants to be a global economic actor or at least a regional one, it will have to learn to manage volatility without excessive intervention and control from the government.


Resources

Primary

Shanghai Stock Exchange: Brief Introduction

Additional

Business Insider: Goldman Sach’s on China’s Stock Market collapse

Fortune: China’s Wild Stock Market Ride in One Chart

Bloomberg View: China’s Tamed Stock Market Might Bite its Economy

The New York Times: Cooling of China’s Stock Market Dents Major Driver of Economic Growth

Business Insider: Here’s a Simple Explanation of Why Chinese Stock Markets are in Free Fall Right Now

Slate: China’s Stock Market is Falling Again

Business Insider: China Pays a Price to Avert Stock Market Crash

The Washington Times: No Worries about Impact of China Stock Market Crash on U.S. Economy Yet

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post A Crack in the Great Wall: Chinese Stock Market Takes a Tumble appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/business-and-economics/crack-great-wall-chinese-stock-market-takes-tumble/feed/ 0 45274
American Muslims: A Vibrant History, Misplaced Hatred https://legacy.lawstreetmedia.com/issues/entertainment-and-culture/american-muslims-vibrant-history-misplaced-hatred/ https://legacy.lawstreetmedia.com/issues/entertainment-and-culture/american-muslims-vibrant-history-misplaced-hatred/#respond Sun, 02 Aug 2015 17:58:35 +0000 http://lawstreetmedia.wpengine.com/?p=45748

Muslim Americans are a vibrant part of our culture, so why are they discriminated against?

The post American Muslims: A Vibrant History, Misplaced Hatred appeared first on Law Street.

]]>

The United States is a patchwork of cultures so diverse that large groups can often go under the radar unnoticed or unidentified. That is, until a tragedy brings that group to the forefront. American Muslims in particular have repeatedly been branded as terrorists throughout history, most recently after a terrible shooting by a Muslim man at an army base in Chattanooga, Tennessee. This viewpoint is unfair and uninformed. Far from being a secret insurgency, Muslims in the United States are one of its oldest groups and most average populations. Read on to learn more about Muslims in the United States, starting with the group’s culture, moving to a profile of the modern American Muslim, and lastly how this group of people is portrayed by the culture.


History of Islam in the U.S.

Muslims have a long history in the United States, perhaps going back to a time before Europeans even settled the area. According to some legends, Muslim Moors who had been expelled from Spain as part of Christian Reconquista may have explored the Caribbean and what is now America. In fact it has even been speculated that Columbus on his travels to the New World cited a book written by Muslims who had made a similar voyage in the 12th century. There are also reports of physical Muslim guides used by the Spanish in their conquests beginning in the 16th century.

The first major migration of Muslims unquestionably came through the slave trade. In fact, as many as ten to 15 percent of the imported human cargo was believed to be Muslim. These slaves were often forced to convert to Christianity or at least hide their beliefs, however some populations were able to hold out into the 20th century.

Following this wave was another from the Middle East from the late 1800s to early 1900s. This group settled in the modern day American Midwest, as jobs were readily available particularly in the automobile industry. A third wave came in the 1950s and 60s from throughout Asia, when the United States relaxed its strict immigration policies. Islam in the United States was also invigorated by Black Americans who sought to restore their original faith, beginning with the Great Migration and continuing to this day. The first mosque in the United States was built in Cedar Rapids, Iowa and as of today there are more than one thousand mosques nationwide. Additionally, some of the most notable Muslims of the time were and are major historical figures such as Malcolm X and Muhammed Ali.


The Modern American Muslim

Like the varied waves in which they immigrated to the United States, the current population of Muslims in the U.S., numbering anywhere from five to 12 million people, is also an ethnic hodgepodge of American Blacks, Africans, Asians, Hispanics, Europeans, and converts. Along with a diversity that mirrors the U.S. population as a whole, Muslims in America are quintessentially American in a number of other ways. The number of Muslims with college and graduate degrees for example is nearly exactly the national average. This correlation holds true for the number of Muslims making $100,000 down to those making less than $30,000 annually.

Muslims also are slightly different in some ways as well. First, unlike America as a whole, Muslims skew young, over 75 percent of the Muslim population in the United States is 49 or younger. It also skews slightly male with approximately 54 percent of the Muslim population in this country being male. As far as geography, most live near urban centers such as New York and other coastal areas. The population of Muslims is also greater near university towns as many are graduate students or faculty. Thus while Muslim populations may go unrecognized, that may in part be because of how similar they are to the general population of which they are a part.  The video below looks in-depth at Muslims in the U.S.


Popular Perception of Muslims in American Culture

Clearly then, American Muslims have a rich historical place in America and blend in quite well with the population, too. However, while by most any metric Muslims are the epitome of America, the perception of Muslims remains disproportionately hostile. According to a recent poll, only 27 percent of Americans had a favorable view of Muslims. Additionally, nearly half of the respondents thought that Muslims’ decisions would be overly impacted by their religion and that profiling people of the Islamic faith was justified.

These are not just views of anonymous individuals. Following in the wake of the recent shootings for example, prominent Reverend Franklin Graham, son of Billy Graham, called for the end of Muslim immigration to the United States. Graham is not alone in his vitriol. The FBI, it was revealed, also seems ill-disposed to American Muslims and unsurprisingly teaches its counter-terrorism agents that American Muslims are potential terrorist sympathizers, that the prophet Muhammad was a cult leader, and the act of giving is actually a covert effort to fund terrorist activities. The FBI was not the only policing agency in on the act; the NYPD also ran a notorious anti-terrorism program that targeted Muslims. Since 9/11, members of the NYPD infiltrated mosques, spied on attendees, and even enticed informants to trick other Muslims to make seditious statements on recordings.

In fact the American media in general is guilty of mischaracterizing Muslims. In a study done by a North Carolina professor of media from 2001 to 2008, he found overwhelming evidence of a media bias against Muslims, including a disproportionate focus on groups who denounced Islam.  The accompanying video details this bias through a triple murder of Muslims in North Carolina earlier this year.

Misplaced Hate

When one looks at the numbers, this anger and hate is clearly misplaced. For example, from 1980 to 2005, 94 percent of terrorist attacks committed on U.S. soil were done by non-Muslims. In 2013 in fact, more people were killed inadvertently by guns fired by toddlers than by Muslims.


Conclusion

There is an old saying that people fear the unknown. When it comes to the Muslim population in the United States, unknown might not be an adequate description. Perhaps the best example is that most Americans equate Arabs and Islam, even when most Arabs in the U.S. are not Muslims. Yet this void of knowledge has not remained unfilled, on the contrary a concerted effort has been made to twist and often distort the popular opinion of Americans into believing all Muslims are terrorists or at the very least, sympathetic to the cause of terror groups.

The numbers show nothing could be further from the truth. Far from being a homogenous group of troublesome people, Muslims, like America itself, are a diverse collection of peoples. Furthermore, these people encapsulate the average American identity in virtually every way.

Muslims like so many groups before them are often not treated equally in American society for a number of factors ranging from media influence to that all important unknown. However, closer examination reveals that in the United States, Muslims are most representative of one thing, the patchwork nature of the country itself.


Resources

Embassy of the United States: Muslims in America

PBS: Islam in America

Reuters: American Opinion of Arabs, Muslims is Getting Worse

Wired: FBI Teaches Agents

Atlantic: Horrifying Effects of NYPD Ethnic Profiling on American Muslims

Think Progress: Study; Anti-Islam Messages Dominate Media Coverage

 Daily Beast: Are All Terrorists Muslims? It’s Not Even Close

Viral Buzz: 30 Hollywood Muslims

 

 

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post American Muslims: A Vibrant History, Misplaced Hatred appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/entertainment-and-culture/american-muslims-vibrant-history-misplaced-hatred/feed/ 0 45748
Turkey: A Country Perpetually at a Crossroads https://legacy.lawstreetmedia.com/issues/world/turkey-country-perpetually-crossr/ https://legacy.lawstreetmedia.com/issues/world/turkey-country-perpetually-crossr/#respond Sat, 01 Aug 2015 13:00:51 +0000 http://lawstreetmedia.wpengine.com/?p=46120

Turkey is no stranger to conflict; whats up next for the country?

The post Turkey: A Country Perpetually at a Crossroads appeared first on Law Street.

]]>
Image courtesy of [Quinn Dombrowski via Flickr]

The nation of Turkey sits at a crossroads. Stretching from Europe to Asia, the country serves as the major path between the two continents and has done so through one form or another for centuries. The nation is also proverbially stuck between two competing forces as well. While it has advanced economically, politically, and through foreign policy much further than many of its Middle East neighbors, recent setbacks have shown just how far this process has yet to go. On top of this is the continued threat of ISIS and homegrown groups that recently reached such a fevered pitch that Turkey has called on its NATO allies for assistance. Read on for a look at this critical junction for Turkey, examining its past, politics, economy, and security situation.


History

Present day Turkey was formerly known as Anatolia in ancient times and was part of many of the world’s strongest and longest-lasting empires such as the Romans and Byzantines. Beginning in the 11th century however, it was invaded by a number of Turkic tribes from the Asian Steppe. These groups spent the next 200 years warring with each other, as well as with the Byzantines. Through this fighting the region gradually came to have an overwhelmingly Turkic population, leading to the origins of the nation’s present name, Turkey.

Out of that chaos rose the Ottomans, who slowly expanded the borders of the burgeoning Anatolian state and moved to finally crush the Byzantium Empire to the west. Finally in 1453, after nearly 100 years, the Ottomans conquered the Byzantine capital Constantinople. The city was renamed Istanbul and rechristened the capital of the Ottoman Empire. Following this rapid rise, the Ottomans then spent the next 300 years building and consolidating their empire.

The tide of history began to work against them though, in the 18th century, as the empire was pinned in under threat from all sides; the Austrians to the west, the Russians to the north, and Persians to the east and south. This led a gradual decline of the Ottoman Empire, which was soon dubbed “the sick man of Europe.” Ironically, the empire survived only through efforts of European nations, which were anxious to maintain it as part of the balance of power.

This weakness was exacerbated by defeat in the Crimean War and the independence of a number of the regions under Turkish rule in the late 1800s. This also led to a reform movement, which culminated with Ataturk and the Young Turks who took control over the country in a bloodless coup in 1909 and ushered in a modern European-style democracy.

Aside from imitating European democracy, Ataturk also modernized Turkey in other ways including through farming, education, and even the Turkish language. His most lasting objective though and the most divisive to the present day, was to make Turkey a secular nation. While its inhabitants are still overwhelmingly Muslim, the country is modeled more after others in which the church and state are separate.  The video below provides an in-depth look at Turkey’s history.


Turkey and the EU

When Ataturk imitated European life he dreamed of one day ingratiating Turkey into the continent or at least being strong enough to challenge it. This has translated through the years into a desire by Turks to join the European Union. In fact, it has been a candidate for membership since 1999. The partnership would be a natural one for a number of reasons including Turkey’s growing economy, as well as its existing partnerships within NATO and the G20.

Nevertheless, after more than 15 years, Turkey remains on the outside looking in. Despite its strong economy and its capital, which is technically in Europe, the Turks have not been able to convince the EU it is worthy of membership. This is due to a number of reasons that extend beyond Turkey’s Muslim population, which it alleges is the main problem.

To start, while Turkey is wealthy, that wealth is unequally concentrated at the top. Thus while Turkey’s economy overall is growing and there are extremely wealthy people, the majority live in poverty. This could be problematic for the EU because it would bring a population even larger than Germany’s into the fold, which might need extensive government help. This is even more of a concern in the wake of the repeated bailouts of Greece, a much smaller country than Turkey both economically and population-wise. Additionally, Turkey brings further baggage through its problems with the Kurds, the contentious issue of who rules Cyprus, and its democracy, which looks increasingly less representative and more like a dictatorship.

Perhaps most importantly though, in light of Greece’s recent issues, is the slowing Turkish economy. Since its rise from the ashes of the IMF bailout it received in 2001, the economy of Turkey boomed averaging between five and ten percent annually. However, this growth has stalled and plummeted the last few years, averaging closer to three percent.

This is a result of less innovation and deregulation of the economy, regulations which helped it climb out of its earlier hole. At the center of much of this, is former Prime Minister Recep Erdogan. Erdogan has been criticized for intervening too much in the nation’s economy, particularly concerning its central bank. Erdogan claims that the central bank acting as if it is under a foreign authority has reduced confidence in the economy and the government.


The Turkish Government

Speaking of its government, since the founding of the modern state by Ataturk, Turkey has made a concerted effort, unlike its neighbors, to be secularist and not become dominated by Islamists. This attempt has been carried out, historically, by the military, which has initiated a number of coups to preserve the country as is.  The following video depicts the military’s role in the government.

However, these efforts are under threat of being rendered moot, thanks to Turkey’s most powerful politician since Ataturk, Recep Erdogan. Erdogan built his political career piece by piece, rising from professional soccer all the way to the position of Prime Minister. After serving the maximum allowable 12 years, he became the first ever directly elected president in the country’s position.

While Erdogan has been immensely popular during his rule, many view him as a threat to Turkey’s secular identity. This is due to many factors, including his religious upbringing, laws he has attempted to pass that prohibit certain freedoms with regard to Islamic doctrine, and his political leanings. The fear is growing because Erdogan is now attempting to alter the constitution in a way that grants the president far-reaching powers, which would be a massive shift for a position that until now has been mostly ceremonial. Erdogan has also had a combative foreign affairs history, including alienating a once-close ally in Israel and in failing to live up to promises offered to Kurds living in the south.  The accompanying video details Erdogan’s political career.


Security

The Kurds represent just one of the two major threats to Turkey emanating from its south. The other is ISIS whose power base in Syria and Iraq touches the nation’s southern border and threatens to spill over it.

The Kurds

Turkey’s longest term enemy is within its own borders. The Kurds are led by the Kurdistan Workers Party (PKK). Since the party was outlawed by Turkey in 1984, 40,000 people have been killed as a result of the conflict. Erdogan negotiated a tentative peace with the group, however when the Kurds asked for assistance in fighting ISIS they were met with indifference.

The issue came to a head again recently when Kurdish members of the PKK ambushed and killed two Turkish police officers. This has led Turkish officials to state that they see no difference between groups such as the PKK and ISIS, in that both are viewed as terrorists. This explains recent air strikes then, against Kurdish positions in Iraq and Syria by Turkey, which effectively ends the ceasefire.  The video below discusses the issues between the Kurds and Turkey.

ISIS

For the most part Turkey has tried and been successful in avoiding conflict with the barbaric terrorist group, but recent signs suggest this may be ending. Following a recent attack by militants and in light of the nearly two million refugees flooding into Turkey from Iraq and Syria, the country is no longer able to sit on the sidelines.

On July 26,  Turkey, as a member of NATO, called for a meeting under Article 4 of the treaty organization’s charter. It was only the fifth such time since the organization’s inception that such a meeting has been called. The Turks proposed a buffer where no militants will be allowed to operate within 68 miles of their border. In return for the assistance, the Turks will also give greater access to U.S. troops and aircraft fighting ISIS.

While the coalition fighting ISIS has long desired a foothold in Turkey for targeting the group, any agreement would come with strings attached. Not only would it mean condoning attacks by the Turks on the PKK, it would also condone many of the other undemocratic actions taking place within the country.


Conclusion

Turkey is literally a land at a crossroads between Europe and Asia, Christianity and Islam. For nearly a century, the country has maintained this tenuous position by adhering to the principles of the founder of the modern Turkish state, Ataturk. He called for a secularist nation and when the country strayed from this path, it was and has been repeatedly corrected through military intervention.

Secularism was made easier following the turn of the millennium as Turkey’s economy hummed, its relations with the Kurds improved, and a path to joining the E.U. looked open. However, life has a way of presenting obstacles and Turkey has begun to encounter several, ranging from its flat-lining growth and its power-hungry leader to its continued assault on minorities within its borders and beyond. It is this intersection that now presents Turkey with its most difficult decisions in the future to come. The choices it makes could very well change its direction.


Resources

BBC News: IS Conflict NATO to discuss Turkey-Syria border crisis

History World: History of Turkey

All About Istanbul: Ataturk and the Modernization of Turkey

European Union: EU Relations With Turkey

Debating Europe: Arguments For and Against Turkey’s EU Membership

Telegraph: How Turkey’s Economy Went From Flying to Flagging and Could Get Worse

Reuters: Turkey’s Erdogan Says New Constituion a Priority After Elections

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Turkey: A Country Perpetually at a Crossroads appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/turkey-country-perpetually-crossr/feed/ 0 46120
Falling Fertility: The Impact of Declining Birth Rates https://legacy.lawstreetmedia.com/issues/health-science/falling-fertility-impact-declining-birth-rates/ https://legacy.lawstreetmedia.com/issues/health-science/falling-fertility-impact-declining-birth-rates/#respond Sun, 12 Jul 2015 14:20:10 +0000 http://lawstreetmedia.wpengine.com/?p=44474

Here's why millennials should start reproducing.

The post Falling Fertility: The Impact of Declining Birth Rates appeared first on Law Street.

]]>
Image courtesy of [Joshua Rappeneker via Flickr]

A report recently released by the National Center for Health Statistics revealed some good news for our nation’s population growth for the first time since the beginning of the Great Recession. According to the study, the number of babies being born and the fertility rate increased by one percent–the first birth rate increase since 2007.

Growing birth rates are very important for a nation for a number of reasons, and the recent surge in births also points to several positive indicators, such as greater financial security. Read on to learn about the fuss about the birth rate, what it means for America, and what we need to consider about population growth moving forward.


A History of Fertility

To start, it is important to understand the terminology associated with birth rates. This begins with Total Fertility Rates (TFRs). This measures the average number of children a woman will have over the course of her life. Another term to understand is replacement level. The replacement level is the TFR required to maintain zero population growth, basically the number of births needed to equally replace the adults who are having babies. The United States’ replacement level is roughly 2.1 births per women. It is higher than two because it factors in the unfortunate truth that some people will die before they reach adulthood and can reproduce.

Around the turn of 20th century, the fertility rate in the United States was over three. The rate then dropped precipitously following the Great Depression, but that trend somewhat reversed leading into World War II, and the TFR continued to stay above replacement levels until the 1970s. There was a slight drop in the rate in this era, during the oil crisis. However, when the economy began to rebound again, so did the nation’s TFR, which continued at or above replacement levels until the Great Recession. The video below shows the correlation between birth rates and the economy:

Based on this data, it seems obvious that the number of births in the U.S. is tied to the success of the economy. But a more nuanced explanation is also required. For example, even when birth rates cratered during the Great Depression, they still remained higher than the birth rates following the uptick of the 1980s and 1990s. Essentially, the economy plays an important role, but current culture does too.


Falling Birth Rates

The recent low birth rates in the U.S. are not unique. In fact low birth rates might be the clearest predictor of a country’s level of development. In several European countries and Japan, the birth rate is 1.5 or lower, meaning below replacement levels. So, why is this happening and what do these shrinking birth rates mean? First let’s analyze the “why.” One of the common threads between these countries, besides a higher standard of living, is the improved status of women.

More Access to Education

One of the major factors in dictating how many children a woman is likely to have is how educated that woman is. It has been repeatedly shown that women who are exposed earlier to education and continue with their studies have lower birth rates. Conversely, in other nations in which this is not true, such as Eritrea on the eastern horn of Africa, birth rates are much higher at 4.6. This stands in stark contrast to highly educated Japan’s birth rate of 1.4.

Having Children Later

While the recent data did indicate that American women had more children last year, it also revealed something else interesting. Most of the women having those children were older, in the age groups of 30-34, 35-39, and even 40-44. On the other hand, women in the age bracket spanning 20-24 saw their number of births continue to decline.

Although having children later is not the taboo it once was, having a baby after the age of 40 still carries with it an added weight of potential health risks. But thanks to the fact that older woman are often healthier, more prepared financially and mentally, and are generally better educated, many of these concerns can be off-set.

Immigration

Following the “why” is the “what”: what do shrinking birth rates mean? One of the most important is that low birth rates can have a dramatic impact on immigration. Namely, in countries with low birth rates like Japan or Germany, immigration is needed to simply maintain the current size of the population. Additionally, since many of these countries will start or have already started to have a disproportionately older population, immigration can provide a needed youth infusion.

Who is Going Pay for All This?

There is a lot riding on how these countries either maintain or increase their populations. Specifically, developed nations need people to pay for the wide-reaching entitlement programs enjoyed by older citizens as they retire, such as Social Security in the U.S. It becomes harder for the population to support these programs when the number of people paying into them shrinks. As populations grow older, the number of people depending on them will continue to increase. With this is mind, countries with low birth rates either need to accept immigrants or find new ways to boost their populations naturally.


The Future of Babies

How are countries trying to up their birth rates?

While these countries grapple with declining birth rates and the need for more immigration, to quote a poem by Dylan Thomas, they, “do not go gentle into that good night.” Indeed, many of these countries already have plans in place to reverse their falling birth rates.

In Japan nearly $30 million has been designated for encouraging young, single people to meet and eventually get married. In Russia, the government directly gives couples up to $12,500 for having a second child or adopting. Many European countries meanwhile, focus on benefits after the baby is born, providing lavish maternity leave or child care. The United Sates is actually an outlier on this front, as it does not offer guaranteed maternity leave and really only provides minor tax breaks to women having children. This may explain in part why older, more secure women are the ones increasingly having children in the US.

Assistance from Technology

Technology is also playing a greater role in the number of births each year. In the United States for example, approximately 2,000 more babies were born in 2013 than in 2012 through in vitro fertilization, when an egg is directly impregnated in a medical procedure. However, there are various kinds of fertility treatments–combined they accounted for 1.5 percent of all births in the U.S. in 2013.

Private companies are also stepping into the proverbial baby-making arena. Both Apple and Facebook are offering programs that will pay to have their female workers’ eggs frozen. The idea is that once these women are at a point in their career where family seems more accessible or appropriate they can then use their frozen fertilized eggs to have children.

This approach has received mixed reviews however. While some laud the efforts as providing an avenue through which women interested in pursuing a career and a family can still hope to travel, others see it differently. To the second group, this approach does nothing to address the reasons why women feel compelled to choose. Additionally the health risks associated with having children from frozen eggs versus unfrozen are no less diminished in older women.


Conclusion

Moderating population size is a tricky task. At times it has been predicted that the growing population is unsustainable and that we are headed for disaster. Conversely, people are currently concerned about not having enough children and the subsequent dangers that would present to society.

In the United States, the birth rate has been falling more or less steadily for the last one hundred years and now hovers right around replacement levels. While there seems to be optimism following last year’s uptick in births, it follows on the heels of several years of consecutive declines that put the U.S., like many of its developed contemporaries, below replacement levels. It is still unclear if the recent increase is sustainable.

Perhaps what the declining birth rates have revealed most clearly is the changing role of women and the continued changes necessary for women to have children and still be able to pursue a professional career. The importance of this issue can be showcased by both national and private efforts to address it. Still, as of right now, the issue remains unresolved and a remedy is unclear. Even with birth rates recently back on the rise, in the future the concerns may change from the number of mouths to feed to whether or not we even have enough people to help feed them.


Resources

Primary

United Nations: Total Fertility Rate

World Bank: Fertility Rate, Total (births per woman)

Additional

Time: Rising Birth Rates a Good Sign for the Economy

Deseret New National: Can Government Incentives Reverse Falling Birth Rates?

Population Reference Bureau: World Population Data Sheet 2012

Earth Policy Institute: Education Leads to Lower Fertility and Increased Prosperity

Time: Women Keep Having Kids Later and Later

Yale Global Online: The Choice: More Immigrants or Less Citizens?

Genius: Do Not Go Gentle Into That Good Night

Health Day: In Vitro births Continue to Rise in the U.S.

CNN: Egg-Freezing a Better Deal for Companies Than for Women

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Falling Fertility: The Impact of Declining Birth Rates appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/health-science/falling-fertility-impact-declining-birth-rates/feed/ 0 44474
America’s Focus on Guns by the Numbers https://legacy.lawstreetmedia.com/issues/law-and-politics/americas-focus-guns-numbers/ https://legacy.lawstreetmedia.com/issues/law-and-politics/americas-focus-guns-numbers/#respond Wed, 08 Jul 2015 13:00:36 +0000 http://lawstreetmedia.wpengine.com/?p=43951

Even though crime remains low across the country, more Americans are turning to gun ownership.

The post America’s Focus on Guns by the Numbers appeared first on Law Street.

]]>
Image courtesy of [Peretz Partensky via Flickr]

The recent shooting at the Emanuel A.M.E. Church in Charleston, South Carolina opened up a number of old wounds for the country and reinvigorated several dormant concerns that seem to linger in the American consciousness. Chief among these concerns is both racism and America’s lack of gun control laws. While many were quick to put the blame in this case on a twisted, racist individual, there were others who said it was just one more in the litany of examples of the side effects of a culture that enthusiastically embraces guns without any real checks. Read on to learn more about gun control in this country, the role of groups such as the National Rifle Association, and what impact this has on the lives of everyday Americans.


History of Gun Control

What does the Second Amendment actually mean?

Any and all issues concerning guns in the United States start with the Second Amendment. While people associate the amendment with protecting their right to own firearms, the exact wording is as follows: “A well-regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.” The amendment was originally designed as a check against the federal government, in essence to protect the states from being overwhelmed by its standing army.

According to former Supreme Court Justice John Paul Stephens, over the years the law has been misinterpreted and manipulated for political gain. Originally it was designed so that people could bear arms as part of a militia in order to protect against the federal government. In other words, these people would own weapons as a function of their status within a militia. In fact, this was the way the law was interpreted for most of American history. But beginning in 2008, in a controversial Supreme Court decision regarding handguns, the amendment was interpreted to owning guns in general, instead of for a purpose. On top of this, the type of weapons protected also expanded. Specifically, in 1939 in a famous case cited by Stephens, sawed-off shotguns were ruled illegal because they did not fit the requirement of self-protection that was originally interpreted as the law’s modus operandi. However, as recent efforts have shown in which automatic weapons have become allowable these same rules no longer apply.

Failed Efforts at Reform

While gun control advocates are seemingly losing the battle over gun ownership in the U.S., this has not always been the case. On the contrary, the opposite held true for much of America’s history. The first efforts at regulation were in 1934. Following the high number of deaths resulting from the use of automatic weapons by prohibition-era gangsters, the federal government passed the National Firearms Act, which both made automatic weapons too expensive for the average person to afford and prevented the importation of the weapons.

The Gun Control Act was passed in 1968, in the aftermath of the high-profile killings of Martin Luther King Jr. and Robert Kennedy. This legislation created the Bureau of Alcohol, Tobacco and Firearms (ATF). ATF was tasked with regulating the sale of guns and the weapons themselves.

The tide began to turn against gun control advocates, however, with the passage of the 1986 Firearm Owners’ Protection Act, which limited the ATF in its crackdown of gun owners and dealers. The gun control side had one last hurrah with the Brady Act in 1994, which outlawed the sale of assault weapons. This law nevertheless had a built-in sunset provision of ten years. When it came up for reauthorization in 2004 it was not renewed.

Along with the recent court decisions supporting gun ownership rights, the country’s representatives also seem to be opposed to regulating the weaponry. This became clear in the wake of the Sandy Hook massacre when both new legislation and efforts to expand existing legislation, which called for background checks, failed to gain traction even in the shadow of the massacre of 20 elementary school children. Click here to view a video explainer on the history of gun control in the United States.


Guns in America

An Abundance of Firearms

Despite all the discussion over protecting gun owners’ rights, only a minority of the population actually owns any guns at all. While exact figures do not exist, according to a 2013 survey by the Pew Research Center, only about 37 percent of Americans own firearms. However, while less than half of the U.S. owns a gun, there are an estimated 270 to 310 million in circulation among the civilian population. In other words, one for every man, woman, and child. To put this into context, although the U.S. accounts for only about five percent of the world’s population, it is home to between 35 to 50 percent of its firearms. While the overwhelming majority of these are owned by law-abiding citizens, the sheer volume of available weapons has led to a serious issue with gun violence in the United States.  The following video depicts the level of gun ownership, gun fatalities, and attempts at gun control.

 

Gun Deaths by the Numbers

While those who favor protecting gun rights over gun control cite protection as a main reason, it has to be asked, are guns making the U.S. any safer? Going strictly by numbers and in comparison to other industrialized nations, the answer is a resounding no. On an average day in the U.S., 88 people die from a gun-related incident. The yearly total extrapolates to roughly 32, 251, the approximate figure in 2011 according to the CDC.

These rates dwarf those of countries in Western Europe to which the United States is often compared in other metrics. The U.S. in 2010 for example had a homicide rate that was 6.6 times higher than that of Portugal, who had the highest rate in Western Europe. To put it another way, that same year the U.S. had a higher homicide rate per capita than Pakistan, a country renowned for terrorism, and was only slightly behind other nefarious locales such as the Democratic Republic of Congo and Iraq. Perhaps the most chilling comparison is the 2013 numbers which show major American cities with homicide rates similar to that of notoriously violent countries such as El Salvador, Honduras, and Mexico. While it should be made clear that all gun-related deaths in the U.S. are not homicides, the fact that these are also some of the highest figures in the world is telling in itself.

The level of gun violence is so high in the United States that Surgeon General Vivek Murthy argued prior to being appointed to the position in 2014 that it is a public health crisis.

In defense of guns, some proponents compare them to automobile fatalities and suggest that no one ever considers banning cars. This comparison may soon be losing traction, however. Not even taking into account factors such as cars being used for longer time periods and much more frequently than firearms, overall vehicle fatalities are declining. In fact, while gun deaths continue to rise, projections for automobile deaths continue to fall and it is widely speculated that gun-related fatalities will soon eclipse those from automobiles.


Opinions of Guns

With all this in mind, what is the perception of gun control and gun ownership in this country today? According to a recent Pew Research Poll, for the first time since polling began in the early 1990s, more people view protecting gun rights as important than they do controlling gun ownership. The main motivation behind this is a perceived threat and belief in an increased crime rate. However, crime rates have remained consistently low since the beginning of their precipitous fall in the early 1990s.

Nonetheless, the main reason why those polled owned guns was for protection. This is in stark contrast to just 16 years ago when the main reason given by respondents was hunting. These numbers can be broken down further; white people, men, and those who identify as Republican are also more in favor of protecting gun ownership rights and believe guns are a means of protection that makes a home safer.

The fact that support for gun ownership is going up as crime rates remain low presents a paradox. The perception then according to these polls is people are either being misinformed or misinterpreting the issues relating to gun ownership.

The NRA

The National Rifle Association (NRA) has a major impact on the perception of firearms in the United States. In 2014 for example, the NRA donated $984,152 in political contributions, spent $3.36 million on lobbying, and another $28.2 million on outside spending. Nevertheless, while this may seem like a lot, the organization ranked 315 in contributions, 150 in lobbying, and 10 in outside spending among all groups.

Thus, the NRA seemingly has far more clout than is warranted based on how much money it spends. From where then does its power come? The answer is in the rating system the NRA has for candidates. The system provides a letter grade, similar to one from elementary school, based on how a candidate votes on a bill related to guns. An A-grade indicates a candidate’s strong adherence to individual gun ownership and conservative values.

Watch the video below for more information about the NRA.


Conclusion

The United States is a heavily weaponized country, in fact the most heavily weaponized in the world. This extends from its military, which is the best funded by far, to its police forces, which are quickly resembling its military in terms of equipment. This has even pervaded the towns, communities, and neighborhoods as regular Americans are armed like no other people on the globe.

This is the result of years of lobbying by pro-gun groups, namely the NRA, and decisions by the government and courts to protect gun ownership. Subsequently, the widespread availability of these weapons has also led to extremely high numbers of gun-related deaths and homicide rates that on average rival some of the most dangerous countries in the world.

While these facts have caused some to take pause, they have not led to any real change in regulating these weapons, whether this takes the form of outlawing guns in general or requiring more thorough background checks for the mentally ill. The numbers on this issue are unquestionable. The debate, however, on how to handle this issue is still wide open to a variety of corrective actions.

Regardless though, the recent events in Charleston showed that whether it is guns themselves or those wielding the weapons, they have contributed to immense suffering and loss in this country. Whether protecting the right to own these weapons supersedes these individual tragedies is where the debate now begins.


Resources

Atlantic: America’s Top Killing Machines

Economist: Why Gun Control is Doomed

Washington Post: The Five Extra Words That Can Fix the Second Amendment

Breitbart: Gun Control

Pew Research Center: A Minority of Americans Own Guns, But Just How Many is Unclear

Humano Sphere: Visualizing Gun Deaths

National Journal: Senate Confirms Gun Control Advocate as Surgeon General

Pew Research Center: Despite Lower Crime Rates, Support For Gun Rights Increases

Pew Research Center: Why Own a Gun? Protection is Now Top Reason

Open Secrets: National Rifle Association

GQ: How the NRA’s Grading System Keeps Congress on Lockdown

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post America’s Focus on Guns by the Numbers appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/law-and-politics/americas-focus-guns-numbers/feed/ 0 43951
A Resurgent Taliban Complicates Life in Afghanistan https://legacy.lawstreetmedia.com/issues/world/resurgent-taliban-complicates-life-afghanistan/ https://legacy.lawstreetmedia.com/issues/world/resurgent-taliban-complicates-life-afghanistan/#respond Thu, 18 Jun 2015 18:32:57 +0000 http://lawstreetmedia.wpengine.com/?p=43405

What role will the Taliban play in Afghanistan's future?

The post A Resurgent Taliban Complicates Life in Afghanistan appeared first on Law Street.

]]>

Starting in late April 2015, the Taliban launched its annual Spring offensive in Afghanistan. Since that time, the government has fought back and launched its own counteroffensive, which has continued throughout the month of May and into June. After more than a decade and major American military intervention, the Taliban remains active and strong within Afghanistan and neighboring regions. Read on to learn about the group’s origins, the impact of the American war, and the Taliban’s role in Afghanistan’s future.


The Origins of the Taliban

As the oft-told story goes, the Taliban emerged as one of the many competing groups among the Mujahideen fighting against the Soviets in Afghanistan in the late 1970s through 1980s. The group and many others that would make up the Mujahideen were supplied, equipped, and financed in part by large contributions from the United States and Pakistan, which shares a close tribal relation to the Taliban.

The group came to prominence beginning in 1994, succeeding the ouster of Soviet forces. Following the scramble for control, the Taliban, a predominantly Pashtun group, began taking over large swaths of territory. The motivation behind the group centered on a strict interpretation of Sharia law and Sunni Islam. In 1995 they captured their first province, Herat, bordering Iran. By 1998 they had conquered 90 percent of the entire country and were effectively in charge.  The video below details the origins of the Taliban.

Help From Abroad

While the Taliban enjoyed a seemingly meteoric rise from obscure Mujahideen group to the rulers of an entire country, it was not without substantial help–inadvertent or overt–from outside sources. This assistance begins with the United States.

As touched on briefly, the U.S. initially started supporting the Taliban and similar groups in the 1980s in an effort to defeat the Soviets in Afghanistan. This assistance was far from benign, in fact several Mujahideen members actually visited the White House and met with then-President Ronald Reagan. The relationship continued openly until as late as 1997, when members of the Taliban came to Texas to discuss building an oil pipeline in Afghanistan with an American oil company. This even while the Taliban had been suspected of hiding Osama Bin Laden as early as 1996.

Even after the war in Afghanistan started and dragged on, the U.S. was still allegedly funding the Taliban inadvertently. Up to a billion dollars a year in funding ear-marked for the Afghan government, was believed to be funneled directly to the Taliban.

While the United States has directly and indirectly funded the Taliban, Saudi Arabia has been more direct. The Taliban themselves are widely suspected of emerging from holy seminaries paid for by the Saudis, which cultivated the ideals of strict Sunni Islam. However, their support has not stopped there.

Along with other gulf countries, including the United Arab Emirates and Kuwait, Saudi Arabia remains the largest funder of terrorist groups, including the Taliban. These funds are not usually given out directly. Instead, they are channeled through a false corporation that may request support to build more schools, for example. The Taliban and other groups can also raise money from these countries through kidnappings and extortion.

However, the Taliban’s strongest supporter is likely Pakistan, which shares the closest kinship bonds with members of the Taliban. The Pashtun is a tribe whose members live in an area that straddles the northern borders of Pakistan and Afghanistan. Many of the early members were also educated in Pakistani schools known as Madrassas.

Pakistan’s relationship with the Taliban did not end there. Like the U.S., Pakistan funded the Taliban in their efforts against the Soviets in the 1980s; however, the Pakistanis’ efforts continued after the Americans left, as Pakistan’s Inter-Service Intelligence agency (ISI) continued to train members of the Taliban throughout the 1990s up until the American invasion in 2001.

In 2007, after being driven out of Afghanistan, the Taliban set up an organization in Waziristan, Pakistan and proclaimed itself an Islamic state. From this base the Taliban, which is still being supported by aspects of Pakistan’s ISI, has launched numerous attacks, assassinations, and kidnappings into Afghanistan.


The U.S. War in Afghanistan

Despite the Taliban coming to power essentially as a result of fighting one superpower, this did not prevent the other from going after them either. Following the terrorist attacks of 9/11, then-President George W. Bush gave the Taliban an ultimatum to either hand over Al-Qaeda and Osama Bin Laden or be attacked. The Taliban refused and U.S. forces were in the country in less than a month. Less than two months after that, the Taliban was defeated and pushed out of Afghanistan. Despite this victory, both Bin Laden and the leader of the Taliban, Mullah Omar, were able to escape to Pakistan.

Following the overthrow of the Taliban, the focus of the U.S. and its allies shifted to nationbuilding and keeping the remnants of the Taliban at bay. The Taliban however, would not be so quickly dismissed and began a resurgence starting in 2005. The Taliban traded in their old tactics of facing the U.S. in conventional battles for guerilla tactics–particularly suicide bombs–which had been effective in Iraq. The group also resorted to the opium trade for funding. Afghanistan would eventually reach a point where it was supplying 90 percent of the world’s opium.

The renewed and increased violence led to another major policy shift: the surge. The surge was a large additional deployment of U.S. troops to Afghanistan. Newly appointed general Stanley McChrystal requested the troop increase out of fear that at current levels the war may be lost outright. Following this in 2010, Afghan President Hamid Karzai began to publicly float the idea of meeting with Taliban leaders for the first time. While the U.S. initially condemned his actions, by the following year and in the aftermath of the assassination of Osama Bin Laden, the Obama Administration announced it was open to talks.

Along with attempts at negotiating with the Taliban, the U.S. and its allies also began shifting greater responsibility and power to their Afghan counterparts. The U.S. and NATO also planned to pull out all troops by the end of 2014. However, following continued violence, uncertain safety situations, and attacks on NATO troops by allied Afghan soldiers, NATO agreed to keep as many as 13,000 soldiers in the country as part of a new bilateral security agreement signed by Afghan President Ashraf Ghani. The war officially concluded in 2014, making it the longest war in American history.  The video below details the latest war in Afghanistan.


 

The Future of the Taliban in Afghanistan

So what is the Taliban’s position today? While as of 2014 they maintained direct control of only four of the 373 districts in the country, their reach is much greater. For example, in a 2013 assessment by Afghan security forces, 40 percent of the country was considered to be at a raised or high danger level. Furthermore, while Pakistan has paid lip service, the Taliban still have a strong base in the neighboring country. The group has also benefited from record poppy harvests and other illegal financing operations such as mining.

Partners in power?

Negotiations of varying degrees have been attempted beginning as early as 2010. President Ashraf Ghani seems especially eager to bring the Taliban to the table, as his first two official visits were to Pakistan where the Taliban is strong and China, who has sponsored such talks. The two sides finally met in May and while nothing was agreed upon, just meeting was a step in a positive direction. However, for more meaningful action to be taken it may require removing all foreign fighters from Afghanistan as the Taliban has articulated.  The video below presents a desire by the Afghan president to talk with the Taliban.

The question now is how likely the Taliban is to actually come to the negotiating table in a meaningful way? The Taliban currently have an entrenched position and are reaping the windfall from record opium sales. It is very possible that the group will simply wait out the withdrawal of all foreign combat troops and then reignite the conflict with a government that has been repeatedly unable to answer to the task.


Conclusion

You reap what you sow. This is an old saying that essentially means your actions will have consequences, whether good or bad. For the United States, it used the Mujahideen in its fight against the Soviets in the 1980s then left them to themselves for much of the next two decades; however, 9/11 revealed what can happen as a result of benign neglect.

While the attacks were not orchestrated by Afghanistan, they were planned by the insidious leader of Al Qaeda, Osama Bin Laden, who was allowed to live in Afghanistan by the Taliban and who helped them gain more territory in the country.

Since that fateful day the U.S., its allies, and many average Afghanis have fought with the consequences of earlier decisions. This process has now seemingly come full circle, as the U.S. and its regional partners are advocating for talks with the Taliban and suggesting a role for them in the government. The Taliban, for their part, seemed hesitant to commit and more likely to wait out the complete withdrawal of foreign forces before striking again at what is viewed as a weak government.


Resources

BBC: Who Are the Taliban?

Nazareth College: The History of the Taliban

Global Research: Grisly Peshawar Slaughter-Who Created the Taliban? Who Still Funds Them?

Guardian: WikiLeaks Cables Portray Saudi Arabia as a Cash Machine for Terrorists

Shave Magazine: Pakistan and Taliban: It’s Complicated

Council on Foreign Relations: U.S. War in Afghanistan

Brookings Institution: Blood and Hope in Afghanistan

Council on Foreign Relations: The Taliban in Afghanistan

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post A Resurgent Taliban Complicates Life in Afghanistan appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/resurgent-taliban-complicates-life-afghanistan/feed/ 0 43405
FIFA Scandal is No Surprise if You’ve Been Paying Attention https://legacy.lawstreetmedia.com/issues/world/fifa-scandal-sheds-light-organizations-leaders-goals/ https://legacy.lawstreetmedia.com/issues/world/fifa-scandal-sheds-light-organizations-leaders-goals/#respond Fri, 12 Jun 2015 20:14:49 +0000 http://lawstreetmedia.wpengine.com/?p=42916

Are you a fan of the world's most popular sport? Then the FIFA scandal doesn't surprise you.

The post FIFA Scandal is No Surprise if You’ve Been Paying Attention appeared first on Law Street.

]]>
Image courtesy of [Mariya Butd via Flickr]

Votes swinging based on bribes, secret deals made in backrooms, corruption at the highest levels. No, this is not about the next presidential election–not yet anyway. Instead this is how the last few World Cups have allegedly been awarded. To many jaded sports fans familiar with the International Olympic Committee or NCAA, this is not surprising. Even for the naïve, allegations of corruption in FIFA are not startling. What was unexpected though was that the powerful people at FIFA would actually be caught. With the recent arrests, the narrative of the story has shifted from if the tree is rotten to how far up that rot goes. Read on to learn about the scandal rocking FIFA and what it means for the future of the World Cup and its decisionmakers.


FIFA

To understand the FIFA scandal, it is first necessary to understand the organization itself, and its former leader, Sepp Blatter.

What is FIFA?

FIFA–the Federation Internationale de Football Association–was founded in May 1904 by the international football associations of seven countries. The organization continued to grow, but remained entirely European until 1909 when South Africa joined and the United States followed in 1912. FIFA went through hard times during WWI and nearly fell apart altogether, however it endured and began expanding anew.

In 1930, FIFA staged its first World Cup, an event it had been building up to ever since soccer was first played at the Olympics in 1908. In the ensuing years, the organization and its membership grew while also dealing with issues such as travel causing many of best teams to not participate in the first few World Cups. By the 1970s FIFA had really emerged on the world stage incorporating members from Europe and South America in growing numbers as well as many new members from former colonial holdings. Under the much-maligned supervision of Sepp Blatter, FIFA has grown into a powerful global entity with 209 members worldwide, divided into six regional confederations, and with unquestioned clout.

Who is Sepp Blatter?

Sepp Blatter first became part of FIFA in 1975, after leaving his job at a Swiss watchmaker. He spent the next 40 years serving in a variety of roles since his start, namely as secretary general for 17 years and then president of the organization since 1998. Under his leadership FIFA’s crowning tournament has been played on two new continents, Asia and Africa, and become a multi-billion dollar tournament.

Despite his role in dramatically growing the game’s presence worldwide, Blatter is known as much for controversy. In the past he has made numerous inappropriate comments and been repeatedly accused of corruption in the court of public opinion. The awarding of the 2018 World Cup to Russia and the 2022 contest to Qatar seemed to be the ultimate examples of his duplicitousness.

Still, even with this reputation and after the recent arrests of senior FIFA members, Blatter was able to avoid indictment and was actually elected to a fifth term as FIFA president. However, following persistent criticism of himself and FIFA as a whole, Blatter finally relented and resigned his post in 2015. Nonetheless, Blatter will remain in his position until a new election takes place either later this year or early next, meaning the reign of Sepp Blatter at FIFA is not over just yet.

A History of Bribery, Corruption, and Kickbacks

While allegations of corruption and bribery have long haunted Blatter and by extension FIFA, this has had little or no effect on the all-important bottom line. In the last four years alone, FIFA has generated $6 billion in revenue; however, how the money is used has come under greater question. While this money was earmarked for soccer development worldwide, it was instead used for FIFA’s leaders’ own ambitions.

Acting on all the rumor and speculation concerning FIFA’s backroom dealings, the U.S. Justice Department indicted nine of the organization’s leaders for bribery amounting to $150 million. The arrests were part of a larger joint raid made along with Swiss authorities that also saw five corporate executives arrested and charged with racketeering, conspiracy, and corruption. The British are also considering filing their own charges.  The video below explains the FIFA scandal and arrests in detail.


Picking a World Cup

The World Cup is easily the most popular sporting event across the world. In 2010 for example, 200 million people tuned in for the draw or group selection process, not even an actual game. For comparison’s sake, the amount of people who watched the Super Bowl in 2015, a record for the event, maxed out at approximately 121 million people.

How the Process Works

Until 2002, every World Cup was played in either Europe, North America, or South America. However, this finally changed when Japan and Korea co-hosted the event. This also led to a major change in how the hosting country is selected. In 2006, FIFA instituted a system in which the tournament would be rotated among its six regional confederations.

While this was scrapped in 2007, a similar rule was put into place that same year stating that all countries in a particular regional confederation would be ineligible to host two World Cups following the event hosted by a neighboring country. In other words, if the U.S. hosted the 2018 World Cup, other countries in its region, such as Mexico, would not be eligible to host a World Cup until 2030 at the earliest.

The voting process itself is the responsibility of the executive committee, which is made up of 24 people. These include the president and vice president of FIFA, as well as seven other vice presidents representing each continental soccer federation and one from one of the home nations of the United Kingdom. To clarify, there are actually only six continental confederations–Antarctica is left out in the cold, thus the need for the seventh member. Lastly are 15 members elected from the 209 member countries, who are appointed to four-year terms.

These members are in charge of who gets the right to host the World Cup. The voting process involves each country interested in hosting the event giving a presentation on television before the committee. Once all the prospective hosts have presented their cases, the executive committee votes by secret ballot until a winner is declared. In the case of a tie, it is up to the president of FIFA to cast the deciding vote.

Corruption at Every Turn

As can be expected from a process of this nature, corruption is rampant. Of the many accusations, members selling votes is most common. In the most recent World Cup bid process, actual evidence of this phenomenon emerged. Two undercover British journalists were approached and offered votes in exchange for bribes. The notion of corruption however, should not be a surprise, in fact the way FIFA is constructed basically lends itself to this.

While not every country votes on who will host the World Cup, each has a say in another important way. Every member votes for the organization’s president. This is a system that can encourage small countries that are more dependent on FIFA stipends to be more likely to sell their vote in exchange for more support. This is the case because the amount of support each country receives has nothing to do with its size. Thus, for example, a massive country like China can receive less money from FIFA than a small country such as Bermuda.

In addition, aside from money, small countries can also expect other benefits for supporting certain people or countries’ bids. This comes in the form of recognition, namely FIFA along with having a poorly defined system for allocating funds also has an unclear definition of what makes a nation. For example Gibraltar, a small rock governed by the U.K. but claimed by Spain, nearly won recognition as its own nation despite only having a population of 29,000 people. The following video highlights the most recent FIFA presidential election.

Trouble With the Machine

The controversial decisions to award Russia the World Cup in 2018 and Qatar the event in 2022 are hardly the first incidents with picking a host country. In 2002 when Japan and South Korea co-hosted the event there were minor issues with the travel required between the venues causing the organizers to never again hold a multi-country event.

The controversy only ratcheted up for the next World Cup in 2006, when allegations concerning bribery surfaced when Germany won an upset bid for the tournament over supposedly favored South Africa. Recently, details have emerged of specifically what this bribery entailed; in this case it far exceeded the norm. In 2006 Germany is alleged to have temporarily lifted an arms embargo on Saudi Arabia and to have shipped the country weapons in exchange for its vote. It is also accused of using the lure of investment from German companies such as Volkswagen, to get Thailand and South Korea to also support its candidacy.

Controversy continued when the tournament moved to Africa. In 2010 South Africa finally succeeded in its bid for the World Cup. According to a recent report, Morocco actually received more votes but, through a series of bribes, South Africa was declared the winner. At the center of this scandal was former FIFA Vice President Jack Warner, who reportedly took bribes from both countries for the votes he controlled, he may also have taken money from Egypt who was also bidding for the tournament that year.

Like a perpetual storm cloud, problems followed the World Cup when it arrived in soccer mecca Brazil. The issues evolved far beyond just bribery and affected society as whole. Just a few of the major problems included the forced eviction of thousands of poor residents, social unrest, police brutality, unfinished infrastructure projects, unused stadiums, worker deaths, and lasting social inequality that was actually exacerbated by the tournament.

Russia and Qatar

ll these issues bring us back around to the next two proposed hosts for the World Cup: Russia and Qatar. Russia was awarded the tournament despite continued human rights abuses as well as its flagrant invasions of Ukraine and Georgia. Additionally, like Brazil before it, while Russia agrees to host the lavish tournament, people at home will be feeling the cost. Russia plans to spend at least $20 billion–a new record–despite the Ruble losing half its value in the last year and U.S.-led sanctions taking their toll on the Russian economy, as well.

Then there is Qatar, whose selection to hold the 2022 tournament was so preposterous that it played a huge role in authorities finally stepping in to clean up FIFA’s corruption. Qatar plans to spend $220 billion on the tournament, which will make that record-breaking Russian figure look minuscule. Also, in an effort to avoid the average 106 degree temperature there, the World Cup in Qatar will be moved to winter. On a human level, most of the work is being done by migrants who are working in slave-like conditions and dying in droves. This does not even take into account the laws against things such as drinking alcohol or homosexuality.  The following video explains many of the negative issues as a result of the World Cup in Qatar.

With this as the backdrop and with the still-simmering scandal, it comes as little surprise then that bidding for the 2026 tournament has been put on hold. Additionally, despite FIFA saying there is no legal ground on which to take hosting duties for the 2018 and 2022 World Cups away from Russia and Qatar, many are eager to explore that option as well.


Conclusion

The FIFA scandal far exceeds the traditional borders of sport. The organization is so powerful that it has the ability, directly or indirectly, to boost an unpopular leader and even legitimize states. It also has sponsorships from some the world’s most powerful corporations and is the most popular sport globally. With this in mind then, the recent arrests of FIFA’s top leaders were surprising only in the fact that they actually happened. These men and this organization have been basically untouchable for decades.

Thus, while the U.S. and Swiss indict leaders and promise further action, it is hard to believe any of it will actually happen, or at the very least stick. Even the resignation of Sepp Blatter, despite the ardent support of Vladimir Putin, comes with a caveat. Blatter was elected in a landslide right before his resignation and was allowed to leave on his own terms instead of in hand cuffs, as many feel should be the case.

While its leaders fall like dominoes, FIFA will likely survive this scandal as it survived two world wars, membership issues, and a host of other problems along the way. The real question in the wake of this scandal is, will any of these arrests, indictments, or resignations make this seminal organization less corrupt and more honest? Based on the system in place and its recent elections the answer looks like no.


Resources

Top End Sports: Host Country Selection

MLS Soccer: What is FIFA, Who is Sepp Blatter, and What is All the Fuss About?

Goal: World Cup Bidding Process Explained

FIFA: History of FIFA

Time: These Are the Five Facts That Explain the FIFA Scandal

Five Thirty Eight: How FIFA’s Structure Lends Itself to Corruption

Reuters: Germany Sold Arms to Saudi Arabia to Secure Its Vote for 2006 World Cup

Sports Illustrated: Morocco Beat South Africa in Vote For 2010 World Cup

World.Mic: Seven Big Problems the World Cup Left Behind in Brazil

LA Times: So Many Things Wrong With Qatar World Cup 2022

CNN: FIFA to Suspend Bidding For 2026 World Cup Amid Corruption Scandal

BBC: Vladimir Putin Expresses Support for Blatter

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post FIFA Scandal is No Surprise if You’ve Been Paying Attention appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/fifa-scandal-sheds-light-organizations-leaders-goals/feed/ 0 42916
Expiration of Patriot Act Reignites Security v. Privacy Debate in America https://legacy.lawstreetmedia.com/issues/law-and-politics/expiration-patriot-act-reignites-security-v-privacy-debate-america/ https://legacy.lawstreetmedia.com/issues/law-and-politics/expiration-patriot-act-reignites-security-v-privacy-debate-america/#respond Sat, 06 Jun 2015 19:29:31 +0000 http://lawstreetmedia.wpengine.com/?p=42396

The Patriot Act expired but a near-identical bill passed. How do Americans feel?

The post Expiration of Patriot Act Reignites Security v. Privacy Debate in America appeared first on Law Street.

]]>

Portions of a law known as the Patriot Act were allowed to expire on May 31, 2015. The Patriot Act is one of the most controversial laws in U.S. history, originating in a time of fear and later being at the heart of leaks by Edward Snowden that revealed a massive data gathering effort by the NSA of Americans’ information. Read on to learn more about what exactly the Patriot Act is, where it originated from, and the future outlook of its laws.


The Patriot Act

The Patriot Act is one of most divisive laws passed in recent history; however, like many other boogeymen, the actual details of what it entails are unclear to much of the American public. So what exactly is the Patriot Act?

What is the Patriot Act?

The USA Patriot Act or the Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act, was passed in October 2001. The act dramatically expanded the ability of the United States government to conduct surveillance and investigate citizens without their knowledge.

Unlike similar preceding pieces of legislation, this act lacked the familiar protections that preserve rights in the face of legislation; its statutes were also hard to define and limit. This was due to the speed at which the law passed through congress and was signed by President George W. Bush. The bill was passed quickly due to widespread fear mongering immediately following the 9/11 terrorist attack on U.S. soil, including on the part of Attorney General John Ashcroft who warned any delay could result in another devastating attack.  Watch the video below for more details on the Patriot Act.

Illegality

In the first legal challenge to the act, despite it being in place since 2001, a three-judge panel ruled the law illegal. The panel, however, did not say the law was unconstitutional; instead that the federal government’s mass-data collections had gone beyond what the original creators of the act envisioned when they signed it into law. This is an important distinction in that it effectively says that lawmakers have power to create such an all-encompassing law, but that the Patriot Act was no such law.

Expired Provisions

While the recent ruling may have impacted the decision to let parts of the law expire, the Patriot Act was actually created with built in sunset provisions that were designed to expire unless extended by congress. Thus, after much deliberation, key components of the act were allowed to expire. One such aspect was the so-called Lone Wolf provision, which basically allowed the U.S. intelligence system to monitor individual people even if they had no known terrorist affiliation. This clause was supposedly never used and it was only allowable against non-citizens.

Another major aspect allowed to expire was the roving wiretap. As the name implied, it allowed the surveillance network to maintain taps on any one of a person’s devices, not just a single phone.

Probably the most well-known provision of the law allowed to expire was section 215. This section was the grounds the NSA used to collect data on a large number of Americans without their express permission, even if they were not suspected of terrorism or of any other crime. This section had also been used by agencies such as the CIA and FBI to track financial records of suspected terrorists and criminals.  The video below highlights what provisions of the Patriot Act will expire and what that means, specifically in relation to section 215.


Its Future and Its Successors

While these unpopular parts of the Patriot Act were allowed to expire, a similar successor was quickly passed. Known as the Freedom Act, this new law allows for greater transparency and puts the onus for compiling phone records on companies. Additionally, the Freedom Act also requires the disclosure of how often data collection is requested and allows for more opinions from judges from the mysterious Foreign Intelligence Surveillance Court.

In the aftermath of the expiration of parts of the Patriot Act and following the passage of the Freedom Act, opinions quickly poured in. While those who supported the Patriot Act claim that this has led to a degradation in U.S. security, many others actually view the two bills as essentially the same. In fact, for this latter group, the new Freedom Act does little more than privatize the collection of people’s data while offering the vaguest efforts at greater transparency. Under the Patriot Act, the NSA was compiling the data, but now the onus will fall on the telecom companies themselves. Now the companies will store the data and whenever the NSA or FBI wants to use it they will need to get a warrant from the Foreign Intelligence Surveillance Court.

Aside from changing who collects the data, the new law really does not do much. This new collection method may actually cost more due to private inefficiencies and also the money the government will pay the companies for their efforts. It also protects these same companies, such as AT&T, from lawsuits. Meaning, regardless of opinion, Americans may now be paying more money to spy on themselves. The video below explains the specifics of the new Freedom Act, even suggesting that it might lead to more widespread surveillance.

When the Freedom Act successfully passed through congress, despite repeated efforts by Senate Majority Leader Mitch McConnell, President Barack Obama immediately signed the legislation. While he and others in the government and business community lauded the new act and its potential for greater oversight and transparency, the reality remains to be seen.


The Origins of the Act

Most people associate the Patriot Act with the events of 9/11. In actuality, many of the ideas contained in the act had been debated for years but had not won the necessary support. The Patriot Act was actually the result of a compromise over another proposed bill known as the Anti-terrorism Act. Nevertheless, while the events of 9/11 did not necessarily spawn the ideas for the Patriot Act, they did serve as the catalyst to convince lawmakers that a law of that type was at last needed to prevent any further attacks.

However, how they came to this decision and how it was passed has only added fuel to the fire of those who find it controversial and even illegal. The law was originally introduced to congress by Ashcroft, who gave congress a week to pass the bill or risk the consequences of another attack. Members in both houses attempted to make changes to the law, but most were scrapped in order to meet the deadline. Certainly no one wanted to be responsible for another terrorist attack against the United States due to idleness.

While the law itself has generated controversy, extending its provision has also led to extended debates. In 2009 when it was first up for review, certain provisions were set to expire, which led to a lengthy debate and even a delayed vote. In the end, though, President Obama reauthorized the act in 2010 for one more year.

The president had another opportunity the following year, in 2011, to refuse to authorize the act or at least to add amendments. One such amendment, suggested by Senator Patrick Leahy (D-VT), called for government oversight and transparency for how the act was used by the FBI. Leahy had actually been the one leading the charge for more oversight measures for the original act, too. Despite these attempts, the amendment was ignored once more and President Obama confirmed the act yet again.


Public Sentiment

An act this controversial and requiring so much support would seem to be a likely candidate for the legislative trash heap; however, even following the disclosure made by Edward Snowden about the NSA monitoring civilians’ phones, this law is still far from unpopular. In fact, the opposite is true. In light of the act expiring, CNN polled people across party lines to gather their opinions. According to that poll, 61 percent of people felt the law should have been renewed.

Additionally, while lawmakers in Washington did not agree that the Patriot Act as it was originally constructed should be renewed, they did agree that something similar was still needed to support America’s anti-terrorism efforts. In an odd coupling, Democrats and Tea-Party Republicans united to defeat the expiring Patriot Act and then pass its successor, the Freedom Act. While the Freedom Act was overwhelmingly passed, small groups on both sides held out. On one side were those in the old-guard of Republicans, such as McConnell and Senator John McCain (R-Ariz.), who felt the Patriot Act should have been renewed as it was. Conversely, some legislators such as Senator Rand Paul (R-Ky) wanted it scrapped altogether. Paul and his like-minded supporters viewed the whole program as an example of government overreach.

The public and congress therefore still view the Patriot Act and its successor as necessary and vital to national security, even after the Snowden revelations revealed that security is coming at a cost to everyone’s privacy.


Conclusion

The Patriot Act is an extremely controversial law, passed during a time of public terror in the wake of the greatest attack on the United States in the nation’s history. The law itself gave the American intelligence community widespread powers to spy on and investigate its own citizens, without discretion and often without reason.

After much public outcry, the most contested parts of the law were allowed to die off; however, its  successor the Freedom Act guarantees nearly the same all-encompassing powers for the intelligence community, while merely shifting the effort to compile data onto communications companies. All this, even in the face of revelations, that data compiled through the Patriot Act did not aid in any major terror investigation.

 


Sources

Primary

Electronic Privacy Information Center: USA Patriot Act

Additional

USA Today: Here’s what happens now that the Patriot Act provisions expired

Reuters: USA Freedom Act vs. expired Patriot Act Provisions: How Do the Spy Laws Differ?

Daily Tech: Despite Support From Senator Sanders, Senator Paul Loses USA Freedom Act Fight

Politifact: Revise the Patriot Act to Increase Oversight on Government Surveillance

CNN: Six in Ten Back Renewal of NSA Data Collection

Law Street Media: NSA’s Surveillance of Americans’ Phone Conversations Ruled Illegal

NPR: NSA’s Bulk Collection of American’s Phone Data is illegal, appeals court rules

Business Insider: Obama’s Signature on the Freedom Act Reverses Security Policy That Has Been in Place Since 9/11

CNN: NSA Surveillance Bill Passes After Weeks-Long Showdown

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Expiration of Patriot Act Reignites Security v. Privacy Debate in America appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/law-and-politics/expiration-patriot-act-reignites-security-v-privacy-debate-america/feed/ 0 42396
FARC: Preventing Peace in Colombia? https://legacy.lawstreetmedia.com/issues/world/within-grasp-peace-colombia-remains-elusive/ https://legacy.lawstreetmedia.com/issues/world/within-grasp-peace-colombia-remains-elusive/#respond Sun, 31 May 2015 14:22:18 +0000 http://lawstreetmedia.wpengine.com/?p=41830

Is there a way to combat the infamous terrorist group?

The post FARC: Preventing Peace in Colombia? appeared first on Law Street.

]]>
Image courtesy of [Camilo Rueda López via Flickr]

 

In Colombia a violent conflict has been raging for more than 50 years. This conflict has pitted the nation’s government against a rebel–possibly terrorist–group known as FARC. With the struggle surpassing the half-century mark, both sides have been willing to return to the negotiating table to give peace yet another chance. However, the process is once again under the threat of unraveling due to a familiar cycle of FARC ambushes and government reprisals. Read on to learn about the conflict’s history, previous efforts at peace, other groups and issues in play, and whether this round of negotiations is likely to actually result in peace for the embattled nation.


History of the Conflict

The Revolutionary Armed Forces of Colombia’s (FARC) earliest origins can be traced all the way back to the late 1940s and 50s. During this time, the two major political parties in Colombia–the Liberals and the Conservatives–were locked in a brutal civil war for control of the country. The conflict left more than 300,000 people dead. Following the end of the war, the Conservatives took control of the government and barred many of their Liberal opponents from participating, in effect marginalizing them. Several groups emerged as a reaction. One of these groups was led by Manuel Marulanda, who founded the FARC’s precursor, the Marquetalia Republic, and was the first leader of FARC as well. This group began arming and organizing in the mountains.

In 1964, the government launched an attack on this organization, quickly routing them. However, Marulanda escaped along with several followers. Before fleeing he and other group leaders agreed to the Agrarian Program of the Guerillas, basically FARC’s constitution, which created roles within the group and a common strategy. This was followed up every two or four years by congresses where the group’s members would meet to discuss new policies.

A seminal moment was reached at one such conference in 1982, when members became determined to solidify their control in the mountainous regions and also begin to expand their influence into cities with the ultimate goal of taking the capital. In the ensuing years, particularly from the 1990s to the early 2000s, the group engaged in several high profile and ostentatious attacks on police, soldiers, villagers, rival groups, and even a presidential inauguration.

Ideology and Perception

FARC was founded on a Marxist-Leninist ideology. The group claims to represent the lower class of the country, namely the poor and farmers, who it feels are being oppressed by the government. It also opposes the opening up of the country to foreign interests, particularly American corporations, which it views as imperialistic.

However, while FARC espouses a high-minded ideology, its actions are less than noble. In fact the group has been designated a terrorist organization by a number of countries including the U.S. This is due in part to the group’s tactics, ranging the whole violent gambit from murder to bombings. With its base in Colombia, as well as its presence in Ecuador, Panama, Brazil, Venezuela, and Peru, the group launches attacks usually within its own territory as well as outside raids to obtain supplies. These attacks generally target military personnel and foreigners, however a large number of civilians have been caught in the cross fire.

FARC’s Power Base

While its commitment to Marxist-Leninism is dubious, so too are the ways the group generates its funding. Much of the wealth created by FARC has been through practices such as kidnapping, extortion, and cocaine trafficking. In fact, estimates for the amount of money FARC earned from the cocaine trade range from $220 million to $3.5 billion. FARC has also, recently, added the ignominious task of illegal gold mining to this grim list.

Additionally, the group has allegedly received support from like minded and sympathetic governments in Venezuela and Ecuador. According to documents the Colombian military claims it seized in a raid against FARC in Ecuador in 2008, then-Venezuelan president Hugo Chavez had given the group $300 million. While Chavez denied the allegations, these countries have been open to FARC in response to U.S. support of the Colombian government.


So Close Yet So Far Away

The current peace negotiations between FARC and the Colombian government are not the first attempts at a détente. In fact they are far from it–attempts to broker peace between the two sides have gone on almost as long as the fighting. The first efforts came in 1984 when, as part of Uribe Accords, FARC agreed to stop the kidnappings and the government pushed the group to channel its energy into legitimate political movements. Although the ceasefire ultimately did not hold and the group continued fighting, it did lead to the Patriotic Union and the Colombian Communist Party. These were the legitimate government parties associated with the FARC, similar to Hamas in Palestine or Sinn Fein in Ireland.

These talks failed however, because FARC’s political candidates, though successful, were repeatedly murdered by right-wing paramilitary groups. Additionally no demands of a ceasefire were ever made on behalf of the government.

The next major effort came in 1998, when then-President Andres Pastrana once again agreed to provide FARC with a demilitarized zone from which to operate. However, this move backfired as FARC capitalized on the temporary weakness of the government to recollect itself militarily, launch attacks, grow coca, and kidnap officials. Nevertheless, this lack of commitment on behalf of FARC may have ushered in its own decline, as it led to a backlash in which the citizenry called for a tougher stand against the rebels. This was carried out by then-incoming President Álvaro Uribe.

The latest round of peace negotiations began secretly in Cuba. When preliminary agreements were reached, another round was proposed to be conducted in Norway, with final negotiations returning to Cuba. As part of this agreement, several reforms passed aimed at compensating victims and pardoning FARC members. The discussions themselves centered on six points. These points included rural development and land policy; the political participation of FARC; ending the conflict and reinserting FARC members back into civilian life; the end to cultivation of illicit crops and drug trafficking; reparations for victims; and lastly the implementation of these agreed-upon measures once the negotiations had concluded.

So far tentative agreements have been reached on three of the issues, although nothing is finalized until the whole process has been implemented. These three points were land redistribution, the role in civil society for demobilized FARC members, and putting an end to the illicit growing of crops used for the drug trade. However, the negotiations were temporarily put on hold when FARC kidnapped three high-ranking military officers. They were eventually released and talks resumed.

The discussions have once more been put under pressure, with the attacks on soldiers as well as a raid that killed one of the FARC negotiators. The ceasefire has also been lifted by the government and airstrikes against rebel positions have begun once more. Despite this pressure though, talks have continued in Cuba.


 

Other Players in Colombia

Along with FARC there are several other groups at work. These groups can be divided into left wing, of which FARC is one, and right wing, which generally oppose FARC at all levels.

Leftists

While FARC may be the largest and most infamous group in Colombia, it is far from the only one. Another Leftist group in the country is the ELN or National Liberation Army. The ELN formed at the same time as FARC, however its membership was comprised chiefly of students, Catholics, and intellectuals who were more focused on replicating a Castro-style revolution. While the groups have similar beginnings and ideologies, ELN is focused more in urban areas as opposed to the rural locations FARC dominates. Despite these similarities however, the groups have clashed directly. ELN, like FARC, is also listed as a terrorist organization, according to the U.S. State Department.

Both of these groups have operated under the larger Simon Bolivar Guerilla Coordinating Group, an umbrella organization for left-wing organizations. Along with FARC and ELN, M-19 and EPL also worked under this designation. M-19, or the April 19th movement, was FARC’s attempt at an urban organization. However, this group ultimately broke away and became independent. EPL was another aggressive communist group that was eventually weakened by FARC attacks.

Leftist groups such as these both help and hurt FARC’s position. By existing and making similar demands they reveal to the government the reality of problems such as poverty, which can be more easily dismissed if they are only discussed by one group with a controversial past. On the other side, they can be harmful by splintering the potential support base for FARC and directly undermining the group when they engage in internal conflicts that can create more violence.

Right-Wing Paramilitaries

While leftist groups formed in reaction to the conservative government,right-wing groups formed in response to organizations such as FARC. These started out small in the 1960s as local self-defense groups authorized by Colombia’s Congress.

Eventually they consolidated into the AUC or United Self Defense Forces of Colombia. This was essentially a holding company paramilitary group, created and funded by rich farmers and narcotics traffickers to protect these people and their interests from FARC and like-minded organizations. This group was very strong and its membership ranged from 8,000 to 20,000 members. Additionally, while there was never any admitted connections between these groups and the government, it has been widely speculated that the administration often looked the other way or even funded the operations. Although the group was disbanded in 2006 and its leaders pardoned, many of its former members are suspected of continuing to operate in the drug trade and other criminal operations. This group was also considered a terrorist organization by the U.S. government until 2014.

These right-wing para-military groups have a dual effect on FARC. On one side they show the violence perpetrated against the group, often at the behest of the government or powerful individuals, which can further justify FARC’s cause. Conversely they are actively trying to destroy FARC and have seen a certain degree of success. In either case, they have ratcheted up the violence and created a culture of fear and mistrust. They also make FARC less likely to come to the peace table because they are seen as the secret hand of the government.


Economics and Legacy

Decline of FARC

While securing peace with FARC is still an important goal, its importance has diminished over time. This is partly due to economics, as in 2012 a free trade agreement between Colombia and the U.S. went into effect, making the allies that much closer. Additionally, the economy of Colombia has continued to grow despite the fighting, averaging nearly 6 percent growth a year. In individual terms this has meant the average income per person has gone from $5800 dollars in 2000 to $10,000 in 2011.

The effect of this is two fold. For a government weary of fighting and eager to shine on the global stage, defeating or at least containing FARC would allow it to focus more on improving its economy and the well being of its citizens. Additionally, an improved standard of living would also seem to undermine the very existence of FARC as the group was originally supposedly founded on the idea of protecting and standing up for underrepresented groups, namely the peasants.

FARC also appears to be declining. In 2001, the group was believed to have as many as 16,000 members; that number has recently dwindled to as few as 6,000 to 8,000. This has been the result of an intensified campaign by former President Uribe, whose father was murdered by FARC in a kidnapping attempt. Aside from decreased membership, the leadership of FARC has also been hit hard. After its founder died of a heart attack in 2008, his second in command was subsequently killed in the raid in Ecuador. Other leaders have also been killed. Desertion has become a problem as well as some fighters, who were attracted by noble ideas, have become jaded by the drug trafficking and perpetual violence.

Legacy

So what legacy does FARC leave behind? In terms of numbers, over 220,000 people have been killed as a result of fighting between the group and the government since its inception. Additionally, much of the popular support once enjoyed by FARC has eroded, as people have become exhausted with the conflict and simply want a better life. Most of the territory once held by FARC has also been lost as a result of the increased military efforts on behalf of the government. Thus, FARC’s strength and importance has been greatly reduced. Still, an agreement between the group and the government would be a major step in rebuilding the war-torn nation.


Conclusion

The most recent round of talks between Colombia and FARC offer a glimmer of hope. But this hope can only be achieved if both sides stop committing the same perpetual violent acts that have spawned this conflict in the first place. Nevertheless, if the last few month’s actions are any indication, this is not going to happen.

This presents a challenge to both sides. On one side, FARC is a diminished organization that faces enemies on all sides and has few friends. The government, meanwhile, clearly wants to capitalize on economic growth and turn the page on the history of drug violence and terrorist insurgencies. Both of these goals can be accomplished, but the two sides have to come to terms and end a destructive conflict that has lasted for more than 50 years.


Resources

Primary

Congressional Research Service: Peace Talks in Colombia

Additional

Stanford University: Revolutionary Armed Forces of Colombia-People’s Army

Council on Foreign Relations: FARC, ELN: Colombia’s Left Wing Guerillas

BBC News: Colombian FARC Negotiator Killed in Bombing Raid

Institute of the Americas: Colombia Pushes Back Cartels, Terrorists to Become Economic Powerhouse

Al Jazeera: Profile: The FARC

Ploughshares: Colombia

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post FARC: Preventing Peace in Colombia? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/within-grasp-peace-colombia-remains-elusive/feed/ 0 41830
Fatal Amtrak Derailment Casts Light on a Forgotten American Industry https://legacy.lawstreetmedia.com/issues/business-and-economics/amtrak-derailment-casts-light-forgotten-industry/ https://legacy.lawstreetmedia.com/issues/business-and-economics/amtrak-derailment-casts-light-forgotten-industry/#respond Fri, 22 May 2015 20:37:13 +0000 http://lawstreetmedia.wpengine.com/?p=40238

Is it still safe to travel by train in the United States?

The post Fatal Amtrak Derailment Casts Light on a Forgotten American Industry appeared first on Law Street.

]]>
Image courtesy of [John H. Gray via Flickr]

Recently an Amtrak train traveling the busy northeast corridor route near Philadelphia derailed at a high speed, killing eight people and injuring more than 200. While experts weigh in over the speed of the train, the state of the engineer, and whether the locomotive was struck by a foreign object, many other people are now concerned about a different matter: the safety of trains in the United States. Read on to learn about the development of the train industry in the U.S., the rules and regulations that trains must follow, and considerations moving forward in light of the recent, horrific Amtrak crash.


Locomotives: The American Backstory

The first charter for a railroad in North America was granted to John Stevens in 1815. The same man also tested the first steam locomotive in the United States, nine years later in 1826. A railroad boom began in 1840, stemming from the northeast. However, this initial expansion was beset by unregulated practices and differing track gauges, which kept the lines from unifying. Individual owners of regional charters fought over territory. This chaos led to dangerous conditions for passengers and cargo traveling by rail.

As track mileage continued to expand rapidly, the rail industry achieved one of its greatest moments with the completion of the transcontinental railway in 1869 in Promontory Point, Utah. Starting in the 1880s and continuing through the 1920s, rail companies enjoyed greatest success. This was in part due to owners finally agreeing to standardize track gauge size and the development of a number of safety features that also improved efficiency. Ultimately, 1916 served as a peak year, with rail mileage reaching an all time high in the United States and stretching a total of 254,000 miles.

This expansion would come to a halt in the 1930s however, with the rise of individual automobiles and continue to stagnate throughout the 1940s and 50s following WWII. In the 1960s train companies began merging or going bankrupt, as passenger and freight numbers continued to dwindle. In 1971 Amtrak, a government supported system that dealt primarily with passenger traffic, was created. Even with government support however, the train network nationwide would likely have collapsed without a move towards deregulation in 1980. This move allowed the remaining companies to negotiate better rates and drop routes that were unprofitable.

This renaissance has continued into the present, as companies have merged into larger and larger entities. Freight has also returned to rail in large numbers, so much so that it is actually in danger of overwhelming the current system. Passenger travel has also increased, as people seek to avoid traffic and relax during commutes. The following video gives an overview of the history and development of railroads in the U.S.:


Trains By the Numbers

People

In the 2014 fiscal year, 30.9 million people rode Amtrak trains. During that same year 11.6 million passengers rode along the Northeast Corridor route, where the recent accident occurred. This was a 3.3 percent increase from the year before. In fact, this route is so popular that it actually accounts for 77 percent of combined rail and air travel between Washington DC and New York. These numbers would likely be even higher, except that Amtrak suffers from outdated infrastructure and has its efficiency hampered by freight trains using the same rails.

Freight

So exactly how much freight do these cumbersome trains move each year in the U.S.? In 2010, approximately 1.7 billion tons of freight were transported on rails, the last year with complete data. This accounted for 16 percent of the total freight shipped within the U.S. and equated $427 billion dollars. The industry is dominated by seven major carriers that employ 175,000 people. The number is expected to grow in the future, but is currently stagnant due to old infrastructure and insufficient investment.

Incidents

With all these people and things being moved by rail, the next questions is how likely are accidents like the one outside of Philadelphia? The answer is extremely unlikely. In fact, a person is 17 times more likely to be involved in a car accident than a train accident. While some of this can be explained by the obvious fact that most people travel in cars more than they travel in trains, the accident rate is also lower. There’s just .43 accidents per billion passenger miles for train travel versus 7.3 accidents per billion passenger miles for cars.

In addition, when rail accidents do occur, they usually do not involve passengers, as most rail traffic is moving freight. Thus, while 1,241 derailments occurred last year, there were few injuries. Furthermore, while the number of derailments may seem fairly high, it is less than half the number seen just thirty years ago. Most of the derailments that do still occur are a result of track conditions like the ones being blamed in the recent high profile crash. Experts worry that these are a result of underfunding, especially when it comes to Amtrak. This is the case even with ridership growing in the Northeast corridor route because Amtrak must spread its revenue across all its routes and many of them don’t make a profit.


Rules and Regulations

Benefits of Deregulation

As touched on earlier, the railroad industry actually experienced deregulation in the 1980s. The railroad industry wasn’t doing well, and needed to become more flexible in order to survive. Thanks to two separate acts passed in Congress, in 1976 and 1980 respectively, a collapse of the railroad industry was avoided. Basically both these acts provided greater flexibility to railroad companies to negotiate rates, change routes, and merge to stave off insolvency. While fears grew of monopolies, these acts were also designed to lower the cost on entry into rail travel and transport, which was supposed to prevent any one company from dominating the industry. Since these acts went into effect, the rail industry has enjoyed a strong comeback. Additionally, deregulating the rail industry may have also improved infrastructure, as the large companies that have come to dominate rail traffic can afford to reinvest in improving safety and the technology that guides their trains.

Thus, while the technology and knowledge exists to improve safety and prevent accidents like the one in Pennsylvania, everything ultimately comes back to money. In 2008, Congress passed a bill requiring trains to implement a system known as positive train control. This utilizes a number of technologies to sense how fast a train is moving and slow it down if necessary. However while this system was in place on the tracks going the opposite direction it was not yet in position on the tracks heading northbound towards Philadelphia. The accompanying video explains some of the safety measures in place, particularly positive train control:

There are other measures in place to alert the conductor and slow down the train as well. If the conductor does not alter the train’s trajectory in any way for a certain amount of time, bells go off in the cabin to alert him or her. Additionally, the train is also supposed to slow itself down, but it was unclear if these safeguards were initiated before the crash.

Money Troubles

However, in its 2008 decision, Congress required Amtrak to bid for the communications channels required to send and receive signals. For an already cash-strapped system that was also facing major budget cuts, this was a deadly requirement. Implementing this technology costs $52,000 dollars per mile and must be universally applicable to a variety of trains that use different technology. Nonetheless, despite all these challenges, the Amtrak system was actually one of the leaders and was one of the few on pace to complete the required installation by 2018.

Following the crash however, Congress vetoed a bill being pushed by President Obama that called for $1 billion dollars in additional funding for Amtrak. This funding is clearly needed not just because of this crash but also because of how Amtrak compares to foreign train systems. In the UK for example, this type of braking technology has been in place for nearly ten years. This is also true in other more train-centric countries such as France and Japan.

Shipping Oil

Along with recent concerns over rail safety in general, there are long standing worries over trains carrying oil. Due to the nation’s energy boom, trains are increasing being relied upon to transport oil. For example, in 2012 trains shipped more than 40 times the amount of oil they did just four years earlier, an amount which doubled again in 2013.  The video below documents the rise and danger of shipping oil by rail:

This increased traffic has also led to an increase in the number of accidents. In 2014 there were 141 incidents termed “unintentional releases” of oil. The year before, while there were less individual incidents, even more oil was spilled, about 1.4 million gallons. For some perspective, that amount was more than all the oil that had ever been spilled by train transport to that point. These spills and accidents can lead to massive explosions, deaths and contaminated ecosystems. The increasing threat is so troubling that some people are calling for more pipelines to be built, rehashing the Keystone Pipeline debate.

Thus, while all trains are facing tighter rules and regulations, those carrying oil and gas are facing the most stringent changes to protocol. In new rules outlined at the beginning of May, more durable containers are now required for transporting fuel in the event of a crash. Additionally, trains are required to go no faster than 30 miles an hour unless they have electronic brakes. This action was part of a joint announcement in conjunction with Canada. It was also in the wake of a number of crashes involving fuel shipments, including four this year alone. This has also rekindled the argument over lack of funding and overly tight timelines.


Conclusion

The crash of an Amtrak train along the nation’s busiest passenger rail route has raised fears over train safety. These fears are compounded by a rail industry facing budgets cuts and relying on outdated technology. However, the rate of crashes and derailments remain low, especially in relation to other types of transportation such as cars. Nevertheless, in the future more investment and infrastructure improvements must be made in order to prevent accidents, like the one outside of Philadelphia, from repeating themselves. Greater efforts must also be made and continue to be made to regulate the usage of trains in moving massive quantities of oil, which has proven very dangerous. The rail system is the unsung and often forgotten cog of the transportation system in the U.S. But, it only takes one accident to put concerns over its safety back on America’s one mind.


Resources

American-Rails: Railroad History: An Overview of the Past

The New York Times: Amtrak Says Shortfalls and Rules Delayed its Safety System

Washington Post: Trains and Carrying and Spilling a Record Amount of Oil

CNN: Amtrak Installs Speed Controls at Fatal Crash Site

Amtrak: News Release

Guardian: Amtrak’s Northeast Corridor

Center for American Progress: Getting America’s Freight Back on the Move

Vox: 4 Facts Everyone Should Know about Train Accidents

Hofstra University: Rail Deregulation in the United States

Guardian: Amtrak How America Lags Behind the Rest of the Developed World on Train Safety

Wall Street Journal: U.S. Lays Down Stricter Railcar Rules

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Fatal Amtrak Derailment Casts Light on a Forgotten American Industry appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/business-and-economics/amtrak-derailment-casts-light-forgotten-industry/feed/ 0 40238
The Globalization of Cinema: What’s Next? https://legacy.lawstreetmedia.com/issues/entertainment-and-culture/globalization-cinema-whats-next/ https://legacy.lawstreetmedia.com/issues/entertainment-and-culture/globalization-cinema-whats-next/#comments Wed, 20 May 2015 20:51:55 +0000 http://lawstreetmedia.wpengine.com/?p=38995

Can movies transcend borders?

The post The Globalization of Cinema: What’s Next? appeared first on Law Street.

]]>
Image courtesy of [Shinya Suzuki via Flickr]

Avengers: Age of Ultron,” the latest hit in the Avengers franchise, debuted in theaters recently and made more than $200 million in a single weekend. The surprising part however, is that it earned that $200 million outside the U.S., before the movie even opened stateside. The increasing globalization of the film industry is abundantly clear. But the changes in the film industry aren’t just reflected in the exports of American movies to foreign audiences. There are also many nations expanding into the industry. Read on to learn about the globalization of the film industry, and its worldwide ramifications.


The American Film Industry: Changes From Sea to Shining Sea

While Hollywood is facing greater competition from abroad in almost every aspect of the film industry, it is still the dominant player globally. In 2014, for example, the top ten most profitable movies were all made in the United States.

Hollywood has had to adjust to a changing customer base. Nearly 60 percent of the box office hauls taken in by these big productions came from abroad. This means that the success of the Hollywood movie industry is driven more by foreign markets than domestic. In fact, the number two market for Hollywood films, China, is predicted to surpass the American market by 2020.

In response to this changing environment, Hollywood is increasingly relying on big-budget blockbusters. These movies have been particularly marketable specifically because of their simple plot lines, which often avoid nuanced or culturally specific stories that might get lost in translation. Additionally, Hollywood often adds extra scenes to movies released in other countries, sometimes featuring actors from those countries, in order to make them more relatable. This has meant making changes to movies, too. For example, in the remake of “Red Dawn,” the nationality of the invading soldiers was changed from Chinese to North Korean in order to avoid alienating the Chinese movie audience.


Foreign Film Industries: The Veterans

Although Hollywood, as a result of globalization, is facing stiffer competition abroad, there has long existed a traditional foreign film industry. The center of this industry is located in Europe

European Film Industry

While every country in Europe makes movies, five countries in particular make up 80 percent of the market: France, Germany, the United Kingdom, Italy, and Spain. The industry itself is also massive in scope, including 75,000 companies and 370,000 workers across Europe.

In addition to the number of people involved, Europe is also home to some of the most prestigious events in cinema. Perhaps the most famous is the Cannes Film Festival in France. This event has taken place nearly every year since 1946, with filmmakers from all walks of life competing for the coveted Palme d’Or prize for the best film in the competition.

Despite the success of the film industry in Europe, it has struggled to deal with foreign competition, particularly Hollywood. As of 2013, 70 percent of the European film market was dominated by American films. This is in stark contrast to a much smaller 26 percent coming directly from European sources.

But as Hollywood has made efforts to keep its industry relevant, so has Europe. One of the most prominent attempts has been through the LUX competition. Seeking to address one of the most glaring problems in Europe’s film industry–distributing and dubbing movies in all the languages spoken in Europe–the films involved in this competition are sub-titled in 24 different languages so as to be accessible to a wide audience.

Film Industries Down Under

Australia and New Zealand also have prominent film industries. While Australia is currently dealing with losing out on some projects because its tax credits are not competitive enough, there is a strong tradition already in place. For example, “Star Wars: Revenge of the Sith” as well as the “Matrix” trilogy were both filmed there.

The New Zealand film industry is strong and thriving. This has been the result of two forces. First, home-grown production of films such as “The Piano,” which won three Oscars in 1993, has helped promote the industry. There has also been a rise of recognizable talent coming out of the country, including director Peter Jackson. Like Australia, New Zealand has also been the location of major Hollywood productions such as “Avatar,” “King Kong,” and “The Last Samurai” to name just a few.


Rising Stars

Other countries are continuing to create voices of their own through national film industries. Three of the most successful countries in creating major movie industries of their own have been India, Nigeria, and South Korea.

India

Although Hollywood is the most profitable film industry worldwide, India’s is the most productive based on its sheer number of films. India’s film production is so prodigious that it has earned a nickname of its own: Bollywood, in reference to the city of Mumbai. In fact, India’s industry is so expansive that the Bollywood moniker is really only applicable to Mumbai–other regions and cities have film industries of their own that have spawned similar nicknames, such as “Kollywood” and “Sandalwood.”

While the Indian film industry has been a compelling force for more than 100 years, it has seen a huge jump in growth recently. From 2004-2013, gross receipts tripled and revenue is estimated to reach $4.5 billion next year. With those kinds of numbers, India’s film industry promises to continue its upward trajectory in money and influence.

Nigeria

The Nigerian film industry also produces more films per year than Hollywood, and it has the similar nickname “Nollywood.” Nigeria’s films are often lower-budget productions that are released directly to DVD and often not even filmed in a studio. Nonetheless, the Nigerian film industry is influential enough regionally that neighboring countries fear a Nigerianization effect on their own cultures.

The Nigerian film industry is so popular that the World Bank believes that with the proper management it could create a million more jobs in a country with high unemployment levels. The film industry in Nigeria already employs a million people, making it the second-largest employer in the country behind the agricultural sector. Still, for Nigeria to be on the same level as Hollywood or Bollywood, many issues would have to be addressed, in particular the high rate of film pirating. The video below explores Nollywood and its impact on Hollywood.

South Korea

South Korea also has a strong film industry, although it doesn’t have a catchy nickname. While it does not generate the volume of films of Bollywood or Nollywood, it does have the advantage of being the go-to destination for entertainment for much of Asia, particularly China and Japan. South Korea’s movies resonate both domestically and regionally because they often play on historical conflicts that affected the region as a whole. The film industry there also received a boost when a law was passed stating that at least 40 percent of films shown in South Korea had to be produced there, forcing local companies to step up and fill the void.


What does film industry globalization mean?

Money

One of the most obvious implications of globalization is financing. Several major Hollywood studios including Disney have bankrolled films in Bollywood. This is in an attempt to harness the massive potential audience there. Financing is a two-way street however, and when Hollywood struggled for funds following the 2008 recession it received loans and financing from Indian sources.

Culture

Another implication is cultural. In many countries, the government has posted quotas or imposed tariffs on foreign films to limit their dominance domestically. These laws are aimed specifically at American movies. One of the motivations for these rules is the competition American films provide. In basically every domestic market worldwide, Hollywood movies have a larger share than the domestic industry. Secondly, movies are seen as cultural pillars, so leaders are interested in preserving, and even promoting their own culture over that of a foreign entity like the one presented by Hollywood.

Like financing, cultural considerations also have a return effect on Hollywood. In order to attract more foreign viewers, Hollywood movies have simplified story lines and included more actors from different locales. In effect, Hollywood has had to become more diverse and open in order to appeal worldwide. This effect may actually dilute any would-be American cultural overload as well, as these movies are incorporating more global cultures in order to be competitive.

Globalization is a give and take. There has been a long-standing fear of globalization leading to Americanization; however, as the film industry has shown, for American filmmakers or any others to be competitive globally their themes and characters must be global, too. Additionally the invasion of Hollywood movies has also encouraged many domestic industries to build up their own audiences and industries that had been neglected before.


Conclusion

Hollywood has long dealt with issues, ideas, and events that have stretched the world over, and it is now dealing with competition as diverse and far reaching as the topics of the movies it produces. The Hollywood film industry had remained the dominant player in the industry by leveraging foreign markets. Globally this has also meant the incorporation of more films and actors from traditional markets such as Europe. It also means the rise of movies and stars from non-traditional markets as well. Thus the globalization of the film industry has meant many things to many different people, but what it has meant to everyone involved from production to consumption is greater access and opportunity. Hopefully, the global film industry will continue along this path.


Resources

Arts.Mic: Three Countries With Booming Movie Industries That Are Not the U.S.

BBC: How the Global Box Office is Changing Hollywood

Vanity Fair: Avengers Age of Ultron is Already a Huge, Hulking Hit at the Box Office

Business Insider: The Highest Grossing Movies of 2014

Grantland: All the World’s a Stage

Law Without Borders: The Intersection of Hollywood and the Indian Film Industry

Los Angeles Daily News: Why TV, Film Production is Running Away From Hollywood.

European Parliament Think Tank: An Overview of Europe’s Film Industry

BBC: Australia Film Industry Hurt by Strong Currency

International Journal of Cultural Policy: Cultural Globalization and the Dominance of the American Film Industry

UN: Nigeria’s Film Industry a Potential Gold Mine

Festival De Cannes: History of the Festival

100% Pure New Zealand: History of New Zealand Screen Industry

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Globalization of Cinema: What’s Next? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/entertainment-and-culture/globalization-cinema-whats-next/feed/ 3 38995
Nuclear Energy: Worth the Risk? https://legacy.lawstreetmedia.com/issues/energy-and-environment/nuclear-age-revisted/ https://legacy.lawstreetmedia.com/issues/energy-and-environment/nuclear-age-revisted/#respond Fri, 15 May 2015 16:12:55 +0000 http://lawstreetmedia.wpengine.com/?p=39748

It's sustainable, but also risky.

The post Nuclear Energy: Worth the Risk? appeared first on Law Street.

]]>
Image courtesy of [IAEA Imagebank via Flickr]

On Saturday May 9th a transformer fire broke out in New York. While this was a seemingly innocuous event, there was more to the incident than just a fire. It broke out at the site of a nuclear power plant, located only about 35 miles from Manhattan. While the fire never spread to the nuclear power plant itself and there was no immediate threat of a nuclear meltdown, the potential danger was concerning. Yet these risks are just part of the balancing act that is required to harness nuclear power for energy. Read on to learn about the development of nuclear energy, its risks, and its rewards.


History of Nuclear Power

Developing the Technology

The first notions of atoms can be traced all the way back to the ancient Greeks, who philosophized about tiny, unseen elements which combine to form the world around us. But the real work on nuclear energy essentially started in the early years of the 20th century. In the late 1930s, German scientists, following the previous example set by Italian physicist Enrico Fermi, bombarded uranium with neutron, causing it to split. The experiment and subsequent efforts revealed that during the fission process some mass is converted into energy.

In 1942, Fermi took the next step by achieving a self-sustaining chain reaction underneath the University of Chicago’s athletic stadium. This step effectively ushered in the nuclear age. During WWII this field was mainly focused on harnessing the power of the fission reaction into some type of weapon. However, following the war focus returned to producing energy from the reaction, as part of the Atomic Energy Commission created by Congress in 1946. The first reactor to produce electricity was in Idaho on December 20, 1951. The first nuclear powered plant that created power for public use in the United States was in Shippingport, Pennsylvania in 1957.

How do nuclear power plants work?

There are two types of nuclear power plants and they work in separate ways to generate power. In a pressurized water reactor, water is pressurized but not allowed to boil. The water is then streamed though pipes and turned in to steam which powers the generators. In this type of reactor, the water creating the steam and the water in the reactor do not mix.

The other type is known as a boiling water reactor. As the name implies, in this case the water is allowed to boil and turns into steam through fission. The steam, like in the pressurized reactors, turns the generators, which create electricity. In both systems, the water can also be reused once it has been reconverted from steam back into its liquid form.

The Nuclear Power Industry

Following the opening of the plant in Pennsylvania, the industry continued to grow rapidly throughout the 1960s as corporations across the U.S. saw the possibility of a power source that was viewed as a cheaper, safer, and more environmentally friendly than traditional sources, such as coal. However, this trend began to reverse in the 1970s and 80s as the popular opinion of nuclear power became negative and many of the strong selling points of nuclear energy became areas of concern.

Nevertheless, as of January 2015, 31 countries were operating 439 nuclear power plants worldwide, although the number of operating plants can fluctuate slightly based on different definitions of the term “operable.” The United States has the most plants at 99, almost twice as many as the next country France, which has 58. The plants themselves are located predominately in what are commonly considered the more developed countries. One of the major explanations for this phenomenon are the high costs required to build a nuclear power plant. Another major factor in the peaceful use of nuclear power is the Nuclear Non-Proliferation Treaty or NPT.

Nuclear Non-Proliferation Treaty

The Nuclear Non-Proliferation Treaty (NPT) has been one of bedrocks for peacefully spreading, and at times hampering, the spread of nuclear power worldwide. The first step can probably be traced back to a speech given by President Dwight Eisenhower. This speech, coined the “Atoms for Peace” speech, provided a blueprint for effectively managing nuclear proliferation following WWII. It also paved the way for spreading nuclear technology in a positive way.

While many of the suggested measures from Eisenhower’s speech were not taken, the International Atomic Energy Agency was born out of his ideas. This agency provided the prospect of nuclear knowledge in exchange for agreeing to safeguards and arms limits. While it worked in some cases, it could not halt the military aspect of nuclear research. It did however help give rise to the NPT.

The NPT divided countries into the proverbial nuclear weapon haves and have nots. Its requirements were also essentially the same, in return for allowing inspections countries were giving technical knowhow. While there are many criticisms levied against the NPT, it did work to prevent the spread of nuclear weapons, while helping some nations gain nuclear power as an energy source.


A Series of Unfortunate Events

Despite all the efforts made to safeguard nuclear energy, there are still many concerns over the safety of nuclear power plants. This danger has manifested itself several times over the course of the nuclear power age, both internationally and abroad.

The worst nuclear power plant disaster in history was in Chernobyl, Ukraine which was then part of the Soviet Union. During the disaster, 50 people were killed at the plant and as many as a million more were exposed to the radiation. The amount of radioactive fallout released into the air, as a result, was 400 times more than what had been released in the bomb that was dropped on Hiroshima in 1945.

Domestically, the worst nuclear energy disaster was the Three Mile Island incident in 1979. During the crisis on an island in Pennsylvania, a full nuclear meltdown was narrowly avoided and no one was killed. Nonetheless, the stigma created from the ordeal was a key contributing factor to the decline of new nuclear plants in the U.S. during the 1970s and 80s.

The most recent disaster came in 2011 in Fukushima, Japan. During this disaster a massive earthquake, followed by a tsunami, damaged the nuclear reactors in Fukushima. This led to a nuclear meltdown that killed as many as 1000 people trying to evacuate the area.

These are just three examples, but there are more, both in the U.S. and abroad. While nuclear energy has been lauded for its sustainability and limited impact on the environment, the threat of a nuclear meltdown is a major consideration in regards to expanding the technology going forward.


The Future of Nuclear Energy

With last week’s fire at a nuclear facility rekindling fears over the dangers of nuclear technology, what exactly is the future of nuclear energy both domestically and abroad? In answering that, two aspects need to be considered, namely nuclear waste and security.

Waste

Although nuclear energy is often touted as a clean alternative to other energy sources, such as coal and natural gas, it has its own waste issues. In the U.S. alone each year approximately 2000 metric tons of high-level radioactive waste are generated. Troublingly, there is no permanent repository for this nuclear waste so it remains stored on site, potentially vulnerable to attack and leakage.

The waste issue continues further down the supply chain as well. The mining of uranium, which occurs mostly outside of the U.S. and therefore also partly nullifies any argument in relation to energy independence, is a very harrowing experience. A number of chemicals are used to mine Uranium which poison both the surrounding environment and the workers involved in the extraction.

Security

Along with waste is the issue of security. It has already been shown that the security of a nuclear power plant can be jeopardized by human error and natural disasters. However after 9/11 there have been fears of a terrorist attack on a nuclear facility. While the nuclear plants are supposedly protected by measures designed by the Nuclear Regulatory Commission (NRC), there are acknowledged vulnerabilities.

Air and sea attacks could be problematic, as well as multiple coordinated attacks on a facility at once. Spent nuclear rods are particularly vulnerable to attack as they sit outside of controlled nuclear reactors. While the NRC has made strides in some of these categories, especially in regards to potential air strikes, concerns remain that it still falls short in other categories such as potential land and sea assaults. Furthermore, force on force tests–staged attacks on nuclear plants–showed at least 5 percent of plants are still not adequately protected even after changes were made to increase protection following 9/11.

These fears include other worries that stem from the Soviet collapse of the early 90s. These are centered on what are termed as “loose nukes”– unaccounted nuclear weapons from the Soviet Union. Similar concerns may also arise as civil wars continue in countries such as Iraq or Syria who at one time were known to be pursuing nuclear weapons.

Staying the course?

Coupled with waste and security concerns are also cost considerations. Nuclear power plants are very expensive to maintain and suffer a failure rate, in regards to financing, of over 50 percent, meaning tax payers are often required to bail them out. In light of all these considerations and with other truer sustainable energy sources it would seem the days of nuclear energy would be numbered.

This assumption is wrong however, as already 70 new plants are under construction with 400 more proposed worldwide. While many of these will never leave the drawing table, the rise in construction and planning of new nuclear plants points to nuclear power’s proven track record in at least one regard–battling CO2 emissions and producing power on a scale that currently far exceeds any other renewable options.

This option is particularly attractive to countries with state-run governments that can commit to long term investments and are desperate to move beyond major polluters such as coal-power plants, such as China. Meanwhile in Western democracies while some construction is planned, many are working toward phasing out nuclear power altogether. In this regard Germany is leading the pack and has pledged to be completely nuclear free by 2022. The following video explores the future of nuclear power:

Conclusion

Nuclear energy seems to be the ultimate compromise. While it is cleaner than coal or gas plants, it still produces radioactive waste that has no long term storage location and takes thousands of years to decay. Conversely it has a proven track record and while it may cost more to build new nuclear facilities than any other energy source, the energy produced far outpaces many alternatives. Thus, the world with its ever growing energy demands is left to maintain the delicate balance. We are still in the nuclear age, although how long we’ll stay here remains uncertain.


Resources

Primary

Department of Energy: The History of Nuclear Energy

Additional

United States History: International Atomic Energy Agency

Physicians for Social Responsibility: Dirty, Dangerous and Expensive The Truth About Nuclear Power

CNN: After Explosion at Nuclear Plant, Concerns of Environmental Damage

Duke Energy: How Do Nuclear Plants Work?

European Nuclear Society: Nuclear Power Plants Worldwide

Arms Control Association: Arms Control Today

Foreign Policy: Think Again Nuclear Proliferation

CNBC: 11 Nuclear Meltdowns and Disasters

World Nuclear Association: Fukushima Accident

Union of Concerned Scientists: Nuclear Plant Security

BBC News: Nuclear power Energy for the Future or Relic of the Past?

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Nuclear Energy: Worth the Risk? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/energy-and-environment/nuclear-age-revisted/feed/ 0 39748
How Do Nations Respond When Disaster Strikes? https://legacy.lawstreetmedia.com/issues/world/disaster-strikes-nations-respond/ https://legacy.lawstreetmedia.com/issues/world/disaster-strikes-nations-respond/#respond Sun, 10 May 2015 18:34:20 +0000 http://lawstreetmedia.wpengine.com/?p=39240

The recent earthquake in Nepal sheds on a light on disaster preparedness around the globe.

The post How Do Nations Respond When Disaster Strikes? appeared first on Law Street.

]]>

The devastating 7.8 earthquake that recently struck Nepal caused untold damage to buildings and has killed thousands of people, with many more missing. Following the devastation, the usual influx of aid began, as did finger pointing over who was to blame for the devastation. However, what this catastrophe has revealed most clearly is the disparate ways in which countries respond to disasters. Read on to learn about the response to the Nepalese earthquake, and the various global responses to disasters.


Responding to a Disaster

Emergency Management

Disasters, natural and man made, have been around since the beginning of time. However, the response to these disasters has not always been the same, and methods have varied as widely as the civilizations that have suffered them.

In the United States for example, we have FEMA (the Federal Emergency Management Agency). FEMA was founded in 1979 when five separate agencies that dealt with disasters consolidated into one. Although it perhaps best known now for its poor handling of Hurricane Katrina in New Orleans, it has served as the point agency for every natural disaster the United States has dealt with since its inception.

Emergency Management Cycle

While the methods for emergency management vary, one of the commonly accepted tools is the emergency management cycle. The cycle’s origins go back to the 1930s when phases were first used to describe the ideal response to a disaster. The cycle gained its central place in the emergency management lexicon in 1979 when FEMA was created by President Jimmy Carter following recommendations from the National Governors Association, and versions have now expanded to other nations. This cycle is generally broken into three or four parts, although newer variations can include more steps. Usually the four steps are mitigation, preparedness, response, and recovery. You can see an example here.

The first two phases, mitigation and preparedness, actually occur before the potential disaster strikes. In the preparedness and mitigation phases a country plans for a potential disaster through steps such as developing evacuation plans, raising awareness and improving current infrastructure.

Once the disaster actually strikes, there’s a response section of the cycle. During this time, emergency management workers attempt to rescue people, provide basic services, and prevent any further damage. The final phase is recovery. In this final stage, once the disaster has passed, authorities go to work returning basic services to full operational capabilities. Additionally, infrastructure and other institutions that were damaged during the devastation are rebuilt.

While these distinctions seem clear, steps often overlap and become blurred, further complicating the process. In addition, it’s important to remember that these steps apply equally to both man-made and natural disasters. However, maintaining an appropriate balance of preparedness for the two types is important, otherwise one can become neglected at the expense of the other. A chilling example is the focus on defending against terrorism in the United States that left other shortcomings unnoticed. Critics claim this led to an underfunding of the levee system in New Orleans, which ultimately failed during Hurricane Katrina and had devastating results.

The Finger Pointers and the 20/20 Crowd

Unfortunately not every country has such a system or even a plan in place, including Nepal. These programs are very expensive. For example, in 2015 FEMA’s requested budget was $10.4 billion. To put that into context, Nepal’s entire GDP for 2013, the most recent year available from the World Bank, was only $19.3 billion. While no one expects Nepal to have an agency or program on the scale of FEMA given the lower population and wealth gap between it and U.S., in the wake of this disaster, concerns have arisen that the nation was unprepared.

These considerations did not stop the criticism from pouring about the failure of the Nepalese government. These criticisms have come from several high profile sources, including numerous relief agencies, namely the United Nations. Criticisms range from insufficient infrastructure to the difficulty aid groups have delivering supplies to those who need them. Despite the disaster, many protective tariffs are still in place, making it difficult to distribute goods. There are also concerns over widespread corruption and the reported looting of supply convoys by authorities who want to disperse the aid along ethnic lines.

These criticisms should not be entirely surprising given Nepal’s governmental history. The country only just began recovering from a civil war in 2006, which had lasted ten years. That conflict pitted the newly established democratic government against Maoist insurgents. Since the end of the civil war, there have been a succession of ineffectual governments who have been unable to create any sort of a unified front. For example, in January 2015, the current government was unable to agree on changes to its constitution because of political infighting.  The video below depicts many of the issues facing Nepal’s relief efforts:


International Community

When countries such as Nepal and others suffer a horrendous disaster, the international community usually steps up to aid them in their suffering. While variations of aid can be separated into many different branches, the two clearest distinctions are financial and direct intervention.

Financial Assistance

While not every country has an emergency response team to spare to help in a disaster zone, many can offer another valuable commodity: money. As of April 28th 60 million dollars in financial assistance had already been pledged to the earthquake ravaged area. This type of giving is not surprising, especially following natural disasters such as earthquakes. In fact two other examples, the deadly 2013 typhoon in the Philippines, and the 2008 cyclone in Myanmar (Burma) illustrate that in circumstances such as these, it is not uncommon for the aid a country receives to as much as quintuple from one year to the next.

Although this is good news for Nepal, it may not be enough. While financial pledges can be easily won in the immediate aftermath of a disaster, the ability to continue to elicit them tends to fade as the story does from the headlines. Costs to repair the damage in Nepal have been estimated to be as much as five billion dollars. This massive undertaking is especially difficult for a nation like Nepal whose GDP, as previously mentioned, is only around 20 billion total each year, with a significant portion of that coming from now-lost tourist revenue.

In addition to these considerations, a working paper on the political economy of disaster preparedness by Charles Cohen and Eric Werker of the Harvard Business School also raises additional considerations. While money is useful in dealing with a disaster, giving away large sums reduces the incentive of a government to be adequately prepared in the first place. According to the study, rich countries as well as poor would be better off if more aid was provided for preparedness than response–it’s smarter to be proactive than reactive.

Concerns also abound over a dishonest government stealing aid money. In some cases, leaders want to reward their constituents first in order to maintain their good graces. Thus, it is also imperative in these types of situations to have a decentralized aid distribution system as much as possible. The video below provides some dos and don’ts in regards to helping following a disaster:

Physical Intervention

Another means to assist an ailing nation is through direct assistance by countries and private organizations. In the case of Nepal, this aid can be divided into three sub-categories. First, countries such as Japan and Australia sent experts and aid teams to help recovery. Relief organizations such as the Red Cross provided money and experts to help, basically serving as microcosms of the nations they represent. Lastly corporations such as Coca-Cola and Kellogg provided bottled water and food to satiate survivors whose access to basic goods may have vanished in the wake of the disaster.

Like financial assistance, direct intervention can also have drawbacks. An example of this comes from the 2010 Haitian earthquake. In that case, relief efforts were hampered and stagnated due to an inefficient infrastructure in place. The United States took full control of the response efforts, at one point legally taking possession of the main airport in the capital Port-au-Prince during the relief efforts. However, subsequent American prioritizing of its own relief planes over other nations’ led to an international row that threatened to divert focus from the main crisis as hand. The accompanying video depicts the controversy:

The Wealth of Nations

Additionally the acceptance of aid either through financial aid or direct intervention can also be influenced by the existing wealth of a nation. For example, while Nepal is basically dependent on other countries for assistance, richer nations who are less beholden may refuse aid when it is offered. A prime example is the United States, which politely declined nearly one billion dollars in aid from allies following Hurricane Katrina in 2005. While part of this was due to government inefficiency in distributing assistance, most offers were simply declined out of hand.

The U.S. declined most of the aid because, while it was adept at distributing aid to other countries, it was less skilled at dispensing aid within its own.  Thus rather than accept more aid that would often spoil or remain unclaimed, it instead declined many offers.  While this stagnation is criticized in other countries as a result of underdeveloped agencies, in the U.S. it was accepted because the U.S. is perceived as being a more capable nation due to its relative wealth.


Conclusion

Although countries such as Nepal and Haiti may serve as examples of how not to handle a disaster, there is no telling how any nation will respond once it actually experiences one. The prime example here is the United States. Even with its large bureaucracy dedicated to disaster relief and readiness, with an equally large budget, the U.S. has repeatedly been accused of being unprepared.

There are numerous examples of these failings, perhaps the two most glaring in recent memory are Hurricane Katrina and Hurricane Sandy. Hurricane Katrina essentially wiped out one of the most historic cities in the US, New Orleans, while also killing over a 1000 people and causing over $135 billion dollars in damage. Hurricane Sandy saw a lower fatality count, approximately 100 dead, but saw major parts of eastern states such as New York and New Jersey effected to the point of $50 billion dollars in damages.

Disasters, whether they are man made or natural, can strike anywhere, anytime. While some nations, either through financial means or previous experience are more prepared than others, ultimately no nation is ever ready for something as deadly as Nepal’s earthquake or a massive hurricane. This is a global issue, and one that has no easy answer.


Resources

Primary

FEMA: The Four Phases of Emergency Management

World Bank: Nepal

Ottawa County Sheriffs’ Office: Four Phases of Emergency Management

Central Intelligence Agency: World Factbook Nepal

Additional

Time: These are the Five Facts That Explain Nepal’s Devastating Earthquake

Brookings: Counter-Terrorism and Emergency Management Keeping a Proper Balance

MNMK: Disaster Management – A Theoretical Approach

VOA: Nepal Officials Slammed Over Aid Response

Fierce Homeland Security: 2015 Budget Request

Harvard Business School: The Political Economy of Natural Disasters

CNN Money: Nepal Earthquake Donations, Who’s Sending What

Vanderbilt Center for Transportation Research: The Phases of Emergency Management

Guardian: US Accused of Annexing Airport as Squabbling Hinders Aid Effort in Haiti

Washington Post: Most Katrina Aid from Overseas went Unclaimed

The Data Center: Fact for Features Katrina Impact

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post How Do Nations Respond When Disaster Strikes? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/disaster-strikes-nations-respond/feed/ 0 39240
The Armenian Genocide: A Battle For Recognition https://legacy.lawstreetmedia.com/issues/world/armenian-genocide-battle-recognition/ https://legacy.lawstreetmedia.com/issues/world/armenian-genocide-battle-recognition/#respond Sat, 02 May 2015 15:00:22 +0000 http://lawstreetmedia.wpengine.com/?p=38949

Why won't Turkey or the US recognize the Armenian genocide?

The post The Armenian Genocide: A Battle For Recognition appeared first on Law Street.

]]>
Image courtesy of [Rita Willaert via Flickr]

This week marks the one hundredth anniversary of the Armenian genocide, which took place in the Ottoman Empire beginning in April 1915. A lot has changed in 100 years–the Ottoman Empire obviously no longer exists, having been replaced by modern-day Turkey. The Armenians also now have a country of their own, bordering Turkey to the East. Yet the atrocities committed against the Armenians have remained a contentious point of debate, as Turkey refuses to recognize the genocide or even mention that it happened. Turkey has also pressured its allies to ignore the events, as well. Read on to learn about the Armenian genocide, Turkey’s position on the events, and the recognition, or lack thereof, by other countries.


History of the Armenian Genocide

Defining Genocide 

In the 1948 Convention on the Prevention and Punishment of Genocide, Articles II and III, genocide is defined as “intent to destroy, in whole or in part, a national, ethnical, racial or religious group, as such.” However, there’s been significant debate over whether or not what happened to the Armenians constitutes a genocide. On the global stage, opinions vary widely. For example, Pope Francis recently declared it the century’s first genocide, while Secretary General Ban Ki-Moon of the United Nations has stopped short of doing the same. For the purposes of this article, it will be referred to as the Armenian genocide, although with recognition that such a classification is disputed.

The Armenians

The Armenians lived in the region of modern-day Turkey for thousands of years. While they briefly had their own kingdom, they were usually a part of a larger empire, including the Ottoman Empire from the 1500s until its collapse following WWI. The Armenians were treated as second-class citizens in the empire due to their Christian religious beliefs, as the Ottomans were Muslim.

While the Armenian genocide was the worst and most well-publicized massacre of the Armenian people, it was not the only one. Over the course of the late 1800s, there had been another massacre at the hands of the Ottoman Turks as well. In that case, hundreds of thousands of people had been killed, a large number given the small overall population. There were also other intermittent acts of butchery levied against the Armenian population by the Turks throughout the years.

April 1915

The Armenian genocide began in April 1915, during WWI. It lasted into the 1920s and overall as many as 1.5 million Armenians were forcibly deported or killed. Along with the gruesome murders, children were also kidnapped from their families and sent to live with Ottoman parents and women were raped and forced to become part of harems for Ottoman rulers.

These attacks were prompted by a few different facets of the Ottoman-Armenian relationship. Since the late 1800s Armenians had protested Ottoman rule, demanding more rights and greater autonomy. During WWI it was widely believed that the Armenians would support the Russians in hopes of achieving independence. This concern was validated, as Armenians organized volunteer battalions to fight alongside the Russians against the Ottomans.

These atrocities against the Armenians were carried out by the ruling power of the Ottoman Empire at the time, the Young Turks. The Young Turks had come to power themselves through a coup of the old emperor of what was then the Ottoman Empire. The video below gives greater details of the massacre.


Reflecting on History

A mass killing of Armenians happened; there’s almost no disagreement about that. But even today, it is still illegal to say that in Turkey. In fact, if someone is caught talking about the event or writing about it, they risk being arrested. But why have so many other nations been so slow today to acknowledge the events that happened almost a century ago?

Turkey

The Turks have many ways to explain the mass deaths of the Armenian population during WWI, mostly attributing it to the grim realities of war. Why has Turkey persisted so long in presenting that description of events? The answer appears to be two-fold.

First, Turkey has denied the genocide so long now that it has almost become part of the national consciousness. In fact, the idea of an Armenian genocide almost seems bewildering to the Turkish people. In a recent statement with regard to an EU parliamentary vote on whether or not to recognize the actions of the Ottoman Empire as genocide, Turkish President Recep Tayyip Erdogan weighed in. Erdogan seemed perplexed at the EU even raising the issue. According to him,

I don’t know right now what sort of decision they will make … but I barely understand why we, as the nation, as well as print and visual media, stand in defense. I personally don’t bother about a defense because we don’t carry a stain or a shadow like genocide.

Turkey also faces potential costs in admitting guilt. Experts suggest that if Turkey were to admit to committing genocide, it may have to compensate victims or their families. This was the case in the aftermath of the Holocaust, which was recognized. With these factors in place it becomes clearer why Turkey would be hesitant to admit guilt, especially when the admission would gain the Turks nothing, except perhaps some good will in the international community. The accompanying video reiterates why Turkey is refusing to acknowledge the genocide.

Denying the genocide has also been a political strategy for some in Turkey. President Erdogan is a huge road block for acknowledging the genocide. He has made comments denying the genocide that have helped him to gain popularity. Given that he has faced increased criticism for his governing style and changes he has attempted to make to Turkey’s government to keep himself in power, any political points he can score probably look pretty appealing.

Within Turkey, some groups have recognized the genocide. Kurds, who make up about 20 percent of the country’s population, have recognized the events to a large extent. While Kurds commemorated the anniversary and use the word genocide in describing the events, they have been accused of falling somewhat short. Namely, despite Kurdish units carrying out some of the Armenian murders, Kurdish citizens, like the Turks, are hesitant to accept any responsibility. In this case, they feel justified in their denial because it was not their nation conducting the massacres, but rather the Ottoman Empire. Nevertheless many Kurds feel a responsibility to reconcile with the Armenians because they are also an oppressed people.

The U.S. and Other Allies

While Turkey’s motives seem relatively clear in denying the Armenian genocide, the motives of its allies are less so. Already many countries recognize the genocide including Canada, France, Germany, and Russia.  Other countries such as the U.K. and Israel do not.

The United States also hasn’t, as a whole, recognized the genocide. While 40 states, the House of Representatives, and several presidents have confirmed Turkish actions against the Armenians to be genocide, the nation has not. The reason for American refusal, like that of Turkey itself, mostly lies in self-interest.

When other countries, such as France and Austria, have recognized the genocide, Turkey has withdrawn its ambassador or ended military alliances with them. While France and Austria can get by fine without Turkish military assistance, it is a little more difficult for the U.S., which uses Turkey as a critical strategic point for interactions with nations in the Middle East.

Additionally, there has been a significant lobby on Turkey’s behalf within the U.S. government to not recognize the genocide. By preventing the U.S. from recognizing Turkey’s culpability it reduces the pressure the country is under internationally. The video below shows then-Senator Barack Obama addressing the Armenian genocide seven years ago, an issue he promised to address but still has not.


 

Conclusion

The man who came up with the word genocide, Raphael Lampkin, penned the term to describe the Nazis’ atrocities against the Jews.However, he had also been influenced by the Turkish actions against the Armenians during WWI and the Armenians’ subsequent efforts to track down and murder the leaders responsible. To him there was no difference between the two scenarios–in each case an entire people and way of life were targeted for extermination; however, Turkey and its allies, including the United States, have consistently failed to see the similarities. As long as the current barriers to recognition remain in place, that will probably continue to be the norm.


Resources

The New York Times: Armenian Genocide of 1915

Prevent Genocide International: The Crime of Genocide Defined in International Law

Times of Israel: UN Chief Won’t Call 1915 Slaughter of Armenians “Genocide

Guardian: Turkey Cannot Accept Armenian Genocide Label, says Erdogan

CNN: ISIS-Kurdish Fight Stirs Trouble in Turkey

Ynet News: Erdogan Turkey’s King of Controversy

Al Monitor: Kurds Pay Respect to Armenians

History: Armenian Genocide

Los Angeles Times: Why Armenia Genocide Recognition Remains a Tough Sell

Blaze: The 1915 Armenian Genocide-Why it is Still Being Denied by Turkey (and the US?)

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Armenian Genocide: A Battle For Recognition appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/armenian-genocide-battle-recognition/feed/ 0 38949
Illegal Immigration in Europe: Latest Shipwreck Sheds Light on Trend https://legacy.lawstreetmedia.com/issues/world/illegal-immigration-europe-latest-shipwreck-sheds-light-trend/ https://legacy.lawstreetmedia.com/issues/world/illegal-immigration-europe-latest-shipwreck-sheds-light-trend/#respond Sun, 26 Apr 2015 14:30:18 +0000 http://lawstreetmedia.wpengine.com/?p=38652

Why are so many migrants going to Europe?

The post Illegal Immigration in Europe: Latest Shipwreck Sheds Light on Trend appeared first on Law Street.

]]>
Image courtesy of [SarahTz via Flickr]

Like the United States, many European nations increasingly face an illegal immigration problem. As the sinking of a boat carrying migrants last week showed, this problem is also very deadly. But what is inspiring these migrants to risk everything and head for Europe? Read on to learn about the immigrants coming into Europe, the groups facilitating that process, and the issues with which Europe needs to contend in light of the influx of illegal immigration.


The Sinking and Legacy

On April 19, 2015, a boat on its way to Italy carrying illegal immigrants from places as far and wide as Eritrea and Bangladesh, capsized off the coast of Libya. The overcrowded boat overturned after ramming a Portuguese cargo ship, the King Jacob. A full count of the deceased is still unknown.

A Recurring Problem

While the recent wreck was a tragedy, it certainly was not the first and likely not the last boat filled with illegal migrants headed for Europe to sink. In fact, such incidents have happened frequently and speak to a much larger trend. In 2014 for example, as many as 218,000 migrants were estimated to have crossed the Mediterranean from Africa to Europe. This year, 35,000 have already been suspected of crossing from Northern Africa into Europe.

Those who have made the crossing must be considered the lucky ones. Attempted crossings lead to a substantial number of deaths at sea. Last year 3,500 people were believed to have perished during the attempted crossing. That number sits at around 1,600 this year, with the most recent sinking taken into account. Unfortunately these numbers are only likely to increase. Prior to this incident, since October 2013, there have been at least four other occurrences in which a boat carrying migrants had sunk while carrying at least 300 people.

Human Trafficking

These trips tend to be organized by human traffickers. The traffickers are predominantly Libyan bandits, militia, and tribesmen. There are two main routes these smugglers take to get their human cargo through Africa and into Europe. The eastern route stretches as far as Somalia, while the western one reaches Senegal. Regardless of the routes’ starting points, migrants are funneled to Libya where they are then launched from either Benghazi or Tripoli in overcrowded and rickety boats toward the coast of Italy.

Unfortunately, traffickers’ tactics have recently began to change, making them even more nefarious and hard to prevent. Many traffickers have begun abandoning their ships en route to Europe–literally leaving the ships without steering of any kind. The smugglers obtain a large cargo ship, then during the trip advise their migrant-manned crews to call for help while they abandon the ship. The reason why the smugglers do this is two fold: First they are paid up front so it does not matter to them whether these migrants actually make it to Europe or not; secondly, by abandoning the boat they reduce their own chances of being arrested and can then smuggle more people and further profit. This practice has extended the smuggling season from spring and summer to all year round, but has made the crossing even more dangerous.

The industry has become especially appealing for traffickers in the last few years as traditional sources of income have disappeared as a result of government upheaval. Additionally, those doing the actual trafficking in many cases are would-be migrants themselves, which makes stopping the practice extremely difficult. The video below briefly explains the harrowing journey from Libya to Europe and all its difficulties.


Why do migrants cross the Mediterranean?

With all these dangers in mind, why do migrants risk crossing the Mediterranean? The answer varies for each individual, yet some reoccurring themes present themselves. Many of these themes are similar to the reasons why people attempt to migrate to the United States. First, many of the migrants are escaping danger back home. This ranges from country to country as well–for example, there has been an increase in migrants from Syria due to the civil war in that country.

Along with danger, another major impetus is economic. Most of the migrants attempting the journey are young men looking for opportunities. The goals of these men naturally vary, but often the promise of success and the ability to send earnings back to their families is a common desire.

While migration to Europe has become popular, it was not always the top destination for migrants. In the past, migrants had also attempted to go to places such as Israel and Saudi Arabia; however, with Israel increasing security and with Saudi Arabia engaged in a military conflict in Yemen, these routes have dried up. Whichever route the migrants take, they risk abuse ranging from robbery to rape and murder. In response to these dangers and the increasing deterioration of Libya, some migrants have tried crossing through Morocco instead, a much more difficult route.


Impact on Europe

When migrants successfully make the journey to Europe, the onus shifts from their handlers to European authorities. Since many migrants arrive in Europe without identification of any kind, it can make it much more difficult to send them back. This, in effect, makes migrants asylum seekers who are then held in refugee camps. Once in these camps, migrants may continue onward in Europe where travel restrictions have been reduced as part of the open-border aspect of the European Union.

Migrants are sometimes also allowed to move throughout Europe due simply to the cost of supporting them. Italy, the destination for many migrants, was spending as much as $12 million dollars a month on its search and rescue efforts in the Mediterranean. Another popular hub, Greece, spent $63 million in 2013 fighting illegal immigration. The problem both these countries, and other southern-European countries, face is that while they are part of the EU, the costs of their efforts have been almost entirely their own burdens to bear. These costs can be especially painful, considering the same countries that serve as these initial destinations for migrants are the ones also currently dealing with recessions. The video below highlights the issues each country in the EU deals with in regards to immigration.

The reason why countries such as Italy and Greece are footing the majority of these bills is due to their immigration laws. According to something referred to as the Dublin Regulation, a migrant must be processed as an asylum seeker upon entering a country. Once the person has been processed in that country, they become the responsibility of that particular nation. The following video shows the strategic routes immigrants take into Europe and reiterates how asylum status is achieved.

The design of this system naturally leads to problems, chief among which are accusations by richer northern-European countries that their southern neighbors are letting migrants pass north in an effort to reduce costs for themselves. In response to these allegations and as a result of bearing what it perceives to be an unfair burden, Italy cancelled its search and rescue mission last year. In its place the EU created the Triton Mission, a program similar to Italy’s, which focuses on rescuing migrants. Moreover, as part of a proposed ten-point plan in response to the most recent ship sinking, the mission is slated to increase in size. Another aspect of that plan is a program that is supposed to be implemented to return refugees to their countries. Nonetheless, even if the EU goes forward with its goal to expand the Triton mission, it will still be smaller than the one Italy disbanded last year.


Conclusion

Despite being described by several sources as modern day slavery, the practice of illegally ferrying immigrants from Africa and elsewhere to Europe is unlikely to stop or even slow down any time soon. This is the result of many things that are not likely to change in the immediate future, such as relatively high standards of living in the EU, crisis in the Middle East and Africa, EU laws regarding migrants, and the lucrative trafficking operations. But if Europe wants to fix its broken immigration system and prevent future tragedies on the scale of last week’s ship sinking it must do more than simply increase patrols.


Resources

ABC News: Libya Migrant Boat Sinking

Wall Street Journal: Rich Smuggling Trade Fuels Deadly Migration Across Mediterranean

BBC News: Mediterranean Migrants: Hundreds Feared Dead After Boat Capsizes

Atlantic: Human Traffickers Are Abandoning Ships Full of Migrants

CNN: Eating Toothpaste, Avoiding Gangs: Why Migrants Head to the Mediterranean

Human Events: Illegal Immigration is Europe Losing Control of Its Borders

Economist: Europe’s Huddled Masses

EUbusiness: Commission Proposes Ten-Point Migrant Crisis Plan

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Illegal Immigration in Europe: Latest Shipwreck Sheds Light on Trend appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/illegal-immigration-europe-latest-shipwreck-sheds-light-trend/feed/ 0 38652
A Castle Made of Sand? The Iranian Nuclear Deal Moves Forward https://legacy.lawstreetmedia.com/issues/world/castle-made-sand-iranian-nuclear-deal-moves-forward/ https://legacy.lawstreetmedia.com/issues/world/castle-made-sand-iranian-nuclear-deal-moves-forward/#respond Sun, 19 Apr 2015 17:11:01 +0000 http://lawstreetmedia.wpengine.com/?p=38039

After extensive negotiations, an Iranian Nuclear Deal has been made. Will it end up being successful?

The post A Castle Made of Sand? The Iranian Nuclear Deal Moves Forward appeared first on Law Street.

]]>

The United States and Iran, along with a number of other world powers, reached a tentative deal on April 2, 2015, that would prevent the Iranians from developing nuclear weapons. The deal required a tremendous amount of time and work to come together. With all these moving parts it’s not surprising that there have been varied reactions around the world. Regardless, if finalized, the deal will have wide-reaching ramifications both regionally and across the globe. Read on to learn about the current agreement, its impact, and what could happen if it falls through.


The Deal

So what exactly is this “deal” to which Iran, the U.S., and the other nations agreed?

Iran’s Requirements

To begin, Iran will reduce its number of centrifuges and lessen its stockpile of low-enriched uranium. Excesses of both will be handed over to the the International Atomic Energy Agency (IAEA) for safe storage. Iran will also stop enriching uranium at its Fordow facility and will not build any new enrichment facilities. Only one plant, Natanz, will continue to enrich uranium, although in lesser amounts. Additionally, Iran will halt research on uranium enrichment concerning spent fuel rods and will either postpone or reduce research on general uranium enrichment and on advanced types of centrifuges. Iran, by following through with these commitments, will abide by its requirements as a member of the Nuclear Non-Proliferation Treaty (NPT). In addition, Iran will open itself completely to IAEA inspections. The overarching goal is to change the timeline of Iran’s ability to build a nuclear weapon from a few months to at least a year.

U.S. and E.U. Requirements

On the other side of the deal are the U.S. and the E.U. These parties will begin lifting sanctions on Iran once it has been verified that it is complying with the agreed conditions concerning the nuclear framework agreement. These sanctions include a number of limitations that have hurt the Iranian economy. Specifically, the E.U. sanctions include trade restrictions on uranium-related equipment, asset freezes, a ban on transactions with Iranian financial institutions, and a ban on Iranian energy products. The U.S. has been levying sanctions on Iran since 1979; these include most of those imposed by the E.U. as well as sanctions on basically all types of trade with Iran, other than aid-related equipment.

The sanctions lifted will only be those levied in relation to Iran’s nuclear weapons program; other sanctions that are a result of human rights violations for example, will remain in place. Additionally, if Iran violates the terms of the agreement, the original sanctions can go back into effect. The following video explains in detail what the Iranians agreed to and what the U.S. and other world powers are offering in return.


Roadblocks to the Deal

While a framework is in place and the Obama Administration hailed it as progress, there are still several potential challenges that could derail the agreement before it is finalized in June. Each side appears to have to contend with at least one formidable roadblock to the deal’s success.

In the U.S., Congress still isn’t quite on board. For the U.S. to lift sanctions, President Obama needs Congress to approve the deal; however, due to consistent fighting with Congress, the president has been reluctant to leave it in their hands. Nevertheless, thanks to an agreement on April 14, 2015, Congress will now get to vote on a finalized deal if it is reached by June 30, 2015. While this may appear as yet another defeat for the president and pose a dark outlook for the nuclear agreement, the compromise reached with Congress ensures they will have a say.

Another potential roadblock is Israel. While the country does not have any direct say in whether the deal happens or not, it is not without influence.  As Netanyahu’s recent visit to the U.S. shows, he has Congress’ ear, and could prove an effective lobbyist.

On the Iranian side, dissent has emerged from the arguably most powerful voice in the entire country, Ayatollah Ali Khamenei, the Supreme Leader of the country. In a recent speech he called for sanctions to be lifted immediately upon finalization of the deal, meaning Iran would not have to proove its sincerity first. Khamenei is an unquestioned power in Iran, so this could be a big problem. The video below reiterates the obstacles to finalizing an Iranian nuclear deal.


Impact of the Agreement

The impact of a successful Iran-U.S. deal would be monumental on national, regional, and global levels.

National Importance

Perhaps no party will reap the benefits of this deal as much as Iran itself. With a deal in place, Iran’s economic struggles as a result of the sanctions will be softened. Iran has the opportunity to improve its economy dramatically. When the sanctions are lifted, Iran can enjoy a $100 billion windfall in oil profits that have been frozen as part of the sanctions. Additionally, Iran can follow through on a number of oil pipeline projects it had in place, but was unable to complete due to the sanctions. Lastly, with U.S. cooperation, Iran will be able to more efficiently develop its large oil and natural gas reserves with American technology.

Regional Importance

While Iran stands to gain the most, there will also be changes for the region as a whole. In agreeing to this deal, Iran did not agree to limit its actions in the ongoing conflicts in Lebanon, Syria, and its proxy war in Yemen, which is especially important as it is part of the larger feud between Iran and Saudi Arabia. Saudi Arabia has been in competition with Iran, its ideological and religious counter, for leadership of the Middle East for years. The two have engaged indirectly in a number of conflicts for the hearts and minds of the region. While the nuclear deal likely eliminates a potential nuclear arms race between the conflicting sides, it does nothing to prevent Iran from continuing to vie for control of the region.

Israel shares a similar fear of Iran’s growing influence. Iran is a chief supporter of Hezbollah, a group based in Lebanon that strongly opposes Israel. Additionally, Israel, while not declared, is a well-known nuclear power. These nuclear weapons provide Israel with the ultimate deterrent against larger countries like Iran. Israel therefore fears the Iran nuclear deal because it believes the deal will further empower Iran.

Global Importance

Lastly is the impact of the deal within the global community, beginning with the United States. Many experts expect a huge increase in the world oil supply once the sanctions are lifted. American corporations will benefit not only from cheaper prices, but also from access to developing Iranian energy supplies.

The deal could also help countries such as India, which also benefits from cheap energy as well as increased access to development projects in Iran. China is yet another country that can use another source of cheap oil, but by agreeing to a deal with the U.S., Iran may have taken itself out of the orbit of a sympathetic China. Along a similar vein, Russia, whose economy lives and dies with energy prices, does not need another competitor to bring the price of oil down even further, which is likely to happen.  The video below explains further what the implications of the Iran nuclear deal are.

Thus the Iran deal means something different to all parties at every level of foreign affairs, but the consensus is that it is important to all sides.


 Conclusion

On paper the Iran nuclear deal is a win for most parties. The problem is the deal is not on paper yet, as only a framework has been reached. While even getting this far can seem like a monumental step when history is factored in, that same history has the potential to undo everything achieved so far. Whether or not all sides end up getting on board with this deal remains to be seen.


Resources

Business Insider: Here’s the Text of the Iran Nuclear Framework

Al Jazeera: Why Saudi Arabia and Israel Oppose the Iran Nuclear Deal

Reuters: Kerry Says He Stands by Presentation of Iran Nuclear Deal

The New York Times: Obama Yields, Allowing Congress Say on Iran Nuclear Deal

BBC News: Iran Nuclear Crisis: What Are the Sanctions?

Cato Institute: Remaining Obstacles to the Iran Nuclear Deal

Daily Star: Region to Feel the Effects of Iran Nuclear Deal

The New York Times: Israeli Response to Iran Nuclear Deal Could Have Broader Implications

Quora: What Could Be an Impact on a Global Level of Iran’s Nuclear Deal?

BBC News: Iran-U.S. Relations

Atlantic: What Are the Alternatives to Obama’s Nuclear Deal with Iran

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post A Castle Made of Sand? The Iranian Nuclear Deal Moves Forward appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/castle-made-sand-iranian-nuclear-deal-moves-forward/feed/ 0 38039
Garissa Massacre: Al Shabab’s Role in Kenya https://legacy.lawstreetmedia.com/issues/world/massacre-kenya-meets-eye/ https://legacy.lawstreetmedia.com/issues/world/massacre-kenya-meets-eye/#respond Sat, 11 Apr 2015 13:30:05 +0000 http://lawstreetmedia.wpengine.com/?p=37619

Who was responsible for the horrible Garissa Massacre?

The post Garissa Massacre: Al Shabab’s Role in Kenya appeared first on Law Street.

]]>
Image courtesy of [Kevin Walsh via Flickr]

On Thursday April 2, a number of gunmen burst into a college in Garissa, Kenya. These attackers proceeded to separate the sleeping students into groups. They then executed 147 people, leaving a trail of carnage for the world to see. Carried out by Al Shabab, the attack targeting Kenyan Christians was another in a series of skirmishes between the group and Kenyan citizens. Read on to learn about the roots of conflict, what Al Shabab is, why the attack occurred, and considerations for the Kenyan people moving forward.


 A Brief Look at Kenyan History

Prior to contact from outside groups, modern day Kenya was home to several different indigenous tribes. However, this way of life began to change with the incursions first of Arabs and later Christian Europeans. These groups brought two different religions, Islam and Christianity, which would create lasting divisions and serve as a root cause for friction in the present day.

Islam

Islam reached Kenya first as a result of trade with Arab merchants, but also stemmed from the Oman Sultanate whose power emanated from Zanzibar, an island off the coast of Kenya. Not surprisingly then, along the coast many Kenyans became Muslims. Presently about 11 percent of the population of Kenya is Muslim. Today the Muslim population remains centered along the coast and in the north, along the border with Somalia.

Christianity

Christianity arrived much later, in the nineteenth century. British colonization led to the rise of Christianity in Kenya. Starting with its land grabs, the Christian faith accompanied every expansion of the British presence in Kenya, culminating in its colonial status. Along with British officials, missionaries also worked to spread the faith throughout the country. Despite these efforts, Christianity was still second to the traditional beliefs of Kenya. Even for those who accepted Christianity, for many it took the form of a blend of traditional practices and the Christian faith. However, following independence, the new ruling elite adopted Christianity and thus made it the de facto religion of the nation. Today, approximately 82 percent of the population of Kenya is some form of Christian. The accompanying video explains the settling of Kenya, the arrivals of Arab and European colonists, and Kenya’s arrival at independence:


 Kenya and Somalia

While the situation within Kenya is complex, matters are also complicated with its neighbors, especially with the nation to the north, Somalia. The attack in Garissa came after continued Kenyan intervention into Somalia, dating back to 2011. The incursion was triggered by a raid into Kenya by the terror group Al Shabab.

Al Shabab

The group claiming responsibility for the attack in Garissa is a Somali-based Islamist extremist group known as Al Shabab, which means “the youth” in Arabic. The Al-Qaeda linked group was the youth movement of the Union of Islamic Courts which controlled Mogadishu, the capital of Somalia, until it was ousted by Ethiopian forces in 2006. While the group has lost control over major areas, including Mogadishu and the port city Kismayo, it still maintains a grip over a large swath of territory within Somalia, despite continued efforts of African Union troops. Within the territory under its control, Al Shabab practices an extremist form of Islam.

The attack on Garissa carried out by A Shabab was unquestionably grisly, however it was not the first. Rather, it was one in a long series of escalating assaults against Kenya. Prior to the attack on the university, one of the worst terrorist attacks in Kenya was also courtesy of Al Shabab. That attack occurred at a shopping mall in Nairobi, the capital of Kenya, and left 68 people dead. There are many other incidents of gun or grenade attacks carried out by the group. One of the chilling hallmarks of these attacks is Al Shabab forcing people to correctly recite specific passages of the Koran in order to separate the Christians from Muslims. Although Kenya recently invaded Somalia to confront the group, these attacks precipitated that invasion, which begs the question, why is Al Shabab targeting Kenya?  The video below explains what Al Shabab is and its goals in Kenya:

Kenyan Intervention

Al Shabab seemingly initiated the mass killings in part because of Kenya’s invasion of Somalia as well as how Kenya deals with its Muslim minority population. Kenya began a direct military intervention into Somalia in 2011 along with fellow African nations to root out Al Shabab, whose kidnappings and killings Kenya claimed hurt the country economically. However, there has been a history of raids from Al Shabab into Kenya, so many experts attributed the invasion to Kenya’s increased militarization, courtesy of growing military assistance packages from the United States. Additionally, Kenya had also previously trained and armed a militia group to serve as a buffer between itself and Al Shabab in the northern border region.

Aside from direct military conflict with Al Shabab, another reason for the attack was how Kenya treats its own Muslim population. Muslims make up around 11 percent of the population of Kenya and are based mostly in the northern and coastal regions that border Somalia. This area has historically been marginalized, resulting in a lack of services, jobs, and representation in the government. It has also been the recipient of anger from Kenyan armed forces for attacks on Kenyan territory. In 1984 for example, over 1000 people were murdered by Kenyan troops in Wagalla, located in the predominately Muslim north, in an attempt to end clan conflict.


Current Situation in Kenya

So what’s next for Kenya following this massacre? On April 6, just four days after the deadly attack on the university in Garissa, Kenya launched airstrikes on suspected Al Shabab militants. While officials say the strikes were already planned and were not a direct result of the Garissa carnage, the timing is questionable. However, some are questioning what exactly Kenya hopes to achieve with the strikes, other than killing a few insurgents. As Al Shabab is already reeling from attacks in Somalia, critics worry that it would appear wiser to try to better incorporate the Muslim population in Kenya and thus eliminate the recruiting ground for the terrorist group there. As Hunter S. Thompson immortally once said, “kill the body and the head will die.”

Nevertheless, despite whatever path Kenya takes, the attacks by Al Shabab appear to point to a larger trend conflict in the area–the overall struggle taking place in central and northern Africa and the Middle East, between states and extremist groups. These efforts are spearheaded historically by Al Qaeda, but more recently by ISIS in Iraq and Syria and Boko Haram in Nigeria. The question going forward then, is what links these groups may have to aiding Al Shabab?

Al Shabab has already begun working, at least in minor ways, with Boko Haram. In fact the two groups have communicated since 2011 about bombing plans and other tactics. Even the situation in the two countries are similar–Nigeria is plagued by a Islamic extremist group representing a northern region populated by Muslims who feel oppressed and marginalized by the existing governments. Continued and increased cooperation between the terror groups have many worried about even worse attacks than the Garissa massacre, if underlying problems within Kenya are not addressed and the Al Shabab is not successfully countered.


 Conclusion

Kenya currently faces a difficult road, but not necessarily a unique one. Kenya is now embroiled in a seemingly endless conflict with a prominent non-state actor, Al Shabab. Kenya may need to unite its own people more closely, and not just through airstrikes. This sentiment seemed to be shared by Kenyan President Uhuru Kenyatta in an address to the nation on Easter Sunday, in which he called for national unity and defended Islam as a religion of peace. The issue now, is whether Kenya can abide by Kenyatta’s words and unite to defeat the terror that has infiltrated it.



Resources

Foreign Relations: Why Kenya Invaded Somalia

CNN: 147 Dead, Islamist Gunmen Killed After Attack at Kenyan College

Index Mundi: Kenya’s Demographic Profile

BBC News: Who are Somalia’s Al Shabab?

Al Jazeera: Why Al Shabab has Gained a Foothold in Kenya

CNN: Kenya Airstrikes on Al Shabaab Targets Unrelated to Garissa Attacks, Source Says

Good Reads: Quotes

DW: Islamist Terror Groups in Africa and the Middle East

Horseed Media: Somalia Al Shabab Leaders in Squabble over Joining IS

NBC News: Missing Nigeria School Girls

Think Progress: Deadly University Attack Hangs Over Kenya’s Easter Sunday

Danish Institute for International Studies: Political Islam in Kenya

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Garissa Massacre: Al Shabab’s Role in Kenya appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/massacre-kenya-meets-eye/feed/ 0 37619
Asian Infrastructure Investment Bank: Threat to the Financial System? https://legacy.lawstreetmedia.com/issues/world/asian-infrastructure-investment-bank-threat-financial-system-know/ https://legacy.lawstreetmedia.com/issues/world/asian-infrastructure-investment-bank-threat-financial-system-know/#respond Sat, 04 Apr 2015 13:30:02 +0000 http://lawstreetmedia.wpengine.com/?p=37022

Will the Asian Infrastructure Investment Bank (AIIB) change the global financial system for good?

The post Asian Infrastructure Investment Bank: Threat to the Financial System? appeared first on Law Street.

]]>
Image courtesy of [Steve Parker via Flickr]

Despite China’s strong and consistent economic growth, there have been two areas that are clearly understood to be American-dominated spheres–military and finance. While America still holds a large lead over other countries in terms of military power–at least based on money spent–that other sphere of power may be waning. Although China has long been dismissed as lacking in infrastructure and innovation, that belief is likely about to change. With the formation of the Asian Infrastructure Investment Bank, China is throwing itself into the financial arena. Read on to learn about China’s latest push for superpower status that has the potential to change the global financial system that has been in place since WWII, and casts into question the future of who controls the world’s purse strings.


History of the Current System

The history of the modern financial system began in 1944. While WWII still raged, representatives from the Allied powers met to decide the future of the global financial system. The result of this was the Bretton Woods Agreement, named after the town in New Hampshire where the meeting was held.

Bretton Woods Agreement

This agreement essentially pegged global currencies to the U.S. dollar. Countries were required to maintain a fixed exchange rate with the U.S., buying up dollars if their currency was too low and printing more money if their currency’s value was too high. It was a basic concept of supply and demand, but with physical currency.

This, in effect, made the United States the preeminent global economic world power. It also relied on the relationship between U.S. dollars and gold, because the dollar itself was tied to a gold standard. However, the Bretton Woods system came crashing down in 1971 when the U.S. experienced something known as stagflation–when a country simultaneously sees a recession and inflation–and was forced to abandon the gold standard. In an unforeseen result, the rising demand for the dollar had made it more valuable even though its value was pegged to a certain amount of gold. The resulting disparity led to shortage and the need to scrap the existing system. Despite the end of the Bretton Woods system, two of its guarantor agencies, the International Monetary Fund (IMF) and World Bank, survived and continue to this day.

The IMF

The IMF was created as part of the Bretton Woods agreement. Its original purpose was to help countries adjust their balance of payments with regard to the dollar, which was the reserve currency. Once the gold standard was abandoned, the IMF offered members a variety of floating currency options, excluding pegging the value of currency to gold. Additionally, the 1970s saw the beginning of the Structural Adjustment Facilities, which are loans out of a trust fund offered by the IMF to countries. The IMF was instrumental in guiding a number of countries, particularly developing ones, through a series of crises including the oil shocks in the 1970s and the financial crisis in 2008.

World Bank

The World Bank was originally known as the International Bank for Reconstruction when it was created as part of the Bretton Woods Agreement. Initially, the bank was created to help with reconstruction in Europe, with its first loan going to France in 1947. However, over time and following the collapse of the Bretton Woods system, it has changed its focus to fighting poverty. The World Bank’s footprint has also expanded from a single office in Washington, D.C., to offices all over the world, and it is now made up of five different development institutions. Like the IMF, it has also tackled issues as they have arisen over the decades, such as social and environmental challenges.

Criticisms of the IMF and World Bank

Although the IMF and World Bank have survived for more than 70 years, they have faced extremely harsh criticism. The IMF has been criticized far and wide. Mostly the criticisms boil down to the conditions upon which the IMF grants loans. Namely, many people believe the IMF intervenes too much in a country’s operations by forcing it to meet arduous standards before it will be given a loan. The problem here is there is no one-size-fits-all way for countries to operate and the parameters the IMF sets are sometimes seen as more detrimental to a country than its existing financial situation. There are also accusations of supporting corrupt regimes and a lack of transparency.

The World Bank faces several of these same criticisms and more. On top of not taking into account individual local situations, the World Bank has also been criticized for enforcing a de facto Washington consensus along with the IMF. In other words, by controlling the money, the World Bank and IMF can force countries to do what Washington wants. Additionally, the World Bank and IMF have been accused of helping large corporations at the expense of poor and developing nations. In particular, the debt associated with the loans, has left many recipients mired in a perpetual state of debt and therefore beholden to the IMF and World Bank structure.

The video below offers a detailed explanation of Bretton Woods, the IMF, World Bank, and the criticisms they face.

 


 The Asian Infrastructure Bank

With the existing state of finance the way it is, it comes as little surprise that China and other nations who do not agree with many American policies would seek to create their own institutions of last resort. This indeed is what China, India, and a number of other smaller countries now intend to do. This has led to the creation of the Asian Infrastructure investment Bank, or AIIB. Although the details of the bank are still murky it will essentially be a clone of the World Bank.

Aside from differing with the U.S. over policy, China and other nations are also upset over representation within the World Bank and IMF. The way the system is currently set up, an American is traditionally in charge at the World Bank and a European at the IMF.

The video below explains what the AIIB is, what it means for the U.S., and how it will impact the existing system.

With Friends Like These

While it is not that shocking that a rising country like China desires its own system and to be free of the constraints placed upon it by the United States and its allies, several other countries that have been quick to sign up for the AIIB have been surprising. These nations included a number of traditional American allies including Germany, France, the U.K., and South Korea. Nevertheless, while it is still unclear what these countries hope to gain from membership, the fact that they would willingly flout American criticisms and join with China is certainly a diplomatic blow.

Progress on the AIIB

Whereas China’s new bank appears as a smack in the face to the U.S., there is still much to be decided. First of all, there was already an Asian Development Bank, so if anything the AIIB seems to be replacing that more than the World Bank or IMF. Additionally and most importantly, the AIIB has not actually been created yet, so all these defections and statements are just plans, not concrete actions. Furthermore, while countries were upset at and critical of the IMF and World Bank as being puppets of U.S. interest, this new Asian Infrastructure Investment Bank is seemingly being designed specifically to make China its unquestioned leader. Thus it bears watching how long countries want to suffer under China’s yoke and if the grass really is much greener.

There are other projects in the works as well. The U.S. has a new trade proposal of its own for the Asian Pacific that would also aid in the development of infrastructure. The following video shows how the IMF and other groups plan to work with the AIIB in the future financial environment.


Conclusion

America’s position as the global hegemon seems increasingly to be challenged in every facet from sports to entertainment to now finance. For roughly 70 years America has been the guarantor of the world’s economy; however, that is beginning to change as revealed by its inability to prevent the financial crisis in 2008 and through tests from other countries such as China. The U.S. therefore, may have to adjust to its new position in a world, where it wields less control and enjoys less prestige. The only lingering question then is not if this degradation of power will occur, but how will the U.S. respond to it?


Resources

Primary

International Monetary Fund: History

World Bank: History

Additional

About News: Bretton Woods System and 1944 Agreement

Vox: How a Chinese Infrastructure Bank Turned into a Diplomatic Disaster for the United States

Economics Help: Criticism of the IMF

Globalization 101: Why the World Bank is So Controversial

Financial Times: Superpowers Circle Each Other in Contest to Control Asia’s Future

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Asian Infrastructure Investment Bank: Threat to the Financial System? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/asian-infrastructure-investment-bank-threat-financial-system-know/feed/ 0 37022
The Forgotten WMDs: Chemical Weapons https://legacy.lawstreetmedia.com/issues/law-and-politics/forgotten-wmds-chemical-weapons/ https://legacy.lawstreetmedia.com/issues/law-and-politics/forgotten-wmds-chemical-weapons/#comments Sun, 29 Mar 2015 18:30:30 +0000 http://lawstreetmedia.wpengine.com/?p=36637

Have our efforts to ban chemical weapons gone anywhere?

The post The Forgotten WMDs: Chemical Weapons appeared first on Law Street.

]]>

In discussions of international politics, we hear a lot of talk about nuclear weapons, but another deadly type of weapon often goes overlooked. Chemical weapons have both proven their deadliness on the battlefield and have been deployed with greater frequency in contemporary times. Nevertheless, just two-and-a-half years since President Obama made his infamous “Red Line” speech against the use of chemical weapons in Syria, this issue has drifted from the public consciousness. While interest has waned publicly, these weapons are still being used on battlefields across the globe, even as legislation and efforts are being made to eliminate them for good. Read on to learn about chemical warfare, the legal framework for using chemical weapons, and how successful efforts to eliminate them have been.


History of Chemical Warfare

While chemical weapons in rudimentary forms have been in use for millennia, it was only relatively recently that they were harnessed in a modern sense. Chemical weapons made their debut on the stage of WWI. During that war, toxic gases such as chlorine and mustard gas were released from canisters on the battlefield. The results were devastating for two reasons. Not only were chemical weapons responsible for over a million causalities on the battlefield, but they also left a strong impression on the public’s consciousness. The video below explains the use of chemical warfare, particularly in WWI.

Nevertheless the use of the weapons continued through the inter-war years, particularly in places such as Russia and Africa. Usage was ramped up again in WWII. In the Far East, the Japanese used a variety of chemical agents in their attempted conquest of China. Meanwhile, in the Atlantic theater, chemical weapons were used by a number of parties, most notoriously by the Nazis in their death camps.

Even after WWII chemical weapons continued to be used. In one of the most glaring instances, the United States used instruments such as Agent Orange in Vietnam. The Americans were not alone, as the Soviets later employed chemical weapons in Afghanistan. Iraq utilized the deadly agents in its war against Iran as well as against its own Kurdish citizens.

Additionally, the usage of chemical weapons by individuals and terrorist groups has become a concern too. The most prominent example came in Japan in 1995, when the Aum Shinrikyo cult used nerve agent Sarin in a Tokyo subway. Chemical weapons were also used by terrorists in Iraq and Afghanistan during the American occupation. Even ISIS has deployed chemical weapons in its battles against Iraqi and Kurdish soldiers.

The most recent high profile and controversial use occurred in Syria in 2013. In late March it was reported that the use of chemical weapons had been detected. While both the Syrian military and the rebels denied using the weapons, each blaming the other side, the usage of chemicals had crossed what President Obama called a “red line.”

While the episode in Syria was just one in a long line of chemical weapons attacks, it aroused concern over whether the existing framework to prevent the creation and use of chemical weapons was adequate. So, what is that framework?


Legality of Chemical Weapons

The horror of chemical weapon usage in WWI left a lasting image in the minds of many people. Thus in 1925, the first legislation aimed at prohibiting the dissemination of chemical weapons was passed. This was known as the Geneva Protocol and it prohibited the use of chemical weapons in warfare. However, the treaty proved inadequate in several ways as it allowed for the continued production of chemical weapons. Additionally, it also gave countries the right to use chemical weapons against non-signatories and in retaliation if weapons were used against them.

The Chemical Weapons Convention

Although seemingly inadequate, the Protocol nonetheless proved to be the only protection against chemical weapons for the next 65 years. Finally in 1992 however, the Chemical Weapons Convention was adopted. It was subsequently opened for signature beginning in 1993 and put into force in 1997. Unlike the Geneva Protocol, the CWC has a much clearer and all-encompassing goal: eliminate an entire category of weapons of mass destruction.

Namely what the treaty calls for is the prohibition of the “development, production, acquisition, stockpiling, retention, transfer or use of chemical weapons by states parties.” The chemicals themselves are divided into three different schedules, which may sound similar to those familiar with the U.S. drug classification regime. In addition, the signatories are responsible for enforcing these protocols within their own countries. Along with stopping the production of chemical weapons, states are required to destroy existing stockpiles and production facilities. Lastly, states are obligated to create a verification system for chemicals and must open themselves to snap inspections by other members. The video below details which chemicals are banned and what the CWC requires of its members.


Chemical Weapons Prohibition Regime: Success or Failure?

So is the current chemical weapons convention (CWC) a success or failure? Different metrics tell different stories.

Arguments for Success 

Membership in the treaty certainly casts a positive glow. As of 1997 when the treaty took effect, 190 countries had joined with only five–Israel, Egypt, North Korea, Angola, and South Sudan–not yet ratifying the treaty. Furthermore, real progress has been made in implementing a number of the treaty’s goals. As of 2007, 100 percent of chemical weapons sites had been “deactivated,” 90 percent of which had either been destroyed or switched to peaceful use. Additionally, over 25 to 30 percent of stockpiles had been destroyed and 2,800 inspections had been carried out. The map below indicates countries’ signing status: light green indicates that the country signed and ratified the CWC, dark green indicates that the CWC is acceded or succeeded, yellow countries have signed but not ratified the CWC, and red countries are not signatories.

{{{image_alt}}}

Image courtesy of Wikimedia

Arguments For Failure

Conversely, while those metrics point to success, there a number that tell the opposite story. The world has failed to meet the 2012 deadline originally set by the treaty for completely disarming all chemical weapons globally. The two main culprits were also two of the main catalysts behind the treaty in the first place: Russia and the United States. These two countries possess the largest stockpiles of chemical weapons, so their compliance with the treaty carries significant weight. The video below shows the failures of the U.S., Russia, and other nations to uphold the treaty’s protocols.

Along with failure to disarm is the question of favoritism. While the U.S. has been critical of other countries’ efforts to disarm, it has not pressured its close ally Israel to ratify the treaty, let alone destroy its acknowledged stockpile.

Other issues also exist. Several countries, despite having ratified the treaty, have not set up the international policing mechanisms necessary and required by the treaty to give it any actual power. Additionally, the inspection process itself has been described as unfair and inadequate. Because labs are transitioning from large factories to smaller compounds, it’s difficult to inspect and punish individual labs for producing illegal compounds. Furthermore, there are a number of non-lethal compounds used by the police–such as tear gas–that are not covered by the CWC and can be harmful. Lastly, while the treaty covers states, it does nothing to prevent groups such as ISIS or Al-Qaeda from using the harmful weapons.


Conclusion

As of June 2014, Syria completed the process of either giving up or destroying all of its declared weapons. This was seen as a major coup as most expected Syria to sandbag, especially after it missed prior several deadlines. Although Syria declared its chemical weapons, it is still suspected that other secret caches remain. Additionally, after the first acknowledged use–the event that overstepped the Red Line and led to the agreement between Russia, the U.S., and Syria–there were several more speculated incidents of chemical weapons use in Syria.

This points to the problem with the Chemical Weapons Convention. Like the Non-Proliferation Treaty for nuclear weapons, there is no governing body that can punish a country for violating it. This is because joining the treaty is voluntary and there is no punishment for not joining or even for joining then quitting. Moreover, most of the countries that did join never had chemical weapons to begin with, thus signing a treaty prohibiting them made no difference. The bottom line then is that when it comes to chemical weapons, much like nuclear or biological weapons, the onus is on the individual country to comply.


Resources

Primary

United Nations Office for Disarmament Affairs: Chemical Weapons Convention

Additional 

Fact Check.org: Obama’s Blurry Red Line

OPCW: Brief History of Chemical Weapons Use

Johnston Archive: Summary of Historical Attacks Using Chemical or Biological Weapons

American Society of International Law: The Chemical Weapons Convention After 10 Years

Arms Control Association: Chemical Weapons Convention Signatories and States-Parties

Washington Times: U.S. and Russia are Slow to Destroy Their Own Chemical Weapons Amid Syria Smackdown

Think Progress: Nobody Thought Syria Would Give Up Its Chemical Weapons. It Just Did

Military.com: U.S. to Destroy Its Largest Remaining Chemical Weapons Cache

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Forgotten WMDs: Chemical Weapons appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/law-and-politics/forgotten-wmds-chemical-weapons/feed/ 3 36637
The U.S. Government: A House Divided on Foreign Policy https://legacy.lawstreetmedia.com/issues/politics/us-government-house-divided-foreign-policy/ https://legacy.lawstreetmedia.com/issues/politics/us-government-house-divided-foreign-policy/#comments Sat, 21 Mar 2015 13:00:27 +0000 http://lawstreetmedia.wpengine.com/?p=36263

The Iran letter and Netanyahu's Congressional invitation is nothing new. Check out the history of foreign policy dissension.

The post The U.S. Government: A House Divided on Foreign Policy appeared first on Law Street.

]]>
Image courtesy of [Ted Eytan via Flickr]

In 1858, then-Senator Abraham Lincoln made one of his most famous speeches. In this particular speech he referenced the bible in stating, “a house divided against itself cannot stand.” At that time, of course, Lincoln was referring to the schism that divided the nation, namely should we be a free country or a slave-owning country? While the slavery question has been answered, the idea of a divided nation has continued and seemingly grown as time passed. The problem now is not over any singular issue, but the conduct of various branches of the government. In short, what effect does public disagreement over foreign policy issues have on the United States in presenting a unified front when trying to implement some type of cohesive strategy?


History of Disagreement

With the two most recent high-profile episodes of dissension in federal government–the Senate Republicans’ letter to Iran and the House Republicans’ invitation to Israeli Prime Minister Benjamin Netanyahu to speak to Congress without executive consent–it may appear as though these events were particularly egregious; however, disagreement between members of the government is certainly not something new. For that matter, this level of disagreement is not even that extraordinary. In fact, at various times throughout the nation’s history members or former members of the government have engaged in literal duels where one of the parties was actually killed. Of course those are the extreem examples of disagreement, but they are part of our history nonetheless.

The 1980s seemed like an especially appropriate time to publicly undermine the president and his foreign policy, as evidenced by two specific events. In 1983, Senator Ted Kennedy allegedly secretly conspired with the then-premier of the USSR to help him defeat Ronald Reagan and win the presidency. Just a year later, in 1984, Democrats wrote a letter to the leader of the Sandinistas in Nicaragua that was critical of the president and forgave the rebel regime’s many atrocities.

Another episode occurred in 1990 when former president Jimmy Carter wrote to the members of the United Nations Security Council denouncing President Bush’s efforts to authorize the Gulf War. In 2002, several democratic senators went to Iraq on a trip financed by late Iraqi dictator Saddam Hussein, and actually actively campaigned for his government. This was also aimed at undermining support for the second president Bush’s Iraq War. And the most recent example came in 2007 when newly elected Speaker of the House Nancy Pelosi traveled to Syria and met with President Basher Assad. Even before he had launched a civil war on his own people, Assad had already made enemies of the Bush Administration by supporting insurgents in Iraq.

This is the context in which Congress’ most recent acts of defiance should be considered. When Speaker John Boehner invited Netanyahu to speak to congress without the consent of the president, he knew perfectly well that Netanyahu would come to urge the use of force in preventing a nuclear armed Iran. This strategy is the complete opposite of the one pursued by the Obama Administration, which has centered on negotiation, give and take. The video below explains why this invitation was so controversial.

The second most recent act of dissent also comes in relation to a nuclear deal with Iran. In this case, 47 senators signed a letter to Iran stating that any agreement between President Obama and the Ayatollah will be considered as an executive agreement only and subject to being overturned when a new president is elected. The video below explores the ramifications of the letter.

Taken alone these efforts by Republicans appear outrageous and indeed even treasonous. But they are actually just two more in a series of moves from both parties to undermine the other. The main difference this time is that it was the Republicans doing the undermining of a Democratic President.


Roles in Foreign policy for Each Branch of the Government

The three branches of the government–the judicial, legislative, and executive branches–each play a role in determining foreign policy. While the courts are instrumental in determining the constitutionality, and therefore legality, of agreements, the legislative and executive branches are the real driving forces behind United States’ foreign policy. So what then are their roles?

Executive

As the saying goes, on paper the President’s foreign policy powers seem limited. According to the Constitution, he is limited to his role as Commander in Chief of the armed forces and nominating and appointing officials. However, the president has several unofficial powers that are more encompassing. First is the executive agreement, which basically allows the president to make an accord without the consent of Congress. This is what Obama did, for example, in relation to immigration in Fall 2014, as well as the situation to which Republicans referred in their letter to Iran.

This power is perhaps the most important as the president is able to pursue his agenda without needing Congressional support, which is often hostile to his ambitions. Along this same track, the president has the ability to determine the foreign policy agenda, and by doing so making it the agenda for the entire nation.

The executive branch also controls the means to carry out foreign policy through its various agencies. Of particular importance are the Department of State, which handles foreign affairs, and the Department of Defense, which is in charge of military operations. The intelligence community is also a key cog in this branch of government.

Legislative

The role of this branch has traditionally been three-fold: advising the president, approving/disapproving the president’s foreign policy agreements, and confirming appointments to the State Department. Recently these powers have come under challenge as Obama himself has conducted military actions in Libya without getting war powers consent from Congress first.

Like everything else, the roles taken on by the particular branches with regard to foreign policy have expanded far beyond those originally outlined in the Constitution. Nevertheless, because the president, as mentioned previously, serves as both the face of policy and its catalyst, it is generally assumed that he will take the lead in those matters. However, a certain gray area still exists as to specifically who has the right to do what. This role was supposed to be more clearly defined through legislation, namely the Logan Act; however, perpetually changing circumstances, such as the role of the media, have continued to make the boundaries for conduct less clear.


What Happens Next

So what is to be done about these quarrelsome representatives and senators? When Pelosi made her infamous trip to visit Assad in 2007, the Bush Administration was extremely angry and reacted accordingly, deeming her actions as criminal and possibly treasonous. If this rhetoric sounds familiar that is because these are the same types of phrases being hurled at the Congresspeople who invited Netanyahu to speak and condemned Iran with their signatures.

The Logan Act

The real issue here is with who is conducting foreign policy as opposed to who is supposed to, according to the Logan Act. The act was passed in 1799 in response to its namesake’s efforts to single-handedly end the quasi-war with the French by engaging in a solo journey to the country. The basic outline of the act is that no unauthorized person is allowed to negotiate on behalf of the United States with a foreign government during a dispute. Thus, while in theory this was meant to resolve the issue as to who was qualified to represent U.S. foreign policy, the video below explains that is far from what actually occurred.

Along with the damning words being thrown about, critics of the Republican actions also call for their prosecution under this relatively obscure law; however, no such indictments are likely to take place as no one has even been charged under it, not even the man for whom it was named. In addition, the language itself is unclear. For example, wouldn’t congresspeople be considered authorized persons? These threats of prosecution, along with the strong language being thrown about hide another important factor in this whole mess: the role of the media.


Media’s Role

In the tumult following the Iranian letter, a somewhat important piece of evidence has been overlooked. While the senators, including Majority Leader Mitch McConnell, indeed signed a letter, the letter was not actually sent anywhere. In fact, after getting 46 other senators to sign the letter, Senator Cotton posted it to his own website and social media accounts. Similarly with the Netanyahu speech, while it is odd for a foreign leader to speak to Congress without approval of the president, the significance of the whole thing can be attributed as much to the stage it was broadcast on as its peculiarity.

There is a history of government officials undermining the White House’s foreign policy. However, in 2015 there are so many avenues to openly and very publicly express dissent that when it does occur it is a bigger deal now than ever. Information is so accessible now, thus when someone posts something to social media anyone all over the world can see it. This is different than if something were broadcast 20 years ago on network news.


Conclusion

In 1951, President Truman removed General MacArthur from command in the Korean War. While MacArthur was one of the most renowned war heroes of WWII, his threats to invade China and expand the war undermined Truman’s efforts to negotiate an end to the conflict. While Truman was able to dismiss MacArthur, this is not true for the current case of branches of government undermining others.Unlike MacArthur who was a general and beholden to the president, these representatives and senators are beholden to the people and cannot be as easily removed. Nor should they, not only because the precedent for this type of disagreement has been set, but also because the president should not have the ability to dismiss everyone who disagrees with him. People voicing their opinions after all, is the whole idea behind representative government.

While recent Republican actions can certainly be termed at least as ill-advised, the question of illegality is much less clear. The Iranians for their part took the letter as well as can be expected, acknowledging its obvious political nature.


Sources

Washington Examiner: 5 Times Democrats Undermined Republican Presidents With Foreign Governments

Foreign Policy Association: How Foreign Policy is Made.

Politico: John Boehner’s Bibi Invite Sets Up Showdown With White House

Intercept: The Parties Role Reversal on Interfering With the Commander-in-Chief’s Foreign Policy

Politico: Iran, Tom Cotton and the Bizarre History of the Logan Act

National Review: The Cotton Letter Was Not Sent Anywhere, Especially Not to Iran

LA Times: Netanyahu’s Speech to Congress Has Politics Written All Over it

The New York Times: Iranian Officials Ask Kerry about Republicans’ Letter

CNN: Did 47 Republican Senators Break the Law in Plain Sight?

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The U.S. Government: A House Divided on Foreign Policy appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/politics/us-government-house-divided-foreign-policy/feed/ 1 36263
ISIS and the Terrorist Social Network https://legacy.lawstreetmedia.com/issues/technology/isis-terrorist-social-network/ https://legacy.lawstreetmedia.com/issues/technology/isis-terrorist-social-network/#respond Sun, 15 Mar 2015 15:37:15 +0000 http://lawstreetmedia.wpengine.com/?p=35826

How ISIS uses social media to gain supporters, spread its message, and solicit money.

The post ISIS and the Terrorist Social Network appeared first on Law Street.

]]>
Image courtesy of [Andreas Eldh via Flickr]

The Islamic State of Iraq and Syria (ISIS) is well known for its brutality and fighting prowess. However, to create a caliphate and establish its own vision of Islam, ISIS leaders have done more than win battles and intimidate enemies. Taking a page from the Arab Spring, the group has adopted a very modern approach to attracting its followers and spreading its message. Read on to learn about ISIS’ use of social media and the results of its campaigns.

Read more: Understanding ISIS’ Radical Apocalyptic Vision

ISIS and Social Media

The use of media by terrorist groups and even Al Qaeda, ISIS’ precursor in some ways, is not new. Under ISIS however, a transformation has begun. It started slowly–when ISIS was first on the rise it engaged mostly in simple, private media communications among its own members or dissidents. But with the fall of Mosul in June 2014, the group finally had its stage and was ready to broadcast to the world audience. Far from the grainy videos of Osama Bin Laden wandering around in the mountains, ISIS began live tweeting its actions and posting statuses on Facebook. On Twitter especially the group has been successful in delivering its message by commandeering popular hashtags.

ISIS Fighters have also taken selfies next to victims or in occupied areas in attempts to show how great life is under the aspiring caliphate. ISIS has even engaged in unsolicited product placement, flashing images of Nutella and Call of Duty in videos and other forms of media. Perhaps most importantly to its Western audience, it started attracting an English-speaking membership that could communicate directly to the English-speaking world. Perhaps no better example exists than the man known as “Jihadi John.” Born Mohammed Emwazi, he graduated with a degree in computer science from the University of Westminster, England. Despite his British upbringing, in 2013 he left Britain for Syria. Emwazi is by now a familiar figure, as he has been involved in some high-profile executions of non-Muslims.

ISIS has even utilized less popular forms of social media. For example, it’s used PalTalk, a video chatroom where radical clerics have convened to praise ISIS and its leadership. The group created an Android App called Fajer Al Bashayer (Dawn of the Good Omens) that provides users with up-to-the-minute updates on ISIS’ movements. The app also includes software that appropriates the Twitter accounts of the downloaders and uses them to further propagate the group’s ideology. ISIS even has its own magazine, Dabiq, which combines graphic insights into violence perpetrated by the group with interviews of its members, resembling a sort of gossip magazine. The video below details how ISIS has been using social media to its advantage.


Influence of the Arab Spring

How did ISIS end up turning to social media to further its cause? Well, it may have taken some inspiration from the Arab Spring. In 2011, one of the catalysts that fueled the Arab Spring movement was the use of social media to coordinate gatherings and denounce authoritarian regimes. While this has been employed for similar causes before, the scope in this case was revolutionary and transformative.

Various Middle Eastern leaders took notice and began to censor social media access they deemed dangerous. This may have had the negative consequences of chasing off progressive voices who lost faith in social media as a means of communication. But, it gave groups like ISIS ideas about powerful ways to attract members and money. The accompanying video explains the way social media has been used from Arab Spring to ISIS.


Have ISIS’ social media campaigns been successful?

How successful has the group been in attracting new fighters and inflows of capital? These results can be broken up into two categories: those who have pledged direct support to the group and the potential lone wolves it has inspired at home in Western nations.

Direct Supporters of ISIS

The first group includes people who have actually moved to ISIS-controlled areas. Many of them, particularly from the West, are drawn by the notion of a Muslim paradise. Often they feel out of place in Western culture. Many are young and eager to find a place where they can be accepted.

The message seems to be finding a plentiful breeding ground too, as thousands of Westerners, including teenagers, have already gone to the Middle East to fight for ISIS. Evidence of this startling trend can be found all over the West. In late 2014, there were three sisters from Colorado who were stopped in Germany as they were trying to fly to ISIS-controlled territory. More recently, the news has focused on three teenage British girls who are believed to have left their homes to join ISIS.

While ISIS is sinister in every way, its recruitment of girls and young women is especially so. Preying upon feelings of alienation and offering acceptance, ISIS has lured many women from Western nations to its cause. While many of these girls may dream of aiding a movement and finding a soulmate, they often experience something much worse. Their fates can include rape, forced marriages, and even enslavement at the hands of their alleged liberators.

How exactly is ISIS seducing these women and its other alienated recruits? The answer to that question comes in two parts. First, ISIS tries to attract attention and create a bigger name for itself. The end goal here is to project its strength and its ability to stand up to entrenched powers such as the United States. This strategy can speak particularly to people who feel victimized by the dominant cultures in the West.

Secondly the group has made a series of videos depicting how great life is under ISIS. These include highlighting the group’s charity  work, its efforts at establishing an appropriate Muslim state, and choreographed scenes of violence to appeal to viewers. ISIS also has responders who will directly engage Westerners who feel an inclination to join ISIS. These responders act as recruiters, echoing the themes of the videos that show the greatness of life under ISIS and the satisfaction women and others can gain living in an ISIS sphere.

The group is also getting some financial support online. ISIS has used Twitter as a place to receive donations along with recruits, despite the best efforts of the US government.

Lastly the group has been able to garner support and allegiance from other like-minded terrorist organizations through social media. Recently, the infamous Nigerian terrorist group, Boko Haram, pledged its support for ISIS and has even begun adopting some of its tactics for publication and recruitment.

Read More: Boko Haram: How Can Nigeria Stop the Terror?

Lone Wolves

Along with calling for would-be jihadis to come join the cause in Iraq and Syria or to provide donations, ISIS has also employed another tactic. It’s used social media campaigns recorded in French and subtitled in English to encourage radical action in Western countries. Instead of encouraging dissatisfied men and woman in these areas to come join the war in the Middle East, it calls for them to make war against their own governments at home. In this regard there also seems to be some examples of success on ISIS’ part. The most notorious so far is the attack on the magazine Charlie Hebdo in Paris and the killing of hostages a few days later in a kosher deli. These, along with subsequent attacks on police officers patrolling the city, have been attributed to ISIS-inspired terrorists, although exact motives remain uncertain. The video below depicts ISIS’ efforts to arouse lone wolves in the West.


Fighting Back

While ISIS has shown a mastery of modern day social networks, Western forces are also fighting back. The United States has already launched a major social media offensive, dedicating a contingent of manpower and materials to fighting ISIS propaganda online. The British have taken a similar approach and adopted the American model for its own program. Both countries are also pressuring companies such as Twitter and YouTube to clean up their sites and rid them of ISIS propaganda.

It is far from clear how effective these efforts have been. Many experts caution against ridding the web entirely of ISIS and its supporters as their posts can be valuable sources of information on the group. Additionally, while the U.S. and British governments are launching their own offensives against ISIS, many people remain skeptical about how effective government-run social media can be. Lastly there are strategic concerns to be considered. While to most people ISIS comes off as repulsive, a mystique could be created about the group by denying it the opportunity to speak, which could further improve recruiting.


Conclusion

ISIS’ use and mastery of social media is intriguing. The fact that it uses sites such as Twitter or Facebook seems almost unbelievable, and stands in direct contrast to common assumptions about the backward nature of terrorist organizations. Additionally, the efforts in response by the United States and its allies also clearly show that the nature of warfare has rapidly changed in the social media age.

Despite the seemingly harmless means by which it communicates and disseminates its messages, ISIS remains a ruthless terrorist organization. It is also clear however, that it is successful both on the battlefield and on the internet. The next step for the West is how to counter ISIS’ message while pushing  back in Iraq and Syria. Unfortunately the military part will likely be the easier path, even as debate over putting boots on the ground proofs devisive. There’s a new battle being fought, but this time, it’s on our computers.


Resources

Primary

Anti-Defamation League: Hashtag Terror

Additional

Independent: Mohammed Emwazi

CNN: What is ISIS’ Appeal to Young People?

CBS News: ISIS Message Resonating With Young People From U.S., West

U.S. News & World Report: ISIS Ability to Recruit Women Baffles West, Strengthens Cause

Hill: ISIS Rakes in Donations on Twitter

Newsmax: Tell ISIS Aligned Groups They Are Targets

Fox News: “What Are You Waiting For?”

Daily Beast: Can the West Beat ISIS on the Web?

Daily Beast: ISIS is Using Social Media to Reach You, Its New Audience

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post ISIS and the Terrorist Social Network appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/technology/isis-terrorist-social-network/feed/ 0 35826
Russia’s Aggressive Foreign Policy https://legacy.lawstreetmedia.com/issues/world/russias-aggressive-foreign-policy/ https://legacy.lawstreetmedia.com/issues/world/russias-aggressive-foreign-policy/#respond Sat, 07 Mar 2015 15:00:45 +0000 http://lawstreetmedia.wpengine.com/?p=35570

Putin's aggressive foreign policy is making a splash. Will it work?

The post Russia’s Aggressive Foreign Policy appeared first on Law Street.

]]>
Image courtesy of [Jennifer Boyer via Flickr]

Winston Churchill famously said that “Russia is a riddle wrapped in a mystery inside an enigma.” While the quote may be well worn, it is still surprisingly appropriate when discussing Russia today. Just a few years ago, old Cold War rivals Russia and the United States seemed to finally bond over their shared struggles against terrorism and to be on the path to real cooperation. But then, Russia changed course. Instead of trying to ingratiate itself into the international community, Russia took some steps that can be labeled as aggressive. Aside from a long-brewing conflict with Chechnya, it fought a war against the Republic of Georgia and is now slowly devouring Ukrainian territory. Those moves left many wondering: why did Russia feel the need to make such a drastic change in its global political relations. Read on to learn about Russia’s origins, historical political relationships, and foreign policy.


Russian History

Rise and Imperial History

While the area today known as Russia had been populated by steppe nomads for thousands of years, eastern European Slavs moved into the area only about 3,500 years ago. The Vikings also sailed into modern day Russia and founded the city of Kiev in the late ninth century. Early Russians adopted many of the practices of the Byzantine Empire, including the Orthodox religion. Following the fall of Constantinople, Russian leaders declared Moscow as its successor. Russia’s leaders adopted the title of tsar, similar to that of Caesar.

Russia continued to grow, but this growth was nearly undone when the Mongols conquered Russia in the thirteenth century, burning Kiev and sacking Moscow along the way. The Mongols then held sway over Russia for the next 200 years until the end of the fifteenth century when Russian rulers finally were strong enough to throw off the Mongol yoke.

Following this emancipation, the new rulers of Russia–the Romanovs–continued expanding, reaching the shores of the Pacific in 1649. Russia also attempted to gain further footholds in Europe, mainly by acquiring seaports in the Baltic to the north and Mediterranean to the south. As it did so, Russia came into greater contact with Europe and participated in a number of wars, including the defeat of Napoleon. Contact with Europe also forced Russia to confront its many backward policies. In the early twentieth century, reactionaries inspired by communism began to gain traction. During World War I, the Romanov family was overthrown and the Soviet Union was established.

Soviet Union

The Soviet Union was the successor to Romanov rule in Russia, but not without a fight. It was established after the victory of the Bolshevik Red Army in the Russian Civil War. Following their ascent to power, the Soviets enacted a series of purges and five-year plans that left the country weak and starving heading into WWII. The Soviets initially allied with the Nazis in exchange for several eastern European countries and a partition of Poland; however, the truce was broken in 1941, when the Germans invaded the Soviet Union. Nevertheless, the Soviets were able to withstand the attack, push back the Nazis, and establish themselves as one of two superpowers along with the United States after the war ended.

Following the war, the USSR and U.S. engaged in a protracted Cold War. Both sides competed against the other in arms and space races. While they never engaged directly in wars, several times during this period their proxies faced off against one another. Following the Cuban Missile crisis, cooler heads began to prevail, the rhetoric surrounding nuclear war was reduced, and several arms control treaties were signed. Beginning in the 1980s, the USSR started to liberalize as its economy and empire began to crumble. Finally, in 1991 the USSR dissolved into a number of independent countries with Russia as its leading member.

Post-USSR

Following the collapse of the Soviet Union, Russia was in disarray. Struggling to deal with the shift from communism to free market capitalism, inflation soared. The Russian economy, under the leadership of Boris Yeltsin, was barely able to avoid total collapse and reached the point of needing to import food to stave off starvation. Following the resignation of Yeltsin and the rise of Putin, the country began to stabilize and the course of foreign policy began to take its present shape. The following video gives a brief summary of modern Russian history.


Current Foreign Policy

Russia’s current foreign policy can be summed up in one word: aggressive. The reason for this shift toward conquest, oppression, and authoritarianism can be linked to two things. First is the desire of many Russians to return to the prestige of the Soviet Union. Second is the man leading that change and the nation itself, Vladimir Putin.  The video below looks at Russia’s current foreign policy.

Vladmir Putin

The man who holds responsibility for many of Russia’s decisions since the fall of the USSR is its longtime leader, President Vladimir Putin. Putin was born in Stalingrad during the height of the Soviet Union’s glory; however, he was coming of age professionally just as the empire was disintegrating.   Even after the USSR collapsed around him, Putin was determined to restore Russia to its status as a global power. Below is an excerpt from a speech Putin gave when he was a candidate for Prime Minister in 1999:

Russia has been a great power for centuries, and remains so. It has always had and still has legitimate zones of interest abroad in both the former Soviet lands and elsewhere. We should not drop our guard in this respect, neither should we allow our opinion to be ignored.

Since Putin was elected prime minister and subsequently president following Yeltsin’s resignation, he has done everything in his power to live up to these words. His first order of business was finally crushing the independent state of Chechnya. Chechnya, a small area in the southwest Caucasus region of Russia, had actually defeated the Russian army in the 1990s and formed a short-lived nation of its own.

After reestablishing Russia’s military strength, Putin also moved to curb the power of the oligarchs who became fabulously wealthy when they took control of state-owned industries following the fall of the USSR. He arrested and silenced critics, such as the fallen oligarch Mikhail Khodorkovsky. This policy has only continued as Putin’s strangle-hold on power has intensified. Along with leading the country since his ascent in 2000 as either president or prime minister, he has also engaged in further military actions including dispatching soldiers to crush Georgian troops and annexing Crimea. Recently Russian troops have also been implicated in separatists’ movements in Eastern Ukraine as well. The video below discusses Putin’s life.

Foray into Ukraine

While outsiders may view Russia’s recent foreign expansion into Ukrainian affairs as aggressive, the majority of its citizens hold the opposite opinion for several reasons. First, to many Russians, Ukraine is part of their historical empire and thus it is only natural that it be restored to Russia.

The conflict in Ukraine started when Russian-backed Ukrainian President Victor Yanukovych was ousted following his unpopular decision to remain aligned with Russia instead of integrating with the European Union. In response, Russian troops invaded an area called Crimea, occupied the area, and Crimea eventually voted in a referendum to become part of Russia. After the annexation of Crimea, Russia has continued supporting ethnic Russian Separatists in Eastern Ukraine, where they are the majority. This has aroused great controversy because despite several ceasefires, Russia has continued to provide separatists with weapons and possibly soldiers.

Many Russians also believe the entire uprising in Ukraine is the result of Western actions. A common argument is that Russia has actually intervened to protect Russian speakers the same as many western countries do for other minority groups. However, the opinions of everyday Russians are heavily influenced by the Russian media, which is indiscriminately run by the state and thus broadcasts the state’s message.

Russia’s next course of action remains up in the air. Economically it would seem obvious that Russia has to stop being so aggressive and work toward appeasing its Western creditors and consumers. Economic sanctions placed on Russia following its actions in Ukraine are beginning to be felt. The main effects of the sanctions have been in denying Russia credit and access to markets. Nonetheless, as yet another breached ceasefire implies, Russia doesn’t seem content to return Eastern Ukraine–and certainly not Crimea–back to the original status quo.

Other Foreign Policy Concerns for Russia 

Along with sanctions, an even greater problem for Russia suggests it should curtail its recent aggressive maneuvering–falling oil prices. At the beginning of the year, the price of oil dropped below $50 a barrel. This is devastating to a Russian economy that is dependent on oil as its main export.

From an economic standpoint this has been disastrous to the ruble, which has dropped by 17.5 percent compared to the dollar in just the first two weeks of 2015. The economy in general is hurting, as well, as it’s projected to retract by three to five percent this year. What this means for people on the street is also troubling. Lower crude prices mean higher prices for other goods, in particular food stuffs.

All of these economic woes have negatively impacted another grand Putin endeavor, the Eurasian Union. As the name implies, it is an economic union made up of Russia, Belarus, Kazakhstan, Armenia, and Kyrgyzstan that is supposed to rival the EU. However, with falling prices in Russia and declining currencies at home, all of the members are already discovering the side effects of allying with a troubled Russia. The member countries are also wary of sovereignty violations by Russia as well, similar to the ones that have already occurred in Georgia, Crimea, and now Eastern Ukraine.

It seems unlikely that Russia will stop pursuing such an aggressive approach, however. As a de facto dictator, it is crucial for Putin that he keeps his people happy enough so that they will not revolt. In this regard Putin seems to have been very successful. In December 2014 he was elected Russia’s Man of the Year for the fifteenth time in a row. Putin’s popularity level in fact has hovered at around 70 percent his entire time in office, spiking even higher during the invasion of Georgia and following the annexation of Crimea. It actually seems to Putin’s benefit to maintain his strong appearance in the face of alleged western aggression. While people in the West may question the authenticity of these ratings, any western politician would love to have the same kind of popularity.

Putin has also increased spending on the military. Even with the economy in crisis, military spending actually increased for this year rising to $50 billion. The effect of this spending has been evident in increased navy patrols, air maneuvers, improved equipment and greater activity. It also included the purchase of dozens of new state-of-the-art nuclear weapons to replace obsolete models from the Cold War.

So, Russia’s policies are working, at least in part. While they have proven very costly to the average Russian and the economy overall, it has not dissuaded Putin from his desire to restore Russian prestige. Frankly it should not be surprising either, with his high approval ratings and the West’s resistance to anything more than soft power tactics. The real question going forward is how much further Russia will go down this path. Will it stop with Eastern Ukraine or go further and risk overstretching? At some point the West will likely draw a line in the sand and if Russia crosses it, what will be next for Russia and the international community it refuses to abide by?


Resources

BBC News: Vladimir Putin

History World: History of Russia

The New York Times: Why Russians Back Putin on Ukraine

Business Insider: How Do We Know Russia Economic Crisis Has Officially Arrived?

Foreign Policy: Putin’s Eurasian Dream is Over Before it Began

Atlantic: Putin’s Popularity Much Stronger Than the Ruble

PBS: What Has Been the Effect of Western Sanctions on Russia?

U.S. News & World Report: Putin Defends Actions in Ukraine.

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Russia’s Aggressive Foreign Policy appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/russias-aggressive-foreign-policy/feed/ 0 35570
The Philippines: A U.S. Ally Grapples with Terrorism https://legacy.lawstreetmedia.com/issues/world/philippines-u-s-ally-grapples-terrorism/ https://legacy.lawstreetmedia.com/issues/world/philippines-u-s-ally-grapples-terrorism/#respond Fri, 27 Feb 2015 21:19:45 +0000 http://lawstreetmedia.wpengine.com/?p=35118

The United States and the Philippines are working together to fight terrorism.

The post The Philippines: A U.S. Ally Grapples with Terrorism appeared first on Law Street.

]]>
Image courtesy of [DVIDSHUB via Flickr]

Terrorism is a global problem and has been an especially challenging issue for the Philippines. A nation with a long and complicated history with the United States, the Philippines plays an important role on the global stage. Read on to learn about the history of the Philippines, its relationship with the U.S., and the struggles it faces today.


History of the Philippines

The settlement of the island nation began as early as 30,000 years ago. It continued with waves of Malay immigrants and Chinese merchants. Islam was brought to the area in 1500, and as Islam spread, Christianity was also introduced.

Christianity was brought to the Philippines by the Spanish, who then spent the next two centuries conquering the nation and establishing colonial rule. This was ultimately challenged and the Spanish were temporarily defeated by the British in the late 1700s. While the Philippines was eventually returned to the Spanish, the mindset had changed and rebellions against colonial rule became more prevalent, especially among the ostracized Muslim communities. As a result, Spain slowly allowed the nation greater freedom, eventually allowing free trade and a form of quasi independence.

Despite increased freedom, resistance and nationalism continued to grow, led by native Filipino members of the clergy. This led to a series of revolts that Spain was able to put down until it entered war with the United States in 1898. The Spanish were defeated by the U.S. and subsequently relinquished control of the Philippines to the United States. The video below explores the history of the nation.


Relationship With the United States

Philippines: An American Colony

While some in the Philippines saw the Americans as liberators and fought alongside them against the Spanish, this viewpoint quickly changed. Although the Filipinos quickly attempted to assert their own independence and even elected a president, the Americans snuffed out any efforts toward immediate independence. This led to years of fighting between the two countries.

Americans eventually became the de facto new colonizers of the Philippines, with Filipinos supposedly being brought along the path toward independent self-government. The final path toward independence did begin in 1934 with the creation of the Commonwealth of the Philippines. Soon after, the Philippines saw the election of its first president, Manuel Quezon, and the approval of its constitution. This time these actions were also sanctioned by the United States. The American plan was to allow for a ten-year transition period before proclaiming the Philippines an independent nation; however, this was all quickly undone when the Japanese captured the Philippines during WWII. The nation was eventually freed from Japanese rule in 1945 and during the following year, 1946, finally gained its independence.

Philippines: After Independence

Although technically independent, the Philippines was still highly dependent on the U.S. for trade, and there were still numerous American military bases on the islands. These bases and other forms of American intervention would occasionally crop up as major issues for Filipinos for the rest of the century. There were also concerns over American support for President Marcos, a strongman who effectively ruled the country as a dictator for over 20 years.

A particular low point in the relationship came in 1991, when the U.S. was forced to abandon its military bases in the Philippines after the government refused to renew the leases. However, the threat of a rising China and the events of 9/11 caused the Philippines to again seek a closer partnership with the U.S.

In 1999, the two sides signed a Visiting Forces Agreement under which the two countries could engage in joint military exercises as long as no American bases were established and the U.S. maintained a non-combatant role.

Following 9/11, a rotating Joint-Operations Task Force was also created in the Philippines numbering approximately 600 soldiers. Its purpose was to help the country fight against Islamist extremist groups. While several of these groups were created worldwide to fight terrorism following 9/11, the Philippines, as a long-standing American ally, was an area of grave concern. Not only was there already an established Islamic insurgency in the south, but there were concerns over two terrorist groups, Abu Sayyef and Jemaah Islamiyah, that operate in the Philippines and have ties to other international terror organizations, including al-Qaeda.

Yet another agreement was signed in 2002, which permitted the U.S. to use the Philippines as a resupply center. The Philippines is a useful ally for the U.S. to have, especially when it comes to a sometimes contentious American relationship with China.

In addition, the U.S. and the Philippines have signed the Enhanced Defense Cooperation Agreement, which allowed greater access by U.S. personnel to Filipino military bases, the construction of new U.S. facilities, and positioning of defensive equipment. In 2014, while military cooperation was still ongoing, it was announced that the Joint-Operations Task Force would be dissolved as progress had been made. The video below documents U.S. efforts in the Philippines.


What issues are the Philippines facing now?

While many of the recent collaborative agreements between the U.S. and Filipino movements have been part of the United States’ overall involvement in Asia, the relationship between the two sides truly regained strength after 9/11. As terrorism became a main foreign policy concern for the U.S. it looked abroad to combat a wide variety of terrorist organizations, leading to its efforts in the Philippines.

In addition, the Philippines struggles with militant groups that make it difficult to successfully run the country. The current President of the Philippines is Benigno Aquino III; he was elected in 2010. He’s had to deal with many issues, including the Filipino-American relations, and the push against the terrorist and militant groups in the nation.


Terrorism

There are three prominent terrorist groups in the Philippines according to the U.S. State Department. These three are the Abu Sayyaf Group, the New People’s Army, and Jemaah Islamiyah. The Abu Sayyaf Group and Jemaah Islamiyah are both Islamist groups.

Abu Sayyaf Group

Abu Sayyaf Group, or ASG, is a splinter group of the Moro National Liberation Front. While smaller than the others, it has been the most aggressive. Its list of transgressions is long but includes such nefarious acts as murder, kidnapping, extortion, and robbery. It is mostly funded through those robberies. It operates primarily out of the southern islands of the Philippines, which have the largest chunk of the Muslim minority population.

Jemaah Islamiyah

The other Islamic extremist group is Jemaah Islamiyah. Unlike the ASG, Jemaah Islamiyah is based out of Indonesia but operates in the Philippines. The group engages in many of the same criminal enterprises as ASG, particularly in bomb-making. Both groups also have ties to Al-Qaeda which has provided logistical support for both, particularly Jemaah Islamiyah.

New People’s Army 

The third group is a bit of a throwback to an earlier era. The New People’s Army, or NPA, is the Communist party of the Philippines, founded with the goal of overthrowing the Filipino government. Unsurprisingly, the group was founded in 1969 during the height of the Cold War. This group mainly targets public officials and U.S. personnel, as it is highly critical of the U.S. presence on the islands. The NPA receives most of its funding locally or from ex-patriots in other countries. While the group’s main aims might be different however, its members still often train alongside Islamist groups.

Other Actors

Along with these groups are the Alex Boncayao Brigade and the Pentagon Gang which were other organizations that were formerly listed on the State Department’s list of terrorist organizations. However, their capacity has been reduced to the point where they are no longer considered terror groups.  The following video gives a detailed explanation of terrorism in the Philippines.


 

Militant Groups

Moro National Liberation Front 

Along with the terrorist groups that operate in the Philippines are two militant groups that are also very prominent. First is the Moro National Liberation Front or MNLF–“Moro” is the Spanish name for  Muslims in the Philippines. It comes from the word “Moor.” Founded in the 1970s, this group has waged a guerrilla campaign against the Filipino government, which it believes has marginalized Muslims in the southern area of Mindanao. In 1996 the two sides reached an agreement with Mindanao achieving semi-autonomy from the government in Manila. Following the agreement and a failed uprising the MNLF’s status has declined.

Moro Islamic National Front 

The second group is the similarly named Moro Islamic National Front, or MILF. Besides sounding similar, the overlap extends further, as the MILF is actually a splinter group formed from the MNLF. Also founded in the 1970s, this organization employs many of the same tactics as the MNLF. The MILF reached its own peace agreement with the government in 2001; however, whereas the MNLF declined following its treaty with the government, the MILF–which is the larger of the two–has continued fighting in hopes of creating an independent Islamic nation in the south.

As fighting continued for the next decade, both sides were also working to reach some kind of a peace agreement, which they finally did in 2014.


Current Outlook

With peace made between the main insurgent threat and the Filipino government, it is fair to ask whether the efforts by both the Filipino government and the U.S. have succeeded. While the terror groups have not completely abated and probably never will, their capabilities have been greatly reduced to the point that the U.S. feels comfortable enough to dissolve its anti-terrorism unit there. Thus, while it may not be the best-case scenario, it does provide a type of closure in the war on terror that is better for both sides than more fighting. This type of agreement might also prove to be the standard going forward in the war against terrorism globally for other afflicted nations.

There are of course many other issues that the Filipinos will have to address in the coming years. As the continued U.S. presence suggests, the Philippines may be a central point of action if relations between China and the U.S. deteriorate to the point of no return. Although this seems far from certain, potential flash point disagreements still exist between China and her neighbors, many of whom are U.S. allies, including the Philippines.

Other issues also exist, such as extreme poverty. The gravity of this problem was on display following the devastation from Typhoon Haiyan, which killed over eight thousand people. The storm also destroyed large swaths of desperately needed farmland. This forced as many as four million people to be displaced and seek help from outside sources. Already many people there were living on around a dollar a day and scavenging just to get adequate food supplies.

Domestic violence has also been on the rise in the nation. While more cases were naturally expected to be reported following the passage of the Violence against Women and their Children Act in 2004, the results are unsettling. According to one report by the Women and Children’s Protective Center, the rate of violence rose over 150 percent from 2004 to 2011. While these numbers are unnerving, it is still suspected that incidents are underreported as abuse is seen as a private matter.

These are only some examples of existing issues and while they are certainly not exclusively Filipino problems, they do point to areas of future concern. Also, while an agreement is in place, something more concrete will likely need to be worked out between the ruling government in Manila and its autonomous regions. Whether this is full independence or greater inclusion of the Muslim minority, the status quo does not appear likely to hold out forever, as evidenced by history.


Resources

Primary

Council of Foreign Relations: Terrorism Havens: Philippines

Additional

Anti-Defamation League: The Philippines and Terrorism

Nations Online: History of the Philippines

Foreign Policy: Old Frenemies

War on the Rocks: End of An Era in the Philippines

Global Security: Moro Islamic Liberation Front

Huffington Post: Is This What Terror War Success Looks Like?

Reuters: Typhoon Haiyan

IRIN: Philippines Steep Rise in Gender Based Violence

International Business Times: China-Philippines Territorial Dispute

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post The Philippines: A U.S. Ally Grapples with Terrorism appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/philippines-u-s-ally-grapples-terrorism/feed/ 0 35118
Myanmar or Burma: Conflict in a Country With Two Names https://legacy.lawstreetmedia.com/issues/world/myanamar-burma-conflict-country-two-names/ https://legacy.lawstreetmedia.com/issues/world/myanamar-burma-conflict-country-two-names/#comments Sat, 21 Feb 2015 15:00:37 +0000 http://lawstreetmedia.wpengine.com/?p=34688

The latest on the protracted conflict in Myanmar/Burma.

The post Myanmar or Burma: Conflict in a Country With Two Names appeared first on Law Street.

]]>

In a country where even the name is a contentious issue, Burma, or Myanmar as it has more recently been known, is home once again to a brewing conflict. There are many competing sides in this struggle, including the much-criticized government, the victimized Muslim population, and Buddhist monks who are advocating a nationalist message. The violence has already led to mass displacement, reprisal attacks and questions about Myanmar and its democratic reforms. A shaky government is gearing up for the 2015 elections, but the outcome of this conflict is anything but certain.  Read on to learn about the history of Myanmar, the current conflict, and the prognosis.


What is Myanmar?

Myanmar, or Burma as it was known prior to 1989, is a country located in Southeast Asia. Its population of approximately 55 million people is comprised primarily of Burmese Buddhists, but there are also sizable minority groups residing there as well. For most of its history, Myanmar was home to independent Burmese kingdoms, until it was conquered and made a British colony from the nineteenth to twentieth centuries.

In 1937, Myanmar was finally separated from the British colony of India, and in 1948 gained full independence; however, independence did not exactly open the country to opportunity. Instead, for the next 60 or so years it was ruled by one military dictatorship after another. Finally, in 2011, a new quasi-civilian government was elected under the leadership of a man named Thein Sein and long-awaited reforms began. These reforms included releasing long-held political prisoners, agreeing to peace with minority groups, and opening up the press and the rest of society. The video below provides a description of the country’s road from independence to the present.

While the country stills struggles with violence and implementing reforms, its name also remains something of a quagmire. Although it officially changed the name from Burma to Myanmar following a brutal government crackdown in 1989, the world has been slow to accept it. The United States for example, while hopeful of Myanmar’s democratic aspirations, still officially uses the name Burma so as not to appear to sanction human rights violations there; however, during a historic visit in 2012, President Obama did refer to the nation as Myanmar instead of Burma. For the duration of this article, I will use Myanmar so as to avoid confusion.

A History of Conflict

Long before the invasion of colonizing British, Myanmar was home to extensive conflict. Much of the conflict was centered in a region called Rakhine, inhabited by the Rakhine people. The region has been repeatedly invaded by multiple forces. One of those forces is the minority Rohingya Muslim population, who clashed with the Buddhist Rakhine people. Rakhine has also been invaded by the Buddhist Burmese, who they are ethnically different from, mostly for political and historical reasons. Rakhine is a powder keg region that has consistently been a center of conflict in Myanmar.

There have also consistently been tensions between the Rohingya Muslims and the Buddhists (both Burmese and Rakhine). Some of it may stem from WWII, when Rohingya Muslims remained loyal to the British, while the Buddhists supported Japan in hopes of achieving independence. This was compounded after WWII when the Rohingya attempted to rise up and carve out an area of autonomy for themselves and were initially successful; however, over time they were defeated and have since been politically oppressed.


The Current Conflict

Rohingya Muslims

The Rohingya are a Muslim minority group in western Myanmar. The group are victims of an official government policy that has been called ethnic cleansing by some; their people are segregated into isolated camps and villages where basic services are not available. The situation is bad enough that the Rohingya are considered some of the world’s most persecuted people by the United Nations. In a recent national census the group was not even counted among the country’s citizens. In fact, a 1982 law forbids the Rohingya from becoming citizens of the country. The group is discriminated against both because of its religion and also its traditionally darker complexion.

The Rohingya originated from a part of what was once a region called Bengal and is now part of Bangladesh. While the Rakhine are the overall majority in the region, in the areas bordering Bangladesh, the Rohingya are actually the majority. It is still unclear exactly how the group came to the region, with some saying they have existed there for hundreds of years and others claiming they are relative newcomers from just this past century. Either way, the Rohingya are viewed with great hostility in Myanmar. This distinction also excludes them from indigenous status within the country’s constitution.

Rakhine Buddhists

The other group is the Rakhine Buddhist nationalists. Somewhat surprisingly, many of those involved in the conflict are also Buddhist monks. This movement has become known as the 969 movement.

The number 969 is a reference to Buddha and some of his teachings. The figurehead of this movement is Ashin Wirathu, who has risen to fame by delivering fierce speeches that include unsubstantiated claims about Muslims, calls to boycott Muslim businesses, and demands for laws that prevent interfaith marriage.

Nevertheless, while the Rakhine are the majority in the region, they are yet another minority within it. Unlike the Rohingya, who are generally viewed as newcomers, the Rakhine are a much older group there. In fact, they once had their own empire in what is now Bangladesh and Myanmar, before they were invaded by the Burmese. To the Rakhine and much of the rest of the population, the Rohingya therefore are an illegal immigrant group and are treated as such.

The Flashpoint

The current conflict was set off by the rape and murder of a Buddhist woman by a Muslim man in May 2012. This led to a wave of violence perpetrated primarily against Rohingya by Buddhist nationalists. A second wave of attacks took place that October. These were different from the first in two ways: first they were much more coordinated; second they were directed at Muslims in general and not just the Rohingya.

Following the attacks, the government took two steps. First it created an interfaith commission to provide a report on exactly what had led to the violence. While portions of the report were valuable, other parts that called for Rohingya family planning cast doubt on its goals. Second, as the focus of attacks grew from the Rohingya to Muslims in general, the government made an effort to protect Muslim populations by sending in police; however, far from being useful, police sometimes stood by or even engaged in violence against Muslims along with Buddhist nationalist crowds. The government also sent the army into areas and they have proven more effective because they have less Rakhine Buddhists among their ranks.

This conflict has led to terrible living conditions for the Rohingya, with many being forced to flee into Bangladesh where they are living in refugee camps. It has also led to a mass exodus as Muslim citizens of Myanmar seek safety in other countries. It has further led to several reprisal bombings and attacks on Buddhist holy sites and Myanmar government offices by Muslims who claim to be acting on behalf of the Rohingya. Domestically, it has increased scrutiny on a government seemingly unable to stop the violence. It has also opened the door to leadership for a the popular opposition candidate named Aung San Suu Kyi. The video below explains the conflict in depth.


The Future For Myanmar

Going forward, three groups of people are likely to have the biggest impact, not only on this conflict but on the country as a whole. These are the Myanmar government, politician Aung San Suu Kyi, and the Buddhist monk Ashin Wirathu

Myanmar Government

Aside from the Rakhine debacle, the government has several other things to worry about. According to international watchdog organization Human Rights Watch, the government has been backsliding on many of the promised reforms from 2011, namely granting freedom to the media. On top of this is the increasingly evident control still held by the military, which threatens to make the widely anticipated elections later this year into a farce that do nothing to change the status quo.

Unfortunately for the people of Myanmar the status quo is not that pretty either. The economy is one of the least advanced in the world. It is also plagued by sanctions from places like the U.S. and E.U. These conditions are only exacerbated by fighting and the perception of ethnic cleansing, which prevents new investments and global support.

Aung San Suu Kyi

Aside from the government’s inability to maintain control, it is also feeling heat from Aung San Suu Kyi. Kyi has been arrested a number of times for denouncing the military regimes that have ruled Myanmar over the course of the last 30 years. Her efforts even garnered her a Nobel Peace Prize in 1991.

Despite her record of denouncing injustice, however, she has declined to speak out in defense of the Rohingya. Many speculate this is her acting politically, as not only might she aspire to the presidency, but some of her strongest backers are the same Buddhist monks attacking Muslims. Furthermore it is unclear how much her voice could really alter things in Rakhine, especially considering the recent refusal by the government to allow her to run for the presidency. Still, it seems that as someone who champions civil rights she would stick up for a targeted minority even if it was unpopular, for the sake of country as a whole. Regardless of her relationship with the government, she is often viewed as a strong and respected voice in Myanmar. The video below chronicles Kyi and her viewpoints.

Ashin Wirathu

At the heart of Buddhist nationalist rhetoric is an embattled monk. Practically unknown before, he began to make a name for himself in 2001 during an earlier uprising against Muslims as part of the 969 group. His actions earned him 25 years in prison, but he was released as a political prisoner in 2010.

While he does not enjoy universal support, he has a large following because of his strong nationalist message and his denunciation of the Rohingya Muslims who are not liked by any segment of the population. Wirathu has also increased his audience by broadcasting on YouTube.

What gives him the most clout though is inaction. His fellow monks and the government have refused to discipline him. This has led some to believe he is preaching a message with which the government implicitly agrees. One of the few groups to oppose his teachings however, are certain women’s groups, which feel he labels the country with a bad image and is attempting to infringe on their rights by restricting who they can marry.

The following video details Wirathu and what he is preaching.

While these three actors are not the only ones at play in Myanmar, they are at the heart of the current conflict. They are also three agents who can affect change for good or for bad across the country itself.


Conclusion

The conflict in Myanmar threatens not just the Rohingya, but all minority groups. This is especially true in the wake of nationalist sermons preached by Buddhist Monks. Unfortunately not much is likely to be done about the situation. Although elections loom, the most promising candidate, Aung San Suu Kyi, is barred from participating. Unchecked and unresolved violence is only likely to simmer and burst out again; however, if the government can make real in-roads to reform and put on a legitimate election then the opportunity to rewrite Myanmar’s  story still exists.


 Resources

Primary

World Factbook: Myanmar

Additional

NPR: In Buddhist-Majority Myanmar, Muslim Minority Gets Pushed to the Margins

Washington Post: Why it’s Such a Big Deal That Obama Said ‘Myanmar’ Rather Than Burma

Al Jazeera America: Myanmar’s Buddhist Terrorism Problem

BBC: Why is There Communal Violence in Myanmar?

Human Rights Watch: Burma: Rights Heading in Wrong Direction

BBC: Myanmar Profile

BBC News: Ashin Wirathu: Myanmar and Its Vitriolic Monk

International Crisis Group: The Dark Side of Transition: Violence Against Muslims in Myanmar

Quartz: Aung San Suu Kyi Has Gone Silent on a Major Human-Rights Crisis in Myanmar

Guardian: Burma Rules Out Lifting Ban on Aung San Suu Kyi Presidency Before Election

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Myanmar or Burma: Conflict in a Country With Two Names appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/myanamar-burma-conflict-country-two-names/feed/ 1 34688
India: A Superpower on the Rise? https://legacy.lawstreetmedia.com/issues/world/india-superpower-rise/ https://legacy.lawstreetmedia.com/issues/world/india-superpower-rise/#respond Sat, 14 Feb 2015 13:30:26 +0000 http://lawstreetmedia.wpengine.com/?p=34193

India may be a superpower on the rise, but the nation still faces many challenges.

The post India: A Superpower on the Rise? appeared first on Law Street.

]]>
Image courtesy of [Global Panorama via Flickr]

India has long been an important nation on the international stage; its massive population and rapidly growing economy have the potential to propel it forward even further. While there have been ebbs and flows–the recent recession strongly impacted the sub-continent–things may be looking up. There’s a new Prime Minister and India is on the rise yet again. Read on to learn about India’s growth, the relationships it has with other nations, and the challenges that the country will face in coming years.


A Look Into the Past

Like China and Mesopotamia, India is often considered one of the birthplaces of civilization. The first civilization in India was founded over five thousand years ago. Since then, India saw the rise and fall of countless empires, invading forces, and ideas. Buddhism and Hinduism were also founded in India; and Islam, when it reached the area in the eighth century, came to exert a powerful influence, as well.

The story of modern India however, picks up at the beginning of the eighteenth century, when the declining Mughal Empire was conquered piecemeal by the British East India Company. The British outcompeted their French rivals and bit by bit took over the sub-continent. Yet British rule was not to last either, with a large-scale mutiny in the middle of the nineteenth century hinting at the rise of Indian nationalism.

This came to fruition after years of protest that featured leaders such as Mahatma Gandhi when India finally achieved independence in 1947. This independence, however, did not come about smoothly. The same year India became independent, it also broke into two separate nations, Hindu India and Muslim Pakistan. As many as 12.5 million people migrated to one country or the other depending on their religion. Up to one million people died in the ensuing chaos.


Rise of Modern India

After the end of colonial rule, India initially adopted a planned economic approach. The idea was to increase consumer savings, which would then lead to greater investment in the economy and growth. The plan was to create a prosperous India that was financed by its own economy and not beholden to outside forces.

While the plan had some success, however, growth remained limited in India at an average of four percent annually in the 1950s, 60s, and 70s. The plan was also plagued by unbridled population growth and inequality. The proverbial corner was turned beginning in the late 80s and early 90s when the economy was finally opened up. Growth shot up to over 6.5 percent annually, while the service sector in particular began to take off.

Move to a Market Economy

The key to the turn-around for India economically was when it moved from a series of five-year plans, as part of a planned economy adopted from its then-ally, the Soviet Union, to a market economy, which is similar to those of Western nations. Originally India adopted a socialist model as the means to improve its economy. This meant most industry, licensing, and investment infrastructure was controlled by the government. The whole idea behind this logic was to build strong home-grown industries in India, and in the process prevent the inequality notorious in capitalist societies from spreading there.

The planned economy proved ineffective. This was mainly due to low growth rates and the failure to generate high savings rates. In fact the state, far from succeeding in building up savings, actually began running up higher and higher deficits as its programs proved ineffective. Thus, spurred by this ineffectiveness and a rise of the price of oil as a result of the first gulf war which nearly caused the country to default, India made a change. The government did a complete 180, reducing state control and planning, liberalizing trade and investment, and reducing the deficit.

Following the success from the 1990s and with continued reforms, the Indian economy continued to hum along in the first decade of the 2000s, averaging greater than six percent growth annually. Rapid growth stalled, however, as it did in much of the rest of the world, following the Great Recession.

The reason that India was hit so hard was because of a failure to further liberalize policy concerning labor, energy, land reform, and infrastructure improvement. Namely the issue was in many ways the same that had been affecting India during its planned economy, despite the reforms the country had enacted in the past two decades. First labor laws were still very restrictive so it made it hard for people to move around in search of jobs.  Secondly, the infrastructure was not adequately developed in India so that its manufacturers could easily export their products. Third, the country was still plagued by shortages in essential goods, such as energy. This was all compounded by the government’s vain effort to prop up the country’s currency, the Rupee.  Not only has this led to a higher deficit, but also inflation, which eats away at people’s savings and makes them poorer. This led to growth rates closer to four or five percent during the recession.

After the Recession

Nevertheless, India’s economy has rebounded in the last two years and in 2014 outpaced China for the first time. This was due to several improvements. First, both the manufacturing and financial sectors improved dramatically. In addition, new Prime Minister Modi and other political leaders have worked diligently to reduce debt. Lastly, the drop in the price of oil has dramatically helped India, as most of its import deficits were due to the importation of oil to fuel its growing need.

While India has seemingly regained its status as a rapidly growing emerging market, this also comes with caveats. First, the growth figures that show it outpacing China had to be recalculated due to some errors, so many economists are treating them with skepticism. Secondly, according to a New York Times study from 2011-2012, 30 percent of Indians still live in extreme poverty, which translates approximately to 363 million people. That is more people than live in the United States. Thus, although India may recoup its status as a major, up-and-coming economy, there is still room for improvement. The following video gives an outlook on the impact reforms could have on India’s economy.


India’s Friends and Enemies

Pakistan

When discussing international concerns for India, the discussion always starts with Pakistan. The two nations were founded at the same time when British rule in India ended; however, the division of the two countries was plagued by extreme violence and a persistently strong feeling of animosity. The situation has in no way improved by the three wars and ongoing proxy war being waged over Kashmir. The conflict in Kashmir stems back to the separation of India and Pakistan.

At the time of independence, there were 562 princely kingdoms that were independent from either country and could choose which one they wanted to join. Both countries therefore were eager to recruit these principalities–Kashmir was one of the most coveted. Pakistan seemed to have the upper hand, as 70 percent of the population was Muslim; however, at the time, the ruler was Hindu so India claimed the area on that argument and still occupies it to this day. Aside from the direct conflicts there, Pakistan has also waged a guerrilla campaign to free the territory from India and incorporate it into the Muslim state of Pakistan.

On top of all that, both countries possess nuclear weapons and flaunt their capabilities, an example of which was the corresponding nuclear tests during the 90s. The video below provides a summary of the two nations’ conflict.

Nonetheless, hopes for thawed relations came when Prime Minister Modi was elected last year–one of his campaign promises was to improve relations between the two countries; however, lately Modi’s speeches have been full of aggressive rhetoric and the Pakistani military continues to support anti-India terror groups so change has yet to come. An example of this is when he suggested Pakistan was, “waging a proxy war” in Kashmir. He has also canceled several meetings with Pakistani officials, including one potential rendezvous at the United Nations.

China

India’s other major neighbor in Asia is China. Like Pakistan, India also fought a brief war with China in 1962 and has since maintained a relatively tense border with the country in the Himalayas. Tthe relationship with China has steadily improved in other areas as the countries have signed a number of trade agreements. The relationship was tested in 2013 with a Chinese incursion into Indian territory; however, no apparent serious harm came of it.

The lack of consternation may be rooted in how the countries view each other. In India, China is seen as a chief rival and also a source of emulation economically. For China, which is stronger militarily and economically, India is not regarded as much of a rival.

United States

Like its relationship with both Pakistan and China, India’s relationship with the U.S. is complicated. The countries originally shared strong ties, with the U.S. aiding India during the conflict with China. Relations were strained following America’s decision to side with Pakistan in its 1971 war with India. Things were further exacerbated by an arms treaty signed between India and the USSR and India’s testing of nuclear weapons in the 70s.

Relations seemed to be improving in the 1990s as India opened up its economy and moved to a free market approach. But once again ties between the nations weakened in 1998 when India again tested nuclear weapons, which drew condemnation and sanctions from the U.S. The sanctions were quickly repealed though and the two nations became close once more over a commitment to combat terrorism. The two sides have continued to grow closer since then, signing everything from trade to weapons agreements. In 2013 an Indian delegate was arrested for committing visa fraud, causing major waves. The two sides have seemed to yet again overcome this hiccup though, following the president’s recent trip to India where he reaffirmed the U.S. commitment to friendship.

The relationship with the U.S. also seems likely to continue to improve, despite numerous setbacks, many of which were over nuclear policy that now seem settled. While the U.S. may want to utilize India against a rising China, the two sides also value each other as trade partners. The relationship is further enhanced by the U.S.’s further distancing itself from Pakistan.


Domestic Concerns for India

While India navigates the dangerous game of international politics, it has internal issues to consider, as well. First and foremost is the status of women. While seemingly no country in the world can boast of total equality between men and woman, the situation is especially bad in India. While some women may enjoy access to lucrative lifestyles, there is a virtually systemic oppression of women in education, marriage, and the economy. A grisly example was the gang-rape of a woman by six men in Delhi in 2013 that resulted in the woman’s death. While four of the men were eventually sentenced to death, their crime highlighted a culture where women are often blamed for rape and where the courts are slow to act.

Women, of course, are not the only group to be institutionally marginalized in India. The caste system has existed for a long time. In this system people are born into and can expect to rise no further than a particular caste or class, which is often associated with some type of profession. While some efforts have succeeded at down-playing caste origins in jobs, castes still play a large role in social interactions and romantic relationships.

The persistence of discrimination, both against women and people of lower classes, speaks to the issue of inequality in the country. According to a report from the United Nations – Economic and Social Commission for Asia and Pacific (UNESCAP), income inequality actually increased in India from the 1990s to late 2000s.

India’s population is the second largest in the world at more than 1.2 billion people. With birth rates still outpacing death rates, that number is only going to continue to increase until it is expected to plateau in 2050. The population of India is also expected to surpass that of China for the world’s largest along the way, in 2025. All these extra people mean more food, housing, and jobs for a country that is already hard pressed to generate them at current levels. The accompanying video highlights the issues with poverty in India.

Domestically, though changes have been made incrementally, the sweeping changes necessary to fix many of India’s societal ills seem unlikely. As the infamous Delhi rape trial showed, while courts can be forced into action when thrust into the spotlight, they have been very slow to protect women. This also speaks to a problem of institutionalized marginalization for a large chunk of society, which has lasted for many years and thus is unlikely to simply go away now. Couple these issues with continued population explosion and the poverty that haunts India is likely to continue. Particularly with inequality rising and wealth being consolidated into the hands of the elites, much as it is in western nations.


Conclusion

After initially struggling following independence, India has enjoyed strong recent growth. While that growth was threatened by the great recession, India was able to pull through and even outpace China, if the numbers are to be believed. Going forward, Asia’s other potential superpower has many issues to deal with. Internationally, serious issues still exist concerning the relationship between India and Pakistan. India’s relationship with Asia’s affirmed rising super power is also in question as India moves closer to fellow democracy in the United States, while China seemingly drifts closer to fellow autocrat Russia.

Domestically it is more of the same, with concern over the economy dominating. Yet other issues also exist, namely an entrenched class system and the low status of women. Thus, while India has come very far, there is still a long way to go. Therefore while it is still possible for India to act on its superpower potential and one day rival China as Asia’s premier power, reforms and improvements are likely required along the way.


Resources

Primary 

Indian Embassy: U.S.-India Relations

Additional

Forbes: India Growth Now Beats China

Diplomat: India and Pakistan: A Debilitating Relationship

National Interest: China and India: The End of Cold Peace?

Council on Foreign Relations: Timeline U.S.-India Relations

Centre for Economic Policy Research: India’s Growth in the 2000s: Four Facts

Economist: How India Got Its Funk

BBC News: India Growth Figures Baffle Economists

The New York Times: Setting a High Bar for Poverty in India

Asia Society: India-Pakistan Relations: A 50-Year History

Saarthak: Women’s Situation in India

World Post: India Gang Rape Case: Four Men Sentenced to Death

Economist: Why Caste Still Matters in India

Financial Express: Income Inequality: Poor-Rich Gap Growing in India, Asia-Pacific

International Business Times: Partition of India and Pakistan: The Rape of Women on an Epic, Historic Scale

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post India: A Superpower on the Rise? appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/india-superpower-rise/feed/ 0 34193
Saudi Arabia: Succession in the Chaos https://legacy.lawstreetmedia.com/issues/world/saudi-arabia-succession-in-the-chaos/ https://legacy.lawstreetmedia.com/issues/world/saudi-arabia-succession-in-the-chaos/#comments Sun, 08 Feb 2015 13:30:14 +0000 http://lawstreetmedia.wpengine.com/?p=33782

There's a new monarch in Saudi Arabia, but what new challenges will he face?

The post Saudi Arabia: Succession in the Chaos appeared first on Law Street.

]]>

A few weeks ago, Saudi Arabian monarch King Abdullah died. At the time of his death Abdullah was 90 years old, which made him the oldest living sovereign. While his country’s place on the world stage has changed dramatically over the course of his life time, his death leaves many questions unanswered. Read on to learn about the Saudi monarchy, and the problems plaguing the new ruler.


The Al-Saud Family

The site of modern day Saudi Arabia has been settled in some form for approximately 20,000 years. The region was a key trading corridor for the ascending civilizations of the Nile Valley and Mesopotamia, following the invention of agriculture.

The area’s first era of prestige, however, came hand in hand with the founding of Islam. Two cities, Medina and Mecca, located in present day Saudi Arabia, served as two of the birthplaces of Islam. They remained vital and began attracting thousands of pilgrims as the Muslim world expanded from North Africa to China.

The first developments of modern Saudi Arabia came in the seventeenth century when Shaikh Muhammad bin Abdul Wahhab and Muhammad bin Saud formed an agreement promising to return to the original teachings of Islam, which culminated in the first Saudi state. The state proved prosperous and quickly covered much of what is now modern day Saudi Arabia. However, this prosperity drew the attention of the Ottoman Empire, which crushed the aspiring nation in the early nineteenth century. A second Saudi state was established soon after, but also met a similar fate. This time the current patriarch, Abdul Rahman bin Faisal Al-Saud, was forced into exile in the Empty Quarter, a desert region in the east, before finally fleeing to modern day Kuwait.

Faisal Al-Saud’s son, Abdulaziz, began to reverse the family fortune, when in 1902 he led a daring raid into the current capital of Riyadh, and with a small force was able to take over the city. Abdulaziz gradually reestablished control over the whole territory, two of his most symbolic conquests being of Mecca and Medina in 1924 and 1925, respectively. Finally, the modern nation of Saudi Arabia was established in 1932 by its first monarch, the same Abdulaziz Al-Saud.


The Road to Succession

King Abdulaziz wanted one of his sons to succeed him on the throne; however, he had approximately 45 sons from which to choose. Thus it is no surprise then, that every ruler of Saudi Arabia since the death of Abdulaziz has been one of his many sons. This trend continued, as the recently deceased King Abdullah was succeeded by another of his brothers, Crown Prince Salman. The next in line after Salman is his brother, Crown Prince Muqrin.

While so far all of Abdulaziz’s successors have been one of his sons, this is likely to end soon. Crown Prince Muqrin is the youngest of Abdulaziz’s sons, but youngest is a relative term, as he is in his sixties. Therefore, if he actually ever ascends to the throne of Saudi Arabia, Muqrin is likely to be the last son to do so. The next ruler of Saudi Arabia after Muqrin therefore, assuming he outlives all his brothers and half-brothers, is one of the many grandsons of Abdulaziz.

While the proverbial changing of the guard has the potential to cause trouble, since the death of Abdulaziz the line of succession has never been an issue. Power has continued to pass down the line of brothers. The only change to the succession formula in fact, was the creation of the deputy crown prince position, formerly occupied by Prince Muqrin, which was put in place precisely because all of Saudi Arabia’s leaders are so old.

The smoothness of the succession process can be attributed partly to this familiar formula, as well as the Allegiance Council, which was created by King Abdullah in 2006. The council, made up of his brothers and nephews, is responsible for deciding the next monarch.  While the sons of Abdulaziz still reign, the council has a smaller pool to choose from, however once the next generation rises to prominence, the decision of the council could be potentially much more difficult politically.  For now though, the council followed the traditional track and declared Salman, the oldest living son of Abdulaziz, the new king and Prince Muqrin his successor. The video below summarizes this succession process.


Challenges for the New King

Oil Prices

While the succession to Saudi Arabia’s throne seems clear, the challenges facing King Salman are anything but. The first and most obvious problem plaguing Saudi Arabia is how to handle plummeting oil prices. In November, contrary to conventional wisdom, OPEC, which is dominated by Saudi Arabia, decided not to cut production even as prices were already dropping dramatically.

The reason why the Saudis may be willing to flood the market with cheap oil is geared more to the long run. By driving costs so low, the Saudis can put many of their competitors, such as upstart fracking operations, out of business, because the cost to access the oil is more than it is being sold for.

Not only may Saudi Arabia be forcing the price of oil down to eliminate its competition, there are also political factors at work. There’s a worry that Saudi Arabia has been working behind the scenes with Russia, a country that cannot afford low oil prices, offering to decrease production that would then raise prices again. In return, the Saudis would most likely want Russia to rescind its support for the regime of Assad in Syria.

Regardless, as the landscape of the global oil market changes, the role that the Saudis play in it will continue to change. How King Salman handles the oil market is certainly something to watch.

ISIS

ISIS, or the Islamic State in Iraq and Syria, is a terrorist organization that has carved out a large swath of territory for itself in Iraq and Syria, and whose ultimate goal is to establish a new caliphate. ISIS’ goals pose several problems for Saudi Arabia.

First, the areas under its control are close to the eastern regions of Saudi Arabia where a large number of Shi’ites reside in the predominantly Sunni nation. This is also the part of the country where Saudi oil is centered. The Saudis are wary of ISIS rhetoric creating discontent in the Saudi Shi’ite community, especially if it affects oil production.

Second, as part of ISIS’ would-be caliphate, it would have to conquer the two holiest places in Islam, Medina and Mecca. These two places are both located inside Saudi Arabia, meaning ISIS would have to invade the nation at some point if it hopes to rule either site.

Not surprisingly then, Saudi Arabia has already joined the coalition, led by the United States, which has riddled ISIS with constant airstrikes; however, unlike most other Muslim countries, Saudi Arabia has gone even further, attacking ISIS in Syria and even allowing the U.S. to train Syrian insurgents within its borders.

Saudi Arabia’s Neighbors

Aside from attempting to undermine ISIS, Saudi Arabia’s efforts in Syria are also calculated to inflict damage on a proxy state of its chief rival. Saudi Arabia has already poured large amounts of resources into the fight in Syria in the hopes of deposing Assad, viewed to be a client of Iran. However, the proxy war between the two extends far beyond Syria.

The recent coup in Yemen, located on the southwest border of Saudi Arabia, was led by a group known as the Houthis. This group is also purportedly under the influence of Iran. The interference of both Saudi Arabia and Iran in the affairs of their neighbors have led to a sort-of proxy war between the two powers.

While neither side can claim victory yet, geographically Saudi Arabia finds itself encircled. Normally this would not be too serious as Saudi Arabia traditionally has had the support of the US, the strongest military power in the world.

Recently though the strength of this relationship has come into question. U.S. talks with Iran over nuclear weapons have begun. While Saudis may fear the talks could lead to a closer relationship between the two, if Iran were to instead to go nuclear that could also have major consequences for the region and the proxy conflict. It is widely assumed that if Iran does go nuclear, Saudi Arabia will quickly follow suit, acquiring weapons from Pakistan whose program it originally helped fund. King Salman must prepare for that possibility.

Internal Struggles

Lastly, the new monarch of Saudi Arabia must consider what is going on inside the kingdom itself. Although government did very well in preventing the mass protests that plagued other nations during the Arab Spring, it can’t just throw money at all its problems. The list of potential problems is extensive, including human rights violations, xenophobia, and discrimination against women and non-Muslims. While these problems have yet to flare up, there certainly exists the potential for them to do so.

Domestically, the situation in Saudi Arabia is unlikely to change dramatically. While the late King Abdullah made some minor changes, the established order remains virtually unaltered. That is an order in which women are second-class citizens and wealth is concentrated among the few. For this to change anytime soon, Saudi Arabia would probably require some strong external pressure forcing it to alter the country’s way of thinking.


Conclusion

Following the death of King Abdullah, many experts have speculated there could be a succession crisis in Saudi Arabia; however, as of right now the succession seems to be about the only thing that won’t present problems in the future.

That is about the only well-established factor currently in the nation. While the succession is clearly laid out, Saudi Arabia has a number of other concerns: dropping oil prices, ISIS, its proxy war with Iran, and unrest among its own people. These concerns are only further exacerbated by the U.S.’s waning commitment. Thus while choosing a new king was relatively easy, maintaining the kingdom of Saudi Arabia may be potentially much more difficult.


Resources

Primary

Embassy of Saudi Arabia: History of Saudi Arabia

Additional

BBC: Saudi Arabia: Why Succession Could Become a Princely Tussle

Al Jazeera: The Question of Succession in Saudi Arabia

Daily Star: For Saudi Arabia Problems Abound All Around

Economist: Why the Oil Price is Falling

Business Insider: The Saudis Floated the Idea of Higher Oil Prices to Get Russia to Stop Supporting Assad in Syria

Huffington Post: Saudi Succession Raises Questions For ISIS Fight

Washington Institute: Nuclear Kingdom: Saudi Arabia’s Atomic Ambitions

Middle East Monitor: Saudis Most Likely to Join ISIS, 10% of Group’s Fighters Are Women

Al-Jazeera: Saudi Arabia, Iran and the ‘Great Game’ in Yemen

Guardian: Iranian President Says Nuclear Deal With the West is Getting Closer

Michael Sliwinski
Michael Sliwinski (@MoneyMike4289) is a 2011 graduate of Ohio University in Athens with a Bachelor’s in History, as well as a 2014 graduate of the University of Georgia with a Master’s in International Policy. In his free time he enjoys writing, reading, and outdoor activites, particularly basketball. Contact Michael at staff@LawStreetMedia.com.

The post Saudi Arabia: Succession in the Chaos appeared first on Law Street.

]]>
https://legacy.lawstreetmedia.com/issues/world/saudi-arabia-succession-in-the-chaos/feed/ 1 33782