Share |

Relax. Brexit Won’t Cause International Order to Collapse

A. Trevor Thrall

In 1989, the political scientist Francis Fukuyama famously declared the end of history. As the end of the Cold War approached, the liberal international order appeared to have won the debate about how societies should be governed. In Fukuyama’s words: “What we may be witnessing is not just the end of the Cold War, or the passing of a particular period of post-war history, but the end of history as such: that is, the end point of mankind’s ideological evolution and the universalization of Western liberal democracy as the final form of human government.”

Today, just two decades later, commentators are tripping over each other to proclaim the collapse of the liberal world order in the wake of the British vote to withdraw from the European Union. To many it appears we have now reached the end of the end of history, implying we are headed toward something far worse.

Recent developments, however, though certainly sobering, do not represent either the collapse of the liberal world order or a harbinger of worse things to come. The doom and gloom crowd makes three mistakes in their assessment.

To resist the homogenizing influence of globalization and supranational organizations is not itself an anti-liberal act.

We Have Only Ourselves to Blame

The first mistake is the insistence on a false nostalgia. The liberal world order was never as monolithic, well entrenched, or evenly distributed throughout the world as many suggest. As Stephen Walt has recently noted, many advocates of the end of history underappreciated the power of nationalism and sectarianism, unleashed in the wake of the Cold War, to provide alternative visions of society and governance.

When Fukuyama wrote “The End of History” only 41 percent of the world’s nations were electoral democracies. Today the figure is around 60 percent. Globalization of the economy has likewise grown consistently since the end of history began. The World Trade Organization reports that between 1980 and 2011 exports from developing economies have increased to 47 percent of the world market, while world trade has grown twice as fast as world production. Resistance to globalization and democracy, even violent resistance, certainly exists. Nonetheless, from these two critical vantage points it takes considerable imagination to believe that the world order is collapsing.

The second mistake the doomsayers make is to ignore the self-inflicted nature of many of the wounds now making the news. On the domestic front, elites in Western nations appear to have gotten too far out in front of their publics. In the United States, much like in Britain, a majority of the public is questioning the current strategy of engagement (especially military) with the rest of world. For those struggling to make ends meet, in particular, the benefits of globalization and nation-building abroad are cold comfort.

Without greater efforts to ensure economic policies benefit all of their citizens, Western politicians should not be surprised when they lose referendums and elections. But to be clear, the current upheaval has nothing to do with the collapse of the liberal world order and a great deal to do with entitled political elites forgetting that not everyone has been benefitting from it.

In foreign policy, the events of September 11 and Islamist-inspired terrorism, the Arab uprisings, and surging Russian and Chinese assertiveness may all reflect powerful anti-liberal forces in some respects, but they also reflect the unintended consequences of American and European foreign policy. After the Cold War, extended Western military intervention in the Gulf and support for oppressive governments helped both to spur jihadist groups like al-Qaeda and later the Islamic State and to give rise to the Arab Spring, leading eventually to the current refugee crisis evoked so powerfully during the Brexit debate. Meanwhile NATO expansion and Obama’s “pivot” to Asia have provoked predictable Russian and Chinese reactions. None of these self-inflicted wounds were inevitable, nor does their explanation require a theory of collapsing world order.

Finally, there seems to be a conflation of the potential collapse of the European Union with the collapse of the liberal world order. On one level the EU represents the ultimate liberal dream: a supranational organization able to spread peace, democracy, prosperity, and human rights. But for many the EU represents the opposite of liberal ideals. It replaces local rule with distant bureaucrats. It threatens the ability of individuals and communities to determine their own unique economic, social, and cultural answers to modern problems.

To resist the homogenizing influence of globalization and supranational organizations is not itself an anti-liberal act. At the end of the day, the EU is just one of many possible mechanisms for nations to work together. The liberal world order survived the coming and going of the Soviet Union and the Cold War. It will survive the coming and going of the European Union.

A. Trevor Thrall is a senior fellow at the Cato Institute.

Share |

How This Election Turned Me into a Libertarian

Ilya Shapiro

This election has turned me into a libertarian. Yes, given that I work at the Cato Institute, that statement seems either confusing or trite, but hear me out.

It’s not that my political views have changed; I wasn’t a secret socialist or paleo-conservative fifth-columnist in the heart of the libertarian mother ship. While I don’t agree with all my colleagues on everything, no two libertarians are in complete accord anyway (and are more likely to be found arguing about whose libertarianism is purer). (For the record, I fight the hypothetical and consider myself a classical liberal, so anarcho-capitalists and liberaltarians may commence criticism.)

Nor is it that I’m now a capital-L Libertarian, offering a full-throated endorsement of Gary Johnson. I mean, of the declared candidates, of course I’d go for one who’s fit for office. But a lot could happen between today and November 8. Clinton or Trump, or both, may not end up on their respective parties’ ballot lines, or an independent could enter whom I like more. Anyway, none of this means I’m throwing my lot in with the Libertarian Party itself.

No, turning libertarian has little to do with either ideology or partisanship. Instead, it’s an attitudinal shift.

Even though I’ve previously been pragmatic about allying with politicians I disagree with, both major parties’ candidates are so bad I’m voting Libertarian with a clean conscience.

I Care about What Politics Does

I’d always been a pretty political guy: I’ve enjoyed following the strategery, debating tactics, arguing historical counter-factuals, and memorizing statistics. It’s like sports, except at the end you’re left with more than just entertainment—which is scary when you realize that the winners of this “game” get, instead of trophies, power to control other people.

A lot of libertarians aren’t like that. Not that my fellow travelers in the liberty movement are unique in that way; most Americans aren’t political animals. For good reason: as George Mason University law professor (and Cato adjunct scholar) Ilya Somin has detailed in his excellent and often counterintuitive book “Democracy and Political Ignorance,” it makes no rational sense learning political intricacies when your vote is insignificant. Indeed, one measure of a country’s health and stability is how little its citizens feel a need to engage with politics. People are busy with jobs, kids, hobbies, and other much more important concerns.

Of course, self-identified libertarians are very much into small-p politics—honing ideological consistency, identifying the best policies, criticizing government—but many simply think getting “into the muck” of capital-P Politics is a waste of time, especially when both major parties have strong statist aspects. This is probably most true for the staunchest non-interventionists.

There’s nothing necessarily wrong with that perspective, but I’ve never been that way. I care a lot about political outcomes and have figured that the best way I can advance them, especially given my skill set and particular interest in legal policy and judicial nominations, is to work within the system rather than ignore it.

Voting With a Clean Conscience

Professionally I build unconventional coalitions, engaging whichever politicians and interest groups can help on any given issue. For example, I’ve joined dozens of organizations on Supreme Court amicus briefs and regularly meet with a range of politicians. But politics is different from the policy world in that you’re more often choosing the lesser of two evils, working against a candidate’s opponent more than for the candidate himself. That often involves supporting candidates who don’t score very high on libertarian purity tests, like George W. Bush, John McCain, and Mitt Romney, but whose party professes to care about and be influenced by classical-liberal ideas and whose executive and judicial appointments I would prefer.

Granted, I only became a citizen two years ago, so this will be the first presidential election where I can actually vote. (My first non-presidential vote, in 2014, was to legalize marijuana in D.C.—not that Johnson needs to make it his leading issue—after which I promptly moved to Virginia.) But I consider voting to be my least important political activity, which is a good thing given how unpalatable the suitors are for my first time.

No, this year, when both the Republicans and Democrats are poised to nominate the most godawful presidential candidates imaginable, count me out of conventional politics. I’ll instead be with the too-cool-for-school black-leather-jacket crowd that decrees “a pox on both your houses” before retiring to its absinthe snifters and e-cigars.

So far, I’ve found being an attitudinal libertarian to be cathartic. It’s a better way of dealing with this political season’s frustrations than arguing with your conscience about whether “Crooked Hillary” or “Fraudulent Donald” would be least unacceptable.

Ilya Shapiro is a senior contributor to the Federalist. He is a fellow in Constitutional Studies at the Cato Institute and Editor-in-Chief of the Cato Supreme Court Review.

Share |

The Democrats’ New National Security Strategy

A. Trevor Thrall

This week the New Democrat Coalition, a 52-member group of centrist Democrats from the House of Representatives, unveiled its principles for national security strategy. Like the similar effort from House Speaker Paul Ryan three weeks ago, this working document is a political maneuver and carries no legal or policymaking weight. 13 of the members of the coalition sit on the House Armed Services Committee, however, and thus it tells us something about the fault lines in future national security debates in Congress and about Hillary Clinton’s allies should she win the White House, as seems increasingly likely.

A close reading of the two-page document, unfortunately, suggests that the Democrats have learned only partial lessons from the long slog against terrorism in the Middle East. The strategy memo strongly suggests that Democrats are ready to double down on the Obama administration’s approach to national security and foreign policy. Given the general agreement among observers that Hillary Clinton is more hawkish than President Obama, we have good reason to doubt that House Democrats will present any resistance to an extended run of the failed foreign policies in place today.

American intervention and meddling have fueled chaos and instability in the Middle East and elsewhere, meanwhile feeding the threat of terrorism against Americans.

On paper, the New Democrats appear to have figured out that “not every problem can be solved with a bomb or a tank,” calling for a strategy to “eliminate terrorist threats without reckless interventions.” In light of President Obama’s own admission that the Libyan intervention was the worst mistake of his presidency, this seems like the least one should have learned from the past 15 years of American intervention in the Middle East. By almost every measure — financial, lives lost, terrorist attacks — the United States is worse off for having intervened so heavily in Afghanistan and Iraq, not to mention the persistent drone attacks in Yemen, Pakistan, Somalia, Libya and wherever. Any sign that Democrats are rethinking the “shoot first, plan later” approach to foreign policy is very welcome.

The rest of the memo, however, combines vague platitudes with a series of bullet points outlining an unrepentant vision of American foreign policy overextension. The strategy’s heavy emphasis on counterterrorism makes it clear that the New Democrats are unwilling to challenge the overheated rhetoric about the terrorist threat. In addition to calling on the United States to “defeat organizations like the Islamic State and Al Qaeda,” the New Democrats also call for aggressive development assistance to “prevent chaos and instability upon which terrorist organizations prey.” Even if the New Democrats have decided that the military interventions themselves were mistakes, after 15 years of trying (and failing) to help Afghanistan get its feet and almost as long in Iraq, Congress should have learned that democracy, civil society and a robust economy are not things the United States can deliver. Efforts to do so, with or without military intervention, are doomed to failure.

Just as troubling are the calls for more aggressive efforts to confront terrorism on the home front. Acts of domestic terrorism, like in San Bernardino and Orlando, are tragic, but the worst thing the United States can do in their wake is rush to implement “solutions” that do more harm than good. The New Democrats want to “deny domestic terrorists the weapons and resources they use to perpetrate atrocities” and to increase funding for federal, state and local homeland security and law enforcement counterterrorism programs. History with homeland security since 9/11 tells us, however, that expanding such efforts will lead to more wasted money and greater restrictions on civil liberties without any increase in actual security.

Further, the New Democrats call for increased engagement to confront a wide range of conflicts in Europe and the Middle East, none of which directly threaten American national security. The strategy calls for a recommitment to NATO “in the face of Russian aggression,” to “strengthen our relationships with Israel, Egypt, and our Middle East partners” in order to “take a more active role in leading diplomatic efforts to form new alliances in the region,” and to “expand engagement in the Asia-Pacific” to “preserve U.S. influence…” And in case they missed anyone, the final bullet point calls for the United States to “stand by commitments to U.S. alliances, increased engagement with countries and people whenever possible, and continue to share the burden of security arrangements.” Such a stance is a full-proof recipe for entanglement and intervention.

If all this sounds familiar it should. The New Democrats are simply echoing the bipartisan foreign policy consensus of the past two administrations. The United States, in this view, must engage deeply both militarily and diplomatically in order to keep the world from falling into chaos. In truth, however, the real lesson of the past 15 years is just the opposite: American intervention and meddling have fueled chaos and instability in the Middle East and elsewhere, meanwhile feeding the threat of terrorism against Americans. The United States would be better off pulling back from many of its current commitments and adopting a greater degree of humility about what American power can truly achieve.

A. Trevor Thrall is a senior fellow at the Cato Institute and an associate professor at George Mason University in the School of Policy, Government, and International Affairs.

Share |

Confirmation Chaos and Constitutional Corruption

Ilya Shapiro

Within hours of hearing the news of Justice Antonin Scalia’s passing, Senate Majority Leader Mitch McConnell announced that his caucus would not be holding any hearings or votes on a replacement nominee until after the election. “Let the people decide” became the rallying cry of the Republican majority, and all of the party’s members on the Senate Judiciary Committee signed a letter pledging fidelity to the #NoHearingsNoVotes plan.

When President Obama announced the nomination of Judge Merrick Garland a month later, nothing really changed: this wasn’t about the nominee’s qualifications, but an argument from the political principle that the gaping hole left by a jurisprudential giant shouldn’t be filled until the voters in a polarized nation — who reelected Obama in 2012 but then handed the Senate to the GOP in 2014 — could have their say.

This seemed like unprecedented obstructionism, though historically plenty of judicial nominees have never gotten hearings or votes, and the last time that a Senate confirmed a nomination made by a president of the opposing party to a high-court vacancy arising during a presidential election year was in 1888. Indeed, under recent Republican presidents, Democratic senators ranging from Joe Biden to Chuck Schumer to Harry Reid announced that they wouldn’t consider any new nominees until after the election.

That’s literally their prerogative: Just like the Senate can decline to take up a bill passed by the House, or a treaty signed by the president, it can surely decide how to exercise its constitutional power to “advice and consent” on judicial nominations. This is purely a political matter, with the Senate staking out how it wants to exercise its power and the voters being the ultimate judges, as it were, of that tactic. Indeed, if the Senate decided not to confirm any nomine to any position, it could do so — and likely pay a high political price unless the president were so compromised as to lack any popular legitimacy whatsoever.

Why the Push to Fill the Vacancy?

Why has it come to this? Why all the focus on one office, however high it might be? Sure, it’s an election year, but that doesn’t mean that governance grinds to a halt. If Secretary of State John Kerry died or resigned, it would certainly be a big deal — with Republicans grilling his would-be successor on President Obama’s foreign-policy record — but there’s no doubt that the slot would be filled if someone with generally appropriate credentials were nominated. Even a vacancy in the vice-presidency wouldn’t last unduly long, though Republicans would jockey to extract concessions for not having Speaker Paul Ryan be President Obama’s designated successor (even if for mere months).

But of course executive appointments expire at the end of the presidential term, while judicial appointments long outlast any president. To take an extreme example, an important ruling on donor-list disclosures was made this past April by a district judge appointed by Lyndon Johnson. Justice Scalia himself served nearly 30 years, giving President Reagan legal-policy agenda a bridge well into the 21st century. And let’s not forget that the Scalia-less Supreme Court stands starkly split 4-4 on so many controversial issues: campaign-finance law, the Second Amendment, religious liberty, executive and regulatory power, to name just a few. In this already bizarre 2016 election, legal pundits have finally gotten their wish that judicial nominations are firmly among the top campaign issues.

If we want to have the rule of law, we need judges to interpret the Constitution faithfully and strike down laws when government is exceeding its authority.

Moreover, this year marks the 25th anniversary of the bitter confirmation hearings of Justice Clarence Thomas. HBO aired a reenactment called “Confirmation,” which itself was controversial, reopening old political wounds regarding its portrayal of what Thomas referred to as a “high-tech lynching.” Justice Thomas received the narrowest Supreme Court confirmation in more than a century, 52-48 — and this less than four years after the failed nomination that ushered in the poisonous modern era of confirmation battles, that of Judge Robert Bork in 1987.

Senate Democrats had warned that nominating Bork would provoke a fight unlike any President Reagan had faced over judges — after Scalia’s unanimous confirmation the previous year. And so, the very day that Reagan nevertheless announced this pick, Ted Kennedy went to the floor of the Senate to denounce “Robert Bork’s America,” which is a place “in which women would be forced into back-alley abortions, blacks would sit at segregated lunch counters, rogue police could break down citizens’ doors in midnight raids, schoolchildren could not be taught about evolution, writers and artists could be censored at the whim of the Government, and the doors of the Federal courts would be shut on the fingers of millions of citizens.” It went downhill from there, as the irascible Bork — with an irascible beard — refused to adopt the now well-worn strategy of talking a lot without saying anything. A few years later, Ruth Bader Ginsburg would refine that tactic into a “pincer movement,” refusing to comment on specific fact patterns because they might come before the Court, and then refusing to discuss general principles because “a judge could deal in specifics only.”

History of Confirming Justices

Confirmation processes weren’t always like this. The Senate didn’t even hold public hearings on Supreme Court nominations until 1916 — and that innovation was driven by the unusual circumstances of (1) the resignation of a justice (Charles Evans Hughes) to run against a sitting president (Woodrow Wilson) and (2) the first Jewish nominee (Louis Brandeis). It wouldn’t be until 1938, with (also-Jewish) Felix Frankfurter, that a judicial nominee actually testified at his own hearing. In 1962, the part of Byron White’s hearing where the nominee himself testified lasted less than 15 minutes and consisted of a handful of questions, mostly about the Heisman-runner-up’s football-playing days.

What’s changed? Is it TV and social media, the 24-hour news cycle and the viral video? Is it that legal issues have become more ideologically divisive? No, it isn’t that there’s been a perversion of the confirmation process, increasingly demagogic political rhetoric, or even the use of filibusters. Those are symptoms of the underlying problem, a relatively new development but one that’s part and parcel of a much larger problem: constitutional corruption.

As government has grown, so have the laws and regulations over which the Court has power. All of a sudden, judges are declaring what Congress can do with its great powers, what kind of law the executive branch can write into the Federal Register, and what kinds of new rights will be recognized.  As we’ve gone down the wrong jurisprudential track since the New Deal, the judiciary now has the opportunity to change the direction of public policy more than it ever did. So of course judicial nominations and confirmations are going to be more fraught with partisan considerations.

This wasn’t always a problem — in the sense that partisanship didn’t really mean that much other than rewarding your cronies. It’s a modern phenomenon for our two political parties to be so ideologically polarized, and therefore for judges nominated by presidents from different parties to have notably different views on constitutional interpretation.

Under the Founders’ Constitution, under which the country lived under for its first 150 years, the Supreme Court hardly ever had to strike down a law. If you read the Congressional Record of the 18th and 19th centuries, Congress debated whether legislation was constitutional, much more than whether it was a good idea. Debates focused on whether something was genuinely for the general welfare or whether it only served, for example, the state of Georgia. “Do we have the power to do this?” was the central issue with any aspect of public policy.

In 1887, Grover Cleveland vetoed an appropriation of $10,000 for seeds to drought-stricken Texas farmers because he could find no constitutional warrant for such action. In 1907, in the case of Kansas vs. Colorado, the Supreme Court said that “the proposition that there are legislative powers affecting the nation as a whole although not expressed in the specific grant of powers is in direct conflict with the doctrine that this is a government of enumerated powers.”

The Changing Role of Judges

We also had a stable system of unenumerated rights that went beyond those listed in the Bill of Rights to those retained by the people per the Ninth Amendment. The Tenth Amendment was similarly redundant of the whole structure: the idea is that we have a government of delegated and enumerated — and therefore limited — powers.

Judges play much larger roles today. The idea that the General Welfare Clause says that the government can essentially regulate any issue as long as the legislation fits someone’s conception of what’s good — meaning, that you get a majority in Congress — emerged in the Progressive Era and was codified during the New Deal. After 1937’s so-called “switch in time that saved nine” — when the Supreme Court began approving grandiose legislation of the sort it had previously rejected — no federal legislation would be struck down until 1995. The New Deal Court is the one that politicized the Constitution, and therefore too the confirmation process, by laying the foundation for judicial mischief of every stripe — be it letting laws sail through that should be struck down or striking down laws that should be upheld.

This is not about the tired old debate about “activism” versus “restraint.” So long as we accept that judicial review is constitutional and appropriate in the first place — how a judiciary is supposed to ensure that the government stays within its limited powers without it is beyond me — then we should only be concerned that a court “get it right,” regardless of whether that correct interpretation leads to the challenged law being upheld or overturned. For that matter, an honest court watcher shouldn’t care whether one party wins or another. To paraphrase John Roberts at his confirmation hearings, the “little guy” should win when he’s in the right, and the big corporation should win when it’s in the right. The dividing line, then, is not between judicial activism and judicial restraint (passivism?), but between legitimate and vigorous judicial engagement and illegitimate judicial imperialism.

In that light, the recent confirmation battles — whether you look at Bork, Thomas, the filibustering of George W. Bush’s lower-court nominees, or the scrutiny of Sonia Sotomayor’s “wise Latina” comment — are all a logical response to political incentives. When judges act as super-legislators, senators, the media, and the public want to scrutinize their ideology and treat them as if they’re confirming lifetime super-politicians — and rightfully so.

Judges as Super-legislators

Sure we can tinker around the edges of the appointment process with bipartisan commissions, or have set terms or fixed retirement ages — or we could have scheduling requirements for when hearings and votes have to occur after a nomination — but all that is re-arranging the deck chairs on the Titanic. And the Titanic is not the judicial-nominations process, but rather the ship of government. The fundamental problem is the politicization not of the process but of the product, of the role of government, which began with the Progressive Era politically and was institutionalized during the New Deal.

Justice Scalia described this phenomenon in his dissent from the 1992 abortion ruling in Planned Parenthood v. Casey:

[T]he American people love democracy and the American people are not fools. As long as this Court thought (and the people thought) that we Justices were doing essentially lawyers’ work up here — reading texts and discerning our society’s traditional understanding of that text — the public pretty much left us alone. Text and traditions are facts to study, not convictions to demonstrate about. But if in reality our process of constitutional adjudication consists primarily of making value judgments; if we can ignore a long and clear tradition clarifying an ambiguous text … then a free and intelligent people’s attitude towards us can be expected to be (ought to be) quite different. The people know that their value judgments are quite as good as those taught in any law school — maybe better.

Enforcing the Founding Document

Ultimately judicial power is not a means to an end, be that liberal, conservative or anything else, but instead an enforcement mechanism for the strictures of the founding document. We have a republic, with a constitutional structure intended just as much to curtail the excesses of democracy as it was to empower its exercise. In a country ruled by law and not men, the proper response to an unpopular legal decision is not to call out the justices at a State of the Union address but to change the law or amend the Constitution.

Any other method leads to a sort of judicial abdication and the loss of those very rights and liberties that can only be vindicated through the judicial process — which by definition is counter-majoritarian. Or it could lead to government by black-robed philosopher kings. Even if that’s what you want, why would you hire nine lawyers for the job?!

So if we want to have the rule of law, we need judges to interpret the Constitution faithfully and strike down laws when government is exceeding its authority. Depoliticizing the judiciary is a laudable goal, but that’ll happen only when judges go back to judging rather than merely ratifying the excesses of the other branches while allowing infinite intrusions into economic liberties and property rights. Until that time, it’s absolutely appropriate to question judicial philosophies and theories of constitutional interpretation — and to vote accordingly.

Regardless of what happens to the Garland nomination or who’s president come January 2017, the battle for control of the third branch of government will continue — as will the attention paid to the resulting confirmation battles.

Ilya Shapiro is a senior fellow in constitutional studies at the Cato Institute and editor-in-chief of the Cato Supreme Court Review.

Share |

ACLU Fights Dems’ Dishonest War on Due Process

Nat Hentoff and Nick Hentoff

What a difference two months makes. In April, we offered harsh criticism of the national ACLU’s opposition to important due process criminal justice reforms pending in Congress. At the time, we described the ACLU as “a diminished shadow of its former self” and argued that “(t)he ACLU is now led by cafeteria civil libertarians who choose the liberties they deem worthy of protection based on a narrow ideological agenda.”

This week, the ACLU redeemed itself with a courageous stand against legislation that would have expanded the Obama administration’s reliance on secret watch lists to deny Americans their constitutional rights.

In the wake of the Orlando massacre, Senator Dianne Feinstein introduced legislation in the Senate that would authorize the attorney general to block a gun sale for anyone suspected of terrorism. Feinstein’s bill would also have authorized the attorney general to add anyone to a watch list who had been investigated for terrorism within the past five years, even if that person had been completely exonerated of any involvement in terrorism.

The ACLU sent a letter to senators urging them to vote no because of “the use of vague and overbroad criteria and the lack of adequate due process.” The ACLU’s letter argued that the “regulation of firearms and individual gun ownership or use must be consistent with civil liberties principles, such as due process, equal protection, freedom from unlawful searches and privacy.”

After the Feinstein bill was defeated in the Senate, the House Democrats sought to force Speaker Paul Ryan to bring a nearly identical companion bill up for a vote. The bill was first introduced by Sen. Frank Lautenberg and Rep. Peter King in 2007. According to King, Speaker Pelosi refused to bring the bill up for a vote when the Democrats controlled the House. Nearly 10 years later, congressional Democrats demanded that the GOP allow a vote on the very same legislation.

House Democrats took a page out of Donald Trump’s playbook by using ad hominem attacks, fear-mongering and deliberate disinformation in a cynical, shortsighted attempt to score political points at the expense of Americans’ constitutional rights. Their indecorous protest turned the House of Representatives into an infantile reality TV show. As legal expert Alan Dershowitz put it during an appearance on CNN, the House Democrats behaved like “a bunch of buffoons.”

Dershowitz was being kind.

Democrats and their allies in the powerful lobbying group the Center for American Progress repeatedly accused opponents of wanting to arm terrorists if they opposed either the Senate or House No Fly List legislation. On Twitter, Sen. Elizabeth Warren said, “the Senate GOP have decided to sell weapons to ISIS.” Rep. Jerry Nadler tweeted that the bill’s critics’ due process arguments are “a red herring,” and accused them of using “due process as an excuse to support mass murder.”

While the House Democrats were making fools of themselves on the floor of Congress, lawyers for the ACLU had already appeared before a U.S. district court in a lawsuit that resulted in rulings that the No Fly List is unconstitutional. The ACLU filed the lawsuit in June 2010 on behalf of 10 U.S. citizens and permanent residents — four of whom are U.S. military veterans — challenging their placement on, and inability to get off, the list.

“(T)he standards for inclusion on the No Fly List are unconstitutionally vague, and innocent people are blacklisted without a fair process to correct government error,” the ACLU’s National Security Project Director Hina Shamsi wrote in a commentary last December on the organization’s website. “Our lawsuit seeks a meaningful opportunity for our clients to challenge their placement on the No Fly List because it is so error-prone and the consequences for their lives have been devastating.”

Shamsi noted that the U.S. district court, in two separate rulings, had already held: 1) “that constitutional rights are at stake when the government stigmatizes Americans as suspected terrorists and bans them from international travel”; and 2) “that the government’s refusal to provide any notice or a hearing violates the Constitution.”

The national ACLU must have faced tremendous pressure to remain silent in the face of the congressional Democrats’ ideological onslaught on due process. But the national ACLU stood firm on principle, even if it meant aligning itself with the NRA and the House GOP.

“We disagree with Speaker Ryan on many things,” Shamsi wrote last year. “But he’s right that people in this country have due process rights. We want to see them protected.”

And so should the Democrats in Congress, if they have an ounce of integrity and any respect for the Constitution.

Nat Hentoff is a nationally renowned authority on the First Amendment and the Bill of Rights. He is a member of the Reporters Committee for Freedom of the Press, and the Cato Institute, where he is a senior fellow. Nick Hentoff is a criminal defense and civil liberties attorney in New York.

Share |

Venezuela vs. Ecuador (Chavismo vs. Chavismo Dollarized)

Steve H. Hanke

With the arrival of President Hugo Chávez in 1999, Venezuela embraced Chavismo, a form of Andean socialism. In 2013, Chávez met the Grim Reaper and Nicolás Maduro assumed Chávez’ mantle.

Chavismo has not been confined to Venezuela, however. A form of it has been adopted by Rafael Correa — a leftist economist who became president of a dollarized Ecuador in 2007.

Even though the broad outlines of their economic models are the same, the performance of Venezuela and Ecuador are in stark contrast with one another.

One metric that can be used to compare the two Latin American countries is the misery index. For any country, a misery index score is simply the sum of the unemployment, inflation, and bank lending rates, minus the percentage change in real GDP per capita. A higher misery index score reflects higher levels of “misery.” Using data from the Economist Intelligence Unit, I determined that Venezuela had the world’s highest score — 214.9 — at the end of 2015. It held the ignominious top spot — the world’s most miserable country in 2015. On the other hand, Ecuador had a misery index score of 18.9, which placed it in a slightly better position than the median Latin American country. Why the big difference?

It can be explained by the U.S. dollar. Ecuador uses the greenback as its official currency — like Panama and El Salvador — and Venezuela uses the battered bolívar. Prior to 2000, Ecuador was like Venezuela; it used its own currency, the sucre. But, Ecuador (like Venezuela) was incapable of imposing the rule of law and discipline in its monetary and fiscal spheres.

The Banco Central del Ecuador was established in 1927, with a sucre-U.S. dollar exchange rate of 5. Until the 1980s, the central bank periodically devalued the sucre against the dollar, violating the rule of law. In 1982, the central bank began to exercise its devaluation option with abandon. From 1982 until 2000, the sucre was devalued against the dollar each year. The sucre traded at 6825 per dollar at the end of 1998, and by the end of 1999, the sucre-dollar rate was 20,243. During the first week of January 2000, the sucre rate soared to 28,000 per dollar. In the case of Ecuador, the inability of the government to abide by the rule of law is, in part, a consequence of traditions and moral beliefs.

Ecuadorian politics have traditionally been dominated by elites (interest groups) that are uninhibited in their predatory and parochial demands on the state. With the lack of virtually any moral inhibitions, special interest legislation has been the order of the day. For example, during the rout of the sucre in 1999, laws were passed allowing bankers to make loans to themselves. In addition, state guarantees for bank deposits were introduced. These proved to be a deadly cocktail, one that allowed for massive looting of the banking system’s deposit base. This, as well as the collapsing sucre, enraged most Ecuadorians.

With the rule of law (and the sucre) in shambles, President Jamil Mahuad announced on January 9, 2000 that Ecuador would abandon the sucre and officially dollarize the economy. The positive confidence shock was immediate. On January 11th — even before a dollarization law had been enacted—the central bank lowered the rediscount rate from 200 percent a year to 20 percent. On Fe/puary 29th, the Congress passed the so-called Ley Trolebús, which contained dollarization provisions. It became law on March 13th, and after a transition period in which the dollar replaced the sucre, Ecuador became the world’s most populous dollarized country.

I had a front-row seat in Ecuador’s dollarization drama — both as a participant in the dollarization debates that preceded the sucre’s collapse and also during the replacement of the sucre and the greenback’s implementation phase, when I was an advisor to Carlos Julio Emanuel, the Minister of Finance and Economy. As for Venezuela, I had another front row seat, as President Rafael Caldera’s adviser in 1995-96, prior the arrival of Chávez. To put discipline into Venezuela’s monetary and fiscal spheres, I recommended an orthodox currency board — one that would have made the bolívar a clone of the U.S. dollar.

Caldera came close to adopting my recommendations. But, in the end, he failed to do so. The elites and special interest groups, as well as a variety of leftists, were opposed to any reform that would introduce the rule of law and impose monetary and fiscal discipline. The failure to adopt the rule of law has been catastrophic.

Let’s look at oil production. Venezuela has the largest proven oil reserves in the world — even greater than Saudi Arabia. But, the oil output of Venezuela’s state-owned oil company, PDVSA, is only 80 percent of what it was in 1999 (see the accompanying chart). In contrast, Ecuador’s oil output has jumped in the post-dollarization period and is now over 40 percent higher than in 1999.

image

Venezuela’s inflation record under Chávez was dismal, and under Maduro it has been catastrophic. For the past three years, Venezuela’s inflation rate has held the world’s top spot. It reached an annual rate (year-over-year) of almost 800 percent in the summer of 2015 and is 145 percent at present, still the world’s highest rate (see the chart below). In contrast, Ecuador’s annual inflation during the last ten years — dollarized years — has averaged 5.2 percent.

image

The most telling contrast between Venezuela’s Chavismo and Ecuador’s Chavismo Dollarized can be seen in the accompanying chart of real GDP in U.S. dollars. We begin in 1999, the year Chávez came to power in Venezuela.

image

The comparative exercise requires us to calculate the real GDP (absent inflation) and do so in U.S. dollar terms for both Venezuela and Ecuador. Since Ecuador is dollarized, there is no exchange-rate conversion to worry about. GDP is measured in terms of dollars. Ecuadorians are paid in dollars. Since 1999, Ecuador’s real GDP in dollar terms has almost doubled.

To obtain a comparable real GDP for Venezuela is somewhat more complicated. We begin with Venezuela’s real GDP, which is measured in terms of bolívars. This bolívar metric must be converted into U.S. dollars at the black market (read: free market) exchange rate. This calculation shows that, since the arrival of Chávez in 1999, Venezuela’s real GDP in dollar terms has vanished. The country has been destroyed by Chavismo.

So, where is Venezuela going? According to the International Monetary Fund’s (IMF) forecast, inflation will be 720 percent by the end of the year. We can reverse engineer the IMF’s inflation forecast to determine the bolívar-greenback exchange rate implied by the inflation forecast. When we conduct that exercise, we calculate that the VEF/USD rate moves from today’s black market (read: free market) rate of 1079 to 6,218 by year’s end. So, the IMF is forecasting that the bolívar will shed 83 percent of its current value against the greenback by New Year’s Day, 2017. The following chart shows the dramatic plunge anticipated by the IMF.

image

Venezuela is clearly in a death spiral. The only way out is to officially dump the bolívar and replace it with the greenback.

Steve H. Hanke is a professor of Applied Economics at The Johns Hopkins University in Baltimore and a Senior Fellow at the Cato Institute in Washington, D.C.

Share |

No, the Polls Aren’t Biased. Clinton Really Is Leading Trump

Emily Ekins

The conservative blogosphere is lighting up again with accusations of polling bias against Republican presidential candidate Donald Trump in his race against Democratic opponent Hillary Clinton. However, Trump supporters should avoid giving into this temptation to assume unfavorable results must be biased results. Clinton really is leading Trump, and by nearly 6 percentage points.

If people don’t like poll results, they may find their interests better served by taking action to persuade other voters to their point of view, rather than cry bias.

The blogospherian argument goes something like this: Clinton is leading Trump by 5 to 7 points in certain polls because the pollsters oversampled or over-weighted Democrats by about 5 to 7 points. If the polls are “corrected” to include fewer Democrats then the race is actually tied, they say.

For instance, one blogger argues that a recent CBS News poll inflated the number of Democrats in the poll, comprised of 28 percent Republicans and 35 percent Democrats. Citing one pollster’s calculation, she thinks party identification in the United States is closer to parity, with 28 percent Republicans and only 29 percent Democrats rather than a seven-point Democratic advantage. She reasons that if you erase the partisan gap that would erase Clinton’s six-point lead over Trump.

For Trump supporters, this is a tempting narrative to believe. But this simply isn’t so. The fact is there just are more Democrats out there than Republicans, and this has largely been the case at least since the New Deal. That obviously doesn’t mean Democrats always win, but it’s unwise to assume a pollster is biased because its sample included more Democrats than Republicans.

HuffPollster, which aggregates hundreds of polls across nearly 100 different pollsters, finds that averaging across 2016 polls, 34.8 percent of Americans identify as Democrats and 28.7 percent identify as Republicans—roughly a six-point Democratic advantage. This is very similar to the CBS poll’s partisan composition. Democrats maintain this advantage even among the national electorate. HuffPollster finds on average that among likely voters 38 percent are Democrats and 32.9 percent are Republicans—a 5.1-point Democratic advantage.

Furthermore, not a single major poll conducted since the beginning of June has found Trump ahead. Since May, Trump only led in three of the 24 polls found at RealClearPolitics (RCP). As of today, the RCP polling average shows Clinton with +5.8 percentage point lead over Trump. Thus, Clinton’s lead is not due to one biased poll; polls including Rasmussen (Clinton +5), CBS (+6), Monmouth (+8), CNN (+5), and Fox (+3) all find Clinton leading Trump.

These accusations of polling bias are strikingly similar to what happened in 2012 with “UnSkewedPolls.com” where some folks claimed the polls were unfairly sampling too many Democrats and thus giving an unmerited advantage to Barack Obama over Mitt Romney. But at the end of the day, the polls were largely correct and Obama handily won.

It’s easy to give into the temptation and assume that polls with unfavorable results must be biased. In some elections, polls have been considerably off. But by and large, it’s usually better to assume that results averaged across multiple polls from a variety of polling organizations are probably pointed in the right direction. Thus, if people don’t like poll results, they may find their interests better served by taking action to persuade other voters to their point of view, rather than cry bias.

Emily Ekins is a research fellow at the Cato Institute. Her research focuses primarily on American politics, public opinion, political psychology, and social movements, with an emphasis in survey and quantitative methods.

Share |

Obama’s Executive Abuses Harm Chances for Immigration Reform

Ilya Shapiro

The Supreme Court’s non-ruling that left in place the lower courts’ injunction against President Barack Obama’s executive actions on immigration should’ve come as no surprise to anyone who followed this case or read the oral argument transcript. The Department of Homeland Security claimed unprecedented discretion not just regarding enforcement priorities — the twenty-six-state plaintiffs didn’t challenge that — but, as U.S. District Judge Andrew Hanen wrote back in February 2015, also to pursue the “affirmative action” of granting benefits to a large class of illegal immigrants. The president himself had boasted that he “took an action to change the law,” contradicting his earlier protestations that he’s not a king and undermining the government’s argument that this was all mere policy guidance. That we came one vote from ratifying this royal lawmaking — not Merrick Garland’s; he wouldn’t have been confirmed in time to hear the case and it’s disingenuous for the president to claim otherwise — speaks volumes to how ends-justify-the-means the White House’s supporters are.

But regardless of what anyone thinks about the four-four denouement, the administration’s maneuvers represent an unfortunate setback for those who seek lasting immigration reform.

That may seem counterintuitive. Isn’t it better to do something, to at least get relief for four million people and worry about a larger fix when the political winds are more favorable?

President Obama picked a political fight that mired immigration reform in uncertain and ultimately fatal litigation instead of pushing for real change.

The answer is “no” for two big reasons. First, the “executive discretion” at issue, even if it hadn’t been stopped by the judiciary, could’ve been reversed by any future executive. While it would’ve been hard to claw back whatever tangible benefits were extended in the meantime — or the money states would’ve been forced to spend on driver’s licenses and other benefits — there was never any guarantee that the granted residence and work permits would be renewed.

In other words, the people who would’ve been eligible for temporary status under the Deferred Action for Parents of American Citizens and Lawful Permanent Residents (DAPA) may have moved “out of the shadows” but they’d still have been under a legal cloud. Perhaps that’s why an underwhelming number of those eligible for the earlier Deferred Action for Childhood Arrivals (DACA, the expansion of which is also now blocked) have registered for that program: why make it easier for the government to deport you in the future?

Second, and more significantly, President Obama short-circuited any chance at a legislative solution — for a Hillary Clinton administration as well. There’s a reason that we’ve all heard how the president has “poisoned the well.” By resorting to executive actions — right after the GOP won a midterm election running against just that style of governance — Obama ensures that Congress will never see anyone associated with him as an honest broker.

While it’s true that it’s difficult for a president to get any significant legislation through a Congress run by the other party, Obama didn’t even try to do anything on immigration when Democrats controlled both houses. (Indeed, the well was poisoned then, when he rammed through Obamacare and Dodd-Frank.) And there are certainly reforms that would have gained majorities had Obama not acted as he did, such as expanding high-tech visas and employment-based green cards. Even a larger reform that would’ve given legal status to those here illegally was possible, turning mainly on the scope of a guest-worker program and restricting the “pathway to citizenship.”

In short, President Obama picked a political fight that mired immigration reform in uncertain and ultimately fatal litigation instead of pushing for real change. Reformers are now worse off than they were two years ago, for having lost time and opportunity.

It’s all so unfortunate, because everybody knows that our immigration system is a mess, quite possibly the worst part of the federal government.

That’s quite a statement, I know — particularly coming from someone at the Cato Institute, and especially from a constitutional lawyer who’s spent significant time and energy battling Obamacare. But it’s true: far from merely advancing bad policy, our current immigration system lacks any coherent policy that it purports to implement. It’s a compilation of various half-baked “reforms” going back decades, a schizophrenic set of laws and regulations.

And the solution is straightforward: Expand the ways to be here legally, then crack down on those who ignore them. If you commit a crime, or go too long without a job, you lose your visa. But give people a chance to earn an honest living. As long as we screen for criminal records, terrorism, and public health, America should stand for the idea of letting people in who seek a better life, in an orderly way: a funnel, not a necessarily leaky wall.

I say this as an immigrant myself, who could be expected to be least sympathetic to those who came here illegally. After all, I navigated the bureaucratic morass — finally became a citizen two years ago after living here my entire adult life — so why shouldn’t everyone? Why should we “amnesty” people, even legislatively, who didn’t play by the rules I painstakingly followed?

That answer is also simple: I care about my new country and about giving people the opportunity I got only through some rather fortuitous twists of fate. And I also care about the rule of law, one of the reasons people want to come here in the first place. That’s why it saddens me that President Obama’s executive actions are so beyond the constitutional pale that Saturday Night Live parodies them as violating the Schoolhouse Rock instruction on “how a bill becomes a law.”

The fact that DACA and DAPA have stymied chances for legislative reform is doubly sad because our immigration laws themselves undermine the rule of law. If you brainstormed a process for how foreigners enter the country, how long they can stay, and what they can do while here, it would be hard to come up with something worse than our current hodge-podge of often contradictory rules. This immigration non-policy serves nobody’s interest — not big business or small, not the rich or the poor, not the economy or national security, and certainly not the average taxpayer — except perhaps bureaucrats and immigration lawyers.

The rule of law means changing the laws we now have rather than paying lip service to the idea that we should spend a trillion dollars enforcing them. Creating a line for people to get into — skilled and unskilled — isn’t “amnesty” but “parole.” That’s why President Reagan’s 1986 reform failed: not because we didn’t combine an amnesty with border enforcement, but because we didn’t follow the parole we granted with a workable line (or funnel) for future immigrants.

Alas, this administration has never been willing to spend political capital on immigration reform — unlike President George W. Bush, who came very close to attaining that elusive goal. Given how little trust President Obama now inspires, and the culpability he therefore shares for giving rise to “burn it all down” populisms of the left and right, it’s clear that this challenge falls to a successor. I had long thought that there would be a Nixon-to-China moment, with a President Cruz or Walker (or whomever conservatives trust) working with Congress to finally pass comprehensive reform. That moment may now have to wait at least four years.

In the meantime, immigration activists did themselves no favors by claiming that an obvious violation of both administrative and immigration law helps those whom they purport to represent. United States v. Texas was not a case about the merits of immigration reform or whether we as a nation accept immigrants. That President Obama couldn’t resist making the demagogic claim that it was is indicative of the seriousness with which he didn’t take the Constitution’s separation of powers — to the detriment of law and policy alike.

Ilya Shapiro is a senior fellow in constitutional studies at the Cato Institute. He filed briefs supporting the challenge to DAPA on behalf of Cato and others who generally support immigration reform. Like most immigrants, he does a job native-born Americans won’t: defending the Constitution.

Share |

Come on, Trump, Debate Gary Johnson

Gene Healy

In some ways, it was typical Donald Trump: He belittled the other candidates (“are these people stiffs or what?”), blasted US foreign policy leaders (“a bunch of weak sisters”), and then bragged that he “built a lot of great wealth.”

But then, surprisingly, he made a valid point: “It’s disgraceful” that third-party candidates are systematically excluded from the nationally televised presidential debates.

When Trump was a political outsider, he wanted the debate stage opened up to alternative viewpoints; now that he’s a member of the club, he wants it kept more exclusive than the Mar-a-Lago.

“I am not surprised that the two-party political establishment wants to keep the American people from having a third choice,”Trump said. “It’s amazing that they can get away with it.”

That was January 2000. The celebrity real estate magnate was flirting with a presidential run on the Reform Party ticket. Sixteen years later he’s the presumptive Republican nominee, and the mere hint of “a spoiler indie candidate” drives him into a spluttering rage on TwitterSad!

When Trump was a political outsider, he wanted the debate stage opened up to alternative viewpoints; now that he’s a member of the club, he wants it kept more exclusive than the Mar-a-Lago.

The leading third-party alternative in the 2016 race is Gary Johnson, the former New Mexico governor who has clinched the Libertarian Party nomination. Johnson has reached as high as 12 percent in the few polls that have bothered to mention his name, and he’s likely to be on the ballot in all 50 states.

The Donald of 2000, with his expert’s eye for a fraudulent scheme, was absolutely right that the system is rigged to exclude third-party alternatives. Back then, he said:

The fix has been in since at least 1988, when the major-party front group known as the Commission on Presidential Debates engineered a hostile takeover of the debates, shoving aside the independent group that had hosted them from 1976-84, and ensuring that the two-party “duopoly” would control the process in order to preserve politics as usual.

This year the CPD will do everything in its power to make sure the national debate audience never gets to hear Johnson.

As Trump said, “it’s amazing that they can get away with it.” Understanding how they managed the feat requires a look back at the corrupt bargain that gave rise to the CPD and enabled it to hijack the presidential debates.

From 1976 through 1984, an independent group, the League of Women Voters, hosted the debates, and repeatedly rebuffed major-party demands for safe, stage-managed affairs.

In 1980, when President Jimmy Carter refused to appear with independent candidate John Anderson, the league threatened to hold the debate with Anderson, Ronald Reagan,and an empty chair. (They eventually went forward sans the chair — and Carter.) During the 1984 race, after the Reagan and Mondale teams rejected 68 proposed debate panelists, the league backed them down with a press conference calling out the campaigns for “totally abusing the process.”

That sort of behavior struck major-party moguls as entirely too “pushy.” They wanted a more compliant sponsor, and if you want something done right, sometimes you’ve got to do it yourself. Thus, in 1984 the chairs of the Republican and Democratic national committees hatched a plan to sideline the league and take over the debates: “The two major political parties should do everything in their power to strengthen their own position,” explained then-RNC head Frank Fahrenkopf. Three years later, the parties announced the formation of the Commission on Presidential Debates, co-chaired by … the heads of the RNC and DNC.

In 1988, the league refused to go along with a restrictive “Memorandum of Understanding” that set the terms of the Bush-Dukakis debates, warning that it “would perpetrate a fraud on the American voter.” The CPD stepped in as official sponsor, and the takeover was complete.

Negotiated between the campaigns every four years and rubber-stamped by the CPD, the candidates’ Memoranda of Understanding read like Hollywood stars’ contract riders.

The 2012 Obama-Romney MOU is typical; at 21 pages, it covers minutia like the specific placement of the podiums: “equally canted toward the center of the stage” at an angle to be approved by the campaigns.

But the real problem is what the MOUs restrict. “In general, direct candidate-to-candidate questioning has been banned,” reports an Annenberg white paper on debate reform, and there are to be no “challenges for additional debates.” The moderators are prohibited from asking the candidates for “?‘a show of hands’ or other similar calls for response,” and in town hall debates, follow-up questions are prohibited.

Even the camera crew is under tight restrictions: “No TV cut-aways to any candidate who is not responding to a question.” Perhaps the Trump campaign can add a proviso ensuring that the cameras never linger on the candidates’ hands.

The CPD would, no doubt, be willing to oblige. As Scott Reed, Bob Dole’s campaign manager in 1996, explained: “The commission does what you tell them to do,” including barring the debate forum door to any candidate who might spoil the party for the red and blue teams.

In the 1992 cycle, Texas billionaire Ross Perot had been included in all three debates at the insistence of the George H.W. Bush campaign, which wrongly expected he’d tip the race to Bush. Perot shot up from 7 percent in pre-debate polls to nearly 19 percent of the popular vote on Election Day.

But in 1996, both the Clinton and Dole campaigns wanted Perot kept off the stage, and the CPD complied, even though three-quarters of eligible voters wanted him included. The parties got their way, and managed to duck the blame for it as well: “We were able to hide behind the commission,” said Reed.

To make future exclusions look less arbitrary, in 2000 the CPD adopted a numerical standard: Eligible candidates would need to show at least 15 percent support in independent national polls in the runup to the debates. To have such “a high criteria for a party that’s a legitimate party” that will be on the ballot “in all 50 states [is] very unfair,”Trump complained at the time. Indeed, that requirement kept the Reform Party’s eventual nominee, Pat Buchanan, and the Green Party’s Ralph Nader off the stage that year, and would have barred Anderson in 1980, Perot in ‘92, and nearly every third-party candidate in American history.

The 15 percent rule will keep Gary Johnson out too, unless he’s able to better his current standing in the polls. But as Perot showed in 1992, sometimes admission to the debates is a prerequisite for cracking that barrier. Another Catch-22 for Johnson is that, thus far, most national polling organizations aren’t asking about him — and the CPD picks the pollsters that count.

So what’s Trump so worried about? The Donald likes to posture as a fearless outsider — he even wrote a book called Time to Get Tough (chapter one: “Get Tough”). Lately, though, he gets jumpy whenever the Libertarian candidate comes up in an interview: “I don’t want to mention the name; we want to give them as little publicity as possible,” Trump said on Fox News the other day.

Luckily for the Donald, he’s an establishment insider now, and the Commission on Presidential Debates is on his side, ready to protect insiders from competition. Membership has its privileges.

Gene Healy is a vice president at the Cato Institute and author of The Cult of the Presidency.

Share |

Britain’s Democracy Is a Sham

Marian L. Tupy

European countries have joined the European integration process for different reasons. Germany wanted to expiate its World War II guilt, France wanted to enhance its global influence, Poland wanted protection from Russian expansionism and Romania wanted a less corrupt government. Great Britain wanted easier access to a free trade area called the European Economic Community. It was the membership of that community that the British people approved in a 1974 referendum. No more, no less.

On Thursday, the British people will decide if they wish to remain in an organization that only faintly resembles the former European Economic Community. Over time, a humble free trade area evolved into a supranational entity that at least superficially resembles a federal state. The European Union has its own flag, anthem, currency, president (five of them, actually) and a diplomatic service. It is only natural that the British electorate should be given an opportunity to reflect on the changes that have taken place over the last 42 years.

The people of Europe are sick and tired of being ignored, and none more so than the British.

Before joining the European Economic Community, Britain was a sovereign and democratic polity. Its governing institutions stretched back a thousand years and were the envy of the world. The island gave us representative democracy, rule of law, abolition of slavery, the English language and the Industrial Revolution. It saved Europe from Louis XIV and Napoleon during the French ascendency, and from Wilhelm II and Hitler during the German ascendency. As such, it must surely count, along with ancient Greece, as among the most consequential of nations.

But Britain lost some of its greatness. The country was exhausted from fighting two world wars. It lost confidence after its imperial possessions gained independence. Most seriously, Britain was suffering from the socialist rot. Its centrally planned wartime economy was never fully liberalized, with food rationing persisting into the 1950s. In the meantime, West Germany, which was obliterated by allied bombing during World War II, but revived by Ludwig Erhard’s free market reforms, powered ahead of Britain in terms of standard of living.

And so, Britain threw in its lot with the nascent EU. That proved to be a bit of a Faustian bargain. In exchange for access to the common market, Britain had to accept an external tariff and, over time, a deluge of regulations from power-hungry Brussels. The former makes imports more expensive in Britain, while the latter makes British exports less competitive globally. Most importantly, the British people, who struggled for their political rights for centuries (even beheading a king in the process), lost much of their political freedom.

As the European integration process deepened, ever more so-called competences were ceded by the EU member states to Brussels. Today, the EU has a say in almost everything, from agricultural production and labor regulations to the strength of European showers and electric consumption of European vacuum cleaners.

A defining feature of democracy is the ability of the electorate to choose and replace the government through free and fair elections. The choice, however, needs to be a meaningful one. What is the point of being able to choose between two or more candidates, if none of them can effect specific policy changes? That question is at the core of the upcoming referendum on British exit from the EU.

Truth be told, democracy in Europe is a bit of a sham. People still cast their votes for their favorite candidates, but the former know that the latter are powerless to change decisions made by the unelected, unknown and unaccountable bureaucracy in Brussels.

The EU, it is vital to understand, is undemocratic not by accident, but by design. Politicians in Brussels know that there is no public support for so-called deeper integration. Jean-Claude Juncker, the current president of the EU Commission, summed up the decision-making behind the introduction of the single currency thusly: “We decide on something, leave it lying around and wait and see what happens. If no one kicks up a fuss, because most people don’t understand what has been decided, we continue step by step until there is no turning back.”

The people of Europe are sick and tired of being ignored, and none more so than the British. On Thursday, the British people will be able to choose whether to regain full sovereignty, or remain in the EU. Should they choose the former, other countries will be sure to follow.

Marian L. Tupy is a senior policy analyst at the Cato Institute’s Center for Global Liberty and Prosperity.