I Want My Binky
The Lizard: Episode 4
Collage by Anthony Freda - Text by Mike Armstrong
Public Banks Can Create 50 Wall Streets
From Massachusetts to Oregon, Colorado to Illinois and Wisconsin, and Ohio to California, citizens throughout the country voted overwhelmingly last week for their legislators to pass a constitutional amendment to overturn the U.S. Supreme Court’s Citizens United v. Federal Election Commission ruling and declare that only human beings – not corporations – are entitled to constitutional rights and that money is not speech and campaign spending can be regulated.
Residents in over 150 cities had the opportunity to vote on measures calling for an end to the doctrines of corporate constitutional rights and money as free speech, and in every single town the vote was supportive, often by an overwhelming margin. This encouraging development was in response to activist Federal Courts, which had created the concept of corporations as persons, imbuing them with constitutional rights, the same as human beings. Corporations were not bestowed this distinction by any act of Congress. The people have never voted to grant such rights. Corporate personhood is solely a construct of the Courts, who have, in so doing, violated their Constitutional authority and jurisdiction.
As we witness citizens pushing back on the idea that corporations have constitutional rights, it’s important to recognize that concentration of power — and money — has had a hand in the establishment of corporate personhood. Wall Street banks certainly have bought tremendous influence in Congress. In a recent op-ed published in the Washington Post, Craig Shirley, the president of Shirley & Banister Public Affairs, an Alexandria-based lobbying and communications firm, writes:
…If we rightly fear all concentrations of power, then the first order of business must be to break up the five big banks. The rationale is simple: Since the banks used illicit means via lobbyists and government to acquire such power, then government can be used to undo their ill-gotten authority.
Wall Street is too fearsome and corrupt for anyone’s good. We should find a way to create 50 Wall Streets so that money can stay in the states, and corruption can be kept to a minimum and law enforcement to a maximum. In the era of the Internet — which empowers the individual — can there be any doubt that scrutiny of local Wall Streets would keep bankers and brokers on their toes?
There is no better way to do this than to create 50 state-owned banks and scores of county and municipal-owned banks throughout the country, effectively moving the center of gravity from Wall Street to Main Street. By putting the issuance and control of hundreds of billions of dollars of credit in the hands of public servants, we effectively neuter Wall Street, at least when it comes to control of public money. Public scrutiny, called for by Craig Shirley, will undoubtedly be a natural element of the oversight and governance of a public bank. After all, public banks are public institutions managed by professional bankers, who are public servants. And let’s be very clear: public banks simply need to create loan programs, not be involved in making lending decisions. Loan origination decisions can be made locally by community bankers, an important control point that the Bank of North Dakota so effectively models.
This approach would serve our communities as we seek to reverse the artificial economic scarcity imposed on us by the existing private banking system. Affordable loans to fund the new economy, a simple idea with its roots being local, not centralized on Wall Street or in Congress. The Public Banking Institute envisions public banks as the organizing entity for the development of this market. Public banks can be the provider of debt capital to businesses and, by issuing corporate bonds, serve as the investment vehicle for people who wish to take a portion of their retirement funds to invest locally.
Local investing happens to be the dominant theme in our upcoming annual conference, which we’ll soon be announcing. This year’s conference will be co-hosted with the Dominican University MBA Program in San Rafael, California. Its theme “Think Global, Invest Local” will serve as the context for exploring how public banking can be used to solve the mortgage fraud crisis in association with the use of Eminent Domain, build sustainable and local food systems, provide for local investment opportunities, and address how a U.S. postal savings bank can better serve the large unbanked population we have in both rural and urban areas of the USA. It promises to be a significant conference that will provide opportunites to explore alternative models for creating the investments we make in our communities.
Meanwhile, we keep building the Public Banking Coalition and forming chapters throughout the USA. You can help by either making a donation or by emailing and volunteering your assistance in other ways. Please get involved — we’re doing important work and need your help.
Collage by Anthony Freda - Text by Noam Chomsky
THE INFORMATION PROBLEM: THE PROSPEROUS FEW AND THE RESTLESS MANY
Part of the reason why the few remain prosperous and man remain restless is our broken system of information. As the following excerpt illustrates, when information that points out fundamental flaws in our system surfaces— information which if discussed meaningfully might help rectify the plight of the restless— because six prosperous companies control 90% of the information in the US, the truth is omitted, and business-as-usual continues unimpeded. Why some are prosperous and others restless is a question of how power flows and is structured. Noam Chomsky’s book, “Understanding Power” is a series of speeches he gave and the question and answer sessions afterward.
What would have to happen for people to be able to do more of the real work of society—like supporting each other and educating chil- dren—instead of just spending our whole lives working at lame jobs for corporations?
Actually, a lot of countries tend to emphasize those things, even today— we don’t have to look very far for models. For example, take Western Europe: those are societies not very different from ours, they have the same corporate-run economy, the same sort of limited political system, but they just happen to pursue somewhat different social policies, for various histor ical reasons. So Germany has a kind of social contract we don’t have—one of the biggest unions there just won a 35-hour work-week, for example.1 In the Netherlands, poverty among the elderly has gone down to flat zero, and among children it’s 4 percent, almost nothing.2 In Sweden, mothers and fathers both get substantial parental leave to take care of their children, like a year or something—because taking care of children is considered something that has value in that society, unlike in the United States, where the leader- ship elements hate families.3 I mean, [politicians] may talk about supporting “family values,” but they actually want families destroyed—because families are not rational from the point of view of profit-making.
So even within the range of existing societies set up almost exactly like ours, there are plenty of other social policies you could have—and I think our system could tolerate those things too, it really just depends if there’s enough pressure to achieve them.
Actually, you might want to take a look at an interesting volume published recently by U.N.I.C.E.F. [the United Nations Children’s Fund], about treatment of children in the rich countries—it’s yet to be reviewed in the New York Times, or anywhere else in the United States, but it’s really quite revealing. It was written by a very good American economist named Sylvia Ann Hewlett, and she identifies two basic patterns of treatment, a “Continental-European/Japanese” model and an “Anglo-American” model—which just are radically different. Her conclusion is, the Continental-European/Japanese pattern has improved the status of children and families; the Anglo-American pattern has been what she calls “a war” against children and families. And that’s particularly been true in the last twenty years, because the so-called “conservatives” who took over in the 1980s, aside from their love of torture and misery abroad, also happen to be passionately opposed to family values and the rights of children, and have carried out social policies which have destroyed them.
Well, that’s just the wrong story for the New York Times—so that study never gets reviewed. Instead what the Times editors devote the cover-story of their Book Review to is another extremely deep problem the United States is facing—in case you aren’t aware of it, you’d really better read this. We’re facing the problem that “bad genes” are taking over the United States—and part of the proof of that is that scores on S.A.T.s and I.Q. tests have been steadily declining in recent years, children just aren’t doing as well as they used to.
Well, somebody who’s really unsophisticated might think that the problem could have something to do with social policies that have driven 40 percent of the children in New York City below the poverty line, for example—but that issue never arises for the New York Times. Instead the problem is bad genes. The problem is that blacks, who evolved in Africa, evolved in kind of a hostile climate, so therefore they evolved in such a way that black mothers don’t nurture their children—and also they breed a lot, they all breed like rabbits. And the effect is, the gene pool in the United States is being contaminated, and now it’s starting to show up in standardized test scores. This is real hard science.
The Times’s review starts off by saying, well, maybe the facts in these books aren’t quite right, but nonetheless, one thing is clear: these are serious issues, and any democratic society which ignores them does so “at its peril.” On the other hand, a society doesn’t ignore “at its peril” social policies that are depriving 40 percent of the children in New York City of the minimal material conditions which would offer them any hope of ever escaping the misery, destitution and violence that surround them, and which have driven them down to levels of malnutrition, disease and suffer- ing where you can predict perfectly well what their scores are going to be on the “I.Q.” tests you give them—none of that you even mention.
In fact, according to the last statistics I saw about this, 30 million people in the United States are suffering hunger. 30 million is a lot of people, you know, and that means plenty of children. In the 1980s, hunger declined in general throughout the entire world, with two exceptions: sub-Saharan Africa and the United States—the poorest part of the world and the richest part of the world, there hunger increased. And as a matter of fact, between 1985 and 1990, hunger in the United States increased by 50 percent— it took a couple years for the Reagan “reforms” to start taking hold, but by 1985 they were beginning to have their effects. And there is just over-whelming evidence, in case it’s not obvious from common sense, what the effects of this kind of deprivation are on children—physically, emotionally, and mentally. For one thing, it’s well known that neural development simply is reduced by low levels of nutrition, and lack of nurturance in general. So when kids suffer malnutrition, it has permanent effects on them, it has a permanent effect on their health and lives and minds—they never get over it.
And the growing hunger here isn’t just among children—it’s also been increasing among the elderly, to name one group. So as the Wall Street Journal recently pointed out in a front-page story, hunger is “surging” among the elderly: about five million older Americans, about 16 percent of the population over 60, are going hungry, they’re malnourished, many of them are literally starving to death. Now, in the United States we don’t have starvation the way they do in Haiti or Nicaragua or something—but the deprivation is still very real. In many places it’s probably worse than it is in Cuba, say, under the embargo.
And it’s not just hunger: it turns out that contact time between parents and children has declined by about 40 percent in the United States since the 1960s—that means that on average, parents and children have to spend about 10 or 12 hours less time together a week.13 Alright, the effects of that also are obvious: it means television as supervision, latch-key kids, more violence by children and against children, drug abuse—it’s all perfectly predictable. And this is mostly the result of the fact that today, both parents in a family have to put in 50- or 60-hour work-weeks, with no child-support system around to help them (unlike in other countries), just to make ends meet.14 And remember, this is in the 1990s, a period when, as Fortune magazine just pointed out, corporate profits are at a record high, and the percentage of corporate income going into payrolls is near a record low— that’s the context in which all of this has been happening.
Well, none of these things are discussed in the New York Times Book Review article either. They are discussed in the U.N.I.C.E.F. book I mentioned, but the Times chose not to review that one.
So to return to your question, you ask: what would have to happen for us to get social policies different from all of these? I don’t think there’s any reason why the “Anglo-American model” Hewlett identifies has to continue— and be extended. These aren’t laws of nature, after all; they’re social-policy decisions—they can be made differently. There’s a lot of space for changing these things, even in a society with the same corporate control as ours.
But why not ask another question. Why not ask why absolutist organizations have any right to exist in the first place? I mean, why should a corporation—technically a fascist organization of enormous power—have any right to tell you what kind of work you’re going to do? Why is that any better than having a king tell you what kind of work you’re going to do? People fought against that and overthrew it, and we can fight against it again and overthrow it.
There’s plenty of challenging, gratifying, interesting, productive work around for people to do, and there are plenty of people who want to do it— they simply aren’t being allowed that opportunity under the current economic system. Of course, there’s also plenty of junky work that has to get done too—but in a reasonable society, that work would just be distributed equally among everybody capable of doing it. If you can’t get robots to do it, fine, then you just distribute it equally.
Okay, I think that’s the kind of model we have to try to work towards now—and frankly, I don’t see any reason why that’s an impossible goal.
By Jonathan Schnell - Excerpt from The Unconquerable World: Power, Non-Violence and the Will of the People
Like the starting gun of a race that no one knew he was to run, 9/11 set the pack of nations off in a single direction — toward the trenches. Although the attack was unaccompanied by any claim of authorship or statement of political goals, the evidence almost immediately pointed to al-Qaeda, the radical Islamist, terrorist network, which, though stateless, was headquartered in Afghanistan and enjoyed the protection of its fundamentalist Islamic government. In a tape that was soon shown around the world, the group’s leader, Osama bin Laden, was seen at dinner with his confederates in Afghanistan, rejoicing in the slaughter.
Historically, nations have responded to terrorist threats and attacks with a combination of police action and political negotiation, while military action has played only a minor role. Voices were raised in the United States calling for a global cooperative effort of this kind to combat al-Qaeda. President Bush opted instead for a policy that the United States alone among nations could have conceivably undertaken: global military action not only against al-Qaeda but against any regime in the world that supported international terrorism.
The president announced to Congress that he would “make no distinction between the terrorists who commit these acts and those who harbor them.” By calling the campaign a “war,” the administration summoned into action the immense, technically revolutionized, post-Cold War American military machine, which had lacked any clear enemy for over a decade. And by identifying the target as generic “terrorism,” rather than as al-Qaeda or any other group or list of groups, the administration licensed military operations anywhere in the world.
In the ensuing months, the Bush administration continued to expand the aims and means of the war. The overthrow of governments — “regime change” — was established as a means for advancing the new policies. The president divided regimes into two categories — those “with us” and those “against us.” Vice President Cheney estimated that al-Qaeda was active in 60 countries. The first regime to be targeted was of course al-Qaeda’s host, the government of Afghanistan, which was overthrown in a remarkably swift military operation conducted almost entirely from the air and without American casualties.
Next, the administration proclaimed an additional war goal — preventing the proliferation of weapons of mass destruction. In his State of the Union speech in January 2002, the president announced that “the United States of America will not permit the world’s most dangerous regimes to threaten us with the world’s most destructive weapons.” He went on to name as an “axis of evil” Iraq, Iran, and North Korea — three regimes seeking to build or already possessing weapons of mass destruction. To stop them, he stated, the Cold War policy of deterrence would not be enough — “preemptive” military action would be required, and preemption, the administration soon specified, could include the use of nuclear weapons.
Beginning in the summer of 2002, the government intensified its preparations for a war to overthrow the regime of Saddam Hussein in Iraq, and in the fall, the president demanded and received a resolution from the Security Council of the United Nations requiring Iraq to accept the return of U.N. inspectors to search for weapons of mass destruction or facilities for building them. Lists of other candidates for “regime change” began to surface in the press.
Leaving cooperative action behind In this way, the war on terror grew to encompass the most important geopolitical issue facing the world: the disposition of nuclear weapons in the second nuclear age. The Clinton administration had already answered the question regarding American possession of nuclear weapons: even in the absence of the Soviet Union, the United States planned to hold on to its nuclear arsenal indefinitely.
In 2002, the Bush administration gave an answer to the question regarding nonproliferation, which throughout the nuclear age had been dealt with exclusively by diplomacy and negotiation, or, on occasion, economic sanctions. The new answer was force. Nuclear disarmament was to be achieved by war and threats of war, starting with Iraq. One complementary element of the new policy, embraced long before September 11th, was the decision to build a national missile defense system to protect the United States against nuclear attack by “rogue nations.” But the fundamental element was a policy of preemptive war, or “offensive deterrence.”
This momentous shift in nuclear policy called, in addition, for programs to build new nuclear weapons and new delivery vehicles; confirmed new missions for nuclear weapons — retaliation for chemical or biological attacks, attacking hardened bunkers unreachable by other weapons — in the post-Cold War world; and listed seven countries (Russia, China, North Korea, Iraq, Iran, Libya, and Syria) for which contingency plans for nuclear attack should be considered. To achieve all these aims, nuclear and conventional, the president asked for an increase in military spending of $48 billion — a sum greater than the total military spending of any other nation.
The sharp turn toward force as the mainstay of the policies of the United States was accompanied by a turn away from treaties and other forms of cooperation. Even before September 11th, the trend had been clear. Now it accelerated. The Bush administration either refused to ratify or withdrew from most of the principal new international treaties of the post-Cold War era. In the nuclear arena alone, the administration refused to submit to the Senate for ratification the Comprehensive Test Ban Treaty, which would have added a ban on underground tests to the existing bans on testing in the air; withdrew from the A.B.M. Treaty, which had severely limited Russian and American deployment of antinuclear defensive systems; and jettisoned the START negotiations as the framework for nuclear reductions with Russia — replacing them with the Strategic Offensive Reduction Agreement, a three-page document requiring two-thirds of the strategic weapons of both sides to be removed from their delivery vehicles, but then stored rather than dismantled.
In addition, the Bush administration withdrew from the Kyoto Protocol of the United Nations Framework Convention on Climate Change, which had become the world’s principal forum for making decisions about reducing emissions that cause global warming; refused to ratify the Rome treaty establishing an international criminal court; and declined to agree to an important protocol for inspection and enforcement of a U.N. convention banning biological weapons.
The consequences of this revolution in American policy rippled through the world, where it found ready imitators. On December 12th, the Indian Parliament was attacked by terrorists whom India linked to Pakistan. Promptly, nuclear-armed India, citing the American policy of attacking not only terrorists but any state that harbored them, moved half a million men to the border of nuclear-armed Pakistan, which responded in kind, producing the first full-scale nuclear crisis of the twenty-first century.
In South Asia, nuclearization did not produce the cautionary effects that the theorists of deterrence expected. High Indian officials openly threatened Pakistan with annihilation. Rajnath Singh, the minister for the state of Uttar Pradesh, declared, “If Pakistan doesn’t change its ways, there will be no sign of Pakistan left,” and when India’s army chief, General S. Padmanabhan, was asked how India would respond if attacked with a nuclear weapon, he answered that “the perpetrator of that particular outrage shall be punished so severely that their continuation thereafter in any form of fray will be doubtful.” In Pakistan, the dictator General Pervez Musharraf stated that, in the event of an Indian conventional invasion of Pakistan, “as a last resort, the atom bomb is also possible.” In March 2002, Israel, citing the same American precedent and calling for U.S. support for its policy on this basis, responded to Palestinian suicide bombings by launching its own “war on terrorism” — a full-scale attack on the Palestinian Authority on the West Bank.
The revolution in American policy had been precipitated by September 11th, but went far beyond any war on terror. It remained to give the policy comprehensive doctrinal expression, which came in an official document, “The National Security Strategy of the United States of America,” issued in September 2002. In the world, it stated, only one economic and political system remained “viable”: the American one of liberal democracy and free enterprise. The United States would henceforth promote and defend this system by the unilateral use of force — preemptively, if necessary. The United States, the president said, “has, and intends to keep, military strengths beyond challenge, thereby making the destabilizing arms races of other eras pointless, and limiting rivalries to trade and other pursuits of peace.”
In other words, the United States reserved the entire field of military force to itself, restricting other nations to humbler pursuits. In the words of the “National Security Strategy,” “Our forces will be strong enough to dissuade potential adversaries from pursuing a military build-up in hopes of surpassing, or equaling, the power of the United States.” If the United States was displeased with a regime, it reserved the right to overthrow it — to carry out “regime change.” “In the world we have entered,” President Bush has said, “the only path to safety is the path of action. And this nation will act.”
A policy of unchallengeable military domination over the earth, accompanied by a unilateral right to overthrow other governments by military force, is an imperial, an Augustan policy. It marks a decisive choice of force and coercion over cooperation and consent as the mainstay of the American response to the disorders of the time. If wars of self-determination and other kinds of local and regional mayhem multiply and run out of control; if the wealthy and powerful use globalization to systematize and exacerbate exploitation of the poor and powerless; if the poor and the powerless react with terrorism and other forms of violence; if the nuclear powers insist on holding on to and threatening to use their chosen weapons of mass destruction; if more nations then develop nuclear or biological or chemical arsenals in response and threaten to use them; if these weapons one day fall, as seems likely, into the hands of terrorists; and if the United States continues to pursue an Augustan policy, then the stage will be set for catastrophe.
Each of these possibilities represents a path of least resistance. Local and regional conflicts have been the way of the world since history began. The spread of nuclear- as well as biological- and chemical-weapon know-how is an automatic function of technical progress, and the spread of nuclear arsenals is a self-feeding process of action and reaction. Continued possession of nuclear weapons by those who already have them is the path of inertia, of deep sleep. The imperial temptation for the United States is the path of arrogance and ignorance.
At the intersection of these tendencies is a Niagara higher and more violent than the one that a heartbroken Henry James lived to witness in 1914. It is of course impossible to predict how and where history might again go over the precipice. It could be nuclear war in South Asia, bringing the deaths of tens of millions of people. It could be the annihilation of one or several cities in the United States in a nuclear terrorist attack, or the loss of millions in a smallpox attack. It could be a war spinning out of control in the Middle East, leading by that route to the use of weapons of mass destruction in the Middle East. It could be war in Korea, or between the United States and China over Taiwan. It could even be — hard as it is to imagine now — intentional or semi-intentional nuclear war between Russia and the United States in some future crisis that we cannot foresee but cannot rule out, either. Or it could be — is even likely to be — some chain of events we are at present incapable of imagining.
After September 11th, people rightly said that the world had changed forever. Before that event, who could have predicted the galloping transformation of the politics of the United States and the world, the escalating regional crises, the vistas of perpetual war? Yet the use of just one nuclear weapon could exceed the damage of September 11th ten-thousandfold.
Would the global economy plunge into outright depression? Would the people of the world flee their menaced cities? Would anyone know who the attacker was? Would someone retaliate — perhaps on a greater scale? Would the staggering shock bring the world to its senses? Would the world at that moment of unparalleled panic and horror react more wisely and constructively than it has been able to do in a time of peace, in comparative calm, or would it fall victim to an incalculable cycle of fear, confusion, hatred, hostility, and violence, both between nations and within them, as it did after 1914 — but this time, in all likelihood, far more swiftly and with incomparably direr consequences? In the face of these questions, predictive powers dim. But attempts at prophecy are in any case the wrong response. Decisions are required.
A new 1914?
The escalation of violence around the world has been so rapid since September 11, 2001, that this day may appear already to have been the August 1914 of the twenty-first century. The parallels are striking. In 2001 as in 1914 a period of political liberalization, economic globalization, and peace (at least in the privileged zones of the planet) was summarily ended by a violent explosion. The fundamental decision now, as it was then, is between force and peaceful means as the path to safety, and the world seems to have made a decision for force.
Again, observers have been compelled, as Henry James was in 1914, to recognize that the immediate past has been a time of illusion — a time when the world was heading toward a precipice but did not know it, or did not care to know it. Again, an unpredictable chain of violent events has been set in motion — some today have even said that a “third world war” is upon us.
And yet, since history does not repeat itself, the analogy between 1914 and 2001, like all measurements of the present with yardsticks from the past, is useful only for querying events, not for predicting them. There are equally important differences between the two moments, some of them obvious, others less so. In 1914, the great powers’ preparations for war were complete. The arms were piled high, the troops massed, the war plans mapped out in detail, the mobilization schedules fixed, the treaties of alliance signed and sealed. Even before the first shot was fired, the whole of the long war to come lay waiting in the file cabinets of the chanceries of Europe, needing only the right incident to spring to life. And when that incident came and the armies were hurled across the borders, no power on earth, including the governments involved, could call them back until the war had run its full bloody course.
Our moment, by contrast, is one of exceptional unpredictability and fluidity. No inexorable timetables or web of alliances among great powers threaten to drag everyone together into a new abyss. The unexpected — new crises, abrupt developments, sudden opportunities — is the order of the day. The strength of the forces that attacked on September 11th is unclear, and appears likely to wax or wane in response to events. The Bush administration has announced a series of wars that it may decide to fight, but there will be points of decision at every step along the way.
Developments in the field can quickly alter political opinion at home. The proliferation of weapons of mass destruction can inhibit as well as provoke war. Elections can bring new people to power. Other countries are watching and waiting, uncertain where and how to bring the weight of their influence to bear. The effect of a series of wars, if such occur, on global economic integration is unknown, and huge uncertainties shadow the economic scene.
As shocking as September 11th was, it was not a decisive catastrophe, but rather a warning. No irrevocable decision has in fact been made. The scope for choice remains unusually large, and the new cycle of violence can still be broken or reversed, and new policies adopted. Seen narrowly, September 11th posed the specific question of how the United States and the civilized world should deal with a global terrorist network ready to commit any crime within its power. That question requires all the urgent attention and action that it is receiving.
At the same time, I submit, we should be asking what the larger and more fundamental decisions for policy may be. If we take this broader approach, the profound changes that have occurred in the character of violence, politics, and power over the last century will command our attention. In 1918 and 1945, a decision in favor of coercive power clearly meant in practice choosing the old war system, and a decision in favor of cooperative power meant choosing to create ex nihilo a Wilsonian system of global collective security based on international law. Today, neither of these alternatives is open to us.
We Won’t Forget Wisconsin - by David Swanson
A new film called Wisconsin Rising is screening around the country, the subject, of course, being the activism surrounding the mass occupation of the Wisconsin Capitol in 2011. I recommend attending a planned screening or setting up a new one, and discussing the film collectively upon its conclusion. For all the flaws in Wisconsin’s activism in 2011 and since, other states haven’t even come close — most have a great deal to learn.
The film tells a story of one state, where, long ago, many workers’ rights originated or found early support, and where, many years later, threats to workers’ rights, wages, and benefits, and to what those workers produce including education in public schools, were aggressively initiated by the state’s right-wing governor, Scott Walker.
The joy and inspiration created by the public resistance to that threat were intense. The occupation, the singing, the marching, the creative props and protests, the donations for pizza from around the world, the parades, the rallies, the concerts, the firefighters and police officers spared in the legislation but choosing to join with the rest of the public anyway, the growing crowds, the growing awareness of the power of nonviolent action, the legislators bringing their desks out onto the grass to meet with constituents in the cold snow or fleeing the state to deny the governor a quorum, Fox News propaganda showing a violent rally supposedly in Wisconsin but with palm trees in the background, the Wisconsinites hauling plastic palm trees to the capitol, the high school students joining the occupation on behalf of their teachers, Governor Walker unable to step outdoors without protest — all of this energy and activity is accurately conveyed in Wisconsin Rising. For over three weeks, Wisconsin’s capitol was occupied, and the reminders of it are still frequently visible there.
The Wisconsin legislature rammed through its horrendous legislation despite the public opposition. The film does not hide that awful defeat. But the same would have happened had there been no opposition. The question is whether the opposition did any good and whether it could conceivably have succeeded had wiser decisions been made — and whether power was tapped that could be enlarged still further. I think the answer to all of these questions is yes.
In the film we see people withdrawing their money from a bank that funds candidates like Walker. That can and should continue.
We see a choice made to withdraw energy from protests and demonstrations and nonviolent resistance and camps and marches and a general strike, in order to put all of that energy into recall elections. The lessons of all of those labor songs sung at all of those rallies are not followed. Instead, an effort is made to pretend that the system works and that slightly better personalities in positions of corrupt power will solve everything. Massive popular energy went into a contest where it could not compete with massive money.
What might have happened instead? Energy could have stayed with the occupation, drawing inspiration from and giving inspiration to activism around the United States and the world. I remember Michael Moore pointing out at the Wisconsin occupation that 400 people in the United States had as much money as half the country, and pundits compelled to note that that was true. An education campaign about the division and concentration of wealth would have been time better spent. Creative means of keeping working people’s wealth with working people, rather than handing it over to Wall Street, would have been wiser use of euphoric enthusiasm.
An effort might also have been made to build even wider state-level solidarity by recognizing the state of Wisconsin, like the other 49 U.S. states, as a victim of a federal budget gone off the deep end of plutocratic plunder and militarism. The federal government does not support education or any other human need, at home or abroad, in remotely the way that it could if it curtailed spending on war preparations, giveaways to corporations and billionaires, or both. What if Wisconsin were to convert from weapons to peaceful industries, tax major federal tax evaders at the state level instead, and call for a Constitutional Convention to recriminalize bribery? What if the money Wisconsinites dump into elections went into setting up and supporting independent media outlets in Wisconsin instead?
What if three enjoyable, energizing, inspiring weeks of effort wasn’t seen as a record long action, but as the opening preview of much longer struggles? What if the pressure were to build back up, and a different direction were chosen this time, the direction of nonviolent resistance rather than naive compliance? Wisconsin, at least, has done its warm ups. Most states are still in the locker room.
David Swanson wants you to declare peace at http://WorldBeyondWar.org His new book is War No More: The Case for Abolition. He blogs athttp://davidswanson.org and http://warisacrime.org and works for http://rootsaction.org. He hosts Talk Nation Radio. Follow him on Twitter: @davidcnswanson andFaceBook.