TO COMMENT CLICK HERE
History
truth is freedom
A people without the knowledge of their past history, origin and culture is like a
tree without roots
Marcus Garvey
tree without roots
Marcus Garvey
march 2023
what use to be
history you should know
america's greatest traitors of 20th century
a traitor confesses!!!
A former Texas lawmaker, compelled by the news of President Jimmy Carter entering hospice care, has come forward to reveal that there was, in fact, a secret GOP effort in 1980 to prevent Iran from releasing more than 50 Americans hostages until after that year’s presidential election.
Former Texas Lieutenant Governor Ben Barnes, 85, has told the New York Times that in the summer of 1980, he accompanied his mentor and one-time business partner, former Texas governor John Connally, on a trip to the Middle East during which Connally asked Arab leaders to communicate to Iranian officials that they should not release the hostages before Election Day because if they waited, Ronald Reagan would offer them a better deal. (Connally, at that point a former Democrat, is also known for being the other person seriously wounded during the assassination of President John F. Kennedy in 1963.) The infamous hostage crisis, which began on November 4, 1979, and lasted 444 days, was an open political wound for Carter, who went on to lose his reelection bid to Reagan. After Iran released the hostages on Reagan’s Inauguration Day, there were immediate suspicions among Democrats that the Reagan team had somehow sabotaged the Carter administration’s efforts in order to deny Carter a late-campaign political win — which Reagan advisers famously dubbed a potential “October surprise.” The most prominent theories, like the one put forward by former Carter national security aide Gary Sick in the early 90s, alleged that William Casey, Reagan’s campaign chairman who went on to become the director of the CIA, had orchestrated the sabotage and made a deal with Iran, but as the Times notes, subsequent Congressional investigations never turned up proof. Barnes, a Democrat who was once the speaker of the Texas House of Representatives, told the Times that there was no doubt in his mind that the purpose of Connally’s trip was to get a message to Iran. He also said that they communicated with the Reagan team during the trip, and that after they returned to the U.S., Connally — who wanted to be Reagan’s Secretary of State — briefed Casey on the trip: Mr. Barnes said he was certain the point of Mr. Connally’s trip was to get a message to the Iranians to hold the hostages until after the election. “I’ll go to my grave believing that it was the purpose of the trip,” he said. “It wasn’t freelancing because Casey was so interested in hearing as soon as we got back to the United States.” Mr. Casey, he added, wanted to know whether “they were going to hold the hostages.” … They traveled to the region on a Gulfstream jet owned by Superior Oil. Only when they sat down with the first Arab leader did Mr. Barnes learn what Mr. Connally was up to, he said. Mr. Connally said, “‘Look, Ronald Reagan’s going to be elected president and you need to get the word to Iran that they’re going to make a better deal with Reagan than they are Carter,’” Mr. Barnes recalled. “He said, ‘It would be very smart for you to pass the word to the Iranians to wait until after this general election is over.’ And boy, I tell you, I’m sitting there and I heard it and so now it dawns on me, I realize why we’re there.” “History needs to know that this happened,” Barnes explained to the Times, “I think it’s so significant and I guess knowing that the end is near for President Carter put it on my mind more and more and more. I just feel like we’ve got to get it down some way.” Barnes said he had kept the details of the trip secret because he didn’t “want to look like Benedict Arnold to the Democratic Party.” Casey and Connally are no longer alive to comment on Barnes’s bombshell, but the Times confirmed Barnes accompanied Connally on a July 1980 trip to six countries in the Middle East, and spoke with four still-living people who Barnes had previously shared the story with. The Times report stresses that there is no evidence Reagan was aware of the effort, or that Casey directed it, but Barnes’s admission nonetheless provides compelling evidence that Reagan operatives — or at the very least, Connally — did in fact conspire against Carter and U.S. foreign policy for political gain, and may have prevented an earlier release for the hostages. RELATED:
An angry mob broke into a jail looking for a Black man—then freed him
He called himself Jerry. He was a skilled cabinetmaker in Syracuse, N.Y., before he got a better-paying job making wooden barrels. He was a light-skinned Black man with reddish hair in his early forties, and as far as anyone knew, he didn’t have any family.
But in the eyes of the law, his name was William Henry, and he was another man’s property. On Oct. 1, 1851, the struggle against slavery in the United States centered on this man’s body, and his forceful liberation became a community holiday, “Jerry Rescue Day,” marked with poetry, song and fundraising. Since 1843, Jerry’s life had been marked by escape. First he fled his enslavement in Missouri. He may have also narrowly avoided recapture in Chicago and Milwaukee, according to one account. During the winter of 1849-1850, he arrived in Syracuse, a city well known for its strong antislavery bent. Even with the high number of White and Black abolitionist leaders and supporters living there, Jerry was still met with at least some racism from co-workers, who saw him as competition. He also had a few run-ins with the law, getting arrested for theft and assault. It isn’t clear how much truth there was to the charges; in any case, he was always soon released. In late 1850, Congress passed the Fugitive Slave Act, making escape from slavery a federal matter and requiring assistance from local officials in any state, including ones where slavery was illegal. Daniel Webster, a Northern politician who supported the law, predicted a confrontation over its enforcement would happen in Syracuse, according to historian Angela F. Murphy, who wrote a book about the rescue. “He gives this really thundering speech about how the Fugitive Slave Law would be enforced, even in Syracuse,” Murphy told The Washington Post. “He said even at the next national antislavery convention” — set for October in Syracuse — “it’s going to be enforced.” As September gave way to October, the city was packed, not only with hundreds of abolitionists there for the convention but also with thousands of farmers and their families in town for the county fair. Jerry was working through his lunch break when local police and federal marshals came to detain him. At first, he didn’t resist, probably figuring it would go like his other arrests. Then they arrived at a federal commissioner’s office, and he recognized a White neighbor of his former enslaver. Jerry had been sold in absentia, and the new owner had sent the neighbor up to collect his property. By this point, a lot of Northern cities had “vigilance committees” — multiracial groups that kept an eye out for slave catchers. One of these committee members spotted Jerry on the way to the office and ran to the church where the convention was being held. Soon, church bells across the city were ringing to alert the whole town. As a crowd gathered outside the office, prominent abolitionists like Gerrit Smith, Rev. Samuel J. May and Rev. Jermain Wesley Loguen — himself a fugitive slave — along with a handful of lawyers pushed their way inside to aid Jerry at a hearing. There isn’t much they could have done, legally speaking, and most likely everyone knew it. Before the hearing could even get going, members of the vigilance committee made a first attempt to free Jerry, taking advantage of the chaotic and crowded room to push him outside. He ran down the street, still handcuffed. Authorities caught up to him, roughed him up and tried to take him back to the hearing. A fight broke out between police and the crowd, both sides pulling on Jerry’s body until his clothes were torn off. Eventually, police dragged him, bloodied, back into a cell, where they added leg irons. The sight of the brutality “actually turn[ed] some people into supporters of the move to rescue him,” Murphy said. Many White residents at the time opposed slavery but preferred a gradual, legal approach rather an immediate emancipation that almost by definition required violence, or at least the threat of it. Jerry began to scream. He shouted. He begged for the crowd outside to help him. He was “in a perfect rage, a fury of passion,” May, the abolitionist and a Unitarian minister, recalled later. May was allowed in the cell to calm Jerry, which didn’t work until May made it clear another attempt to free him was in the works. The hearing resumed at 5:30 p.m. Jerry’s attorneys began raising objections to anything they could to slow it down. Outside, the sun was low in the sky, and the crowd had grown to thousands. Rocks began flying through the windows. After a rock flew past his head, the commissioner adjourned the hearing until the next morning. Still, the crowd did not disperse; it grew. Some arrived with weapons, others picked up an ax or iron rod from a nearby hardware store with an abolitionist owner. A battering ram appeared. At 8:30 p.m., someone shouted, “Now!” They smashed windows, rammed the doors and pulled bricks right out of the building’s walls. The marshals inside got off a shot or two, hitting no one, before basically giving up. No one was killed, though one marshal suffered a broken arm when he jumped out of a second-story window. Another, hiding inside the cell with the prisoner, opened the door and pushed Jerry out. The rescuers carried Jerry to a waiting carriage, which rushed him out of town to a safe house, where his chains were removed. Soon he was on the Underground Railroad to Canada, and safety. Though it hasn’t been a feature of too many history textbooks, the “Jerry Rescue” was national news at the time. In general, Syracuse residents were happy about it, jokingly asking, “Where’s Jerry?” as they passed one another on the street. More than a dozen organizers were eventually indicted, including Loguen, who fled to Canada. He denied the charges and even said he would return to stand trial if authorities would promise not to send him back into slavery. “Jerry Rescue Day” became a feather in abolitionist Syracuse’s cap — residents had defied the Fugitive Slave Act and won! — and the city still memorializes the incident with a statue. This mob, which broke into a jail to liberate rather than lynch, was not unique. Harriet Tubman herself helped storm a jail to free Charles Nalle near Troy, N.Y., in 1860. In 1854 in Milwaukee, abolitionists stormed a jail and freed Joshua Glover, a formerly enslaved man who had been living in nearby Racine for years. And in Boston that same year, thousands rioted after a failed attempt to free a young man named Anthony Burns. His forced return to Virginia solidified opposition to slavery for many Bostonians, including Ralph Waldo Emerson and Henry David Thoreau. “We went to bed one night old-fashioned, conservative, Compromise Union Whigs and waked up stark mad Abolitionists,” one observer wrote. (Burns was later sold to abolitionists and freed.) Usually, the violence of the Civil War is said to have begun on April 12, 1861, with shots fired at Fort Sumter in South Carolina. But perhaps it really started with these battles in the North, where the fight for a man’s freedom could not have been more literal. american democracy!!!
Reversing Roe v. Wade goes against the will of the people. A recent Quinnipiac poll shows that a clear majority support the Supreme Court ruling ensuring a patient’s access to abortion care. That, of course, won’t stop opponents to the measure from ruling by minority; it’s exactly what the so-called “pro-lifers” want.
Rule by minority has increasingly become the Republican’s modus operandi; gerrymandering, voter suppression, and congressional loopholes show they are not shy about staying in power by any means necessary. Now we’re seeing what’s possible when a man like Donald Trump embraces it as the leader of the power. Trump has not hesitated to embrace white nationalists and give racists power—just look at Steve Bannon, Stephen Miller, and Jeff Sessions—which is exactly why it’s prime time for Roe v. Wade to come up on the chopping block. It’s no coincidence that the biggest national threat to abortion rights since Roe is happening under such a racist government. Have you ever wondered why the “pro-life” movement is so … white? Or perhaps you’ve noticed that they seem incapable of not being racist whenever they pretend to care about Black people to further their extreme agenda. You’re not alone. It turns out that the pro-life movement has been very good about hiding its racist origins. That’s not just because white people tend to be uncomfortable and avoidant when talking about race. It’s because it also exposes the true goal of the movement, which makes their initially confusing hypocrisy incredibly clear. Fundamentalist Christians and [the KKK] are pretty close, fighting for God and country. Someday we may all be in the trenches together in the fight against the slaughter of unborn children. — John Burt, 1994 New York Times interview. Abortion restrictions have always been political—and about raceDuring much of the 19th century, abortion was unregulated and business was booming. The industry was doing so well that one famous provider, Madame Restell, invested in one of New York City’s first luxury apartment buildings with her husband. The white, middle-class women who could afford abortions were having more control of their bodies and thus having fewer children. This was all happening while the United States was also getting more Catholic and Jewish immigrants. The fears of white women increasingly turning away from doing their “duty” to bear children coupled with xenophobia compelled powerful white men to spring into action. Under the guise of wanting to require a medical license to perform abortions, the American Medical Association (AMA) ran a successful campaign to ban abortion care and put the decision to make exceptions completely in their hands. How did they succeed? They appealed to the racist little hearts of Anglo-Saxon politicians. Back then, “pro-life” racism wasn’t as subtle. The authors of “Abortion, Race, and Gender in Nineteenth-Century America” in the American Sociological Review wrote that “physicians argued that middle-class, Anglo-Saxon married women were those obtaining abortions, and that their use of abortion to curtail childbearing threatened the Anglo-Saxon race.” Take this excerpt from a book by Dr. Augustus K. Gardner from 1870, for example: Infanticide is no new crime. Savages have existed in all times, and abortions and destruction of children at and subsequent to birth have been practiced among all barbarous nations of antiquity … The savages of past ages were not better than the women who commit such infamous murders to-day, to avoid the cares, the expense or the duty of nursing and tending a child. Here we see how framing abortion as murder came from racist propaganda. Dr. Gardner talked about barbaric peoples—Indians, Greeks, and Chinese, for example—that supposedly partook in infanticide. He uses this in an attempt to shame women from seeking abortions, calling them no better than these “savages.” Political anti-abortion rhetoric began with this message: abortion is for other people. Non-white people. Yet even back then, there was no consensus among conservatives or Christians about abortion’s morality. However, the disproportionate amount of power that rich white men had in the country—as doctors and politicians—allowed this minority to execute its will on the people (sound familiar?). While the 19th century racists succeeded in getting a nationwide abortion ban, that pesky desire from women for autonomy kept rearing its head. It’s almost as if you keep oppressing people, they will eventually want more rights—no matter how hard you try! No wonder they hated Margaret Sanger, the founder of Planned Parenthood. She published a feminist magazine in 1914 that advocated for reproductive freedom—exactly what racist white men didn’t want embraced by women. The smears and attacks against her continue today as conservatives try to paint her as the racist. The truth is that she was a proponent of eugenics, but was staunchly against its use for racist means. At the Jewish Woman’s Archive, Open Society Institute fellow Ellen Chasler explains: She distinguished between individual applications of eugenic principles and cultural ones and spoke out against immigration prohibitions that promoted ethnic or racial stereotypes with a biological rationale. She saw birth control as an instrument of social justice, not of social control.” In fact, Sanger worked with activists of color like W.E.B. Du Bois and Japanese feminist Shizue Kato—people conservatives today would undoubtedly disparage. Dr. Martin Luther King even once said, “There is a striking kinship between our [civil rights] movement and Margaret Sanger’s early efforts in “Family Planning—A Special and Urgent Concern.” While there’s no excuse for Sanger’s support of the eugenics movement, it does show that the fact was distorted by a white racist movement that undoubtedly has people who would agree with her eugenic statements today. Even in Sanger’s time, white supremacists still couldn’t agree on whether to support birth control or not. Some saw it as a possible means to keep “undesirables” from reproducing, while other had fears that Anglo-Saxon white women would embrace it too much and significantly lower their birth rate. --- How racism brought Republicans and white evangelicals together Evangelical leaders tried to influence Carter to seek a constitutional amendment to overturn Roe v. Wade. He refused, so they looked to the other party. The Republican Party’s sexism dovetailed nicely with racist anti-abortion policies and support for such an amendment was made a part of the party’s platform. And thus, the GOP officially adopted the language proposed by the power-hungry white evangelicals and officially became the “pro-life” party with candidate Ronald Reagan as the leader. Reagan was a good bet. He had name recognition with his acting career before entering politics. And, like Falwell and friends, Reagan lamented the advancement of the civil rights for Black people. Reagan had no problem catering to racists, pushing the “welfare queen” myth and calling the Voting Rights Act “humiliating to the South.” Oh, and he was endorsed by the KKK—twice. At first glance, Reagan seemed to be the least likely ally for the anti-choice movement. When he was California’s governor, he signed the country’s least restrictive abortion access bill in the country. Carter had a documented history of being anti-abortion, both in his personal and political life. However, it’s Carter’s refusal to bend to the political will of the powerful white evangelical men that was seen as the biggest liability. Reagan’s landslide win solidified the religious right’s political strength. Falwell, Dobson, and Weyrich had succeeded in making their racist political goals viable enough to get millions to vote for their preferred candidate who’d get rid of abortion and keep the brown and Black people from taking over. Since then, the political power of white evangelicals in the United States has only gotten stronger.
|
...Mentioning my research to others repeatedly provoked questions about Africa, not America. They obviously assumed that a scholar working on the slave trade must be working on the trade that brought millions of Africans to the Western Hemisphere via the terrifying Atlantic Ocean crossing known as the Middle Passage.
They did not appear to know that by the time slavery ended in 1865, more than 1 million enslaved people had been forcibly moved across state lines in their own country, or that hundreds of thousands more had been bought and sold within individual states.
Americans continue to misunderstand how slavery worked and how vast was its reach – even as the histories of race and slavery are central to ongoing public conversations.
Indifference to suffering
Enslaved people were bought and sold within the boundaries of what is now the United States dating back to the Colonial era. But the domestic slave trade accelerated dramatically in the decades after 1808.
That year, Congress outlawed the importation of enslaved people from overseas, and it did so at a moment when demand for enslaved laborers was booming in expanding cotton and sugar plantation regions of the lower South.
Growing numbers of professional slave traders stepped forward to satisfy that demand. They purchased enslaved people primarily in upper South states like Maryland and Virginia, where a declining tobacco economy left many slaveholders with a surplus of laborers. Traders then forced those enslaved people to migrate hundreds of miles over land and by ship, selling them in Alabama, Mississippi, Louisiana and other states where traders hoped to turn a profit.
The domestic slave trade was a brutal and violent business. Enslaved people lived in constant fear that they or their loved ones would be sold.
William Anderson, who was enslaved in Virginia, remembered seeing “hundreds of slaves pass by for the Southern market, chained and handcuffed together.” Years after he fled the South, Anderson wrote of “wives taken from husbands and husbands from wives, never to see each other again – small and large children separated from their parents,” and he never forgot the sounds of their sorrow. “O, I have seen them and heard them howl like dogs or wolves,” he recalled, “when being under the painful obligation of parting to meet no more.”
Slave traders were largely indifferent to the suffering they caused. Asked in the 1830s whether he broke up slave families in the course of his operations, one trader admitted that he did so “very often,” because “his business is to purchase, and he must take such as are in the market.”
‘So wicked’
Domestic slave traders initially worked mostly out of taverns and hotels. Over time, an increasing number of them established offices, showrooms and prisons where they held enslaved people whom they intended to sell.
By the 1830s, the domestic slave trade was ubiquitous in the slave states. Newspaper advertisements blared “Cash for Negroes.” Storefront signs announced that “dealers in slaves” were inside. At ports and along roads, travelers reported seeing scores of enslaved people in chains.
Meanwhile, the money the trade generated and the credit that financed it circulated throughout the country and across the Atlantic, as even European banks and merchants looked to share in the gains.
The more visible the trade became, the more antislavery activists made it a core of their appeals. When abolitionist editor Benjamin Lundy, for example, asked white Americans in the 1820s how long they could look at the slave trade and “permit so disgraceful, so inhuman, and so wicked a practice to continue in our country, which has been emphatically termed THE HOME OF THE FREE,” he was one among a rising chorus.
But abolitionists made little headway. The domestic slave trade ended only when slavery ended in 1865.
Propaganda obscures history
Vital to the American economy, important to American politics and central to the experience of enslaved people, the domestic slave trade was an atrocity carried out on a massive scale. As British traveler Joseph Sturge noted, by the 1840s, the entire slaveholding portion of the United States could be characterized by division “into the ‘slave-breeding’ and ‘slave-consuming’ States.”
Yet popular historical knowledge of the domestic trade remains hazy, thanks largely to purposeful forgetting and to a propaganda campaign that began before the Civil War and continued long past its conclusion.
White Southerners made denial about the slave trade an important tenet in their defense of slavery. They claimed that slave sales were rare, that they detested the slave trade and that traders were outcasts disdained by respectable people.
Kentucky minister Nathan Lewis Rice’s assertion in 1845 that “the slave-trader is looked upon by decent men in the slave-holding States with disgust” was such a common sentiment that even white Northerners sometimes parroted it. Nehemiah Adams, for example, a Massachusetts resident who visited the South in 1854, came away from his time in the region believing that “Negro traders are the abhorrence of all flesh.”
Such claims were almost entirely lies. But downplaying the slave trade became a standard element of the racist mythology embedded in the defense of the Confederacy known as the Lost Cause, whose purveyors minimized slavery’s significance as they discounted its role in bringing about the Civil War.
And while the Confederacy may have lost on the battlefield, its supporters arguably triumphed in the cultural struggle to define the war and its meaning. Well into the 20th century, significant numbers of white Americans throughout the country accepted and embraced the notion that slavery had been relatively benign.
As they did so, the devastations of the domestic slave trade became buried beneath comforting fantasies of moonlight and magnolias evoked by movies like “Gone With the Wind.”
Recent years have seen monuments to the Confederacy coming down in cities and towns across the country. But the struggle over how Americans remember and talk about slavery, now perhaps more heated and controversial than ever, arguably remains stuck in terms that are legacies of the Lost Cause.
Slavery still conjures images of Southern farms and plantations. But the institution was grounded in the sales of nearly 2 million human beings in the domestic slave trade, the profits from which nurtured the economy of the entire country.
Until that history makes its way more deeply into our popular memory, it will be impossible to come to terms with slavery and its significance for the American past and present.
They did not appear to know that by the time slavery ended in 1865, more than 1 million enslaved people had been forcibly moved across state lines in their own country, or that hundreds of thousands more had been bought and sold within individual states.
Americans continue to misunderstand how slavery worked and how vast was its reach – even as the histories of race and slavery are central to ongoing public conversations.
Indifference to suffering
Enslaved people were bought and sold within the boundaries of what is now the United States dating back to the Colonial era. But the domestic slave trade accelerated dramatically in the decades after 1808.
That year, Congress outlawed the importation of enslaved people from overseas, and it did so at a moment when demand for enslaved laborers was booming in expanding cotton and sugar plantation regions of the lower South.
Growing numbers of professional slave traders stepped forward to satisfy that demand. They purchased enslaved people primarily in upper South states like Maryland and Virginia, where a declining tobacco economy left many slaveholders with a surplus of laborers. Traders then forced those enslaved people to migrate hundreds of miles over land and by ship, selling them in Alabama, Mississippi, Louisiana and other states where traders hoped to turn a profit.
The domestic slave trade was a brutal and violent business. Enslaved people lived in constant fear that they or their loved ones would be sold.
William Anderson, who was enslaved in Virginia, remembered seeing “hundreds of slaves pass by for the Southern market, chained and handcuffed together.” Years after he fled the South, Anderson wrote of “wives taken from husbands and husbands from wives, never to see each other again – small and large children separated from their parents,” and he never forgot the sounds of their sorrow. “O, I have seen them and heard them howl like dogs or wolves,” he recalled, “when being under the painful obligation of parting to meet no more.”
Slave traders were largely indifferent to the suffering they caused. Asked in the 1830s whether he broke up slave families in the course of his operations, one trader admitted that he did so “very often,” because “his business is to purchase, and he must take such as are in the market.”
‘So wicked’
Domestic slave traders initially worked mostly out of taverns and hotels. Over time, an increasing number of them established offices, showrooms and prisons where they held enslaved people whom they intended to sell.
By the 1830s, the domestic slave trade was ubiquitous in the slave states. Newspaper advertisements blared “Cash for Negroes.” Storefront signs announced that “dealers in slaves” were inside. At ports and along roads, travelers reported seeing scores of enslaved people in chains.
Meanwhile, the money the trade generated and the credit that financed it circulated throughout the country and across the Atlantic, as even European banks and merchants looked to share in the gains.
The more visible the trade became, the more antislavery activists made it a core of their appeals. When abolitionist editor Benjamin Lundy, for example, asked white Americans in the 1820s how long they could look at the slave trade and “permit so disgraceful, so inhuman, and so wicked a practice to continue in our country, which has been emphatically termed THE HOME OF THE FREE,” he was one among a rising chorus.
But abolitionists made little headway. The domestic slave trade ended only when slavery ended in 1865.
Propaganda obscures history
Vital to the American economy, important to American politics and central to the experience of enslaved people, the domestic slave trade was an atrocity carried out on a massive scale. As British traveler Joseph Sturge noted, by the 1840s, the entire slaveholding portion of the United States could be characterized by division “into the ‘slave-breeding’ and ‘slave-consuming’ States.”
Yet popular historical knowledge of the domestic trade remains hazy, thanks largely to purposeful forgetting and to a propaganda campaign that began before the Civil War and continued long past its conclusion.
White Southerners made denial about the slave trade an important tenet in their defense of slavery. They claimed that slave sales were rare, that they detested the slave trade and that traders were outcasts disdained by respectable people.
Kentucky minister Nathan Lewis Rice’s assertion in 1845 that “the slave-trader is looked upon by decent men in the slave-holding States with disgust” was such a common sentiment that even white Northerners sometimes parroted it. Nehemiah Adams, for example, a Massachusetts resident who visited the South in 1854, came away from his time in the region believing that “Negro traders are the abhorrence of all flesh.”
Such claims were almost entirely lies. But downplaying the slave trade became a standard element of the racist mythology embedded in the defense of the Confederacy known as the Lost Cause, whose purveyors minimized slavery’s significance as they discounted its role in bringing about the Civil War.
And while the Confederacy may have lost on the battlefield, its supporters arguably triumphed in the cultural struggle to define the war and its meaning. Well into the 20th century, significant numbers of white Americans throughout the country accepted and embraced the notion that slavery had been relatively benign.
As they did so, the devastations of the domestic slave trade became buried beneath comforting fantasies of moonlight and magnolias evoked by movies like “Gone With the Wind.”
Recent years have seen monuments to the Confederacy coming down in cities and towns across the country. But the struggle over how Americans remember and talk about slavery, now perhaps more heated and controversial than ever, arguably remains stuck in terms that are legacies of the Lost Cause.
Slavery still conjures images of Southern farms and plantations. But the institution was grounded in the sales of nearly 2 million human beings in the domestic slave trade, the profits from which nurtured the economy of the entire country.
Until that history makes its way more deeply into our popular memory, it will be impossible to come to terms with slavery and its significance for the American past and present.
Historian: Republican culture war fight driven by need to hide a basic fact about American history
History News Network
July 19, 2021
Critical Race Theory (CRT) has become a lightning rod for conservative ire at any discussion of racism, anti-racism, or the non-white history of America. Across the country, bills in Republican-controlled legislatures have attempted to prevent the teaching of CRT, even though most of those against CRT struggle to define the term. CRT actually began as a legal theory which held simply that systemic racism was consciously created, and therefore, must be consciously dismantled. History reveals that the foundation of America, and of systemic racism, happened at the same time and from the same set of consciously created laws.
Around the 20th of August, 1619, the White Lion, an English ship sailing under a Dutch flag, docked off Old Point Comfort (near present-day Hampton), in the British colony of Virginia, to barter approximately 20 Africans for much needed food and supplies. The facts of the White Lion's arrival in Virginia, and her human cargo, are generally not in dispute. Whether those first Africans arriving in America were taken by colonists as slaves or as indentured servants is still debated. But by the end of the 17th century, a system of chattel slavery was in place in colonial America. How America got from uncertainly about the status of Africans, to certainty that they were slaves, is a transition that highlights the origins of systemic racism.
Three arguments have been put forth about whether the first Africans arriving in the colonies were treated as indentured servants or as slaves. One says that European racism predisposed American colonists to treat these Africans as slaves. Anthony and Isabella, for example, two Africans aboard the White Lion, were acquired by Captain William Tucker and listed at the bottom of his 1624/25 muster (census) entry, just above his real property, but below white indentured servants and native Americans.
A second argument counters that racism was not, at first, the decisive factor but that the availability of free labor was. "Before the invention of the Negro or the white man or the words and concepts to describe them," historian Lerone Bennett wrote, "the Colonial population consisted largely of a great mass of white and black [and native] bondsmen, who occupied roughly the same economic category and were treated with equal contempt by the lords of the plantations and legislatures."
In this view, slavery was not born of racism, but racism was born of slavery. Early colonial laws had no provisions distinguishing African from European servants, until those laws began to change toward the middle of the 17th century, when Africans became subject to more brutal treatment than any other group. Proponents of this second argument point to cases like Elizabeth Key in 1656, or Phillip Corven in 1675, Black servants who sued in different court cases against their white masters for keeping them past the end of their indentures. Both Key and Corven won. If slavery was the law, Key and Corven would have had no standing in court much less any hope of prevailing.
Still, a third group stakes out slightly different ground. Separate Africans into two groups: the first generation that arrived before the middle of the 17th century, and those that arrived after. For the first generations of Africans, English and Dutch colonists had the concept of indefinite, but not inheritable, bondage. For those who came after, colonists applied the concept of lifetime, inheritable bondage. Here, the 1640 case of John Punch, a Black man caught with two other white servants attempting to run away, is often cited. As punishment, all the men received thirty lashes but the white servants had only one-year added to their indentures, while John Punch was ordered to serve his master "for the time of his natural life." For this reason, many consider John Punch the first real slave in America. Or was he the last Black indentured servant?
Clearly these cases show the ambiguity, or "loopholes," of the system separating servitude from slavery in early America. What is also clear is that one by one these loopholes were closed through conscious intent of colonial legislatures. In this reduction of ambiguity over the status of Africans, the closure of loopholes between servitude and slavery, are the roots of systemic racism.
Maryland enacted a first-of-its-kind law in 1664, specifically tying being Black to being a slave. "[A]ll Negroes or other slaves already within the Province And all Negroes and other slaves to be hereafter imported into the Province shall serve Durante Vita." Durante Vita is a Latin phrase meaning for the duration of one's life.
Another loophole concerned the status of children. Colonial American law was initially derived from English common law, where the status of child (whether bound or free) followed the status of the father. But adherence to English common law posed problems in colonial America, such as revealed in the 1630 case of Hugh Davis, a white man sentencing to whipping "for abusing himself to the dishonor of God and shame of Christians, by defiling his body in lying with a negro..." Whipping proved no deterrent for such interracial unions between a free European and a bound African. If English common law was followed, then the child of such a liaison would be free. So, in the years following Davis' whipping the legislatures in Maryland and Virginia enacted statutes that the status of the child, whether slave or free, followed that of the mother.
But closing this loophole assumes that only the sexual exploits of European men needed containing. The famous, and well-documented case of Irish milkmaid, Molly Welsh, who worked off her indentures in Maryland, shows the reverse actually happened as well. Welsh purchased a slaved named Banna Ka, whom she eventually freed, then married. They had a girl named Mary, who was free. Mary married a runaway slaved named Thomas, and they had a boy named Benjamin, who was also free. And Benjamin Banneker, a clockmaker, astronomer, mathematician, and surveyor, became an important figure in African American history, having authored a letter to Thomas Jefferson lamenting the lofty ideals of liberty and equality contained in the nation's founding documents were not extended to all citizens regardless of color.
Closing the religious exemption was another way in which colonial legislatures sought to separate Blacks from whites, and force slavery only on people of African descent. One of the reasons Elizabeth Key prevailed in court was that she asserted she could not be held in slavery as a Christian. In fact, there was a widespread belief in early America that Christians holding other Christians in slavery went against core biblical teachings.
Most first generation Africans in colonial America came from the Angola-Congo region of West Africa, first taken there by the Portuguese. Christianity was well-known, and practiced by Africans in these regions as early as the 15th century. So, many Africans destined for slavery, or indentured servitude in America, were already baptized, or were christened by priests aboard Portuguese slave trading vessels.
Colonial legislatures got busy. Maryland updated the 1664 law, cited above, with a 1671 statute that specifically carved out a religious exception for people of African descent. Regardless of whether they had become Christian, or received the sacrament of baptism, they would "hereafter be adjudged, reputed, deemed, and taken to be and remain in servitude and bondage" forever. Acts like this led to a tortured, convoluted American Christianity, developed to support slavery, and this legacy of racism within American Christianity continues to this day.
Apprehension of runaway servants and slaves was still another area in which colonial legislatures targeted people of color for differential, oppressive treatment. While granting masters the right to send a posse after runaways, a 1672 Virginia statute called "An act for the apprehension and suppression of runawayes, negroes and slaves," granted immunity to any white person who killed or wounded a runaway person of color while in pursuit of them. It read:
"Be it enacted by the governour, councell and burgesses of this grand assembly, and by the authority thereof, that if any negroe, molatto, Indian slave, or servant for life, runaway and shalbe persued by warrant or hue and crye, it shall and may be lawfull for any person who shall endeavour to take them, upon the resistance of such negroe, mollatto, Indian slave, or servant for life, to kill or wound him or them soe resisting."
Acts like this became the basis for slave patrols, and for the police forces that arose from them. Today, we still deal with the consequences of "qualified immunity," stemming from ideas like these enacted in 1672, which shield police from prosecution in cases of violence and brutality, especially against people of color.
Protection of southern rights even found its way into the Constitution. The Second Amendment protects the right of militias (a polite term for "slave patrols") to organize and bear arms. The Fugitive Slave Clause (never repealed) guaranteed southern slaveholders that their slaves apprehended in the North would be returned. Even the Interstate Commerce Clause allowed Southerners traveling North with their slaves assurances those slaves would not automatically become free by setting foot in states that outlawed slavery.
Though enacted centuries ago, the laws cited above are representative of the many laws that came to define American jurisprudence, and have at their core, the repression and oppression of Black Americans, and other people of color. This is why Chief Justice Roger B. Taney, writing for the U.S. Supreme Court in 1857, handed down a 7-2 verdict in the Dred Scott case, with the words that Blacks had "no rights which the white man was bound to respect." This is why critical race theory states that systemic racism was consciously created, as these laws and their enforcement show they were.
But this is also why Republican legislators and their supporters lump anything and everything having to do with diversity, equity, and inclusion into the box of critical race theory, then try to keep it out of schools and public institutions. They're afraid of Americans being told the truth: that the foundation of America, and of systemic racism, happened at the same time and from the same consciously created laws. In this way, these individuals are actually living proof of the validity of critical race theory, because they seek to consciously enact laws today which perpetuate the racial inequality established by laws enacted hundreds of years ago.
Around the 20th of August, 1619, the White Lion, an English ship sailing under a Dutch flag, docked off Old Point Comfort (near present-day Hampton), in the British colony of Virginia, to barter approximately 20 Africans for much needed food and supplies. The facts of the White Lion's arrival in Virginia, and her human cargo, are generally not in dispute. Whether those first Africans arriving in America were taken by colonists as slaves or as indentured servants is still debated. But by the end of the 17th century, a system of chattel slavery was in place in colonial America. How America got from uncertainly about the status of Africans, to certainty that they were slaves, is a transition that highlights the origins of systemic racism.
Three arguments have been put forth about whether the first Africans arriving in the colonies were treated as indentured servants or as slaves. One says that European racism predisposed American colonists to treat these Africans as slaves. Anthony and Isabella, for example, two Africans aboard the White Lion, were acquired by Captain William Tucker and listed at the bottom of his 1624/25 muster (census) entry, just above his real property, but below white indentured servants and native Americans.
A second argument counters that racism was not, at first, the decisive factor but that the availability of free labor was. "Before the invention of the Negro or the white man or the words and concepts to describe them," historian Lerone Bennett wrote, "the Colonial population consisted largely of a great mass of white and black [and native] bondsmen, who occupied roughly the same economic category and were treated with equal contempt by the lords of the plantations and legislatures."
In this view, slavery was not born of racism, but racism was born of slavery. Early colonial laws had no provisions distinguishing African from European servants, until those laws began to change toward the middle of the 17th century, when Africans became subject to more brutal treatment than any other group. Proponents of this second argument point to cases like Elizabeth Key in 1656, or Phillip Corven in 1675, Black servants who sued in different court cases against their white masters for keeping them past the end of their indentures. Both Key and Corven won. If slavery was the law, Key and Corven would have had no standing in court much less any hope of prevailing.
Still, a third group stakes out slightly different ground. Separate Africans into two groups: the first generation that arrived before the middle of the 17th century, and those that arrived after. For the first generations of Africans, English and Dutch colonists had the concept of indefinite, but not inheritable, bondage. For those who came after, colonists applied the concept of lifetime, inheritable bondage. Here, the 1640 case of John Punch, a Black man caught with two other white servants attempting to run away, is often cited. As punishment, all the men received thirty lashes but the white servants had only one-year added to their indentures, while John Punch was ordered to serve his master "for the time of his natural life." For this reason, many consider John Punch the first real slave in America. Or was he the last Black indentured servant?
Clearly these cases show the ambiguity, or "loopholes," of the system separating servitude from slavery in early America. What is also clear is that one by one these loopholes were closed through conscious intent of colonial legislatures. In this reduction of ambiguity over the status of Africans, the closure of loopholes between servitude and slavery, are the roots of systemic racism.
Maryland enacted a first-of-its-kind law in 1664, specifically tying being Black to being a slave. "[A]ll Negroes or other slaves already within the Province And all Negroes and other slaves to be hereafter imported into the Province shall serve Durante Vita." Durante Vita is a Latin phrase meaning for the duration of one's life.
Another loophole concerned the status of children. Colonial American law was initially derived from English common law, where the status of child (whether bound or free) followed the status of the father. But adherence to English common law posed problems in colonial America, such as revealed in the 1630 case of Hugh Davis, a white man sentencing to whipping "for abusing himself to the dishonor of God and shame of Christians, by defiling his body in lying with a negro..." Whipping proved no deterrent for such interracial unions between a free European and a bound African. If English common law was followed, then the child of such a liaison would be free. So, in the years following Davis' whipping the legislatures in Maryland and Virginia enacted statutes that the status of the child, whether slave or free, followed that of the mother.
But closing this loophole assumes that only the sexual exploits of European men needed containing. The famous, and well-documented case of Irish milkmaid, Molly Welsh, who worked off her indentures in Maryland, shows the reverse actually happened as well. Welsh purchased a slaved named Banna Ka, whom she eventually freed, then married. They had a girl named Mary, who was free. Mary married a runaway slaved named Thomas, and they had a boy named Benjamin, who was also free. And Benjamin Banneker, a clockmaker, astronomer, mathematician, and surveyor, became an important figure in African American history, having authored a letter to Thomas Jefferson lamenting the lofty ideals of liberty and equality contained in the nation's founding documents were not extended to all citizens regardless of color.
Closing the religious exemption was another way in which colonial legislatures sought to separate Blacks from whites, and force slavery only on people of African descent. One of the reasons Elizabeth Key prevailed in court was that she asserted she could not be held in slavery as a Christian. In fact, there was a widespread belief in early America that Christians holding other Christians in slavery went against core biblical teachings.
Most first generation Africans in colonial America came from the Angola-Congo region of West Africa, first taken there by the Portuguese. Christianity was well-known, and practiced by Africans in these regions as early as the 15th century. So, many Africans destined for slavery, or indentured servitude in America, were already baptized, or were christened by priests aboard Portuguese slave trading vessels.
Colonial legislatures got busy. Maryland updated the 1664 law, cited above, with a 1671 statute that specifically carved out a religious exception for people of African descent. Regardless of whether they had become Christian, or received the sacrament of baptism, they would "hereafter be adjudged, reputed, deemed, and taken to be and remain in servitude and bondage" forever. Acts like this led to a tortured, convoluted American Christianity, developed to support slavery, and this legacy of racism within American Christianity continues to this day.
Apprehension of runaway servants and slaves was still another area in which colonial legislatures targeted people of color for differential, oppressive treatment. While granting masters the right to send a posse after runaways, a 1672 Virginia statute called "An act for the apprehension and suppression of runawayes, negroes and slaves," granted immunity to any white person who killed or wounded a runaway person of color while in pursuit of them. It read:
"Be it enacted by the governour, councell and burgesses of this grand assembly, and by the authority thereof, that if any negroe, molatto, Indian slave, or servant for life, runaway and shalbe persued by warrant or hue and crye, it shall and may be lawfull for any person who shall endeavour to take them, upon the resistance of such negroe, mollatto, Indian slave, or servant for life, to kill or wound him or them soe resisting."
Acts like this became the basis for slave patrols, and for the police forces that arose from them. Today, we still deal with the consequences of "qualified immunity," stemming from ideas like these enacted in 1672, which shield police from prosecution in cases of violence and brutality, especially against people of color.
Protection of southern rights even found its way into the Constitution. The Second Amendment protects the right of militias (a polite term for "slave patrols") to organize and bear arms. The Fugitive Slave Clause (never repealed) guaranteed southern slaveholders that their slaves apprehended in the North would be returned. Even the Interstate Commerce Clause allowed Southerners traveling North with their slaves assurances those slaves would not automatically become free by setting foot in states that outlawed slavery.
Though enacted centuries ago, the laws cited above are representative of the many laws that came to define American jurisprudence, and have at their core, the repression and oppression of Black Americans, and other people of color. This is why Chief Justice Roger B. Taney, writing for the U.S. Supreme Court in 1857, handed down a 7-2 verdict in the Dred Scott case, with the words that Blacks had "no rights which the white man was bound to respect." This is why critical race theory states that systemic racism was consciously created, as these laws and their enforcement show they were.
But this is also why Republican legislators and their supporters lump anything and everything having to do with diversity, equity, and inclusion into the box of critical race theory, then try to keep it out of schools and public institutions. They're afraid of Americans being told the truth: that the foundation of America, and of systemic racism, happened at the same time and from the same consciously created laws. In this way, these individuals are actually living proof of the validity of critical race theory, because they seek to consciously enact laws today which perpetuate the racial inequality established by laws enacted hundreds of years ago.
White Supremacy Was on Trial at Washington and Lee University. It Won.
BY BRANDON HASBROUCK - SLATE
JUNE 07, 20215:06 PM
After the end of the Civil War, Robert E. Lee, the general who commanded the army of the Confederacy, was never tried, convicted, or sentenced for any crimes—not treason, not murder, not torture. Instead, he became president of Washington College, where he attracted students molded in his image, inspired by his lost cause, and motivated to maintain racial hierarchy. Under Lee’s leadership, his students would, among other things, form a KKK chapter and harass and assault Black school children. The board of trustees of Washington College honored this legacy when it decided to rename its institution Washington and Lee University.
Lee is the embodiment of white supremacy—he lived a life, as I previously argued, committed to racial subjugation and terror. He fought to enslave Black people—so the Confederate States of America could continue to profit on Black labor and Black pain while creating an antidemocratic state founded upon white supremacy. For this reason many stakeholders asked the current board of trustees of Washington and Lee University, where I am an assistant professor of law at the law school, to remove Lee as a namesake. After significant and critical national attention, Lee was finally put on trial at the place where his body is buried. Not guilty, the board of trustees announced on Friday. The vote was not even close—a supermajority of trustees (22 out of 28 trustees or 78 percent) voted to retain Lee as a namesake. That vote, however, did more. It signaled that Washington and Lee University will continue to shine as a beacon of racism, hate, and privilege.
In response to the board’s decision, the university’s president released a statement. He declared that Lee’s name does not define the university or its stakeholders; rather “we define it.” But we cannot engage in historical revisionism to redefine Lee’s name, nor should we. The board announced its commitment to “repudiating racism, racial injustice, and Confederate nostalgia.” But we cannot hope to make consequential change until we accept the truth of what Lee’s name means.
The jury at Washington and Lee harkens back to Jim Crow juries—white, male, privileged, and rigged. The jury, composed of 28 trustee members, was mostly white (25 trustees) and mostly male (23 trustees). Many of the witnesses supporting Lee were white, as were many of the big donors who threatened to withhold contributions if Lee’s name was removed. The outcome was never in doubt.
White supremacy has been put on trial before throughout our history. The outcomes in those trials was also predictable. The “Indian Removal Act” ensured white officials could never be found guilty of stealing Native land and committing genocide on the Trail of Tears. White insurrectionists in Wilmington, North Carolina murdered Black residents, destroyed Black-owned businesses and then ousted Wilmington’s anti-segregation, pro-equal rights government to insulate themselves against accountability. The United States Supreme Court sanctioned Japanese internment during World War II. An all-white, all male jury found Emmett Till’s murderers not guilty after 67 minutes of deliberation. Los Angeles cops were acquitted of bludgeoning Rodney King after the jury watched the tape more than 30 times.
When our racial ghosts are on trial, we know the outcome. When truth and justice are on trial, we know the outcome. When our country is asked to reject a revisionist tissue of historical lies, we known the outcome. White supremacy wins. White supremacy remains adaptable, persistent, violent, and nearly undefeated.
It inspires an insurrection. It introduces 389 restrictive voting right bills in 48 states over the past six months. It forbids schools to give a true accounting of our history—a history of racial violence, from the Trail of Tears, to Black Wall Street, to extrajudicial killings including those of Emmet Till, George Floyd, and Breonna Taylor. It allows Washington and Lee University to keep Lee as a namesake because it is safer to benefit from white supremacy than to summon the courage to even appear antiracist.
Historical revisionism shelters white supremacy. It entrenches white supremacy. It emboldens white supremacy. We need truth and reconciliation in America. We must face our past head on and acknowledge it for what it was¾oppression and racial terror fueled by white supremacy. Only then can we start to reimagine our democratic institutions as more—more just, more fair, more equal. Only then will we build the capacity, the resolve, and the collective will to find white supremacy guilty.
Lee is the embodiment of white supremacy—he lived a life, as I previously argued, committed to racial subjugation and terror. He fought to enslave Black people—so the Confederate States of America could continue to profit on Black labor and Black pain while creating an antidemocratic state founded upon white supremacy. For this reason many stakeholders asked the current board of trustees of Washington and Lee University, where I am an assistant professor of law at the law school, to remove Lee as a namesake. After significant and critical national attention, Lee was finally put on trial at the place where his body is buried. Not guilty, the board of trustees announced on Friday. The vote was not even close—a supermajority of trustees (22 out of 28 trustees or 78 percent) voted to retain Lee as a namesake. That vote, however, did more. It signaled that Washington and Lee University will continue to shine as a beacon of racism, hate, and privilege.
In response to the board’s decision, the university’s president released a statement. He declared that Lee’s name does not define the university or its stakeholders; rather “we define it.” But we cannot engage in historical revisionism to redefine Lee’s name, nor should we. The board announced its commitment to “repudiating racism, racial injustice, and Confederate nostalgia.” But we cannot hope to make consequential change until we accept the truth of what Lee’s name means.
The jury at Washington and Lee harkens back to Jim Crow juries—white, male, privileged, and rigged. The jury, composed of 28 trustee members, was mostly white (25 trustees) and mostly male (23 trustees). Many of the witnesses supporting Lee were white, as were many of the big donors who threatened to withhold contributions if Lee’s name was removed. The outcome was never in doubt.
White supremacy has been put on trial before throughout our history. The outcomes in those trials was also predictable. The “Indian Removal Act” ensured white officials could never be found guilty of stealing Native land and committing genocide on the Trail of Tears. White insurrectionists in Wilmington, North Carolina murdered Black residents, destroyed Black-owned businesses and then ousted Wilmington’s anti-segregation, pro-equal rights government to insulate themselves against accountability. The United States Supreme Court sanctioned Japanese internment during World War II. An all-white, all male jury found Emmett Till’s murderers not guilty after 67 minutes of deliberation. Los Angeles cops were acquitted of bludgeoning Rodney King after the jury watched the tape more than 30 times.
When our racial ghosts are on trial, we know the outcome. When truth and justice are on trial, we know the outcome. When our country is asked to reject a revisionist tissue of historical lies, we known the outcome. White supremacy wins. White supremacy remains adaptable, persistent, violent, and nearly undefeated.
It inspires an insurrection. It introduces 389 restrictive voting right bills in 48 states over the past six months. It forbids schools to give a true accounting of our history—a history of racial violence, from the Trail of Tears, to Black Wall Street, to extrajudicial killings including those of Emmet Till, George Floyd, and Breonna Taylor. It allows Washington and Lee University to keep Lee as a namesake because it is safer to benefit from white supremacy than to summon the courage to even appear antiracist.
Historical revisionism shelters white supremacy. It entrenches white supremacy. It emboldens white supremacy. We need truth and reconciliation in America. We must face our past head on and acknowledge it for what it was¾oppression and racial terror fueled by white supremacy. Only then can we start to reimagine our democratic institutions as more—more just, more fair, more equal. Only then will we build the capacity, the resolve, and the collective will to find white supremacy guilty.
America's History of Slavery Began Long Before Jamestown
The arrival of the first captives to the Jamestown Colony, in 1619, is often seen as the beginning of slavery in America—but enslaved Africans arrived in North America as early as the 1500s.
CRYSTAL PONTI - HISTORY.COM
UPDATED:AUG 26, 2019ORIGINAL:AUG 14, 2019
In late August 1619, the White Lion, an English privateer commanded by John Jope, sailed into Point Comfort and dropped anchor in the James River. Virginia colonist John Rolfe documented the arrival of the ship and “20 and odd” Africans on board. His journal entry is immortalized in textbooks, with 1619 often used as a reference point for teaching the origins of slavery in America. But the history, it seems, is far more complicated than a single date.
It is believed the first Africans brought to the colony of Virginia, 400 years ago this month, were Kimbundu-speaking peoples from the kingdom of Ndongo, located in part of present-day Angola. Slave traders forced the captives to march several hundred miles to the coast to board the San Juan Bautista, one of at least 36 transatlantic Portuguese and Spanish slave ships.
The ship embarked with about 350 Africans on board, but hunger and disease took a swift toll. En route, about 150 captives died. Then, when the San Juan Bautista approached what is now Veracruz, Mexico in the summer of 1619, it encountered two ships, the White Lion and another English privateer, the Treasurer. The crews stormed the vulnerable slave ship and seized 50 to 60 of the remaining Africans. After, the pair sailed for Virginia.
As noted by Rolfe, when the White Lion arrived in what is now present-day Hampton, Virginia, the Africans were offloaded and “bought for victuals.” Governor Sir George Yeardley and head merchant Abraham Piersey acquired the majority of the captives, most of whom were kept in Jamestown, America’s first permanent English settlement.
The arrival of these “20 and odd” Africans to England’s mainland American colonies in 1619 is now a focal point in history curricula. The date and their story have become symbolic of slavery’s roots, despite captive Africans likely being present in the Americas in the 1400s and as early as 1526 in the region that would become the United States.
Some experts, including Michael Guasco, a professor at Davidson College and author of Slaves and Englishmen: Human Bondage in the Early Modern Atlantic World, caution about placing too much emphasis on the year 1619.
“To ignore what had been happening with relative frequency in the broader Atlantic world over the preceding 100 years or so understates the real brutality of the ongoing slave trade, of which the 1619 group were undoubtedly a part, and minimizes the significant African presence in the Atlantic world to that point,” Guasco explains. “People of African descent have been ‘here’ longer than the English colonies.”
Africans had a notable presence in the Americas before colonization
Prior to 1619, hundreds of thousands of Africans, both free and enslaved, aided the establishment and survival of colonies in the Americas and the New World. They also fought against European oppression, and, in some instances, hindered the systematic spread of colonization.
Christopher Columbus likely transported the first Africans to the Americas in the late 1490s on his expeditions to Hispaniola, now part of the Dominican Republic. Their exact status, whether free or enslaved, remains disputed. But the timeline fits with what we know of the origins of the slave trade.
European trade of enslaved Africans began in the 1400s. “The first example we have of Africans being taken against their will and put on board European ships would take the story back to 1441,” says Guasco, when the Portuguese captured 12 Africans in Cabo Branco—modern-day Mauritania in north Africa—and brought them to Portugal as enslaved peoples.
In the region that would become the United States, there were no enslaved Africans before the Spanish occupation of Florida in the early 16th century, according to Linda Heywood and John Thornton, professors at Boston University and co-authors of Central Africans, Atlantic Creoles and the Foundation of the Americas, 1585-1660.
“There were significant numbers who were brought in as early as 1526,” says Heywood. That year, some of these enslaved Africans became part of a Spanish expedition to establish an outpost in what is now South Carolina. They rebelled, preventing the Spanish from founding the colony.
The uprising didn’t stop the inflow of enslaved Africans to Spanish Florida. “We don’t know how many followed, but there was certainly a slave population around St. Augustine," says Heywood.
Africans also played a role in England's early colonization efforts. Enslaved Africans may have been on board Sir Francis Drake’s fleet when he arrived at Roanoke Island in 1586 and failed to establish the first permanent English settlement in America. He and his cousin, John Hawkins, made three voyages to Guinea and Sierra Leone and enslaved between 1,200 and 1,400 Africans.
Although not part of present-day America, Africans from the West Indies were also present in the English colony of Bermuda in 1616, where they provided expert knowledge of tobacco cultivation to the Virginia Company.
Focusing on the English colonies omits the global nature of slavery
From an Anglo-American perspective, 1619 is considered the beginning of slavery, just like Jamestown and Plymouth symbolize the beginnings of "America" from an English-speaking point of view. But divorcing the idea of North America's first enslaved people from the overall context of slavery in the Americas, especially when the U.S. was not formed for another 157 years, is not historically accurate.
“We would do well to remember that much of what played out in places like Virginia were the result of things that had already happened in Mexico, Central America, the Caribbean, Peru, Brazil and elsewhere,” says Guasco.
“The English took note of their fellow Europeans’ role in enslavement and the slave trade,” says Mark Summers, a public historian at Jamestown Rediscovery. In the context of the broader Atlantic world, the colony and institution of slavery developed from a chain of events involving multiple actors.
Still, U.S. school curricula tend to ignore much of what happened in the Atlantic prior to the Jamestown settlement and also the colonial projects of other countries that became part of America, such as Dutch New York, Swedish Delaware and French-Spanish Louisiana and Florida. “There is both an Anglo-centrism and east coast bias to much of traditional American history,” says Summers.
While Heywood and Thornton acknowledge that 1619 remains a key date for slavery in America, they also argue that focusing too much on the first enslaved people at Jamestown can distort our understanding of history. “It does so by failing to understand that the development of slavery was a gradual process, and that laws other than English laws applied,” says Thornton.
In 1619, slavery, as codified by law, did not yet exist in Virginia or elsewhere in places that would later become the United States.
But any question about the status of Black people in the colonies—free, enslaved or indentured servants—was made clear with the passage of the Virginia Slave Codes of 1705, a series of laws that stripped away legal rights and legalized the barbaric and dehumanizing nature of slavery.
As Guasco puts it, “The Spanish, Portuguese and English were co-conspirators in what we would now consider a crime against humanity.”
It is believed the first Africans brought to the colony of Virginia, 400 years ago this month, were Kimbundu-speaking peoples from the kingdom of Ndongo, located in part of present-day Angola. Slave traders forced the captives to march several hundred miles to the coast to board the San Juan Bautista, one of at least 36 transatlantic Portuguese and Spanish slave ships.
The ship embarked with about 350 Africans on board, but hunger and disease took a swift toll. En route, about 150 captives died. Then, when the San Juan Bautista approached what is now Veracruz, Mexico in the summer of 1619, it encountered two ships, the White Lion and another English privateer, the Treasurer. The crews stormed the vulnerable slave ship and seized 50 to 60 of the remaining Africans. After, the pair sailed for Virginia.
As noted by Rolfe, when the White Lion arrived in what is now present-day Hampton, Virginia, the Africans were offloaded and “bought for victuals.” Governor Sir George Yeardley and head merchant Abraham Piersey acquired the majority of the captives, most of whom were kept in Jamestown, America’s first permanent English settlement.
The arrival of these “20 and odd” Africans to England’s mainland American colonies in 1619 is now a focal point in history curricula. The date and their story have become symbolic of slavery’s roots, despite captive Africans likely being present in the Americas in the 1400s and as early as 1526 in the region that would become the United States.
Some experts, including Michael Guasco, a professor at Davidson College and author of Slaves and Englishmen: Human Bondage in the Early Modern Atlantic World, caution about placing too much emphasis on the year 1619.
“To ignore what had been happening with relative frequency in the broader Atlantic world over the preceding 100 years or so understates the real brutality of the ongoing slave trade, of which the 1619 group were undoubtedly a part, and minimizes the significant African presence in the Atlantic world to that point,” Guasco explains. “People of African descent have been ‘here’ longer than the English colonies.”
Africans had a notable presence in the Americas before colonization
Prior to 1619, hundreds of thousands of Africans, both free and enslaved, aided the establishment and survival of colonies in the Americas and the New World. They also fought against European oppression, and, in some instances, hindered the systematic spread of colonization.
Christopher Columbus likely transported the first Africans to the Americas in the late 1490s on his expeditions to Hispaniola, now part of the Dominican Republic. Their exact status, whether free or enslaved, remains disputed. But the timeline fits with what we know of the origins of the slave trade.
European trade of enslaved Africans began in the 1400s. “The first example we have of Africans being taken against their will and put on board European ships would take the story back to 1441,” says Guasco, when the Portuguese captured 12 Africans in Cabo Branco—modern-day Mauritania in north Africa—and brought them to Portugal as enslaved peoples.
In the region that would become the United States, there were no enslaved Africans before the Spanish occupation of Florida in the early 16th century, according to Linda Heywood and John Thornton, professors at Boston University and co-authors of Central Africans, Atlantic Creoles and the Foundation of the Americas, 1585-1660.
“There were significant numbers who were brought in as early as 1526,” says Heywood. That year, some of these enslaved Africans became part of a Spanish expedition to establish an outpost in what is now South Carolina. They rebelled, preventing the Spanish from founding the colony.
The uprising didn’t stop the inflow of enslaved Africans to Spanish Florida. “We don’t know how many followed, but there was certainly a slave population around St. Augustine," says Heywood.
Africans also played a role in England's early colonization efforts. Enslaved Africans may have been on board Sir Francis Drake’s fleet when he arrived at Roanoke Island in 1586 and failed to establish the first permanent English settlement in America. He and his cousin, John Hawkins, made three voyages to Guinea and Sierra Leone and enslaved between 1,200 and 1,400 Africans.
Although not part of present-day America, Africans from the West Indies were also present in the English colony of Bermuda in 1616, where they provided expert knowledge of tobacco cultivation to the Virginia Company.
Focusing on the English colonies omits the global nature of slavery
From an Anglo-American perspective, 1619 is considered the beginning of slavery, just like Jamestown and Plymouth symbolize the beginnings of "America" from an English-speaking point of view. But divorcing the idea of North America's first enslaved people from the overall context of slavery in the Americas, especially when the U.S. was not formed for another 157 years, is not historically accurate.
“We would do well to remember that much of what played out in places like Virginia were the result of things that had already happened in Mexico, Central America, the Caribbean, Peru, Brazil and elsewhere,” says Guasco.
“The English took note of their fellow Europeans’ role in enslavement and the slave trade,” says Mark Summers, a public historian at Jamestown Rediscovery. In the context of the broader Atlantic world, the colony and institution of slavery developed from a chain of events involving multiple actors.
Still, U.S. school curricula tend to ignore much of what happened in the Atlantic prior to the Jamestown settlement and also the colonial projects of other countries that became part of America, such as Dutch New York, Swedish Delaware and French-Spanish Louisiana and Florida. “There is both an Anglo-centrism and east coast bias to much of traditional American history,” says Summers.
While Heywood and Thornton acknowledge that 1619 remains a key date for slavery in America, they also argue that focusing too much on the first enslaved people at Jamestown can distort our understanding of history. “It does so by failing to understand that the development of slavery was a gradual process, and that laws other than English laws applied,” says Thornton.
In 1619, slavery, as codified by law, did not yet exist in Virginia or elsewhere in places that would later become the United States.
But any question about the status of Black people in the colonies—free, enslaved or indentured servants—was made clear with the passage of the Virginia Slave Codes of 1705, a series of laws that stripped away legal rights and legalized the barbaric and dehumanizing nature of slavery.
As Guasco puts it, “The Spanish, Portuguese and English were co-conspirators in what we would now consider a crime against humanity.”
There was a time reparations were actually paid out – just not to formerly enslaved people
THE CONVERSATION
February 26, 2021 8.26am EST
The cost of slavery and its legacy of systemic racism to generations of Black Americans has been clear over the past year – seen in both the racial disparities of the pandemic and widespread protests over police brutality.
Yet whenever calls for reparations are made – as they are again now – opponents counter that it would be unfair to saddle a debt on those not personally responsible. In the words of then-Senate Majority Leader Mitch McConnell, speaking on Juneteenth – the day Black Americans celebrate as marking emancipation – in 2019, “I don’t think reparations for something that happened 150 years ago for whom none of us currently living are responsible is a good idea.”
As a professor of public policy who has studied reparations, I acknowledge that the figures involved are large – I conservatively estimate the losses from unpaid wages and lost inheritances to Black descendants of the enslaved at around US$20 trillion in 2021 dollars.
But what often gets forgotten by those who oppose reparations is that payouts for slavery have been made before – numerous times, in fact. And few at the time complained that it was unfair to saddle generations of people with a debt for which they were not personally responsible.
There is an important caveat in these cases of reparations though: The payments went to former slave owners and their descendants, not the enslaved or their legal heirs.
Extorting Haiti
A prominent example is the so-called “Haitian Independence Debt” that saddled revolutionary Haiti with reparation payments to former slave owners in France.
Haiti declared independence from France in 1804, but the former colonial power refused to acknowledge the fact for another 20 years. Then in 1825, King Charles X decreed that he would recognize independence, but at a cost. The price tag would be 150 million francs – more than 10 years of the Haitian government’s entire revenue. The money, the French said, was needed to compensate former slave owners for the loss of what was deemed their property.
By 1883, Haiti had paid off some 90 million francs in reparations. But to finance such huge payments, Haiti had to borrow 166 million francs with the French banks Ternaux Grandolpe et Cie and Lafitte Rothschild Lapanonze. Loan interests and fees added to the overall sum owed to France.
The payments ran for a total of 122 years from 1825 to 1947, with the money going to more than 7,900 former slave owners and their descendants in France. By the time the payments ended, none of the originally enslaved or enslavers were still alive.
British ‘reparations’
French slave owners weren’t the only ones to receive payment for lost revenue, their British counterparts did too – but this time from their own government.
The British government paid reparations totaling £20 million (equivalent to some £300 billion in 2018) to slave owners when it abolished slavery in 1833. Banking magnates Nathan Mayer Rothschild and his brother-in-law Moses Montefiore arranged for a loan to the government of $15 million to cover the vast sum – which represented almost half of the U.K. governent’s annual expenditure.
The U.K. serviced those loans for 182 years from 1833 to 2015. The authors of the British reparations program saddled many generations of British people with a reparations debt for which they were not personally responsible.
Paying for freedom
In the United States, reparations to slave owners in Washington, D.C., were paid at the height of the Civil War. On April 16, 1862, President Abraham Lincoln signed the “Act for the Release of certain Persons held to Service or Labor within the District of Columbia” into law.
It gave former slave owners $300 per enslaved person set free. More than 3,100 enslaved people saw their freedom paid for in this way, for a total cost in excess of $930,000 – almost $25 million in today’s money.
In contrast, the formerly enslaved received nothing if they decided to stay in the United States. The act provided for an emigration incentive of $100 – around $2,683 in 2021 dollars – if the former enslaved agreed to permanently leave the United States.
Yet whenever calls for reparations are made – as they are again now – opponents counter that it would be unfair to saddle a debt on those not personally responsible. In the words of then-Senate Majority Leader Mitch McConnell, speaking on Juneteenth – the day Black Americans celebrate as marking emancipation – in 2019, “I don’t think reparations for something that happened 150 years ago for whom none of us currently living are responsible is a good idea.”
As a professor of public policy who has studied reparations, I acknowledge that the figures involved are large – I conservatively estimate the losses from unpaid wages and lost inheritances to Black descendants of the enslaved at around US$20 trillion in 2021 dollars.
But what often gets forgotten by those who oppose reparations is that payouts for slavery have been made before – numerous times, in fact. And few at the time complained that it was unfair to saddle generations of people with a debt for which they were not personally responsible.
There is an important caveat in these cases of reparations though: The payments went to former slave owners and their descendants, not the enslaved or their legal heirs.
Extorting Haiti
A prominent example is the so-called “Haitian Independence Debt” that saddled revolutionary Haiti with reparation payments to former slave owners in France.
Haiti declared independence from France in 1804, but the former colonial power refused to acknowledge the fact for another 20 years. Then in 1825, King Charles X decreed that he would recognize independence, but at a cost. The price tag would be 150 million francs – more than 10 years of the Haitian government’s entire revenue. The money, the French said, was needed to compensate former slave owners for the loss of what was deemed their property.
By 1883, Haiti had paid off some 90 million francs in reparations. But to finance such huge payments, Haiti had to borrow 166 million francs with the French banks Ternaux Grandolpe et Cie and Lafitte Rothschild Lapanonze. Loan interests and fees added to the overall sum owed to France.
The payments ran for a total of 122 years from 1825 to 1947, with the money going to more than 7,900 former slave owners and their descendants in France. By the time the payments ended, none of the originally enslaved or enslavers were still alive.
British ‘reparations’
French slave owners weren’t the only ones to receive payment for lost revenue, their British counterparts did too – but this time from their own government.
The British government paid reparations totaling £20 million (equivalent to some £300 billion in 2018) to slave owners when it abolished slavery in 1833. Banking magnates Nathan Mayer Rothschild and his brother-in-law Moses Montefiore arranged for a loan to the government of $15 million to cover the vast sum – which represented almost half of the U.K. governent’s annual expenditure.
The U.K. serviced those loans for 182 years from 1833 to 2015. The authors of the British reparations program saddled many generations of British people with a reparations debt for which they were not personally responsible.
Paying for freedom
In the United States, reparations to slave owners in Washington, D.C., were paid at the height of the Civil War. On April 16, 1862, President Abraham Lincoln signed the “Act for the Release of certain Persons held to Service or Labor within the District of Columbia” into law.
It gave former slave owners $300 per enslaved person set free. More than 3,100 enslaved people saw their freedom paid for in this way, for a total cost in excess of $930,000 – almost $25 million in today’s money.
In contrast, the formerly enslaved received nothing if they decided to stay in the United States. The act provided for an emigration incentive of $100 – around $2,683 in 2021 dollars – if the former enslaved agreed to permanently leave the United States.
US Capitol protesters, egged on by Trump, are part of a long history of white supremacists hearing politicians’ words as encouragement
the conversation
January 7, 2021 1.02pm EST
...During Reconstruction, the post-Civil War period of forming interracial governments and reintegrating former Confederate states into the Union, white city and state leaders in the South tacitly encouraged violence against black voters by state militias and groups like the Ku Klux Klan. They did it in a way that allowed those leaders to look innocent of any crimes.
Those groups used that chaos to end federal power in their states and reestablish white-dominated Southern state governments.
Today, white supremacists hope the political chaos they contribute to will lead to race war and the creation of their own white nation.
Reconstruction violence
Moments of changing social and political power in U.S. history have led to clashes – often armed – between white supremacists and interracial alliances over voting rights.
That history includes the period following the Civil War, when white supremacist organizations saw the postwar rule over Southern states of Radical Republicans and the federal government as illegitimate. They wanted to return to the prewar status quo of slavery by another name and white supremacist rule.
As a historian of protests and Reconstruction, I study how those paramilitary groups or self-proclaimed “regulators” consequently spread fear and terror among black and white Republican voters with the support of the anti-black Democratic Party in Southern states.
They targeted elections and vowed to “carry the election peaceably if we can, forcibly if we must.”
Still, many courageous black and white voters fought back by forming political organizations, daring to vote and assembling their own armed guards to protect themselves.
‘Gentlemen of property and standing’
Then, as today, white supremacists received encouraging signals from powerful leaders.
In the 19th century, “gentlemen of property and standing” often led or indirectly supported anti-abolition mobs, slave patrols, lynch mobs or Klan attacks.
Federal investigators in Kentucky in 1867 found that “many men of wealth and position” rode with the armed groups. One witness in the federal investigation testified that “many of the most respectable men in the county belong in the ‘Lynch’ party.” Future South Carolina Governor and U.S. Senator “Pitchfork” Ben Tillman reflected on his participation in the Hamburg massacre of 1876, arguing that “the leading men” of the area wanted to teach black voters a lesson by “having the whites demonstrate their superiority by killing as many as was justifiable.” At least six black men were killed in the Hamburg attack on the black South Carolina militia by the Red Shirts, a white rifle club.
White supremacists knew that they would not face consequences for their violence.
An agent of the federal Freedmen’s Bureau – set up by Congress in 1865 to help former slaves and poor whites in the South – stated that the “desperadoes” received encouragement and were “screened from the hands of justice by citizens of boasted connections.”
President Ulysses S. Grant condemned the Hamburg massacre, arguing that some claimed “the right to kill negroes and Republicans without fear of punishment and without loss of caste or reputation.”
Facing community pressure, and without the presence of the U.S. Army to enforce laws, local sheriffs and judges refused or were unable to enforce federal laws.
Witnesses were often afraid to challenge local leaders for fear of attack. The “reign of terror” was so complete that “men dare not report outrages and appear as witnesses.”
When the U.S. District Court in Kentucky brought charges against two men for lynching in 1871, prosecutors could not find witnesses willing to testify against the accused. The Frankfort Commonwealth newspaper wrote, “He would be hung by a [mob] inside of twenty-four hours, and the dominant sentiment … would say ‘served him right.’”
State militias
As Southern states threw off federal military occupation and elected their own white-dominated governments, they no longer had to rely solely on white terror organizations to enforce their agenda.
Instead, these self-described “redeemers” formed state-funded militias that served similar functions of intimidation and voter suppression with the support of prominent citizens.
At political rallies and elections throughout the South, official Democratic militias paraded through towns and monitored polling stations to threaten black and white Republican voters, proclaiming that “this is our country and we intend to protect it or die.”
In 1870 the Louisville Commercial newspaper argued, “We have, then, a militia for the State of Kentucky composed of members of one political party, and designed solely to operate against members of another political party. These militia are armed with State guns, are equipped from the State arsenal, and to a man are the enemies of the national government.”
By driving away Republican voters and claiming electoral victory, these Democratic leaders gained power through state-supported militia violence.
White militias and paramilitary groups also confiscated guns from black citizens who tried to protect themselves, claiming “We did not think they had a right to have guns.”
White terror groups and their allies in law enforcement were especially hostile to politically active black Union veterans who returned home with their military weapons. Local sheriffs confiscated weapons and armed bands raided homes to destroy their guns.
Guerrilla race war
During Reconstruction, paramilitary groups and official Democratic militias found support from county sheriffs up to state governors who encouraged violence while maintaining their own innocence.
Today, white supremacists appear to interpret politicians’ remarks as support for their cause of a new civil war to create a white-dominated government.
These groups thrive on recent protests against stay-at-home orders, especially the ones featuring protesters with guns, creating an intimidating spectacle for those who support local and state government authority.
Beyond “dog whistle” politics, as in the past, these statements – and the actions encouraged by them – can lead to real violence and hate crimes against any who threaten supremacists’ concept of a white nation.
Those groups used that chaos to end federal power in their states and reestablish white-dominated Southern state governments.
Today, white supremacists hope the political chaos they contribute to will lead to race war and the creation of their own white nation.
Reconstruction violence
Moments of changing social and political power in U.S. history have led to clashes – often armed – between white supremacists and interracial alliances over voting rights.
That history includes the period following the Civil War, when white supremacist organizations saw the postwar rule over Southern states of Radical Republicans and the federal government as illegitimate. They wanted to return to the prewar status quo of slavery by another name and white supremacist rule.
As a historian of protests and Reconstruction, I study how those paramilitary groups or self-proclaimed “regulators” consequently spread fear and terror among black and white Republican voters with the support of the anti-black Democratic Party in Southern states.
They targeted elections and vowed to “carry the election peaceably if we can, forcibly if we must.”
Still, many courageous black and white voters fought back by forming political organizations, daring to vote and assembling their own armed guards to protect themselves.
‘Gentlemen of property and standing’
Then, as today, white supremacists received encouraging signals from powerful leaders.
In the 19th century, “gentlemen of property and standing” often led or indirectly supported anti-abolition mobs, slave patrols, lynch mobs or Klan attacks.
Federal investigators in Kentucky in 1867 found that “many men of wealth and position” rode with the armed groups. One witness in the federal investigation testified that “many of the most respectable men in the county belong in the ‘Lynch’ party.” Future South Carolina Governor and U.S. Senator “Pitchfork” Ben Tillman reflected on his participation in the Hamburg massacre of 1876, arguing that “the leading men” of the area wanted to teach black voters a lesson by “having the whites demonstrate their superiority by killing as many as was justifiable.” At least six black men were killed in the Hamburg attack on the black South Carolina militia by the Red Shirts, a white rifle club.
White supremacists knew that they would not face consequences for their violence.
An agent of the federal Freedmen’s Bureau – set up by Congress in 1865 to help former slaves and poor whites in the South – stated that the “desperadoes” received encouragement and were “screened from the hands of justice by citizens of boasted connections.”
President Ulysses S. Grant condemned the Hamburg massacre, arguing that some claimed “the right to kill negroes and Republicans without fear of punishment and without loss of caste or reputation.”
Facing community pressure, and without the presence of the U.S. Army to enforce laws, local sheriffs and judges refused or were unable to enforce federal laws.
Witnesses were often afraid to challenge local leaders for fear of attack. The “reign of terror” was so complete that “men dare not report outrages and appear as witnesses.”
When the U.S. District Court in Kentucky brought charges against two men for lynching in 1871, prosecutors could not find witnesses willing to testify against the accused. The Frankfort Commonwealth newspaper wrote, “He would be hung by a [mob] inside of twenty-four hours, and the dominant sentiment … would say ‘served him right.’”
State militias
As Southern states threw off federal military occupation and elected their own white-dominated governments, they no longer had to rely solely on white terror organizations to enforce their agenda.
Instead, these self-described “redeemers” formed state-funded militias that served similar functions of intimidation and voter suppression with the support of prominent citizens.
At political rallies and elections throughout the South, official Democratic militias paraded through towns and monitored polling stations to threaten black and white Republican voters, proclaiming that “this is our country and we intend to protect it or die.”
In 1870 the Louisville Commercial newspaper argued, “We have, then, a militia for the State of Kentucky composed of members of one political party, and designed solely to operate against members of another political party. These militia are armed with State guns, are equipped from the State arsenal, and to a man are the enemies of the national government.”
By driving away Republican voters and claiming electoral victory, these Democratic leaders gained power through state-supported militia violence.
White militias and paramilitary groups also confiscated guns from black citizens who tried to protect themselves, claiming “We did not think they had a right to have guns.”
White terror groups and their allies in law enforcement were especially hostile to politically active black Union veterans who returned home with their military weapons. Local sheriffs confiscated weapons and armed bands raided homes to destroy their guns.
Guerrilla race war
During Reconstruction, paramilitary groups and official Democratic militias found support from county sheriffs up to state governors who encouraged violence while maintaining their own innocence.
Today, white supremacists appear to interpret politicians’ remarks as support for their cause of a new civil war to create a white-dominated government.
These groups thrive on recent protests against stay-at-home orders, especially the ones featuring protesters with guns, creating an intimidating spectacle for those who support local and state government authority.
Beyond “dog whistle” politics, as in the past, these statements – and the actions encouraged by them – can lead to real violence and hate crimes against any who threaten supremacists’ concept of a white nation.
The complicated legacy of the Pilgrims is finally coming to light 400 years after they landed in Plymouth
the conversation
September 4, 2020 8.21am EDT
The 400th anniversary of the Pilgrims’ voyage to Plymouth will be celebrated on both sides of the Atlantic with a “remembrance ceremony” with state and local officials and a museum exhibit in Plymouth, England. IBM has even outfitted a replica of the Mayflower with an AI navigating system that will allow the ship to trace the course of the original journey without any humans on board.
Yet as a scholar of early 17th-century New England, I’ve always been puzzled by the glory heaped on the Pilgrims and their settlement in Plymouth.
Native Americans had met Europeans in scores of places before 1620, so yet another encounter was hardly unique. Relative to other settlements, the colony attracted few migrants. And it lasted only 70 years.
So why does it have such a prominent place in the story of America? And why, until recently, did the more troubling aspects to Plymouth and its founding document, the Mayflower Compact, go ignored?
Prophets and profits
The establishment of Plymouth did not occur in a vacuum.
The Pilgrims’ decision to go to North America – and their deep attachment to their faith – was an outcome of the intense religious conflict roiling Europe after the Protestant Reformation. Shortly before the travelers’ arrival, the Wampanoag residents of Patuxet – the area in and around modern day Plymouth – had suffered a devastating, three-year epidemic, possibly caused by leptospirosis, a bacterial disease that can lead to meningitis, respiratory distress and liver failure.
It was during these two crises that the histories of western Europe and Indigenous North America collided on the shores of Massachusetts Bay.
Despite a number of advantages, including less competition for local resources because of the epidemic, Plymouth attracted far fewer English migrants than Virginia, which was settled in 1607, and Massachusetts, which was established in 1630.
The Pilgrims, as they told their story traveled so they could practice their religion free from persecution. But other English joined them, including some migrants seeking profits instead of heeding prophets. Unfortunately for those hoping to earn a quick buck, the colony never became an economic dynamo.
A shaky compact
Plymouth nonetheless went on to attain a prominent place in the history of America, primarily due to two phenomena: It was the alleged site of the first Thanksgiving, and its founders drafted the Mayflower Compact, a 200-word document written and signed by 41 men on the ship.
Generations of American students have learned that the Compact was a stepping stone towards self-government, the defining feature of American constitutional democracy.
But did Plymouth really inspire democracy? After all, self-governing communities existed across Indigenous New England long before European migrants arrived. And a year earlier, in 1619, English colonists in Virginia had created the House of Burgesses to advance self-rule in North America for subjects of King James I.
So American self-government, however one defines it, was not born in Plymouth.
The Mayflower Compact nonetheless contained lofty ideals. The plan signed by many of the Mayflower’s male passengers demanded that colonists “Covenant & Combine ourselves into a Civil body politic, for our better ordering, & preservation.” They promised to work together to write “laws, ordinances, Acts, constitutions.” The signers pledged to work for the “advancement of the Christian faith.”
Yet as the years after 1620 bore out, the migrants did not adhere to such principles when dealing with their Wampanoag and other Algonquian-speaking neighbors. Gov. William Bradford, who began writing his history of Plymouth in 1630, wrote about the Pilgrims arriving in “a hideous and desolate wilderness, full of wild beasts and wild men” even though Patuxet looked more like a settled European farmland. The Pilgrims exiled an English lawyer named Thomas Morton, in part because he believed that Indigenous and colonists could peacefully coexist. And in 1637, Plymouth’s authorities joined a bloody campaign against the Pequots, which led to the massacre of Indigenous people on the banks of the Mystic River, followed by the sale of prisoners into slavery.
The Compact was even used by loyalists to the British crown to argue against independence. Thomas Hutchinson, the last royal governor of Massachusetts, pointed to the Pilgrims as proof that colonists should not rebel, highlighting the passage that defined the signers as “loyal subjects” of the English king.
History told by the victors
After the American Revolution, politicians and historians, especially those descended from Pilgrims and Puritans, were keen to trace the origins of the United States back to Plymouth.
In the process, they glossed over the Pilgrims’ complicated legacy.
In 1802, the future President John Quincy Adams spoke at Plymouth about the unique genius of the colony’s founders and their governing contract. He announced that the Pilgrims would arrive at the biblical day of judgment “in the whiteness of innocence” for having shown “kindness and equity toward the savages.”
In the mid-19th century, the historian George Bancroft claimed that it was in “the cabin of the Mayflower” where “humanity recovered its rights, and instituted government on the basis of ‘equal laws’ for ‘the general good.’”
Nineteenth-century anniversary celebrations focused on the colonists, their written Compact, and their contribution to what became the United States. In 1870, on the 250th anniversary, celebrants struck a commemorative coin: one side featured an open Bible, the other a group of Pilgrims praying on the shoreline.
Missing, not surprisingly, were the Wampanoags.
A more nuanced view of the past
By 1970, the cultural tide had turned. Representatives of the Wampanoag nation walked out of Plymouth’s public celebration of Thanksgiving that year to announce that the fourth Thursday in November should instead be known as the National Day of Mourning. To these protesters, 1620 represented violent conquest and dispossession, the twinned legacies of exclusion.
The organizers of an international group called “Plymouth 400” have stressed that they want to tell a “historically accurate and culturally inclusive history.” They’ve promoted both the General Society of Mayflower Descendants and an exhibit featuring 400 years of Wampanoag History. Unlike earlier generations of celebrants, the organizers have acknowledged the continued presence of Native residents.
Prior celebrations of Plymouth’s founding focused on the Pilgrims’ role in the creation of the United States. By doing so, these commemorations sustained an exclusionary narrative for over two centuries.
Perhaps this year a different story will take hold, replacing ancestor worship with a more clear-eyed view of the past.
FOR THE REAL HISTORY, READ: *WHITE TRASH - THE 400-YEAR UNTOLD HISTORY OF CLASS IN AMERICA" BY NANCY ISENBERG
Yet as a scholar of early 17th-century New England, I’ve always been puzzled by the glory heaped on the Pilgrims and their settlement in Plymouth.
Native Americans had met Europeans in scores of places before 1620, so yet another encounter was hardly unique. Relative to other settlements, the colony attracted few migrants. And it lasted only 70 years.
So why does it have such a prominent place in the story of America? And why, until recently, did the more troubling aspects to Plymouth and its founding document, the Mayflower Compact, go ignored?
Prophets and profits
The establishment of Plymouth did not occur in a vacuum.
The Pilgrims’ decision to go to North America – and their deep attachment to their faith – was an outcome of the intense religious conflict roiling Europe after the Protestant Reformation. Shortly before the travelers’ arrival, the Wampanoag residents of Patuxet – the area in and around modern day Plymouth – had suffered a devastating, three-year epidemic, possibly caused by leptospirosis, a bacterial disease that can lead to meningitis, respiratory distress and liver failure.
It was during these two crises that the histories of western Europe and Indigenous North America collided on the shores of Massachusetts Bay.
Despite a number of advantages, including less competition for local resources because of the epidemic, Plymouth attracted far fewer English migrants than Virginia, which was settled in 1607, and Massachusetts, which was established in 1630.
The Pilgrims, as they told their story traveled so they could practice their religion free from persecution. But other English joined them, including some migrants seeking profits instead of heeding prophets. Unfortunately for those hoping to earn a quick buck, the colony never became an economic dynamo.
A shaky compact
Plymouth nonetheless went on to attain a prominent place in the history of America, primarily due to two phenomena: It was the alleged site of the first Thanksgiving, and its founders drafted the Mayflower Compact, a 200-word document written and signed by 41 men on the ship.
Generations of American students have learned that the Compact was a stepping stone towards self-government, the defining feature of American constitutional democracy.
But did Plymouth really inspire democracy? After all, self-governing communities existed across Indigenous New England long before European migrants arrived. And a year earlier, in 1619, English colonists in Virginia had created the House of Burgesses to advance self-rule in North America for subjects of King James I.
So American self-government, however one defines it, was not born in Plymouth.
The Mayflower Compact nonetheless contained lofty ideals. The plan signed by many of the Mayflower’s male passengers demanded that colonists “Covenant & Combine ourselves into a Civil body politic, for our better ordering, & preservation.” They promised to work together to write “laws, ordinances, Acts, constitutions.” The signers pledged to work for the “advancement of the Christian faith.”
Yet as the years after 1620 bore out, the migrants did not adhere to such principles when dealing with their Wampanoag and other Algonquian-speaking neighbors. Gov. William Bradford, who began writing his history of Plymouth in 1630, wrote about the Pilgrims arriving in “a hideous and desolate wilderness, full of wild beasts and wild men” even though Patuxet looked more like a settled European farmland. The Pilgrims exiled an English lawyer named Thomas Morton, in part because he believed that Indigenous and colonists could peacefully coexist. And in 1637, Plymouth’s authorities joined a bloody campaign against the Pequots, which led to the massacre of Indigenous people on the banks of the Mystic River, followed by the sale of prisoners into slavery.
The Compact was even used by loyalists to the British crown to argue against independence. Thomas Hutchinson, the last royal governor of Massachusetts, pointed to the Pilgrims as proof that colonists should not rebel, highlighting the passage that defined the signers as “loyal subjects” of the English king.
History told by the victors
After the American Revolution, politicians and historians, especially those descended from Pilgrims and Puritans, were keen to trace the origins of the United States back to Plymouth.
In the process, they glossed over the Pilgrims’ complicated legacy.
In 1802, the future President John Quincy Adams spoke at Plymouth about the unique genius of the colony’s founders and their governing contract. He announced that the Pilgrims would arrive at the biblical day of judgment “in the whiteness of innocence” for having shown “kindness and equity toward the savages.”
In the mid-19th century, the historian George Bancroft claimed that it was in “the cabin of the Mayflower” where “humanity recovered its rights, and instituted government on the basis of ‘equal laws’ for ‘the general good.’”
Nineteenth-century anniversary celebrations focused on the colonists, their written Compact, and their contribution to what became the United States. In 1870, on the 250th anniversary, celebrants struck a commemorative coin: one side featured an open Bible, the other a group of Pilgrims praying on the shoreline.
Missing, not surprisingly, were the Wampanoags.
A more nuanced view of the past
By 1970, the cultural tide had turned. Representatives of the Wampanoag nation walked out of Plymouth’s public celebration of Thanksgiving that year to announce that the fourth Thursday in November should instead be known as the National Day of Mourning. To these protesters, 1620 represented violent conquest and dispossession, the twinned legacies of exclusion.
The organizers of an international group called “Plymouth 400” have stressed that they want to tell a “historically accurate and culturally inclusive history.” They’ve promoted both the General Society of Mayflower Descendants and an exhibit featuring 400 years of Wampanoag History. Unlike earlier generations of celebrants, the organizers have acknowledged the continued presence of Native residents.
Prior celebrations of Plymouth’s founding focused on the Pilgrims’ role in the creation of the United States. By doing so, these commemorations sustained an exclusionary narrative for over two centuries.
Perhaps this year a different story will take hold, replacing ancestor worship with a more clear-eyed view of the past.
FOR THE REAL HISTORY, READ: *WHITE TRASH - THE 400-YEAR UNTOLD HISTORY OF CLASS IN AMERICA" BY NANCY ISENBERG
The Troubling History — and Unfinished Work — of the Suffragists
With millions of people disenfranchised along racist lines, this is no time for uncomplicated commemoration.
Natasha Lennard - the intercept
August 26 2020, 3:00 a.m.
THE CENTENARY OF the 19th Amendment’s ratification, which granted female citizens the right to vote in 1920, will be marked with monuments. On August 26, 100 years to the day since women’s suffrage went into legal effect, a statue will be unveiled in Central Park depicting three prominent suffragists: Susan B. Anthony, Sojourner Truth, and Elizabeth Cady Stanton. The monument will be the first in the park to feature nonfictional women. Meanwhile, President Donald Trump, a putrid misogynist and statue fetishist, has thrown his support behind a bill to create a monument to suffragists in Washington, D.C.
In one sense, at a moment of reckoning around which statues and monuments get to occupy public space, stone edifications of women who struggled for suffrage are a welcome and overdue addition to a terrain too often held by desecration-worthy slavers and colonizers. Both the New York and D.C. sculptures feature Black suffragists, like Truth, alongside white women, like Anthony and Stanton, who have been accorded disproportionate credit compared to their nonwhite counterparts. Yet, the suffragist monuments, like so many statues, obfuscate and reduce a complex and troubling history plagued by racism.
It is a problem with the practice of monumentalization more broadly, that statues suggest an idea of finished and settled history — indeed, narratives set in stone. This is no time to consider the fight for universal suffrage as a closed history, worthy of uncomplicated commemoration. Trump’s ever-growing efforts toward election sabotage are just the tip of the iceberg in a nation that systematically disenfranchises millions along indisputably racist lines. It’s no surprise that Trump would support efforts, however well-intentioned, to confine the struggle for voting rights to commemorable history.
As Melissa Gira Grant wrote in a recent essay, “The women who remain locked out of the franchise are the fractured legacy of a fractured movement.”
Over 6 million people in the U.S. are currently disenfranchised due to laws relating to current or former felony convictions; that’s 1 in 40 adults. Given the vicious racism of our carceral system, the racial bent of disenfranchisement is profound: One in 13 Black Americans of voting age is disenfranchised, according to a 2016 study by the Sentencing Project. This disenfranchisement is further compounded by voter intimidation and laws requiring voter ID (which disproportionately shut out poor Black, Indigenous, and other people of color, as well as trans individuals). The problem is not specific to women in these communities, but we are nonetheless talking about millions of women denied the vote, a century after their right to it was constitutionally affirmed.
One could argue that the centenary commemorations are a reminder that the historic struggle for universal suffrage goes on: The promise of the 19th Amendment remains worthy and yet to be realized. This would be American mythmaking par excellence; the sort that lionizes the universalist claims of the nation’s founding documents, but forgets, for example, that the Declaration of Independence described Indigenous people as “merciless Indian savages.” A state built on stolen land, by the labor of people owned as property, cannot at the same time be a nation founded on the principles of universal rights. It is high time we reject narratives that claim the struggle for equality and justice as the fulfillment of the great American promise. Equally, for many prominent white, middle-class suffragists, the promise of the 19th Amendment was never intended toward universal suffrage. “Women’s suffrage,” for them, meant middle-class white women’s suffrage by design. To a major extent, this includes Anthony and Stanton, whose tome on the movement, “The History of Women’s Suffrage,” all but erases the work of Black women like Truth, even though the three women did correspond and, at times, worked in concert.
Yet Stanton and Anthony were unambiguous with their wish for an “educated” female franchise — an ostensibly race-neutral framework that effectively excluded most poor, Black women. Stanton particularly turned to explicit racist epithets in her argument that the women’s vote should come before that of Black men. As Brent Staples noted two years ago in the New York Times, Stanton “embraced fairness in the abstract while publicly enunciating bigoted views of African-American men, whom she characterized as ‘Sambos’ and incipient rapists in the period just after the war.” The decision to sit Stanton and Anthony next to Truth, a formerly enslaved abolitionist, reflects the fact that the three had worked together but ignores their split over Stanton and Anthony’s opposition to the 15th Amendment. “I will cut off this right arm of mine before I will ever work or demand the ballot for the Negro and not the woman,” Anthony famously proclaimed. The planned D.C. monument will also feature, among other renowned suffragists, Ida B. Wells, a journalist and anti-lynching campaigner, who refused to march at the back of a racially segregated march for suffrage in 1913.
The racism that pervaded the suffrage movement cannot be excused by virtue of it taking place in a different era. Abolitionist suffragists like Truth, Wells, and Frances Ellen Watkins Harper were in the struggle at the time and emphatic that the battle for liberation must be intersectional. It was a choice, and a shameful one, for white suffragists to align with white supremacy, both explicitly and tacitly; the 53 percent of white women voters who cast their ballot for Trump in 2016 took part in that same indefensible legacy. Any commemorations of the 19th Amendment’s centenary must reckon with this unbroken history of white women’s racism. It is of note that original plans for the Central Park monument featured only statues of Anthony and Stanton, alongside a scroll containing quotations from more than 20 other suffragists, including Truth. After some high-profile criticism, Truth was added as a figure. Unfeatured, though, are the fraught antagonisms between those women’s political visions.
“You can’t ask one statue to meet all the desires of the people who have waited so long for recognition,” said Pam Elam, the president of Monumental Women, the fund partly behind the statue, in defense of its original design. She’s right. But what women engaged in antiracist, anti-patriarchal struggle, in the legacy of Truth and Wells, also know is that waiting for recognition from a white supremacist state is a losing strategy indeed. In this way, the thousands of women who have taken to the streets this summer to fight for Black lives have honored the memory of abolitionist suffragists more than any statue could.
In one sense, at a moment of reckoning around which statues and monuments get to occupy public space, stone edifications of women who struggled for suffrage are a welcome and overdue addition to a terrain too often held by desecration-worthy slavers and colonizers. Both the New York and D.C. sculptures feature Black suffragists, like Truth, alongside white women, like Anthony and Stanton, who have been accorded disproportionate credit compared to their nonwhite counterparts. Yet, the suffragist monuments, like so many statues, obfuscate and reduce a complex and troubling history plagued by racism.
It is a problem with the practice of monumentalization more broadly, that statues suggest an idea of finished and settled history — indeed, narratives set in stone. This is no time to consider the fight for universal suffrage as a closed history, worthy of uncomplicated commemoration. Trump’s ever-growing efforts toward election sabotage are just the tip of the iceberg in a nation that systematically disenfranchises millions along indisputably racist lines. It’s no surprise that Trump would support efforts, however well-intentioned, to confine the struggle for voting rights to commemorable history.
As Melissa Gira Grant wrote in a recent essay, “The women who remain locked out of the franchise are the fractured legacy of a fractured movement.”
Over 6 million people in the U.S. are currently disenfranchised due to laws relating to current or former felony convictions; that’s 1 in 40 adults. Given the vicious racism of our carceral system, the racial bent of disenfranchisement is profound: One in 13 Black Americans of voting age is disenfranchised, according to a 2016 study by the Sentencing Project. This disenfranchisement is further compounded by voter intimidation and laws requiring voter ID (which disproportionately shut out poor Black, Indigenous, and other people of color, as well as trans individuals). The problem is not specific to women in these communities, but we are nonetheless talking about millions of women denied the vote, a century after their right to it was constitutionally affirmed.
One could argue that the centenary commemorations are a reminder that the historic struggle for universal suffrage goes on: The promise of the 19th Amendment remains worthy and yet to be realized. This would be American mythmaking par excellence; the sort that lionizes the universalist claims of the nation’s founding documents, but forgets, for example, that the Declaration of Independence described Indigenous people as “merciless Indian savages.” A state built on stolen land, by the labor of people owned as property, cannot at the same time be a nation founded on the principles of universal rights. It is high time we reject narratives that claim the struggle for equality and justice as the fulfillment of the great American promise. Equally, for many prominent white, middle-class suffragists, the promise of the 19th Amendment was never intended toward universal suffrage. “Women’s suffrage,” for them, meant middle-class white women’s suffrage by design. To a major extent, this includes Anthony and Stanton, whose tome on the movement, “The History of Women’s Suffrage,” all but erases the work of Black women like Truth, even though the three women did correspond and, at times, worked in concert.
Yet Stanton and Anthony were unambiguous with their wish for an “educated” female franchise — an ostensibly race-neutral framework that effectively excluded most poor, Black women. Stanton particularly turned to explicit racist epithets in her argument that the women’s vote should come before that of Black men. As Brent Staples noted two years ago in the New York Times, Stanton “embraced fairness in the abstract while publicly enunciating bigoted views of African-American men, whom she characterized as ‘Sambos’ and incipient rapists in the period just after the war.” The decision to sit Stanton and Anthony next to Truth, a formerly enslaved abolitionist, reflects the fact that the three had worked together but ignores their split over Stanton and Anthony’s opposition to the 15th Amendment. “I will cut off this right arm of mine before I will ever work or demand the ballot for the Negro and not the woman,” Anthony famously proclaimed. The planned D.C. monument will also feature, among other renowned suffragists, Ida B. Wells, a journalist and anti-lynching campaigner, who refused to march at the back of a racially segregated march for suffrage in 1913.
The racism that pervaded the suffrage movement cannot be excused by virtue of it taking place in a different era. Abolitionist suffragists like Truth, Wells, and Frances Ellen Watkins Harper were in the struggle at the time and emphatic that the battle for liberation must be intersectional. It was a choice, and a shameful one, for white suffragists to align with white supremacy, both explicitly and tacitly; the 53 percent of white women voters who cast their ballot for Trump in 2016 took part in that same indefensible legacy. Any commemorations of the 19th Amendment’s centenary must reckon with this unbroken history of white women’s racism. It is of note that original plans for the Central Park monument featured only statues of Anthony and Stanton, alongside a scroll containing quotations from more than 20 other suffragists, including Truth. After some high-profile criticism, Truth was added as a figure. Unfeatured, though, are the fraught antagonisms between those women’s political visions.
“You can’t ask one statue to meet all the desires of the people who have waited so long for recognition,” said Pam Elam, the president of Monumental Women, the fund partly behind the statue, in defense of its original design. She’s right. But what women engaged in antiracist, anti-patriarchal struggle, in the legacy of Truth and Wells, also know is that waiting for recognition from a white supremacist state is a losing strategy indeed. In this way, the thousands of women who have taken to the streets this summer to fight for Black lives have honored the memory of abolitionist suffragists more than any statue could.
Presidents have a long history of condescension, indifference and outright racism towards Black Americans
the conversation
August 26, 2020 8.23am EDT
The fury over racial injustice that erupted in the wake of George Floyd’s killing has forced Americans to confront their history. That’s unfamiliar territory for most Americans, whose historical knowledge amounts to a vague blend of fact and myth that was only half-learned in high school and is only half-remembered now.
If their historical knowledge is lacking, Americans are not any better informed about the role of presidential leadership – and lack of leadership – on racial issues. They may have heard that five of the first seven presidents owned slaves, and they know – or think they do – that Abraham Lincoln “freed the slaves.”
But even those tidbits of fact are incomplete. Several other presidents, including Ulysses Grant, owned slaves. And Lincoln, whose Emancipation Proclamation was more symbolic than practically effective, hated slavery but never considered Blacks equal to whites.
An honest assessment of American presidential leadership on race reveals a handful of courageous actions but an abundance of racist behavior, even by those remembered as equal rights supporters.
Our book, “Presidents and Black America: A Documentary History,” examines the record of the first 44 presidents on racial issues and explores their relationships with African Americans. What emerges is a portrait of chief executives who were often blatantly racist and commonly subordinated concerns for racial justice to their own political advantage.
Here are a few examples:
• Rutherford Hayes, president from 1877-1881, claimed to be a friend of African Americans’ rights. At his inauguration, he said “a true self-government” must be “a government which guards the interests of both races carefully and equally.” But he cut a shady deal to win the presidency in the 1876 election, whose result was as hotly disputed as the 2000 Bush-Gore contest. In that deal, he agreed to withdraw federal troops from Southern states where they’d been protecting Blacks from the Ku Klux Klan and white supremacist depredations. Over the next two decades, Southern whites drove virtually all Black elected officials from office, often by fraud and sometimes at gunpoint, and about 1,500 Southern Blacks were lynched.
• William McKinley, president from 1897-1901, delivered an inaugural address extolling equal rights and declared, “Lynchings must not be tolerated.” However, he remained silent when white supremacists in Wilmington, North Carolina, staged an 1898 coup that ousted all Black elected officials and killed at least 60 Blacks. His lack of response to lynchings prompted a Black-owned newspaper to observe, “The Negroes of this country turn with impatience, disappointment and disgust from Mr. McKinley’s fence-straddling and shilly-shallying discussion of lynch law.”
• Theodore Roosevelt, president from 1901-1909, believed in white superiority while simultaneously advocating educational opportunity regardless of race. In a letter to a friend, he wrote, “Now as to the Negroes! I entirely agree with you that as a race and in the mass they are altogether inferior to the whites.”
• Woodrow Wilson, president from 1913-1921, promised fair treatment for African Americans in his 1912 campaign. But once elected, he defended his Southern Cabinet members who segregated workers in federal departments that hadn’t been segregated, writing, “It is as far as possible from being a movement against the negroes. I sincerely believe it to be in their interest.” Black Democrat Robert Wood of New York unsuccessfully urged Wilson to reverse the segregation policy: “We resent it, not at all because we are particularly anxious to eat in the same room or use the same soap and towels that white people use, but because we see in the separation in the races in the matter of soup and soap the beginning of a movement to deprive the colored man entirely of soup and soap, to eliminate him wholly from the Civil Service.” In a testy White House exchange, Wilson chastised William Monroe Trotter and other Black leaders, asserting that, “Segregation is not a humiliation but a benefit, and ought to be so regarded by you gentlemen. If your organization goes out and tells the colored people of the country that it is a humiliation, they will so regard it … The only harm that will come will be if you cause them to think it is a humiliation.”
• Franklin Roosevelt, president from 1933-1945, was widely admired among African Americans. While his New Deal programs did not benefit Blacks and whites equally, Blacks did receive benefits. But FDR’s actions were always guided by his need to appease Southern segregationists in Congress to pass his other agenda items. And his attitude could be condescending, as when he met with Black leaders about integrating the military. He advised a gradual approach, particularly with the Navy: “We are training a certain number of musicians on board ship. The ship’s band. There’s no reason why we shouldn’t have a colored band on some of these ships, because they’re darn good at it.”
Political calculation has always been at work in presidential dealings with African Americans, from George Washington to Donald Trump.
But often, those dealings also reflected condescension, indifference, racial bias and outright racism in chief executives who took a solemn oath to serve all American citizens equally.
If their historical knowledge is lacking, Americans are not any better informed about the role of presidential leadership – and lack of leadership – on racial issues. They may have heard that five of the first seven presidents owned slaves, and they know – or think they do – that Abraham Lincoln “freed the slaves.”
But even those tidbits of fact are incomplete. Several other presidents, including Ulysses Grant, owned slaves. And Lincoln, whose Emancipation Proclamation was more symbolic than practically effective, hated slavery but never considered Blacks equal to whites.
An honest assessment of American presidential leadership on race reveals a handful of courageous actions but an abundance of racist behavior, even by those remembered as equal rights supporters.
Our book, “Presidents and Black America: A Documentary History,” examines the record of the first 44 presidents on racial issues and explores their relationships with African Americans. What emerges is a portrait of chief executives who were often blatantly racist and commonly subordinated concerns for racial justice to their own political advantage.
Here are a few examples:
• Rutherford Hayes, president from 1877-1881, claimed to be a friend of African Americans’ rights. At his inauguration, he said “a true self-government” must be “a government which guards the interests of both races carefully and equally.” But he cut a shady deal to win the presidency in the 1876 election, whose result was as hotly disputed as the 2000 Bush-Gore contest. In that deal, he agreed to withdraw federal troops from Southern states where they’d been protecting Blacks from the Ku Klux Klan and white supremacist depredations. Over the next two decades, Southern whites drove virtually all Black elected officials from office, often by fraud and sometimes at gunpoint, and about 1,500 Southern Blacks were lynched.
• William McKinley, president from 1897-1901, delivered an inaugural address extolling equal rights and declared, “Lynchings must not be tolerated.” However, he remained silent when white supremacists in Wilmington, North Carolina, staged an 1898 coup that ousted all Black elected officials and killed at least 60 Blacks. His lack of response to lynchings prompted a Black-owned newspaper to observe, “The Negroes of this country turn with impatience, disappointment and disgust from Mr. McKinley’s fence-straddling and shilly-shallying discussion of lynch law.”
• Theodore Roosevelt, president from 1901-1909, believed in white superiority while simultaneously advocating educational opportunity regardless of race. In a letter to a friend, he wrote, “Now as to the Negroes! I entirely agree with you that as a race and in the mass they are altogether inferior to the whites.”
• Woodrow Wilson, president from 1913-1921, promised fair treatment for African Americans in his 1912 campaign. But once elected, he defended his Southern Cabinet members who segregated workers in federal departments that hadn’t been segregated, writing, “It is as far as possible from being a movement against the negroes. I sincerely believe it to be in their interest.” Black Democrat Robert Wood of New York unsuccessfully urged Wilson to reverse the segregation policy: “We resent it, not at all because we are particularly anxious to eat in the same room or use the same soap and towels that white people use, but because we see in the separation in the races in the matter of soup and soap the beginning of a movement to deprive the colored man entirely of soup and soap, to eliminate him wholly from the Civil Service.” In a testy White House exchange, Wilson chastised William Monroe Trotter and other Black leaders, asserting that, “Segregation is not a humiliation but a benefit, and ought to be so regarded by you gentlemen. If your organization goes out and tells the colored people of the country that it is a humiliation, they will so regard it … The only harm that will come will be if you cause them to think it is a humiliation.”
• Franklin Roosevelt, president from 1933-1945, was widely admired among African Americans. While his New Deal programs did not benefit Blacks and whites equally, Blacks did receive benefits. But FDR’s actions were always guided by his need to appease Southern segregationists in Congress to pass his other agenda items. And his attitude could be condescending, as when he met with Black leaders about integrating the military. He advised a gradual approach, particularly with the Navy: “We are training a certain number of musicians on board ship. The ship’s band. There’s no reason why we shouldn’t have a colored band on some of these ships, because they’re darn good at it.”
Political calculation has always been at work in presidential dealings with African Americans, from George Washington to Donald Trump.
But often, those dealings also reflected condescension, indifference, racial bias and outright racism in chief executives who took a solemn oath to serve all American citizens equally.
A horrifying chapter from U.S. history: “Wilmington’s Lie” details white supremacist attack on African Americans in 1898
by Steve Pfarrer | VALLEY ADVOCATE
Apr 30, 2020(8/3/2020)
The last few months have brought national attention to Tulsa, Oklahoma, as plans were unveiled to begin digging there in April in suspected mass graves — locations where corpses may have been dumped almost a century ago, after white mobs attacked and burned a black district of the city, leaving as many as 300 African Americans dead (and perhaps 10 to 20 whites).
What’s known as the 1921 Tulsa Race Massacre is generally considered the single worst incident of racial violence in U.S. history. The attack on the Greenwood section of Tulsa, home to a prosperous African American community in the segregated city, left over 35 blocks burned to the ground and over 10,000 people — most of them black — homeless.
But in his new book, Pulitzer Prize winning author David Zucchino re-examines a white attack on another black community — in Wilmington, North Carolina — that in many ways was far more insidious than the spontaneous outburst of violence in Tulsa.
In “Wilmington’s Lie: The Murderous Coup of 1898 and The Rise of White Supremacy,” Zucchino chronicles an openly orchestrated effort by leading white politicians, newspaper publishers, and citizens in North Carolina — almost all Democrats in those days — to take control of the state and of towns like Wilmington that were led by Republicans, who in turn had the vast amount of support from black voters.
In a race-baiting campaign that went on for months, white supremacists declared they would regain the upper hand “by the ballot or bullet or both” and eliminate blacks from public office, especially in Wilmington — and then they did just that, in the process killing at least 60 African Americans, burning homes and businesses and driving hundreds of other blacks from the city forever.
As Zucchino’s searing narrative details, the attack on Wilmington’s black community ushered in decades of white dominance in North Carolina, as voting rights for blacks were stripped away. Other Southern states also used the Wilmington coup as a model for quashing the African American vote and restoring racism as official policy.
And well into the 20th century, the planned violence was portrayed in newspapers and history books as a “race riot,” in which unruly and criminal blacks, ultimately deemed unfit to vote or hold public office, had to be confronted by liberty-loving whites to restore “freedom” and keep the peace.
Zucchino, previously a reporter and foreign correspondent for the The Philadelphia Inquirer, The Los Angeles Times and other newspapers, won a Pulitzer Prize for his dispatches from apartheid South Africa. He brings a journalist’s immediacy to “Wilmington’s Lie” while also laying out the historical background for the 1898 terror, as well as the incomplete efforts in the city today to come to terms with this grim past.
His book looks at how, in the advent of Reconstruction after the U.S. Civil War, blacks made some gains in the South, winning the right to vote as well as some elections to state and local offices (and in a few cases to the U.S. House of Representatives). Many of these political gains were rolled back over the next couple of decades, however, as former Confederate leaders took control of the levers of government.
Yet more and more blacks were registered to vote, and a solid black working class and middle class began to emerge in Wilmington. By 1880, Zucchino writes, Wilmington, a shipbuilding center along the Cape Fear River in southeastern North Carolina, had the highest proportion of black residents — close to 60 percent — of any Southern city. African Americans also served alongside whites in a number of positions in local government, including on the police force. Blacks and whites in some cases were neighbors — a situation some whites could not abide.
A violent coup
By 1898, Democrats controlled much of the South, but North Carolina had a Republican governor, Daniel Russell, elected with strong black support, especially from Wilmington. Democrats became determined to oust him and to put an end to black voting itself.
“[T]here is one thing the Democrat Party has never done and never will do — and that is to set the negro up TO RULE OVER WHITE MEN,” wrote Furnifold Simmons, chairman of the state Democratic Party and a leader of the white supremacist campaign. “[N]egro rule is a curse to both races.”
As Zucchino outlines, white newspapers and Democratic leaders pounded on this theme for months. At a huge rally of whites in Wilmington, one speaker thundered that blacks and the “handful of white cowards” who led them would be defeated “if we have to choke the Cape Fear with carcasses!” Gun shops reported record sales to whites but refused to sell weapons to blacks.
In August, Democrats found another way to stoke white rage. Alexander Lightfoot Manly, the editor of a black newspaper in Wilmington, The Daily Record, responded to calls for lynching black men accused of raping white women with an editorial that said white men raped black women with impunity. The piece also suggested some white women loved or even lusted after black men, upending “the core white conviction that any sex act between a black man and a white woman could only be rape,” Zucchino writes.
“Don’t think ever that your women will remain pure while you are debauching ours,” Manly wrote. “You sow the seed — the harvest will come in due time.”
Though some whites immediately called for destroying The Daily Record office and lynching Manly, Simmons and other white supremacist leaders urged restraint, saying payback would come at the ballot box. So on election day, Nov. 8, some 2,000 heavily armed white men, collectively known as “Red Shirts,” were essentially deputized as militia and flooded Wilmington. Federal troops, some just back from fighting in the Spanish-American War, were also on hand to help keep “order.”
Not surprisingly, black voters barely showed up at the polls, and the Democrats swept almost all state and local offices in North Carolina. Actual violence had been fairly minimal, but Red Shirts and other whites in Wilmington, Zucchino writes, were still “eager for a confrontation.” They provoked one on Nov. 10 when, responding to rumors of gatherings of hostile blacks, they began shooting African Americans throughout the city, causing hundreds of others to flee to swampland outside of town. Vigilantes also torched The Daily Record office, but Manly, the editor, had already fled.
Zucchino’s account makes for harrowing but can’t-turn-away reading; he basically describes an armed overthrow of a lawfully elected government and open murder of innocent people. Yet this violent coup was celebrated by white supremacists not just in North Carolina but throughout the South (“Old North State Redeemed From Negro Rule At Last” read a headline in the Atlanta Constitution) and even in other parts of the country.
U.S. President William McKinley, whom black leaders had implored to intervene in Wilmington, did nothing, according to Zucchino, and none of the white killers was ever prosecuted. The Republican governor, Daniel Russell, was too frightened to intercede. Afterwards, efforts to disenfranchise African Americans in the state through other means — poll taxes, literacy tests, voter-roll purges — reduced registered black voters from 126,000 in 1896 to 6,100 in 1906.
By contrast, a number of leaders of the Wilmington coup rode their fame straight to prominent positions in state and federal government, including Simmons, the Democratic Party chairman; he became a five-term U.S. senator. The African-American presence in Wilmington was permanently reduced, notes Zucchino. Today, just 18 percent of residents are black.
“Wilmington’s Lie” is one of the most disturbing and frightening books I’ve ever read. It should be required reading for whites who scoff at the idea of white privilege.
What’s known as the 1921 Tulsa Race Massacre is generally considered the single worst incident of racial violence in U.S. history. The attack on the Greenwood section of Tulsa, home to a prosperous African American community in the segregated city, left over 35 blocks burned to the ground and over 10,000 people — most of them black — homeless.
But in his new book, Pulitzer Prize winning author David Zucchino re-examines a white attack on another black community — in Wilmington, North Carolina — that in many ways was far more insidious than the spontaneous outburst of violence in Tulsa.
In “Wilmington’s Lie: The Murderous Coup of 1898 and The Rise of White Supremacy,” Zucchino chronicles an openly orchestrated effort by leading white politicians, newspaper publishers, and citizens in North Carolina — almost all Democrats in those days — to take control of the state and of towns like Wilmington that were led by Republicans, who in turn had the vast amount of support from black voters.
In a race-baiting campaign that went on for months, white supremacists declared they would regain the upper hand “by the ballot or bullet or both” and eliminate blacks from public office, especially in Wilmington — and then they did just that, in the process killing at least 60 African Americans, burning homes and businesses and driving hundreds of other blacks from the city forever.
As Zucchino’s searing narrative details, the attack on Wilmington’s black community ushered in decades of white dominance in North Carolina, as voting rights for blacks were stripped away. Other Southern states also used the Wilmington coup as a model for quashing the African American vote and restoring racism as official policy.
And well into the 20th century, the planned violence was portrayed in newspapers and history books as a “race riot,” in which unruly and criminal blacks, ultimately deemed unfit to vote or hold public office, had to be confronted by liberty-loving whites to restore “freedom” and keep the peace.
Zucchino, previously a reporter and foreign correspondent for the The Philadelphia Inquirer, The Los Angeles Times and other newspapers, won a Pulitzer Prize for his dispatches from apartheid South Africa. He brings a journalist’s immediacy to “Wilmington’s Lie” while also laying out the historical background for the 1898 terror, as well as the incomplete efforts in the city today to come to terms with this grim past.
His book looks at how, in the advent of Reconstruction after the U.S. Civil War, blacks made some gains in the South, winning the right to vote as well as some elections to state and local offices (and in a few cases to the U.S. House of Representatives). Many of these political gains were rolled back over the next couple of decades, however, as former Confederate leaders took control of the levers of government.
Yet more and more blacks were registered to vote, and a solid black working class and middle class began to emerge in Wilmington. By 1880, Zucchino writes, Wilmington, a shipbuilding center along the Cape Fear River in southeastern North Carolina, had the highest proportion of black residents — close to 60 percent — of any Southern city. African Americans also served alongside whites in a number of positions in local government, including on the police force. Blacks and whites in some cases were neighbors — a situation some whites could not abide.
A violent coup
By 1898, Democrats controlled much of the South, but North Carolina had a Republican governor, Daniel Russell, elected with strong black support, especially from Wilmington. Democrats became determined to oust him and to put an end to black voting itself.
“[T]here is one thing the Democrat Party has never done and never will do — and that is to set the negro up TO RULE OVER WHITE MEN,” wrote Furnifold Simmons, chairman of the state Democratic Party and a leader of the white supremacist campaign. “[N]egro rule is a curse to both races.”
As Zucchino outlines, white newspapers and Democratic leaders pounded on this theme for months. At a huge rally of whites in Wilmington, one speaker thundered that blacks and the “handful of white cowards” who led them would be defeated “if we have to choke the Cape Fear with carcasses!” Gun shops reported record sales to whites but refused to sell weapons to blacks.
In August, Democrats found another way to stoke white rage. Alexander Lightfoot Manly, the editor of a black newspaper in Wilmington, The Daily Record, responded to calls for lynching black men accused of raping white women with an editorial that said white men raped black women with impunity. The piece also suggested some white women loved or even lusted after black men, upending “the core white conviction that any sex act between a black man and a white woman could only be rape,” Zucchino writes.
“Don’t think ever that your women will remain pure while you are debauching ours,” Manly wrote. “You sow the seed — the harvest will come in due time.”
Though some whites immediately called for destroying The Daily Record office and lynching Manly, Simmons and other white supremacist leaders urged restraint, saying payback would come at the ballot box. So on election day, Nov. 8, some 2,000 heavily armed white men, collectively known as “Red Shirts,” were essentially deputized as militia and flooded Wilmington. Federal troops, some just back from fighting in the Spanish-American War, were also on hand to help keep “order.”
Not surprisingly, black voters barely showed up at the polls, and the Democrats swept almost all state and local offices in North Carolina. Actual violence had been fairly minimal, but Red Shirts and other whites in Wilmington, Zucchino writes, were still “eager for a confrontation.” They provoked one on Nov. 10 when, responding to rumors of gatherings of hostile blacks, they began shooting African Americans throughout the city, causing hundreds of others to flee to swampland outside of town. Vigilantes also torched The Daily Record office, but Manly, the editor, had already fled.
Zucchino’s account makes for harrowing but can’t-turn-away reading; he basically describes an armed overthrow of a lawfully elected government and open murder of innocent people. Yet this violent coup was celebrated by white supremacists not just in North Carolina but throughout the South (“Old North State Redeemed From Negro Rule At Last” read a headline in the Atlanta Constitution) and even in other parts of the country.
U.S. President William McKinley, whom black leaders had implored to intervene in Wilmington, did nothing, according to Zucchino, and none of the white killers was ever prosecuted. The Republican governor, Daniel Russell, was too frightened to intercede. Afterwards, efforts to disenfranchise African Americans in the state through other means — poll taxes, literacy tests, voter-roll purges — reduced registered black voters from 126,000 in 1896 to 6,100 in 1906.
By contrast, a number of leaders of the Wilmington coup rode their fame straight to prominent positions in state and federal government, including Simmons, the Democratic Party chairman; he became a five-term U.S. senator. The African-American presence in Wilmington was permanently reduced, notes Zucchino. Today, just 18 percent of residents are black.
“Wilmington’s Lie” is one of the most disturbing and frightening books I’ve ever read. It should be required reading for whites who scoff at the idea of white privilege.
KKK founded
history.com
In Pulaski, Tennessee, a group of Confederate veterans convenes to form a secret society that they christen the “Ku Klux Klan.” The KKK rapidly grew from a secret social fraternity to a paramilitary force bent on reversing the federal government’s progressive Reconstruction Era-activities in the South, especially policies that elevated the rights of the local African American population.
The name of the Ku Klux Klan was derived from the Greek word kyklos, meaning “circle,” and the Scottish-Gaelic word “clan,” which was probably chosen for the sake of alliteration. Under a platform of philosophized white racial superiority, the group employed violence as a means of pushing back Reconstruction and its enfranchisement of African Americans. Former Confederate General Nathan Bedford Forrest was the KKK’s first grand wizard; in 1869, he unsuccessfully tried to disband it after he grew critical of the Klan’s excessive violence.
Most prominent in counties where the races were relatively equal in number, the KKK engaged in terrorist raids against African Americans and white Republicans at night, employing intimidation, destruction of property, assault, and murder to achieve its aims and influence upcoming elections. In a few Southern states, Republicans organized militia units to break up the Klan. In 1871, the Ku Klux Act passed Congress, authorizing President Ulysses S. Grant to use military force to suppress the KKK. The Ku Klux Act resulted in nine South Carolina counties being placed under martial law and thousands of arrests. In 1882, the U.S. Supreme Court declared the Ku Klux Act unconstitutional, but by that time Reconstruction had ended and the KKK receded for the time being.
The 20th century witnessed two revivals of the KKK: one in response to immigration in the 1910s and ’20s, and another in response to the African American civil rights movement of the 1950s and ’60s. Various chapters of the KKK still exist in the 21st century. White supremacist violence, in general, is again on the rise in America. Several high profile events, including the 2015 Charleston church shooting; the 2017 "Unite the Right" rally in Charlottesville, Virginia; the 2018 Pittsburgh synagogue shooting; and the 2019 shooting in an El Paso, Texas Walmart were all fueled by white supremacy and racism.
The name of the Ku Klux Klan was derived from the Greek word kyklos, meaning “circle,” and the Scottish-Gaelic word “clan,” which was probably chosen for the sake of alliteration. Under a platform of philosophized white racial superiority, the group employed violence as a means of pushing back Reconstruction and its enfranchisement of African Americans. Former Confederate General Nathan Bedford Forrest was the KKK’s first grand wizard; in 1869, he unsuccessfully tried to disband it after he grew critical of the Klan’s excessive violence.
Most prominent in counties where the races were relatively equal in number, the KKK engaged in terrorist raids against African Americans and white Republicans at night, employing intimidation, destruction of property, assault, and murder to achieve its aims and influence upcoming elections. In a few Southern states, Republicans organized militia units to break up the Klan. In 1871, the Ku Klux Act passed Congress, authorizing President Ulysses S. Grant to use military force to suppress the KKK. The Ku Klux Act resulted in nine South Carolina counties being placed under martial law and thousands of arrests. In 1882, the U.S. Supreme Court declared the Ku Klux Act unconstitutional, but by that time Reconstruction had ended and the KKK receded for the time being.
The 20th century witnessed two revivals of the KKK: one in response to immigration in the 1910s and ’20s, and another in response to the African American civil rights movement of the 1950s and ’60s. Various chapters of the KKK still exist in the 21st century. White supremacist violence, in general, is again on the rise in America. Several high profile events, including the 2015 Charleston church shooting; the 2017 "Unite the Right" rally in Charlottesville, Virginia; the 2018 Pittsburgh synagogue shooting; and the 2019 shooting in an El Paso, Texas Walmart were all fueled by white supremacy and racism.
Ku Klux Klan
Did you know? At its peak in the 1920s, Klan membership exceeded 4 million people nationwide.
HISTORY.COM EDITORS
feb 21, 2020
Founded in 1865, the Ku Klux Klan (KKK) extended into almost every southern state by 1870 and became a vehicle for white southern resistance to the Republican Party’s Reconstruction-era policies aimed at establishing political and economic equality for black Americans. Its members waged an underground campaign of intimidation and violence directed at white and black Republican leaders. Though Congress passed legislation designed to curb Klan terrorism, the organization saw its primary goal–the reestablishment of white supremacy–fulfilled through Democratic victories in state legislatures across the South in the 1870s. After a period of decline, white Protestant nativist groups revived the Klan in the early 20th century, burning crosses and staging rallies, parades and marches denouncing immigrants, Catholics, Jews, African Americans and organized labor. The civil rights movement of the 1960s also saw a surge of Ku Klux Klan activity, including bombings of black schools and churches and violence against black and white activists in the South.
Founding of the Ku Klux Klan
A group including many former Confederate veterans founded the first branch of the Ku Klux Klan as a social club in Pulaski, Tennessee, in 1865. The first two words of the organization’s name supposedly derived from the Greek word “kyklos,” meaning circle. In the summer of 1867, local branches of the Klan met in a general organizing convention and established what they called an “Invisible Empire of the South.” Leading Confederate general Nathan Bedford Forrest was chosen as the first leader, or “grand wizard,” of the Klan; he presided over a hierarchy of grand dragons, grand titans and grand cyclopses.
The organization of the Ku Klux Klan coincided with the beginning of the second phase of post-Civil WarReconstruction, put into place by the more radical members of the Republican Party in Congress. After rejecting President Andrew Johnson’s relatively lenient Reconstruction policies, in place from 1865 to 1866, Congress passed the Reconstruction Act over the presidential veto. Under its provisions, the South was divided into five military districts, and each state was required to approve the 14th Amendment, which granted “equal protection” of the Constitution to former slaves and enacted universal male suffrage.
Ku Klux Klan Violence in the South
From 1867 onward, African-American participation in public life in the South became one of the most radical aspects of Reconstruction, as black people won election to southern state governments and even to the U.S. Congress. For its part, the Ku Klux Klan dedicated itself to an underground campaign of violence against Republican leaders and voters (both black and white) in an effort to reverse the policies of Radical Reconstruction and restore white supremacy in the South. They were joined in this struggle by similar organizations such as the Knights of the White Camelia (launched in Louisiana in 1867) and the White Brotherhood. At least 10 percent of the black legislators elected during the 1867-1868 constitutional conventions became victims of violence during Reconstruction, including seven who were killed. White Republicans (derided as “carpetbaggers” and “scalawags”) and black institutions such as schools and churches—symbols of black autonomy—were also targets for Klan attacks.
By 1870, the Ku Klux Klan had branches in nearly every southern state. Even at its height, the Klan did not boast a well-organized structure or clear leadership. Local Klan members–often wearing masks and dressed in the organization’s signature long white robes and hoods–usually carried out their attacks at night, acting on their own but in support of the common goals of defeating Radical Reconstruction and restoring white supremacy in the South. Klan activity flourished particularly in the regions of the South where black people were a minority or a small majority of the population, and was relatively limited in others. Among the most notorious zones of Klan activity was South Carolina, where in January 1871 500 masked men attacked the Union county jail and lynched eight black prisoners.
The Ku Klux Klan and the End of Reconstruction
Though Democratic leaders would later attribute Ku Klux Klan violence to poorer southern white people, the organization’s membership crossed class lines, from small farmers and laborers to planters, lawyers, merchants, physicians and ministers. In the regions where most Klan activity took place, local law enforcement officials either belonged to the Klan or declined to take action against it, and even those who arrested accused Klansmen found it difficult to find witnesses willing to testify against them. Other leading white citizens in the South declined to speak out against the group’s actions, giving them tacit approval. After 1870, Republican state governments in the South turned to Congress for help, resulting in the passage of three Enforcement Acts, the strongest of which was the Ku Klux Klan Act of 1871.
For the first time, the Ku Klux Klan Act designated certain crimes committed by individuals as federal offenses, including conspiracies to deprive citizens of the right to hold office, serve on juries and enjoy the equal protection of the law. The act authorized the president to suspend the writ of habeas corpus and arrest accused individuals without charge, and to send federal forces to suppress Klan violence. This expansion of federal authority–which Ulysses S. Grant promptly used in 1871 to crush Klan activity in South Carolina and other areas of the South–outraged Democrats and even alarmed many Republicans. From the early 1870s onward, white supremacy gradually reasserted its hold on the South as support for Reconstruction waned; by the end of 1876, the entire South was under Democratic control once again.
Revival of the Ku Klux Klan
In 1915, white Protestant nativists organized a revival of the Ku Klux Klan near Atlanta, Georgia, inspired by their romantic view of the Old South as well as Thomas Dixon’s 1905 book “The Clansman” and D.W. Griffith’s 1915 film “Birth of a Nation.” This second generation of the Klan was not only anti-black but also took a stand against Roman Catholics, Jews, foreigners and organized labor. It was fueled by growing hostility to the surge in immigration that America experienced in the early 20th century along with fears of communist revolution akin to the Bolshevik triumph in Russia in 1917. The organization took as its symbol a burning cross and held rallies, parades and marches around the country. At its peak in the 1920s, Klan membership exceeded 4 million people nationwide.
The Great Depression in the 1930s depleted the Klan’s membership ranks, and the organization temporarily disbanded in 1944. The civil rights movement of the 1960s saw a surge of local Klan activity across the South, including the bombings, beatings and shootings of black and white activists. These actions, carried out in secret but apparently the work of local Klansmen, outraged the nation and helped win support for the civil rights cause. In 1965, President Lyndon Johnson delivered a speech publicly condemning the Klan and announcing the arrest of four Klansmen in connection with the murder of a white female civil rights worker in Alabama. The cases of Klan-related violence became more isolated in the decades to come, though fragmented groups became aligned with neo-Nazi or other right-wing extremist organizations from the 1970s onward. In the early 1990s, the Klan was estimated to have between 6,000 and 10,000 active members, mostly in the Deep South.
READ MORE: How 'The Birth of a Nation' Revived the Ku Klux Klan
Founding of the Ku Klux Klan
A group including many former Confederate veterans founded the first branch of the Ku Klux Klan as a social club in Pulaski, Tennessee, in 1865. The first two words of the organization’s name supposedly derived from the Greek word “kyklos,” meaning circle. In the summer of 1867, local branches of the Klan met in a general organizing convention and established what they called an “Invisible Empire of the South.” Leading Confederate general Nathan Bedford Forrest was chosen as the first leader, or “grand wizard,” of the Klan; he presided over a hierarchy of grand dragons, grand titans and grand cyclopses.
The organization of the Ku Klux Klan coincided with the beginning of the second phase of post-Civil WarReconstruction, put into place by the more radical members of the Republican Party in Congress. After rejecting President Andrew Johnson’s relatively lenient Reconstruction policies, in place from 1865 to 1866, Congress passed the Reconstruction Act over the presidential veto. Under its provisions, the South was divided into five military districts, and each state was required to approve the 14th Amendment, which granted “equal protection” of the Constitution to former slaves and enacted universal male suffrage.
Ku Klux Klan Violence in the South
From 1867 onward, African-American participation in public life in the South became one of the most radical aspects of Reconstruction, as black people won election to southern state governments and even to the U.S. Congress. For its part, the Ku Klux Klan dedicated itself to an underground campaign of violence against Republican leaders and voters (both black and white) in an effort to reverse the policies of Radical Reconstruction and restore white supremacy in the South. They were joined in this struggle by similar organizations such as the Knights of the White Camelia (launched in Louisiana in 1867) and the White Brotherhood. At least 10 percent of the black legislators elected during the 1867-1868 constitutional conventions became victims of violence during Reconstruction, including seven who were killed. White Republicans (derided as “carpetbaggers” and “scalawags”) and black institutions such as schools and churches—symbols of black autonomy—were also targets for Klan attacks.
By 1870, the Ku Klux Klan had branches in nearly every southern state. Even at its height, the Klan did not boast a well-organized structure or clear leadership. Local Klan members–often wearing masks and dressed in the organization’s signature long white robes and hoods–usually carried out their attacks at night, acting on their own but in support of the common goals of defeating Radical Reconstruction and restoring white supremacy in the South. Klan activity flourished particularly in the regions of the South where black people were a minority or a small majority of the population, and was relatively limited in others. Among the most notorious zones of Klan activity was South Carolina, where in January 1871 500 masked men attacked the Union county jail and lynched eight black prisoners.
The Ku Klux Klan and the End of Reconstruction
Though Democratic leaders would later attribute Ku Klux Klan violence to poorer southern white people, the organization’s membership crossed class lines, from small farmers and laborers to planters, lawyers, merchants, physicians and ministers. In the regions where most Klan activity took place, local law enforcement officials either belonged to the Klan or declined to take action against it, and even those who arrested accused Klansmen found it difficult to find witnesses willing to testify against them. Other leading white citizens in the South declined to speak out against the group’s actions, giving them tacit approval. After 1870, Republican state governments in the South turned to Congress for help, resulting in the passage of three Enforcement Acts, the strongest of which was the Ku Klux Klan Act of 1871.
For the first time, the Ku Klux Klan Act designated certain crimes committed by individuals as federal offenses, including conspiracies to deprive citizens of the right to hold office, serve on juries and enjoy the equal protection of the law. The act authorized the president to suspend the writ of habeas corpus and arrest accused individuals without charge, and to send federal forces to suppress Klan violence. This expansion of federal authority–which Ulysses S. Grant promptly used in 1871 to crush Klan activity in South Carolina and other areas of the South–outraged Democrats and even alarmed many Republicans. From the early 1870s onward, white supremacy gradually reasserted its hold on the South as support for Reconstruction waned; by the end of 1876, the entire South was under Democratic control once again.
Revival of the Ku Klux Klan
In 1915, white Protestant nativists organized a revival of the Ku Klux Klan near Atlanta, Georgia, inspired by their romantic view of the Old South as well as Thomas Dixon’s 1905 book “The Clansman” and D.W. Griffith’s 1915 film “Birth of a Nation.” This second generation of the Klan was not only anti-black but also took a stand against Roman Catholics, Jews, foreigners and organized labor. It was fueled by growing hostility to the surge in immigration that America experienced in the early 20th century along with fears of communist revolution akin to the Bolshevik triumph in Russia in 1917. The organization took as its symbol a burning cross and held rallies, parades and marches around the country. At its peak in the 1920s, Klan membership exceeded 4 million people nationwide.
The Great Depression in the 1930s depleted the Klan’s membership ranks, and the organization temporarily disbanded in 1944. The civil rights movement of the 1960s saw a surge of local Klan activity across the South, including the bombings, beatings and shootings of black and white activists. These actions, carried out in secret but apparently the work of local Klansmen, outraged the nation and helped win support for the civil rights cause. In 1965, President Lyndon Johnson delivered a speech publicly condemning the Klan and announcing the arrest of four Klansmen in connection with the murder of a white female civil rights worker in Alabama. The cases of Klan-related violence became more isolated in the decades to come, though fragmented groups became aligned with neo-Nazi or other right-wing extremist organizations from the 1970s onward. In the early 1990s, the Klan was estimated to have between 6,000 and 10,000 active members, mostly in the Deep South.
READ MORE: How 'The Birth of a Nation' Revived the Ku Klux Klan
A brief history of the "Lost Cause": Why this toxic myth still appeals to so many white Americans
Racist myth-making conquered American history — and white people's minds — for far too long. Time to face the facts
BOB CESCA - salon
JUNE 16, 2020 1:30PM (UTC)
...Origins of the Lost Cause
In many ways, the Civil War was the prototype for 20th century-style warfare. The military technology that was developed immediately before and during the war vastly outpaced the archaic Napoleonic tactics used during the first several years of Civil War battles. The new rifled musket was capable of firing conical Minié ball rounds faster, farther and more accurately than the old spherical rounds, yet massed armies continued to march in long lines of battle, shoulder-to-shoulder, within close range of the other side, causing a bloodbath of unprecedented magnitude.
Photography, another relatively new technology at the time, would deliver the images of mangled casualties to the public for the first time, leaving no doubt as to the mind-blowing devastation of war. Likewise, nightmarishly awful trench warfare emerged in 1864 — a "dress rehearsal" for World War I, as historian Shelby Foote once described it — adding to the ugliness and carnage. The Victorian "picnic" at Bull Run in 1861 would quickly evolve into the apocalyptic trench combat of Spotsylvania Courthouse and Petersburg three years later.
After the war, while the task of reunifying the nation began to take shape, few observers and participants forgot about the grisly horror show that had occurred. (Contemporary historians suggest that around 750,000 men died in the war, a larger number than was understood at the time — and by far the largest body count of any war in American history.) Someone would have to pay for the carnage, Northerners commonly believed. From there, several schools of thought emerged about how best to handle reincorporating the former Confederate states back into the Union. Radical Reconstructionists wanted to punish the South, executing the perpetrators of secession and redefining the Southern way of life so that secession could never happen again. Others wanted a more moderate, or more conciliatory approach, including Abraham Lincoln and his ham-fisted (not to mention overtly racist) successor, Andrew Johnson.
While Northern politicians and Union generals engaged in shepherding the policies of Reconstruction, authors, journalists and special interest groups sympathetic to the South began work on the reunification of hearts and minds: This was what would eventually emerge as the Lost Cause, a term first coined by Southern author Edward Pollard in 1866. In other words, revisionist historians began to address the task of reunifying white people of the North and white people of the South following so much brutality, with a clear motivation to exonerate Southern whites.
The myths of the Lost Cause
The central thrust of the Lost Cause was to reframe the animators of secession — Southern landowners and politicians, along with the insurgents who formed the Confederate military — as having fought for the more "noble" cause of Southern states' rights. The goal was to erase slavery as the obvious and express intention of secession, even though the preservation of slavery is clearly enumerated in the Confederate constitution.
When Donald Trump defended the names of U.S. military bases named for rebel generals, he borrowed directly from the Lost Cause mythology: "These Monumental and very Powerful Bases have become part of a Great American Heritage, and a history of Winning, Victory, and Freedom." The Lost Cause was all about rebranding traitors and racists as having fought bravely for ideals like "heritage," "freedom" and "nobility."
Again, it's entirely counterfactual, transforming greedy villains who were responsible for the subjugation of African Americans and the deaths of hundreds of thousands, into kinder, gentler souls who were only interested in defending their cultural heritage and the absolutist interpretation of the 10th Amendment. It's not exactly a shock to learn that Trump and other Republican leaders subscribe to this "cultural heritage" fiction.
As bad as all that sounds, the subsequent myths of the Lost Cause are far more sinister and inexcusable.
One of the most dominant prongs of the Lost Cause was the characterization of Blacks as a common enemy of both northern and southern whites. Mythologists believed that if white people were fighting Black people, then white people wouldn't fight each other again. The goal of smearing African Americans as the enemy of white America involved the whole-cloth fabrication of cultural myths about African Americans, emerging at the dawn of the 20th century and beyond. Architects of the mythology felt that Black people didn't possess a cultural identity and therefore identities could be entirely invented for them by white supremacists.
Prime movers of the Lost Cause taught, therefore, that slaves actually liked being slaves, and were treated better than some whites. Likewise, the myth of Black Confederates, fighting willingly alongside their owners, emerged from similar sources. (In reality, while thousands of Black men accompanied their masters into the Confederate army, they were "camp slaves," not soldiers. There is no reliable evidence that any Black people, free or enslaved, voluntarily fought for the rebel cause.)
Publications and, later, films would portray Black men as unpredictable thieves or as lazy and shiftless "takers," as well as wanton rapists and subjugators of white people.
D.W. Griffith's 1915 silent classic "Birth of a Nation" is the best known cinematic example of Lost Cause myth-making, though other silent films of the early 20th century were arguably more insulting, with titles and plots too horrendous to publish here.
The white protagonist of "Birth of a Nation," fictional Confederate veteran Ben Cameron, invents the Ku Klux Klan to take back his southern heritage. Cameron's KKK is portrayed as an avenging army of swashbuckling heroes who swarm to the rescue of a white woman being surrounded in her cabin by a platoon of lascivious Black soldiers. Naturally, these soldiers are played by white actors in blackface who behave in offensively stereotypical ways.
Black Union soldiers, meanwhile, are shown suppressing and intimidating white voters during Reconstruction. In one scene, several bayonet-wielding Black men disenfranchise white voters at a polling place. Black politicians, including the Silas Lynch character, are unanimously elected to the state legislature via the intimidation of white citizens at the hands of Black troops. The all-Black legislature then goes on to pass laws that strip white people of their right to vote. The politicians, meanwhile, ogle and harass white women in the street, but only when they aren't getting drunk and eating chicken legs.
Given the pernicious vilification of Blacks during the late 19th and early 20th centuries, it's no surprise that in the United States between 1882 and 1968, white people lynched more than 4,400 African-Americans, in large part based on racial resentments and prejudices driven by the fiction of the Lost Cause.
Similarly, the epidemic of police violence against Blacks also has its roots in the Lost Cause.
In addition to the perpetuation of racist stereotypes, these myths were heavily borrowed to justify Jim Crow laws, which were specifically designed to oppress Southern Blacks. In Douglas Blackmon's groundbreaking 2009 book, "Slavery by Another Name," the author documents the symbiosis between Jim Crow laws, law enforcement and "neo-slavery" that lasted well into the 1940s and beyond. Blackmon detailed how nonsense laws against things like "vagrancy" were used to supply backwoods plantations and mines with slave labor. In the Jim Crow South, cops would arrest Black men for, in one example, not carrying proof of employment, then hustle them through kangaroo courts and eventually disappear them into a new and supposedly legal form of slavery in which many African-Americans were worked to death. The practice survived until Franklin D. Roosevelt ordered the FBI to shut it down at the outset of World War II, yet forms of slave labor continue to exist within the modern prison-industrial complex today.
Blackmon's stories of "vagrancy" arrests and the like also call to mind the atrocious "papers, please" policy enacted by Arizona's SB 1070 law in 2010. (It was partially, but not entirely, struck down by the Supreme Court two years later.)
The Lost Cause in the modern era
The modern Republican "Southern strategy" has been all about exploiting Lost Cause myths to scare white people into voting for GOP candidates. The Reagan-era notion of "welfare queens" played up the "lazy and shiftless" stereotypes of the Lost Cause. The "makers and takers" slogan is a less overt iteration of the same thing.
The so-called "war on drugs" turned out to be just another excuse to lock up African Americans. Blacks arrested for possessing crack cocaine, for example, ended up serving longer prison sentences than whites arrested for possessing the same quantity of powder cocaine.
In 1988, Republican political strategist Lee Atwater, along with George H.W. Bush's media consultant, future Fox News founder Roger Ailes, devised the infamous Willie Horton commercial in order to scare white people into voting against Michael Dukakis. Two years later, the late Sen. Jesse Helms of North Carolina rolled out his famous "white hands" commercial, which cautioned white people that affirmative action would allow black people to take their jobs.
The Rev. Jeremiah Wright's "God damn America" video was exploited by Fox News and far-right media to scare white people into voting against Barack Obama, who had just about every Lost Cause trope catapulted at him throughout his two terms.
Fox News celebrities like Bill O'Reilly have routinely employed racist myths to attack the Obamas. O'Reilly once defended "the white power structure that controls America." He also said about Michelle Obama, "I don't want to go on a lynching party against Michelle Obama unless there's evidence, hard facts, that say this is how the woman really feels."
Social media memes of Barack Obama dressed as a witch doctor or the Obamas as monkeys or the Obama-era White House lawn littered with watermelons were all pure turn-of-the-century Lost Cause stereotypes.
All told, the Lost Cause has been one of the most successful disinformation campaigns in world history. Its themes continue to be intrinsic to the white misperception of post-Civil War racial history, including Trump's "heritage" defense of military base names, his defense of Charlottesville white supremacists, and his fetish for law enforcement violence. Likewise, his routine attacks against African-American journalists (e.g., Yamiche Alcindor of PBS and Don Lemon of CNN), athletes (e.g., former NFL quarterback Colin Kaepernick) and lawmakers (e.g., "Low IQ" Rep. Maxine Waters) invariably echo the stereotypes of the Lost Cause.
It's no wonder Trump is a proud student of its fiction. The Lost Cause has been so completely absorbed by the confirmation bias of white racists that its lies have become inextricably bound to conventional wisdom, printed and distributed as legitimate history for way too long. This is why it's been so difficult to shake loose, and it's why there's such a powerful movement now against police violence and the continued lionizing of Confederate insurgents. It's taken more than a century to finally begin to pull down some of the literal monuments to the Lost Cause, as well as to successfully achieve bans against the Confederate battle flag.
We're making progress now, but how many African Americans and other people of color have been stripped of their constitutional rights along the way? How many have suffered and died as a consequence of these fictitious justifications for American racism, especially for our history of secession and slavery? The white supremacist mythmakers believed they were keeping the (white) peace after four gruesome years of war, but all they were doing was rationalizing more death — not to mention injustice — at the hands of racist vigilante groups, cops, politicians, corporations and scores of white supremacist followers, all brainwashed by these 155-year-old lies passed off as "history" and "heritage."
In many ways, the Civil War was the prototype for 20th century-style warfare. The military technology that was developed immediately before and during the war vastly outpaced the archaic Napoleonic tactics used during the first several years of Civil War battles. The new rifled musket was capable of firing conical Minié ball rounds faster, farther and more accurately than the old spherical rounds, yet massed armies continued to march in long lines of battle, shoulder-to-shoulder, within close range of the other side, causing a bloodbath of unprecedented magnitude.
Photography, another relatively new technology at the time, would deliver the images of mangled casualties to the public for the first time, leaving no doubt as to the mind-blowing devastation of war. Likewise, nightmarishly awful trench warfare emerged in 1864 — a "dress rehearsal" for World War I, as historian Shelby Foote once described it — adding to the ugliness and carnage. The Victorian "picnic" at Bull Run in 1861 would quickly evolve into the apocalyptic trench combat of Spotsylvania Courthouse and Petersburg three years later.
After the war, while the task of reunifying the nation began to take shape, few observers and participants forgot about the grisly horror show that had occurred. (Contemporary historians suggest that around 750,000 men died in the war, a larger number than was understood at the time — and by far the largest body count of any war in American history.) Someone would have to pay for the carnage, Northerners commonly believed. From there, several schools of thought emerged about how best to handle reincorporating the former Confederate states back into the Union. Radical Reconstructionists wanted to punish the South, executing the perpetrators of secession and redefining the Southern way of life so that secession could never happen again. Others wanted a more moderate, or more conciliatory approach, including Abraham Lincoln and his ham-fisted (not to mention overtly racist) successor, Andrew Johnson.
While Northern politicians and Union generals engaged in shepherding the policies of Reconstruction, authors, journalists and special interest groups sympathetic to the South began work on the reunification of hearts and minds: This was what would eventually emerge as the Lost Cause, a term first coined by Southern author Edward Pollard in 1866. In other words, revisionist historians began to address the task of reunifying white people of the North and white people of the South following so much brutality, with a clear motivation to exonerate Southern whites.
The myths of the Lost Cause
The central thrust of the Lost Cause was to reframe the animators of secession — Southern landowners and politicians, along with the insurgents who formed the Confederate military — as having fought for the more "noble" cause of Southern states' rights. The goal was to erase slavery as the obvious and express intention of secession, even though the preservation of slavery is clearly enumerated in the Confederate constitution.
When Donald Trump defended the names of U.S. military bases named for rebel generals, he borrowed directly from the Lost Cause mythology: "These Monumental and very Powerful Bases have become part of a Great American Heritage, and a history of Winning, Victory, and Freedom." The Lost Cause was all about rebranding traitors and racists as having fought bravely for ideals like "heritage," "freedom" and "nobility."
Again, it's entirely counterfactual, transforming greedy villains who were responsible for the subjugation of African Americans and the deaths of hundreds of thousands, into kinder, gentler souls who were only interested in defending their cultural heritage and the absolutist interpretation of the 10th Amendment. It's not exactly a shock to learn that Trump and other Republican leaders subscribe to this "cultural heritage" fiction.
As bad as all that sounds, the subsequent myths of the Lost Cause are far more sinister and inexcusable.
One of the most dominant prongs of the Lost Cause was the characterization of Blacks as a common enemy of both northern and southern whites. Mythologists believed that if white people were fighting Black people, then white people wouldn't fight each other again. The goal of smearing African Americans as the enemy of white America involved the whole-cloth fabrication of cultural myths about African Americans, emerging at the dawn of the 20th century and beyond. Architects of the mythology felt that Black people didn't possess a cultural identity and therefore identities could be entirely invented for them by white supremacists.
Prime movers of the Lost Cause taught, therefore, that slaves actually liked being slaves, and were treated better than some whites. Likewise, the myth of Black Confederates, fighting willingly alongside their owners, emerged from similar sources. (In reality, while thousands of Black men accompanied their masters into the Confederate army, they were "camp slaves," not soldiers. There is no reliable evidence that any Black people, free or enslaved, voluntarily fought for the rebel cause.)
Publications and, later, films would portray Black men as unpredictable thieves or as lazy and shiftless "takers," as well as wanton rapists and subjugators of white people.
D.W. Griffith's 1915 silent classic "Birth of a Nation" is the best known cinematic example of Lost Cause myth-making, though other silent films of the early 20th century were arguably more insulting, with titles and plots too horrendous to publish here.
The white protagonist of "Birth of a Nation," fictional Confederate veteran Ben Cameron, invents the Ku Klux Klan to take back his southern heritage. Cameron's KKK is portrayed as an avenging army of swashbuckling heroes who swarm to the rescue of a white woman being surrounded in her cabin by a platoon of lascivious Black soldiers. Naturally, these soldiers are played by white actors in blackface who behave in offensively stereotypical ways.
Black Union soldiers, meanwhile, are shown suppressing and intimidating white voters during Reconstruction. In one scene, several bayonet-wielding Black men disenfranchise white voters at a polling place. Black politicians, including the Silas Lynch character, are unanimously elected to the state legislature via the intimidation of white citizens at the hands of Black troops. The all-Black legislature then goes on to pass laws that strip white people of their right to vote. The politicians, meanwhile, ogle and harass white women in the street, but only when they aren't getting drunk and eating chicken legs.
Given the pernicious vilification of Blacks during the late 19th and early 20th centuries, it's no surprise that in the United States between 1882 and 1968, white people lynched more than 4,400 African-Americans, in large part based on racial resentments and prejudices driven by the fiction of the Lost Cause.
Similarly, the epidemic of police violence against Blacks also has its roots in the Lost Cause.
In addition to the perpetuation of racist stereotypes, these myths were heavily borrowed to justify Jim Crow laws, which were specifically designed to oppress Southern Blacks. In Douglas Blackmon's groundbreaking 2009 book, "Slavery by Another Name," the author documents the symbiosis between Jim Crow laws, law enforcement and "neo-slavery" that lasted well into the 1940s and beyond. Blackmon detailed how nonsense laws against things like "vagrancy" were used to supply backwoods plantations and mines with slave labor. In the Jim Crow South, cops would arrest Black men for, in one example, not carrying proof of employment, then hustle them through kangaroo courts and eventually disappear them into a new and supposedly legal form of slavery in which many African-Americans were worked to death. The practice survived until Franklin D. Roosevelt ordered the FBI to shut it down at the outset of World War II, yet forms of slave labor continue to exist within the modern prison-industrial complex today.
Blackmon's stories of "vagrancy" arrests and the like also call to mind the atrocious "papers, please" policy enacted by Arizona's SB 1070 law in 2010. (It was partially, but not entirely, struck down by the Supreme Court two years later.)
The Lost Cause in the modern era
The modern Republican "Southern strategy" has been all about exploiting Lost Cause myths to scare white people into voting for GOP candidates. The Reagan-era notion of "welfare queens" played up the "lazy and shiftless" stereotypes of the Lost Cause. The "makers and takers" slogan is a less overt iteration of the same thing.
The so-called "war on drugs" turned out to be just another excuse to lock up African Americans. Blacks arrested for possessing crack cocaine, for example, ended up serving longer prison sentences than whites arrested for possessing the same quantity of powder cocaine.
In 1988, Republican political strategist Lee Atwater, along with George H.W. Bush's media consultant, future Fox News founder Roger Ailes, devised the infamous Willie Horton commercial in order to scare white people into voting against Michael Dukakis. Two years later, the late Sen. Jesse Helms of North Carolina rolled out his famous "white hands" commercial, which cautioned white people that affirmative action would allow black people to take their jobs.
The Rev. Jeremiah Wright's "God damn America" video was exploited by Fox News and far-right media to scare white people into voting against Barack Obama, who had just about every Lost Cause trope catapulted at him throughout his two terms.
Fox News celebrities like Bill O'Reilly have routinely employed racist myths to attack the Obamas. O'Reilly once defended "the white power structure that controls America." He also said about Michelle Obama, "I don't want to go on a lynching party against Michelle Obama unless there's evidence, hard facts, that say this is how the woman really feels."
Social media memes of Barack Obama dressed as a witch doctor or the Obamas as monkeys or the Obama-era White House lawn littered with watermelons were all pure turn-of-the-century Lost Cause stereotypes.
All told, the Lost Cause has been one of the most successful disinformation campaigns in world history. Its themes continue to be intrinsic to the white misperception of post-Civil War racial history, including Trump's "heritage" defense of military base names, his defense of Charlottesville white supremacists, and his fetish for law enforcement violence. Likewise, his routine attacks against African-American journalists (e.g., Yamiche Alcindor of PBS and Don Lemon of CNN), athletes (e.g., former NFL quarterback Colin Kaepernick) and lawmakers (e.g., "Low IQ" Rep. Maxine Waters) invariably echo the stereotypes of the Lost Cause.
It's no wonder Trump is a proud student of its fiction. The Lost Cause has been so completely absorbed by the confirmation bias of white racists that its lies have become inextricably bound to conventional wisdom, printed and distributed as legitimate history for way too long. This is why it's been so difficult to shake loose, and it's why there's such a powerful movement now against police violence and the continued lionizing of Confederate insurgents. It's taken more than a century to finally begin to pull down some of the literal monuments to the Lost Cause, as well as to successfully achieve bans against the Confederate battle flag.
We're making progress now, but how many African Americans and other people of color have been stripped of their constitutional rights along the way? How many have suffered and died as a consequence of these fictitious justifications for American racism, especially for our history of secession and slavery? The white supremacist mythmakers believed they were keeping the (white) peace after four gruesome years of war, but all they were doing was rationalizing more death — not to mention injustice — at the hands of racist vigilante groups, cops, politicians, corporations and scores of white supremacist followers, all brainwashed by these 155-year-old lies passed off as "history" and "heritage."
Why the Confederate Flag Flew During World War II
As white, southern troops raised the battle flag, they showed that they were fighting for change abroad—but the status quo at home.
Matthew Delmont - the atlantic
Professor of history at Dartmouth College
3/14/2020
In July 1944, one month after the Allies stormed the beaches of Normandy, the 79th Infantry Division drove Nazi troops out of the French town La Haye-du-Puits. A young officer from Chattanooga, Tennessee, reached into his rucksack and pulled out a flag that his grandfather had carried during the Civil War. He fashioned a makeshift flagpole and hoisted it up, so that the battle-worn Confederate flag could fly over the liberated village.
The U.S. Navy and Marine Corps recently decided to ban the Confederate flag from military installations, and the Army is considering renaming 10 bases named after Confederate generals. But if you want to understand how the U.S. military came to embrace the Confederate flag in the first place, the answers lie in World War II.
When white southern troops went overseas during the war, some of them carried Confederate flags with them. As American forces took over Pacific Islands or European towns, the troops would sometimes raise the Confederate flag alongside or instead of the U.S. flag to celebrate their victory. The Baltimore Evening Sun described this as a “recurring phenomenon which has been observed in areas as widely separated as the Southwest Pacific, Italy and France.”
A major from Richmond, Virginia, raised the Confederate flag over a house after the U.S. Fifth Army captured the Italian town of Rifreddo. He told Stars and Stripes, the official military newspaper, that he’d brought a cache of flags with him and that he had already hung the Confederate flag in Naples, Rome, and Leghorn. “This is one war we’re gonna win,” he said.
In the Pacific, Marine Colonel William O. Brice of South Carolina dubbed himself the “commander of Confederate forces” in the Solomon Islands and flew the Confederate flag on the islands’ base. The Charlotte Observer praised Brice and other white marines, soldiers, and sailors for being “descendants of men who wore the gray [who] have not forgotten in the turmoil of battle, their reverence for those heroes of the [1860s].”
When the Allies secured military victory over Germany, a tank officer carried the Confederate flag into Berlin. As the USS Mississippi steamed into Tokyo Bay after Japan’s surrender, it was flying the Confederate flag.
After the war, a white sergeant from Kentucky wrote home to ask his mother to send a Confederate flag to display in a French school. “I believe we will influence the teaching of the War Between the States,” he wrote. Two former Army pilots returned from overseas and formed a “Confederate Air Force” for white southern pilots in New Bern, North Carolina.
The white troops who raised the Confederate flag during World War II argued that they were honoring the military service of their forefathers. “In its day, this flag stood for much and waved over the heads of the same type of men that made America great,” the Charlotte Observer argued. “Deep in the hearts of all Americans, the Confederacy now is merely a part of ‘One nation indivisible.’”
Not all Americans agreed. When Army Lieutenant General Simon Buckner Jr., himself the son of a Confederate general, saw a Marine unit flying the flag at the battle of Okinawa, he ordered it removed. “Americans from all over are involved in this battle,” he said.
For black Americans especially, the Confederate flag was a symbol of decades of racism, hate, and white supremacy. They fought against it being displayed before, during, and after the war. Before Pearl Harbor, for example, the Baltimore Afro-American successfully protested a plan to use the flag as the insignia of Army quartermasters stationed in Virginia at the base named for Confederate General Robert E. Lee.
The embrace of the Confederate flag by white troops, politicians, and civilians made it clear to black Americans that many of their fellow citizens understood the goals of the Second World War in very different terms. As black Americans fought a Double Victory campaign over fascism abroad and racism at home, most white Americans understood the war only to be about defeating the Nazis and Japanese military, a “single V” abroad and the status quo at home. Edward Moe, a federal investigator who surveyed racial attitudes during the war, found that many white people believed that World War II was about preserving things “as they have been in America.” “White folks would rather lose the war than give up the luxury of race prejudice,” NAACP Secretary Roy Wilkins quipped.
While white officers and enlisted men had no difficulty displaying the Confederate flag at home or overseas, Senator Millard Tydings, a Maryland Democrat, wanted to ensure they could do so officially. In 1943, he introduced a bill to allow Army units to carry Confederate battle streamers. “The sons of those who fought on the southern side in the Civil War ... at least should have the right to carry these streamers as a matter of maintaining military morale,” he argued. The Chicago Defender, a leading black newspaper, struck back immediately, calling the bill a “master stroke of hypocrisy” that proposed to have “American troops carrying the banner under which bitter war was waged for the perpetuation of slavery, into a so-called fight for democracy.” Among Tydings’s opponents, the Defender reported, there was talk of amending the bill to call for German Americans “to enter battle under the swastika, right next to the old Confederacy’s Stars and Bars.”
Tydings’s bill was eventually signed by President Harry Truman in March 1948, which opened the door for the official display of Confederate symbols in the U.S. military. The policy was implemented just four months before southern segregationists formed the States’ Rights Democratic Party, the “Dixiecrats,” and nominated South Carolina Governor Strom Thurmond for the 1948 presidential election. “The Southerners want the State right to continue to deny Negro citizens the right to vote,” the black journalist P. L. Prattis remarked.
The Dixiecrats displayed the Confederate flag prominently at campaign events, which sent sales of Confederate flags skyrocketing nationally. “The Confederacy fought to destroy the United States … how in heaven’s name can those who profess loyalty to the United States of America be loyal to the Confederacy?” asked E. Washington Rhodes, publisher of Philadelphia’s largest black newspaper. “Thousands of men suffered and died to make the stars and stripes supreme in the U.S. There is but one American flag. We are either Americans or something else.”
As the Tydings bill and the Dixiecrats led a surge in the popularity of the Confederate flag, Truman signed Executive Order 9981, committing the government to desegregating the military. The committee Truman organized after the war to study civil rights concluded that discrimination in the military was unacceptable: “Any discrimination which, while imposing an obligation, prevents members of minority groups from rendering full military service in defense of their country is for them a peculiarly humiliating badge of inferiority.” While many white military officers and enlisted men resisted the order, by the end of the Korean War in 1953, the military was almost fully integrated. Black activists fought for this policy for more than a decade, and it was one of the first major victories of the modern civil-rights era.
In the decades after World War II, the U.S. military became one of the most racially diverse institutions in the country and offered social mobility to generations of black Americans. At the same time, the military allowed the display of the Confederate flag and related racist symbols, which have no place in our military.
More than seven decades after the Confederate flag became intertwined with the U.S. military, it is well past time that these ties are severed.
The U.S. Navy and Marine Corps recently decided to ban the Confederate flag from military installations, and the Army is considering renaming 10 bases named after Confederate generals. But if you want to understand how the U.S. military came to embrace the Confederate flag in the first place, the answers lie in World War II.
When white southern troops went overseas during the war, some of them carried Confederate flags with them. As American forces took over Pacific Islands or European towns, the troops would sometimes raise the Confederate flag alongside or instead of the U.S. flag to celebrate their victory. The Baltimore Evening Sun described this as a “recurring phenomenon which has been observed in areas as widely separated as the Southwest Pacific, Italy and France.”
A major from Richmond, Virginia, raised the Confederate flag over a house after the U.S. Fifth Army captured the Italian town of Rifreddo. He told Stars and Stripes, the official military newspaper, that he’d brought a cache of flags with him and that he had already hung the Confederate flag in Naples, Rome, and Leghorn. “This is one war we’re gonna win,” he said.
In the Pacific, Marine Colonel William O. Brice of South Carolina dubbed himself the “commander of Confederate forces” in the Solomon Islands and flew the Confederate flag on the islands’ base. The Charlotte Observer praised Brice and other white marines, soldiers, and sailors for being “descendants of men who wore the gray [who] have not forgotten in the turmoil of battle, their reverence for those heroes of the [1860s].”
When the Allies secured military victory over Germany, a tank officer carried the Confederate flag into Berlin. As the USS Mississippi steamed into Tokyo Bay after Japan’s surrender, it was flying the Confederate flag.
After the war, a white sergeant from Kentucky wrote home to ask his mother to send a Confederate flag to display in a French school. “I believe we will influence the teaching of the War Between the States,” he wrote. Two former Army pilots returned from overseas and formed a “Confederate Air Force” for white southern pilots in New Bern, North Carolina.
The white troops who raised the Confederate flag during World War II argued that they were honoring the military service of their forefathers. “In its day, this flag stood for much and waved over the heads of the same type of men that made America great,” the Charlotte Observer argued. “Deep in the hearts of all Americans, the Confederacy now is merely a part of ‘One nation indivisible.’”
Not all Americans agreed. When Army Lieutenant General Simon Buckner Jr., himself the son of a Confederate general, saw a Marine unit flying the flag at the battle of Okinawa, he ordered it removed. “Americans from all over are involved in this battle,” he said.
For black Americans especially, the Confederate flag was a symbol of decades of racism, hate, and white supremacy. They fought against it being displayed before, during, and after the war. Before Pearl Harbor, for example, the Baltimore Afro-American successfully protested a plan to use the flag as the insignia of Army quartermasters stationed in Virginia at the base named for Confederate General Robert E. Lee.
The embrace of the Confederate flag by white troops, politicians, and civilians made it clear to black Americans that many of their fellow citizens understood the goals of the Second World War in very different terms. As black Americans fought a Double Victory campaign over fascism abroad and racism at home, most white Americans understood the war only to be about defeating the Nazis and Japanese military, a “single V” abroad and the status quo at home. Edward Moe, a federal investigator who surveyed racial attitudes during the war, found that many white people believed that World War II was about preserving things “as they have been in America.” “White folks would rather lose the war than give up the luxury of race prejudice,” NAACP Secretary Roy Wilkins quipped.
While white officers and enlisted men had no difficulty displaying the Confederate flag at home or overseas, Senator Millard Tydings, a Maryland Democrat, wanted to ensure they could do so officially. In 1943, he introduced a bill to allow Army units to carry Confederate battle streamers. “The sons of those who fought on the southern side in the Civil War ... at least should have the right to carry these streamers as a matter of maintaining military morale,” he argued. The Chicago Defender, a leading black newspaper, struck back immediately, calling the bill a “master stroke of hypocrisy” that proposed to have “American troops carrying the banner under which bitter war was waged for the perpetuation of slavery, into a so-called fight for democracy.” Among Tydings’s opponents, the Defender reported, there was talk of amending the bill to call for German Americans “to enter battle under the swastika, right next to the old Confederacy’s Stars and Bars.”
Tydings’s bill was eventually signed by President Harry Truman in March 1948, which opened the door for the official display of Confederate symbols in the U.S. military. The policy was implemented just four months before southern segregationists formed the States’ Rights Democratic Party, the “Dixiecrats,” and nominated South Carolina Governor Strom Thurmond for the 1948 presidential election. “The Southerners want the State right to continue to deny Negro citizens the right to vote,” the black journalist P. L. Prattis remarked.
The Dixiecrats displayed the Confederate flag prominently at campaign events, which sent sales of Confederate flags skyrocketing nationally. “The Confederacy fought to destroy the United States … how in heaven’s name can those who profess loyalty to the United States of America be loyal to the Confederacy?” asked E. Washington Rhodes, publisher of Philadelphia’s largest black newspaper. “Thousands of men suffered and died to make the stars and stripes supreme in the U.S. There is but one American flag. We are either Americans or something else.”
As the Tydings bill and the Dixiecrats led a surge in the popularity of the Confederate flag, Truman signed Executive Order 9981, committing the government to desegregating the military. The committee Truman organized after the war to study civil rights concluded that discrimination in the military was unacceptable: “Any discrimination which, while imposing an obligation, prevents members of minority groups from rendering full military service in defense of their country is for them a peculiarly humiliating badge of inferiority.” While many white military officers and enlisted men resisted the order, by the end of the Korean War in 1953, the military was almost fully integrated. Black activists fought for this policy for more than a decade, and it was one of the first major victories of the modern civil-rights era.
In the decades after World War II, the U.S. military became one of the most racially diverse institutions in the country and offered social mobility to generations of black Americans. At the same time, the military allowed the display of the Confederate flag and related racist symbols, which have no place in our military.
More than seven decades after the Confederate flag became intertwined with the U.S. military, it is well past time that these ties are severed.
racism thru silence!!!
FDR stayed silent too: When police brutality, mob violence and racial injustice collided during WWII
The NAACP pleaded with Roosevelt to address racial injustice during the violent summer of 1943
DAN C. GOLDBERG - salon
JUNE 1, 2020 9:00PM (UTC)
Excerpted from "The Golden Thirteen: How Black Men Won the Right to Wear Navy Gold" by Dan C. Goldberg. Copyright 2020. Excerpted with permission by Beacon Press.
It all looks so familiar. The images broadcast around the world of buildings in flames, of protesters confronting police clad in riot gear may be in response to recent events but they are also part of a uniquely American long-running series, the extension of riots and protests that date back generations.
Nearly 80 years ago, at the height of World War II, police brutality, mob violence and the pervasive belief among African Americans that the government did not value their lives led to racial tensions so fierce that riots spread across the country, threatening war production and national security at a time when Americans were told that only a united force could defeat fascism.
Then, as now, critics of the president felt he was indifferent to their suffering and more concerned with alienating his base than using the Bully Pulpit to bring the country together.
The theme that ties these protests together across generations is not one incident — but rather one incident that happens again and again.
During World War II, local police murdered black soldiers in Columbia, South Carolina and Little Rock, Arkansas. Little more than a month after Pearl Harbor, military police in Alexandria, Louisiana, attempting to arrest a black man, triggered a race riot, during which 28 black men were shot. Nearly 3,000 black men and women were detained in the city's "Little Harlem" section. The city's entire supply of tear gas was used on black soldiers, almost all of whom were from the North, principally New York, Pennsylvania, and Illinois.
If an African American soldier or sailor didn't personally experience the bigotry, he could read about it in the black press. A "U.S. Army uniform to a colored man makes him about as free as a man in the Georgia chain gang," one soldier told the Baltimore Afro-American. "If this is Uncle Sam's Army then treat us like soldiers not animals or else Uncle Sam might find a new axis to fight."
Just a few weeks after Alexandria, the black press screamed the story of Cleo Wright, a 30-year-old cotton mill worker in Sikeston, Missouri, who allegedly assaulted a white woman and stabbed the arresting police officer. Wright was shot during the scuffle but survived. On Sunday morning, a mob of more than 300 grabbed Wright from his jail cell, tied him to the back of a maroon-colored Ford, and dragged his near-naked body through the black section of town. They stopped to force his wife to view his bloodied body. Then they doused him with gasoline and set him on fire in front of two black churches, where the pews were filled with men, women, and children who had come for services.
Truman Gibson, Hastie's aide, commented that so many black men were bludgeoned to death in the South that it would only be a "slight exaggeration to say more black Americans were murdered by White Americans during World War II than were killed by Germans."
The following year, deadly rioting broke out in Beaumont, Texas and Detroit, Michigan where, in early June, more than 25,000 white workers went on strike after the Packard Motor plant promoted three black men to work on the assembly line beside white men. One striker shouted, "I'd rather see Hitler and Hirohito win the war than work beside a n***er on the assembly line."
On June 20, a 90-plus degree day, nearly 100,000 men, women, and children went to Belle Isle, a municipal park on an island in the Detroit River, seeking relief from the sweltering heat.
The first interracial fights began around 10 p.m. Soon groups of white men and black men were fighting on the lawn adjacent to the naval armory. White sailors joined the fracas, fueling the hostilities.
Shortly after midnight, at a bustling nightclub in the heart of the black community, a well-dressed black man carrying a briefcase stopped the music, took the microphone, and said he had an important announcement to make. There was fighting between the races on Belle Isle; three black people had already been killed, and a black woman and her baby had been thrown over the Belle Isle Bridge and into the river. He urged everyone in the club—nearly a thousand people—to go home and get their guns. Now was the time to fight. In the white community, someone said a black man had slit a white sailor's throat and raped his girlfriend.
Neither story was true, but both were believed.
By 2 a.m., hospitals were reporting that they were receiving one new patient every minute. Twenty-five African Americans and nine whites were killed, and more than 750 were injured before the riot, the worst of the era, ended.
Letters poured into the White House demanding federal action, and Walter White, head of the NAACP, begged the president to intervene, to marshal the nation as he had done so many times before when a national crisis threatened to overwhelm the republic.
"No lesser voice than yours can arouse public opinion sufficiently against these deliberately provoked attacks, which are designed to hamper war production, destroy or weaken morale, and to deny minorities, negroes in particular, the opportunity to participate in the war effort on the same basis as other Americans," White wrote. "We are certain that unless you act these outbreaks will increase in number and violence."
But the White House made no move, paralyzed by fear of making the situation worse. For every concerned voice that demanded the President intervene to stop Jim Crowism and call for racial equality, there was an equally concerned voice saying it was the very push for racial equality that was causing all these riots, and that Eleanor Roosevelt, in her never-ending quest to promote black men in the factories and the fields, in the Army and the Navy, was responsible for the national discord.
"It is my belief Mrs. Roosevelt and Mayor [Edward] Jeffries of Detroit are somewhat guilty of the race riots here due to their coddling of negros," John Lang, who owned a bookstore in Detroit, wrote in a letter to FDR. "It is about time you began thinking about the men who built this country."
The Jackson Mississippi Daily News declared the Detroit riots were "blood upon your hands, Mrs. Roosevelt" and said she had "been . . . proclaiming and practicing social equality. In Detroit, a city noted for the growing impudence and insolence of its Negro population, an attempt was made to put your preachments into practice."
Inside the White House, the thought of devoting a Fireside Chat to the subject of race riots was deemed "unwise" by the president's counselors. At most, Attorney General Francis Biddle argued, the president "might consider discussing it the next time you talk about the overall domestic situation as one of the problems to be considered."
Roosevelt thought even that too much, and when he gave a Fireside Chat on July 28, one month after the Detroit riots, he devoted not one word to race.
In August, another large riot began—this time in New York City—when Margie Polite, a 35-year-old black woman, was arrested by Patrolman James Collins for disorderly conduct outside the Braddock Hotel on 126th Street in Harlem. Robert Bandy, a black soldier on leave, intervened. He and Collins scuffled, and at some point Bandy allegedly took hold of Collins's nightstick and struck him with it. Bandy tried to run, and Collins shot him in the left shoulder.
The incident was like a spark to kindling on a hot, sweaty night in the city, the kind where the air is thick and humid, and tempers rise to meet the mercury.
Men and women sitting on their fire escapes seeking relief from the stifling heat climbed down the ladders and formed a mob. They lived in those overstuffed, sweltering tenements because of the color of their skin, because the city wouldn't let them leave the ghetto. They were packed into apartments like animals, and now that they were ready to die so that the best ideals of their country might live their countrymen beat and slaughtered them like animals.
The Harlem Hellfighters, the black men who made up the 369th Infantry Regiment, had been sending letters home from Camp Stewart in Georgia in which they told friends and relatives, often in graphic detail, of the gratuitous insults and violence they endured. Harlem's black press reported on how soldiers were beaten and sometimes lynched in camps across the South. Residents knew of the riots in Detroit and Beaumont. They knew that airplane factories on Long Island, even though desperate for workmen and -women, would not "degrade" their assembly lines with African Americans.
It took 8,000 New York State guards and 6,600 city police officers to quell the violence. In all, 500 people were arrested — all black, 100 of them women. One week later, when the New York Times examined the causes of the riot, it declared that no one should be surprised: "The principal cause of unrest in Harlem and other Negro communities has been [the] complaint of discrimination and Jim Crow treatment of Negroes in the armed forces."
It all looks so familiar. The images broadcast around the world of buildings in flames, of protesters confronting police clad in riot gear may be in response to recent events but they are also part of a uniquely American long-running series, the extension of riots and protests that date back generations.
Nearly 80 years ago, at the height of World War II, police brutality, mob violence and the pervasive belief among African Americans that the government did not value their lives led to racial tensions so fierce that riots spread across the country, threatening war production and national security at a time when Americans were told that only a united force could defeat fascism.
Then, as now, critics of the president felt he was indifferent to their suffering and more concerned with alienating his base than using the Bully Pulpit to bring the country together.
The theme that ties these protests together across generations is not one incident — but rather one incident that happens again and again.
During World War II, local police murdered black soldiers in Columbia, South Carolina and Little Rock, Arkansas. Little more than a month after Pearl Harbor, military police in Alexandria, Louisiana, attempting to arrest a black man, triggered a race riot, during which 28 black men were shot. Nearly 3,000 black men and women were detained in the city's "Little Harlem" section. The city's entire supply of tear gas was used on black soldiers, almost all of whom were from the North, principally New York, Pennsylvania, and Illinois.
If an African American soldier or sailor didn't personally experience the bigotry, he could read about it in the black press. A "U.S. Army uniform to a colored man makes him about as free as a man in the Georgia chain gang," one soldier told the Baltimore Afro-American. "If this is Uncle Sam's Army then treat us like soldiers not animals or else Uncle Sam might find a new axis to fight."
Just a few weeks after Alexandria, the black press screamed the story of Cleo Wright, a 30-year-old cotton mill worker in Sikeston, Missouri, who allegedly assaulted a white woman and stabbed the arresting police officer. Wright was shot during the scuffle but survived. On Sunday morning, a mob of more than 300 grabbed Wright from his jail cell, tied him to the back of a maroon-colored Ford, and dragged his near-naked body through the black section of town. They stopped to force his wife to view his bloodied body. Then they doused him with gasoline and set him on fire in front of two black churches, where the pews were filled with men, women, and children who had come for services.
Truman Gibson, Hastie's aide, commented that so many black men were bludgeoned to death in the South that it would only be a "slight exaggeration to say more black Americans were murdered by White Americans during World War II than were killed by Germans."
The following year, deadly rioting broke out in Beaumont, Texas and Detroit, Michigan where, in early June, more than 25,000 white workers went on strike after the Packard Motor plant promoted three black men to work on the assembly line beside white men. One striker shouted, "I'd rather see Hitler and Hirohito win the war than work beside a n***er on the assembly line."
On June 20, a 90-plus degree day, nearly 100,000 men, women, and children went to Belle Isle, a municipal park on an island in the Detroit River, seeking relief from the sweltering heat.
The first interracial fights began around 10 p.m. Soon groups of white men and black men were fighting on the lawn adjacent to the naval armory. White sailors joined the fracas, fueling the hostilities.
Shortly after midnight, at a bustling nightclub in the heart of the black community, a well-dressed black man carrying a briefcase stopped the music, took the microphone, and said he had an important announcement to make. There was fighting between the races on Belle Isle; three black people had already been killed, and a black woman and her baby had been thrown over the Belle Isle Bridge and into the river. He urged everyone in the club—nearly a thousand people—to go home and get their guns. Now was the time to fight. In the white community, someone said a black man had slit a white sailor's throat and raped his girlfriend.
Neither story was true, but both were believed.
By 2 a.m., hospitals were reporting that they were receiving one new patient every minute. Twenty-five African Americans and nine whites were killed, and more than 750 were injured before the riot, the worst of the era, ended.
Letters poured into the White House demanding federal action, and Walter White, head of the NAACP, begged the president to intervene, to marshal the nation as he had done so many times before when a national crisis threatened to overwhelm the republic.
"No lesser voice than yours can arouse public opinion sufficiently against these deliberately provoked attacks, which are designed to hamper war production, destroy or weaken morale, and to deny minorities, negroes in particular, the opportunity to participate in the war effort on the same basis as other Americans," White wrote. "We are certain that unless you act these outbreaks will increase in number and violence."
But the White House made no move, paralyzed by fear of making the situation worse. For every concerned voice that demanded the President intervene to stop Jim Crowism and call for racial equality, there was an equally concerned voice saying it was the very push for racial equality that was causing all these riots, and that Eleanor Roosevelt, in her never-ending quest to promote black men in the factories and the fields, in the Army and the Navy, was responsible for the national discord.
"It is my belief Mrs. Roosevelt and Mayor [Edward] Jeffries of Detroit are somewhat guilty of the race riots here due to their coddling of negros," John Lang, who owned a bookstore in Detroit, wrote in a letter to FDR. "It is about time you began thinking about the men who built this country."
The Jackson Mississippi Daily News declared the Detroit riots were "blood upon your hands, Mrs. Roosevelt" and said she had "been . . . proclaiming and practicing social equality. In Detroit, a city noted for the growing impudence and insolence of its Negro population, an attempt was made to put your preachments into practice."
Inside the White House, the thought of devoting a Fireside Chat to the subject of race riots was deemed "unwise" by the president's counselors. At most, Attorney General Francis Biddle argued, the president "might consider discussing it the next time you talk about the overall domestic situation as one of the problems to be considered."
Roosevelt thought even that too much, and when he gave a Fireside Chat on July 28, one month after the Detroit riots, he devoted not one word to race.
In August, another large riot began—this time in New York City—when Margie Polite, a 35-year-old black woman, was arrested by Patrolman James Collins for disorderly conduct outside the Braddock Hotel on 126th Street in Harlem. Robert Bandy, a black soldier on leave, intervened. He and Collins scuffled, and at some point Bandy allegedly took hold of Collins's nightstick and struck him with it. Bandy tried to run, and Collins shot him in the left shoulder.
The incident was like a spark to kindling on a hot, sweaty night in the city, the kind where the air is thick and humid, and tempers rise to meet the mercury.
Men and women sitting on their fire escapes seeking relief from the stifling heat climbed down the ladders and formed a mob. They lived in those overstuffed, sweltering tenements because of the color of their skin, because the city wouldn't let them leave the ghetto. They were packed into apartments like animals, and now that they were ready to die so that the best ideals of their country might live their countrymen beat and slaughtered them like animals.
The Harlem Hellfighters, the black men who made up the 369th Infantry Regiment, had been sending letters home from Camp Stewart in Georgia in which they told friends and relatives, often in graphic detail, of the gratuitous insults and violence they endured. Harlem's black press reported on how soldiers were beaten and sometimes lynched in camps across the South. Residents knew of the riots in Detroit and Beaumont. They knew that airplane factories on Long Island, even though desperate for workmen and -women, would not "degrade" their assembly lines with African Americans.
It took 8,000 New York State guards and 6,600 city police officers to quell the violence. In all, 500 people were arrested — all black, 100 of them women. One week later, when the New York Times examined the causes of the riot, it declared that no one should be surprised: "The principal cause of unrest in Harlem and other Negro communities has been [the] complaint of discrimination and Jim Crow treatment of Negroes in the armed forces."
How one man fought South Carolina Democrats to end whites-only primaries – and why that matters now
February 29, 2020
By The Conversation - raw story
A rusting chain-link fence represents a “color line” for the dead in Columbia, South Carolina. In Randolph Cemetery, separated by the barrier from the well-manicured lawn of the neighboring white graveyard, lies the remains of George A. Elmore.
A black business owner and civil rights activist, Elmore is little remembered despite his achievement. But a granite monument at his grave attests to the “unmatched courage, perseverance and personal sacrifice” that saw him take on the South Carolina Democratic Party of the 1940s over its whites-only primaries – and win.
Nearly 75 years after Elmore’s battle, the 2020 Democratic presidential candidates made fervent appeals to African American voters in South Carolina ahead of the primary being held on Feb. 29. For some of the all white front-runners in the race, it could be a make-or-break moment – a failure to win over sufficient black support would be a major setback, potentially campaign-ending.
It is a far cry from the South Carolina of August 1946, when Elmore, a fair-skinned, straight-haired manager of a neighborhood five-and-dime store, consulted with local civil rights leaders and agreed to try once again to register to vote.
It followed blatant attempts to deprive African American citizens of their constitutional rights by white Democratic Party officials who would move voter registration books from store to store and hide them the moment a black voter entered.
When a clerk mistakenly allowed Elmore to register – thinking he was white, contemporary sources suggest – NAACP activists had a plaintiff to challenge the last whites-only primary in the nation.
‘Let the chips fall’
Excluding black voters at the ballot had already been ruled unconstitutional by the U.S. Supreme Court in 1944’s Smith v. Allwright decision. But in defiance, the South Carolina General Assembly simply redefined the state’s Democratic Party as a private club not subject to laws regulating primaries. Gov. Olin D. Johnston declared: “White supremacy will be maintained in our primaries. Let the chips fall where they may.”
Elmore’s name was promptly purged from the rolls and a cadre of prominent civil rights activists arranged for the NAACP to plead his case.
Columbia civil rights attorney Harold Boulware filed the federal lawsuit. In June 1947, Thurgood Marshall and Robert Carter – like Boulware, graduates of the Howard University School of Law – argued Elmore’s case as a class lawsuit covering all African Americans in the state of voting age. The trial inspired a packed gallery of African American observers, including a young Matthew J. Perry Jr., a future federal district judge, who commented: “Marshall and Carter were hitting it where it should be hit.”
In July, an unlikely ally, Charleston blueblood Judge J. Waties Waring agreed, ruling that African Americans must be permitted to enroll. “It is time for South Carolina to rejoin the Union,” he concluded. “It is time … to adopt the American way of conducting elections.”
The state Democratic Party again defied the ruling, requiring voters to sign an oath supporting segregation. Judge Waring issued a permanent injunction in 1948 to open the voting rolls: “To say that these rules conform or even pretend to conform to the law as laid down in the case of Elmore v. Rice is an absurdity.”
In that year’s state primary, more than 30,000 African Americans, including George Elmore and his wife Laura, voted. Elmore remarked, “In the words of our other champion, Joe Louis, all I can say is ‘I’m glad I won.’”
His photos of the long line of voters in his community’s precinct are now in the archives of the University of South Carolina where I teach history.
In the years that followed, voter education and registration programs by civil rights organizations transformed the Democratic Party in the state, both in terms of the makeup of its membership and the policies it pursued. The move sparked the departure of many white Democrats to the Republican Party, including the segregationist Sen. Strom Thurmond.
Thurmond’s defection in 1964 legitimized the move for other white Democrats and hard-core segregationists who aligned themselves with an increasingly conservative Republican Party. Not surprisingly, some of the key architects of Richard Nixon’s invidious Southern strategy, which sought to weaken the Democratic Party in the South through the use of dog-whistle politics on racial issues, came from South Carolina.
As this year’s presidential candidates focus on South Carolina, it is clear that the racial makeup of the state’s electorate is vastly different than that in Iowa or New Hampshire, two of the states where the popularity of candidates has already been tested. But Democrats should view the South Carolina primary as more than a shift from voting in small, mostly white states. They should see the state as representative of the party’s strategic core, a strong African American constituency with diverse interests and perspectives.
African Americans in South Carolina have been fighting and winning legal and political battles for voting rights and electoral power since Reconstruction and as Democrats since the 1940s.
A personal price
After Elmore’s victory in 1947, state NAACP President James M. Hinton gave a somber, prophetic warning: “White men want office, and they want the vote of our people. We will be sought after, but we must be extremely careful who we vote for. … We must have a choice between those who have fought us and those who are our friends.”
George Elmore and his family paid a price for challenging the entrenched power of the white Democratic Party in 1946. In an interview with the University of South Carolina’s Center for Civil Rights History and Research, which I lead, his 81-year-old son Cresswell Elmore recalled the retaliation the family experienced. Ku Klux Klan terrorists burned a cross in their yard and threatened their family. Laura Elmore suffered a nervous breakdown and went into a mental hospital. State agents raided Elmore’s liquor store, claiming the liquor he had bought from the standard wholesaler was illegal, and broke the bottles. Soda bottling companies and other vendors refused to send products on credit. Banks called in loans on their home and other property. Forced into bankruptcy, the family moved from house to house and the disruption scattered Cresswell and his siblings. When Elmore died in 1959 at the age of 53, only scant attention was paid to his passing.
The monument at his grave was unveiled in 1981, at a ceremony attended by civil rights veterans including his original attorney, Harold Boulware.
As the Democratic Party and presidential candidates appeal to African American voters, they would do well to remember the remarkable fight Elmore and others waged against the forces of bigotry and injustice. These historical struggles illuminate both the gains made over many generations and the ongoing battle against inequities and voter suppression tactics that persist to this day in South Carolina and across the nation.
A black business owner and civil rights activist, Elmore is little remembered despite his achievement. But a granite monument at his grave attests to the “unmatched courage, perseverance and personal sacrifice” that saw him take on the South Carolina Democratic Party of the 1940s over its whites-only primaries – and win.
Nearly 75 years after Elmore’s battle, the 2020 Democratic presidential candidates made fervent appeals to African American voters in South Carolina ahead of the primary being held on Feb. 29. For some of the all white front-runners in the race, it could be a make-or-break moment – a failure to win over sufficient black support would be a major setback, potentially campaign-ending.
It is a far cry from the South Carolina of August 1946, when Elmore, a fair-skinned, straight-haired manager of a neighborhood five-and-dime store, consulted with local civil rights leaders and agreed to try once again to register to vote.
It followed blatant attempts to deprive African American citizens of their constitutional rights by white Democratic Party officials who would move voter registration books from store to store and hide them the moment a black voter entered.
When a clerk mistakenly allowed Elmore to register – thinking he was white, contemporary sources suggest – NAACP activists had a plaintiff to challenge the last whites-only primary in the nation.
‘Let the chips fall’
Excluding black voters at the ballot had already been ruled unconstitutional by the U.S. Supreme Court in 1944’s Smith v. Allwright decision. But in defiance, the South Carolina General Assembly simply redefined the state’s Democratic Party as a private club not subject to laws regulating primaries. Gov. Olin D. Johnston declared: “White supremacy will be maintained in our primaries. Let the chips fall where they may.”
Elmore’s name was promptly purged from the rolls and a cadre of prominent civil rights activists arranged for the NAACP to plead his case.
Columbia civil rights attorney Harold Boulware filed the federal lawsuit. In June 1947, Thurgood Marshall and Robert Carter – like Boulware, graduates of the Howard University School of Law – argued Elmore’s case as a class lawsuit covering all African Americans in the state of voting age. The trial inspired a packed gallery of African American observers, including a young Matthew J. Perry Jr., a future federal district judge, who commented: “Marshall and Carter were hitting it where it should be hit.”
In July, an unlikely ally, Charleston blueblood Judge J. Waties Waring agreed, ruling that African Americans must be permitted to enroll. “It is time for South Carolina to rejoin the Union,” he concluded. “It is time … to adopt the American way of conducting elections.”
The state Democratic Party again defied the ruling, requiring voters to sign an oath supporting segregation. Judge Waring issued a permanent injunction in 1948 to open the voting rolls: “To say that these rules conform or even pretend to conform to the law as laid down in the case of Elmore v. Rice is an absurdity.”
In that year’s state primary, more than 30,000 African Americans, including George Elmore and his wife Laura, voted. Elmore remarked, “In the words of our other champion, Joe Louis, all I can say is ‘I’m glad I won.’”
His photos of the long line of voters in his community’s precinct are now in the archives of the University of South Carolina where I teach history.
In the years that followed, voter education and registration programs by civil rights organizations transformed the Democratic Party in the state, both in terms of the makeup of its membership and the policies it pursued. The move sparked the departure of many white Democrats to the Republican Party, including the segregationist Sen. Strom Thurmond.
Thurmond’s defection in 1964 legitimized the move for other white Democrats and hard-core segregationists who aligned themselves with an increasingly conservative Republican Party. Not surprisingly, some of the key architects of Richard Nixon’s invidious Southern strategy, which sought to weaken the Democratic Party in the South through the use of dog-whistle politics on racial issues, came from South Carolina.
As this year’s presidential candidates focus on South Carolina, it is clear that the racial makeup of the state’s electorate is vastly different than that in Iowa or New Hampshire, two of the states where the popularity of candidates has already been tested. But Democrats should view the South Carolina primary as more than a shift from voting in small, mostly white states. They should see the state as representative of the party’s strategic core, a strong African American constituency with diverse interests and perspectives.
African Americans in South Carolina have been fighting and winning legal and political battles for voting rights and electoral power since Reconstruction and as Democrats since the 1940s.
A personal price
After Elmore’s victory in 1947, state NAACP President James M. Hinton gave a somber, prophetic warning: “White men want office, and they want the vote of our people. We will be sought after, but we must be extremely careful who we vote for. … We must have a choice between those who have fought us and those who are our friends.”
George Elmore and his family paid a price for challenging the entrenched power of the white Democratic Party in 1946. In an interview with the University of South Carolina’s Center for Civil Rights History and Research, which I lead, his 81-year-old son Cresswell Elmore recalled the retaliation the family experienced. Ku Klux Klan terrorists burned a cross in their yard and threatened their family. Laura Elmore suffered a nervous breakdown and went into a mental hospital. State agents raided Elmore’s liquor store, claiming the liquor he had bought from the standard wholesaler was illegal, and broke the bottles. Soda bottling companies and other vendors refused to send products on credit. Banks called in loans on their home and other property. Forced into bankruptcy, the family moved from house to house and the disruption scattered Cresswell and his siblings. When Elmore died in 1959 at the age of 53, only scant attention was paid to his passing.
The monument at his grave was unveiled in 1981, at a ceremony attended by civil rights veterans including his original attorney, Harold Boulware.
As the Democratic Party and presidential candidates appeal to African American voters, they would do well to remember the remarkable fight Elmore and others waged against the forces of bigotry and injustice. These historical struggles illuminate both the gains made over many generations and the ongoing battle against inequities and voter suppression tactics that persist to this day in South Carolina and across the nation.
How Prohibition changed the way Americans drink, 100 years ago
January 15, 2020
By The Conversation - raw story
On Jan. 17, 1920, one hundred years ago, America officially went dry.
Prohibition, embodied in the U.S Constitution’s 18th amendment, banned the sale, manufacture and transportation of alcohol. Yet it remained legal to drink, and alcohol was widely available throughout Prohibition, which ended in 1933.
I am reminded of how easy it was to drink during Prohibition every time I go to the hotel in New Hampshire that hosted the Bretton Woods Conference, which created the modern international monetary system after World War II. The hotel, now known as the Omni Mount Washington Resort, boasts a basement speakeasy called The Cave that served illegal liquor during Prohibition.
The last time I was in The Cave I began wondering, given how prevalent Prohibition-era speakeasies appear to have been, what effect banning alcohol had on consumption rates.
Moreover, are we drinking any more today than we did before prohibition?
Consumption begins to drop
The Prohibition movement began in the early 1800s based on noble ideas such as boosting savings, reducing domestic violence and improving family life.
At the time, alcohol usage was soaring in the U.S. Some estimates by alcohol opponents put consumption at three times what it is today. Activists thought that prohibiting its sale would curb excess drinking. Their efforts were very effective.
But while Prohibition is often portrayed as a sharp change that happened with one last national call for drinks just before the stroke of midnight, thousands of towns throughout the country had gone dry well before that. More bans took effect during World War I in an effort to save grain.
So to consider the impact of Prohibition on drinking habits, it’s a good idea to start in the years leading up to it. And given that beer, wine and spirits all have different alcohol content, we’ll use the number of “standard” drinks a person consumes to make our comparison. A standard drink contains about 14 grams of pure alcohol. This is the amount of spirits in a 12 ounce beer, a five ounce glass of wine or a 1.5 ounce shot of hard liquor.
From 1900 until 1915 – five years before the 18th Amendment passed – the average adult drank about 2.5 gallons of pure alcohol a year, which is about 13 standard drinks per week. Consumption fell sharply by 1916, with the average falling to two gallons a year, or 10 drinks a week.
The Prohibition movement and the local dry laws that preceded it appeared to already be having an impact.
Drinking rebounds
Tracking consumption gets a bit trickier after 1920.
Prohibition meant the federal government no longer had a way to measure how much alcohol people were consuming. So to get around the missing information, researchers have used data on arrests for drunkenness, deaths caused by cirrhosis of the liver, deaths by alcoholism and how many patients were admitted to hospitals for alcoholic psychosis. Put together, the numbers suggest alcohol consumption dropped sharply in 1920, falling to about one-third of what people drank before Prohibition.
Starting in 1921, however, alcohol consumption rebounded quickly and soon reached about two-thirds of pre-Prohibition levels. One likely reason is that the U.S. experienced a severe recession in 1920 and 1921. When the economy recovered in 1922 to start the roaring 20s people were more able to afford illegal liquor.
In the decades after Prohibition ended on Dec. 5, 1933, with the repeal of the 18th Amendment, consumption remained relatively subdued. But by the 1960s and ‘70s, Americans were swilling just as much alcohol as in the early 1900s.
Today Americans drink on average about 2.3 gallons of pure alcohol a year, which is about 12 standard drinks a week, about the same amount they drank before Prohibition.
Prohibition’s legacy
The era of Prohibition left many legacies.
One result is American’s preference for pale bland beers. Drinking also moved from public spaces like saloons into the home.
More negatively, some claim it created organized crime as violence soared and mobsters enriched themselves. It also meant states and the federal government, which relied heavily on excise taxes from liquor taxes to fund their budgets, turned to income taxes to help make up for the gap. And ultimately it did not result in a significant or lasting drop in alcohol consumption.
For these reasons, many people believe it was a failure, which should give pause to policymakers and others pushing for a ban on smoking or vaping.
And even the person most responsible for drafting Prohibition legislation, U.S. Rep. Andrew Volstead, was no teetotaler himself, suggesting even those who push such bans can’t even abide by them.
So, as an economist, I believe if you want to stop people from doing something injurious to their health, raising the price works better than a ban. That’s how the U.S. cut the share of adult smokers from 40% in the 1970s to 16% by 2018.
The 100th anniversary of Prohibition reminds us that bans rarely work.
Prohibition, embodied in the U.S Constitution’s 18th amendment, banned the sale, manufacture and transportation of alcohol. Yet it remained legal to drink, and alcohol was widely available throughout Prohibition, which ended in 1933.
I am reminded of how easy it was to drink during Prohibition every time I go to the hotel in New Hampshire that hosted the Bretton Woods Conference, which created the modern international monetary system after World War II. The hotel, now known as the Omni Mount Washington Resort, boasts a basement speakeasy called The Cave that served illegal liquor during Prohibition.
The last time I was in The Cave I began wondering, given how prevalent Prohibition-era speakeasies appear to have been, what effect banning alcohol had on consumption rates.
Moreover, are we drinking any more today than we did before prohibition?
Consumption begins to drop
The Prohibition movement began in the early 1800s based on noble ideas such as boosting savings, reducing domestic violence and improving family life.
At the time, alcohol usage was soaring in the U.S. Some estimates by alcohol opponents put consumption at three times what it is today. Activists thought that prohibiting its sale would curb excess drinking. Their efforts were very effective.
But while Prohibition is often portrayed as a sharp change that happened with one last national call for drinks just before the stroke of midnight, thousands of towns throughout the country had gone dry well before that. More bans took effect during World War I in an effort to save grain.
So to consider the impact of Prohibition on drinking habits, it’s a good idea to start in the years leading up to it. And given that beer, wine and spirits all have different alcohol content, we’ll use the number of “standard” drinks a person consumes to make our comparison. A standard drink contains about 14 grams of pure alcohol. This is the amount of spirits in a 12 ounce beer, a five ounce glass of wine or a 1.5 ounce shot of hard liquor.
From 1900 until 1915 – five years before the 18th Amendment passed – the average adult drank about 2.5 gallons of pure alcohol a year, which is about 13 standard drinks per week. Consumption fell sharply by 1916, with the average falling to two gallons a year, or 10 drinks a week.
The Prohibition movement and the local dry laws that preceded it appeared to already be having an impact.
Drinking rebounds
Tracking consumption gets a bit trickier after 1920.
Prohibition meant the federal government no longer had a way to measure how much alcohol people were consuming. So to get around the missing information, researchers have used data on arrests for drunkenness, deaths caused by cirrhosis of the liver, deaths by alcoholism and how many patients were admitted to hospitals for alcoholic psychosis. Put together, the numbers suggest alcohol consumption dropped sharply in 1920, falling to about one-third of what people drank before Prohibition.
Starting in 1921, however, alcohol consumption rebounded quickly and soon reached about two-thirds of pre-Prohibition levels. One likely reason is that the U.S. experienced a severe recession in 1920 and 1921. When the economy recovered in 1922 to start the roaring 20s people were more able to afford illegal liquor.
In the decades after Prohibition ended on Dec. 5, 1933, with the repeal of the 18th Amendment, consumption remained relatively subdued. But by the 1960s and ‘70s, Americans were swilling just as much alcohol as in the early 1900s.
Today Americans drink on average about 2.3 gallons of pure alcohol a year, which is about 12 standard drinks a week, about the same amount they drank before Prohibition.
Prohibition’s legacy
The era of Prohibition left many legacies.
One result is American’s preference for pale bland beers. Drinking also moved from public spaces like saloons into the home.
More negatively, some claim it created organized crime as violence soared and mobsters enriched themselves. It also meant states and the federal government, which relied heavily on excise taxes from liquor taxes to fund their budgets, turned to income taxes to help make up for the gap. And ultimately it did not result in a significant or lasting drop in alcohol consumption.
For these reasons, many people believe it was a failure, which should give pause to policymakers and others pushing for a ban on smoking or vaping.
And even the person most responsible for drafting Prohibition legislation, U.S. Rep. Andrew Volstead, was no teetotaler himself, suggesting even those who push such bans can’t even abide by them.
So, as an economist, I believe if you want to stop people from doing something injurious to their health, raising the price works better than a ban. That’s how the U.S. cut the share of adult smokers from 40% in the 1970s to 16% by 2018.
The 100th anniversary of Prohibition reminds us that bans rarely work.
The truth about Columbus Day, explained
Christopher Columbus is a historical figure celebrated as a mythical hero in spite of his genocidal and racist past
MATTHEW ROZSA - salon
OCTOBER 14, 2019 9:00AM (UTC)
There are many good reasons as to why Columbus Day is such a controversial holiday. Like Andrew Jackson, Christopher Columbus is a historical figure who is celebrated as a mythical hero in the U.S. in spite of his genocidal, racist and pro-slavery legacy. As a result, a movement exists to replace the national holiday known as Columbus Day with Indigenous People’s Day.
Here is the truth about Columbus Day, explained:
1. Christopher Columbus enslaved the Taínos he encountered in the present-day Bahamas.
When Columbus “discovered” the American continents in 1492 — millions lived there long before Europeans learned of their existence — he encountered a civilization of people known as the Taínos. By his own description, they were curious and friendly, eager to help the new group of people who had landed on their shores. Over time, Columbus enslaved and exploited them, thereby establishing a precedent wherein Europeans would come to the American continents, exploit natives and steal their land. His actions also laid the foundations for the Europeans to introduce African slavery to the American continents, and Columbus is known to have had an African slave with him on his so-called voyages of discovery.
2. Columbus was also a tyrant, generally speaking.
After becoming governor and viceroy of the Indies, Columbus let the power to go to his head, becoming a brutal autocrat who was eventually loathed by his own followers. When one man was caught stealing corn, Columbus responded by having his nose and ears cut off before selling him into slavery. When a woman claimed that Columbus was of lowly birth, his brother Bartolomé cut out her tongue, stripped her naked and had her paraded around the colony on the back of a mule. And these are just two examples of many. Eventually, the Spanish monarchs realized that Columbus had become power mad and ordered him and his brothers to return to Spain. He never regained his power, although his freedom was eventually restored.
3. Columbus Day was established to honor Italian Americans — but they deserve better.
When Italians first began immigrating to the U.S. in large numbers in the 19th Century, they faced severe discrimination and — quite often — violence. As a result, President Benjamin Harrison, who was an under-appreciated liberal, designated the 400th anniversary of Columbus’ arrival as “a general holiday,” describing the Genoan as a “pioneer of progress and enlightenment.” It eventually became a national holiday in 1937 as the result of intense lobbying by the Catholic fraternal organization known as the Knights of Columbus. Yet there is no reason why a genocidal tyrant should be a symbol of Italian American pride. I say this as a Jewish American who, though proud of his heritage, would never want a monster like Confederate Secretary of State Judah P. Benjamin to be the symbol of my people’s contribution to the U.S.
4. Indigenous Peoples’ Day makes more sense as a holiday.
Like Columbus Day, Indigenous Peoples’ Day is celebrated on the second Monday of October and exists to honor the millions of indigenous people who inhabited the Western Hemisphere prior to Columbus’ arrival. It was first officially observed in South Dakota in 1989, with the California city of Berkeley deciding to adopt it in 1992 on the 500th anniversary of Columbus landing in the Bahamas. As University of Louisville professor Frank Kelderman recently told the Louisville Courier Journal, “Indigenous Peoples' Day is an attempt to re-shift the focus of the history of conquest in the Americas, to the presence, culture and variety of indigenous lives. I think it speaks to a renewed interest in the traditions and rights of indigenous people in the general culture.”
5. Indigenous people are more interesting than Columbus, anyway.
When we speak of “indigenous people” or “Native Americans,” we frequently do so as if they were a monolith. However, this is a fallacy, analogous to referring to “Europeans” or “Asians” as a specific group. There are numerous European and Asian cultures, and few would argue that there are not meaningful differences between Russians and Spaniards or Italians and Norwegians, between Han Chinese and Tamils or Koreans and the Javanese in Indonesia. Similarly, there are massive differences between the Taínos who Columbus encountered and the Wampanoag encountered by the English Pilgrims who founded Plymouth Colony, or between the Aztec Empire destroyed by Spanish conquistador Hernán Cortés and the Inca Empire which once stretched from modern-day Colombia and Peru to Chile and Argentina. Learning about the great historical contributions of this diverse network of cultures is far more interesting than reading about another egomaniacal despot.
RELATED: Columbus statues vandalized on US holiday named for him
PROVIDENCE, R.I. (AP) — Several Christopher Columbus statues were vandalized with red paint and messages against the 15th century Italian navigator Monday when the U.S. holiday named for one of the fi ... (Associated Press)
Here is the truth about Columbus Day, explained:
1. Christopher Columbus enslaved the Taínos he encountered in the present-day Bahamas.
When Columbus “discovered” the American continents in 1492 — millions lived there long before Europeans learned of their existence — he encountered a civilization of people known as the Taínos. By his own description, they were curious and friendly, eager to help the new group of people who had landed on their shores. Over time, Columbus enslaved and exploited them, thereby establishing a precedent wherein Europeans would come to the American continents, exploit natives and steal their land. His actions also laid the foundations for the Europeans to introduce African slavery to the American continents, and Columbus is known to have had an African slave with him on his so-called voyages of discovery.
2. Columbus was also a tyrant, generally speaking.
After becoming governor and viceroy of the Indies, Columbus let the power to go to his head, becoming a brutal autocrat who was eventually loathed by his own followers. When one man was caught stealing corn, Columbus responded by having his nose and ears cut off before selling him into slavery. When a woman claimed that Columbus was of lowly birth, his brother Bartolomé cut out her tongue, stripped her naked and had her paraded around the colony on the back of a mule. And these are just two examples of many. Eventually, the Spanish monarchs realized that Columbus had become power mad and ordered him and his brothers to return to Spain. He never regained his power, although his freedom was eventually restored.
3. Columbus Day was established to honor Italian Americans — but they deserve better.
When Italians first began immigrating to the U.S. in large numbers in the 19th Century, they faced severe discrimination and — quite often — violence. As a result, President Benjamin Harrison, who was an under-appreciated liberal, designated the 400th anniversary of Columbus’ arrival as “a general holiday,” describing the Genoan as a “pioneer of progress and enlightenment.” It eventually became a national holiday in 1937 as the result of intense lobbying by the Catholic fraternal organization known as the Knights of Columbus. Yet there is no reason why a genocidal tyrant should be a symbol of Italian American pride. I say this as a Jewish American who, though proud of his heritage, would never want a monster like Confederate Secretary of State Judah P. Benjamin to be the symbol of my people’s contribution to the U.S.
4. Indigenous Peoples’ Day makes more sense as a holiday.
Like Columbus Day, Indigenous Peoples’ Day is celebrated on the second Monday of October and exists to honor the millions of indigenous people who inhabited the Western Hemisphere prior to Columbus’ arrival. It was first officially observed in South Dakota in 1989, with the California city of Berkeley deciding to adopt it in 1992 on the 500th anniversary of Columbus landing in the Bahamas. As University of Louisville professor Frank Kelderman recently told the Louisville Courier Journal, “Indigenous Peoples' Day is an attempt to re-shift the focus of the history of conquest in the Americas, to the presence, culture and variety of indigenous lives. I think it speaks to a renewed interest in the traditions and rights of indigenous people in the general culture.”
5. Indigenous people are more interesting than Columbus, anyway.
When we speak of “indigenous people” or “Native Americans,” we frequently do so as if they were a monolith. However, this is a fallacy, analogous to referring to “Europeans” or “Asians” as a specific group. There are numerous European and Asian cultures, and few would argue that there are not meaningful differences between Russians and Spaniards or Italians and Norwegians, between Han Chinese and Tamils or Koreans and the Javanese in Indonesia. Similarly, there are massive differences between the Taínos who Columbus encountered and the Wampanoag encountered by the English Pilgrims who founded Plymouth Colony, or between the Aztec Empire destroyed by Spanish conquistador Hernán Cortés and the Inca Empire which once stretched from modern-day Colombia and Peru to Chile and Argentina. Learning about the great historical contributions of this diverse network of cultures is far more interesting than reading about another egomaniacal despot.
RELATED: Columbus statues vandalized on US holiday named for him
PROVIDENCE, R.I. (AP) — Several Christopher Columbus statues were vandalized with red paint and messages against the 15th century Italian navigator Monday when the U.S. holiday named for one of the fi ... (Associated Press)
Aaron Burr, vice-president who killed Hamilton, had children of color
Philadelphia ceremony honors John Pierre Burr, prominent member of black society now recognised as son of founding father
Amanda Holpuch
The Guardian
Last modified on Sat 24 Aug 2019 17.15 EDT
John Pierre Burr, one of two children the former vice-president Aaron Burr is said to have fathered with a servant from India, was officially memorialized as a descendant of the founding father at a ceremony in Philadelphia on Saturday.
The elder Burr was the vice-president to Thomas Jefferson between 1801 and 1805 but is perhaps best known for killing Alexander Hamilton in a duel, an act which made him the villain in a hit Broadway musical.
The younger Burr, who lived from 1792 to 1864, was a prominent member of black society in Philadelphia. Rumored to be the vice-president’s son for several years, he was officially recognized by the Aaron Burr Association in 2018.
At Eden Cemetery in Philadelphia on Saturday, the not-for-profit association unveiled a headstone for John Pierre in a ceremony which featured a procession of men in tricorn hats, carrying flags.
The headstone identifies John Pierre as the son of Aaron Burr and reads: “Champion of justice and freedom, conductor on the Underground railroad.”
A descendant of John Pierre Burr, Sherri Burr, spoke at the ceremony, which came about largely because of her own work to determine whether Aaron Burr was John Pierre’s father.
Burr, an emeritus professor of law at the University of New Mexico and the third vice-president of the Aaron Burr Association, told the gathered crowd: “From henceforth I hope John Pierre Burr is never again referred to as ‘the natural son’ or ‘the illegitimate son’, but is simply referred to as ‘the son’,” the Washington Post reported.
Along with other evidence Burr found, a DNA test showed she was related to Stuart Johnson, another Burr descendant.
At the Aaron Burr Association’s annual meeting last year, members voted unanimously to recognize that Aaron Burr had two children – the other was Louisa Charlotte – by Mary Eugenie Beauharnais Emmons, who was from Kolkata, India, and was a servant in the Burr home.
The association was founded in 1946 and knew of rumors about John Pierre for more than a decade. In 2005, a black woman named Louella Burr Mitchell Allen came forward, claiming she had traced her lineage to John Pierre.
It was the work of Sherri Burr which swayed members of the association, a group of roughly 75 Burr descendants and history fans, to formally recognize the lineage. Historians told the Post the evidence Burr had found was convincing.
Sherri Burr is working on a book, Aaron Burr’s Family of Color. A historical fiction book about Burr’s “secret wife”, by Susan Holloway Scott, is due for release next month.
Close ties between the founding fathers of the United States and people of color, including the people they enslaved, have become a more prominent thread in public history.
In 2018, Jefferson’s home at Monticello launched an exhibit about Sally Hemmings, an enslaved women who had Jefferson’s children. The relationship was an open secret while Jefferson was alive but for two centuries it was largely avoided at historical sites and in school textbooks.
In 2017, the first comprehensive history of George Washington’s runaway slave, Ona Judge, was published by the historian Erica Armstrong Dunbar. Washington’s home in Virginia, Mount Vernon, hosted an exhibit about Judge.
The Aaron Burr Association timed its headstone instillation to coincide with the 400th anniversary of enslaved Africans being brought to the US.
The elder Burr was the vice-president to Thomas Jefferson between 1801 and 1805 but is perhaps best known for killing Alexander Hamilton in a duel, an act which made him the villain in a hit Broadway musical.
The younger Burr, who lived from 1792 to 1864, was a prominent member of black society in Philadelphia. Rumored to be the vice-president’s son for several years, he was officially recognized by the Aaron Burr Association in 2018.
At Eden Cemetery in Philadelphia on Saturday, the not-for-profit association unveiled a headstone for John Pierre in a ceremony which featured a procession of men in tricorn hats, carrying flags.
The headstone identifies John Pierre as the son of Aaron Burr and reads: “Champion of justice and freedom, conductor on the Underground railroad.”
A descendant of John Pierre Burr, Sherri Burr, spoke at the ceremony, which came about largely because of her own work to determine whether Aaron Burr was John Pierre’s father.
Burr, an emeritus professor of law at the University of New Mexico and the third vice-president of the Aaron Burr Association, told the gathered crowd: “From henceforth I hope John Pierre Burr is never again referred to as ‘the natural son’ or ‘the illegitimate son’, but is simply referred to as ‘the son’,” the Washington Post reported.
Along with other evidence Burr found, a DNA test showed she was related to Stuart Johnson, another Burr descendant.
At the Aaron Burr Association’s annual meeting last year, members voted unanimously to recognize that Aaron Burr had two children – the other was Louisa Charlotte – by Mary Eugenie Beauharnais Emmons, who was from Kolkata, India, and was a servant in the Burr home.
The association was founded in 1946 and knew of rumors about John Pierre for more than a decade. In 2005, a black woman named Louella Burr Mitchell Allen came forward, claiming she had traced her lineage to John Pierre.
It was the work of Sherri Burr which swayed members of the association, a group of roughly 75 Burr descendants and history fans, to formally recognize the lineage. Historians told the Post the evidence Burr had found was convincing.
Sherri Burr is working on a book, Aaron Burr’s Family of Color. A historical fiction book about Burr’s “secret wife”, by Susan Holloway Scott, is due for release next month.
Close ties between the founding fathers of the United States and people of color, including the people they enslaved, have become a more prominent thread in public history.
In 2018, Jefferson’s home at Monticello launched an exhibit about Sally Hemmings, an enslaved women who had Jefferson’s children. The relationship was an open secret while Jefferson was alive but for two centuries it was largely avoided at historical sites and in school textbooks.
In 2017, the first comprehensive history of George Washington’s runaway slave, Ona Judge, was published by the historian Erica Armstrong Dunbar. Washington’s home in Virginia, Mount Vernon, hosted an exhibit about Judge.
The Aaron Burr Association timed its headstone instillation to coincide with the 400th anniversary of enslaved Africans being brought to the US.
Point Comfort: where slavery in America began 400 years ago
In 1619, a ship with 20 captives landed at Virginia, ushering in the era of slavery in what would become the United States
David Smith in Washington
The Guardian
Wed 14 Aug 2019 00.00 EDT
The blue waters of the Chesapeake lap against the shore. Sunbathers lounge in deckchairs as black children and white children run and play on the beach. And close by stands a magnificent oak tree, its trunk stretching three great arms and canopies of leaves high into the tranquil sky.
Over half a millennium, the Algernoune Oak has witnessed war and peace and the fall of empires, but never a day like the one in late August 1619. It was here that the White Lion, a 160-ton English privateer ship, landed at what was then known as Point Comfort. On board were more than 20 captives seized from the Kingdom of Ndongo in Angola and transported across the Atlantic. This dislocated, unwilling, violated group were the first enslaved Africans to set foot in English North America – ushering in the era of slavery in what would become the United States.
This site, now Fort Monroe in Hampton, southern Virginia, will host a weekend of 400th anniversary commemorations on 23-25 August, culminating in a symbolic release of butterflies and nationwide ringing of bells. Americans of all races will reflect on a historical pivot point that illuminates pain and suffering but also resilience and reinvention. Some see an opportunity for a national reckoning and debate on reparations.
For a people robbed of an origins story, it is also an invitation to go in search of roots – the African in African American.
“Once I learned that I was from there it changed something in me,” said Terry E Brown, 50, who has traced his ancestry to Cameroon and enslaved people in Virginia and North Carolina. “I have a fire in me to just learn about why and who I am. There’s something deep down and spiritual about it and I want to connect to it. I’m American, and I believe in this structure that we have, but I’m emotionally and spiritually tied to Africa now that I know where I came from.”
By the early 17th century the transatlantic slave trade – the biggest forced migration of people in world history – was already well under way in the Caribbean and Latin America. In 1619 it came to the English colony of Virginia. The San Juan Bautista, a Spanish ship transporting enslaved Africans, was bound for Mexico when it was attacked by the White Lion and another privateer, the Treasurer, and forced to surrender its African prisoners.
The White Lion continued on to land at Point Comfort. John Rolfe, a colonist, reported that its cargo was “not anything but 20 and odd Negroes, which the Governor and Cape Merchant bought for victualls”. They were given names by Portuguese missionaries: Antony, Isabela, William, Angela, Anthony, Frances, Margaret, Anthony, John, Edward, Anthony and others, according to research by the Hampton History Museum.
The captain of the White Lion, John Jope, traded the captives to Virginians in return for food and supplies. They were taken into servitude in nearby homes and plantations, their skills as farmers and artisans critical in the daily struggle to survive. Slavery in America was born.
Yet it all requires a leap of imagination in the serenity of today’s 565-acre Fort Monroe national monument, run by the National Park Service, or in the low-key city of Hampton, home to Nasa’s Langley Research Center.
Brown, the first black superintendent at Fort Monroe, said: “The early colonists are trying to survive and they’re not doing it. They’re resorting to cannibalism because they just can’t figure this thing out. When the Africans show up, the game changes a little bit because they knew how to cultivate rice, sugar and cotton, all those things were perfect for this environment and for what they were trying to do.”
It would be another century until the formation of the United States. By 1725, some 42,200 enslaved Africans had been transported to the Chesapeake; by 1775, the total was 127,200. Thomas Jefferson, the author of the declaration of independence, which contains the words “all men are created equal”, was a Virginia slave owner and, by 1860, the US was home to about 3.9 million enslaved African Americans.
The events of 1619 are at once both remote and immediate in a state where white nationalists caused deadly violence in Charlottesville two years ago and in a nation where their enabler occupies the White House.
Brown reflected: “African Americans make up about 13% of the population and our young black men account for about 49% of America’s murders. People who look like me, about 41% of them are sitting in a jail cell. Now I can easily blame that on one thing but I can easily tie it to the very beginning of this country. It’s so easy to treat other people like they’re less than human if you don’t know them. So what I’m hoping this 400th will do is raise the awareness level.
“We’re not going to change people’s behaviour overnight but maybe if you sit back and think, ‘man, 400 years’, they were enslaved for 246 years so they lived under the most oppressive conditions imaginable but they managed to reinvent themselves …They created new music and new art forms and new families. It’s one of the greatest stories and it’s amazing that they survived it.”
Last month, Donald Trump travelled to nearby Jamestown to celebrate the 400th anniversary of the first representative legislative assembly. The US president made reference to the first enslaved Africans’ arrival in Virginia, “the beginning of a barbaric trade in human lives”, but there are currently no plans for him to attend the commemoration at Fort Monroe.
Gaylene Kanoyton, the president of the Hampton branch of the National Association for the Advancement of Colored People (NAACP), said: “He’s not welcome because of everything that we’re commemorating, the arrival of slavery. He’s for white supremacy, he’s for nationalism, he’s for everything that we are against.”
Built by enslaved labour, the fort has a multilayered history, full of contradiction and paradox, like America itself. It witnessed the beginning of slavery but also the end: early in the civil war, three enslaved men seeking freedom escaped to Fort Monroe and were deemed by the commander as “contraband of war”, spurring thousands to seek sanctuary behind Union lines and ultimately a shift in government policy towards emancipation.
There are other threads from past to present. Among the Africans who arrived on the White Lion were Anthony and Isabela who, in 1624 or 1625, had a son, William, who was baptised. In a census they are identified as “Antoney Negro: Isabell Negro: and William theire Child Baptised.” They were living on the property of Captain William Tucker, so are now known by this surname, and William is often described as the first African child born in English North America.
A local family in Hampton believe they are his direct descendants. Walter Jones, 63, whose mother is the oldest living Tucker, said: “We traced as far as we could and then we had word-of-mouth records. We heard this years and years ago and so a lot of us have been through family history and we just never realized how significant it was. From what we’re able to dig up, everything still points to that.”
Jones and his relatives maintain a two-acre cemetery in the historic African American neighborhood of Aberdeen Gardens in Hampton, where many of their ancestors are buried. A simple grey monument is inscribed with the words: “Tucker’s cemetery. First black family. 1619.” A short distance away, a headstone says, “African American female. Approx age 60. Discovered July 2017.” Dozens of white crosses dot patches of grass and soil representing unmarked graves.
Can Jones, a retired software engineer, forgive the enslavers? “The way we were raised and the way I was raised is that we forgive all for some of the things that were done because it wasn’t just them. It was going on everywhere so it was unfortunate and in some cases Africans were also involved in some of the slave trade.
“There’s more discord to not being recognised as being such a vital part of our history and our nation’s history here and what was contributed. We didn’t come here by choice but we chose to excel and to build a country which wasn’t our own. So sometimes I think not having that type of recognition makes you a little bitter. If it hasn’t come by now, when will it? And now that it’s 400 years coming up, how many people truly will even recognise that?”
The Tuckers are not alone. The anniversary coincides with a boom in online and TV genealogy. Donnie Tuck, the mayor of Hampton, a majority African American city, took a DNA test earlier this year and found lineage in Nigeria and other countries.
“Now we look at progress and, with so many documentaries and programs where you’re exploring what slaves went through and the civil war and the period afterwards, I think there’s a whole new emphasis and we have more resources available to us. There’a a real hunger among African Americans to try and know our roots and our experience, our journey here to America and even that whole journey for the last 400 years.”
Some have taken the curiosity further and travelled to Africa. Last month, the congressman James Clyburn was part of a congressional delegation to Ghana, led by the House speaker, Nancy Pelosi, that visited Cape Coast and Elmina castles to observe the 400th anniversary. It was his second trip to the “door of no return”. “All I remember the first time I went there was walking through that door and looking out at the ocean and the impact that was,” he said in a phone interview.
Clyburn believes that America has still not fully confronted the issue of slavery. “It’s an issue that’s been avoided in this country as much as possible. If it were an ongoing process I think that we would be much further down the road on that. We continue to treat this whole issue with what I like to call benign neglect. We tend to feel that if we ignore it, pretend it didn’t happen, then it didn’t happen or if we don’t need to do anything with it then we won’t.”[...]
Over half a millennium, the Algernoune Oak has witnessed war and peace and the fall of empires, but never a day like the one in late August 1619. It was here that the White Lion, a 160-ton English privateer ship, landed at what was then known as Point Comfort. On board were more than 20 captives seized from the Kingdom of Ndongo in Angola and transported across the Atlantic. This dislocated, unwilling, violated group were the first enslaved Africans to set foot in English North America – ushering in the era of slavery in what would become the United States.
This site, now Fort Monroe in Hampton, southern Virginia, will host a weekend of 400th anniversary commemorations on 23-25 August, culminating in a symbolic release of butterflies and nationwide ringing of bells. Americans of all races will reflect on a historical pivot point that illuminates pain and suffering but also resilience and reinvention. Some see an opportunity for a national reckoning and debate on reparations.
For a people robbed of an origins story, it is also an invitation to go in search of roots – the African in African American.
“Once I learned that I was from there it changed something in me,” said Terry E Brown, 50, who has traced his ancestry to Cameroon and enslaved people in Virginia and North Carolina. “I have a fire in me to just learn about why and who I am. There’s something deep down and spiritual about it and I want to connect to it. I’m American, and I believe in this structure that we have, but I’m emotionally and spiritually tied to Africa now that I know where I came from.”
By the early 17th century the transatlantic slave trade – the biggest forced migration of people in world history – was already well under way in the Caribbean and Latin America. In 1619 it came to the English colony of Virginia. The San Juan Bautista, a Spanish ship transporting enslaved Africans, was bound for Mexico when it was attacked by the White Lion and another privateer, the Treasurer, and forced to surrender its African prisoners.
The White Lion continued on to land at Point Comfort. John Rolfe, a colonist, reported that its cargo was “not anything but 20 and odd Negroes, which the Governor and Cape Merchant bought for victualls”. They were given names by Portuguese missionaries: Antony, Isabela, William, Angela, Anthony, Frances, Margaret, Anthony, John, Edward, Anthony and others, according to research by the Hampton History Museum.
The captain of the White Lion, John Jope, traded the captives to Virginians in return for food and supplies. They were taken into servitude in nearby homes and plantations, their skills as farmers and artisans critical in the daily struggle to survive. Slavery in America was born.
Yet it all requires a leap of imagination in the serenity of today’s 565-acre Fort Monroe national monument, run by the National Park Service, or in the low-key city of Hampton, home to Nasa’s Langley Research Center.
Brown, the first black superintendent at Fort Monroe, said: “The early colonists are trying to survive and they’re not doing it. They’re resorting to cannibalism because they just can’t figure this thing out. When the Africans show up, the game changes a little bit because they knew how to cultivate rice, sugar and cotton, all those things were perfect for this environment and for what they were trying to do.”
It would be another century until the formation of the United States. By 1725, some 42,200 enslaved Africans had been transported to the Chesapeake; by 1775, the total was 127,200. Thomas Jefferson, the author of the declaration of independence, which contains the words “all men are created equal”, was a Virginia slave owner and, by 1860, the US was home to about 3.9 million enslaved African Americans.
The events of 1619 are at once both remote and immediate in a state where white nationalists caused deadly violence in Charlottesville two years ago and in a nation where their enabler occupies the White House.
Brown reflected: “African Americans make up about 13% of the population and our young black men account for about 49% of America’s murders. People who look like me, about 41% of them are sitting in a jail cell. Now I can easily blame that on one thing but I can easily tie it to the very beginning of this country. It’s so easy to treat other people like they’re less than human if you don’t know them. So what I’m hoping this 400th will do is raise the awareness level.
“We’re not going to change people’s behaviour overnight but maybe if you sit back and think, ‘man, 400 years’, they were enslaved for 246 years so they lived under the most oppressive conditions imaginable but they managed to reinvent themselves …They created new music and new art forms and new families. It’s one of the greatest stories and it’s amazing that they survived it.”
Last month, Donald Trump travelled to nearby Jamestown to celebrate the 400th anniversary of the first representative legislative assembly. The US president made reference to the first enslaved Africans’ arrival in Virginia, “the beginning of a barbaric trade in human lives”, but there are currently no plans for him to attend the commemoration at Fort Monroe.
Gaylene Kanoyton, the president of the Hampton branch of the National Association for the Advancement of Colored People (NAACP), said: “He’s not welcome because of everything that we’re commemorating, the arrival of slavery. He’s for white supremacy, he’s for nationalism, he’s for everything that we are against.”
Built by enslaved labour, the fort has a multilayered history, full of contradiction and paradox, like America itself. It witnessed the beginning of slavery but also the end: early in the civil war, three enslaved men seeking freedom escaped to Fort Monroe and were deemed by the commander as “contraband of war”, spurring thousands to seek sanctuary behind Union lines and ultimately a shift in government policy towards emancipation.
There are other threads from past to present. Among the Africans who arrived on the White Lion were Anthony and Isabela who, in 1624 or 1625, had a son, William, who was baptised. In a census they are identified as “Antoney Negro: Isabell Negro: and William theire Child Baptised.” They were living on the property of Captain William Tucker, so are now known by this surname, and William is often described as the first African child born in English North America.
A local family in Hampton believe they are his direct descendants. Walter Jones, 63, whose mother is the oldest living Tucker, said: “We traced as far as we could and then we had word-of-mouth records. We heard this years and years ago and so a lot of us have been through family history and we just never realized how significant it was. From what we’re able to dig up, everything still points to that.”
Jones and his relatives maintain a two-acre cemetery in the historic African American neighborhood of Aberdeen Gardens in Hampton, where many of their ancestors are buried. A simple grey monument is inscribed with the words: “Tucker’s cemetery. First black family. 1619.” A short distance away, a headstone says, “African American female. Approx age 60. Discovered July 2017.” Dozens of white crosses dot patches of grass and soil representing unmarked graves.
Can Jones, a retired software engineer, forgive the enslavers? “The way we were raised and the way I was raised is that we forgive all for some of the things that were done because it wasn’t just them. It was going on everywhere so it was unfortunate and in some cases Africans were also involved in some of the slave trade.
“There’s more discord to not being recognised as being such a vital part of our history and our nation’s history here and what was contributed. We didn’t come here by choice but we chose to excel and to build a country which wasn’t our own. So sometimes I think not having that type of recognition makes you a little bitter. If it hasn’t come by now, when will it? And now that it’s 400 years coming up, how many people truly will even recognise that?”
The Tuckers are not alone. The anniversary coincides with a boom in online and TV genealogy. Donnie Tuck, the mayor of Hampton, a majority African American city, took a DNA test earlier this year and found lineage in Nigeria and other countries.
“Now we look at progress and, with so many documentaries and programs where you’re exploring what slaves went through and the civil war and the period afterwards, I think there’s a whole new emphasis and we have more resources available to us. There’a a real hunger among African Americans to try and know our roots and our experience, our journey here to America and even that whole journey for the last 400 years.”
Some have taken the curiosity further and travelled to Africa. Last month, the congressman James Clyburn was part of a congressional delegation to Ghana, led by the House speaker, Nancy Pelosi, that visited Cape Coast and Elmina castles to observe the 400th anniversary. It was his second trip to the “door of no return”. “All I remember the first time I went there was walking through that door and looking out at the ocean and the impact that was,” he said in a phone interview.
Clyburn believes that America has still not fully confronted the issue of slavery. “It’s an issue that’s been avoided in this country as much as possible. If it were an ongoing process I think that we would be much further down the road on that. We continue to treat this whole issue with what I like to call benign neglect. We tend to feel that if we ignore it, pretend it didn’t happen, then it didn’t happen or if we don’t need to do anything with it then we won’t.”[...]
How the 2nd Amendment was ratified to preserve slavery
Thom Hartmann, Independent Media Institute - raw story
17 APR 2019 AT 01:51 ET
The real reason the Second Amendment was ratified, and why it says “State” instead of “Country” (the Framers knew the difference – see the 10th Amendment), was to preserve the slave patrol militias in the southern states, which was necessary to get Virginia’s vote.
Founders Patrick Henry, George Mason, and James Madison were totally clear on that . . . and we all should be too.
In the beginning, there were the militias. In the South, they were also called the “slave patrols,” and they were regulated by the states.
In Georgia, for example, a generation before the American Revolution, laws were passed in 1755 and 1757 that required all plantation owners or their male white employees to be members of the Georgia Militia, and for those armed militia members to make monthly inspections of the quarters of all slaves in the state. The law defined which counties had which armed militias and even required armed militia members to keep a keen eye out for slaves who may be planning uprisings.
As Dr. Carl T. Bogus wrote for the University of California Law Review in 1998, “The Georgia statutes required patrols, under the direction of commissioned militia officers, to examine every plantation each month and authorized them to search ‘all Negro Houses for offensive Weapons and Ammunition’ and to apprehend and give twenty lashes to any slave found outside plantation grounds.”
It’s the answer to the question raised by the character played by Leonardo DiCaprio in Django Unchained when he asks, “Why don’t they just rise up and kill the whites?” If the movie were real, it would have been a purely rhetorical question, because every southerner of the era knew the simple answer: Well regulated militias kept the slaves in chains.
Sally E. Haden, in her book Slave Patrols: Law and Violence in Virginia and the Carolinas, notes that, “Although eligibility for the Militia seemed all-encompassing, not every middle-aged white male Virginian or Carolinian became a slave patroller.” There were exemptions so “men in critical professions” like judges, legislators and students could stay at their work. Generally, though, she documents how most southern men between ages 18 and 45 – including physicians and ministers – had to serve on slave patrol in the militia at one time or another in their lives.
And slave rebellions were keeping the slave patrols busy.
By the time the Constitution was ratified, hundreds of substantial slave uprisings had occurred across the South. Blacks outnumbered whites in large areas, and the state militias were used to both prevent and to put down slave uprisings. As Dr. Bogus points out, slavery can only exist in the context of a police state, and the enforcement of that police state was the explicit job of the militias.
If the anti-slavery folks in the North had figured out a way to disband – or even move out of the state – those southern militias, the police state of the South would collapse. And, similarly, if the North were to invite into military service the slaves of the South, then they could be emancipated, which would collapse the institution of slavery, and the southern economic and social systems, altogether.
These two possibilities worried southerners like James Monroe, George Mason (who owned over 300 slaves) and the southern Christian evangelical, Patrick Henry (who opposed slavery on principle, but also opposed freeing slaves).
Their main concern was that Article 1, Section 8 of the newly-proposed Constitution, which gave the federal government the power to raise and supervise a militia, could also allow that federal militia to subsume their state militias and change them from slavery-enforcing institutions into something that could even, one day, free the slaves.
This was not an imagined threat. Famously, 12 years earlier, during the lead-up to the Revolutionary War, Lord Dunsmore offered freedom to slaves who could escape and join his forces. “Liberty to Slaves” was stitched onto their jacket pocket flaps. During the War, British General Henry Clinton extended the practice in 1779. And numerous freed slaves served in General Washington’s army.
Thus, southern legislators and plantation owners lived not just in fear of their own slaves rebelling, but also in fear that their slaves could be emancipated through military service.
At the ratifying convention in Virginia in 1788, Henry laid it out:
“Let me here call your attention to that part [Article 1, Section 8 of the proposed Constitution] which gives the Congress power to provide for organizing, arming, and disciplining the militia, and for governing such part of them as may be employed in the service of the United States. . . .
“By this, sir, you see that their control over our last and best defence is unlimited. If they neglect or refuse to discipline or arm our militia, they will be useless: the states can do neither . . . this power being exclusively given to Congress. The power of appointing officers over men not disciplined or armed is ridiculous; so that this pretended little remains of power left to the states may, at the pleasure of Congress, be rendered nugatory.”
George Mason expressed a similar fear:
“The militia may be here destroyed by that method which has been practised in other parts of the world before; that is, by rendering them useless, by disarming them. Under various pretences, Congress may neglect to provide for arming and disciplining the militia; and the state governments cannot do it, for Congress has an exclusive right to arm them [under this proposed Constitution] . . . “
Henry then bluntly laid it out:
“If the country be invaded, a state may go to war, but cannot suppress [slave] insurrections [under this new Constitution]. If there should happen an insurrection of slaves, the country cannot be said to be invaded. They cannot, therefore, suppress it without the interposition of Congress . . . . Congress, and Congress only [under this new Constitution], can call forth the militia.”
And why was that such a concern for Patrick Henry?
“In this state,” he said, “there are two hundred and thirty-six thousand blacks, and there are many in several other states. But there are few or none in the Northern States. . . . May Congress not say, that every black man must fight? Did we not see a little of this last war? We were not so hard pushed as to make emancipation general; but acts of Assembly passed that every slave who would go to the army should be free.”
Patrick Henry was also convinced that the power over the various state militias given the federal government in the new Constitution could be used to strip the slave states of their slave-patrol militias. He knew the majority attitude in the North opposed slavery, and he worried they’d use the Constitution to free the South’s slaves (a process then called “Manumission”).
The abolitionists would, he was certain, use that power (and, ironically, this is pretty much what Abraham Lincoln ended up doing):
“[T]hey will search that paper [the Constitution], and see if they have power of manumission,” said Henry. “And have they not, sir? Have they not power to provide for the general defence and welfare? May they not think that these call for the abolition of slavery? May they not pronounce all slaves free, and will they not be warranted by that power?
“This is no ambiguous implication or logical deduction. The paper speaks to the point: they have the power in clear, unequivocal terms, and will clearly and certainly exercise it.”
He added: “This is a local matter, and I can see no propriety in subjecting it to Congress.”
James Madison, the “Father of the Constitution” and a slaveholder himself, basically called Patrick Henry paranoid.
“I was struck with surprise,” Madison said, “when I heard him express himself alarmed with respect to the emancipation of slaves. . . . There is no power to warrant it, in that paper [the Constitution]. If there be, I know it not.”
But the southern fears wouldn’t go away.
Patrick Henry even argued that southerner’s “property” (slaves) would be lost under the new Constitution, and the resulting slave uprising would be less than peaceful or tranquil:
“In this situation,” Henry said to Madison, “I see a great deal of the property of the people of Virginia in jeopardy, and their peace and tranquility gone.”
So Madison, who had (at Jefferson’s insistence) already begun to prepare proposed amendments to the Constitution, changed his first draft of one that addressed the militia issue to make sure it was unambiguous that the southern states could maintain their slave patrol militias.
His first draft for what became the Second Amendment had said: “The right of the people to keep and bear arms shall not be infringed; a well armed, and well regulated militia being the best security of a free country [emphasis mine]: but no person religiously scrupulous of bearing arms, shall be compelled to render military service in person.”
But Henry, Mason and others wanted southern states to preserve their slave-patrol militias independent of the federal government. So Madison changed the word “country” to the word “state,” and redrafted the Second Amendment into today’s form:
“A well regulated Militia, being necessary to the security of a free State [emphasis mine], the right of the people to keep and bear Arms, shall not be infringed.”
Little did Madison realize that one day in the future weapons-manufacturing corporations, newly defined as “persons” by a Supreme Court some have called dysfunctional, would use his slave patrol militia amendment to protect their “right” to manufacture and sell assault weapons used to murder schoolchildren.
Founders Patrick Henry, George Mason, and James Madison were totally clear on that . . . and we all should be too.
In the beginning, there were the militias. In the South, they were also called the “slave patrols,” and they were regulated by the states.
In Georgia, for example, a generation before the American Revolution, laws were passed in 1755 and 1757 that required all plantation owners or their male white employees to be members of the Georgia Militia, and for those armed militia members to make monthly inspections of the quarters of all slaves in the state. The law defined which counties had which armed militias and even required armed militia members to keep a keen eye out for slaves who may be planning uprisings.
As Dr. Carl T. Bogus wrote for the University of California Law Review in 1998, “The Georgia statutes required patrols, under the direction of commissioned militia officers, to examine every plantation each month and authorized them to search ‘all Negro Houses for offensive Weapons and Ammunition’ and to apprehend and give twenty lashes to any slave found outside plantation grounds.”
It’s the answer to the question raised by the character played by Leonardo DiCaprio in Django Unchained when he asks, “Why don’t they just rise up and kill the whites?” If the movie were real, it would have been a purely rhetorical question, because every southerner of the era knew the simple answer: Well regulated militias kept the slaves in chains.
Sally E. Haden, in her book Slave Patrols: Law and Violence in Virginia and the Carolinas, notes that, “Although eligibility for the Militia seemed all-encompassing, not every middle-aged white male Virginian or Carolinian became a slave patroller.” There were exemptions so “men in critical professions” like judges, legislators and students could stay at their work. Generally, though, she documents how most southern men between ages 18 and 45 – including physicians and ministers – had to serve on slave patrol in the militia at one time or another in their lives.
And slave rebellions were keeping the slave patrols busy.
By the time the Constitution was ratified, hundreds of substantial slave uprisings had occurred across the South. Blacks outnumbered whites in large areas, and the state militias were used to both prevent and to put down slave uprisings. As Dr. Bogus points out, slavery can only exist in the context of a police state, and the enforcement of that police state was the explicit job of the militias.
If the anti-slavery folks in the North had figured out a way to disband – or even move out of the state – those southern militias, the police state of the South would collapse. And, similarly, if the North were to invite into military service the slaves of the South, then they could be emancipated, which would collapse the institution of slavery, and the southern economic and social systems, altogether.
These two possibilities worried southerners like James Monroe, George Mason (who owned over 300 slaves) and the southern Christian evangelical, Patrick Henry (who opposed slavery on principle, but also opposed freeing slaves).
Their main concern was that Article 1, Section 8 of the newly-proposed Constitution, which gave the federal government the power to raise and supervise a militia, could also allow that federal militia to subsume their state militias and change them from slavery-enforcing institutions into something that could even, one day, free the slaves.
This was not an imagined threat. Famously, 12 years earlier, during the lead-up to the Revolutionary War, Lord Dunsmore offered freedom to slaves who could escape and join his forces. “Liberty to Slaves” was stitched onto their jacket pocket flaps. During the War, British General Henry Clinton extended the practice in 1779. And numerous freed slaves served in General Washington’s army.
Thus, southern legislators and plantation owners lived not just in fear of their own slaves rebelling, but also in fear that their slaves could be emancipated through military service.
At the ratifying convention in Virginia in 1788, Henry laid it out:
“Let me here call your attention to that part [Article 1, Section 8 of the proposed Constitution] which gives the Congress power to provide for organizing, arming, and disciplining the militia, and for governing such part of them as may be employed in the service of the United States. . . .
“By this, sir, you see that their control over our last and best defence is unlimited. If they neglect or refuse to discipline or arm our militia, they will be useless: the states can do neither . . . this power being exclusively given to Congress. The power of appointing officers over men not disciplined or armed is ridiculous; so that this pretended little remains of power left to the states may, at the pleasure of Congress, be rendered nugatory.”
George Mason expressed a similar fear:
“The militia may be here destroyed by that method which has been practised in other parts of the world before; that is, by rendering them useless, by disarming them. Under various pretences, Congress may neglect to provide for arming and disciplining the militia; and the state governments cannot do it, for Congress has an exclusive right to arm them [under this proposed Constitution] . . . “
Henry then bluntly laid it out:
“If the country be invaded, a state may go to war, but cannot suppress [slave] insurrections [under this new Constitution]. If there should happen an insurrection of slaves, the country cannot be said to be invaded. They cannot, therefore, suppress it without the interposition of Congress . . . . Congress, and Congress only [under this new Constitution], can call forth the militia.”
And why was that such a concern for Patrick Henry?
“In this state,” he said, “there are two hundred and thirty-six thousand blacks, and there are many in several other states. But there are few or none in the Northern States. . . . May Congress not say, that every black man must fight? Did we not see a little of this last war? We were not so hard pushed as to make emancipation general; but acts of Assembly passed that every slave who would go to the army should be free.”
Patrick Henry was also convinced that the power over the various state militias given the federal government in the new Constitution could be used to strip the slave states of their slave-patrol militias. He knew the majority attitude in the North opposed slavery, and he worried they’d use the Constitution to free the South’s slaves (a process then called “Manumission”).
The abolitionists would, he was certain, use that power (and, ironically, this is pretty much what Abraham Lincoln ended up doing):
“[T]hey will search that paper [the Constitution], and see if they have power of manumission,” said Henry. “And have they not, sir? Have they not power to provide for the general defence and welfare? May they not think that these call for the abolition of slavery? May they not pronounce all slaves free, and will they not be warranted by that power?
“This is no ambiguous implication or logical deduction. The paper speaks to the point: they have the power in clear, unequivocal terms, and will clearly and certainly exercise it.”
He added: “This is a local matter, and I can see no propriety in subjecting it to Congress.”
James Madison, the “Father of the Constitution” and a slaveholder himself, basically called Patrick Henry paranoid.
“I was struck with surprise,” Madison said, “when I heard him express himself alarmed with respect to the emancipation of slaves. . . . There is no power to warrant it, in that paper [the Constitution]. If there be, I know it not.”
But the southern fears wouldn’t go away.
Patrick Henry even argued that southerner’s “property” (slaves) would be lost under the new Constitution, and the resulting slave uprising would be less than peaceful or tranquil:
“In this situation,” Henry said to Madison, “I see a great deal of the property of the people of Virginia in jeopardy, and their peace and tranquility gone.”
So Madison, who had (at Jefferson’s insistence) already begun to prepare proposed amendments to the Constitution, changed his first draft of one that addressed the militia issue to make sure it was unambiguous that the southern states could maintain their slave patrol militias.
His first draft for what became the Second Amendment had said: “The right of the people to keep and bear arms shall not be infringed; a well armed, and well regulated militia being the best security of a free country [emphasis mine]: but no person religiously scrupulous of bearing arms, shall be compelled to render military service in person.”
But Henry, Mason and others wanted southern states to preserve their slave-patrol militias independent of the federal government. So Madison changed the word “country” to the word “state,” and redrafted the Second Amendment into today’s form:
“A well regulated Militia, being necessary to the security of a free State [emphasis mine], the right of the people to keep and bear Arms, shall not be infringed.”
Little did Madison realize that one day in the future weapons-manufacturing corporations, newly defined as “persons” by a Supreme Court some have called dysfunctional, would use his slave patrol militia amendment to protect their “right” to manufacture and sell assault weapons used to murder schoolchildren.
The U.S. Government Turned Away Thousands of Jewish Refugees, Fearing That They Were Nazi Spies
In a long tradition of “persecuting the refugee,” the State Department and FDR claimed that Jewish immigrants could threaten national security
smithsonian.com
In the summer of 1942, the SS Drottningholm set sail carrying hundreds of desperate Jewish refugees, en route to New York City from Sweden. Among them was Herbert Karl Friedrich Bahr, a 28-year-old from Germany, who was also seeking entry to the United States. When he arrived, he told the same story as his fellow passengers: As a victim of persecution, he wanted asylum from Nazi violence.
But during a meticulous interview process that involved five separate government agencies, Bahr's story began to unravel. Days later, the FBI accused Bahr of being a Nazi spy. They said the Gestapo had given him $7,000 to steal American industrial secrets—and that he'd posed as a refugee in order to sneak into the country unnoticed. His case was rushed to trial, and the prosecution called for the death penalty.
What Bahr didn’t know, or perhaps didn’t mind, was that his story would be used as an excuse to deny visas to thousands of Jews fleeing the horrors of the Nazi regime.
World War II prompted the largest displacement of human beings the world has ever seen—although today's refugee crisis is starting to approach its unprecedented scale. But even with millions of European Jews displaced from their homes, the United States had a poor track record offering asylum. Most notoriously, in June 1939, the German ocean liner St. Louis and its 937 passengers, almost all Jewish, were turned away from the port of Miami, forcing the ship to return to Europe; more than a quarter died in the Holocaust.
Government officials from the State Department to the FBI to President Franklin Roosevelt himself argued that refugees posed a serious threat to national security. Yet today, historians believe that Bahr's case was practically unique—and the concern about refugee spies was blown far out of proportion.
In the court of public opinion, the story of a spy disguised as a refugee was too scandalous to resist. America was months into the largest war the world had ever seen, and in February 1942, Roosevelt had ordered the internment of tens of thousands of Japanese-Americans. Every day the headlines announced new Nazi conquests.
Bahr was “scholarly” and “broad-shouldered,” a man Newsweek called “the latest fish in the spy net.” Bahr was definitely not a refugee; he had been born in Germany, but immigrated to the U.S. in his teens and become a naturalized citizen. He returned to Germany in 1938 as an engineering exchange student in Hanover, where he was contacted by the Gestapo.
At his preliminary hearing, the Associated Press reported that Bahr was “nattily clad in gray and smiling pleasantly.” By the time his trial began, he had little reason to smile; in a hefty 37-page statement, he admitted to attending spy school in Germany. His defense was that he'd planned to reveal everything to the U.S. government. But he sad he'd stalled because he was afraid. “Everywhere, no matter where, there are German agents,” he claimed.
Comments like these only fed widespread fears of a supposed “fifth column” of spies and saboteurs that had infiltrated America. U.S. Attorney General Francis Biddle said in 1942 that “every precaution must be taken...to prevent enemy agents slipping across our borders. We already have had experience with them and we know them to be well trained and clever.” The FBI, meanwhile, released propaganda films that bragged about German spies who had been caught. “We have guarded the secrets, given the Army and Navy its striking force in the field,” one film said.
These suspicions were not only directed at ethnic Germans. “All foreigners became suspect. Jews were not considered immune,” says Richard Breitman, a scholar of Jewish history.
The American ambassador to France, William Bullitt, made the unsubstantiated statement that France fell in 1940 partly because of a vast network of spying refugees. “More than one-half the spies captured doing actual military spy work against the French Army were refugees from Germany,” he said. “Do you believe there are no Nazi and Communist agents of this sort in America?”
These kinds of anxieties weren't new, says Philip Orchard, a historian of international refugee policy. When religious persecution in the 17th century led to the flight of thousands of French Huguenots—the first group ever referred to as “refugees”—European nations worried that accepting them would lead to war with France. Later, asylum seekers themselves became objects of suspicion. “With the rise of anarchism at the turn of the 20th century, there were unfounded fears that anarchists would pose as refugees to enter countries to engage in violence,” Orchard says.
These suspicions seeped into American immigration policy. In late 1938, American consulates were flooded with 125,000 applicants for visas, many coming from Germany and the annexed territories of Austria. But national quotas for German and Austrian immigrants had been set firmly at 27,000.
Read more: SMITHSONIAN.COM
But during a meticulous interview process that involved five separate government agencies, Bahr's story began to unravel. Days later, the FBI accused Bahr of being a Nazi spy. They said the Gestapo had given him $7,000 to steal American industrial secrets—and that he'd posed as a refugee in order to sneak into the country unnoticed. His case was rushed to trial, and the prosecution called for the death penalty.
What Bahr didn’t know, or perhaps didn’t mind, was that his story would be used as an excuse to deny visas to thousands of Jews fleeing the horrors of the Nazi regime.
World War II prompted the largest displacement of human beings the world has ever seen—although today's refugee crisis is starting to approach its unprecedented scale. But even with millions of European Jews displaced from their homes, the United States had a poor track record offering asylum. Most notoriously, in June 1939, the German ocean liner St. Louis and its 937 passengers, almost all Jewish, were turned away from the port of Miami, forcing the ship to return to Europe; more than a quarter died in the Holocaust.
Government officials from the State Department to the FBI to President Franklin Roosevelt himself argued that refugees posed a serious threat to national security. Yet today, historians believe that Bahr's case was practically unique—and the concern about refugee spies was blown far out of proportion.
In the court of public opinion, the story of a spy disguised as a refugee was too scandalous to resist. America was months into the largest war the world had ever seen, and in February 1942, Roosevelt had ordered the internment of tens of thousands of Japanese-Americans. Every day the headlines announced new Nazi conquests.
Bahr was “scholarly” and “broad-shouldered,” a man Newsweek called “the latest fish in the spy net.” Bahr was definitely not a refugee; he had been born in Germany, but immigrated to the U.S. in his teens and become a naturalized citizen. He returned to Germany in 1938 as an engineering exchange student in Hanover, where he was contacted by the Gestapo.
At his preliminary hearing, the Associated Press reported that Bahr was “nattily clad in gray and smiling pleasantly.” By the time his trial began, he had little reason to smile; in a hefty 37-page statement, he admitted to attending spy school in Germany. His defense was that he'd planned to reveal everything to the U.S. government. But he sad he'd stalled because he was afraid. “Everywhere, no matter where, there are German agents,” he claimed.
Comments like these only fed widespread fears of a supposed “fifth column” of spies and saboteurs that had infiltrated America. U.S. Attorney General Francis Biddle said in 1942 that “every precaution must be taken...to prevent enemy agents slipping across our borders. We already have had experience with them and we know them to be well trained and clever.” The FBI, meanwhile, released propaganda films that bragged about German spies who had been caught. “We have guarded the secrets, given the Army and Navy its striking force in the field,” one film said.
These suspicions were not only directed at ethnic Germans. “All foreigners became suspect. Jews were not considered immune,” says Richard Breitman, a scholar of Jewish history.
The American ambassador to France, William Bullitt, made the unsubstantiated statement that France fell in 1940 partly because of a vast network of spying refugees. “More than one-half the spies captured doing actual military spy work against the French Army were refugees from Germany,” he said. “Do you believe there are no Nazi and Communist agents of this sort in America?”
These kinds of anxieties weren't new, says Philip Orchard, a historian of international refugee policy. When religious persecution in the 17th century led to the flight of thousands of French Huguenots—the first group ever referred to as “refugees”—European nations worried that accepting them would lead to war with France. Later, asylum seekers themselves became objects of suspicion. “With the rise of anarchism at the turn of the 20th century, there were unfounded fears that anarchists would pose as refugees to enter countries to engage in violence,” Orchard says.
These suspicions seeped into American immigration policy. In late 1938, American consulates were flooded with 125,000 applicants for visas, many coming from Germany and the annexed territories of Austria. But national quotas for German and Austrian immigrants had been set firmly at 27,000.
Read more: SMITHSONIAN.COM
the party of traitors in 1933 are still traitors today!!!
What the GOP learned when the wealthy tried to overthrow FDR and install a fascist dictator
by DailyKos - alternet
March 24, 2019
In 1933, a group of very wealthy bankers on Wall Street were mortified with FDR. He gave government jobs to many unemployed, ended the gold standard, and was planning on enacting a slew of government programs that would constitute the “New Deal”. Workers would be given the right to unionize, infrastructure projects would be established, and he pitched a pension program for workers called Social Security. For that, he was called a “socialist.” Up to that point, however, his gravest sin was going after the wealthy to pay their fair share. The Revenue Act, which imposed a “wealth tax” on those at the top, increased the tax rate to 75%.
Roosevelt was accused of being a “traitor to his class”.
Unlike what the GOP says today, Roosevelt wasn’t trying to do any great social experiment. He was trying to pull our nation out of the Great Depression. Unemployment was between 80-90% at several major cities, and those few who were working were being exploited with zero labor protections. FDR took action. He was hated by the rich, and he didn’t care:
The forces of ‘organized money’ are unanimous in their hate for me – and I welcome their hatred.
The bankers felt that FDR was going after their money, but they went far beyond just trying to sway the election. Some decided that this whole democracy thing just wasn’t working, and they had to take matters into their own hands. This was a few years before the extent of the horror inflicted by the Nazis had surfaced, and fascism had several believers within the American right.
The bankers plotted a coup against FDR, which would later be called the Wall Street Putsch. The conspirators included a bond salesman named Gerald MacGuire, the commander of the Massachusetts American Legion Bill Doyle, and investment banker Prescott Bush. Yes, that Bush—father of George H.W. and grandfather of W.
They tried to recruit a general named Smedley Butler to their cause. They arranged several meetings and were quite blunt in what they wanted:
The conspirators would provide the financial backing and recruit an army of 500,000 soldiers, which Butler was to lead.
The pretext for the coup would be that FDR’s health was failing. FDR would remain in a ceremonial position, in which, as MacGuire allegedly described, “The President will go around and christen babies and dedicate bridges and kiss children.”
The real power of the government would be held in the hands of a Secretary of General Affairs, who would be in effect a dictator: “somebody to take over the details of the office — take them off the President’s shoulders. […] A sort of a super secretary.”
Unfortunately for the conspirators, the general didn’t bite. In fact, he exposed them at a Congressional hearing as traitors. Initially, Congress and the press treated this whole thing as a joke. However, with Butler’s testimony, along with a reporter who was at one of the meetings, Congress opened an investigation. They found out it was indeed true, but decided not to do anything about it because the plot seemed so far-fetched. No one was prosecuted.
Sadly, this wouldn’t be the last time a wealthy person got away with treason.
The wealthy plutocrats determined they had a problem. As much as the American people were suffering, they believed in democracy. They believed in America, and had contempt for dictatorship. No coup could have worked back then even if Butler had agreed to march on Washington. People wouldn’t have stood for it. So the strategy changed: the kind of Ayn-Randian changes that they wanted could never happen in a democracy. The majority doesn’t want welfare for the rich, denial of healthcare, repeal of the estate tax, or destruction of the social safety net. So democracy itself had to go.
For years, the plutocrats have spent a fortune on campaigns, rightwing media, and astro-turfing groups to cultivate a bizarre authoritarian ideology. I’ve written about the phenomenon of conservative submission before, but I argue that it didn’t happen overnight. It has taken a while, but now a large segment of our population has been utterly convinced that they need to align their beliefs with whatever benefits those at the top. Simultaneously, through constant bombardment from rightwing media, they are fed a steady diet of fear and paranoia about other people. They are told repeatedly that they need a strong leader to save and protect them. These are the ingredients for authoritarianism. For the rich, it doesn’t really matter who the person is, as long as he takes control and gives them what they want.
Has it worked?
According to a study by the University of Massachusetts, the inclination to support authoritarianism is now the most statistically significant trait that identifies someone as a modern, Trump-supporting republican.
Now, when Trump threatens to send the Feds against people on TV who satire him, or threatens to use the military, police, or “biker gangs” against those who speak out against him, these people cheer.
His supporters openly call for dictatorship. They don’t even bother to hide it anymore:
After World War II, what people hated the most was a fascist dictator. Ironic that is exactly what the modern GOP is now demanding. This time, there won’t need to be any ridiculous military coup attempt, but rather just a couple million manipulated morons willing to surrender their rights.
Roosevelt was accused of being a “traitor to his class”.
Unlike what the GOP says today, Roosevelt wasn’t trying to do any great social experiment. He was trying to pull our nation out of the Great Depression. Unemployment was between 80-90% at several major cities, and those few who were working were being exploited with zero labor protections. FDR took action. He was hated by the rich, and he didn’t care:
The forces of ‘organized money’ are unanimous in their hate for me – and I welcome their hatred.
The bankers felt that FDR was going after their money, but they went far beyond just trying to sway the election. Some decided that this whole democracy thing just wasn’t working, and they had to take matters into their own hands. This was a few years before the extent of the horror inflicted by the Nazis had surfaced, and fascism had several believers within the American right.
The bankers plotted a coup against FDR, which would later be called the Wall Street Putsch. The conspirators included a bond salesman named Gerald MacGuire, the commander of the Massachusetts American Legion Bill Doyle, and investment banker Prescott Bush. Yes, that Bush—father of George H.W. and grandfather of W.
They tried to recruit a general named Smedley Butler to their cause. They arranged several meetings and were quite blunt in what they wanted:
The conspirators would provide the financial backing and recruit an army of 500,000 soldiers, which Butler was to lead.
The pretext for the coup would be that FDR’s health was failing. FDR would remain in a ceremonial position, in which, as MacGuire allegedly described, “The President will go around and christen babies and dedicate bridges and kiss children.”
The real power of the government would be held in the hands of a Secretary of General Affairs, who would be in effect a dictator: “somebody to take over the details of the office — take them off the President’s shoulders. […] A sort of a super secretary.”
Unfortunately for the conspirators, the general didn’t bite. In fact, he exposed them at a Congressional hearing as traitors. Initially, Congress and the press treated this whole thing as a joke. However, with Butler’s testimony, along with a reporter who was at one of the meetings, Congress opened an investigation. They found out it was indeed true, but decided not to do anything about it because the plot seemed so far-fetched. No one was prosecuted.
Sadly, this wouldn’t be the last time a wealthy person got away with treason.
The wealthy plutocrats determined they had a problem. As much as the American people were suffering, they believed in democracy. They believed in America, and had contempt for dictatorship. No coup could have worked back then even if Butler had agreed to march on Washington. People wouldn’t have stood for it. So the strategy changed: the kind of Ayn-Randian changes that they wanted could never happen in a democracy. The majority doesn’t want welfare for the rich, denial of healthcare, repeal of the estate tax, or destruction of the social safety net. So democracy itself had to go.
For years, the plutocrats have spent a fortune on campaigns, rightwing media, and astro-turfing groups to cultivate a bizarre authoritarian ideology. I’ve written about the phenomenon of conservative submission before, but I argue that it didn’t happen overnight. It has taken a while, but now a large segment of our population has been utterly convinced that they need to align their beliefs with whatever benefits those at the top. Simultaneously, through constant bombardment from rightwing media, they are fed a steady diet of fear and paranoia about other people. They are told repeatedly that they need a strong leader to save and protect them. These are the ingredients for authoritarianism. For the rich, it doesn’t really matter who the person is, as long as he takes control and gives them what they want.
Has it worked?
According to a study by the University of Massachusetts, the inclination to support authoritarianism is now the most statistically significant trait that identifies someone as a modern, Trump-supporting republican.
Now, when Trump threatens to send the Feds against people on TV who satire him, or threatens to use the military, police, or “biker gangs” against those who speak out against him, these people cheer.
His supporters openly call for dictatorship. They don’t even bother to hide it anymore:
After World War II, what people hated the most was a fascist dictator. Ironic that is exactly what the modern GOP is now demanding. This time, there won’t need to be any ridiculous military coup attempt, but rather just a couple million manipulated morons willing to surrender their rights.
This Is How the Age of Plastics Began
In 1939, the future arrived at the World’s Fair in New York with the slogan, “The World of Tomorrow.”
DAVID A. TAYLOR - mother jones
MARCH 3, 2019 6:00 AM
In the closing months of World War II, Americans talked nonstop about how and when the war would end, and about how life was about to change. Germany would fall soon, people agreed on that. Opinions varied on how much longer the war in the Pacific would go on.
Amid the geopolitical turmoil, a small number of people and newspapers chattered about the dawn of another new age. A subtle shift was about to change the fabric of people’s lives: cork was about to lose its dominance as a cornerstone of consumer manufacturing to a little-known synthetic substance called plastic.
In 1939, the future arrived at the World’s Fair in New York with the slogan, “The World of Tomorrow.” The fairground in Queens attracted 44 million people over two seasons, and two contenders laid claim to being the most modern industrial material: cork and plastic.
For decades, cork had been rising as the most flexible of materials; plastic was just an intriguing possibility. The manifold forms of cork products were featured everywhere, from an international Paris Exhibition to the fair in Queens, where the material was embedded in the Ford Motors roadway of the future.
Meanwhile, plastic made a promising debut, with visitors getting their first glimpse of nylon, Plexiglas, and Lucite. Souvenirs included colorful plastic (phenolic resin) pencil sharpeners molded in the form of the fair’s emblematic, obelisk-shaped Trylon building. Visitors also picked up celluloid badges and pen knives, and a Remington electric razor made of Bakelite, along with plastic ashtrays, pens, and coasters.
In the months after the fair, as US entry into the war became inevitable, the government grew concerned by American dependence on cork, which was obtained entirely from forests in Europe. The United States imported nearly half of the world’s production.
People in their 50s today remember when a bottle cap included a cork sliver insert to seal it. But in 1940, cork was in far more than bottle caps. It was the go-to industrial sealant used in car windshield glazing, insulation, refrigerated containers, engine gaskets, and airplanes. In defense, cork was crucial to tanks, trucks, bomber planes, and weapon systems. As the vulnerability for the supply of this all-purpose item became clear with the Nazi blockade of the Atlantic, the government put cork under “allotment,” or restricted use prioritized for defense. Information about cork supplies became subject to censorship.
In October 1941, the Commerce Department released a hefty report detailing the situation titled “Cork Goes to War.” Besides outlining the growing industrial use of cork, the report highlighted Hitler’s efforts to scoop up Europe’s cork harvests and the need for a systemic American response.
Part of that response was an intense research and development machine that ramped up the nascent synthetic industry to fill gaps in defense pipelines. Some were synthetics first developed by America’s enemies: chemists at Armstrong Cork, an industry leader, crafted new products using materials research from Germany. Many synthetics were developed during the mad scramble to replace organic items that the blockade made expensive. To pay for the research and offset rising materials costs, Armstrong trimmed employees’ use of items like carbon paper and paper clips; the company’s accountants noted 95,000 clips used per month in 1944, a 40 percent decline since the war’s start.
In 1944, a book titled “Plastic Horizons,” by B.H. Weil and Victor Anhorn, documented the promise of plastic. A chapter titled “Plastics in a World at War” opens with a paean to the blood toll of war. But then the authors trace how war bends science to its needs for new both deadly and life-saving items: Physicists turn to aircraft detection, chemists to explosives. “Nylon for stockings has become nylon for parachutes. Rubber for tires has almost vanished, and desperate measures are required to replace it with man-made elastics.” That section concludes, “Plastics have indeed gone to war.”
In one dramatic example, the authors describe how plastics came to neutralize Germany’s secret weapon: a magnetic mine designed to be laid on the ocean floor and detonated by the magnetic field surrounding any vessel that passed over it. To counteract that, Allied scientists created plastic-coated electric cables that wrapped around the ships’ hulls and “degaussed” them, rendering the mines ineffective. Thanks, polyvinyl chloride!
The book got a glowing review in the New York Times, which noted that America was experiencing a chemical revolution.
Early plastics, as the book explained, covered a wide range of natural or semi-synthetics like celluloid and synthetic resins that could be molded with heat and pressure.
After the war, chronic shortages of common materials like rubber, cork, linseed oil, and paints forced chemists to scramble for substitutes, further speeding the embrace of plastics. Profitable bottling innovations included the LDPE squeeze bottle introduced by Monsanto in 1945, which paved the way for plastic bottles for soaps and shampoos, and the “Crowntainer,” a seamless metal cone-topped beer can.
There was also a shortage of tinplate for metal caps. Industry was quickly adapting to finding substitutes. Giles Cooke, the in-house chemist at one manufacturing leader, Crown Cork & Seal, was dabbling in research on synthetic resins for container sealants through the 1940s. In beverage bottling, cork’s quality remained unmatched. You could taste the difference between a cork-sealed bottle and one sealed with plastic. Recognizing that it would takes decades to replace cork as a sealant, Cooke and his colleagues hedged their bets with patents on both silicone film container liners and rubber hydrochloride.
In the end, “Plastic Horizons” undersold its subject. Its closing chapter hardly seems to anticipate the ubiquity of plastics we see today, along with its formidable waste problem. “In the future, plastics will supplement rather than supplant such traditional structural materials as metals, wood, and glass,” the authors wrote.
“There may be no Plastics Age, but that should discourage no one; applications will multiply with the years,” they continued. “Plastics are indeed versatile materials, and industry, with the help of science, will continue to add to their number and to improve their properties. Justifiable optimism is the order of the day, and the return of peace will enable the plastics industry to fulfill its promise of things to come.”
By 1946, the transition to plastics had reached a new threshold. That year, New York hosted a National Plastics Exposition, where for the first time, a range of strong, new materials and consumer products headed for American homes were on display. One observer noted, “the public are certainly steamed up on plastics.”
The World of Tomorrow indeed.
Amid the geopolitical turmoil, a small number of people and newspapers chattered about the dawn of another new age. A subtle shift was about to change the fabric of people’s lives: cork was about to lose its dominance as a cornerstone of consumer manufacturing to a little-known synthetic substance called plastic.
In 1939, the future arrived at the World’s Fair in New York with the slogan, “The World of Tomorrow.” The fairground in Queens attracted 44 million people over two seasons, and two contenders laid claim to being the most modern industrial material: cork and plastic.
For decades, cork had been rising as the most flexible of materials; plastic was just an intriguing possibility. The manifold forms of cork products were featured everywhere, from an international Paris Exhibition to the fair in Queens, where the material was embedded in the Ford Motors roadway of the future.
Meanwhile, plastic made a promising debut, with visitors getting their first glimpse of nylon, Plexiglas, and Lucite. Souvenirs included colorful plastic (phenolic resin) pencil sharpeners molded in the form of the fair’s emblematic, obelisk-shaped Trylon building. Visitors also picked up celluloid badges and pen knives, and a Remington electric razor made of Bakelite, along with plastic ashtrays, pens, and coasters.
In the months after the fair, as US entry into the war became inevitable, the government grew concerned by American dependence on cork, which was obtained entirely from forests in Europe. The United States imported nearly half of the world’s production.
People in their 50s today remember when a bottle cap included a cork sliver insert to seal it. But in 1940, cork was in far more than bottle caps. It was the go-to industrial sealant used in car windshield glazing, insulation, refrigerated containers, engine gaskets, and airplanes. In defense, cork was crucial to tanks, trucks, bomber planes, and weapon systems. As the vulnerability for the supply of this all-purpose item became clear with the Nazi blockade of the Atlantic, the government put cork under “allotment,” or restricted use prioritized for defense. Information about cork supplies became subject to censorship.
In October 1941, the Commerce Department released a hefty report detailing the situation titled “Cork Goes to War.” Besides outlining the growing industrial use of cork, the report highlighted Hitler’s efforts to scoop up Europe’s cork harvests and the need for a systemic American response.
Part of that response was an intense research and development machine that ramped up the nascent synthetic industry to fill gaps in defense pipelines. Some were synthetics first developed by America’s enemies: chemists at Armstrong Cork, an industry leader, crafted new products using materials research from Germany. Many synthetics were developed during the mad scramble to replace organic items that the blockade made expensive. To pay for the research and offset rising materials costs, Armstrong trimmed employees’ use of items like carbon paper and paper clips; the company’s accountants noted 95,000 clips used per month in 1944, a 40 percent decline since the war’s start.
In 1944, a book titled “Plastic Horizons,” by B.H. Weil and Victor Anhorn, documented the promise of plastic. A chapter titled “Plastics in a World at War” opens with a paean to the blood toll of war. But then the authors trace how war bends science to its needs for new both deadly and life-saving items: Physicists turn to aircraft detection, chemists to explosives. “Nylon for stockings has become nylon for parachutes. Rubber for tires has almost vanished, and desperate measures are required to replace it with man-made elastics.” That section concludes, “Plastics have indeed gone to war.”
In one dramatic example, the authors describe how plastics came to neutralize Germany’s secret weapon: a magnetic mine designed to be laid on the ocean floor and detonated by the magnetic field surrounding any vessel that passed over it. To counteract that, Allied scientists created plastic-coated electric cables that wrapped around the ships’ hulls and “degaussed” them, rendering the mines ineffective. Thanks, polyvinyl chloride!
The book got a glowing review in the New York Times, which noted that America was experiencing a chemical revolution.
Early plastics, as the book explained, covered a wide range of natural or semi-synthetics like celluloid and synthetic resins that could be molded with heat and pressure.
After the war, chronic shortages of common materials like rubber, cork, linseed oil, and paints forced chemists to scramble for substitutes, further speeding the embrace of plastics. Profitable bottling innovations included the LDPE squeeze bottle introduced by Monsanto in 1945, which paved the way for plastic bottles for soaps and shampoos, and the “Crowntainer,” a seamless metal cone-topped beer can.
There was also a shortage of tinplate for metal caps. Industry was quickly adapting to finding substitutes. Giles Cooke, the in-house chemist at one manufacturing leader, Crown Cork & Seal, was dabbling in research on synthetic resins for container sealants through the 1940s. In beverage bottling, cork’s quality remained unmatched. You could taste the difference between a cork-sealed bottle and one sealed with plastic. Recognizing that it would takes decades to replace cork as a sealant, Cooke and his colleagues hedged their bets with patents on both silicone film container liners and rubber hydrochloride.
In the end, “Plastic Horizons” undersold its subject. Its closing chapter hardly seems to anticipate the ubiquity of plastics we see today, along with its formidable waste problem. “In the future, plastics will supplement rather than supplant such traditional structural materials as metals, wood, and glass,” the authors wrote.
“There may be no Plastics Age, but that should discourage no one; applications will multiply with the years,” they continued. “Plastics are indeed versatile materials, and industry, with the help of science, will continue to add to their number and to improve their properties. Justifiable optimism is the order of the day, and the return of peace will enable the plastics industry to fulfill its promise of things to come.”
By 1946, the transition to plastics had reached a new threshold. That year, New York hosted a National Plastics Exposition, where for the first time, a range of strong, new materials and consumer products headed for American homes were on display. One observer noted, “the public are certainly steamed up on plastics.”
The World of Tomorrow indeed.
How Islam spread through the Christian world via the bedroom
by Aeon - alternet
February 2, 2019
There are few transformations in world history more profound than the conversion of the peoples of the Middle East to Islam. Starting in the early Middle Ages, the process stretched…
There are few transformations in world history more profound than the conversion of the peoples of the Middle East to Islam. Starting in the early Middle Ages, the process stretched across centuries and was influenced by factors as varied as conquest, diplomacy, conviction, self-interest and coercion. There is one factor, however, that is largely forgotten but which played a fundamental role in the emergence of a distinctively Islamic society: mixed unions between Muslims and non-Muslims.
For much of the early Islamic period, the mingling of Muslims and non-Muslims was largely predicated on a basic imbalance of power: Muslims formed an elite ruling minority, which tended to exploit the resources of the conquered peoples – reproductive and otherwise – to grow in size and put down roots within local populations. Seen in this light, forced conversion was far less a factor in long-term religious change than practices such as intermarriage and concubinage.
The rules governing religiously mixed families crystallised fairly early, at least on the Muslim side. The Quran allows Muslim men to marry up to four women, including ‘People of the Book’, that is, Jews and Christians. Muslim women, however, were not permitted to marry non-Muslim men and, judging from the historical evidence, this prohibition seems to have stuck. Underlying the injunction was the understanding that marriage was a form of female enslavement: if a woman was bound to her husband as a slave is to her master, she could not be subordinate to an infidel.
Outside of marriage, the conquests of the seventh and eighth centuries saw massive numbers of slaves captured across North Africa, the Middle East and Central Asia. Female slaves of non-Muslim origin, at least, were often pressed into the sexual service of their Muslim masters, and many of these relationships produced children.
Since Muslim men were free to keep as many slaves as they wished, sex with Jewish and Christian women was considered licit, while sex with Zoroastrians and others outside the ‘People of the Book’ was technically forbidden. After all, they were regarded as pagans, lacking a valid divine scripture that was equivalent to the Torah or the Gospel. But since so many slaves in the early period came from these ‘forbidden’ communities, Muslim jurists developed convenient workarounds. Some writers of the ninth century, for example, argued that Zoroastrian women could be induced or even forced to convert, and thus become available for sex.
Whether issued via marriage or slavery, the children of religiously mixed unions were automatically considered Muslims. Sometimes Jewish or Christian men converted after already having started families: if their conversions occurred before their children attained the age of legal majority – seven or 10, depending on the school of Islamic law – they had to follow their fathers’ faith. If the conversions occurred after, the children were free to choose. Even as fathers and children changed religion, mothers could continue as Jews and Christians, as was their right under Sharia law.
Mixed marriage and concubinage allowed Muslims – who constituted a tiny percentage of the population at the start of Islamic history – to quickly integrate with their subjects, legitimising their rule over newly conquered territories, and helping them grow in number. It also ensured that non-Muslim religions would quickly disappear from family trees. Indeed, given the rules governing the religious identity of children, mixed kinship groups probably lasted no longer than a generation or two. It was precisely this prospect of disappearing that prompted non-Muslim leaders – Jewish rabbis, Christian bishops and Zoroastrian priests – to inveigh against mixed marriage and codify laws aimed at discouraging it. Because Muslims were members of the elite, who enjoyed greater access to economic resources than non-Muslims, their fertility rates were probably higher.
Of course, theory and reality did not always line up, and religiously mixed families sometimes flouted the rules set by jurists. One of the richest bodies of evidence for such families are the biographies of Christian martyrs from the early Islamic period, a little-known group who constitute the subject of my book, Christian Martyrs under Islam (2018). Many of these martyrs were executed for crimes such as apostasy and blasphemy, and not a small number of them came from religiously mixed unions.
A good example is Bacchus, a martyr killed in Palestine in 786 – about 150 years after the death of the Prophet Muhammad. Bacchus, whose biography was recorded in Greek, was born into a Christian family, but his father at some point converted to Islam, thereby changing his children’s status, too. This greatly distressed Bacchus’s mother, who prayed for her husband’s return, and in the meantime, seems to have exposed her Muslim children to Christian practices. Eventually, the father died, freeing Bacchus to become a Christian. He was then baptised and tonsured as a monk, enraging certain Muslim relatives who had him arrested and killed.
Similar examples come from Córdoba, the capital of Islamic Spain, where a group of 48 Christians were martyred between 850 and 859, and commemorated in a corpus of Latin texts. Several of the Córdoba martyrs were born into religiously mixed families, but with an interesting twist: a number of them lived publicly as Muslims but practised Christianity in secret. In most instances, this seems to have been done without the knowledge of their Muslim fathers, but in one unique case of two sisters, it allegedly occurred with the father’s consent. The idea that one would have a public legal identity as a Muslim but a private spiritual identity as a Christian produced a unique subculture of ‘crypto-Christianity’ in Córdoba. This seems to have spanned generations, fuelled by the tendency of some ‘crypto-Christians’ to seek out and marry others like them.
In the modern Middle East, intermarriage has become uncommon. One reason for this is the long-term success of Islamisation, such that there are simply fewer Jews and Christians around to marry. Another reason is that those Jewish and Christian communities that do exist today have survived partly by living in homogeneous environments without Muslims, or by establishing communal norms that strongly penalise marrying out. In contrast to today’s world, where the frontiers between communities can be sealed, the medieval Middle East was a world of surprisingly porous borders, especially when it came to the bedroom.
There are few transformations in world history more profound than the conversion of the peoples of the Middle East to Islam. Starting in the early Middle Ages, the process stretched across centuries and was influenced by factors as varied as conquest, diplomacy, conviction, self-interest and coercion. There is one factor, however, that is largely forgotten but which played a fundamental role in the emergence of a distinctively Islamic society: mixed unions between Muslims and non-Muslims.
For much of the early Islamic period, the mingling of Muslims and non-Muslims was largely predicated on a basic imbalance of power: Muslims formed an elite ruling minority, which tended to exploit the resources of the conquered peoples – reproductive and otherwise – to grow in size and put down roots within local populations. Seen in this light, forced conversion was far less a factor in long-term religious change than practices such as intermarriage and concubinage.
The rules governing religiously mixed families crystallised fairly early, at least on the Muslim side. The Quran allows Muslim men to marry up to four women, including ‘People of the Book’, that is, Jews and Christians. Muslim women, however, were not permitted to marry non-Muslim men and, judging from the historical evidence, this prohibition seems to have stuck. Underlying the injunction was the understanding that marriage was a form of female enslavement: if a woman was bound to her husband as a slave is to her master, she could not be subordinate to an infidel.
Outside of marriage, the conquests of the seventh and eighth centuries saw massive numbers of slaves captured across North Africa, the Middle East and Central Asia. Female slaves of non-Muslim origin, at least, were often pressed into the sexual service of their Muslim masters, and many of these relationships produced children.
Since Muslim men were free to keep as many slaves as they wished, sex with Jewish and Christian women was considered licit, while sex with Zoroastrians and others outside the ‘People of the Book’ was technically forbidden. After all, they were regarded as pagans, lacking a valid divine scripture that was equivalent to the Torah or the Gospel. But since so many slaves in the early period came from these ‘forbidden’ communities, Muslim jurists developed convenient workarounds. Some writers of the ninth century, for example, argued that Zoroastrian women could be induced or even forced to convert, and thus become available for sex.
Whether issued via marriage or slavery, the children of religiously mixed unions were automatically considered Muslims. Sometimes Jewish or Christian men converted after already having started families: if their conversions occurred before their children attained the age of legal majority – seven or 10, depending on the school of Islamic law – they had to follow their fathers’ faith. If the conversions occurred after, the children were free to choose. Even as fathers and children changed religion, mothers could continue as Jews and Christians, as was their right under Sharia law.
Mixed marriage and concubinage allowed Muslims – who constituted a tiny percentage of the population at the start of Islamic history – to quickly integrate with their subjects, legitimising their rule over newly conquered territories, and helping them grow in number. It also ensured that non-Muslim religions would quickly disappear from family trees. Indeed, given the rules governing the religious identity of children, mixed kinship groups probably lasted no longer than a generation or two. It was precisely this prospect of disappearing that prompted non-Muslim leaders – Jewish rabbis, Christian bishops and Zoroastrian priests – to inveigh against mixed marriage and codify laws aimed at discouraging it. Because Muslims were members of the elite, who enjoyed greater access to economic resources than non-Muslims, their fertility rates were probably higher.
Of course, theory and reality did not always line up, and religiously mixed families sometimes flouted the rules set by jurists. One of the richest bodies of evidence for such families are the biographies of Christian martyrs from the early Islamic period, a little-known group who constitute the subject of my book, Christian Martyrs under Islam (2018). Many of these martyrs were executed for crimes such as apostasy and blasphemy, and not a small number of them came from religiously mixed unions.
A good example is Bacchus, a martyr killed in Palestine in 786 – about 150 years after the death of the Prophet Muhammad. Bacchus, whose biography was recorded in Greek, was born into a Christian family, but his father at some point converted to Islam, thereby changing his children’s status, too. This greatly distressed Bacchus’s mother, who prayed for her husband’s return, and in the meantime, seems to have exposed her Muslim children to Christian practices. Eventually, the father died, freeing Bacchus to become a Christian. He was then baptised and tonsured as a monk, enraging certain Muslim relatives who had him arrested and killed.
Similar examples come from Córdoba, the capital of Islamic Spain, where a group of 48 Christians were martyred between 850 and 859, and commemorated in a corpus of Latin texts. Several of the Córdoba martyrs were born into religiously mixed families, but with an interesting twist: a number of them lived publicly as Muslims but practised Christianity in secret. In most instances, this seems to have been done without the knowledge of their Muslim fathers, but in one unique case of two sisters, it allegedly occurred with the father’s consent. The idea that one would have a public legal identity as a Muslim but a private spiritual identity as a Christian produced a unique subculture of ‘crypto-Christianity’ in Córdoba. This seems to have spanned generations, fuelled by the tendency of some ‘crypto-Christians’ to seek out and marry others like them.
In the modern Middle East, intermarriage has become uncommon. One reason for this is the long-term success of Islamisation, such that there are simply fewer Jews and Christians around to marry. Another reason is that those Jewish and Christian communities that do exist today have survived partly by living in homogeneous environments without Muslims, or by establishing communal norms that strongly penalise marrying out. In contrast to today’s world, where the frontiers between communities can be sealed, the medieval Middle East was a world of surprisingly porous borders, especially when it came to the bedroom.
Trump, Benjamin Franklin and the long history of calling immigrants ‘snakes’
by History News Network - ALTERNET
January 28, 2019
In the midst of a national (non-)dialogue about immigration, one major sticking point has been the belief promoted by Donald Trump that immigrants crossing the southern border are criminals, slinking northward…
In the midst of a national (non-)dialogue about immigration, one major sticking point has been the belief promoted by Donald Trump that immigrants crossing the southern border are criminals, slinking northward like reptiles to spread their venom.
Speaking at the Conservative Political Action Conference in February 2018, Trump read a poem that he had frequently recited during his campaign to discuss immigration. The poem (first written as a song in the 1960s) tells the story of a woman who rescues a freezing snake but is bitten after she revives it. The last stanza is a dialogue between the woman and the snake: “I saved you, cried the woman, and you’ve bitten me. Heavens why?/ You know your bite is poisonous, and now I’m going to die./ Oh, shut up, silly woman said the reptile with a grin./ You knew damn well I was a snake before you took me in.”
Trump’s unsupported allegations that immigrants are “animals, not people” may find a popular reception among many Americans because the association between immigrants, criminality, and reptility goes back to a period well before the founding of the nation. The sticking point in the national dialogue might be removed if citizens had more information about the cultural origins of the belief that immigrants are felons—the legal equivalent of rattlesnakes.
After the establishment of the British colonies in America in the early seventeenth century, it became a common practice in England for authorities to round up “beggars, Gypsies, prostitutes, the poor, the orphaned,” and other “lewd and dangerous persons” and ship them to the colonies.To pay for their passage, the ship’s captain sold the immigrants into servitude upon arrival. Dennis Todd, author of Defoe’s America, estimates that about 130,000 British immigrants were brought to the Chesapeake Bay region between 1670 and 1718, far outnumbering the property-owning colonists (Todd, 7). Of this number, about half were indentured servants. Many of the remainder were felons who were granted transportation to America as an alternative to capital punishment.After passage of the Transportation Act of 1718, a large class of felonies was made punishable by transportation, rather than death. Between 1718 and 1775, some 40,000 convicts were transported to America (Todd, 8).
In the years prior to the importation of enslaved people from Africa, indentured servants, some of whom came voluntarily, were the principal source of labor in America. They were bound to work for a limited period of time (generally four to seven years), after which they were to be freed. They were often paid “freedom dues” of food, clothing, tools, and land upon completing their terms of service. After 1718, indentured servants were gradually supplanted by enslaved Africansand by transported felons. The felons were often sentenced to fourteen years of labor and did not enjoy the rights accorded to indentured servants. The longer terms of service and the lower social status of transported felons, together with a tighter market for tobacco, meant that felons after 1718 were much less likely to obtain the rewards of transportation that they might once have expected.
Benjamin Franklin, a patriot committed to ideals of human liberty, decried the policies of the British government that sent ships loaded with convicts to America. In 1749, the Assembly in Pennsylvania passed a bill forbidding the importation of convicts,but the measure wasrejected by the British Parliament on the grounds “That such Laws are against the Publick Utility, as they tend to present the Improvement and Well Peopling of the Colonies” (Franklin, 358). In a satirical letter worthy of Jonathan Swift, Franklin expressed the gratitude of the colonies to “our Mother Country for the Welfare of her Children,” and proposed a fair return for the shipments of convicts. His proposal was that, in the Spring, the colonists should round up thousands of the “venomous Reptiles we call Rattle-Snakes, Felons-convict from the Beginning of the World,” and transport them to Britain, where they may be released in St. James’s Park, in the pleasure gardens about London, “but particularly in the Gardens of the Prime Ministers, the Lords of Tradeand Members of Parliament; for to them we are most particularly obliged” (Franklin, 360).
Franklin, of course, did not consider all immigrants to be comparable to rattlesnakes, but many colonists made no distinction between the two groups. One pamphleteer, whom Franklin quoted in his letter to the Pennsylvania Gazette, exclaimed “In what can Britain show a more Sovereign Contempt for us, than by emptying their Jails into our Settlements; unless they would likewise empty their Jakes on our Tables?” (Franklin, 358). Eventually, says Todd, “all servants, free or criminal, came to be seen as socially inferior and unfit” (Todd, 144). Franklin was not brazen enough to propose a wall to keep out both servants and slaves, but in some ways his proposal to send rattlesnakes to Britain went further. His letter was really a cry to Americans to stand up for their own sovereignty (to which they did not yet have a claim, though Franklin thought they should exert it). Presumably, Franklin thought that sovereignty for America would leadto a more humanitarian policy on immigration.
America should be able to control its borders and determine the composition of its citizenry without recourse to rhetoric that demeans both immigrants and the office of the Presidency. The starting point, rather than sticking point, may be the moment when we stop regarding immigrants as rattlesnakes.
RECOMMENDED READING: WHITE TRASH - THE 400-YEAR UNTOLD HISTORY OF CLASS IN AMERICA BY NANCY ISENBERG
In the midst of a national (non-)dialogue about immigration, one major sticking point has been the belief promoted by Donald Trump that immigrants crossing the southern border are criminals, slinking northward like reptiles to spread their venom.
Speaking at the Conservative Political Action Conference in February 2018, Trump read a poem that he had frequently recited during his campaign to discuss immigration. The poem (first written as a song in the 1960s) tells the story of a woman who rescues a freezing snake but is bitten after she revives it. The last stanza is a dialogue between the woman and the snake: “I saved you, cried the woman, and you’ve bitten me. Heavens why?/ You know your bite is poisonous, and now I’m going to die./ Oh, shut up, silly woman said the reptile with a grin./ You knew damn well I was a snake before you took me in.”
Trump’s unsupported allegations that immigrants are “animals, not people” may find a popular reception among many Americans because the association between immigrants, criminality, and reptility goes back to a period well before the founding of the nation. The sticking point in the national dialogue might be removed if citizens had more information about the cultural origins of the belief that immigrants are felons—the legal equivalent of rattlesnakes.
After the establishment of the British colonies in America in the early seventeenth century, it became a common practice in England for authorities to round up “beggars, Gypsies, prostitutes, the poor, the orphaned,” and other “lewd and dangerous persons” and ship them to the colonies.To pay for their passage, the ship’s captain sold the immigrants into servitude upon arrival. Dennis Todd, author of Defoe’s America, estimates that about 130,000 British immigrants were brought to the Chesapeake Bay region between 1670 and 1718, far outnumbering the property-owning colonists (Todd, 7). Of this number, about half were indentured servants. Many of the remainder were felons who were granted transportation to America as an alternative to capital punishment.After passage of the Transportation Act of 1718, a large class of felonies was made punishable by transportation, rather than death. Between 1718 and 1775, some 40,000 convicts were transported to America (Todd, 8).
In the years prior to the importation of enslaved people from Africa, indentured servants, some of whom came voluntarily, were the principal source of labor in America. They were bound to work for a limited period of time (generally four to seven years), after which they were to be freed. They were often paid “freedom dues” of food, clothing, tools, and land upon completing their terms of service. After 1718, indentured servants were gradually supplanted by enslaved Africansand by transported felons. The felons were often sentenced to fourteen years of labor and did not enjoy the rights accorded to indentured servants. The longer terms of service and the lower social status of transported felons, together with a tighter market for tobacco, meant that felons after 1718 were much less likely to obtain the rewards of transportation that they might once have expected.
Benjamin Franklin, a patriot committed to ideals of human liberty, decried the policies of the British government that sent ships loaded with convicts to America. In 1749, the Assembly in Pennsylvania passed a bill forbidding the importation of convicts,but the measure wasrejected by the British Parliament on the grounds “That such Laws are against the Publick Utility, as they tend to present the Improvement and Well Peopling of the Colonies” (Franklin, 358). In a satirical letter worthy of Jonathan Swift, Franklin expressed the gratitude of the colonies to “our Mother Country for the Welfare of her Children,” and proposed a fair return for the shipments of convicts. His proposal was that, in the Spring, the colonists should round up thousands of the “venomous Reptiles we call Rattle-Snakes, Felons-convict from the Beginning of the World,” and transport them to Britain, where they may be released in St. James’s Park, in the pleasure gardens about London, “but particularly in the Gardens of the Prime Ministers, the Lords of Tradeand Members of Parliament; for to them we are most particularly obliged” (Franklin, 360).
Franklin, of course, did not consider all immigrants to be comparable to rattlesnakes, but many colonists made no distinction between the two groups. One pamphleteer, whom Franklin quoted in his letter to the Pennsylvania Gazette, exclaimed “In what can Britain show a more Sovereign Contempt for us, than by emptying their Jails into our Settlements; unless they would likewise empty their Jakes on our Tables?” (Franklin, 358). Eventually, says Todd, “all servants, free or criminal, came to be seen as socially inferior and unfit” (Todd, 144). Franklin was not brazen enough to propose a wall to keep out both servants and slaves, but in some ways his proposal to send rattlesnakes to Britain went further. His letter was really a cry to Americans to stand up for their own sovereignty (to which they did not yet have a claim, though Franklin thought they should exert it). Presumably, Franklin thought that sovereignty for America would leadto a more humanitarian policy on immigration.
America should be able to control its borders and determine the composition of its citizenry without recourse to rhetoric that demeans both immigrants and the office of the Presidency. The starting point, rather than sticking point, may be the moment when we stop regarding immigrants as rattlesnakes.
RECOMMENDED READING: WHITE TRASH - THE 400-YEAR UNTOLD HISTORY OF CLASS IN AMERICA BY NANCY ISENBERG
THE BORDER PATROL HAS BEEN A CULT OF BRUTALITY SINCE 1924
Greg Grandin - the intercept
January 12 2019, 6:00 a.m.
SINCE ITS FOUNDING in the early 20th century, the U.S. Border Patrol has operated with near-complete impunity, arguably serving as the most politicized and abusive branch of federal law enforcement — even more so than the FBI during J. Edgar Hoover’s directorship.
The 1924 Immigration Act tapped into a xenophobia with deep roots in the U.S. history. The law effectively eliminated immigration from Asia and sharply reduced arrivals from southern and eastern Europe. Most countries were now subject to a set quota system, with the highest numbers assigned to western Europe. As a result, new arrivals to the United States were mostly white Protestants. Nativists were largely happy with this new arrangement, but not with the fact that Mexico, due to the influence of U.S. business interests that wanted to maintain access to low-wage workers, remained exempt from the quota system. “Texas needs these Mexican immigrants,” said the state’s Chamber of Commerce.
Having lost the national debate when it came to restricting Mexicans, white supremacists — fearing that the country’s open-border policy with Mexico was hastening the “mongrelization” of the United States — took control of the U.S. Border Patrol, also established in 1924, and turned it into a frontline instrument of race vigilantism. As the historian Kelly Lytle Hernández has shown, the patrol’s first recruits were white men one or two generations removed from farm life. Some had a military or county sheriff background, while others transferred from border-town police departments or the Texas Rangers — all agencies with their own long tradition of unaccountable brutality. Their politics stood in opposition to the big borderland farmers and ranchers. They didn’t think that Texas — or Arizona, New Mexico, and California — needed Mexican migrants.
Earlier, in the mid-1800s, the Mexican-American War had unleashed a broad, generalized racism against Mexicans throughout the nation. That racism slowly concentrated along an ever-more focused line: the border. While the 1924 immigration law spared Mexico a quota, a series of secondary laws — including one that made it a crime to enter the country outside official ports of entry — gave border and customs agents on-the-spot discretion to decide who could enter the country legally. They had the power to turn what had been a routine daily or seasonal event — crossing the border to go to work — into a ritual of abuse. Hygienic inspections became more widespread and even more degrading. Migrants had their heads shaved, and they were subjected to an increasingly arbitrary set of requirements and the discretion of patrollers, including literacy tests and entrance fees.
The patrol wasn’t a large agency at first — just a few hundred men during its early years — and its reach along a 2,000-mile line was limited. But over the years, its reported brutality grew as the number of agents it deployed increased. Border agents beat, shot, and hung migrants with regularity. Two patrollers, former Texas Rangers, tied the feet of one migrant and dragged him in and out of a river until he confessed to having entered the country illegally. Other patrollers were members of the resurgent Ku Klux Klan, active in border towns from Texas to California. “Practically every other member” of El Paso’s National Guard “was in the Klan,” one military officer recalled, and many had joined the Border Patrol upon its establishment.
For more than a decade, the Border Patrol operated under the authority of the Department of Labor, which in the early years of the Great Depression, before the election of Franklin D. Roosevelt and his appointment of Frances Perkins as secretary of labor, was a major driver pushing deportation. Perkins, even before she entered FDR’s cabinet, had already criticized Border Patrol brutality. In office, she tried to limit the abuses of immigration officials as much as she could, curtailing warrantless arrests, allowing detained migrants phone calls, and working to extend the protections the New Deal offered citizens to migrant workers, including an effort to make abusive migrant labor contracts more equitable.
Reform was short-lived. The White House, bowing to pressure from agriculturalists, placed the Border Patrol, and migration policy more broadly, under the authority of the Department of Justice. More lawsfurther criminalizing migration reinforced the Border Patrol’s power. For example, the end of the Bracero guest-worker program, along with the 1965 Hart-Celler Act, which for the first time assigned quotas to Mexico and other countries in the Western Hemisphere, now meant that thousands of seasonal Mexican workers were officially “illegal.”
Exporting Paramilitary Policing
At the same time, experience gained in migrant interdiction began to be exported internationally. The Border Patrol is often thought of, even by critics of its brutality, as a sleepy backwater federal agency, far removed from the Cold War’s ideological frontlines. But the Patrol played a role in expanding the radius of Washington’s national security doctrine — the tutoring of allied security forces in counterinsurgency tactics — and accelerating the tempo of paramilitary action.
The career of John P. Longan, who worked as an Oklahoma sheriff before joining the Border Patrol, is illustrative. Following stints in New Mexico and Texas, Longan was tapped to help run Operation Wetback, a mass deportation drive focused mostly on California that, as the Los Angeles Times put it, transformed the patrol into an “army” committed to an “all-out war to hurl tens of thousands of Mexican wetbacks back into Mexico.” Modern armies need a modern intelligence service, and Longan, operating out of an unmarked location in an old Alameda Navy installation, updated the Patrol’s ability to gather and analyze information — including information extracted during interrogations — and then act on that information quickly. A few years later, Longan transferred to the State Department’s Public Safety Program, doing tours in a number of third-world hotspots, including Venezuela, Thailand, the Dominican Republic, and Guatemala. According to Stuart Schrader, in his forthcoming “Badges Without Borders: How Global Counterinsurgency Transformed American Policing,” Longan was one of a number of Border Patrol agents recruited to train foreign police through CIA-linked “public safety” programs, since they were likely to speak Spanish. And having worked the southwestern borderlands, these patrollers-turned-covert operators were familiar with societies built around peonage-like labor relations; they seamlessly extended the kind of free-range immunity they enjoyed at home to poorer, oligarch-ruled nations like Guatemala.
In Guatemala, Longan used the intelligence techniques similar to the ones he developed in Operation Wetback to train local police and military officers, creating an “action unit” that could gather information — also mostly from interrogations, many of them including torture — and act on that information in a rapid manner. Within the first three months of 1966, “Operación Limpieza,” or Operation Clean-up, as Longan called his project, conducted over 80 raids and scores of extrajudicial assassinations, including the murder, during one four-day period in early March, of over 30 political activists (I describe Longan’s time in Guatemala in detail here). Likewise, through the early 1970s, the U.S. trained Latin American security forces, the majority from countries run by military governments, at the Border Patrol Academy in Los Fresnos, Texas, where, according to the Los Angeles Times, “CIA instructors” trained them “in the design, manufacture, and potential use of bombs and incendiary devices.”
In This Place, You Have No Rights
Starting in the 1970s, investigative journalists began to report on Border Patrol abuse. Such exposés were damning, but largely ignored. John Crewdson, for instance, won a Pulitzer in 1980 for a series of articles published in the New York Times, including one titled “Border Sweeps of Illegal Aliens Leave Scores of Children in Jails,” yet his 1983 book based on the series, “The Tarnished Door,” is out of print. Crewdson’s reporting on the Border Patrol and the immigration system deserves a revival, for it provides an important back-history to the horrors we are witnessing today.
Patrollers, he reported, regularly engaged in beatings, murder, torture, and rape, including the rape of girls as young as 12. Some patrollers ran their own in-house “outlaw” vigilante groups. Others maintained ties with groups like the Klan. Border Patrol agents also used the children of migrants, either as bait or as a pressure tactic to force confessions. When coming upon a family, agents usually tried to apprehend the youngest member first, with the idea that relatives would give themselves up so as not to be separated. “It may sound cruel,” one patroller said, but it often worked.
Separating migrant families was not official government policy in the years Crewdson was reporting on abuses. But left to their own devices, Border Patrol agents regularly took children from parents, threatening that they would be separated “forever” unless one of them confessed that they had entered the country illegally. Mothers especially, an agent said, “would always break.” Once a confession was extracted, children might be placed in foster care or left to languish in federal jails. Others were released into Mexico alone, far from their homes — forced to survive, according to public defenders, by “garbage-can scrounging, living on rooftops and whatever.” Ten-year-old Sylvia Alvarado, separated from her grandmother as they crossed into Texas, was kept in a small cinderblock cell for more than three months. In California, 13-year-old Julia Pérez, threatened with being arrested and denied food, broke down and told her interrogator that she was Mexican, even though she was a U.S. citizen. The Border Patrol released Pérez into Mexico with no money or way to contact her U.S. family. Such cruelties weren’t one-offs, but part of a pattern, encouraged and committed by officers up the chain of command. The violence was both gratuitous and systemic, including “stress” techniques later associated with the war in Iraq.
The practice, for instance, as recently reported, of placing migrants in extremely cold rooms — called hieleras, or “ice boxes” — goes back decades, at least to the early 1980s, with Crewdson writing that it was a common procedure. Agents reminded captives that they were subject to their will: “In this place, you have no rights.”
Some migrants, being sent back to Mexico, were handcuffed to cars and made to run alongside them to the border. Patrollers pushed “illegals off cliffs,” a patrol agent told Crewdson, “so it would look like an accident.” Officers in the patrol’s parent agency, the Immigration and Naturalization Service, traded young Mexican women they caught at the border to the Los Angeles Rams in exchange for season tickets, and supplied Mexican prostitutes to U.S. congressmen and judges, paying for them out of funds the service used to compensate informants. Agents also worked closely with Texas agriculturalists, delivering workers to their ranches (including to one owned by Lyndon B. Johnson when he was in the White House), then raiding the ranches just before payday and deporting the workers. “The ranchers got their crops harvested for free, the INS men got fishing and hunting privileges on the ranches, and the Mexicans got nothing,” Crewdson reported.
Subsequent reporting confirms that the violence Crewdson documented continued down the years, largely unabated. The remoteness of much of the border region and the harshness of its terrain, the work that straddled the line between foreign and domestic power, and the fact that many of the patrollers were themselves veterans of foreign wars (or hailed from regions with fraught racial relations, including the borderlands themselves) all contributed to a “fortress mentality,” as one officer put it. Patrollers easily imagined their isolated substations to be frontier forts in hostile territory, holding off barbarians. They wielded awesome power over desperate people with little effective recourse. Based on information provided by local migrant advocacy groups, Human Rights Watch wrote in 1993 that in one such substation, in Harlingen, Texas, “physical abuse is often coupled with due process abuses meant to terrorize victims of brutality.” Most captured migrants, beaten or threatened with a beating, signed “voluntary departure agreements” and were “quickly repatriated.”
Between 1982 and 1990, Mexico City sent at least 24 protests to the U.S. State Department on behalf of Mexicans injured or murdered by Border Patrol agents. Just as soldiers use racial epithets for the people they are fighting overseas, Border Patrol agents have a word for their adversaries: “tonks.” It’s “the sound,” one patroller told a journalist, “a flashlight makes when you hit someone over the head.” In neighborhoods filled with undocumented residents, the Patrol operated with the latitude of an occupying army. “Mind your own fucking business, lady, and go back into your house,” one patroller ordered a resident in Stockton, California, who came out on her balcony to see him “kicking a Mexican male who was handcuffed and lying facedown on the ground.”
It wasn’t just the federal Border Patrol that engaged in such sadism, but local law enforcement as well. In 1980, a Texas lawyer affiliated with the United Farm Workers obtained videos of 72 interrogations of migrants that took place over the course of the previous seven years, recorded by the police department in McAllen, Texas. The images were disturbing: Police took turns beating one handcuffed Mexican man, bashing his head on the concrete floor, punching, kicking, and cursing as he pleaded for mercy. The tapes were made for enjoyment, as a kind of bonding ritual that would later be associated with the abuse committed against Iraqi prisoners in Abu Ghraib: As the officers gathered “night after night,” they drank beer and watched “playbacks” of their interrogation sessions. It was, said one of the men involved, a way of initiating new recruits into the cult of border brutalism.
There have been contradictory judicial rulings, but historically, agent power has been limited by no constitutional clause. There are few places patrollers can’t search, no property belonging to migrants they can’t seize. And there is hardly anybody they can’t kill, provided that the victims are poor Mexican or Central American migrants. Between 1985 and 1990, federal agents shot 40 migrants around San Diego alone, killing 22 of them. On April 18, 1986, for instance, patroller Edward Cole was beating 14-year-old Eduardo Carrillo Estrada on the U.S. side of the border’s chain-link fence, when he stopped and shot Eduardo’s younger brother, Humberto, in the back. Humberto was standing on the other side of the fence on Mexican soil. A court ruled that Cole, who had previous incidents of shooting through the fence at Mexicans, had reason to fear for his life from Humberto and used justifiable force.
Such abuses persisted through the 1990s and 2000s. In 1993, the House Subcommittee on International Law, Immigration, and Refugees held hearings on Border Patrol abuse, and its transcript is a catalogue of horrors. One former guard, Tony Hefner, at the INS detention center in Port Isabel, Texas, reported that “a young Salvadoran girl” was forced to “perform personal duties, like dancing the Lambada, for INS officials.” (In 2011, Hefner published a memoir with more accusations of sexual abuse by, as Hefner writes, the INS “brass”). Roberto Martinez, who worked with the San Diego-based U.S.-Mexico Border Program for the American Friends Service Committee, testified that “human and civil rights violations” by the Border Patrol “run the gamut of abuses imaginable” — from rape to murder. Agents regularly seized “original birth certificates and green cards” from Latino citizens, “leaving the victim with the financial burden of having to go through a lengthy process of applying for a new document.” “Rapes and sexual abuse in INS detention centers around the United States,” Martinez said, “seem to be escalating throughout the border region.”
Brutality continued as Washington further militarized both the border and broader immigration policy — first after the 1993 signing of the North American Free Trade Agreement, and then years later with the creation of Immigration and Customs Enforcement and the establishment of the Department of Homeland Security after the 9/11 attacks. Since 2003, Border Patrol agents have killed at least 97 people, including six children. Few agents were prosecuted. Last year, a 19-year-old Guatemalan Maya woman, Claudia Patricia Gómez Gonzáles was killed, shot in the head by a still-unnamed Texas Border Patrol agent shortly after she entered the United States. According to a recent report by the American Civil Liberties Union, young girls apprehended by the Patrol have been physically abused and threatened with rape, while unaccompanied children have experienced “physical and psychological abuse, unsanitary and inhumane living conditions, isolation from family members, extended period of detention, and denial of access to legal medical service.”
The viciousness we are witnessing today at the border, directed at children and adults, has a long history, a fact that should in no way mitigate the extraordinary cruelty of Donald Trump. But it does suggest that if the U.S. is to climb out of the moral abyss it has fallen into, it has to think well beyond Trump’s malice. It needs a historical reckoning with the true cause of the border crisis: the long, brutal history of border enforcement itself.
The 1924 Immigration Act tapped into a xenophobia with deep roots in the U.S. history. The law effectively eliminated immigration from Asia and sharply reduced arrivals from southern and eastern Europe. Most countries were now subject to a set quota system, with the highest numbers assigned to western Europe. As a result, new arrivals to the United States were mostly white Protestants. Nativists were largely happy with this new arrangement, but not with the fact that Mexico, due to the influence of U.S. business interests that wanted to maintain access to low-wage workers, remained exempt from the quota system. “Texas needs these Mexican immigrants,” said the state’s Chamber of Commerce.
Having lost the national debate when it came to restricting Mexicans, white supremacists — fearing that the country’s open-border policy with Mexico was hastening the “mongrelization” of the United States — took control of the U.S. Border Patrol, also established in 1924, and turned it into a frontline instrument of race vigilantism. As the historian Kelly Lytle Hernández has shown, the patrol’s first recruits were white men one or two generations removed from farm life. Some had a military or county sheriff background, while others transferred from border-town police departments or the Texas Rangers — all agencies with their own long tradition of unaccountable brutality. Their politics stood in opposition to the big borderland farmers and ranchers. They didn’t think that Texas — or Arizona, New Mexico, and California — needed Mexican migrants.
Earlier, in the mid-1800s, the Mexican-American War had unleashed a broad, generalized racism against Mexicans throughout the nation. That racism slowly concentrated along an ever-more focused line: the border. While the 1924 immigration law spared Mexico a quota, a series of secondary laws — including one that made it a crime to enter the country outside official ports of entry — gave border and customs agents on-the-spot discretion to decide who could enter the country legally. They had the power to turn what had been a routine daily or seasonal event — crossing the border to go to work — into a ritual of abuse. Hygienic inspections became more widespread and even more degrading. Migrants had their heads shaved, and they were subjected to an increasingly arbitrary set of requirements and the discretion of patrollers, including literacy tests and entrance fees.
The patrol wasn’t a large agency at first — just a few hundred men during its early years — and its reach along a 2,000-mile line was limited. But over the years, its reported brutality grew as the number of agents it deployed increased. Border agents beat, shot, and hung migrants with regularity. Two patrollers, former Texas Rangers, tied the feet of one migrant and dragged him in and out of a river until he confessed to having entered the country illegally. Other patrollers were members of the resurgent Ku Klux Klan, active in border towns from Texas to California. “Practically every other member” of El Paso’s National Guard “was in the Klan,” one military officer recalled, and many had joined the Border Patrol upon its establishment.
For more than a decade, the Border Patrol operated under the authority of the Department of Labor, which in the early years of the Great Depression, before the election of Franklin D. Roosevelt and his appointment of Frances Perkins as secretary of labor, was a major driver pushing deportation. Perkins, even before she entered FDR’s cabinet, had already criticized Border Patrol brutality. In office, she tried to limit the abuses of immigration officials as much as she could, curtailing warrantless arrests, allowing detained migrants phone calls, and working to extend the protections the New Deal offered citizens to migrant workers, including an effort to make abusive migrant labor contracts more equitable.
Reform was short-lived. The White House, bowing to pressure from agriculturalists, placed the Border Patrol, and migration policy more broadly, under the authority of the Department of Justice. More lawsfurther criminalizing migration reinforced the Border Patrol’s power. For example, the end of the Bracero guest-worker program, along with the 1965 Hart-Celler Act, which for the first time assigned quotas to Mexico and other countries in the Western Hemisphere, now meant that thousands of seasonal Mexican workers were officially “illegal.”
Exporting Paramilitary Policing
At the same time, experience gained in migrant interdiction began to be exported internationally. The Border Patrol is often thought of, even by critics of its brutality, as a sleepy backwater federal agency, far removed from the Cold War’s ideological frontlines. But the Patrol played a role in expanding the radius of Washington’s national security doctrine — the tutoring of allied security forces in counterinsurgency tactics — and accelerating the tempo of paramilitary action.
The career of John P. Longan, who worked as an Oklahoma sheriff before joining the Border Patrol, is illustrative. Following stints in New Mexico and Texas, Longan was tapped to help run Operation Wetback, a mass deportation drive focused mostly on California that, as the Los Angeles Times put it, transformed the patrol into an “army” committed to an “all-out war to hurl tens of thousands of Mexican wetbacks back into Mexico.” Modern armies need a modern intelligence service, and Longan, operating out of an unmarked location in an old Alameda Navy installation, updated the Patrol’s ability to gather and analyze information — including information extracted during interrogations — and then act on that information quickly. A few years later, Longan transferred to the State Department’s Public Safety Program, doing tours in a number of third-world hotspots, including Venezuela, Thailand, the Dominican Republic, and Guatemala. According to Stuart Schrader, in his forthcoming “Badges Without Borders: How Global Counterinsurgency Transformed American Policing,” Longan was one of a number of Border Patrol agents recruited to train foreign police through CIA-linked “public safety” programs, since they were likely to speak Spanish. And having worked the southwestern borderlands, these patrollers-turned-covert operators were familiar with societies built around peonage-like labor relations; they seamlessly extended the kind of free-range immunity they enjoyed at home to poorer, oligarch-ruled nations like Guatemala.
In Guatemala, Longan used the intelligence techniques similar to the ones he developed in Operation Wetback to train local police and military officers, creating an “action unit” that could gather information — also mostly from interrogations, many of them including torture — and act on that information in a rapid manner. Within the first three months of 1966, “Operación Limpieza,” or Operation Clean-up, as Longan called his project, conducted over 80 raids and scores of extrajudicial assassinations, including the murder, during one four-day period in early March, of over 30 political activists (I describe Longan’s time in Guatemala in detail here). Likewise, through the early 1970s, the U.S. trained Latin American security forces, the majority from countries run by military governments, at the Border Patrol Academy in Los Fresnos, Texas, where, according to the Los Angeles Times, “CIA instructors” trained them “in the design, manufacture, and potential use of bombs and incendiary devices.”
In This Place, You Have No Rights
Starting in the 1970s, investigative journalists began to report on Border Patrol abuse. Such exposés were damning, but largely ignored. John Crewdson, for instance, won a Pulitzer in 1980 for a series of articles published in the New York Times, including one titled “Border Sweeps of Illegal Aliens Leave Scores of Children in Jails,” yet his 1983 book based on the series, “The Tarnished Door,” is out of print. Crewdson’s reporting on the Border Patrol and the immigration system deserves a revival, for it provides an important back-history to the horrors we are witnessing today.
Patrollers, he reported, regularly engaged in beatings, murder, torture, and rape, including the rape of girls as young as 12. Some patrollers ran their own in-house “outlaw” vigilante groups. Others maintained ties with groups like the Klan. Border Patrol agents also used the children of migrants, either as bait or as a pressure tactic to force confessions. When coming upon a family, agents usually tried to apprehend the youngest member first, with the idea that relatives would give themselves up so as not to be separated. “It may sound cruel,” one patroller said, but it often worked.
Separating migrant families was not official government policy in the years Crewdson was reporting on abuses. But left to their own devices, Border Patrol agents regularly took children from parents, threatening that they would be separated “forever” unless one of them confessed that they had entered the country illegally. Mothers especially, an agent said, “would always break.” Once a confession was extracted, children might be placed in foster care or left to languish in federal jails. Others were released into Mexico alone, far from their homes — forced to survive, according to public defenders, by “garbage-can scrounging, living on rooftops and whatever.” Ten-year-old Sylvia Alvarado, separated from her grandmother as they crossed into Texas, was kept in a small cinderblock cell for more than three months. In California, 13-year-old Julia Pérez, threatened with being arrested and denied food, broke down and told her interrogator that she was Mexican, even though she was a U.S. citizen. The Border Patrol released Pérez into Mexico with no money or way to contact her U.S. family. Such cruelties weren’t one-offs, but part of a pattern, encouraged and committed by officers up the chain of command. The violence was both gratuitous and systemic, including “stress” techniques later associated with the war in Iraq.
The practice, for instance, as recently reported, of placing migrants in extremely cold rooms — called hieleras, or “ice boxes” — goes back decades, at least to the early 1980s, with Crewdson writing that it was a common procedure. Agents reminded captives that they were subject to their will: “In this place, you have no rights.”
Some migrants, being sent back to Mexico, were handcuffed to cars and made to run alongside them to the border. Patrollers pushed “illegals off cliffs,” a patrol agent told Crewdson, “so it would look like an accident.” Officers in the patrol’s parent agency, the Immigration and Naturalization Service, traded young Mexican women they caught at the border to the Los Angeles Rams in exchange for season tickets, and supplied Mexican prostitutes to U.S. congressmen and judges, paying for them out of funds the service used to compensate informants. Agents also worked closely with Texas agriculturalists, delivering workers to their ranches (including to one owned by Lyndon B. Johnson when he was in the White House), then raiding the ranches just before payday and deporting the workers. “The ranchers got their crops harvested for free, the INS men got fishing and hunting privileges on the ranches, and the Mexicans got nothing,” Crewdson reported.
Subsequent reporting confirms that the violence Crewdson documented continued down the years, largely unabated. The remoteness of much of the border region and the harshness of its terrain, the work that straddled the line between foreign and domestic power, and the fact that many of the patrollers were themselves veterans of foreign wars (or hailed from regions with fraught racial relations, including the borderlands themselves) all contributed to a “fortress mentality,” as one officer put it. Patrollers easily imagined their isolated substations to be frontier forts in hostile territory, holding off barbarians. They wielded awesome power over desperate people with little effective recourse. Based on information provided by local migrant advocacy groups, Human Rights Watch wrote in 1993 that in one such substation, in Harlingen, Texas, “physical abuse is often coupled with due process abuses meant to terrorize victims of brutality.” Most captured migrants, beaten or threatened with a beating, signed “voluntary departure agreements” and were “quickly repatriated.”
Between 1982 and 1990, Mexico City sent at least 24 protests to the U.S. State Department on behalf of Mexicans injured or murdered by Border Patrol agents. Just as soldiers use racial epithets for the people they are fighting overseas, Border Patrol agents have a word for their adversaries: “tonks.” It’s “the sound,” one patroller told a journalist, “a flashlight makes when you hit someone over the head.” In neighborhoods filled with undocumented residents, the Patrol operated with the latitude of an occupying army. “Mind your own fucking business, lady, and go back into your house,” one patroller ordered a resident in Stockton, California, who came out on her balcony to see him “kicking a Mexican male who was handcuffed and lying facedown on the ground.”
It wasn’t just the federal Border Patrol that engaged in such sadism, but local law enforcement as well. In 1980, a Texas lawyer affiliated with the United Farm Workers obtained videos of 72 interrogations of migrants that took place over the course of the previous seven years, recorded by the police department in McAllen, Texas. The images were disturbing: Police took turns beating one handcuffed Mexican man, bashing his head on the concrete floor, punching, kicking, and cursing as he pleaded for mercy. The tapes were made for enjoyment, as a kind of bonding ritual that would later be associated with the abuse committed against Iraqi prisoners in Abu Ghraib: As the officers gathered “night after night,” they drank beer and watched “playbacks” of their interrogation sessions. It was, said one of the men involved, a way of initiating new recruits into the cult of border brutalism.
There have been contradictory judicial rulings, but historically, agent power has been limited by no constitutional clause. There are few places patrollers can’t search, no property belonging to migrants they can’t seize. And there is hardly anybody they can’t kill, provided that the victims are poor Mexican or Central American migrants. Between 1985 and 1990, federal agents shot 40 migrants around San Diego alone, killing 22 of them. On April 18, 1986, for instance, patroller Edward Cole was beating 14-year-old Eduardo Carrillo Estrada on the U.S. side of the border’s chain-link fence, when he stopped and shot Eduardo’s younger brother, Humberto, in the back. Humberto was standing on the other side of the fence on Mexican soil. A court ruled that Cole, who had previous incidents of shooting through the fence at Mexicans, had reason to fear for his life from Humberto and used justifiable force.
Such abuses persisted through the 1990s and 2000s. In 1993, the House Subcommittee on International Law, Immigration, and Refugees held hearings on Border Patrol abuse, and its transcript is a catalogue of horrors. One former guard, Tony Hefner, at the INS detention center in Port Isabel, Texas, reported that “a young Salvadoran girl” was forced to “perform personal duties, like dancing the Lambada, for INS officials.” (In 2011, Hefner published a memoir with more accusations of sexual abuse by, as Hefner writes, the INS “brass”). Roberto Martinez, who worked with the San Diego-based U.S.-Mexico Border Program for the American Friends Service Committee, testified that “human and civil rights violations” by the Border Patrol “run the gamut of abuses imaginable” — from rape to murder. Agents regularly seized “original birth certificates and green cards” from Latino citizens, “leaving the victim with the financial burden of having to go through a lengthy process of applying for a new document.” “Rapes and sexual abuse in INS detention centers around the United States,” Martinez said, “seem to be escalating throughout the border region.”
Brutality continued as Washington further militarized both the border and broader immigration policy — first after the 1993 signing of the North American Free Trade Agreement, and then years later with the creation of Immigration and Customs Enforcement and the establishment of the Department of Homeland Security after the 9/11 attacks. Since 2003, Border Patrol agents have killed at least 97 people, including six children. Few agents were prosecuted. Last year, a 19-year-old Guatemalan Maya woman, Claudia Patricia Gómez Gonzáles was killed, shot in the head by a still-unnamed Texas Border Patrol agent shortly after she entered the United States. According to a recent report by the American Civil Liberties Union, young girls apprehended by the Patrol have been physically abused and threatened with rape, while unaccompanied children have experienced “physical and psychological abuse, unsanitary and inhumane living conditions, isolation from family members, extended period of detention, and denial of access to legal medical service.”
The viciousness we are witnessing today at the border, directed at children and adults, has a long history, a fact that should in no way mitigate the extraordinary cruelty of Donald Trump. But it does suggest that if the U.S. is to climb out of the moral abyss it has fallen into, it has to think well beyond Trump’s malice. It needs a historical reckoning with the true cause of the border crisis: the long, brutal history of border enforcement itself.
Why Thomas Jefferson’s vision of American Islam matters today
The Conversation - raw story
04 JAN 2019 AT 07:01 ET
The new Congress includes its first two Muslim women members. One of them, Rashida Tlaib of Michigan, considered getting sworn in privately using a copy of the Qur’an from the library of one of America’s Founding Fathers.
She told CNN this shows “Islam has been part of American history for a long time.” I explored this little-known history in my book “Thomas Jefferson’s Qur’an: Islam and the Founders”.
Islam, an American religion
Muslims arrived in North America as early as the 17th century, eventually composing 15 to 30 percent of the enslaved West African population of British America. (Muslims from the Middle East did not begin to immigrate here as free citizens until the late 19th century.) Even key American Founding Fathers demonstrated a marked interest in the faith and its practitioners, most notably Thomas Jefferson.
As a 22-year-old law student in Williamsburg, Virginia, Jefferson bought a Qur’an – 11 years before drafting the Declaration of Independence.
The purchase is symbolic of a longer historical connection between American and Islamic worlds, and a more inclusive view of the nation’s early, robust view of religious pluralism.
Although Jefferson did not leave any notes on his immediate reaction to the Qur’an, he did criticize Islam as “stifling free enquiry” in his early political debates in Virginia, a charge he also leveled against Catholicism. He thought both religions fused religion and the state at a time he wished to separate them in his commonwealth.
Despite his criticism of Islam, Jefferson supported the rights of its adherents. Evidence exists that Jefferson had been thinking privately about Muslim inclusion in his new country since 1776. A few months after penning the Declaration of Independence, he returned to Virginia to draft legislation about religion for his native state, writing in his private notes a paraphrase of the English philosopher John Locke’s 1689 “Letter on Toleration”:
“[he] says neither Pagan nor Mahometan [Muslim] nor Jew ought to be excluded from the civil rights of the commonwealth because of his religion.”
The precedents Jefferson copied from Locke echo strongly in his Virginia Statute for Religious Freedom, which proclaims:
“(O)ur civil rights have no dependence on our religious opinions.”
The statute, drafted in 1777, which became law in 1786, inspired the Constitution’s “no religious test” clause and the First Amendment.
Jefferson’s pluralistic visionWas Jefferson thinking about Muslims when he drafted his famed Virginia legislation?
Indeed, we find evidence for this in the Founding Father’s 1821 autobiography, where he happily recorded that a final attempt to add the words “Jesus Christ” to the preamble of his legislation failed. And this failure led Jefferson to affirm that he had intended the application of the Statute to be “universal.”
By this he meant that religious liberty and political equality would not be exclusively Christian. For Jefferson asserted in his autobiography that his original legislative intent had been “to comprehend, within the mantle of its protection, the Jew and the Gentile, the Christian and Mahometan [Muslim], the Hindoo, and Infidel of every denomination.”
By defining Muslims as future citizens in the 18th century, in conjunction with a resident Jewish minority, Jefferson expanded his “universal” legislative scope to include every one of every faith.
Ideas about the nation’s religiously plural character were tested also in Jefferson’s presidential foreign policy with the Islamic powers of North Africa. President Jefferson welcomed the first Muslim ambassador, who hailed from Tunis, to the White House in 1805. Because it was Ramadan, the president moved the state dinner from 3:30 p.m. to be “precisely at sunset,” a recognition of the Tunisian ambassador’s religious beliefs, if not quite America’s first official celebration of Ramadan.
Muslims once again provide a litmus test for the civil rights of all U.S. believers. Today, Muslims are fellow citizens and members of Congress, and their legal rights represent an American founding ideal still besieged by fear mongering, precedents at odds with the best of our ideals of universal religious freedom.
She told CNN this shows “Islam has been part of American history for a long time.” I explored this little-known history in my book “Thomas Jefferson’s Qur’an: Islam and the Founders”.
Islam, an American religion
Muslims arrived in North America as early as the 17th century, eventually composing 15 to 30 percent of the enslaved West African population of British America. (Muslims from the Middle East did not begin to immigrate here as free citizens until the late 19th century.) Even key American Founding Fathers demonstrated a marked interest in the faith and its practitioners, most notably Thomas Jefferson.
As a 22-year-old law student in Williamsburg, Virginia, Jefferson bought a Qur’an – 11 years before drafting the Declaration of Independence.
The purchase is symbolic of a longer historical connection between American and Islamic worlds, and a more inclusive view of the nation’s early, robust view of religious pluralism.
Although Jefferson did not leave any notes on his immediate reaction to the Qur’an, he did criticize Islam as “stifling free enquiry” in his early political debates in Virginia, a charge he also leveled against Catholicism. He thought both religions fused religion and the state at a time he wished to separate them in his commonwealth.
Despite his criticism of Islam, Jefferson supported the rights of its adherents. Evidence exists that Jefferson had been thinking privately about Muslim inclusion in his new country since 1776. A few months after penning the Declaration of Independence, he returned to Virginia to draft legislation about religion for his native state, writing in his private notes a paraphrase of the English philosopher John Locke’s 1689 “Letter on Toleration”:
“[he] says neither Pagan nor Mahometan [Muslim] nor Jew ought to be excluded from the civil rights of the commonwealth because of his religion.”
The precedents Jefferson copied from Locke echo strongly in his Virginia Statute for Religious Freedom, which proclaims:
“(O)ur civil rights have no dependence on our religious opinions.”
The statute, drafted in 1777, which became law in 1786, inspired the Constitution’s “no religious test” clause and the First Amendment.
Jefferson’s pluralistic visionWas Jefferson thinking about Muslims when he drafted his famed Virginia legislation?
Indeed, we find evidence for this in the Founding Father’s 1821 autobiography, where he happily recorded that a final attempt to add the words “Jesus Christ” to the preamble of his legislation failed. And this failure led Jefferson to affirm that he had intended the application of the Statute to be “universal.”
By this he meant that religious liberty and political equality would not be exclusively Christian. For Jefferson asserted in his autobiography that his original legislative intent had been “to comprehend, within the mantle of its protection, the Jew and the Gentile, the Christian and Mahometan [Muslim], the Hindoo, and Infidel of every denomination.”
By defining Muslims as future citizens in the 18th century, in conjunction with a resident Jewish minority, Jefferson expanded his “universal” legislative scope to include every one of every faith.
Ideas about the nation’s religiously plural character were tested also in Jefferson’s presidential foreign policy with the Islamic powers of North Africa. President Jefferson welcomed the first Muslim ambassador, who hailed from Tunis, to the White House in 1805. Because it was Ramadan, the president moved the state dinner from 3:30 p.m. to be “precisely at sunset,” a recognition of the Tunisian ambassador’s religious beliefs, if not quite America’s first official celebration of Ramadan.
Muslims once again provide a litmus test for the civil rights of all U.S. believers. Today, Muslims are fellow citizens and members of Congress, and their legal rights represent an American founding ideal still besieged by fear mongering, precedents at odds with the best of our ideals of universal religious freedom.
‘The sunrise city’: Florida community reconciles with history of 1920s race riot
Politicians and activists of Ocoee found recognition for victims of the ‘single bloodiest day’ in modern US political history
Richard Luscombe in Miami
the guardian
Thu 3 Jan 2019 06.00 EST
It has been almost a century since Gladys Franks Bell’s father fled an election day race riot in Florida, clutching his little brothers and sisters and wading through swamps and woodland to safety while the Ku Klux Klan razed the family’s home town of Ocoee.
By the end of the night his uncle July Perry was dead, lynched by a white mob and left hanging from a lamp-post next to a sign reading: “This is what we do to niggers who vote.” The murderous rampage, meanwhile, continued unchecked, claiming dozens of other black lives, according to many accounts, while hundreds of survivors were run out of what then became an all-white town for decades.
Until recently, one of the most shameful episodes of the deep south’s racist past looked destined to be forgotten forever.
But now, thanks to the efforts of local politicians, activists and the Alabama-based Equal Justice Initiative, there is permanent recognition for the victims and their legacy, and an official expression of “regret and horror” from the city of Ocoee, near Orlando.
“It’s been so long, I never thought I’d live to see an acknowledgment that this even happened,” said Bell, who lives in the neighbouring city of Apopka.
“It stayed with me over the years, what my daddy shared with us when we were children, when we used to go into Ocoee and he showed us everything that used to be our property, and told us about his life there and everything that went on.
“Some of it makes you laugh, some makes you cry, and other parts make you downright mad. But they are the facts. It does bring all the memories back.”
A giant step towards healing came in November 2018, when the city of Ocoee – where the census returns between the time of the massacre and 1980 recorded only white residents – adopted a proclamation steeped in symbolism. Ocoee was no longer a so-called sundown city, named for an era when the safety of any black resident could not be guaranteed after dark, the proclamation read. It was henceforth to be “the sunrise city, with the bright light of harmony, justice and prosperity shining upon all our citizens”.
The ball was set rolling to reconciliation at the start of the decade when the civil rights historian Paul Ortiz, associate professor of history at the University of Florida, published an essay looking into what he called “the single bloodiest day in modern American political history”.
Ortiz chronicled the events in Ocoee surrounding the presidential election of 2 November 1920. Perry and his friend Mose Norman, two prosperous black businessmen, had tried to register African Americans to vote, in the face of fierce opposition from city leaders, and when Norman attempted to vote himself he was turned away.
Events degenerated quickly after he returned with a shotgun and was beaten and chased off by a mob who had gathered at the polling station. They raced to Perry’s home, where they thought Norman was hiding, and radioed for reinforcements. Klan members from Orlando and Tampa rushed to the scene where Perry, fearful for his family’s safety, fired at the crowd with a shotgun, killing two men.
The mob then overran the house, wounding Perry and pursuing his fleeing family through nearby woods, and expanded their rampage to Ocoee’s northern quarter, burning dozens of homes and two churches, killing an unknown number of people, perhaps as many as 50, according to Ortiz.
Perry’s fate was sealed when he was pulled by Klan members from the county jail in Orlando, shot and strung up. In the following hours, the rioting spread to Ocoee’s southern districts, where hundreds of black residents were forced to leave permanently, with no compensation for their lost property.
The renewed interest in Ocoee’s grim history sparked a new push for reconciliation, bolstered this April when the election day riot was incorporated into the National Memorial for Peace and Justice in Montgomery, Alabama, a museum dedicated to victims of racial terror and more than 4,400 black people lynched in the south between 1877 and 1950.
And in May, Ocoee voters elected George Oliver as the city’s first African American commissioner, who joined William Maxwell, the longtime chair of the city’s human relations diversity board, as driving forces for the adoption of the proclamation.
“It’s not so much righting a wrong as an opportunity to look at ourselves, each person as an individual,” Oliver said. “You’ve got to understand where July Perry and Mose Norman were coming from. They dared to prosper in an era of white privilege, dared to leave their home in North Carolina to seek out prosperity … one generation away from slavery.”
“That part became their undoing. They wanted to live the American dream. ”
For Bell, the healing process began decades ago when her father Richard, as a teenager, carried his siblings to safety and helped them build their new life in Plymouth, Florida, 10 miles north of Ocoee, memories she records in her book Visions Through My Father’s Eyes.
“He went through all of that, he never shared any hatred against any white person and he taught us to do the same,” she said. “He’d tell us all about it and we just knew of it not holding any grudges. That’s just the type of man my father was.”
RELATED: The sadism of white men: why America must atone for its lynchings
By the end of the night his uncle July Perry was dead, lynched by a white mob and left hanging from a lamp-post next to a sign reading: “This is what we do to niggers who vote.” The murderous rampage, meanwhile, continued unchecked, claiming dozens of other black lives, according to many accounts, while hundreds of survivors were run out of what then became an all-white town for decades.
Until recently, one of the most shameful episodes of the deep south’s racist past looked destined to be forgotten forever.
But now, thanks to the efforts of local politicians, activists and the Alabama-based Equal Justice Initiative, there is permanent recognition for the victims and their legacy, and an official expression of “regret and horror” from the city of Ocoee, near Orlando.
“It’s been so long, I never thought I’d live to see an acknowledgment that this even happened,” said Bell, who lives in the neighbouring city of Apopka.
“It stayed with me over the years, what my daddy shared with us when we were children, when we used to go into Ocoee and he showed us everything that used to be our property, and told us about his life there and everything that went on.
“Some of it makes you laugh, some makes you cry, and other parts make you downright mad. But they are the facts. It does bring all the memories back.”
A giant step towards healing came in November 2018, when the city of Ocoee – where the census returns between the time of the massacre and 1980 recorded only white residents – adopted a proclamation steeped in symbolism. Ocoee was no longer a so-called sundown city, named for an era when the safety of any black resident could not be guaranteed after dark, the proclamation read. It was henceforth to be “the sunrise city, with the bright light of harmony, justice and prosperity shining upon all our citizens”.
The ball was set rolling to reconciliation at the start of the decade when the civil rights historian Paul Ortiz, associate professor of history at the University of Florida, published an essay looking into what he called “the single bloodiest day in modern American political history”.
Ortiz chronicled the events in Ocoee surrounding the presidential election of 2 November 1920. Perry and his friend Mose Norman, two prosperous black businessmen, had tried to register African Americans to vote, in the face of fierce opposition from city leaders, and when Norman attempted to vote himself he was turned away.
Events degenerated quickly after he returned with a shotgun and was beaten and chased off by a mob who had gathered at the polling station. They raced to Perry’s home, where they thought Norman was hiding, and radioed for reinforcements. Klan members from Orlando and Tampa rushed to the scene where Perry, fearful for his family’s safety, fired at the crowd with a shotgun, killing two men.
The mob then overran the house, wounding Perry and pursuing his fleeing family through nearby woods, and expanded their rampage to Ocoee’s northern quarter, burning dozens of homes and two churches, killing an unknown number of people, perhaps as many as 50, according to Ortiz.
Perry’s fate was sealed when he was pulled by Klan members from the county jail in Orlando, shot and strung up. In the following hours, the rioting spread to Ocoee’s southern districts, where hundreds of black residents were forced to leave permanently, with no compensation for their lost property.
The renewed interest in Ocoee’s grim history sparked a new push for reconciliation, bolstered this April when the election day riot was incorporated into the National Memorial for Peace and Justice in Montgomery, Alabama, a museum dedicated to victims of racial terror and more than 4,400 black people lynched in the south between 1877 and 1950.
And in May, Ocoee voters elected George Oliver as the city’s first African American commissioner, who joined William Maxwell, the longtime chair of the city’s human relations diversity board, as driving forces for the adoption of the proclamation.
“It’s not so much righting a wrong as an opportunity to look at ourselves, each person as an individual,” Oliver said. “You’ve got to understand where July Perry and Mose Norman were coming from. They dared to prosper in an era of white privilege, dared to leave their home in North Carolina to seek out prosperity … one generation away from slavery.”
“That part became their undoing. They wanted to live the American dream. ”
For Bell, the healing process began decades ago when her father Richard, as a teenager, carried his siblings to safety and helped them build their new life in Plymouth, Florida, 10 miles north of Ocoee, memories she records in her book Visions Through My Father’s Eyes.
“He went through all of that, he never shared any hatred against any white person and he taught us to do the same,” she said. “He’d tell us all about it and we just knew of it not holding any grudges. That’s just the type of man my father was.”
RELATED: The sadism of white men: why America must atone for its lynchings
How George H.W. Bush Rode a Fake National Security Scandal to the Top of the CIA
James Risen - the intercept
December 8 2018, 6:00 a.m.
ON DECEMBER 15, 1975, a Senate committee opened hearings on whether George H.W. Bush should be confirmed as director of the Central Intelligence Agency.
It wasn’t going to be a slam dunk.
The Democrats had a huge majority in the Senate, and many were still angry over Bush’s role as a partisan apologist for former President Richard Nixon, who had resigned the year before as a result of the Watergate scandal. What’s more, in the wake of disclosures in the press of pervasive domestic spying by the CIA, the Senate had launched its first aggressive investigation into alleged abuses by the U.S. intelligence community.
Beginning in January 1975, the Church Committee, named for its chair, Idaho Democratic Sen. Frank Church, unearthed one scandal after another at the CIA, the FBI, and the National Security Agency. Long-hidden covert programs, including a series of plots to kill foreign leaders like Cuba’s Fidel Castro and the Congo’s Patrice Lumumba, had been exposed, rocking the CIA. By late 1975, the agency’s public standing was at a low ebb, and the CIA and White House officials in the administration of President Gerald Ford were increasingly worried about the political impact of the disclosures.
For Bush, the CIA job was a major opportunity at a time when his political career was in flux. Until then, his greatest accomplishment in the Republican Party had been to win a House seat in Texas that had always been held by a Democrat. But he had lost a subsequent Senate bid in 1970 and had been bouncing around Republican establishment circles ever since. He had the ignominy to serve as chair of the Republican National Committee during Watergate, forcing him to make repeated public excuses for Nixon.
Bush had also served as United Nations ambassador under Nixon and as head of the U.S. Liaison Office in China under Ford, and now the Washington rumor mill was reporting that Bush, the loyal soldier, was under consideration for a major political prize — to be Ford’s vice presidential running mate in 1976. If he didn’t get the vice president’s slot in 1976, it seemed likely that he might run for the presidency on his own later.
But first he had to get confirmed to the CIA post.
For the Ford White House and the CIA, Bush’s confirmation hearings set the stage for an all-out battle with congressional leaders. At a critical moment, the Ford administration, its allies in Congress, and the intelligence community collaborated to gin up outrage over a fake national security scandal that ultimately helped pull Bush across the finish line. That polarizing strategy has provided a winning model for Republican efforts to discredit and distract ever since, all the way down to Donald Trump, Devin Nunes, and the attempted sliming of the FBI and special counsel Robert Mueller’s Trump-Russia investigation.
The story of how Bush became CIA director is brilliantly told in “A Season of Inquiry Revisited” by Loch K. Johnson, a renowned historian of intelligence at the University of Georgia and former Church Committee staffer.
To get confirmed, Bush had to run a gauntlet through the Senate, where Democrats held 60 seats thanks to a post-Watergate Democratic landslide in the 1974 midterms. If he got the nod, he would be the first partisan political figure ever to run the CIA. Until then, the agency had been led by gray-flannel establishment figures from Wall Street, former senior military officers, or longtime agency professionals.
Standing directly in Bush’s way was Church, who had emerged as the spokesperson and public face of congressional efforts to probe and reform the intelligence community. Church immediately opposed Bush’s nomination, which he saw as an effort by Ford to install a partisan hack at the CIA who would do the bidding of the White House just as Congress was seeking to curb the agency’s abuses. Church viewed the Bush nomination as a direct White House attack on his committee’s investigation.
“We need a CIA that can resist all the partisan pressures which can be brought to bear by various groups inside and outside the government — especially pressures from the White House itself,” Church said in a speech on the Senate floor. “This is why the appointment of Ambassador George Bush is so ill-advised. It is one thing to choose an individual who may have had political experience, and quite another to choose someone whose principal political role has been that of chairman of the Republican National Committee. There is no need to eliminate from consideration an individual simply because he or she may have held public office. But the line must be drawn somewhere, and a man of Mr. Bush’s prolonged involvement in partisan activities at the highest party level surely passes over that line.”
At his confirmation hearing, Bush did little to allay Church’s concerns. Instead, he warned that “we must not see the CIA dismantled,” an obvious attack on the Senate’s investigative efforts.
AS THE HOLIDAYS approached, Bush’s confirmation hung in limbo. Then, on December 23, 1975 — eight days after his confirmation hearing — Richard Welch, the CIA’s station chief in Greece, was returning home from a Christmas party at the U.S. ambassador’s residence in Athens when he was assassinated.
Welch had been a relatively easy target for a local militant group known as 17 November. He had been living in the same house used by several previous CIA station chiefs and had been publicly identified in publications in Greece. The group later claimed that its members had been watching him for months.
But the CIA and the Ford White House quickly saw Welch’s murder as a political windfall. At a time when the CIA was under assault from Congress and Bush’s nomination was in peril in the Senate, there was now a dead CIA hero to mourn.
Ford, waiving restrictions, announced that Welch could be buried at Arlington National Cemetery. The plane carrying his body back home in early January “circled Andrews Air Force Base for three quarters of an hour in order to land live during the Today Show,” according to Johnson’s book.
The CIA and the White House began to exploit Welch’s death to discredit Church and his committee’s work. William Colby, the outgoing CIA director, lashed out at Congress, blaming Welch’s killing on the “sensational and hysterical way the CIA investigations had been handled and trumpeted around the world,” Johnson writes.
There was not a shred of evidence that anything the Church Committee had done had led to Welch’s murder. But the truth didn’t matter to the CIA and the Ford White House, and the campaign to discredit Church and his committee’s investigation worked. After Welch’s murder, public support for the Church Committee waned.
The changed climate proved helpful to Bush. On January 27, 1976, South Carolina Sen. Strom Thurmond argued for his confirmation by claiming that the public was more concerned by disclosures that “are tearing down the CIA” than by the “selection of this highly competent man to repair the damage of this over-exposure,” according to Johnson’s book. Later that day, Bush was confirmed by a vote of 64-27.
Bush only lasted a year as CIA director. Ford — who ended up choosing Bob Dole as his running mate — was defeated by Jimmy Carter in the 1976 election. Bush tried to convince Carter to keep him on as CIA director, but Carter’s vice president was Walter Mondale, who had been a leading member of the Church Committee and had already won a commitment from Carter to try to implement many of the committee’s recommendations for reforming the intelligence community.
So Bush ran for president instead. He lost in the primaries to Ronald Reagan, then rode Reagan’s coattails as his running mate in the 1980 election.
Bush’s political career owes much to the misuse of Welch’s murder. Above all, it helped start a Republican tradition of generating fake national security scandals to discredit Democrats and win political battles. In the wake of Bush’s death, many in the mainstream press and political elite have pinned him to a bygone era of civility, when partisanship was held in check out of concern for some greater good. But playing dirty didn’t start yesterday. There is a straight line from Welch to pre-war intelligence on Iraq’s weapons of mass destruction, Benghazi, and Nunes’s farcical midnight search for evidence that Trump was wiretapped.
It wasn’t going to be a slam dunk.
The Democrats had a huge majority in the Senate, and many were still angry over Bush’s role as a partisan apologist for former President Richard Nixon, who had resigned the year before as a result of the Watergate scandal. What’s more, in the wake of disclosures in the press of pervasive domestic spying by the CIA, the Senate had launched its first aggressive investigation into alleged abuses by the U.S. intelligence community.
Beginning in January 1975, the Church Committee, named for its chair, Idaho Democratic Sen. Frank Church, unearthed one scandal after another at the CIA, the FBI, and the National Security Agency. Long-hidden covert programs, including a series of plots to kill foreign leaders like Cuba’s Fidel Castro and the Congo’s Patrice Lumumba, had been exposed, rocking the CIA. By late 1975, the agency’s public standing was at a low ebb, and the CIA and White House officials in the administration of President Gerald Ford were increasingly worried about the political impact of the disclosures.
For Bush, the CIA job was a major opportunity at a time when his political career was in flux. Until then, his greatest accomplishment in the Republican Party had been to win a House seat in Texas that had always been held by a Democrat. But he had lost a subsequent Senate bid in 1970 and had been bouncing around Republican establishment circles ever since. He had the ignominy to serve as chair of the Republican National Committee during Watergate, forcing him to make repeated public excuses for Nixon.
Bush had also served as United Nations ambassador under Nixon and as head of the U.S. Liaison Office in China under Ford, and now the Washington rumor mill was reporting that Bush, the loyal soldier, was under consideration for a major political prize — to be Ford’s vice presidential running mate in 1976. If he didn’t get the vice president’s slot in 1976, it seemed likely that he might run for the presidency on his own later.
But first he had to get confirmed to the CIA post.
For the Ford White House and the CIA, Bush’s confirmation hearings set the stage for an all-out battle with congressional leaders. At a critical moment, the Ford administration, its allies in Congress, and the intelligence community collaborated to gin up outrage over a fake national security scandal that ultimately helped pull Bush across the finish line. That polarizing strategy has provided a winning model for Republican efforts to discredit and distract ever since, all the way down to Donald Trump, Devin Nunes, and the attempted sliming of the FBI and special counsel Robert Mueller’s Trump-Russia investigation.
The story of how Bush became CIA director is brilliantly told in “A Season of Inquiry Revisited” by Loch K. Johnson, a renowned historian of intelligence at the University of Georgia and former Church Committee staffer.
To get confirmed, Bush had to run a gauntlet through the Senate, where Democrats held 60 seats thanks to a post-Watergate Democratic landslide in the 1974 midterms. If he got the nod, he would be the first partisan political figure ever to run the CIA. Until then, the agency had been led by gray-flannel establishment figures from Wall Street, former senior military officers, or longtime agency professionals.
Standing directly in Bush’s way was Church, who had emerged as the spokesperson and public face of congressional efforts to probe and reform the intelligence community. Church immediately opposed Bush’s nomination, which he saw as an effort by Ford to install a partisan hack at the CIA who would do the bidding of the White House just as Congress was seeking to curb the agency’s abuses. Church viewed the Bush nomination as a direct White House attack on his committee’s investigation.
“We need a CIA that can resist all the partisan pressures which can be brought to bear by various groups inside and outside the government — especially pressures from the White House itself,” Church said in a speech on the Senate floor. “This is why the appointment of Ambassador George Bush is so ill-advised. It is one thing to choose an individual who may have had political experience, and quite another to choose someone whose principal political role has been that of chairman of the Republican National Committee. There is no need to eliminate from consideration an individual simply because he or she may have held public office. But the line must be drawn somewhere, and a man of Mr. Bush’s prolonged involvement in partisan activities at the highest party level surely passes over that line.”
At his confirmation hearing, Bush did little to allay Church’s concerns. Instead, he warned that “we must not see the CIA dismantled,” an obvious attack on the Senate’s investigative efforts.
AS THE HOLIDAYS approached, Bush’s confirmation hung in limbo. Then, on December 23, 1975 — eight days after his confirmation hearing — Richard Welch, the CIA’s station chief in Greece, was returning home from a Christmas party at the U.S. ambassador’s residence in Athens when he was assassinated.
Welch had been a relatively easy target for a local militant group known as 17 November. He had been living in the same house used by several previous CIA station chiefs and had been publicly identified in publications in Greece. The group later claimed that its members had been watching him for months.
But the CIA and the Ford White House quickly saw Welch’s murder as a political windfall. At a time when the CIA was under assault from Congress and Bush’s nomination was in peril in the Senate, there was now a dead CIA hero to mourn.
Ford, waiving restrictions, announced that Welch could be buried at Arlington National Cemetery. The plane carrying his body back home in early January “circled Andrews Air Force Base for three quarters of an hour in order to land live during the Today Show,” according to Johnson’s book.
The CIA and the White House began to exploit Welch’s death to discredit Church and his committee’s work. William Colby, the outgoing CIA director, lashed out at Congress, blaming Welch’s killing on the “sensational and hysterical way the CIA investigations had been handled and trumpeted around the world,” Johnson writes.
There was not a shred of evidence that anything the Church Committee had done had led to Welch’s murder. But the truth didn’t matter to the CIA and the Ford White House, and the campaign to discredit Church and his committee’s investigation worked. After Welch’s murder, public support for the Church Committee waned.
The changed climate proved helpful to Bush. On January 27, 1976, South Carolina Sen. Strom Thurmond argued for his confirmation by claiming that the public was more concerned by disclosures that “are tearing down the CIA” than by the “selection of this highly competent man to repair the damage of this over-exposure,” according to Johnson’s book. Later that day, Bush was confirmed by a vote of 64-27.
Bush only lasted a year as CIA director. Ford — who ended up choosing Bob Dole as his running mate — was defeated by Jimmy Carter in the 1976 election. Bush tried to convince Carter to keep him on as CIA director, but Carter’s vice president was Walter Mondale, who had been a leading member of the Church Committee and had already won a commitment from Carter to try to implement many of the committee’s recommendations for reforming the intelligence community.
So Bush ran for president instead. He lost in the primaries to Ronald Reagan, then rode Reagan’s coattails as his running mate in the 1980 election.
Bush’s political career owes much to the misuse of Welch’s murder. Above all, it helped start a Republican tradition of generating fake national security scandals to discredit Democrats and win political battles. In the wake of Bush’s death, many in the mainstream press and political elite have pinned him to a bygone era of civility, when partisanship was held in check out of concern for some greater good. But playing dirty didn’t start yesterday. There is a straight line from Welch to pre-war intelligence on Iraq’s weapons of mass destruction, Benghazi, and Nunes’s farcical midnight search for evidence that Trump was wiretapped.
Let’s Talk About George H.W. Bush’s Role in the Iran-Contra Scandal
Arun Gupta - the intercept
December 7 2018, 10:06 a.m.
THE EFFUSIVE PRAISE being heaped on former President George H.W. Bush — “a calm and vital statesman” who exuded “decency, moderation, compromise” — risks burying his skeletons with him. One of the most notable skeletons that has gotten scant attention in recent days is his role in the Iran-Contra scandal.
As CIA director in the mid-1970s and as Ronald Reagan’s vice president, Bush helped forge a world of strongmen, wars, cartels, and refugees that continues today. In particular, he was deeply involved in the events that became known as the Iran-Contra scandal, a series of illegal operations that began with a secret effort to arm Contra fighters in Nicaragua in the hopes of toppling the leftist Sandinista government; this effort became connected to drug trafficking, trading weapons for hostages with Iran, and banking scandals.
In 1987, Arthur Liman, chief counsel for the Senate Select Committee on Secret Military Assistance to Iran and the Nicaraguan Opposition, described it as a “secret government-within-a-government … with its own army, air force, diplomatic agents, intelligence operatives and appropriations capacity.” Independent counsel Lawrence Walsh, tasked with investigating Iran-Contra, concluded that the White House cover-up “possibly forestalled timely impeachment proceedings against President Reagan and other officials.” Bush was a central figure in this.
Bush’s spy history is murky. According to Russ Baker, author of “Family of Secrets,” a history of the Bush family, in the late 1950s, Bush allegedly allowed the CIA to use an offshore oil rig he owned near Cuba as a staging ground for anti-Castro Cubans to raid their homeland. In 1967, Bush visited Vietnam as a freshman member of Congress, and Baker claims that Bush was accompanied by his business partner, a CIA agent, to investigate the Phoenix Program, the CIA torture and assassination operation that killed more than 20,000 Vietnamese by 1971.
These pieces come together when Bush served as CIA director from January 1976 to January 1977. During his tenure, he met his future national security adviser, Donald Gregg, who was involved in operations linked to the Phoenix Program as a former CIA station chief in Saigon. There, Gregg fought alongside Cuban exile and CIA agent Felix Rodriguez, who helped track down and kill Cuban revolutionary Che Guevara.
Bush was at the CIA during the height of Operation Condor, an international “kidnap-torture-murder apparatus” run by six Latin American dictatorships and coordinated by Washington. In an Operation Condor plot carried out in October 1976, Chilean secret police assassinated former Chilean diplomat Orlando Letelier and American Ronni Moffitt with a car bomb in Washington, D.C.
Bush misled an FBI investigation about Chile’s responsibility. Also as spy chief, Bush met his Panamanian counterpart, Manuel Noriega, already suspected at the time of drug trafficking. (As president, Bush ordered the invasion of Panama in 1989 to remove Noriega from power, who was the country’s ruler by that point.)
As vice president, Bush became an architect of the “secret government” that came into being for the Iran-Contra operations. Official investigations of Iran-Contra are limited to the period after October 1984 when Congress banned military and intelligence services from providing direct or indirect support to the Contras. But Gary Webb’s expose on CIA and Contra links to cocaine smuggling, “The Dark Alliance,” dates to 1981 the covert U.S. support for the Contras. Cobbled together from remnants of Nicaragua’s defeated National Guard, the Contras were notorious for torture, assassination, and other atrocities. The Phoenix-Condor link reached Central America, as the CIA recruited veterans of Argentina’s Dirty War to train the Contras, who ignited a decadelong war that killed an estimated 50,000 Nicaraguans.
Rolling Stone dates Bush’s involvement in the Contra war to 1982, when he reportedly conspired with CIA chief William Casey in an operation they code-named “Black Eagle.” Working under Bush, Donald Gregg managed finances and operations for the Contras, according to Rolling Stone. Rodriguez handled arms flights to Central America and negotiated with military commanders there. Historian Douglas Valentine has claimed that in 1981, Bush authorized these veterans of the Phoenix Program to initiate a “Pink Plan” terror war against Central American insurgents.
Black Eagle masked its operation by relying on the Israeli Mossad to acquire and ship weapons to Central America, employing Panamanian airfields and companies as fronts, according to the Rolling Stone story. But the planes, once emptied of their arms cargo in Central America, were repurposed by Noriega and the Medellín cartel to ship drugs back to the United States. The CIA allegedly stuck a deal with the Medellín cartel’s primary contact, Barry Seal. In return for Seal hauling weapons to the Contras, the CIA protected him as his operations smuggled an estimated $3 billion to $5 billion in drugs into the United States.
The White House also leaned on Gulf State monarchies to cough up more than $40 million for the Contras, violating the 1984 congressional ban known as the Boland Amendment. In 1985, Lt. Col. Oliver North coordinated with Israel to ship more than 2,000 anti-tank missiles to Iran through Israel in exchange for Iran’s assistance in freeing American hostages held in the region — and the profits were used to fund the Contras.
The maneuver, which violated the Arms Export Control Act, was extraordinarily cynical. Iran was mired in a brutal war with Iraq, which was backed by Bush and other senior Reagan administration officials beginning in 1982. Through the BNL bank that would later collapse in scandal, Iraq received more than $4 billion of U.S. Department of Agriculture credits. Most of that money reportedly went to buy weaponry even as Iraq waged chemical warfare against Iran and its own Kurdish citizens.
Both the Contra weapons shipments and the arms-for-hostages deals were exposed in 1986.
Much is still not known about Iran-Contra because of document shredding, deceit, and cover-ups by Reagan-era officials. Congress handcuffed its inquiry by failing to subpoena Oval Office recordings and calling knowledgeable witnesses. Robert Parry, an Associated Press reporter who uncovered the arms-for-drugs trade years before Webb, criticized the media for failing to dig into the story and succumbing to White House pressure and perception management.
On Christmas Eve 1992, then-President Bush decapitated the investigation by Walsh. Bush pardoned six figures, including Secretary of Defense Caspar Weinberger, whose trial was about to begin, with Bush likely called to testify. Walsh was livid. Saying “the Iran-Contra cover-up … has now been completed,” he called Bush a “president who has such a contempt for honesty [and] arrogant disregard for the rule of law.” Bush’s pardons are newly relevant because Bush consulted his attorney general at the time, William Barr, who reportedly did not oppose the pardons. Barr has just been named by President Donald Trump as his nominee for attorney general, where he may once again confront the issue of presidential pardons of senior government officials caught in an illegal conspiracy.
Bush’s role in the Iran-Contra scandal shows that his legacy is far darker than what is being reported amid his death and funeral. The truth is that he coddled dictators and death squads, undermined democratic institutions, and trashed the Constitution. He created the conditions that helped give rise to Donald Trump.
As CIA director in the mid-1970s and as Ronald Reagan’s vice president, Bush helped forge a world of strongmen, wars, cartels, and refugees that continues today. In particular, he was deeply involved in the events that became known as the Iran-Contra scandal, a series of illegal operations that began with a secret effort to arm Contra fighters in Nicaragua in the hopes of toppling the leftist Sandinista government; this effort became connected to drug trafficking, trading weapons for hostages with Iran, and banking scandals.
In 1987, Arthur Liman, chief counsel for the Senate Select Committee on Secret Military Assistance to Iran and the Nicaraguan Opposition, described it as a “secret government-within-a-government … with its own army, air force, diplomatic agents, intelligence operatives and appropriations capacity.” Independent counsel Lawrence Walsh, tasked with investigating Iran-Contra, concluded that the White House cover-up “possibly forestalled timely impeachment proceedings against President Reagan and other officials.” Bush was a central figure in this.
Bush’s spy history is murky. According to Russ Baker, author of “Family of Secrets,” a history of the Bush family, in the late 1950s, Bush allegedly allowed the CIA to use an offshore oil rig he owned near Cuba as a staging ground for anti-Castro Cubans to raid their homeland. In 1967, Bush visited Vietnam as a freshman member of Congress, and Baker claims that Bush was accompanied by his business partner, a CIA agent, to investigate the Phoenix Program, the CIA torture and assassination operation that killed more than 20,000 Vietnamese by 1971.
These pieces come together when Bush served as CIA director from January 1976 to January 1977. During his tenure, he met his future national security adviser, Donald Gregg, who was involved in operations linked to the Phoenix Program as a former CIA station chief in Saigon. There, Gregg fought alongside Cuban exile and CIA agent Felix Rodriguez, who helped track down and kill Cuban revolutionary Che Guevara.
Bush was at the CIA during the height of Operation Condor, an international “kidnap-torture-murder apparatus” run by six Latin American dictatorships and coordinated by Washington. In an Operation Condor plot carried out in October 1976, Chilean secret police assassinated former Chilean diplomat Orlando Letelier and American Ronni Moffitt with a car bomb in Washington, D.C.
Bush misled an FBI investigation about Chile’s responsibility. Also as spy chief, Bush met his Panamanian counterpart, Manuel Noriega, already suspected at the time of drug trafficking. (As president, Bush ordered the invasion of Panama in 1989 to remove Noriega from power, who was the country’s ruler by that point.)
As vice president, Bush became an architect of the “secret government” that came into being for the Iran-Contra operations. Official investigations of Iran-Contra are limited to the period after October 1984 when Congress banned military and intelligence services from providing direct or indirect support to the Contras. But Gary Webb’s expose on CIA and Contra links to cocaine smuggling, “The Dark Alliance,” dates to 1981 the covert U.S. support for the Contras. Cobbled together from remnants of Nicaragua’s defeated National Guard, the Contras were notorious for torture, assassination, and other atrocities. The Phoenix-Condor link reached Central America, as the CIA recruited veterans of Argentina’s Dirty War to train the Contras, who ignited a decadelong war that killed an estimated 50,000 Nicaraguans.
Rolling Stone dates Bush’s involvement in the Contra war to 1982, when he reportedly conspired with CIA chief William Casey in an operation they code-named “Black Eagle.” Working under Bush, Donald Gregg managed finances and operations for the Contras, according to Rolling Stone. Rodriguez handled arms flights to Central America and negotiated with military commanders there. Historian Douglas Valentine has claimed that in 1981, Bush authorized these veterans of the Phoenix Program to initiate a “Pink Plan” terror war against Central American insurgents.
Black Eagle masked its operation by relying on the Israeli Mossad to acquire and ship weapons to Central America, employing Panamanian airfields and companies as fronts, according to the Rolling Stone story. But the planes, once emptied of their arms cargo in Central America, were repurposed by Noriega and the Medellín cartel to ship drugs back to the United States. The CIA allegedly stuck a deal with the Medellín cartel’s primary contact, Barry Seal. In return for Seal hauling weapons to the Contras, the CIA protected him as his operations smuggled an estimated $3 billion to $5 billion in drugs into the United States.
The White House also leaned on Gulf State monarchies to cough up more than $40 million for the Contras, violating the 1984 congressional ban known as the Boland Amendment. In 1985, Lt. Col. Oliver North coordinated with Israel to ship more than 2,000 anti-tank missiles to Iran through Israel in exchange for Iran’s assistance in freeing American hostages held in the region — and the profits were used to fund the Contras.
The maneuver, which violated the Arms Export Control Act, was extraordinarily cynical. Iran was mired in a brutal war with Iraq, which was backed by Bush and other senior Reagan administration officials beginning in 1982. Through the BNL bank that would later collapse in scandal, Iraq received more than $4 billion of U.S. Department of Agriculture credits. Most of that money reportedly went to buy weaponry even as Iraq waged chemical warfare against Iran and its own Kurdish citizens.
Both the Contra weapons shipments and the arms-for-hostages deals were exposed in 1986.
Much is still not known about Iran-Contra because of document shredding, deceit, and cover-ups by Reagan-era officials. Congress handcuffed its inquiry by failing to subpoena Oval Office recordings and calling knowledgeable witnesses. Robert Parry, an Associated Press reporter who uncovered the arms-for-drugs trade years before Webb, criticized the media for failing to dig into the story and succumbing to White House pressure and perception management.
On Christmas Eve 1992, then-President Bush decapitated the investigation by Walsh. Bush pardoned six figures, including Secretary of Defense Caspar Weinberger, whose trial was about to begin, with Bush likely called to testify. Walsh was livid. Saying “the Iran-Contra cover-up … has now been completed,” he called Bush a “president who has such a contempt for honesty [and] arrogant disregard for the rule of law.” Bush’s pardons are newly relevant because Bush consulted his attorney general at the time, William Barr, who reportedly did not oppose the pardons. Barr has just been named by President Donald Trump as his nominee for attorney general, where he may once again confront the issue of presidential pardons of senior government officials caught in an illegal conspiracy.
Bush’s role in the Iran-Contra scandal shows that his legacy is far darker than what is being reported amid his death and funeral. The truth is that he coddled dictators and death squads, undermined democratic institutions, and trashed the Constitution. He created the conditions that helped give rise to Donald Trump.
How Hanukkah came to America
The Conversation - raw story
03 DEC 2018 AT 06:21 ET
...Hanukkah’s back story
The word “Hanukkah” means dedication. It commemorates the rededicating of the ancient Temple in Jerusalem in 165 B.C. when Jews – led by a band of brothers called the Maccabees – tossed out statues of Hellenic gods that had been placed there by King Antiochus IV when he conquered Judea. Antiochus aimed to plant Hellenic culture throughout his kingdom, and that included worshipping its gods.
Legend has it that during the dedication, as people prepared to light the Temple’s large oil lamps to signify the presence of God, only a tiny bit of holy oil could be found. Yet, that little bit of oil remained alight for eight days until more could be prepared. Thus, each Hanukkah evening, for eight nights, Jews light a candle, adding an additional one as the holiday progresses throughout the festival.
Hanukkah’s American story
Today, America is home to almost 7 million Jews. But Jews did not always find it easy to be Jewish in America. Until the late 19th century, America’s Jewish population was very small and grew to only as many as 250,000 in 1880. The basic goods of Jewish religious life – such as kosher meat and candles, Torah scrolls, and Jewish calendars – were often hard to find.
In those early days, major Jewish religious events took special planning and effort, and minor festivals like Hanukkah often slipped by unnoticed.
My own study of American Jewish history has recently focused on Hanukkah’s development.
It began with a simple holiday hymn written in 1840 by Penina Moise, a Jewish Sunday school teacher in Charleston, South Carolina. Her evangelical Christian neighbors worked hard to bring the local Jews into the Christian fold. They urged Jews to agree that only by becoming Christian could they attain God’s love and ultimately reach Heaven.
Moise, a famed poet, saw the holiday celebrating dedication to Judaism as an occasion to inspire Jewish dedication despite Christian challenges. Her congregation, Beth Elohim, publicized the hymn by including it in their hymnbook.
This English language hymn expressed a feeling common to many American Jewsliving as a tiny minority. “Great Arbiter of human fate whose glory ne’er decays,” Moise began the hymn, “To Thee alone we dedicate the song and soul of praise.”
It became a favorite among American Jews and could be heard in congregations around the country for another century.
Shortly after the Civil War, Cincinnati Rabbi Max Lilienthal learned about special Christmas events for children held in some local churches. To adapt them for children in his own congregation, he created a Hanukkah assembly where the holiday’s story was told, blessings and hymns were sung, candles were lighted and sweets were distributed to the children.
His friend, Rabbi Isaac M. Wise, created a similar event for his own congregation. Wise and Lilienthal edited national Jewish magazines where they publicized these innovative Hanukkah assemblies, encouraging other congregations to establish their own.
Lilienthal and Wise also aimed to reform Judaism, streamlining it and emphasizing the rabbi’s role as teacher. Because they felt their changes would help Judaism survive in the modern age, they called themselves “Modern Maccabees.” Through their efforts, special Hanukkah events for children became standard in American synagogues.
20th-century expansion
By 1900, industrial America produced the abundance of goods exchanged each Dec. 25. Christmas’ domestic celebrations and gifts to children provided a shared religious experience to American Christians otherwise separated by denominational divisions. As a home celebration, it sidestepped the theological and institutional loyalties voiced in churches.
For the 2.3 million Jewish immigrants who entered the U.S. between 1881 and 1924, providing their children with gifts in December proved they were becoming American and obtaining a better life.
But by giving those gifts at Hanukkah, instead of adopting Christmas, they also expressed their own ideals of American religious freedom, as well as their own dedication to Judaism.
After World War II, many Jews relocated from urban centers. Suburban Jewish children often comprised small minorities in public schools and found themselves coerced to participate in Christmas assemblies. Teachers, administrators and peers often pressured them to sing Christian hymns and assert statements of Christian faith.
From the 1950s through the 1980s, as Jewish parents argued for their children’s right to freedom from religious coercion, they also embellished Hanukkah. Suburban synagogues expanded their Hanukkah programming.
As I detail in my book, Jewish families embellished domestic Hanukkah celebrations with decorations, nightly gifts and holiday parties to enhance Hanukkah’s impact. In suburbia, Hanukkah’s theme of dedication to Judaism shone with special meaning. Rabbinical associations, national Jewish clubs and advertisers of Hanukkah goods carried the ideas for expanded Hanukkah festivities nationwide.
In the 21st century, Hanukkah accomplishes many tasks. Amid Christmas, it reminds Jews of Jewish dedication. Its domestic celebration enhances Jewish family life. In its similarity to Christmas domestic gift-giving, Hanukkah makes Judaism attractive to children and – according to my college students – relatable to Jews’ Christian neighbors. In many interfaith families, this shared festivity furthers domestic tranquility.
In America, this minor festival has attained major significance.
The word “Hanukkah” means dedication. It commemorates the rededicating of the ancient Temple in Jerusalem in 165 B.C. when Jews – led by a band of brothers called the Maccabees – tossed out statues of Hellenic gods that had been placed there by King Antiochus IV when he conquered Judea. Antiochus aimed to plant Hellenic culture throughout his kingdom, and that included worshipping its gods.
Legend has it that during the dedication, as people prepared to light the Temple’s large oil lamps to signify the presence of God, only a tiny bit of holy oil could be found. Yet, that little bit of oil remained alight for eight days until more could be prepared. Thus, each Hanukkah evening, for eight nights, Jews light a candle, adding an additional one as the holiday progresses throughout the festival.
Hanukkah’s American story
Today, America is home to almost 7 million Jews. But Jews did not always find it easy to be Jewish in America. Until the late 19th century, America’s Jewish population was very small and grew to only as many as 250,000 in 1880. The basic goods of Jewish religious life – such as kosher meat and candles, Torah scrolls, and Jewish calendars – were often hard to find.
In those early days, major Jewish religious events took special planning and effort, and minor festivals like Hanukkah often slipped by unnoticed.
My own study of American Jewish history has recently focused on Hanukkah’s development.
It began with a simple holiday hymn written in 1840 by Penina Moise, a Jewish Sunday school teacher in Charleston, South Carolina. Her evangelical Christian neighbors worked hard to bring the local Jews into the Christian fold. They urged Jews to agree that only by becoming Christian could they attain God’s love and ultimately reach Heaven.
Moise, a famed poet, saw the holiday celebrating dedication to Judaism as an occasion to inspire Jewish dedication despite Christian challenges. Her congregation, Beth Elohim, publicized the hymn by including it in their hymnbook.
This English language hymn expressed a feeling common to many American Jewsliving as a tiny minority. “Great Arbiter of human fate whose glory ne’er decays,” Moise began the hymn, “To Thee alone we dedicate the song and soul of praise.”
It became a favorite among American Jews and could be heard in congregations around the country for another century.
Shortly after the Civil War, Cincinnati Rabbi Max Lilienthal learned about special Christmas events for children held in some local churches. To adapt them for children in his own congregation, he created a Hanukkah assembly where the holiday’s story was told, blessings and hymns were sung, candles were lighted and sweets were distributed to the children.
His friend, Rabbi Isaac M. Wise, created a similar event for his own congregation. Wise and Lilienthal edited national Jewish magazines where they publicized these innovative Hanukkah assemblies, encouraging other congregations to establish their own.
Lilienthal and Wise also aimed to reform Judaism, streamlining it and emphasizing the rabbi’s role as teacher. Because they felt their changes would help Judaism survive in the modern age, they called themselves “Modern Maccabees.” Through their efforts, special Hanukkah events for children became standard in American synagogues.
20th-century expansion
By 1900, industrial America produced the abundance of goods exchanged each Dec. 25. Christmas’ domestic celebrations and gifts to children provided a shared religious experience to American Christians otherwise separated by denominational divisions. As a home celebration, it sidestepped the theological and institutional loyalties voiced in churches.
For the 2.3 million Jewish immigrants who entered the U.S. between 1881 and 1924, providing their children with gifts in December proved they were becoming American and obtaining a better life.
But by giving those gifts at Hanukkah, instead of adopting Christmas, they also expressed their own ideals of American religious freedom, as well as their own dedication to Judaism.
After World War II, many Jews relocated from urban centers. Suburban Jewish children often comprised small minorities in public schools and found themselves coerced to participate in Christmas assemblies. Teachers, administrators and peers often pressured them to sing Christian hymns and assert statements of Christian faith.
From the 1950s through the 1980s, as Jewish parents argued for their children’s right to freedom from religious coercion, they also embellished Hanukkah. Suburban synagogues expanded their Hanukkah programming.
As I detail in my book, Jewish families embellished domestic Hanukkah celebrations with decorations, nightly gifts and holiday parties to enhance Hanukkah’s impact. In suburbia, Hanukkah’s theme of dedication to Judaism shone with special meaning. Rabbinical associations, national Jewish clubs and advertisers of Hanukkah goods carried the ideas for expanded Hanukkah festivities nationwide.
In the 21st century, Hanukkah accomplishes many tasks. Amid Christmas, it reminds Jews of Jewish dedication. Its domestic celebration enhances Jewish family life. In its similarity to Christmas domestic gift-giving, Hanukkah makes Judaism attractive to children and – according to my college students – relatable to Jews’ Christian neighbors. In many interfaith families, this shared festivity furthers domestic tranquility.
In America, this minor festival has attained major significance.
Coard: Know Thanksgiving instead of celebrating it
Michael Coard - philly tribune
11/16/18
I wouldn’t go so far as to say the white man is the devil. But I will say, as I have always said, “A devil is what a devil does.” And, historically speaking, the white man has done a whole lotta devilment, especially here in the land called America.
Since white folks, and sadly Black folks too, in this country will celebrate Thanksgiving next week, I’ll use this week’s Freedom’s Journal column to expose some irrefutable proof of that racist devilment.
Let’s begin at the beginning, which was white invasion resulting in Red genocide. Howard N. Simpson, M.D. in Invisible Armies: The Impact of Disease on American History writes, “The Europeans were able to conquer America not because of their military genius or their religious motivation or their ambition or [even] their greed. They conquered it by waging... biological warfare.” And J. Leitch Wright Jr. in The Only Land They Knew notes, “In 1623, the British indulged in the first use of chemical warfare in the colonies when negotiating a treaty with tribes, headed by Chief Chiskiac, near the Potomac River. The British offered a toast symbolizing ‘eternal friendship,’ whereupon the chief, his family, advisors, and two hundred followers dropped dead of poison.”
And in a 1763 letter to a colleague, Sir Jeffrey Amherst, a high-ranking British military officer, not only suggested using vicious wild dogs to hunt down Red men, women, and children- which was brutally done- but also suggested using diseased blankets on Red men, women, and children when he wrote, “Could it not be contrived to send Small Pox among those disaffected tribes of Indians? We must on this occasion use every stratagem in our power to reduce them.” And that was satanically done.
However, you might say those aren’t examples of Thanksgiving. You might also say Thanksgiving was invented by Europeans as an expression of unity and appreciation between the two races. But you’d be wrong, dead wrong- as dead as the murdered so-called Indians.
Thanksgiving, as an American holiday, is a celebration of racist genocide. But don’t take my word for it. Listen to what Wamsutta (also known as Frank B. James), the official representative of the Wampanoag Nation, wrote in 1970 in response to an invitation from the Massachusetts Department of Commerce for his “tribe” to participate in the 350th anniversary of the Pilgrims’ landing:
“This is a time of celebration for you- celebrating an anniversary of a beginning for the white man in America.... It is with a heavy heart that I look back upon what happened to my people. Even before the Pilgrims landed [here], it was a common practice for explorers to capture Indians, take them to Europe, and sell them as slaves.... The Pilgrims had hardly explored the shores of Cape Cod for four days before they had robbed the graves of my ancestors and stolen their corn and beans.... Massasoit, the great Sachem of the Wampanoag, knew these facts. Yet he and his people welcomed and befriended the settlers.... This action by Massaoit was perhaps our biggest mistake. We, the Wampanoag, welcomed you, the white man, with open arms, little knowing that it was the beginning of the end, that before 50 years were to pass, the Wampanoag would no longer be a free people....
History gives us facts and there were atrocities. There were broken promises and most of these centered around land ownership.... Never before had we had to deal with fences and stone walls. But the white man needed to prove his worth by the amount of land that he owned. Only ten years later, when the Puritans came, they treated the Wampanoag with even less kindness in converting the souls of the so-called ‘savages....’ [And the Indians who rejected the Puritans’ Christianity were] pressed between stone slabs and [also] hanged.... And... down through the years, there is record after record of Indian lands taken and... reservations set up....
Although time has drained our culture and our language is almost extinct, we the Wampanoags still walk the lands of Massachusetts. [And] our spirit refuses to die.... We still have the spirit. We still have the unique culture. We still have the will and, most important of all, the determination to remain as Indians.
We are determined, and our presence here this evening is living testimony that this is only the beginning of the American Indian... to regain the position in this country that is rightfully ours.”
But Brother Wamsutta’s September 10, 1970 speech was never heard publicly at the anniversary event because Massachusetts’ white government officials banned him from reading it aloud after they had requested and received a copy of it beforehand.
Here are three facts you must know about white folks’ Thanksgiving so you won’t make the mistake of celebrating and thereby whitewashing the horrific physical and biological slaughter of our brave Red sisters and brothers.
1. The Red nations (and there were five hundred of them on this land they called Turtle Island) were inhabited by people accurately and generally known as the Onkwehonwe whose ancestors had been in the so-called New World for approximately 14,000 years. White Thanksgiving was founded thousands of years later in 1621 in Plymouth, Massachusetts by Pilgrims a year after they arrived from England to promote European religious traditions.
2. As further explained by Professor James W. Loewen in Lies My Teacher Told Me, “The Pilgrims did not introduce the tradition.... Indians had observed autumnal harvest celebrations for centuries. Although George Washington... [in 1789 did issue a proclamation setting aside November 26] as a national day of thanksgiving, our modern celebration dates back only to 1863. During the Civil War, when the Union needed all the patriotism that such an observance might muster, Abraham Lincoln proclaimed Thanksgiving a national holiday.” By the way, as Francis Russell Stoddard notes in The Truth About the Pilgrims, the term “Pilgrims” wasn’t even used until the 1870s.
3. Shortly after the Pilgrims (and later the Puritans) arrived in/invaded this land and throughout the history of the United States, most notably following Congressional passage of Senate Bill 102 signed by President Andrew Jackson in 1830 and known as the Indian Removal Act- resulting in the gruesome “Trail of Tears”- Red people by the millions decreased in number as the genocidal terrorism, biological warfare, torture, rape, murder, land theft, and colonization increased.
Despite the hellish tradition of white Thanksgiving, I’m certainly not suggesting that Black folks not chill out on November 22 by hanging out, socializing, eating, and drinking with your family. In fact, you should do all that because it’s important for families, especially Black families, to come together as often as possible. Furthermore, that chilling out could also include watching professional football (unless you’re still boycotting like me). But when you watch the Washington game on that day, don’t use the racist slur by calling that team the “Redskins” unless you call their Dallas opponents (and all other NFL teams) the “Crackers.”
Think about it.
Since white folks, and sadly Black folks too, in this country will celebrate Thanksgiving next week, I’ll use this week’s Freedom’s Journal column to expose some irrefutable proof of that racist devilment.
Let’s begin at the beginning, which was white invasion resulting in Red genocide. Howard N. Simpson, M.D. in Invisible Armies: The Impact of Disease on American History writes, “The Europeans were able to conquer America not because of their military genius or their religious motivation or their ambition or [even] their greed. They conquered it by waging... biological warfare.” And J. Leitch Wright Jr. in The Only Land They Knew notes, “In 1623, the British indulged in the first use of chemical warfare in the colonies when negotiating a treaty with tribes, headed by Chief Chiskiac, near the Potomac River. The British offered a toast symbolizing ‘eternal friendship,’ whereupon the chief, his family, advisors, and two hundred followers dropped dead of poison.”
And in a 1763 letter to a colleague, Sir Jeffrey Amherst, a high-ranking British military officer, not only suggested using vicious wild dogs to hunt down Red men, women, and children- which was brutally done- but also suggested using diseased blankets on Red men, women, and children when he wrote, “Could it not be contrived to send Small Pox among those disaffected tribes of Indians? We must on this occasion use every stratagem in our power to reduce them.” And that was satanically done.
However, you might say those aren’t examples of Thanksgiving. You might also say Thanksgiving was invented by Europeans as an expression of unity and appreciation between the two races. But you’d be wrong, dead wrong- as dead as the murdered so-called Indians.
Thanksgiving, as an American holiday, is a celebration of racist genocide. But don’t take my word for it. Listen to what Wamsutta (also known as Frank B. James), the official representative of the Wampanoag Nation, wrote in 1970 in response to an invitation from the Massachusetts Department of Commerce for his “tribe” to participate in the 350th anniversary of the Pilgrims’ landing:
“This is a time of celebration for you- celebrating an anniversary of a beginning for the white man in America.... It is with a heavy heart that I look back upon what happened to my people. Even before the Pilgrims landed [here], it was a common practice for explorers to capture Indians, take them to Europe, and sell them as slaves.... The Pilgrims had hardly explored the shores of Cape Cod for four days before they had robbed the graves of my ancestors and stolen their corn and beans.... Massasoit, the great Sachem of the Wampanoag, knew these facts. Yet he and his people welcomed and befriended the settlers.... This action by Massaoit was perhaps our biggest mistake. We, the Wampanoag, welcomed you, the white man, with open arms, little knowing that it was the beginning of the end, that before 50 years were to pass, the Wampanoag would no longer be a free people....
History gives us facts and there were atrocities. There were broken promises and most of these centered around land ownership.... Never before had we had to deal with fences and stone walls. But the white man needed to prove his worth by the amount of land that he owned. Only ten years later, when the Puritans came, they treated the Wampanoag with even less kindness in converting the souls of the so-called ‘savages....’ [And the Indians who rejected the Puritans’ Christianity were] pressed between stone slabs and [also] hanged.... And... down through the years, there is record after record of Indian lands taken and... reservations set up....
Although time has drained our culture and our language is almost extinct, we the Wampanoags still walk the lands of Massachusetts. [And] our spirit refuses to die.... We still have the spirit. We still have the unique culture. We still have the will and, most important of all, the determination to remain as Indians.
We are determined, and our presence here this evening is living testimony that this is only the beginning of the American Indian... to regain the position in this country that is rightfully ours.”
But Brother Wamsutta’s September 10, 1970 speech was never heard publicly at the anniversary event because Massachusetts’ white government officials banned him from reading it aloud after they had requested and received a copy of it beforehand.
Here are three facts you must know about white folks’ Thanksgiving so you won’t make the mistake of celebrating and thereby whitewashing the horrific physical and biological slaughter of our brave Red sisters and brothers.
1. The Red nations (and there were five hundred of them on this land they called Turtle Island) were inhabited by people accurately and generally known as the Onkwehonwe whose ancestors had been in the so-called New World for approximately 14,000 years. White Thanksgiving was founded thousands of years later in 1621 in Plymouth, Massachusetts by Pilgrims a year after they arrived from England to promote European religious traditions.
2. As further explained by Professor James W. Loewen in Lies My Teacher Told Me, “The Pilgrims did not introduce the tradition.... Indians had observed autumnal harvest celebrations for centuries. Although George Washington... [in 1789 did issue a proclamation setting aside November 26] as a national day of thanksgiving, our modern celebration dates back only to 1863. During the Civil War, when the Union needed all the patriotism that such an observance might muster, Abraham Lincoln proclaimed Thanksgiving a national holiday.” By the way, as Francis Russell Stoddard notes in The Truth About the Pilgrims, the term “Pilgrims” wasn’t even used until the 1870s.
3. Shortly after the Pilgrims (and later the Puritans) arrived in/invaded this land and throughout the history of the United States, most notably following Congressional passage of Senate Bill 102 signed by President Andrew Jackson in 1830 and known as the Indian Removal Act- resulting in the gruesome “Trail of Tears”- Red people by the millions decreased in number as the genocidal terrorism, biological warfare, torture, rape, murder, land theft, and colonization increased.
Despite the hellish tradition of white Thanksgiving, I’m certainly not suggesting that Black folks not chill out on November 22 by hanging out, socializing, eating, and drinking with your family. In fact, you should do all that because it’s important for families, especially Black families, to come together as often as possible. Furthermore, that chilling out could also include watching professional football (unless you’re still boycotting like me). But when you watch the Washington game on that day, don’t use the racist slur by calling that team the “Redskins” unless you call their Dallas opponents (and all other NFL teams) the “Crackers.”
Think about it.
A threat to democracy: Republicans' war on minority voters
The Republican party has harassed, obstructed, frustrated and purged American citizens from having a say in their own democracy for more than 150 years
Carol Anderson
the guardian
Wed 31 Oct 2018 06.00 EDT
It was a mystery worthy of crime novelist Raymond Chandler. On 8 November 2016, African Americans did not show up. It was like a day of absence. African Americans had virtually boycotted the election because they “simply saw no affirmative reason to vote for Hillary”, as one reporter explained, before adding, with a hint of an old refrain, that “some saw her as corrupt”. As proof of blacks’ coolness toward her, journalists pointed to the much greater turnout for Obama in 2008 and 2012.
It is true that, nationwide, black voter turnout had dropped by 7% overall. Moreover, less than half of Hispanic and Asian American voters came to the polls.
This was, without question, a sea change. The tide of African American, Hispanic and Asian voters that had previously carried Barack Obama into the White House and kept him there had now visibly ebbed. Journalist Ari Berman called it the most underreported story of the 2016 campaign. But it’s more than that.
The disappearing minority voter is the campaign’s most misunderstood story.
Minority voters did not just refuse to show up; Republican legislatures and governors systematically blocked African Americans, Hispanics and Asian Americans from the polls. Pushed by both the impending demographic collapse of the Republican party, whose overwhelmingly white constituency is becoming a smaller share of the electorate, and the GOP’s extremist inability to craft policies that speak to an increasingly diverse nation, the Republicans opted to disfranchise rather than reform. The GOP enacted a range of undemocratic and desperate measures to block the access of African American, Latino and other minority voters to the ballot box.
Using a series of voter suppression tactics, the GOP harassed, obstructed, frustrated and purged American citizens from having a say in their own democracy.
The devices the Republicans used are variations on a theme going back more than 150 years. They target the socioeconomic characteristics of a people (poverty, lack of mobility, illiteracy, etc) and then soak the new laws in “racially neutral justifications – such as “administrative efficiency” or “fiscal responsibility” – to cover the discriminatory intent. Republican lawmakers then act aggrieved, shocked and wounded that anyone would question their stated purpose for excluding millions of American citizens from the ballot.
The millions of votes and voters that disappeared behind a firewall of hate and partisan politics was a long time in the making. The decisions to purposely disenfranchise African Americans, in particular, can be best understood by going back to the close of the civil war.
After Reconstruction, the plan was to take years of state-sponsored “trickery and fraud” and transform those schemes into laws that would keep blacks away from the voting booth, disfranchise as many as possible, and, most important, ensure that no African American would ever assume real political power again.
The last point resonated. Reconstruction had brought a number of blacks into government. And despite their helping to craft “the laws relative to finance, the building of penal and charitable institutions, and, greatest of all, the establishment of the public school system”, the myth of incompetent, disastrous “black rule” dominated. Or, as one newspaper editor summarized it: “No negro is fit to make laws for white people.”
Of course, the white lawmakers couldn’t be that blatant about their plans to disfranchise; there was, after all, that pesky constitution to contend with, not to mention the 15th amendment covering the right to vote with its language barring discrimination “on account of race”. Undaunted, they devised ways to meet the letter of the law while doing an absolute slash-and-burn through its spirit.
That became most apparent in 1890 when the Magnolia State passed the Mississippi Plan, a dizzying array of poll taxes, literacy tests understanding clauses, newfangled voter registration rules, and “good character” clauses – all intentionally racially discriminatory but dressed up in the genteel garb of bringing “integrity” to the voting booth. This feigned legal innocence was legislative evil genius.
As the historian C Vann Woodward concluded, “The restrictions imposed by these devices [in the Mississippi Plan] were enormously effective in decimating the Negro vote.” Indeed, by 1940, shortly before the United States entered the war against the Nazis, only 3% of age-eligible blacks were registered to vote in the south.
Senator Theodore Bilbo, one of the most virulent racists to grace the halls of Congress, boasted of the chicanery (of the Mississippi Plan) nearly half a century later. “What keeps ’em [blacks] from voting is section 244 of the [Mississippi] Constitution of 1890 … It says that a man to register must be able to read and explain the Constitution or explain the Constitution when read to him.”
While the Civil Rights Movement and the subsequent Voting Rights Act of 1965 seemed to disrupt and overturn disfranchisement, the forces of voter suppression refused to rest. The election of Barack Obama to the presidency in 2008 and 2012 sent tremors through the right wing in American politics. They seized their opportunity in 2013 after the US supreme court gutted the Voting Rights Act and doubled down on some vestiges of the Jim Crow era, such as felony disfranchisement.
In 2016, one in 13 African Americans had lost their right to vote because of a felony conviction – compared with one in 56 of every non-black voter. The felony disfranchisement rate in the United States has grown by 500% since 1980. In America, mass incarceration equals mass felony disfranchisement. With the launch of the war on drugs, millions of African Americans were swept into the criminal justice system, many never to exercise their voting rights again.
Generally, the incarcerated cannot vote, but once they have served their time, which sometimes includes parole or probation, there is a process – often arcane and opaque – that allows for the restoration of voting rights. Overall, 6.1 million Americans have lost their voting rights. Currently, because of the byzantine rules, “approximately 2.6 million individuals who have completed their sentences remain disenfranchised due to restrictive state laws”, according to The Sentencing Project.
The majority are in Florida. The Sunshine State is actually an electorally dark place for 1.7 million citizens because “Florida is the national champion of voter disenfranchisement”, according to the Florida Centre for Investigative Reporting. The state leads the way in racializing felony disfranchisement as well. “Nearly one-third of those who have lost the right to vote for life in Florida are black, although African Americans make up just 16% of the state’s population,” according to Conor Friedersdorf’s reporting for the Atlantic.
Florida is one of only four states, including Kentucky, Iowa and Virginia, that “permanently” disfranchises felons.
The term “permanent” means that there is no automatic restoration of voting rights. Instead, there is a process to plead for dispensation, which usually requires petitioning all the way up to the governor after a specified waiting period. Republican Governor Rick Scott, a Republican, has made that task doubly difficult. The Florida Office of Executive Clemency, which he leads, meets only four times a year and has more than 10,000 applications waiting to be heard. An ex-offender cannot even apply to have his or her voting rights restored until 14 years after all the sentencing requirements have been met. The process is therefore daunting enough as it is, but Scott has slowed it down considerably.
His predecessor, a moderate Republican turned Democrat, “restored rights to 155,315 ex-offenders” over a four-year span. Since 2011, however, Scott has approved only 2,340 cases.
Republicans in Georgia have brought their own distinct twist to voter suppression. Secretary of State, Brian Kemp, has developed a pattern of going after and intimidating organizations that register minorities to vote. In 2012, when the Asian American Legal Advocacy Center (AALAC) realized that a number of its clients, who were newly naturalized citizens, were not on the voter rolls although they had been registered, its staff made an inquiry with the secretary of state’s office.
After waiting and waiting and still receiving no response, AALAC issued an open letter expressing concern that the early voting period would close before they had an answer. Two days later, in a show of raw intimidation, Kemp launched an investigation questioning the methods the organization had used to register new voters. One of the group’s attorneys was “aghast … ‘I’m not going to lie: I was shocked, I was scared.’” AALAC remained under this ominous cloud for more than two years before Kemp’s office finally concluded there was no wrongdoing.
Kemp then went after the New Georgia Project when in 2014 the organization decided to whittle away at the bloc of 700,000 unregistered African American voters in the state and, in its initial run, registered nearly 130,000 mostly minority voters. Kemp didn’t applaud and see democracy in action. Instead, he exclaimed in a TV interview, “We’re just not going to put up with fraud.”
Later, when talking with a group of fellow Republicans behind closed doors, he didn’t claim “fraud”. It was something much baser. “Democrats are working hard … registering all these minority voters that are out there and others that are sitting on the sidelines,” he warned. “If they can do that, they can win these elections in November.” Not surprisingly, within two months of that discussion, he “announced his criminal investigation into the New Georgia Project”. And, just as before, Kemp’s hunt for fraud dragged on and on with aspersions and allegations filling the airwaves and print media while no evidence of a crime could be found.
Vote suppression has become far too commonplace. In 2017, “99 bills to limit access to the ballot have been introduced in 31 states … and more states have enacted new voting restrictions in 2017 than in 2016 and 2015 combined”, according to Abi Berman.
Yet, while there are far too many states that are eager to reduce “one person, one vote” to a meaningless phrase, others, such as Oregon, are determined to “make voting convenient” and “registration simple” because these “policies are good for civic engagement and voter participation”. In 2015, Oregon pioneered automatic voter registration (AVR). Under AVR, Oregon added 68,583 new voters in just six months. By the end of July 2016, the state’s “torrid pace” had swelled the rolls by 222,197 new voters.
California took one look at its neighbor to the north and is “hard on Oregon’s heels”. Secretary of State, Alex Padilla, dissatisfied with his own state’s abysmal 42% voter turnout rate, had been scouring the nation looking for best practices. “We want to serve as a contrast to what we see happening in other states, where they are making it more difficult to register or actually cast a ballot,” he said. California, thus, adopted and then adapted Oregon’s AVR program to include preregistration of 16- and 17-year-olds who are then automatically registered to vote when they turn 18.
These state initiatives to remove the barriers to the ballot box including the use of mail-in ballots – which has had tremendous success in Colorado – are beginning to ricochet around the nation.
To date, ten states have implemented AVR and “15 states have introduced automatic voter registration proposals in 2018”.
Democrats in Congress have also pushed for legislation to enact a federal AVR program, because the United States consistently ranks toward the bottom of developed democracies in terms of voter turnout. In July 2016, Senators Patrick Leahy (D-VT), Dick Durbin (D-IL), and Amy Klobuchar (D-MN) co-sponsored legislation that would take AVR nationwide. Leahy remarked, “There is no reason why every eligible citizen cannot have the option of automatic registration when they visit the DMV, sign up for healthcare or sign up for classes in college.”
No reason at all, except not one Republican in Congress has stepped up to support the bill.
Thus, when thirty-one states are vying to develop new and more ruthless ways to disfranchise their population, and when the others are searching desperately for ways to bring millions of citizens into the electorate, we have created a nation where democracy is simultaneously atrophying and growing – depending solely on where one lives. History makes clear, however, that this is simply not sustainable. It wasn’t sustainable in the antebellum era. It wasn’t sustainable when the poll tax and literacy test gave disproportionate power in Congress to Southern Democrats. And it’s certainly not sustainable now. Or, as Abraham Lincoln soberly observed, “I believe this government cannot endure, permanently half slave and half free.”
It is true that, nationwide, black voter turnout had dropped by 7% overall. Moreover, less than half of Hispanic and Asian American voters came to the polls.
This was, without question, a sea change. The tide of African American, Hispanic and Asian voters that had previously carried Barack Obama into the White House and kept him there had now visibly ebbed. Journalist Ari Berman called it the most underreported story of the 2016 campaign. But it’s more than that.
The disappearing minority voter is the campaign’s most misunderstood story.
Minority voters did not just refuse to show up; Republican legislatures and governors systematically blocked African Americans, Hispanics and Asian Americans from the polls. Pushed by both the impending demographic collapse of the Republican party, whose overwhelmingly white constituency is becoming a smaller share of the electorate, and the GOP’s extremist inability to craft policies that speak to an increasingly diverse nation, the Republicans opted to disfranchise rather than reform. The GOP enacted a range of undemocratic and desperate measures to block the access of African American, Latino and other minority voters to the ballot box.
Using a series of voter suppression tactics, the GOP harassed, obstructed, frustrated and purged American citizens from having a say in their own democracy.
The devices the Republicans used are variations on a theme going back more than 150 years. They target the socioeconomic characteristics of a people (poverty, lack of mobility, illiteracy, etc) and then soak the new laws in “racially neutral justifications – such as “administrative efficiency” or “fiscal responsibility” – to cover the discriminatory intent. Republican lawmakers then act aggrieved, shocked and wounded that anyone would question their stated purpose for excluding millions of American citizens from the ballot.
The millions of votes and voters that disappeared behind a firewall of hate and partisan politics was a long time in the making. The decisions to purposely disenfranchise African Americans, in particular, can be best understood by going back to the close of the civil war.
After Reconstruction, the plan was to take years of state-sponsored “trickery and fraud” and transform those schemes into laws that would keep blacks away from the voting booth, disfranchise as many as possible, and, most important, ensure that no African American would ever assume real political power again.
The last point resonated. Reconstruction had brought a number of blacks into government. And despite their helping to craft “the laws relative to finance, the building of penal and charitable institutions, and, greatest of all, the establishment of the public school system”, the myth of incompetent, disastrous “black rule” dominated. Or, as one newspaper editor summarized it: “No negro is fit to make laws for white people.”
Of course, the white lawmakers couldn’t be that blatant about their plans to disfranchise; there was, after all, that pesky constitution to contend with, not to mention the 15th amendment covering the right to vote with its language barring discrimination “on account of race”. Undaunted, they devised ways to meet the letter of the law while doing an absolute slash-and-burn through its spirit.
That became most apparent in 1890 when the Magnolia State passed the Mississippi Plan, a dizzying array of poll taxes, literacy tests understanding clauses, newfangled voter registration rules, and “good character” clauses – all intentionally racially discriminatory but dressed up in the genteel garb of bringing “integrity” to the voting booth. This feigned legal innocence was legislative evil genius.
As the historian C Vann Woodward concluded, “The restrictions imposed by these devices [in the Mississippi Plan] were enormously effective in decimating the Negro vote.” Indeed, by 1940, shortly before the United States entered the war against the Nazis, only 3% of age-eligible blacks were registered to vote in the south.
Senator Theodore Bilbo, one of the most virulent racists to grace the halls of Congress, boasted of the chicanery (of the Mississippi Plan) nearly half a century later. “What keeps ’em [blacks] from voting is section 244 of the [Mississippi] Constitution of 1890 … It says that a man to register must be able to read and explain the Constitution or explain the Constitution when read to him.”
While the Civil Rights Movement and the subsequent Voting Rights Act of 1965 seemed to disrupt and overturn disfranchisement, the forces of voter suppression refused to rest. The election of Barack Obama to the presidency in 2008 and 2012 sent tremors through the right wing in American politics. They seized their opportunity in 2013 after the US supreme court gutted the Voting Rights Act and doubled down on some vestiges of the Jim Crow era, such as felony disfranchisement.
In 2016, one in 13 African Americans had lost their right to vote because of a felony conviction – compared with one in 56 of every non-black voter. The felony disfranchisement rate in the United States has grown by 500% since 1980. In America, mass incarceration equals mass felony disfranchisement. With the launch of the war on drugs, millions of African Americans were swept into the criminal justice system, many never to exercise their voting rights again.
Generally, the incarcerated cannot vote, but once they have served their time, which sometimes includes parole or probation, there is a process – often arcane and opaque – that allows for the restoration of voting rights. Overall, 6.1 million Americans have lost their voting rights. Currently, because of the byzantine rules, “approximately 2.6 million individuals who have completed their sentences remain disenfranchised due to restrictive state laws”, according to The Sentencing Project.
The majority are in Florida. The Sunshine State is actually an electorally dark place for 1.7 million citizens because “Florida is the national champion of voter disenfranchisement”, according to the Florida Centre for Investigative Reporting. The state leads the way in racializing felony disfranchisement as well. “Nearly one-third of those who have lost the right to vote for life in Florida are black, although African Americans make up just 16% of the state’s population,” according to Conor Friedersdorf’s reporting for the Atlantic.
Florida is one of only four states, including Kentucky, Iowa and Virginia, that “permanently” disfranchises felons.
The term “permanent” means that there is no automatic restoration of voting rights. Instead, there is a process to plead for dispensation, which usually requires petitioning all the way up to the governor after a specified waiting period. Republican Governor Rick Scott, a Republican, has made that task doubly difficult. The Florida Office of Executive Clemency, which he leads, meets only four times a year and has more than 10,000 applications waiting to be heard. An ex-offender cannot even apply to have his or her voting rights restored until 14 years after all the sentencing requirements have been met. The process is therefore daunting enough as it is, but Scott has slowed it down considerably.
His predecessor, a moderate Republican turned Democrat, “restored rights to 155,315 ex-offenders” over a four-year span. Since 2011, however, Scott has approved only 2,340 cases.
Republicans in Georgia have brought their own distinct twist to voter suppression. Secretary of State, Brian Kemp, has developed a pattern of going after and intimidating organizations that register minorities to vote. In 2012, when the Asian American Legal Advocacy Center (AALAC) realized that a number of its clients, who were newly naturalized citizens, were not on the voter rolls although they had been registered, its staff made an inquiry with the secretary of state’s office.
After waiting and waiting and still receiving no response, AALAC issued an open letter expressing concern that the early voting period would close before they had an answer. Two days later, in a show of raw intimidation, Kemp launched an investigation questioning the methods the organization had used to register new voters. One of the group’s attorneys was “aghast … ‘I’m not going to lie: I was shocked, I was scared.’” AALAC remained under this ominous cloud for more than two years before Kemp’s office finally concluded there was no wrongdoing.
Kemp then went after the New Georgia Project when in 2014 the organization decided to whittle away at the bloc of 700,000 unregistered African American voters in the state and, in its initial run, registered nearly 130,000 mostly minority voters. Kemp didn’t applaud and see democracy in action. Instead, he exclaimed in a TV interview, “We’re just not going to put up with fraud.”
Later, when talking with a group of fellow Republicans behind closed doors, he didn’t claim “fraud”. It was something much baser. “Democrats are working hard … registering all these minority voters that are out there and others that are sitting on the sidelines,” he warned. “If they can do that, they can win these elections in November.” Not surprisingly, within two months of that discussion, he “announced his criminal investigation into the New Georgia Project”. And, just as before, Kemp’s hunt for fraud dragged on and on with aspersions and allegations filling the airwaves and print media while no evidence of a crime could be found.
Vote suppression has become far too commonplace. In 2017, “99 bills to limit access to the ballot have been introduced in 31 states … and more states have enacted new voting restrictions in 2017 than in 2016 and 2015 combined”, according to Abi Berman.
Yet, while there are far too many states that are eager to reduce “one person, one vote” to a meaningless phrase, others, such as Oregon, are determined to “make voting convenient” and “registration simple” because these “policies are good for civic engagement and voter participation”. In 2015, Oregon pioneered automatic voter registration (AVR). Under AVR, Oregon added 68,583 new voters in just six months. By the end of July 2016, the state’s “torrid pace” had swelled the rolls by 222,197 new voters.
California took one look at its neighbor to the north and is “hard on Oregon’s heels”. Secretary of State, Alex Padilla, dissatisfied with his own state’s abysmal 42% voter turnout rate, had been scouring the nation looking for best practices. “We want to serve as a contrast to what we see happening in other states, where they are making it more difficult to register or actually cast a ballot,” he said. California, thus, adopted and then adapted Oregon’s AVR program to include preregistration of 16- and 17-year-olds who are then automatically registered to vote when they turn 18.
These state initiatives to remove the barriers to the ballot box including the use of mail-in ballots – which has had tremendous success in Colorado – are beginning to ricochet around the nation.
To date, ten states have implemented AVR and “15 states have introduced automatic voter registration proposals in 2018”.
Democrats in Congress have also pushed for legislation to enact a federal AVR program, because the United States consistently ranks toward the bottom of developed democracies in terms of voter turnout. In July 2016, Senators Patrick Leahy (D-VT), Dick Durbin (D-IL), and Amy Klobuchar (D-MN) co-sponsored legislation that would take AVR nationwide. Leahy remarked, “There is no reason why every eligible citizen cannot have the option of automatic registration when they visit the DMV, sign up for healthcare or sign up for classes in college.”
No reason at all, except not one Republican in Congress has stepped up to support the bill.
Thus, when thirty-one states are vying to develop new and more ruthless ways to disfranchise their population, and when the others are searching desperately for ways to bring millions of citizens into the electorate, we have created a nation where democracy is simultaneously atrophying and growing – depending solely on where one lives. History makes clear, however, that this is simply not sustainable. It wasn’t sustainable in the antebellum era. It wasn’t sustainable when the poll tax and literacy test gave disproportionate power in Congress to Southern Democrats. And it’s certainly not sustainable now. Or, as Abraham Lincoln soberly observed, “I believe this government cannot endure, permanently half slave and half free.”
Here are 7 things the United Daughters of the Confederacy might not want you to know about them
Kali Holloway, Independent Media Institute - raw story
07 OCT 2018 AT 01:34 ET
It’s helpful, in the midst of any conversation about this country’s Confederate monuments, to understand who put these things up, which also offers a clue as to why. In large part, the answer to the first question is the United Daughters of the Confederacy, a white Southern women’s “heritage” group founded in 1894. Starting 30 years after the Civil War, as historian Karen Cox notes in her 2003 book “Dixie’s Daughters,” “UDC members aspired to transform military defeat into a political and cultural victory, where states’ rights and white supremacy remained intact.” In other words, when the Civil War gave them lemons, the UDC made lemonade. Horribly bitter, super racist lemonade.
Though the UDC didn’t invent the Lost Cause ideology, they were deeply involved in spreading the myth, which simultaneously contends the Confederacy wasn’t fighting to keep black people enslaved while also suggesting slavery was pretty good for everyone involved. Lost Causers — plenty of whom exist today, their sheer numbers a reflection of the UDC’s effectiveness — argue that Confederate monuments are just innocent statues; that taking them down erases history; that we cannot retroactively apply today’s ideas about the morality of slavery to the past. The response to those ridiculous cop-outs is that Confederate monuments honor and glorify people who fought to maintain black chattel slavery; that they were erected for the explicit purpose of obfuscating history; and that the immorality of slavery was always understood by the enslaved. Excuses, excuses: get better at them.
“In their earliest days, the United Daughters of the Confederacy definitely did some good work on behalf of veterans and in their communities,” says Heidi Christensen, former president of the Seattle, Washington, chapter of the UDC, who left the organization in 2012. “But it’s also true that since the UDC was founded in 1894, it has maintained a covert connection with the Ku Klux Klan. In fact, in many ways, the group was the de facto women’s auxiliary of the KKK at the turn of the century. It’s a connection the group downplays now, but evidence of it is easily discoverable — you don’t even have to look very hard to find it.”
In 2017, after the white nationalist Unite the Right rally in Charlottesville, UDC President Patricia M. Bryson posted an open letter claiming the UDC’s members “have spent 123 years honoring [Confederate soldiers] by various activities in the fields of education, history and charity, promoting patriotism and good citizenship,” and that members, “like our statues, have stayed quietly in the background, never engaging in public controversy.” But that isn’t true, not by a stretch. The UDC’s monuments, books, education and political agenda have always spoken loudly—in absolutely deafening shouts — on issues from anti-black racism to the historical memory of the Civil War across the South. Today, a shameful number of Americans don’t think slavery was the primary cause of the Civil War—even though the seceding states literally spelled this out in document form — in part because of the UDC’s campaign of misinformation. The most minor of gains made by blacks during the Reconstruction were obliterated nearly as soon as they were obtained, and the UDC backed that disenfranchisement full stop. Even the current UDC has mostly steadfastly refused — with rare exceptions — to take down Confederate monuments. They know the power of those symbols, both politically and socially, and they aren’t giving an inch, if they can help it.
The UDC have had a huge impact on this country, and to pretend they’ve stood “quietly in the background” would be laughable if it weren’t so insulting. The UDC both trained and became the white women of 1950s massive resistance, who author Elizabeth Gillespie McRae writes did “the daily work on multiple levels . . . needed to sustain racial segregation and to shape resistance to racial equality.” They set a precedent for a huge swath of today’s white women voters whose main political agenda is white supremacy — women who in a 2017 Alabama Senate race backed the alleged pedophile who wistfully longed for slavery and supported the presidency of a man who brags about grabbing women’s genitals when he’s not shouting his racism from the rafters. They have contributed to the construction of a “white womanhood” that has historically been and currently remains incredibly problematic, rendering “white feminism” eternally suspect. With their impact considered, and signs of their handiwork all over society — even carved indelibly into mountain sides — it seems worth understanding a few things about the UDC both then and now. Here are seven things you should know about the United Daughters of the Confederacy.
1. They published a very pro-KKK book. For children.
In 1914, the in-house historian of the UDC Mississippi chapter, Laura Martin Rose, published “The Ku Klux Klan, or Invisible Empire.” It’s essentially a love letter to the original Klan for its handiwork in the field of domestic terror in the years following the Civil War, when blacks achieved a modicum of political power.
---
2. Actually, they published at least two very pro-KKK books. . .
. . .and probably many more. Another UDC ode to the KKK was written by Annie Cooper Burton, then-president of the Los Angeles chapter of the UDC, and published in 1916. Titled “The Ku Klux Klan,” much like Rose’s aforementioned book, it argues that the Klan has gotten a bad rap just because they terrorized and intimidated black people, not infrequently assaulting and raping black women, murdering black citizens, and burning down black townships. For these reasons, she suggests, the UDC should do even more to show reverence to the Klan:
“Every clubhouse of the United Daughters of the Confederacy should have a memorial tablet dedicated to the Ku Klux Klan; that would be a monument not to one man, but to five hundred and fifty thousand men, to whom all Southerners owe a debt of gratitude.”
By “all Southerners,” Burton clearly means “only white people,” which is also what she means whenever she uses the word “people.”
3. They built a monument to the KKK.
The UDC was busiest during the 1910s and 1920s, two decades during which the group erected hundreds of Confederate monuments that made tangible the racial terror of Jim Crow. This, apparently, the group still considered insufficient to convey their message of white power and to reassert the threat of white violence. So in 1926, the UDC put up a monument to the KKK. In a piece for Facing South, writer Greg Huffman describes a record of the memorial in the UDC’s own 1941 book “North Carolina’s Confederate Monuments and Memorials:”
4. Their most intense efforts focused on the “education” of white children.
Historian Karen Cox, author of 2003’s “Dixie’s Daughters,” has written that the UDC’s biggest goal was to indoctrinate white Southern children in the Lost Cause, thus creating “living monuments.”
---
5. They’re big fans of black chattel slavery from way back.
The UDC were perhaps the most efficient agents making the ahistorical Lost Cause myth go viral. They did this through a number of methods, the most visually apparent being the 700 monuments exalting people who fought for black chattel slavery that still stand. But also, in the rare cases the UDC has “honored” black people with statuary and monuments, it has been in the form of “loyal slave” markers — an actual subgenre of Confederate monuments — which perpetuate the image of content enslaved blacks and benevolent white enslavers.
---
6. They get tax breaks that help keep their workings financially solvent.
The UDC is a nonprofit. That means it’s a tax-exempt organization. That recent article about the UDC by AP reporter Allen Breed notes that the annual budget of Virginia, where the UDC is headquartered, “awards the state [division of the] UDC tens of thousands of dollars for the maintenance of Confederate graves — more than $1.6 million since 1996.”
7. They continue to exert political and social influence.
For the most part, the UDC has publicly kept pretty mum on the subject of Confederate monument removal, which has led some to conclude that the group is largely inactive, and even obsolete. Their numbers have dwindled since their heyday, but they remain tenacious about keeping Confederate monuments standing, thus continuing their cultural and political influence.
The UDC does this mostly through lawsuits. (The number of Confederate markers on courthouses has always shown the group’s keen interest in the power of the legal system.) When the San Antonio City Council voted in the weeks after the racist violence in Charlottesville to remove a Confederate monument from public property, the UDC filed suit against city officials. The Shreveport, Louisiana, chapter of the UDC has announced it will appeal a federal judge’s 2017 dismissal of the group’s lawsuit to keep up a Confederate monument at a local courthouse. The UDC threatened legal action against officials in Franklin, Tennessee, when city officials announced plans — not to take down a UDC monument to the Confederacy, but to add markers recognizing African-American historical figures to the park, which the UDC claims it owns. The city of Franklin, with pretty much no other option, responded by filing a lawsuit against the UDC.
And then there’s the case of the UDC vs. Vanderbilt University, in which the group’s Tennessee division filed suit after school administrators announced plans to remove the word “Confederate” from one of its dorms. A state appeals court ruled Vanderbilt could only implement the plan if it repaid $50,000 the UDC had contributed to the building’s construction in 1933 — adjusted to 2016 dollars. Vanderbilt opted to pay $1.2 million to the UDC rather than keep “Confederate” in the dorm name, which it raised from anonymous donors who contributed to a fund explicitly dedicated to the cause.
Though the UDC didn’t invent the Lost Cause ideology, they were deeply involved in spreading the myth, which simultaneously contends the Confederacy wasn’t fighting to keep black people enslaved while also suggesting slavery was pretty good for everyone involved. Lost Causers — plenty of whom exist today, their sheer numbers a reflection of the UDC’s effectiveness — argue that Confederate monuments are just innocent statues; that taking them down erases history; that we cannot retroactively apply today’s ideas about the morality of slavery to the past. The response to those ridiculous cop-outs is that Confederate monuments honor and glorify people who fought to maintain black chattel slavery; that they were erected for the explicit purpose of obfuscating history; and that the immorality of slavery was always understood by the enslaved. Excuses, excuses: get better at them.
“In their earliest days, the United Daughters of the Confederacy definitely did some good work on behalf of veterans and in their communities,” says Heidi Christensen, former president of the Seattle, Washington, chapter of the UDC, who left the organization in 2012. “But it’s also true that since the UDC was founded in 1894, it has maintained a covert connection with the Ku Klux Klan. In fact, in many ways, the group was the de facto women’s auxiliary of the KKK at the turn of the century. It’s a connection the group downplays now, but evidence of it is easily discoverable — you don’t even have to look very hard to find it.”
In 2017, after the white nationalist Unite the Right rally in Charlottesville, UDC President Patricia M. Bryson posted an open letter claiming the UDC’s members “have spent 123 years honoring [Confederate soldiers] by various activities in the fields of education, history and charity, promoting patriotism and good citizenship,” and that members, “like our statues, have stayed quietly in the background, never engaging in public controversy.” But that isn’t true, not by a stretch. The UDC’s monuments, books, education and political agenda have always spoken loudly—in absolutely deafening shouts — on issues from anti-black racism to the historical memory of the Civil War across the South. Today, a shameful number of Americans don’t think slavery was the primary cause of the Civil War—even though the seceding states literally spelled this out in document form — in part because of the UDC’s campaign of misinformation. The most minor of gains made by blacks during the Reconstruction were obliterated nearly as soon as they were obtained, and the UDC backed that disenfranchisement full stop. Even the current UDC has mostly steadfastly refused — with rare exceptions — to take down Confederate monuments. They know the power of those symbols, both politically and socially, and they aren’t giving an inch, if they can help it.
The UDC have had a huge impact on this country, and to pretend they’ve stood “quietly in the background” would be laughable if it weren’t so insulting. The UDC both trained and became the white women of 1950s massive resistance, who author Elizabeth Gillespie McRae writes did “the daily work on multiple levels . . . needed to sustain racial segregation and to shape resistance to racial equality.” They set a precedent for a huge swath of today’s white women voters whose main political agenda is white supremacy — women who in a 2017 Alabama Senate race backed the alleged pedophile who wistfully longed for slavery and supported the presidency of a man who brags about grabbing women’s genitals when he’s not shouting his racism from the rafters. They have contributed to the construction of a “white womanhood” that has historically been and currently remains incredibly problematic, rendering “white feminism” eternally suspect. With their impact considered, and signs of their handiwork all over society — even carved indelibly into mountain sides — it seems worth understanding a few things about the UDC both then and now. Here are seven things you should know about the United Daughters of the Confederacy.
1. They published a very pro-KKK book. For children.
In 1914, the in-house historian of the UDC Mississippi chapter, Laura Martin Rose, published “The Ku Klux Klan, or Invisible Empire.” It’s essentially a love letter to the original Klan for its handiwork in the field of domestic terror in the years following the Civil War, when blacks achieved a modicum of political power.
---
2. Actually, they published at least two very pro-KKK books. . .
. . .and probably many more. Another UDC ode to the KKK was written by Annie Cooper Burton, then-president of the Los Angeles chapter of the UDC, and published in 1916. Titled “The Ku Klux Klan,” much like Rose’s aforementioned book, it argues that the Klan has gotten a bad rap just because they terrorized and intimidated black people, not infrequently assaulting and raping black women, murdering black citizens, and burning down black townships. For these reasons, she suggests, the UDC should do even more to show reverence to the Klan:
“Every clubhouse of the United Daughters of the Confederacy should have a memorial tablet dedicated to the Ku Klux Klan; that would be a monument not to one man, but to five hundred and fifty thousand men, to whom all Southerners owe a debt of gratitude.”
By “all Southerners,” Burton clearly means “only white people,” which is also what she means whenever she uses the word “people.”
3. They built a monument to the KKK.
The UDC was busiest during the 1910s and 1920s, two decades during which the group erected hundreds of Confederate monuments that made tangible the racial terror of Jim Crow. This, apparently, the group still considered insufficient to convey their message of white power and to reassert the threat of white violence. So in 1926, the UDC put up a monument to the KKK. In a piece for Facing South, writer Greg Huffman describes a record of the memorial in the UDC’s own 1941 book “North Carolina’s Confederate Monuments and Memorials:”
4. Their most intense efforts focused on the “education” of white children.
Historian Karen Cox, author of 2003’s “Dixie’s Daughters,” has written that the UDC’s biggest goal was to indoctrinate white Southern children in the Lost Cause, thus creating “living monuments.”
---
5. They’re big fans of black chattel slavery from way back.
The UDC were perhaps the most efficient agents making the ahistorical Lost Cause myth go viral. They did this through a number of methods, the most visually apparent being the 700 monuments exalting people who fought for black chattel slavery that still stand. But also, in the rare cases the UDC has “honored” black people with statuary and monuments, it has been in the form of “loyal slave” markers — an actual subgenre of Confederate monuments — which perpetuate the image of content enslaved blacks and benevolent white enslavers.
---
6. They get tax breaks that help keep their workings financially solvent.
The UDC is a nonprofit. That means it’s a tax-exempt organization. That recent article about the UDC by AP reporter Allen Breed notes that the annual budget of Virginia, where the UDC is headquartered, “awards the state [division of the] UDC tens of thousands of dollars for the maintenance of Confederate graves — more than $1.6 million since 1996.”
7. They continue to exert political and social influence.
For the most part, the UDC has publicly kept pretty mum on the subject of Confederate monument removal, which has led some to conclude that the group is largely inactive, and even obsolete. Their numbers have dwindled since their heyday, but they remain tenacious about keeping Confederate monuments standing, thus continuing their cultural and political influence.
The UDC does this mostly through lawsuits. (The number of Confederate markers on courthouses has always shown the group’s keen interest in the power of the legal system.) When the San Antonio City Council voted in the weeks after the racist violence in Charlottesville to remove a Confederate monument from public property, the UDC filed suit against city officials. The Shreveport, Louisiana, chapter of the UDC has announced it will appeal a federal judge’s 2017 dismissal of the group’s lawsuit to keep up a Confederate monument at a local courthouse. The UDC threatened legal action against officials in Franklin, Tennessee, when city officials announced plans — not to take down a UDC monument to the Confederacy, but to add markers recognizing African-American historical figures to the park, which the UDC claims it owns. The city of Franklin, with pretty much no other option, responded by filing a lawsuit against the UDC.
And then there’s the case of the UDC vs. Vanderbilt University, in which the group’s Tennessee division filed suit after school administrators announced plans to remove the word “Confederate” from one of its dorms. A state appeals court ruled Vanderbilt could only implement the plan if it repaid $50,000 the UDC had contributed to the building’s construction in 1933 — adjusted to 2016 dollars. Vanderbilt opted to pay $1.2 million to the UDC rather than keep “Confederate” in the dorm name, which it raised from anonymous donors who contributed to a fund explicitly dedicated to the cause.
'Not just in the US': amateur historian highlights Canada's forgotten racism
Archives shed light on incidents of racial discrimination and the country’s civil rights pioneers
Leyland Cecco in Toronto
the guardian
Thu 20 Sep 2018 04.00 EDT
An amateur historian in Canada has highlighted a forgotten story of racial injustice, and one of the country’s earliest segregation lawsuits, in hopes of bringing recognition to civil rights pioneers.
In 1914, Charles Daniels bought a pair of tickets to see King Lear at Calgary’s Sherman Grand Theatre, but when he attempted to take his orchestra-level seat, he was told by ushers to move up to the balcony level, where other black patrons were seated.
Theatre staff told Daniels that his presence made the white patrons uncomfortable. Daniels protested – refused offers of a refund, and left.
“The fact that this happened in 1914, in Calgary, Alberta, blew my mind. It broke the whole narrative that these kind of things only happen in the United States,” said Bashir Mohamed, a civil servant who has been scouring the provincial archives in Edmonton for the last two years, and wrote about Daniels’s case in an essay for the Sprawl, a Calgary journalism site.
Daniels’s story has re-emerged amid a belated recognition across Canada of past injustices that have been largely absent from the national conversation on race.
At the time, the incident at the theatre was widely covered in local papers, with one running the headline: “CALGARY ‘NIGGER’ KICKS UP FUSS — Wants to Attend Theatre With ‘White Folks’ But Management Says No.”
Daniels retained a lawyer, sued the theatre over its policy of segregation – and won the case.
He was awarded $1,000, worth more than $17,000 USD today. During the trial he had said: “I think the humiliation is worth that amount.”
Mohamed said he had started research the history of black Canadians after reading an online commentator claiming that Canada does not have the same history of racial discrimination as the US. “When I was going through school, I never learned about this black history. But I always assumed there was something there,” he said.
Since then he has documented Canada’s racist place-names, the extensive presence of the Ku Klux Klan in western Canada, the effects of segregation and the fights of early – but largely forgotten – civil rights activists.
“I think it’s very important work… because there isn’t as much written about people of African descent within Alberta,” Jennifer Kelly, a University of Alberta education professor, told the CBC.
Daniel’s story has been told before, over the years and in different publications. But few Canadians are aware of the pioneering theatregoer, whose successful fight against discrimination predated the broader civil rights movement by decades.
“We have photos of Martin Luther King being arrested. We have mugshots. We have photos of Rosa Parks sitting on a bus. We have photos of her mugshot too,” he said.
But in Canada, the victims of racism were often seen as secondary to the story, he said: in Daniels’ case, newspaper reporters spoke with theatre management, but never Daniels. No images of Daniels were ever published.
“We don’t see photos of the black-only sections of the theatre. We don’t see photos of black patients being denied care, even know we know all those things happened,” said Mohamed. “Because these photos don’t exist, it’s hard to see these things as real that affected real people.”
Other civil rights activists are only recently gaining mainstream recognition in Canada. Viola Desmond, a black woman who refused to leave a whites-only area of a movie theatre in 1946, is to become the first Canadian-born woman to appear on the country’s $10 bill.
Amid a broader debate about Canada’s colonial heritage, some cities have removed of now-controversial historical figures, such as John A Macdonald, the country’s first prime minister and a notorious racist.
“There’s been this resurgence across Canada and the United States of people reclaiming their history,” said Mohamed, pointing out that Edmonton’s Oliver neighbourhood is named after Frank Oliver, the former federal minister who championed policies barring black immigration to Canada and successfully lobbied for the forced removal First Nations from their treaty lands.
“It’s led to really critical discussions of who we celebrate. And who we’ve forgotten.”
In 1914, Charles Daniels bought a pair of tickets to see King Lear at Calgary’s Sherman Grand Theatre, but when he attempted to take his orchestra-level seat, he was told by ushers to move up to the balcony level, where other black patrons were seated.
Theatre staff told Daniels that his presence made the white patrons uncomfortable. Daniels protested – refused offers of a refund, and left.
“The fact that this happened in 1914, in Calgary, Alberta, blew my mind. It broke the whole narrative that these kind of things only happen in the United States,” said Bashir Mohamed, a civil servant who has been scouring the provincial archives in Edmonton for the last two years, and wrote about Daniels’s case in an essay for the Sprawl, a Calgary journalism site.
Daniels’s story has re-emerged amid a belated recognition across Canada of past injustices that have been largely absent from the national conversation on race.
At the time, the incident at the theatre was widely covered in local papers, with one running the headline: “CALGARY ‘NIGGER’ KICKS UP FUSS — Wants to Attend Theatre With ‘White Folks’ But Management Says No.”
Daniels retained a lawyer, sued the theatre over its policy of segregation – and won the case.
He was awarded $1,000, worth more than $17,000 USD today. During the trial he had said: “I think the humiliation is worth that amount.”
Mohamed said he had started research the history of black Canadians after reading an online commentator claiming that Canada does not have the same history of racial discrimination as the US. “When I was going through school, I never learned about this black history. But I always assumed there was something there,” he said.
Since then he has documented Canada’s racist place-names, the extensive presence of the Ku Klux Klan in western Canada, the effects of segregation and the fights of early – but largely forgotten – civil rights activists.
“I think it’s very important work… because there isn’t as much written about people of African descent within Alberta,” Jennifer Kelly, a University of Alberta education professor, told the CBC.
Daniel’s story has been told before, over the years and in different publications. But few Canadians are aware of the pioneering theatregoer, whose successful fight against discrimination predated the broader civil rights movement by decades.
“We have photos of Martin Luther King being arrested. We have mugshots. We have photos of Rosa Parks sitting on a bus. We have photos of her mugshot too,” he said.
But in Canada, the victims of racism were often seen as secondary to the story, he said: in Daniels’ case, newspaper reporters spoke with theatre management, but never Daniels. No images of Daniels were ever published.
“We don’t see photos of the black-only sections of the theatre. We don’t see photos of black patients being denied care, even know we know all those things happened,” said Mohamed. “Because these photos don’t exist, it’s hard to see these things as real that affected real people.”
Other civil rights activists are only recently gaining mainstream recognition in Canada. Viola Desmond, a black woman who refused to leave a whites-only area of a movie theatre in 1946, is to become the first Canadian-born woman to appear on the country’s $10 bill.
Amid a broader debate about Canada’s colonial heritage, some cities have removed of now-controversial historical figures, such as John A Macdonald, the country’s first prime minister and a notorious racist.
“There’s been this resurgence across Canada and the United States of people reclaiming their history,” said Mohamed, pointing out that Edmonton’s Oliver neighbourhood is named after Frank Oliver, the former federal minister who championed policies barring black immigration to Canada and successfully lobbied for the forced removal First Nations from their treaty lands.
“It’s led to really critical discussions of who we celebrate. And who we’ve forgotten.”
Teachers Were the Real Heroes of School Desegregation
Often overlooked in histories of school desegregation are the teachers.
By Michael Gengler / History News Network - alternet
August 27, 2018, 9:05 AM GMT
“There was not a manual, and there was not anything other than let’s try this, and with the overriding principle that these young people should not have to pay too big a price both in terms of their academic learning, in terms of their safety, by going through this process, because they didn’t volunteer for it either. And we’re all in it, in that sense. And that was the beauty of it, I mean, there were so many beautiful moments, but a lot of ugly stuff.”– Shelton Boyles, a former English teacher, Gainesville, Florida
Often overlooked in histories of school desegregation are the teachers. Most were products of segregated colleges. Nothing in their training prepared them to teach integrated classes. After the lawyers and the courts had their say, the teachers had to make desegregation work. Failure was not an option. We have public schools in the South today because of their courage and tenacity.
School desegregation in the South proceeded in two stages. The Supreme Court in Brown v. Board of Education (two opinions, 1954 and 1955), decreed that each segregated school district must change to a “racially nondiscriminatory school system.” The lower courts quickly decided that test was satisfied if black students could choose to transfer from their historically black schools to white schools. The South’s dual system of white and black schools remained in place.
I graduated from the white Gainesville High School in Florida in 1962, two years before the first blacks entered white schools there. My research of the Gainesville experience with desegregation has informed this article.
During the “freedom of choice” era of desegregation, only a minority of black students chose to transfer to white schools. They were expected to conform to the white schools’ expectations. Many of those students were middle class and college bound. Curricula, clubs, student government, and other activities in the white schools did not change.
The NAACP Legal Defense Fund in 1968 persuaded the Supreme Court that the South’s dual school system, even with freedom of choice, was not a “racially nondiscriminatory school system.” The black schools would have to be integrated or closed. In Florida, virtually all of the black high schools were closed. In Gainesville, the entire student body of Lincoln High School struck for 11 days in 1969 to protest its closing.
With full desegregation, teachers had to engage disaffected students of both races. African American teachers were expected not only to continue their normal instructional roles but also to help black students accommodate to desegregation. Frequently, that role included calming student disturbances. White teachers had to begin working with their new black students from the point at which segregation had left those students academically.
Unencumbered by reforms such as No Child Left Behind, teachers in the newly integrated classrooms could innovate. At Gainesville High, two white and one black teachers developed a three-hour block course called “Man and His Environment.” The course satisfied English, science and social studies requirements. The class included a representative group of students of both races and differing academic abilities. In other classes, many teachers took time out for students to tell something of their personal lives. Coaches fielded integrated sports teams. Principals found ways to get students of both races to work together on projects.
Masked by segregation, reading deficits were a major problem. Reading skills were needed not just for English classes but for all other subjects, even math. At the high school level, teachers were not expected to teach reading. However, in 1970, ten Gainesville High teachers enrolled in a junior college course for problem readers, to learn how to address their students’ reading issues. Curricula had to be revised to accommodate varying reading skills. Every teacher and every school were affected by student reading deficits.
Principal and football coach were the two most important positions in the high schools. In Gainesville, two new high schools opened in 1970. At the east side school, the last principal of Lincoln High School, a black, became principal and a white became football coach. On the more affluent, white west side, the new school’s principal was white and Lincoln’s last football coach took over the athletic program. At Lincoln, he fed his players Kentucky Fried Chicken before games. Speaking to a friend about his posting to the west side, he commented, “These kids, they ain’t going to be able to play football off of salads.” But he molded a successful program. A white player says the teams played all the harder for him, to protect him from the occasional racist sideline taunts.
Although the schools increased the numbers of deans who enforced discipline, at times the teachers had to be first responders. One white GHS English teacher recalls breaking up a fight in class between a white boy and a black boy. “I got right down on the floor in my little dress and high heels and I rolled around with them. I got them apart.”
Public education in Florida had been tarnished by the antics of Republican Governor Claude Kirk. Recruiting black teachers, already difficult, was made more so because of Kirk. The school district sent out a recruiting team of a white teacher and a black teacher to visit out-of-town colleges. When passing through Taylor County, Florida, the teacher who was not driving would get down on the floor of the car to avoid potentially hostile attention.
After the 1969-70 and 1970-71 school years, each of the principals of Gainesville High resigned. For 1971-72, 30-year-old Dan Boyd took over. The school board told him that if he resigned, there would be no other position for him in the county school system. Although without special training in racial matters, Boyd remained principal for 24 years. He did not lead from his desk. He worked out with the athletes in the weight room, traveled with the football team, and was visible on campus. A female student, remembering Boyd’s tight jeans, says all the girls had crushes on him.
Despite serious outbreaks of violence in 1970, the Gainesville schools slowly but steadily regained educational equilibrium. However, student engagement continues to be a key problem facing public schools. We can still learn from both white and black educators who taught through the desegregation years.
Often overlooked in histories of school desegregation are the teachers. Most were products of segregated colleges. Nothing in their training prepared them to teach integrated classes. After the lawyers and the courts had their say, the teachers had to make desegregation work. Failure was not an option. We have public schools in the South today because of their courage and tenacity.
School desegregation in the South proceeded in two stages. The Supreme Court in Brown v. Board of Education (two opinions, 1954 and 1955), decreed that each segregated school district must change to a “racially nondiscriminatory school system.” The lower courts quickly decided that test was satisfied if black students could choose to transfer from their historically black schools to white schools. The South’s dual system of white and black schools remained in place.
I graduated from the white Gainesville High School in Florida in 1962, two years before the first blacks entered white schools there. My research of the Gainesville experience with desegregation has informed this article.
During the “freedom of choice” era of desegregation, only a minority of black students chose to transfer to white schools. They were expected to conform to the white schools’ expectations. Many of those students were middle class and college bound. Curricula, clubs, student government, and other activities in the white schools did not change.
The NAACP Legal Defense Fund in 1968 persuaded the Supreme Court that the South’s dual school system, even with freedom of choice, was not a “racially nondiscriminatory school system.” The black schools would have to be integrated or closed. In Florida, virtually all of the black high schools were closed. In Gainesville, the entire student body of Lincoln High School struck for 11 days in 1969 to protest its closing.
With full desegregation, teachers had to engage disaffected students of both races. African American teachers were expected not only to continue their normal instructional roles but also to help black students accommodate to desegregation. Frequently, that role included calming student disturbances. White teachers had to begin working with their new black students from the point at which segregation had left those students academically.
Unencumbered by reforms such as No Child Left Behind, teachers in the newly integrated classrooms could innovate. At Gainesville High, two white and one black teachers developed a three-hour block course called “Man and His Environment.” The course satisfied English, science and social studies requirements. The class included a representative group of students of both races and differing academic abilities. In other classes, many teachers took time out for students to tell something of their personal lives. Coaches fielded integrated sports teams. Principals found ways to get students of both races to work together on projects.
Masked by segregation, reading deficits were a major problem. Reading skills were needed not just for English classes but for all other subjects, even math. At the high school level, teachers were not expected to teach reading. However, in 1970, ten Gainesville High teachers enrolled in a junior college course for problem readers, to learn how to address their students’ reading issues. Curricula had to be revised to accommodate varying reading skills. Every teacher and every school were affected by student reading deficits.
Principal and football coach were the two most important positions in the high schools. In Gainesville, two new high schools opened in 1970. At the east side school, the last principal of Lincoln High School, a black, became principal and a white became football coach. On the more affluent, white west side, the new school’s principal was white and Lincoln’s last football coach took over the athletic program. At Lincoln, he fed his players Kentucky Fried Chicken before games. Speaking to a friend about his posting to the west side, he commented, “These kids, they ain’t going to be able to play football off of salads.” But he molded a successful program. A white player says the teams played all the harder for him, to protect him from the occasional racist sideline taunts.
Although the schools increased the numbers of deans who enforced discipline, at times the teachers had to be first responders. One white GHS English teacher recalls breaking up a fight in class between a white boy and a black boy. “I got right down on the floor in my little dress and high heels and I rolled around with them. I got them apart.”
Public education in Florida had been tarnished by the antics of Republican Governor Claude Kirk. Recruiting black teachers, already difficult, was made more so because of Kirk. The school district sent out a recruiting team of a white teacher and a black teacher to visit out-of-town colleges. When passing through Taylor County, Florida, the teacher who was not driving would get down on the floor of the car to avoid potentially hostile attention.
After the 1969-70 and 1970-71 school years, each of the principals of Gainesville High resigned. For 1971-72, 30-year-old Dan Boyd took over. The school board told him that if he resigned, there would be no other position for him in the county school system. Although without special training in racial matters, Boyd remained principal for 24 years. He did not lead from his desk. He worked out with the athletes in the weight room, traveled with the football team, and was visible on campus. A female student, remembering Boyd’s tight jeans, says all the girls had crushes on him.
Despite serious outbreaks of violence in 1970, the Gainesville schools slowly but steadily regained educational equilibrium. However, student engagement continues to be a key problem facing public schools. We can still learn from both white and black educators who taught through the desegregation years.
The Colonial Roots of Gun Culture
The origins of the U.S. gun obsession lie in the violent dispossession of Native Americans.
BY ROXANNE DUNBAR-ORTIZ - in these times
In the summer of 1970, while I was living and organizing in New Orleans with a women’s study-action group, we became caught up in a current of repression and paranoia. After a week of heavy police surveillance, we began receiving telephone calls from a man claiming to be a member of the Ku Klux Klan. The man threatened to burn down our building, and, of course, we didn’t trust the police, so we did not report it. Instead, we decided to arm ourselves. We saw it as a practical step, not a political act, something we needed for self-defense in order to continue working, not at all embracing armed struggle. In reality, once armed, our mindsets changed to match the new reality.
Gun-love can be akin to non-chemical addictions like gambling or hoarding, either of which can have devastating effects, but murder, suicide, accidental death and mass shootings result only from guns. While nearly anything may be used to kill, only the gun is created for the specific purpose of killing a living creature. The sheer numbers of guns in circulation, and the loosening of regulations on handguns especially, facilitate deadly spur-of-the-moment reflex acts.
Seventy-four percent of gun owners in the United States are male, and 82 percent of gun owners are white. The top reason U.S. Americans give for owning a gun is for protection. What are the majority of white men so afraid of?
Instead of dismissing the Second Amendment as antiquated and irrelevant, or as not actually meaning what it says, understanding the original purpose of the Second Amendment is key to understanding gun culture, and possibly the key to a new consciousness about the continuing effects of settler-colonialism and white nationalism.
One argument that runs through historical accounts of the thinking behind the Second Amendment is idealizing Anglo settler-farmers as fiercely independent and rightly fearing Big Brother government, insisting on settlers’ right to overthrow oppressive regimes. But, what colonists considered oppressive was any restriction put on them in regard to obtaining land. In the instances of Bacon’s Rebellion in 1676, the War of Independence itself, and many cases in between, the settlers’ complaint was the refusal of the British colonial authorities to allow them to seize Native land peripheral to the colonies.
Taking land by force was not an accidental or spontaneous project or the work of a few rogue characters. Male colonial settlers had long formed militias for the purpose of raiding and razing Indigenous communities and seizing their lands and resources, and the Native communities fought back. Virginia, the first colony, forbade any man to travel unless he was “well armed.” In 1658, the colony ordered every settler home to have a functioning firearm, and later even provided government loans for those who could not afford to buy a weapon.
These types of laws stayed on the books of the earliest colonies and were created in new colonies as they were founded. The Second Amendment, ratified in 1791, enshrined these rights and obligations as constitutional law: “A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.” The Second Amendment thus reflects this dependence on individual armed men to take and retain land.
The continuing significance of that “freedom” specified in the Bill of Rights reveals the settler-colonialist cultural roots of the United States that appear even in the present as a sacred right. Settler-militias and armed households were institutionalized for the destruction and control of Native peoples, communities and nations. With the expansion of plantation agriculture, by the late 1600s they were also used as “slave patrols,” forming the basis of the U.S. police culture after emancipation.
That is the way of settler-colonialism, and that is the way of the gun—to kill off and control enemies. Violence perpetrated by armed settlers, even genocide, were not absent in the other British settler-colonies—Australia, Canada and New Zealand—but the people of those polities never declared the gun a God-given right. And the people of the other Anglo settler-colonies did not have economies, governments and social orders based on the enslavement of other human beings. The United States is indeed “exceptional.”
Gun-love can be akin to non-chemical addictions like gambling or hoarding, either of which can have devastating effects, but murder, suicide, accidental death and mass shootings result only from guns. While nearly anything may be used to kill, only the gun is created for the specific purpose of killing a living creature. The sheer numbers of guns in circulation, and the loosening of regulations on handguns especially, facilitate deadly spur-of-the-moment reflex acts.
Seventy-four percent of gun owners in the United States are male, and 82 percent of gun owners are white. The top reason U.S. Americans give for owning a gun is for protection. What are the majority of white men so afraid of?
Instead of dismissing the Second Amendment as antiquated and irrelevant, or as not actually meaning what it says, understanding the original purpose of the Second Amendment is key to understanding gun culture, and possibly the key to a new consciousness about the continuing effects of settler-colonialism and white nationalism.
One argument that runs through historical accounts of the thinking behind the Second Amendment is idealizing Anglo settler-farmers as fiercely independent and rightly fearing Big Brother government, insisting on settlers’ right to overthrow oppressive regimes. But, what colonists considered oppressive was any restriction put on them in regard to obtaining land. In the instances of Bacon’s Rebellion in 1676, the War of Independence itself, and many cases in between, the settlers’ complaint was the refusal of the British colonial authorities to allow them to seize Native land peripheral to the colonies.
Taking land by force was not an accidental or spontaneous project or the work of a few rogue characters. Male colonial settlers had long formed militias for the purpose of raiding and razing Indigenous communities and seizing their lands and resources, and the Native communities fought back. Virginia, the first colony, forbade any man to travel unless he was “well armed.” In 1658, the colony ordered every settler home to have a functioning firearm, and later even provided government loans for those who could not afford to buy a weapon.
These types of laws stayed on the books of the earliest colonies and were created in new colonies as they were founded. The Second Amendment, ratified in 1791, enshrined these rights and obligations as constitutional law: “A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.” The Second Amendment thus reflects this dependence on individual armed men to take and retain land.
The continuing significance of that “freedom” specified in the Bill of Rights reveals the settler-colonialist cultural roots of the United States that appear even in the present as a sacred right. Settler-militias and armed households were institutionalized for the destruction and control of Native peoples, communities and nations. With the expansion of plantation agriculture, by the late 1600s they were also used as “slave patrols,” forming the basis of the U.S. police culture after emancipation.
That is the way of settler-colonialism, and that is the way of the gun—to kill off and control enemies. Violence perpetrated by armed settlers, even genocide, were not absent in the other British settler-colonies—Australia, Canada and New Zealand—but the people of those polities never declared the gun a God-given right. And the people of the other Anglo settler-colonies did not have economies, governments and social orders based on the enslavement of other human beings. The United States is indeed “exceptional.”
German-American history, largely erased, has lessons for our anti-immigrant era
Germans are the largest ancestry group in the U.S., but their identity has been largely disappeared. Here’s why
ERIKA SCHELBY - salon
JULY 27, 2018 10:00PM (UTC)
"The howl of the cave man.” This is how a 1918 Los Angeles Times article described the music of Brahms and Bach. A year earlier the U.S. had declared war against Germany and waded into the tragedy of the First World War. The propaganda machine was in full swing. Germans were brutes—close cousins of the barbaric Huns—and detesting all things German became a badge of patriotic pride. The time was also ripe for the contributions of German-Americans to be scrubbed from the history books. Figures like Alexander von Humboldt, Carl Blümner and Heinrich Balduin Möllhausen, whose contributions to the U.S. are vast, were eclipsed by caricatures of the brutish German lusting for American blood. German-Americans learned to keep a low profile and the collective demonization induced a historical amnesia from which we have yet to awaken.
Today, German-Americans are the largest ancestry group in the U.S., with some 50 million citizens, but their history and their identity has been largely disappeared. This may seem irrelevant, but the history of how it happened tells us a great deal about our present and how ethnic groups can come in and out of favor depending on the geopolitics of the day. As Art Silverman and Robert Siegel noted in an “All Things Considered” segment titled “During World War, the US Propaganda Erased German Culture”: “today … what happened a century ago has special relevance … World War I inspired an outbreak of nativism and xenophobia that targeted German immigrants, Americans of German descent, and even the German language.”
I grew up in Germany and was educated in the post-World War II model, which, for obvious reasons, stressed a respect for pluralism and cultivated a global view of politics and culture. As a result I’ve always been sensitive to the ways in which propaganda shapes our opinions of different communities. The daily scandals of how Latin American immigrants, many of them fleeing horrors that the U.S. had a hand in producing, are brutalized at our borders; and the harassment and attacks on Muslims come out of and occur alongside a propaganda campaign to dehumanize these groups as criminal, devious and irrational. The ethnicities change, but the message stays the same.
Unfortunately, this model has proven reliable in the effort to tar an entire population, and I’ve spent a good deal of time studying how it was applied to Germans and German-Americans. It begins with a well-crafted propaganda campaign initiated only days after the U.S. declared war on Germany. On April 13, 1917, President Wilson formed the Committee on Public Information (CPI), which recruited 150,000 academics, business people, and talent from the media and arts to promote the war to voters. Up to 20,000 newspaper pieces per week were printed based on “voluntary censorship” and CPI handouts. That was probably the birth of “embedded.” One of the CPI’s primary goals was to shape public opinion of the new enemies—Germans. Professor Vernon Kellogg, a member of the intelligentsia, who served this effort, eloquently expressed himself in a CPI publication, writing, “Will it be any wonder, if, after the war, the people of the world, when they recognize any human being as German, will shrink aside so that they may not touch him as he passes, or stoop for stones to drive him from his path?”
Hollywood did its part by producing films like “The Kaiser: The Beast of Berlin and Wolves of Kultur,” which cemented the idea of Germans as a public menace in the minds of movie-goers. Super-patriotic volunteer groups joined in, whipping up a tidal wave of war hysteria, hunting for imaginary spies or saboteurs who did not exist, trampling civil rights along the way. Books were burned, citizens were tarred and feathered, German names of places, streets, foods, and objects were erased. Bach went from being a seminal composer to a villain whose work exemplified the monstrous German aesthetic. Teaching the German language was outlawed in more than a dozen states.
In that same year an anti-German mob demanded that the Pabst Theater in Milwaukee (once called the “German Athens”) cancel a scheduled performance of “William Tell” by Friedrich Schiller. They underscored their demand by placing a machine gun in front of the theater. When the theater offered to show a comedy instead the crowd threatened to “break up the Hun show.” The mob got its way and nothing showed on the night “William Tell” was due to run.
It is interesting to note that Schiller drew his inspiration from the American and French Revolution. “William Tell” was the same play W.E.B. Du Bois read in German at Fiske University before he studied in Berlin where his “first awakening to social reform began.”
On April 5, 1918, almost exactly a year after Wilson inaugurated the CPI, Robert Prager, a German miner living in Illinois, was lynched. Not only had he committed the sin of being a German immigrant, but he was suspected of being a socialist. Ironically, one of the reasons Prager’s application for membership in the United Mine Workers Union was declined was precisely because he was German. Eleven men were arrested, tried, and ultimately acquitted of the crime. On April 11, 1918, The Washington Post editorialized, “The more one ponders Senator Overman’s estimate of 400,000 German spies the harder it is to grow righteously indignant over the Illinois lynching.” Lee Slater Overman was a senator from North Carolina who chaired a committee zealously dedicated to routing out real or perceived spying and other treasonous activities on the part of Germans and others.
If the propaganda served the war effort, it also served the economic interests of the elite. Not unlike today, discontent among workers was profound. It ran so deep that the United States Commission on Industrial Relations, an arm of Congress, admitted: “The workers of the nation, through compulsory and oppressive methods, legal and illegal, are denied the full product of their toil … Citizens numbering millions smart under a sense of injustice and oppression. The extent and depth of industrial unrest can hardly be exaggerated.”
When miners in Bisbee, Arizona, went on strike in June 1917 Walter Dodge, president of Phelps Dodge Corporation, a mining company impacted by the strike, declared, “I believe the government will be able to show that there is German influence behind this movement.”
In a stroke of propagandistic genius cartoonist H.T. Webster managed to conflate the Kaiser, the top German aristocrat, with the International Workers of the World. His cartoon, which ran in the New York Globe in 1917, shows the acronym IWW printed over the Kaiser’s face.
By the time the war was over ordinary German-Americans had learned to lay low, speak English exclusively, and release any attachment to their cultural heritage. Perhaps the best measure of the success of the propaganda effort against Germans is how quickly and effectively they learned to de-identify as German. We may be tempted to mistake this for assimilation, but history shows the hand of coercion was placed firmly around the German-American community.
So, why care about this? After all, it’s not as if any American is being attacked or oppressed today based on their German heritage. For me, as a German-American and a student of history, the anti-German propaganda effort is no mere historical footnote or anomalous descent into xenophobia. It is a template—hardly the first, but a vivid one no less—for what we are seeing today. When I look at the faces of immigrant children forcibly separated from their families or hear anti-Muslim sentiment, I know that the demonization of communities does not happen accidentally and that an intellectual as well as legal and political infrastructure is required for them to become victims in the larger game of geopolitics. The material loss for those affected is incalculable and the loss of culture heritage for them and everyone else is often a quiet casualty of the war waged against them. History gives us the vantage point to see how and why enemies are created. It also gives us an unmistakable warning that we would be wise to heed.
Today, German-Americans are the largest ancestry group in the U.S., with some 50 million citizens, but their history and their identity has been largely disappeared. This may seem irrelevant, but the history of how it happened tells us a great deal about our present and how ethnic groups can come in and out of favor depending on the geopolitics of the day. As Art Silverman and Robert Siegel noted in an “All Things Considered” segment titled “During World War, the US Propaganda Erased German Culture”: “today … what happened a century ago has special relevance … World War I inspired an outbreak of nativism and xenophobia that targeted German immigrants, Americans of German descent, and even the German language.”
I grew up in Germany and was educated in the post-World War II model, which, for obvious reasons, stressed a respect for pluralism and cultivated a global view of politics and culture. As a result I’ve always been sensitive to the ways in which propaganda shapes our opinions of different communities. The daily scandals of how Latin American immigrants, many of them fleeing horrors that the U.S. had a hand in producing, are brutalized at our borders; and the harassment and attacks on Muslims come out of and occur alongside a propaganda campaign to dehumanize these groups as criminal, devious and irrational. The ethnicities change, but the message stays the same.
Unfortunately, this model has proven reliable in the effort to tar an entire population, and I’ve spent a good deal of time studying how it was applied to Germans and German-Americans. It begins with a well-crafted propaganda campaign initiated only days after the U.S. declared war on Germany. On April 13, 1917, President Wilson formed the Committee on Public Information (CPI), which recruited 150,000 academics, business people, and talent from the media and arts to promote the war to voters. Up to 20,000 newspaper pieces per week were printed based on “voluntary censorship” and CPI handouts. That was probably the birth of “embedded.” One of the CPI’s primary goals was to shape public opinion of the new enemies—Germans. Professor Vernon Kellogg, a member of the intelligentsia, who served this effort, eloquently expressed himself in a CPI publication, writing, “Will it be any wonder, if, after the war, the people of the world, when they recognize any human being as German, will shrink aside so that they may not touch him as he passes, or stoop for stones to drive him from his path?”
Hollywood did its part by producing films like “The Kaiser: The Beast of Berlin and Wolves of Kultur,” which cemented the idea of Germans as a public menace in the minds of movie-goers. Super-patriotic volunteer groups joined in, whipping up a tidal wave of war hysteria, hunting for imaginary spies or saboteurs who did not exist, trampling civil rights along the way. Books were burned, citizens were tarred and feathered, German names of places, streets, foods, and objects were erased. Bach went from being a seminal composer to a villain whose work exemplified the monstrous German aesthetic. Teaching the German language was outlawed in more than a dozen states.
In that same year an anti-German mob demanded that the Pabst Theater in Milwaukee (once called the “German Athens”) cancel a scheduled performance of “William Tell” by Friedrich Schiller. They underscored their demand by placing a machine gun in front of the theater. When the theater offered to show a comedy instead the crowd threatened to “break up the Hun show.” The mob got its way and nothing showed on the night “William Tell” was due to run.
It is interesting to note that Schiller drew his inspiration from the American and French Revolution. “William Tell” was the same play W.E.B. Du Bois read in German at Fiske University before he studied in Berlin where his “first awakening to social reform began.”
On April 5, 1918, almost exactly a year after Wilson inaugurated the CPI, Robert Prager, a German miner living in Illinois, was lynched. Not only had he committed the sin of being a German immigrant, but he was suspected of being a socialist. Ironically, one of the reasons Prager’s application for membership in the United Mine Workers Union was declined was precisely because he was German. Eleven men were arrested, tried, and ultimately acquitted of the crime. On April 11, 1918, The Washington Post editorialized, “The more one ponders Senator Overman’s estimate of 400,000 German spies the harder it is to grow righteously indignant over the Illinois lynching.” Lee Slater Overman was a senator from North Carolina who chaired a committee zealously dedicated to routing out real or perceived spying and other treasonous activities on the part of Germans and others.
If the propaganda served the war effort, it also served the economic interests of the elite. Not unlike today, discontent among workers was profound. It ran so deep that the United States Commission on Industrial Relations, an arm of Congress, admitted: “The workers of the nation, through compulsory and oppressive methods, legal and illegal, are denied the full product of their toil … Citizens numbering millions smart under a sense of injustice and oppression. The extent and depth of industrial unrest can hardly be exaggerated.”
When miners in Bisbee, Arizona, went on strike in June 1917 Walter Dodge, president of Phelps Dodge Corporation, a mining company impacted by the strike, declared, “I believe the government will be able to show that there is German influence behind this movement.”
In a stroke of propagandistic genius cartoonist H.T. Webster managed to conflate the Kaiser, the top German aristocrat, with the International Workers of the World. His cartoon, which ran in the New York Globe in 1917, shows the acronym IWW printed over the Kaiser’s face.
By the time the war was over ordinary German-Americans had learned to lay low, speak English exclusively, and release any attachment to their cultural heritage. Perhaps the best measure of the success of the propaganda effort against Germans is how quickly and effectively they learned to de-identify as German. We may be tempted to mistake this for assimilation, but history shows the hand of coercion was placed firmly around the German-American community.
So, why care about this? After all, it’s not as if any American is being attacked or oppressed today based on their German heritage. For me, as a German-American and a student of history, the anti-German propaganda effort is no mere historical footnote or anomalous descent into xenophobia. It is a template—hardly the first, but a vivid one no less—for what we are seeing today. When I look at the faces of immigrant children forcibly separated from their families or hear anti-Muslim sentiment, I know that the demonization of communities does not happen accidentally and that an intellectual as well as legal and political infrastructure is required for them to become victims in the larger game of geopolitics. The material loss for those affected is incalculable and the loss of culture heritage for them and everyone else is often a quiet casualty of the war waged against them. History gives us the vantage point to see how and why enemies are created. It also gives us an unmistakable warning that we would be wise to heed.
Here are some of the ways that Mexicans made America great
History News Network - raw story
18 JUL 2018 AT 10:25 ET
Mexicans have contributed to making the United States in pivotal and enduring ways. In 1776, more of the territory of the current United States was under Spanish sovereignty than in the 13 colonies that rejected British rule. Florida, the Gulf coast to New Orleans, the Mississippi to St. Louis, and the lands from Texas through New Mexico and California all lived under Spanish rule, setting Hispanic-Mexican legacies. Millions of pesos minted in Mexico City, the American center of global finance, funded the war for U.S. independence, leading the new nation to adopt the peso (renamed the dollar) as its currency.
The U.S. repaid the debt by claiming Spanish/Mexican lands—buying vast Louisiana territories (via France) in 1803; gaining Florida by treaty in 1819; sending settlers into Texas (many undocumented) to expand cotton and slavery in the 1820s; enabling Texas secession in 1836; provoking war in 1846 to incorporate Texas’s cotton and slave economy—and California’s gold fields, too. The U.S. took in land and peoples long Spanish and recently Mexican—often mixing European, indigenous, and African ancestries. The 1848 Treaty of Guadalupe Hidalgo recognized those who remained in the U.S. as citizens. And the U.S. incorporated the dynamic mining-grazing-irrigation economy that had marked Spanish North America for centuries and would long define the U.S. West.
Debates over slavery and freedom in lands taken from Mexico led to the U.S. Civil War while Mexicans locked in shrunken territories fought over liberal reforms and then faced a French occupation—all in the 1860s. With Union victory, the U.S. drove to continental hegemony. Simultaneously, Mexican liberals led by Benito Juárez consolidated power and welcomed U.S. capital. U.S. investors built Mexican railroads, developed mines, and promoted export industries—including petroleum. The U.S. and Mexican economies merged; U.S. capital and technology shaped Mexico while Mexican workers built the U.S. west. The economies were so integrated that a U.S. downturn, the panic of 1907, was pivotal to setting off Mexico’s 1910 revolution, a sociopolitical conflagration that focused Mexicans while the U.S. joined World War I.
Afterwards, the U.S. roared in the 20s while Mexicans faced reconstruction. The U.S. blocked immigration from Europe, and still welcomed Mexicans to cross a little-patrolled border to build dams and irrigation systems, cities and farms across the west. When depression hit in 1929 (it began in New York, spread across the U.S., and was exported to Mexico), Mexicans became expendable. Denied relief, they got one-way tickets to the border, forcing thousands south—including children born as U.S. citizens.
Mexico absorbed the refugees thanks to new industries and land distributions—reforms culminating in the nationalization of the oil industry in 1938. U.S. corporations screamed foul and FDR arranged a settlement (access to Mexican oil mattered as World War II loomed). When war came, the U.S. needed more than oil. It needed cloth and copper, livestock and leather–and workers, too. Remembering the expulsions of the early 30s, many resisted going north. So the governments negotiated a labor program, recruiting braceros in Mexico, paying for travel, promising decent wages and treatment. 500,000 Mexican citizens fought in the U.S. military. Sent to deadly fronts, they suffered high casualty rates.
To support the war, Mexican exporters accepted promises of post-war payment. With peace, accumulated credits allowed Mexico to import machinery for national development. But when credits ran out, the U.S. subsidized the reconstruction of Europe and Japan, leaving Mexico to compete for scarce and expensive bank credit. Life came in cycles of boom and bust, debt crises and devaluations. Meanwhile, U.S. pharmaceutical sellers delivered the antibiotics that had saved soldiers in World War II to families across Mexico. Children lived—and Mexico’s population soared: from 20 million in 1940, to 50 million by 1970, 100 million in 2000. To feed growing numbers, Mexico turned to U.S. funding and scientists to pioneer a “green revolution.” Harvests of wheat and maize rose to feed growing cities. Reliance on machinery and chemical fertilizers, pesticides, and herbicides, however, cut rural employment. National industries also adopted labor saving ways, keeping employment scarce everywhere. So people trekked north, some to labor seasonally in a bracero program that lasted to 1964; others to settle families in once Mexican regions like Texas and California and places north and east.
Documentation and legality were uncertain. U.S. employers’ readiness to hire Mexicans for low wages was not. People kept coming. U.S. financing, corporations, and models of production shaped lives across the border; Mexican workers labored everywhere, too. With integrated economies, the nations faced linked challenges. In the 1980s the U.S. suffered from “stagflation” while Mexico faced a collapse called the “lost decade.” In 1986, Republican President Ronald Reagan authorized a path to legality for thousands of Mexicans in the U.S. tied to sanctions on employers aimed to end new arrivals. Legal status kept workers here; failed sanctions enabled employers to keep hiring Mexicans—who kept coming. They provided cheap and insecure workers to U.S. producers—subsidizing profits in times of challenge.
The 1980s also saw the demise of the Soviet Union, the end of the Cold War, and the presumed triumph of capitalism. What would that mean for people in Mexico and the U.S.? Reagan corroded union rights, leading to declining incomes, disappearing pensions, and enduring insecurities among U.S. workers. President Carlos Salinas of Mexico’s dominant PRI attacked union power—and in 1992 ended rural Mexicans’ right to land. A transnational political consensus saw the erosion of popular rights as key to post-Cold War times.
Salinas proposed NAFTA to Reagan’s Republican successor, George H.W. Bush. The goal was to liberate capital and goods to move freely across borders—while holding people within nations. U.S. business would profit; Mexicans would continue to labor as a reservoir of low wage workers—at home. The treaty was ratified in Mexico by Salinas and the PRI, in the U.S. by Democratic President Clinton and an allied Congress.
As NAFTA took effect in 1994, Mexico faced the Zapatista rising in the south, then a financial collapse—before NAFTA could bring investment and jobs. With recovery, the Clinton era hi-tech boom saw production flow to China. Mexico gained where transport costs mattered—as in auto assembly. But old textiles and new electronics went to Asia. Mexico returned to growth in the late 1990s, with jobs still scarce for a population nearing 100 million. Meanwhile, much of Mexican agriculture collapsed. NAFTA ended tariffs on goods crossing borders. The U.S. subsidizes corporate farmers–internal payments enabling agribusiness to sell below cost. NAFTA left Mexican producers to face U.S. subsidized staples. Mexican growers could not compete and migration to the U.S. accelerated.
NAFTA brought new concentrations of wealth and power across North America. In Mexico, cities grew as a powerful few and favored middle sectors prospered; millions more struggled with informality and marginality. The vacuum created by agricultural collapse and urban marginality made space for a dynamic violent drug economy. Historically, cocaine was an Andean specialty, heroin an Asian product. But as the U.S. pressed against drug economies elsewhere, Mexicans—some enticed by profit; many searching for sustenance—turned to supply U.S. consumers.
U.S. politicians and ideologues blame Mexico for the “drug problem”—a noisy “supply side” understanding that is historically untenable. U.S. demand drives the drug economy. The U.S. has done nothing effective to curtail consumption—or to limit the flow of weapons to drug cartels in Mexico. Laying blame helps block any national discussion of the underlying social insecurities brought by globalization—deindustrialization, scarce employment, low wages, lowered benefits, vanishing pensions—insecurities that close observers know fuel drug dependency. Drug consumption in the U.S. has expanded as migration from Mexico now slows (mostly due to slowing population growth)—a conversation steadfastly avoided.
People across North America struggle with shared challenges—common insecurities spread by globalizing capitalism. Too many U.S. politicians see advantages in polarization, blaming Mexicans for all that ails life north of the border. Better that we work to understand our inseparable histories. Then we might work toward a prosperity shared by diverse peoples facing common challenges in an integrated North America.
The U.S. repaid the debt by claiming Spanish/Mexican lands—buying vast Louisiana territories (via France) in 1803; gaining Florida by treaty in 1819; sending settlers into Texas (many undocumented) to expand cotton and slavery in the 1820s; enabling Texas secession in 1836; provoking war in 1846 to incorporate Texas’s cotton and slave economy—and California’s gold fields, too. The U.S. took in land and peoples long Spanish and recently Mexican—often mixing European, indigenous, and African ancestries. The 1848 Treaty of Guadalupe Hidalgo recognized those who remained in the U.S. as citizens. And the U.S. incorporated the dynamic mining-grazing-irrigation economy that had marked Spanish North America for centuries and would long define the U.S. West.
Debates over slavery and freedom in lands taken from Mexico led to the U.S. Civil War while Mexicans locked in shrunken territories fought over liberal reforms and then faced a French occupation—all in the 1860s. With Union victory, the U.S. drove to continental hegemony. Simultaneously, Mexican liberals led by Benito Juárez consolidated power and welcomed U.S. capital. U.S. investors built Mexican railroads, developed mines, and promoted export industries—including petroleum. The U.S. and Mexican economies merged; U.S. capital and technology shaped Mexico while Mexican workers built the U.S. west. The economies were so integrated that a U.S. downturn, the panic of 1907, was pivotal to setting off Mexico’s 1910 revolution, a sociopolitical conflagration that focused Mexicans while the U.S. joined World War I.
Afterwards, the U.S. roared in the 20s while Mexicans faced reconstruction. The U.S. blocked immigration from Europe, and still welcomed Mexicans to cross a little-patrolled border to build dams and irrigation systems, cities and farms across the west. When depression hit in 1929 (it began in New York, spread across the U.S., and was exported to Mexico), Mexicans became expendable. Denied relief, they got one-way tickets to the border, forcing thousands south—including children born as U.S. citizens.
Mexico absorbed the refugees thanks to new industries and land distributions—reforms culminating in the nationalization of the oil industry in 1938. U.S. corporations screamed foul and FDR arranged a settlement (access to Mexican oil mattered as World War II loomed). When war came, the U.S. needed more than oil. It needed cloth and copper, livestock and leather–and workers, too. Remembering the expulsions of the early 30s, many resisted going north. So the governments negotiated a labor program, recruiting braceros in Mexico, paying for travel, promising decent wages and treatment. 500,000 Mexican citizens fought in the U.S. military. Sent to deadly fronts, they suffered high casualty rates.
To support the war, Mexican exporters accepted promises of post-war payment. With peace, accumulated credits allowed Mexico to import machinery for national development. But when credits ran out, the U.S. subsidized the reconstruction of Europe and Japan, leaving Mexico to compete for scarce and expensive bank credit. Life came in cycles of boom and bust, debt crises and devaluations. Meanwhile, U.S. pharmaceutical sellers delivered the antibiotics that had saved soldiers in World War II to families across Mexico. Children lived—and Mexico’s population soared: from 20 million in 1940, to 50 million by 1970, 100 million in 2000. To feed growing numbers, Mexico turned to U.S. funding and scientists to pioneer a “green revolution.” Harvests of wheat and maize rose to feed growing cities. Reliance on machinery and chemical fertilizers, pesticides, and herbicides, however, cut rural employment. National industries also adopted labor saving ways, keeping employment scarce everywhere. So people trekked north, some to labor seasonally in a bracero program that lasted to 1964; others to settle families in once Mexican regions like Texas and California and places north and east.
Documentation and legality were uncertain. U.S. employers’ readiness to hire Mexicans for low wages was not. People kept coming. U.S. financing, corporations, and models of production shaped lives across the border; Mexican workers labored everywhere, too. With integrated economies, the nations faced linked challenges. In the 1980s the U.S. suffered from “stagflation” while Mexico faced a collapse called the “lost decade.” In 1986, Republican President Ronald Reagan authorized a path to legality for thousands of Mexicans in the U.S. tied to sanctions on employers aimed to end new arrivals. Legal status kept workers here; failed sanctions enabled employers to keep hiring Mexicans—who kept coming. They provided cheap and insecure workers to U.S. producers—subsidizing profits in times of challenge.
The 1980s also saw the demise of the Soviet Union, the end of the Cold War, and the presumed triumph of capitalism. What would that mean for people in Mexico and the U.S.? Reagan corroded union rights, leading to declining incomes, disappearing pensions, and enduring insecurities among U.S. workers. President Carlos Salinas of Mexico’s dominant PRI attacked union power—and in 1992 ended rural Mexicans’ right to land. A transnational political consensus saw the erosion of popular rights as key to post-Cold War times.
Salinas proposed NAFTA to Reagan’s Republican successor, George H.W. Bush. The goal was to liberate capital and goods to move freely across borders—while holding people within nations. U.S. business would profit; Mexicans would continue to labor as a reservoir of low wage workers—at home. The treaty was ratified in Mexico by Salinas and the PRI, in the U.S. by Democratic President Clinton and an allied Congress.
As NAFTA took effect in 1994, Mexico faced the Zapatista rising in the south, then a financial collapse—before NAFTA could bring investment and jobs. With recovery, the Clinton era hi-tech boom saw production flow to China. Mexico gained where transport costs mattered—as in auto assembly. But old textiles and new electronics went to Asia. Mexico returned to growth in the late 1990s, with jobs still scarce for a population nearing 100 million. Meanwhile, much of Mexican agriculture collapsed. NAFTA ended tariffs on goods crossing borders. The U.S. subsidizes corporate farmers–internal payments enabling agribusiness to sell below cost. NAFTA left Mexican producers to face U.S. subsidized staples. Mexican growers could not compete and migration to the U.S. accelerated.
NAFTA brought new concentrations of wealth and power across North America. In Mexico, cities grew as a powerful few and favored middle sectors prospered; millions more struggled with informality and marginality. The vacuum created by agricultural collapse and urban marginality made space for a dynamic violent drug economy. Historically, cocaine was an Andean specialty, heroin an Asian product. But as the U.S. pressed against drug economies elsewhere, Mexicans—some enticed by profit; many searching for sustenance—turned to supply U.S. consumers.
U.S. politicians and ideologues blame Mexico for the “drug problem”—a noisy “supply side” understanding that is historically untenable. U.S. demand drives the drug economy. The U.S. has done nothing effective to curtail consumption—or to limit the flow of weapons to drug cartels in Mexico. Laying blame helps block any national discussion of the underlying social insecurities brought by globalization—deindustrialization, scarce employment, low wages, lowered benefits, vanishing pensions—insecurities that close observers know fuel drug dependency. Drug consumption in the U.S. has expanded as migration from Mexico now slows (mostly due to slowing population growth)—a conversation steadfastly avoided.
People across North America struggle with shared challenges—common insecurities spread by globalizing capitalism. Too many U.S. politicians see advantages in polarization, blaming Mexicans for all that ails life north of the border. Better that we work to understand our inseparable histories. Then we might work toward a prosperity shared by diverse peoples facing common challenges in an integrated North America.
Coard: “Operation Wetback”: America's Worst Mass Deportation
Michael Coard - philly tribune
7/13/18
It was 64 years ago on July 15, 1954 that the U.S. Border Patrol began its widespread and notoriously racist deportation program — officially called “Operation Wetback” by the Dwight Eisenhower administration — by kicking out nearly 1.5 million Mexicans through the use of such despicable actions that involved, for example, the demanding of birth certificate identification of all so-called “Mexican-looking” people via apartheid-styled stop and frisk harassment.
I should mention that this “wetback” racial slur originated in the 1920s to describe Mexicans who swam the Rio Grande to reach America.
Donald Trump, born in 1946, was a budding 8-year-old racist in 1954. And by the time he reached 69 years old 61 years later, he was a full-fledged racist Republican presidential candidate who, in 2015, said Mexican immigrants are “bringing in crime. They’re rapists.” In 2014, he described Mexicans as “our enemies.”
As president, he declared that during his first 100 days in office, he would deport up to three million undocumented immigrants, meaning the Brown and Black ones. Not the white ones. As he stated earlier this year, “Why do we want all these people from Africa here? They’re s***hole countries. .... We should have more [white] people from Norway.”
Also as president, he initiated and implemented in 2018 a policy that has never — I repeat, never — before been initiated and implemented to separate thousands of immigrant, especially Mexican, infants and other children from parents fleeing imminent violence and/or dire poverty in their native land.
Let’s get back to 1954’s “Operation Wetback,” which is a perfect example of what Trump means by his “Make America Great Again” nonsense. By the way, when he and his 63 million racist supporters tell Mexicans to go back to their own country, those so-called immigrants should just plop down in Texas and explain that they and their ancestors are natives of that land — at least until America in 1845 stole the northeastern province of Mexico and declared it the 28th state and then began the Mexican-American War a year later and claimed victory in 1848.
And while our Mexican comrades are at it, they should mention that the American states of Arizona, California, Colorado, New Mexico and Utah are also part of their ancestral homeland. As a result, maybe Trump should go back to Germany and his stooges back to their various European countries.
As indicated by Erin Blakemore of The History Channel, “[This] short-lived operation used military-style tactics to remove Mexican immigrants — some of them American citizens — from the United States. Though millions of Mexicans had legally entered the country through joint immigration programs in the first half of the 20th century, Operation Wetback was designed to send them back to Mexico. ... During [this operation] ..., tens of thousands of immigrants were shoved into buses, boats and planes and sent to often-unfamiliar parts of Mexico, where they struggled to rebuild their lives. In Chicago, three planes a week were filled with immigrants and flown to Mexico. In Texas, 25 percent of all of the immigrants deported were crammed onto boats later compared to slave ships, while others died of sunstroke, disease and other causes while in custody.”
Columbia University professor Dr. Mae Ngai similarly referred to the type of boats used as like “eighteenth century slave ships.”
UCLA professor Kelly Lytle Hernandez pointed out that the operation was “lawless ..., arbitrary ... [and] based on a lot of xenophobia ... and ... resulted in sizable large-scale violations of people’s rights, including the forced deportation of U.S. citizens.”
There were so many immigrant kidnappings, as I describe them, that The Conversation (which is a global network of newsrooms founded by British-Australian journalist Andrew Jaspan) reported that the Border Patrol was “converting public parks into concentration camps to detain at least 1,000 people at a time.”
These types of mass deportations actually began much earlier. During the 1930s, as uncovered by historian Francisco Balderrama, “The United States deported over one million Mexicans ..., 60 percent of whom were U.S. citizens of Mexican descent.”
Yesterday’s 1954 immigration policy oozed from the same sewer that today’s 2018 immigration policy oozes: the racism sewer. It was a political response to white brainless KKK-type voters who illogically claimed that the backbreaking and labor-intensive farming jobs they didn’t want were being taken by Brown people. Wait ... What?!
I should note that in 2015, then-candidate Trump endorsed Operation Wetback in a campaign speech wherein he proclaimed, “I like Ike ... [who] moved a million-and-a-half illegal immigrants out of this country. ... Moved them way south. They never came back.”
I should also note that the Trump-like cretin who served as Southwest Border Patrol Chief during that operation was Harlon Carter. He’s the guy who had been convicted at age 17 of hunting down and murdering Mexican-American Ramon Casiano in Laredo, Texas in 1931 and later, years after his conviction was overturned on a (racist) technicality, became chief executive officer of the National Rifle Association. Anyone surprised? Nope.
You’ve just read about the old problem and the new problem. But what about the solution? It’s very simple: Abolish not only I.C.E. (Immigration and Customs Enforcement) but also any federal policy that does not allow people fleeing violence, persecution, or deadly poverty the right to due process proceedings which include court-appointed lawyers and a “preponderance of the evidence” burden placed on the government to disprove such claim of violence, persecution, or deadly poverty.
Oh, and by the way, when is any member of Congress gonna ask the indigenous Red people here what they think about the so-called (and misnamed) New World’s immigration policy since 1492 and America’s immigration policy since 1776? Red land reparations anyone?
I should mention that this “wetback” racial slur originated in the 1920s to describe Mexicans who swam the Rio Grande to reach America.
Donald Trump, born in 1946, was a budding 8-year-old racist in 1954. And by the time he reached 69 years old 61 years later, he was a full-fledged racist Republican presidential candidate who, in 2015, said Mexican immigrants are “bringing in crime. They’re rapists.” In 2014, he described Mexicans as “our enemies.”
As president, he declared that during his first 100 days in office, he would deport up to three million undocumented immigrants, meaning the Brown and Black ones. Not the white ones. As he stated earlier this year, “Why do we want all these people from Africa here? They’re s***hole countries. .... We should have more [white] people from Norway.”
Also as president, he initiated and implemented in 2018 a policy that has never — I repeat, never — before been initiated and implemented to separate thousands of immigrant, especially Mexican, infants and other children from parents fleeing imminent violence and/or dire poverty in their native land.
Let’s get back to 1954’s “Operation Wetback,” which is a perfect example of what Trump means by his “Make America Great Again” nonsense. By the way, when he and his 63 million racist supporters tell Mexicans to go back to their own country, those so-called immigrants should just plop down in Texas and explain that they and their ancestors are natives of that land — at least until America in 1845 stole the northeastern province of Mexico and declared it the 28th state and then began the Mexican-American War a year later and claimed victory in 1848.
And while our Mexican comrades are at it, they should mention that the American states of Arizona, California, Colorado, New Mexico and Utah are also part of their ancestral homeland. As a result, maybe Trump should go back to Germany and his stooges back to their various European countries.
As indicated by Erin Blakemore of The History Channel, “[This] short-lived operation used military-style tactics to remove Mexican immigrants — some of them American citizens — from the United States. Though millions of Mexicans had legally entered the country through joint immigration programs in the first half of the 20th century, Operation Wetback was designed to send them back to Mexico. ... During [this operation] ..., tens of thousands of immigrants were shoved into buses, boats and planes and sent to often-unfamiliar parts of Mexico, where they struggled to rebuild their lives. In Chicago, three planes a week were filled with immigrants and flown to Mexico. In Texas, 25 percent of all of the immigrants deported were crammed onto boats later compared to slave ships, while others died of sunstroke, disease and other causes while in custody.”
Columbia University professor Dr. Mae Ngai similarly referred to the type of boats used as like “eighteenth century slave ships.”
UCLA professor Kelly Lytle Hernandez pointed out that the operation was “lawless ..., arbitrary ... [and] based on a lot of xenophobia ... and ... resulted in sizable large-scale violations of people’s rights, including the forced deportation of U.S. citizens.”
There were so many immigrant kidnappings, as I describe them, that The Conversation (which is a global network of newsrooms founded by British-Australian journalist Andrew Jaspan) reported that the Border Patrol was “converting public parks into concentration camps to detain at least 1,000 people at a time.”
These types of mass deportations actually began much earlier. During the 1930s, as uncovered by historian Francisco Balderrama, “The United States deported over one million Mexicans ..., 60 percent of whom were U.S. citizens of Mexican descent.”
Yesterday’s 1954 immigration policy oozed from the same sewer that today’s 2018 immigration policy oozes: the racism sewer. It was a political response to white brainless KKK-type voters who illogically claimed that the backbreaking and labor-intensive farming jobs they didn’t want were being taken by Brown people. Wait ... What?!
I should note that in 2015, then-candidate Trump endorsed Operation Wetback in a campaign speech wherein he proclaimed, “I like Ike ... [who] moved a million-and-a-half illegal immigrants out of this country. ... Moved them way south. They never came back.”
I should also note that the Trump-like cretin who served as Southwest Border Patrol Chief during that operation was Harlon Carter. He’s the guy who had been convicted at age 17 of hunting down and murdering Mexican-American Ramon Casiano in Laredo, Texas in 1931 and later, years after his conviction was overturned on a (racist) technicality, became chief executive officer of the National Rifle Association. Anyone surprised? Nope.
You’ve just read about the old problem and the new problem. But what about the solution? It’s very simple: Abolish not only I.C.E. (Immigration and Customs Enforcement) but also any federal policy that does not allow people fleeing violence, persecution, or deadly poverty the right to due process proceedings which include court-appointed lawyers and a “preponderance of the evidence” burden placed on the government to disprove such claim of violence, persecution, or deadly poverty.
Oh, and by the way, when is any member of Congress gonna ask the indigenous Red people here what they think about the so-called (and misnamed) New World’s immigration policy since 1492 and America’s immigration policy since 1776? Red land reparations anyone?
A SHORT HISTORY OF AMERICANS PRAISING THEIR OWN “GENEROSITY,” FROM THE GENOCIDE OF NATIVE AMERICANS TO TRUMP’S CHILD SNATCHERS
Jon Schwarz - the intercept
July 12 2018, 8:14 a.m.
EARLIER THIS WEEK, Health and Human Services Secretary Alex Azar celebrated the Trump administration for its treatment of immigrant children it has separated from their parents. “We have nothing to hide about how we operate these facilities,” said Azar on CNN. “It is one of the great acts of American generosity and charity, what we are doing for these unaccompanied kids.”
This magnanimous claim raises an obvious question: What are the othergreat acts of American generosity?
Of course, we know that the U.S. treatment of Native Americans has been extraordinarily generous. They were literally asking for it since before there was an America: The seal of the Massachusetts Bay Colony for most of the 17th century was an Indian saying, “Come over and help us.”
So as President Andrew Jackson explained to Congress in 1829 when making the case for the Indian Removal Act, “Toward the aborigines of the country no one can indulge a more friendly feeling than myself. … Rightly considered, the policy of the General Government toward the red man is not only liberal, but generous.” The passage of the act the next year generously allowed the Cherokee to experience the Trail of Tears.
Sixty years later, in “The Winning of the West,” future President Teddy Roosevelt remained impressed by this example of American generosity. “In [our] treaties,” he wrote, “we have been more than just to the Indians; we have been abundantly generous. … No other conquering and colonizing nation has ever treated the original savage owners of the soil with such generosity as has the United States.”
Slavery, too, was an act of generosity. As Thomas Roderick Dew, who went on to become president of William & Mary College, put it in the famed 1832 treatise, “The Pro-Slavery Argument,” slaveowners were among the “most generous” Americans. Moreover, a slaveholder’s son, precisely because he witnessed his father enslaving others, “acquires a greater generosity and elevation of soul, and embraces for the sphere of his generous actions a much wider field.”
The Vietnam War was another high point in U.S. generosity. David Lawrence, then editor of U.S. News & World Report, proclaimed in 1966 that “what the United States is doing in Vietnam is the most significant example of philanthropy extended by one people to another that we have witnessed in our times.”
More recently, we generously helped three prisoners at Guantánamo Bay kill themselves. “The manipulative detainees,” wrote Michelle Malkinsoon afterward, “reportedly used the generous civil liberties protections we gave them to plot their suicide pact.”
So we clearly have a long history about which to volubly praise ourselves. But we should be modest enough to realize that we have never matched the moral heights of the most generous man in history: Adolf Hitler.
Just before the German invasion of Poland in 1939, the British ambassador to Germany wrote home to explain how frustrated Hitler was that he was not receiving credit for his generosity:
Herr Hitler replied that he would be willing to negotiate, if there was a Polish Government which was prepared to be reasonable. … He expatiated on misdoings of the Poles, referred to his generous offer of March last, said that it could not be repeated.
We shouldn’t feel too bad about the Poles’ ingratitude, however. Hitler had an advantage, because he was leading the Germans, who are naturally generous to a fault. Joseph Goebbels explained this in a generous 1941 article, titled “The Jews Are Guilty!“:
If we Germans have a fateful flaw in our national character, it is forgetfulness. This failing speaks well of our human decency and generosity, but not always for our political wisdom or intelligence. We think everyone else is as good natured as we are.
At this point, all we can do is pray that someday we will find it in our hearts to be as generous as 1940s-era Germans. Overall, we’re still a long way from that achievement, but certainly there seem to be some pioneers among us right now who are getting close.
This magnanimous claim raises an obvious question: What are the othergreat acts of American generosity?
Of course, we know that the U.S. treatment of Native Americans has been extraordinarily generous. They were literally asking for it since before there was an America: The seal of the Massachusetts Bay Colony for most of the 17th century was an Indian saying, “Come over and help us.”
So as President Andrew Jackson explained to Congress in 1829 when making the case for the Indian Removal Act, “Toward the aborigines of the country no one can indulge a more friendly feeling than myself. … Rightly considered, the policy of the General Government toward the red man is not only liberal, but generous.” The passage of the act the next year generously allowed the Cherokee to experience the Trail of Tears.
Sixty years later, in “The Winning of the West,” future President Teddy Roosevelt remained impressed by this example of American generosity. “In [our] treaties,” he wrote, “we have been more than just to the Indians; we have been abundantly generous. … No other conquering and colonizing nation has ever treated the original savage owners of the soil with such generosity as has the United States.”
Slavery, too, was an act of generosity. As Thomas Roderick Dew, who went on to become president of William & Mary College, put it in the famed 1832 treatise, “The Pro-Slavery Argument,” slaveowners were among the “most generous” Americans. Moreover, a slaveholder’s son, precisely because he witnessed his father enslaving others, “acquires a greater generosity and elevation of soul, and embraces for the sphere of his generous actions a much wider field.”
The Vietnam War was another high point in U.S. generosity. David Lawrence, then editor of U.S. News & World Report, proclaimed in 1966 that “what the United States is doing in Vietnam is the most significant example of philanthropy extended by one people to another that we have witnessed in our times.”
More recently, we generously helped three prisoners at Guantánamo Bay kill themselves. “The manipulative detainees,” wrote Michelle Malkinsoon afterward, “reportedly used the generous civil liberties protections we gave them to plot their suicide pact.”
So we clearly have a long history about which to volubly praise ourselves. But we should be modest enough to realize that we have never matched the moral heights of the most generous man in history: Adolf Hitler.
Just before the German invasion of Poland in 1939, the British ambassador to Germany wrote home to explain how frustrated Hitler was that he was not receiving credit for his generosity:
Herr Hitler replied that he would be willing to negotiate, if there was a Polish Government which was prepared to be reasonable. … He expatiated on misdoings of the Poles, referred to his generous offer of March last, said that it could not be repeated.
We shouldn’t feel too bad about the Poles’ ingratitude, however. Hitler had an advantage, because he was leading the Germans, who are naturally generous to a fault. Joseph Goebbels explained this in a generous 1941 article, titled “The Jews Are Guilty!“:
If we Germans have a fateful flaw in our national character, it is forgetfulness. This failing speaks well of our human decency and generosity, but not always for our political wisdom or intelligence. We think everyone else is as good natured as we are.
At this point, all we can do is pray that someday we will find it in our hearts to be as generous as 1940s-era Germans. Overall, we’re still a long way from that achievement, but certainly there seem to be some pioneers among us right now who are getting close.
american corporations have never been PATRIOTIC!!!
American supporters of the European Fascists
(Read more)
A number of prominent and wealthy American businessmen helped to support fascist regimes in Europe from the 1920s through the 1940s. These people helped to support Francisco Franco during the Spanish Civil War of 1936, as well as Benito Mussolini, and Adolph Hitler.
Some of the primary and more famous Americans and companies that were involved with the fascist regimes of Europe are: William Randolph Hearst, Joseph Kennedy (JFK's father), Charles Lindbergh, John Rockefeller, Andrew Mellon (head of Alcoa, banker, and Secretary of Treasury), DuPont, General Motors, Standard Oil (now Exxon), Ford, ITT, Allen Dulles (later head of the CIA), Prescott Bush, National City Bank, and General Electric.
It should be noted that businessmen from many countries, including England and Australia, also worked with the fascist regimes of Europe prior to WWII. The fascist governments were involved in a high level of construction, production, and international business.
I.G. Farben, a German company, was the largest chemical manufacturing enterprise in the world during the early part of the 20th century. As such the company had many holdings in a variety of countries, including America. The American holdings of I.G. Farben included Bayer Co., General Aniline Works, Agfa Ansco, and Winthrop Chemical Company.
I.G. Farben was critical in the development of the German economy and war machine leading up to WWII. During this time I.G. Farben's international holdings along with its international business contracts with companies like Standard Oil, DuPont, Alcoa, and Dow Chemical were crucial in supplying the Nazi regime with the materials needed for war as well as financial support.
The Spanish Civil War was the precursor to World War II. Fascist Francisco Franco was aided by Hitler and Mussolini during the Spanish Civil War. At this time GM, Ford, DuPont, and Standard Oil were working with Franco and supplying the fascist powers of Europe. At this same time many Americans were protesting the goings on in Europe as well as the involvement of American companies in helping the fascist powers. A group of American volunteer soldiers known as the Abe Lincoln Brigade went to Spain during this time to fight against Franco in defense of the Spanish Republic. This group was made up primarily of leftist American groups, such as members of American socialist parties and communist parties.
The success of the fascists in Spain was an important first step in the building of fascist power in Europe and the stepping-stone for the Italian and German powers.
The support of American corporations, and lack of American intervention by the government, was crucial in the success of this first step.
American banks and businesses continued to support the fascist regimes of Europe legally up until the day Germany declared war on America and the activities were stopped under the Trading with the Enemy Act. Despite this, some companies and individuals still maintained a business relationship with the Third Reich. Ford and GM supplied European fascists with trucks and equipment as well as investing money in I.G. Farben plants. Standard Oil supplied the fascists with fuel. US Steel and Alcoa supplied them with critically needed metals. American banks gave them billion's of dollars worth of loans.
The following is excerpted from a report printed by the United States Senate Committee on the Judiciary in 1974:
The activities of General Motors, Ford and Chrysler prior to and during World War II...are instructive. At that time, these three firms dominated motor vehicle production in both the United States and Germany. Due to its mass production capabilities, automobile manufacturing is one of the most crucial industries with respect to national defense. As a result, these firms retained the economic and political power to affect the shape of governmental relations both within and between these nations in a manner which maximized corporate global profits. In short, they were private governments unaccountable to the citizens of any country yet possessing tremendous influence over the course of war and peace in the world. The substantial contribution of these firms to the American war effort in terms of tanks, aircraft components, and other military equipment is widely acknowledged. Less well known are the simultaneous contributions of their foreign subsidiaries to the Axis Powers. In sum, they maximized profits by supplying both sides with the materiel needed to conduct the war.
During the 1920's and 1930's, the Big Three automakers undertook an extensive program of multinational expansion...By the mid-1930's, these three American companies owned automotive subsidiaries throughout Europe and the Far East; many of their largest facilities were located in the politically sensitive nations of Germany, Poland, Rumania, Austria, Hungary, Latvia, and Japan...Due to their concentrated economic power over motor vehicle production in both Allied and Axis territories, the Big Three inevitably became major factors in the preparations and progress of the war. In Germany, for example, General Motors and Ford became an integral part of the Nazi war efforts. GM's plants in Germany built thousands of bomber and jet fighter propulsion systems for the Luftwaffe at the same time that its American plants produced aircraft engines for the U.S. Army Air Corps....
Ford was also active in Nazi Germany's prewar preparations. In 1938, for instance, it opened a truck assembly plant in Berlin whose "real purpose," according to U.S. Army Intelligence, was producing "troop transport-type" vehicles for the Wehrmacht. That year Ford's chief executive received the Nazi German Eagle (first class)....
The outbreak of war in September 1939 resulted inevitably in the full conversion by GM and Ford of their Axis plants to the production of military aircraft and trucks.... On the ground, GM and Ford subsidiaries built nearly 90 percent of the armored "mule" 3-ton half-trucks and more than 70 percent of the Reich's medium and heavy-duty trucks. These vehicles, according to American intelligence reports, served as "the backbone of the German Army transportation system."....
After the cessation of hostilities, GM and Ford demanded reparations from the U.S. Government for wartime damages sustained by their Axis facilities as a result of Allied bombing... Ford received a little less than $1 million, primarily as a result of damages sustained by its military truck complex at Cologne...
Due to their multinational dominance of motor vehicle production, GM and Ford became principal suppliers for the forces of fascism as well as for the forces of democracy. It may, of course, be argued that participating in both sides of an international conflict, like the common corporate practice of investing in both political parties before an election, is an appropriate corporate activity. Had the Nazis won, General Motors and Ford would have appeared impeccably Nazi; as Hitler lost, these companies were able to re-emerge impeccably American. In either case, the viability of these corporations and the interests of their respective stockholders would have been preserved.
(READ MORE)
Some of the primary and more famous Americans and companies that were involved with the fascist regimes of Europe are: William Randolph Hearst, Joseph Kennedy (JFK's father), Charles Lindbergh, John Rockefeller, Andrew Mellon (head of Alcoa, banker, and Secretary of Treasury), DuPont, General Motors, Standard Oil (now Exxon), Ford, ITT, Allen Dulles (later head of the CIA), Prescott Bush, National City Bank, and General Electric.
It should be noted that businessmen from many countries, including England and Australia, also worked with the fascist regimes of Europe prior to WWII. The fascist governments were involved in a high level of construction, production, and international business.
I.G. Farben, a German company, was the largest chemical manufacturing enterprise in the world during the early part of the 20th century. As such the company had many holdings in a variety of countries, including America. The American holdings of I.G. Farben included Bayer Co., General Aniline Works, Agfa Ansco, and Winthrop Chemical Company.
I.G. Farben was critical in the development of the German economy and war machine leading up to WWII. During this time I.G. Farben's international holdings along with its international business contracts with companies like Standard Oil, DuPont, Alcoa, and Dow Chemical were crucial in supplying the Nazi regime with the materials needed for war as well as financial support.
The Spanish Civil War was the precursor to World War II. Fascist Francisco Franco was aided by Hitler and Mussolini during the Spanish Civil War. At this time GM, Ford, DuPont, and Standard Oil were working with Franco and supplying the fascist powers of Europe. At this same time many Americans were protesting the goings on in Europe as well as the involvement of American companies in helping the fascist powers. A group of American volunteer soldiers known as the Abe Lincoln Brigade went to Spain during this time to fight against Franco in defense of the Spanish Republic. This group was made up primarily of leftist American groups, such as members of American socialist parties and communist parties.
The success of the fascists in Spain was an important first step in the building of fascist power in Europe and the stepping-stone for the Italian and German powers.
The support of American corporations, and lack of American intervention by the government, was crucial in the success of this first step.
American banks and businesses continued to support the fascist regimes of Europe legally up until the day Germany declared war on America and the activities were stopped under the Trading with the Enemy Act. Despite this, some companies and individuals still maintained a business relationship with the Third Reich. Ford and GM supplied European fascists with trucks and equipment as well as investing money in I.G. Farben plants. Standard Oil supplied the fascists with fuel. US Steel and Alcoa supplied them with critically needed metals. American banks gave them billion's of dollars worth of loans.
The following is excerpted from a report printed by the United States Senate Committee on the Judiciary in 1974:
The activities of General Motors, Ford and Chrysler prior to and during World War II...are instructive. At that time, these three firms dominated motor vehicle production in both the United States and Germany. Due to its mass production capabilities, automobile manufacturing is one of the most crucial industries with respect to national defense. As a result, these firms retained the economic and political power to affect the shape of governmental relations both within and between these nations in a manner which maximized corporate global profits. In short, they were private governments unaccountable to the citizens of any country yet possessing tremendous influence over the course of war and peace in the world. The substantial contribution of these firms to the American war effort in terms of tanks, aircraft components, and other military equipment is widely acknowledged. Less well known are the simultaneous contributions of their foreign subsidiaries to the Axis Powers. In sum, they maximized profits by supplying both sides with the materiel needed to conduct the war.
During the 1920's and 1930's, the Big Three automakers undertook an extensive program of multinational expansion...By the mid-1930's, these three American companies owned automotive subsidiaries throughout Europe and the Far East; many of their largest facilities were located in the politically sensitive nations of Germany, Poland, Rumania, Austria, Hungary, Latvia, and Japan...Due to their concentrated economic power over motor vehicle production in both Allied and Axis territories, the Big Three inevitably became major factors in the preparations and progress of the war. In Germany, for example, General Motors and Ford became an integral part of the Nazi war efforts. GM's plants in Germany built thousands of bomber and jet fighter propulsion systems for the Luftwaffe at the same time that its American plants produced aircraft engines for the U.S. Army Air Corps....
Ford was also active in Nazi Germany's prewar preparations. In 1938, for instance, it opened a truck assembly plant in Berlin whose "real purpose," according to U.S. Army Intelligence, was producing "troop transport-type" vehicles for the Wehrmacht. That year Ford's chief executive received the Nazi German Eagle (first class)....
The outbreak of war in September 1939 resulted inevitably in the full conversion by GM and Ford of their Axis plants to the production of military aircraft and trucks.... On the ground, GM and Ford subsidiaries built nearly 90 percent of the armored "mule" 3-ton half-trucks and more than 70 percent of the Reich's medium and heavy-duty trucks. These vehicles, according to American intelligence reports, served as "the backbone of the German Army transportation system."....
After the cessation of hostilities, GM and Ford demanded reparations from the U.S. Government for wartime damages sustained by their Axis facilities as a result of Allied bombing... Ford received a little less than $1 million, primarily as a result of damages sustained by its military truck complex at Cologne...
Due to their multinational dominance of motor vehicle production, GM and Ford became principal suppliers for the forces of fascism as well as for the forces of democracy. It may, of course, be argued that participating in both sides of an international conflict, like the common corporate practice of investing in both political parties before an election, is an appropriate corporate activity. Had the Nazis won, General Motors and Ford would have appeared impeccably Nazi; as Hitler lost, these companies were able to re-emerge impeccably American. In either case, the viability of these corporations and the interests of their respective stockholders would have been preserved.
(READ MORE)
Behind the Criminal Immigration Law: Eugenics and White Supremacy
The history of the statute that can make it a felony to illegally enter the country involves some dark corners of U.S. history.
by Ian MacDougall - pro publica
June 19, 8:15 p.m. EDT
...The federal law they say they are enforcing makes it a crime for foreign citizens to cross (or attempt to cross) the border into the U.S. anywhere other than an official port of entry. A first offense is a misdemeanor; a second unlawful entry is a felony.
The law’s ancestry dates back to World War I. Till that point, U.S. immigration laws had tended to be all or nothing: either no limits at all — or blanket bans for certain groups, such as the Chinese Exclusion Act. Others were free to enter provided they weren’t “lunatics,” polygamists, prostitutes, “suffering from a loathsome or a dangerous contagious disease,” or so on.
The result was floods of immigrants: Between 1901 and 1910, for example, close to 9 million came to the U.S. As that happened, anti-immigrant attitudes mounted, with mass influxes from parts of Europe associated in the popular imagination with a litany of social problems, like urban poverty and squalor.
In May 1918, after the U.S. had entered World War I, Congress passed a statute called the Passport Act that gave the president the power to restrict the comings and goings of foreign citizens during wartime. A few months later, however, the war ended — and with it, the restrictions on border crossings.
Federal officials saw potential in the criminal provisions of the Passport Act — a maximum 20-year sentence — as a tool for deterring immigration. So prosecutors ignored the expiration of the law and continued to indict migrants under the Passport Act for unlawful entry into the U.S.
Anti-immigration sentiment continued to climb and the rhetoric of the era has resonance today. One anti-immigration group at the time claimed that immigrants tended to be “vicious and criminal” — the “bootleggers, gangsters, and racketeers of large cities.” The war, Columbia University historian Mae Ngai has written, “raised nationalism and anti-foreign sentiment to a high pitch.”
In response, Congress began clamping down. With the Immigration Act of 1924, it capped the flow at about 165,000 people a year, a small fraction of previous levels The statute’s quotas curtailed migration from southern and eastern Europe severely. Another 1924 law — the Oriental Exclusion Act — banned most immigration from Asia. At the same time, Congress made it easier to deport non-citizens for immigration violations.
In 1925, a federal appeals court put a halt to the practice of indicting migrants under the Passport Act outside wartime. But immigration officials liked what they’d seen, and by 1927, they were working on a replacement.
Two men spearheaded the effort that would lead Congress to criminalize unlawful entry into the United States. They were motivated by eugenics and white supremacy.
The first was James Davis, who was Secretary of Labor from 1921 to 1930. A Republican originally appointed by President Warren Harding, Davis was himself an immigrant from Wales who went by “Puddler Jim,” a reference to his job as a youthful worker in the steel mills of western Pennsylvania. At the time, the Department of Labor oversaw immigration, and Davis had grown disturbed by what he’d seen.
Davis was a committed eugenicist, and he believed principles of eugenics should guide immigration policy, according to The Bully Pulpit and the Melting Pot by the historian Hans Vought. It was necessary to draw a distinction, Davis had written in 1923, between “bad stock and good stock, weak blood and strong blood, sound heredity and sickly human stuff.”
In November 1927, Davis proposed a set of immigration reforms in the pages of The New York Times. Among his goals: “the definite lessening and possibly, in time, the complete checking of the degenerate and the bearer of degenerates.” One “phase of the immigration problem,” Davis wrote, was the “surreptitious entry of aliens” into the United States in numbers that “cannot even be approximately estimated.”
Deportation alone wasn’t enough to deter illegal immigration, Davis wrote. There was nothing disincentivizing the migrant from turning around and trying again. “Endeavoring to stop this law violation” by deportation only, he wrote, “is like trying to prevent burglary with a penalty no severer than opening the front door of the burglarized residence, should the burglar be found within, escorting him to it, and saying ‘You have no right here; see that you don’t come in again.’”
An immigrant who enters the country unlawfully, he concluded, “should be treated as a law violator and punished effectively.”
To bring his vision to fruition, Davis teamed up with a senator from South Carolina. Coleman Livingston Blease, a Democrat, was “a proud and unreconstructed white supremacist,” UCLA history professor Kelly Lytle Hernández wrote in her 2017 book City of Inmates.
Migrants from Mexico were one group whose numbers the increasingly powerful nativist elements in Congress hadn’t managed to restrict. Mexican workers were key to the booming economy of the southwest. Regional employers, particularly in the agricultural sector, had successfully lobbied Congress to block any bill that would choke off their primary source of inexpensive labor. As a result, migration from Mexico soared, with many Mexicans making illegal border crossings to avoid the cost and inconvenience of customs stations.
Blease saw in Davis’s proposal for criminal penalties a way to advance his vision of a white America, and he believed it would bridge the gap between the nativists clamoring for quotas and southwestern congressmen resisting them. Large-scale farmers didn’t mind criminal penalties, Hernández writes, so long as the law was enforced once the harvest was over.
The legislation wasn’t without its opponents, as the UCLA law professor Ingrid Eagly documented in a 2010 study of immigration prosecutions. Groups like the American Civil Liberties Union opposed the bill. The ACLU felt it was unfair and unlikely to deter migration. An immigrant “may be quite ignorant of this law before he starts on his journey,” the group told Congress.
Despite the ACLU’s objections, a Republican-controlled Congress passed Davis and Blease’s bill in 1929. A Republican president, Herbert Hoover, signed it into law.
The law made it a crime to enter the United States unlawfully and, in so doing, “created the criminalization of the border,” Eagly said.
The statute was swiftly put to use. Between July 1929 and June 1930, according to a Department of Labor report, prosecutors brought more than 6,000 unlawful entry cases. “It is believed that it will prove an effective deterrent,” the report’s author wrote. (In his recent memo, Sessions made similar claims about the Trump administration’s zero-tolerance policy.)
But the law didn’t reduce migration. By 1933, the Labor Department concluded that its rosy outlook had been wrong. The 1929 law “does not seem to have the deterrent effect expected,” noted a Labor Department report published that year.
It blamed budget limitations and judges wary of meting out serious sentences if a defendant was going to be deported anyway.
In the 1930s, the Great Depression achieved what prosecutions and deportations had not. Immigration plunged as the labor market in the United States dried up. Prosecutions for unlawful entry dropped to about 5,000 a year, according to a 2012 examination of the law by Doug Keller in the Loyola University Chicago Law Journal.
A shortage of labor during World War II prompted the U.S. to reverse course and encourage migration of temporary workers from Mexico through what it called the Bracero program. (The word refers to manual laborers in Spanish.)
Despite the earlier lessons, federal prosecutors began to focus their attention on bringing unlawful entry cases against Mexican migrants to deter workers from going around the Bracero program. By 1951, there were 15,000 illegal entry and re-entry prosecutions a year.
At the same time, Congress was working to overhaul American immigration law. The effort was spearheaded by two Democrats: Sen. Patrick McCarran and Rep. Francis Walter. Both were staunch anti-Communists who saw immigration — particularly from Eastern Europe and Asia — as posing a risk that Soviet or Maoist agents would infiltrate the country.
Their law is best known for preserving a quota system that meant about 85 percentof immigration visas annually went to people from northern and western Europe. But it also made a crucial change in the unlawful entry law.
In a counterintuitive move, Congress decided to reduce the penalties for unlawful entry — to a maximum of six months in prison. (It also added a felony provision for any additional illegal entry convictions.)
The change wasn’t driven by compassion or a shift away from criminalizing unlawful immigration. Rather, it anticipated the creation of federal magistrate courts that would handle the cases, according to Eagly, the UCLA law professor. A defendant facing a misdemeanor charge punishable by six months or less generally doesn’t have a right to a grand jury indictment or a jury trial. Once Congress established federal magistrate courts, prosecutors could bring criminal charges against far larger numbers of defendants.
A Democratic-controlled Congress passed the law in 1952, but it was vetoed by President Harry Truman. His veto message decried “carrying over into this year of 1952 the isolationist limitations of our 1924 law.” Congress was unmoved and overrode his veto. (In this sense, Trump is correct that Democrats bear some responsibility for the unlawful entry law that underlies his administration’s new immigration policy.)
The unlawful entry statute has remained largely unchanged since 1952. In 1968, however, Congress finally passed a law establishing federal magistrate courts, allowing for a major expansion of charges under the unlawful entry law. Without the need to go through the grand jury process or deal with potential jury trials, immigration prosecutions — almost all for unlawful entry — shot up, Eagly found in her 2010 study: from 2,536 cases nationwide in 1968 to 17,858 in 1974.
The trend culminated in programs like Operation Streamline during the George W. Bush administration, in which magistrate judges along the border took simultaneous mass guilty pleas for unlawful entry. (An appeals court ended the practice in 2009.)
The use of the law hasn’t been a partisan matter. The number of such cases spiked to nearly 50,000 in the last year of the Bush administration, and it stayed in that range for most of the Obama administration, according to federal government data maintained by the Transactional Records Access Clearinghouse at Syracuse University. By 2016, the number had fallen to about 35,000 — still higher than all but the last year of the Bush administration.
But the number of unlawful entry cases fell, the TRAC data shows, during Trump’s first year in office, to 27,000. (It had begun to rise again in recent months, however, even before Sessions announced the administration’s “zero-tolerance” policy.)
Convictions for immigration crimes now account for more than half of all federal criminal convictions.
The law’s ancestry dates back to World War I. Till that point, U.S. immigration laws had tended to be all or nothing: either no limits at all — or blanket bans for certain groups, such as the Chinese Exclusion Act. Others were free to enter provided they weren’t “lunatics,” polygamists, prostitutes, “suffering from a loathsome or a dangerous contagious disease,” or so on.
The result was floods of immigrants: Between 1901 and 1910, for example, close to 9 million came to the U.S. As that happened, anti-immigrant attitudes mounted, with mass influxes from parts of Europe associated in the popular imagination with a litany of social problems, like urban poverty and squalor.
In May 1918, after the U.S. had entered World War I, Congress passed a statute called the Passport Act that gave the president the power to restrict the comings and goings of foreign citizens during wartime. A few months later, however, the war ended — and with it, the restrictions on border crossings.
Federal officials saw potential in the criminal provisions of the Passport Act — a maximum 20-year sentence — as a tool for deterring immigration. So prosecutors ignored the expiration of the law and continued to indict migrants under the Passport Act for unlawful entry into the U.S.
Anti-immigration sentiment continued to climb and the rhetoric of the era has resonance today. One anti-immigration group at the time claimed that immigrants tended to be “vicious and criminal” — the “bootleggers, gangsters, and racketeers of large cities.” The war, Columbia University historian Mae Ngai has written, “raised nationalism and anti-foreign sentiment to a high pitch.”
In response, Congress began clamping down. With the Immigration Act of 1924, it capped the flow at about 165,000 people a year, a small fraction of previous levels The statute’s quotas curtailed migration from southern and eastern Europe severely. Another 1924 law — the Oriental Exclusion Act — banned most immigration from Asia. At the same time, Congress made it easier to deport non-citizens for immigration violations.
In 1925, a federal appeals court put a halt to the practice of indicting migrants under the Passport Act outside wartime. But immigration officials liked what they’d seen, and by 1927, they were working on a replacement.
Two men spearheaded the effort that would lead Congress to criminalize unlawful entry into the United States. They were motivated by eugenics and white supremacy.
The first was James Davis, who was Secretary of Labor from 1921 to 1930. A Republican originally appointed by President Warren Harding, Davis was himself an immigrant from Wales who went by “Puddler Jim,” a reference to his job as a youthful worker in the steel mills of western Pennsylvania. At the time, the Department of Labor oversaw immigration, and Davis had grown disturbed by what he’d seen.
Davis was a committed eugenicist, and he believed principles of eugenics should guide immigration policy, according to The Bully Pulpit and the Melting Pot by the historian Hans Vought. It was necessary to draw a distinction, Davis had written in 1923, between “bad stock and good stock, weak blood and strong blood, sound heredity and sickly human stuff.”
In November 1927, Davis proposed a set of immigration reforms in the pages of The New York Times. Among his goals: “the definite lessening and possibly, in time, the complete checking of the degenerate and the bearer of degenerates.” One “phase of the immigration problem,” Davis wrote, was the “surreptitious entry of aliens” into the United States in numbers that “cannot even be approximately estimated.”
Deportation alone wasn’t enough to deter illegal immigration, Davis wrote. There was nothing disincentivizing the migrant from turning around and trying again. “Endeavoring to stop this law violation” by deportation only, he wrote, “is like trying to prevent burglary with a penalty no severer than opening the front door of the burglarized residence, should the burglar be found within, escorting him to it, and saying ‘You have no right here; see that you don’t come in again.’”
An immigrant who enters the country unlawfully, he concluded, “should be treated as a law violator and punished effectively.”
To bring his vision to fruition, Davis teamed up with a senator from South Carolina. Coleman Livingston Blease, a Democrat, was “a proud and unreconstructed white supremacist,” UCLA history professor Kelly Lytle Hernández wrote in her 2017 book City of Inmates.
Migrants from Mexico were one group whose numbers the increasingly powerful nativist elements in Congress hadn’t managed to restrict. Mexican workers were key to the booming economy of the southwest. Regional employers, particularly in the agricultural sector, had successfully lobbied Congress to block any bill that would choke off their primary source of inexpensive labor. As a result, migration from Mexico soared, with many Mexicans making illegal border crossings to avoid the cost and inconvenience of customs stations.
Blease saw in Davis’s proposal for criminal penalties a way to advance his vision of a white America, and he believed it would bridge the gap between the nativists clamoring for quotas and southwestern congressmen resisting them. Large-scale farmers didn’t mind criminal penalties, Hernández writes, so long as the law was enforced once the harvest was over.
The legislation wasn’t without its opponents, as the UCLA law professor Ingrid Eagly documented in a 2010 study of immigration prosecutions. Groups like the American Civil Liberties Union opposed the bill. The ACLU felt it was unfair and unlikely to deter migration. An immigrant “may be quite ignorant of this law before he starts on his journey,” the group told Congress.
Despite the ACLU’s objections, a Republican-controlled Congress passed Davis and Blease’s bill in 1929. A Republican president, Herbert Hoover, signed it into law.
The law made it a crime to enter the United States unlawfully and, in so doing, “created the criminalization of the border,” Eagly said.
The statute was swiftly put to use. Between July 1929 and June 1930, according to a Department of Labor report, prosecutors brought more than 6,000 unlawful entry cases. “It is believed that it will prove an effective deterrent,” the report’s author wrote. (In his recent memo, Sessions made similar claims about the Trump administration’s zero-tolerance policy.)
But the law didn’t reduce migration. By 1933, the Labor Department concluded that its rosy outlook had been wrong. The 1929 law “does not seem to have the deterrent effect expected,” noted a Labor Department report published that year.
It blamed budget limitations and judges wary of meting out serious sentences if a defendant was going to be deported anyway.
In the 1930s, the Great Depression achieved what prosecutions and deportations had not. Immigration plunged as the labor market in the United States dried up. Prosecutions for unlawful entry dropped to about 5,000 a year, according to a 2012 examination of the law by Doug Keller in the Loyola University Chicago Law Journal.
A shortage of labor during World War II prompted the U.S. to reverse course and encourage migration of temporary workers from Mexico through what it called the Bracero program. (The word refers to manual laborers in Spanish.)
Despite the earlier lessons, federal prosecutors began to focus their attention on bringing unlawful entry cases against Mexican migrants to deter workers from going around the Bracero program. By 1951, there were 15,000 illegal entry and re-entry prosecutions a year.
At the same time, Congress was working to overhaul American immigration law. The effort was spearheaded by two Democrats: Sen. Patrick McCarran and Rep. Francis Walter. Both were staunch anti-Communists who saw immigration — particularly from Eastern Europe and Asia — as posing a risk that Soviet or Maoist agents would infiltrate the country.
Their law is best known for preserving a quota system that meant about 85 percentof immigration visas annually went to people from northern and western Europe. But it also made a crucial change in the unlawful entry law.
In a counterintuitive move, Congress decided to reduce the penalties for unlawful entry — to a maximum of six months in prison. (It also added a felony provision for any additional illegal entry convictions.)
The change wasn’t driven by compassion or a shift away from criminalizing unlawful immigration. Rather, it anticipated the creation of federal magistrate courts that would handle the cases, according to Eagly, the UCLA law professor. A defendant facing a misdemeanor charge punishable by six months or less generally doesn’t have a right to a grand jury indictment or a jury trial. Once Congress established federal magistrate courts, prosecutors could bring criminal charges against far larger numbers of defendants.
A Democratic-controlled Congress passed the law in 1952, but it was vetoed by President Harry Truman. His veto message decried “carrying over into this year of 1952 the isolationist limitations of our 1924 law.” Congress was unmoved and overrode his veto. (In this sense, Trump is correct that Democrats bear some responsibility for the unlawful entry law that underlies his administration’s new immigration policy.)
The unlawful entry statute has remained largely unchanged since 1952. In 1968, however, Congress finally passed a law establishing federal magistrate courts, allowing for a major expansion of charges under the unlawful entry law. Without the need to go through the grand jury process or deal with potential jury trials, immigration prosecutions — almost all for unlawful entry — shot up, Eagly found in her 2010 study: from 2,536 cases nationwide in 1968 to 17,858 in 1974.
The trend culminated in programs like Operation Streamline during the George W. Bush administration, in which magistrate judges along the border took simultaneous mass guilty pleas for unlawful entry. (An appeals court ended the practice in 2009.)
The use of the law hasn’t been a partisan matter. The number of such cases spiked to nearly 50,000 in the last year of the Bush administration, and it stayed in that range for most of the Obama administration, according to federal government data maintained by the Transactional Records Access Clearinghouse at Syracuse University. By 2016, the number had fallen to about 35,000 — still higher than all but the last year of the Bush administration.
But the number of unlawful entry cases fell, the TRAC data shows, during Trump’s first year in office, to 27,000. (It had begun to rise again in recent months, however, even before Sessions announced the administration’s “zero-tolerance” policy.)
Convictions for immigration crimes now account for more than half of all federal criminal convictions.
Hawaii’s fight against Trump’s Muslim travel ban has long roots of resistance
As SCOTUS prepares to rule on Trump v. Hawaii, a reminder that Hawaii stood up for Japanese Americans in WWII
RAHNA REIKO RIZZUTO - salon
JUNE 16, 2018 5:00PM (UTC)
In 1945, my great uncle died for his country, one of 400,000 American soldiers who gave their lives during World War II. He was a member of the most decorated unit in the history of American warfare, one of the “little brown soldiers” that saved the Lost Battalion from Texas in the Vosges Mountains. Four casualties for every man saved. My great uncle, Robert Ozaki, shows up in written accounts of that battle, leading a bayonet charge when his lieutenant disappeared and was thought captured. He arrived at the hospital in Colorado with shrapnel in his back, and our family story has it that he kept shaking down his thermometer so that the doctors could attend to other soldiers.
Your hero? He served fiercely and with honor, and died in that hospital: a recipient of a Silver Star and a Purple Heart.
Your enemy? His own government branded him class 4-C, an “enemy alien,” and would not let his family attend his funeral.
Unless you are in the Marvel Comic universe, it’s hard to be both. But my Silver Star great uncle was Japanese-American. The decades leading up to the war were a time of virulent hatred for the Japanese, with terms like “inscrutable,” “repulsive” and “the yellow peril” thrown around freely. Racism was codified and supported by the president, Congress, the courts and local government, and urged on in headlines in the media. Robert Ozaki would have remained a “menace,” if it were not for Hawaii. And this month, as we await the ruling of the Supreme Court on Hawaii’s challenge to Donald Trump’s travel ban, it is worth remembering that this is not the first time that Hawaii stood up to the overt racist policies of the U.S. government and said no.
In 1942, after Pearl Harbor was attacked, President Roosevelt issued Executive Order 9066, giving the military broad powers to detain anyone on the west coast of the mainland determined to be a threat. Although it did not mention ethnicity, it was designed to target people of Japanese ancestry and it did. Nearly 120,000 Japanese-American citizens and their immigrant parents were rounded up, stripped of their citizenship, labeled enemy aliens, and imprisoned behind barbed wire in incarceration camps across the country. My infant mother, her parents, aunts, uncles and grandparents were with them.
In what world are infants and old ladies a threat? In a world where anti-Japanese (and “Asiatic”) sentiment already had a long and ugly history. After importing them first as cheap labor, immigration from Japan had been completely banned. Laws had already been passed to ensure that the Japanese could never become citizens or own property. But their children were American citizens, and as they began owning farms and businesses, hysteria grew. The propaganda machine (the fake news of the 1940s) taught Americans that “Japs” were snakes, beasts, who would marry your daughters, rape the world and steal your stuff; they were to be slapped, smacked, banished and exterminated. American citizenship, hard work, community service and a clean record did not help those individuals then, just as law-abiding immigrants are not safe now. The messages were violent and they were everywhere.
It was a racism rooted in greed: Within a few decades, Japanese-American farms on the West Coast were seven times more profitable than the average. Japanese Americans controlled two-thirds of the Los Angeles flower market, and were projected to produce 40 percent of the produce needed for the war effort. In giving them a week to dispose of everything they owned, and holding them prisoner in camps where they could not make enough money to pay the taxes on any properties they still owned, the evacuation effectively stripped them of everything.
The mission that was accomplished by Roosevelt’s Executive Order was not safety for America. Despite the excuse of national security, there was not one single case of espionage during the war. The result was the successful cleansing of the West Coast of all persons of Japanese ancestry, and the transfer of between $150 million and $400 million of assets back into Caucasian hands.
In the territory of Hawaii, however, events spun out differently, with history-making results. There, martial law was also declared, with similar exclusion orders. However, the commanding general, Lt. Gen. Delos Emmons, refused to evacuate the Japanese Americans, who made up 37 percent of the population and a significant portion of the economy. Emmons flipped the script, arguing that it was better for the overall economy to leave them free. He refuted the rumors, false claims of espionage and the violently anti-Japanese sentiment that was fueling calls for exclusion. Instead, he chose to do something radical: to treat the Japanese Americans as lawful, loyal citizens, and trust them. He even gave them back their guns.
After Pearl Harbor was bombed, Japanese Americans serving in the Hawaii Territorial Guard were discharged at first, but petitioned to continue to serve. Emmons eventually placed them into a lone battalion, the 100th, or One-Puka-Puka. Some 10,000 Japanese American men living in Hawaii volunteered to enlist. Their fierce dedication altered the face of the war for the Japanese Americans. Pressured to find a home for the battalion, the U.S. government began to reconsider their status. The War Department asked for volunteers from behind barbed wire, and eventually began drafting men out of the camps to create the all-Japanese American 442nd Regimental Combat Team that would include the 100th Battalion. The 442nd proved that they were not snakes by earning more than 18,000 military awards among 14,000 soldiers, including 9,486 Purple Hearts, 560 Silver Stars and 21 Medals of Honor.
Today, 75 years later, racism is still rampant, and still a smokescreen for greed. All the horrifying treatment of humans that is playing out in our daily newsfeeds — within our own country and at our borders — is based on the same triggers, and the same arguments. Today’s monsters are still people of color, immigrants, people who don’t speak our language. They are still born from our worries about our safety and our fears that there is a lack of jobs and money and that there is not enough for us. As we twist ourselves in knots to erase or justify our actions (turning off body cameras, claiming to be protecting child refugees while we build new for-profit prisons for their parents), it is worth remembering that our safety does not come from threatening the safety of others. Quite the opposite. Our fears imprison us all. Racism is taught; it is deliberate. And until we can see through the lie that we are each other’s enemies, we cannot follow the money and the power to understand who our teachers are.
In 18 long months, the Trump administration has distinguished itself by its many racist and discriminatory policies and executive orders. The actions of its agencies are routinely being challenged in court. Though racism is hardly new in our country (the Japanese American incarceration being just one small example), it is clearly blossoming, thanks to the propaganda that is, once again, infusing the media and every branch of government, and coming from the top. The argument that this revised travel ban is not racist is bogus. It is worth remembering that Executive Order 9066 did not mention ethnicity or race but was to apply to “any or all persons.” Both were justified on grounds of national defense. Just as Roosevelt’s order was a tool for racism, this administration’s actions and words make it clear that the travel ban will be another tool in our growing arsenal against people who are “other,” who we are being told are threatening our safety and our stuff.
In the 1940s, the Supreme Court rejected the first three challenges to the incarceration, before finally ruling that the government had no legal right to imprison a loyal citizen. The damage was already done. In 1988, when Ronald Reagan signed the Civil Liberties Act apologizing for the incarceration, he repeatedly mentioned the bravery of the Japanese American soldiers as proof that the incarceration was a “mistake” and one “based solely on race.”
These reversals may not have been possible if Hawaii had buckled under and followed a different path. Our safety will not be gained in lawsuits. Justice may not be supreme. We must all find a way to question the propaganda and the policies that have been designed to separate us and to see each other as human. If we need assurance that our enemy may indeed be our hero, the all-Japanese American 442nd Regimental Team is a potent reminder that beneath the different skin and eyes of “the other” may beat 9,486 Purple Hearts.
Your hero? He served fiercely and with honor, and died in that hospital: a recipient of a Silver Star and a Purple Heart.
Your enemy? His own government branded him class 4-C, an “enemy alien,” and would not let his family attend his funeral.
Unless you are in the Marvel Comic universe, it’s hard to be both. But my Silver Star great uncle was Japanese-American. The decades leading up to the war were a time of virulent hatred for the Japanese, with terms like “inscrutable,” “repulsive” and “the yellow peril” thrown around freely. Racism was codified and supported by the president, Congress, the courts and local government, and urged on in headlines in the media. Robert Ozaki would have remained a “menace,” if it were not for Hawaii. And this month, as we await the ruling of the Supreme Court on Hawaii’s challenge to Donald Trump’s travel ban, it is worth remembering that this is not the first time that Hawaii stood up to the overt racist policies of the U.S. government and said no.
In 1942, after Pearl Harbor was attacked, President Roosevelt issued Executive Order 9066, giving the military broad powers to detain anyone on the west coast of the mainland determined to be a threat. Although it did not mention ethnicity, it was designed to target people of Japanese ancestry and it did. Nearly 120,000 Japanese-American citizens and their immigrant parents were rounded up, stripped of their citizenship, labeled enemy aliens, and imprisoned behind barbed wire in incarceration camps across the country. My infant mother, her parents, aunts, uncles and grandparents were with them.
In what world are infants and old ladies a threat? In a world where anti-Japanese (and “Asiatic”) sentiment already had a long and ugly history. After importing them first as cheap labor, immigration from Japan had been completely banned. Laws had already been passed to ensure that the Japanese could never become citizens or own property. But their children were American citizens, and as they began owning farms and businesses, hysteria grew. The propaganda machine (the fake news of the 1940s) taught Americans that “Japs” were snakes, beasts, who would marry your daughters, rape the world and steal your stuff; they were to be slapped, smacked, banished and exterminated. American citizenship, hard work, community service and a clean record did not help those individuals then, just as law-abiding immigrants are not safe now. The messages were violent and they were everywhere.
It was a racism rooted in greed: Within a few decades, Japanese-American farms on the West Coast were seven times more profitable than the average. Japanese Americans controlled two-thirds of the Los Angeles flower market, and were projected to produce 40 percent of the produce needed for the war effort. In giving them a week to dispose of everything they owned, and holding them prisoner in camps where they could not make enough money to pay the taxes on any properties they still owned, the evacuation effectively stripped them of everything.
The mission that was accomplished by Roosevelt’s Executive Order was not safety for America. Despite the excuse of national security, there was not one single case of espionage during the war. The result was the successful cleansing of the West Coast of all persons of Japanese ancestry, and the transfer of between $150 million and $400 million of assets back into Caucasian hands.
In the territory of Hawaii, however, events spun out differently, with history-making results. There, martial law was also declared, with similar exclusion orders. However, the commanding general, Lt. Gen. Delos Emmons, refused to evacuate the Japanese Americans, who made up 37 percent of the population and a significant portion of the economy. Emmons flipped the script, arguing that it was better for the overall economy to leave them free. He refuted the rumors, false claims of espionage and the violently anti-Japanese sentiment that was fueling calls for exclusion. Instead, he chose to do something radical: to treat the Japanese Americans as lawful, loyal citizens, and trust them. He even gave them back their guns.
After Pearl Harbor was bombed, Japanese Americans serving in the Hawaii Territorial Guard were discharged at first, but petitioned to continue to serve. Emmons eventually placed them into a lone battalion, the 100th, or One-Puka-Puka. Some 10,000 Japanese American men living in Hawaii volunteered to enlist. Their fierce dedication altered the face of the war for the Japanese Americans. Pressured to find a home for the battalion, the U.S. government began to reconsider their status. The War Department asked for volunteers from behind barbed wire, and eventually began drafting men out of the camps to create the all-Japanese American 442nd Regimental Combat Team that would include the 100th Battalion. The 442nd proved that they were not snakes by earning more than 18,000 military awards among 14,000 soldiers, including 9,486 Purple Hearts, 560 Silver Stars and 21 Medals of Honor.
Today, 75 years later, racism is still rampant, and still a smokescreen for greed. All the horrifying treatment of humans that is playing out in our daily newsfeeds — within our own country and at our borders — is based on the same triggers, and the same arguments. Today’s monsters are still people of color, immigrants, people who don’t speak our language. They are still born from our worries about our safety and our fears that there is a lack of jobs and money and that there is not enough for us. As we twist ourselves in knots to erase or justify our actions (turning off body cameras, claiming to be protecting child refugees while we build new for-profit prisons for their parents), it is worth remembering that our safety does not come from threatening the safety of others. Quite the opposite. Our fears imprison us all. Racism is taught; it is deliberate. And until we can see through the lie that we are each other’s enemies, we cannot follow the money and the power to understand who our teachers are.
In 18 long months, the Trump administration has distinguished itself by its many racist and discriminatory policies and executive orders. The actions of its agencies are routinely being challenged in court. Though racism is hardly new in our country (the Japanese American incarceration being just one small example), it is clearly blossoming, thanks to the propaganda that is, once again, infusing the media and every branch of government, and coming from the top. The argument that this revised travel ban is not racist is bogus. It is worth remembering that Executive Order 9066 did not mention ethnicity or race but was to apply to “any or all persons.” Both were justified on grounds of national defense. Just as Roosevelt’s order was a tool for racism, this administration’s actions and words make it clear that the travel ban will be another tool in our growing arsenal against people who are “other,” who we are being told are threatening our safety and our stuff.
In the 1940s, the Supreme Court rejected the first three challenges to the incarceration, before finally ruling that the government had no legal right to imprison a loyal citizen. The damage was already done. In 1988, when Ronald Reagan signed the Civil Liberties Act apologizing for the incarceration, he repeatedly mentioned the bravery of the Japanese American soldiers as proof that the incarceration was a “mistake” and one “based solely on race.”
These reversals may not have been possible if Hawaii had buckled under and followed a different path. Our safety will not be gained in lawsuits. Justice may not be supreme. We must all find a way to question the propaganda and the policies that have been designed to separate us and to see each other as human. If we need assurance that our enemy may indeed be our hero, the all-Japanese American 442nd Regimental Team is a potent reminder that beneath the different skin and eyes of “the other” may beat 9,486 Purple Hearts.
America’s segregated shores: beaches' long history as a racial battleground
For decades officials imposed regulations to restrict African Americans’ use of public beaches – and the fight for equal access is far from over
Andrew W Kahrl
the guardian
Tue 12 Jun 2018 06.00 EDT
Summer has arrived, which, for many Americans, means day camps for children, afternoons lounging by a pool, and weekend trips to the beach. But for many others, summer brings new burdens, frustrations and fears: the end of free or reduced meals for children at schools, the added cost of childcare and the search – often in vain – for safe, affordable and accessible places to play and cool off on a hot day.
Summers have long been America’s most segregated season. Nowhere is this more evident than along the nation’s beaches and coasts, one of the chief destinations for vacationers and pleasure seekers, and a perennial site of racial conflict and violence. The infamous 1919 Chicago race riot, which lasted seven days and claimed 38 lives, began on the shores of Lake Michigan, when white youth gang members stoned to death a black teenager named Eugene Williams after he had accidentally drifted across a color line in the water. In its aftermath, African Americans learned to avoid the city’s lakefront. As a child, black Chicagoan Dempsey Travis remembers, “I was never permitted to learn to swim. For six years, we lived within two blocks of the lake, but that did not change [my parents’] attitude. To Dad and Mama, the blue lake always had a tinge of red from the blood of that young black boy.”
In the decades that followed, local governments across the US enacted a host of policies and practices designed to segregate places of outdoor leisure by race and effectively exclude people of color from public beaches. In the south, those methods were quite explicit. Coastal cities such as Norfolk, Virginia, Charleston, South Carolina, and Miami, Florida, prohibited African Americans from stepping foot on any of their public beaches, and for years ignored blacks’ demands for public beaches of their own. Whites’ indifference to the health and humanity of black communities often had deadly consequences. Throughout the Jim Crow era, shockingly high numbers of black youth drowned each summer while playing in dangerous, and unsupervised, bodies of water. When white officials did respond to black demands for beaches and parks of their own, they invariably selected remote, polluted, often hazardous, locations. In Washington DC, officials designated Buzzard’s Point, a former dumping ground located downstream from a sewage plant, as an appropriate location for the city’s “colored” bathing beach. In New Orleans, it was a remote site on Lake Pontchartrain, 14 miles from downtown, surrounded on both sides by fishing camps that dumped raw sewage into the lake. One health official described the waters offshore as “grossly contaminated” and wholly unfit for bathing.
In the north, whites employed more subtle, but no less effective, methods of segregation. Predominantly white suburbs and towns in the north-east, for example, designated their public beaches for residents only, or charged exorbitant access fees for non-residents, or barred non-residents from parking near the shore, all designed to keep minority populations in neighboring cities out. City officials, meanwhile, failed to provide black neighborhoods with safe and decent places of public recreation and deliberately made beaches and pools frequented by middle-class whites inaccessible to the poor and people of color.
Here, too, whites’ determined efforts to keep black people out of their pools and off their beaches cost black children their lives. On a hot summer day in June 1968, teenagers Howard Martin and Lemark Hicks left their families’ low-income public housing units in Port Chester, New York, and went in search of a place to cool off. Hours later, scuba divers pulled their lifeless bodies from the depths of the Byram river, where the two African American boys had drowned while swimming, unsupervised, in the river’s dangerous currents. While the boys screamed for help, less than a mile away lifeguards kept a watchful eye over children playing in the surf at Byram Beach, one of three public beaches in the neighboring town of Greenwich, Connecticut. But despite its close proximity, these beaches, and the safety they afforded bathers, were not an option for Martin, Hicks, and all other black children living in Port Chester. They were for Greenwich residents only.
Such senseless tragedies fueled black unrest and played no small role in sparking urban uprisings during the long, hot summers of the 1960s. In 1968, public housing residents in Hartford, Connecticut, staged a series of protests following the drowning deaths of several children along a dangerous section of a river that snaked through their housing project. City officials had repeatedly ignored parents’ demands to fence off the area or, better yet, provide the neighborhood with a public swimming pool, and instead scolded parents for not keeping a better watch over their children. “This is what causes riots,” protest leader Barbara Henderson said in response. The Kerner commission concurred. In its 1968 report on the “riots” that had engulfed urban black America in previous summers, it listed “poor recreation facilities and programs” as the fifth most intense grievance of black populations in riot-torn cities, just behind policing practices, unemployment and underemployment, housing and education. In response, cities hastily built above-ground swimming pools and placed sprinklers on fire hydrants in black neighborhoods.
But aside from these modest gestures, little was done to address the underlying causes of summertime segregation and recreational inequality. In recent decades, fiscally distressed cities have slashed funding for outdoor recreation programs for disadvantaged children, and closed or sold off public parks, beaches and swimming pools in poorer neighborhoods, while affluent communities continue to employ the same tactics for keeping “undesirables” out of their parks and off their shores. Earlier this spring, officials in Westport, Connecticut, dramatically increased parking fees and slashed the number of passes sold to non-residents at its public beach. The move came after locals complained about the growing numbers of outsiders there the previous summer. In the exclusive community of Palos Verdes Estates, California, a gang of wealthy local whites (known as the “Lunada Bay Boys”) has been waging a decades-long campaign of terror against non-residents, especially African Americans, who seek access to the town’s public beach. Local residents have subjected visitors to beatings and assaults, racist epithets, sexual harassment, dog attacks, death threats, property destruction and vandalism, all with the tacit approval of local law enforcement. Officials in this and other affluent beachfront communities in Los Angeles, meanwhile, have for years thwarted attempts by the city’s regional transit authority to offer direct bus routes from black and brown inner-city neighborhoods to the beach. As a result, it is common to find black children living in Los Angeles who have never even seen the Pacific Ocean, much less spent a day on its shores.
Like schools and neighborhoods, the persistence of racial separatism in places of play didn’t just happen by chance. Nor does it, as some might claim, simply reflect people’s personal preferences. It is the result of public policies and private actions that, by design, aimed to segregate bodies of water by race and allow whites to claim the most desirable outdoor spaces to themselves. Many of these policies and practices remain in effect today. Undoing them is critical to making public space in America truly public, and to ensuring that all Americans enjoy the basic human right to leisure and recreation.
Summers have long been America’s most segregated season. Nowhere is this more evident than along the nation’s beaches and coasts, one of the chief destinations for vacationers and pleasure seekers, and a perennial site of racial conflict and violence. The infamous 1919 Chicago race riot, which lasted seven days and claimed 38 lives, began on the shores of Lake Michigan, when white youth gang members stoned to death a black teenager named Eugene Williams after he had accidentally drifted across a color line in the water. In its aftermath, African Americans learned to avoid the city’s lakefront. As a child, black Chicagoan Dempsey Travis remembers, “I was never permitted to learn to swim. For six years, we lived within two blocks of the lake, but that did not change [my parents’] attitude. To Dad and Mama, the blue lake always had a tinge of red from the blood of that young black boy.”
In the decades that followed, local governments across the US enacted a host of policies and practices designed to segregate places of outdoor leisure by race and effectively exclude people of color from public beaches. In the south, those methods were quite explicit. Coastal cities such as Norfolk, Virginia, Charleston, South Carolina, and Miami, Florida, prohibited African Americans from stepping foot on any of their public beaches, and for years ignored blacks’ demands for public beaches of their own. Whites’ indifference to the health and humanity of black communities often had deadly consequences. Throughout the Jim Crow era, shockingly high numbers of black youth drowned each summer while playing in dangerous, and unsupervised, bodies of water. When white officials did respond to black demands for beaches and parks of their own, they invariably selected remote, polluted, often hazardous, locations. In Washington DC, officials designated Buzzard’s Point, a former dumping ground located downstream from a sewage plant, as an appropriate location for the city’s “colored” bathing beach. In New Orleans, it was a remote site on Lake Pontchartrain, 14 miles from downtown, surrounded on both sides by fishing camps that dumped raw sewage into the lake. One health official described the waters offshore as “grossly contaminated” and wholly unfit for bathing.
In the north, whites employed more subtle, but no less effective, methods of segregation. Predominantly white suburbs and towns in the north-east, for example, designated their public beaches for residents only, or charged exorbitant access fees for non-residents, or barred non-residents from parking near the shore, all designed to keep minority populations in neighboring cities out. City officials, meanwhile, failed to provide black neighborhoods with safe and decent places of public recreation and deliberately made beaches and pools frequented by middle-class whites inaccessible to the poor and people of color.
Here, too, whites’ determined efforts to keep black people out of their pools and off their beaches cost black children their lives. On a hot summer day in June 1968, teenagers Howard Martin and Lemark Hicks left their families’ low-income public housing units in Port Chester, New York, and went in search of a place to cool off. Hours later, scuba divers pulled their lifeless bodies from the depths of the Byram river, where the two African American boys had drowned while swimming, unsupervised, in the river’s dangerous currents. While the boys screamed for help, less than a mile away lifeguards kept a watchful eye over children playing in the surf at Byram Beach, one of three public beaches in the neighboring town of Greenwich, Connecticut. But despite its close proximity, these beaches, and the safety they afforded bathers, were not an option for Martin, Hicks, and all other black children living in Port Chester. They were for Greenwich residents only.
Such senseless tragedies fueled black unrest and played no small role in sparking urban uprisings during the long, hot summers of the 1960s. In 1968, public housing residents in Hartford, Connecticut, staged a series of protests following the drowning deaths of several children along a dangerous section of a river that snaked through their housing project. City officials had repeatedly ignored parents’ demands to fence off the area or, better yet, provide the neighborhood with a public swimming pool, and instead scolded parents for not keeping a better watch over their children. “This is what causes riots,” protest leader Barbara Henderson said in response. The Kerner commission concurred. In its 1968 report on the “riots” that had engulfed urban black America in previous summers, it listed “poor recreation facilities and programs” as the fifth most intense grievance of black populations in riot-torn cities, just behind policing practices, unemployment and underemployment, housing and education. In response, cities hastily built above-ground swimming pools and placed sprinklers on fire hydrants in black neighborhoods.
But aside from these modest gestures, little was done to address the underlying causes of summertime segregation and recreational inequality. In recent decades, fiscally distressed cities have slashed funding for outdoor recreation programs for disadvantaged children, and closed or sold off public parks, beaches and swimming pools in poorer neighborhoods, while affluent communities continue to employ the same tactics for keeping “undesirables” out of their parks and off their shores. Earlier this spring, officials in Westport, Connecticut, dramatically increased parking fees and slashed the number of passes sold to non-residents at its public beach. The move came after locals complained about the growing numbers of outsiders there the previous summer. In the exclusive community of Palos Verdes Estates, California, a gang of wealthy local whites (known as the “Lunada Bay Boys”) has been waging a decades-long campaign of terror against non-residents, especially African Americans, who seek access to the town’s public beach. Local residents have subjected visitors to beatings and assaults, racist epithets, sexual harassment, dog attacks, death threats, property destruction and vandalism, all with the tacit approval of local law enforcement. Officials in this and other affluent beachfront communities in Los Angeles, meanwhile, have for years thwarted attempts by the city’s regional transit authority to offer direct bus routes from black and brown inner-city neighborhoods to the beach. As a result, it is common to find black children living in Los Angeles who have never even seen the Pacific Ocean, much less spent a day on its shores.
Like schools and neighborhoods, the persistence of racial separatism in places of play didn’t just happen by chance. Nor does it, as some might claim, simply reflect people’s personal preferences. It is the result of public policies and private actions that, by design, aimed to segregate bodies of water by race and allow whites to claim the most desirable outdoor spaces to themselves. Many of these policies and practices remain in effect today. Undoing them is critical to making public space in America truly public, and to ensuring that all Americans enjoy the basic human right to leisure and recreation.
Colin Kaepernick Is Righter Than You Know: The National Anthem Is a Celebration of Slavery
Jon Schwarz - the intercept
August 28 2016, 12:08 p.m.
6/11/18
Before a preseason game on Friday, San Francisco 49ers quarterback Colin Kaepernick refused to stand for the playing of “The Star-Spangled Banner.” When he explained why, he only spoke about the present: “I am not going to stand up to show pride in a flag for a country that oppresses black people and people of color. … There are bodies in the street and people getting paid leave and getting away with murder.”
Twitter then went predictably nuts, with at least one 49ers fan burning Kaepernick’s jersey.
Almost no one seems to be aware that even if the U.S. were a perfect country today, it would be bizarre to expect African-American players to stand for “The Star-Spangled Banner.” Why? Because it literally celebrates the murder of African-Americans.
Few people know this because we only ever sing the first verse. But read the end of the third verse and you’ll see why “The Star-Spangled Banner” is not just a musical atrocity, it’s an intellectual and moral one, too:
No refuge could save the hireling and slave
From the terror of flight or the gloom of the grave,
And the star-spangled banner in triumph doth wave
O’er the land of the free and the home of the brave.
“The Star-Spangled Banner,” Americans hazily remember, was written by Francis Scott Key about the Battle of Fort McHenry in Baltimore during the War of 1812. But we don’t ever talk about how the War of 1812 was a war of aggression that began with an attempt by the U.S. to grab Canada from the British Empire.
However, we’d wildly overestimated the strength of the U.S. military. By the time of the Battle of Fort McHenry in 1814, the British had counterattacked and overrun Washington, D.C., setting fire to the White House.
And one of the key tactics behind the British military’s success was its active recruitment of American slaves. As a detailed 2014 article in Harper’s explains, the orders given to the Royal Navy’s Admiral Sir George Cockburn read:
Let the landings you make be more for the protection of the desertion of the Black Population than with a view to any other advantage. … The great point to be attained is the cordial Support of the Black population. With them properly armed & backed with 20,000 British Troops, Mr. Madison will be hurled from his throne.
Whole families found their way to the ships of the British, who accepted everyone and pledged no one would be given back to their “owners.” Adult men were trained to create a regiment called the Colonial Marines, who participated in many of the most important battles, including the August 1814 raid on Washington.
Then on the night of September 13, 1814, the British bombarded Fort McHenry. Key, seeing the fort’s flag the next morning, was inspired to write the lyrics for “The Star-Spangled Banner.”
So when Key penned “No refuge could save the hireling and slave / From the terror of flight or the gloom of the grave,” he was taking great satisfaction in the death of slaves who’d freed themselves. His perspective may have been affected by the fact he owned several slaves himself.
With that in mind, think again about the next two lines: “And the star-spangled banner in triumph doth wave / O’er the land of the free and the home of the brave.”
The reality is that there were human beings fighting for freedom with incredible bravery during the War of 1812. However, “The Star-Spangled Banner” glorifies America’s “triumph” over them — and then turns that reality completely upside down, transforming their killers into the courageous freedom fighters.
After the U.S. and the British signed a peace treaty at the end of 1814, the U.S. government demanded the return of American “property,” which by that point numbered about 6,000 people. The British refused. Most of the 6,000 eventually settled in Canada, with some going to Trinidad, where their descendants are still known as “Merikins.”
Furthermore, if those leading the backlash against Kaepernick need more inspiration, they can get it from Francis Scott Key’s later life.
By 1833, Key was a district attorney for Washington, D.C. As described in a book called Snowstorm in August by former Washington Post reporter Jefferson Morley, the police were notorious thieves, frequently stealing free blacks’ possessions with impunity. One night, one of the constables tried to attack a woman who escaped and ran away — until she fell off a bridge across the Potomac and drowned.
“There is neither mercy nor justice for colored people in this district,” an abolitionist paper wrote. “No fuss or stir was made about it. She was got out of the river, and was buried, and there the matter ended.”
Key was furious and indicted the newspaper for intending “to injure, oppress, aggrieve & vilify the good name, fame, credit & reputation of the Magistrates & constables of Washington County.”
You can decide for yourself whether there’s some connection between what happened 200 years ago and what Colin Kaepernick is angry about today. Maybe it’s all ancient, meaningless history. Or maybe it’s not, and Kaepernick is right, and we really need a new national anthem.
Twitter then went predictably nuts, with at least one 49ers fan burning Kaepernick’s jersey.
Almost no one seems to be aware that even if the U.S. were a perfect country today, it would be bizarre to expect African-American players to stand for “The Star-Spangled Banner.” Why? Because it literally celebrates the murder of African-Americans.
Few people know this because we only ever sing the first verse. But read the end of the third verse and you’ll see why “The Star-Spangled Banner” is not just a musical atrocity, it’s an intellectual and moral one, too:
No refuge could save the hireling and slave
From the terror of flight or the gloom of the grave,
And the star-spangled banner in triumph doth wave
O’er the land of the free and the home of the brave.
“The Star-Spangled Banner,” Americans hazily remember, was written by Francis Scott Key about the Battle of Fort McHenry in Baltimore during the War of 1812. But we don’t ever talk about how the War of 1812 was a war of aggression that began with an attempt by the U.S. to grab Canada from the British Empire.
However, we’d wildly overestimated the strength of the U.S. military. By the time of the Battle of Fort McHenry in 1814, the British had counterattacked and overrun Washington, D.C., setting fire to the White House.
And one of the key tactics behind the British military’s success was its active recruitment of American slaves. As a detailed 2014 article in Harper’s explains, the orders given to the Royal Navy’s Admiral Sir George Cockburn read:
Let the landings you make be more for the protection of the desertion of the Black Population than with a view to any other advantage. … The great point to be attained is the cordial Support of the Black population. With them properly armed & backed with 20,000 British Troops, Mr. Madison will be hurled from his throne.
Whole families found their way to the ships of the British, who accepted everyone and pledged no one would be given back to their “owners.” Adult men were trained to create a regiment called the Colonial Marines, who participated in many of the most important battles, including the August 1814 raid on Washington.
Then on the night of September 13, 1814, the British bombarded Fort McHenry. Key, seeing the fort’s flag the next morning, was inspired to write the lyrics for “The Star-Spangled Banner.”
So when Key penned “No refuge could save the hireling and slave / From the terror of flight or the gloom of the grave,” he was taking great satisfaction in the death of slaves who’d freed themselves. His perspective may have been affected by the fact he owned several slaves himself.
With that in mind, think again about the next two lines: “And the star-spangled banner in triumph doth wave / O’er the land of the free and the home of the brave.”
The reality is that there were human beings fighting for freedom with incredible bravery during the War of 1812. However, “The Star-Spangled Banner” glorifies America’s “triumph” over them — and then turns that reality completely upside down, transforming their killers into the courageous freedom fighters.
After the U.S. and the British signed a peace treaty at the end of 1814, the U.S. government demanded the return of American “property,” which by that point numbered about 6,000 people. The British refused. Most of the 6,000 eventually settled in Canada, with some going to Trinidad, where their descendants are still known as “Merikins.”
Furthermore, if those leading the backlash against Kaepernick need more inspiration, they can get it from Francis Scott Key’s later life.
By 1833, Key was a district attorney for Washington, D.C. As described in a book called Snowstorm in August by former Washington Post reporter Jefferson Morley, the police were notorious thieves, frequently stealing free blacks’ possessions with impunity. One night, one of the constables tried to attack a woman who escaped and ran away — until she fell off a bridge across the Potomac and drowned.
“There is neither mercy nor justice for colored people in this district,” an abolitionist paper wrote. “No fuss or stir was made about it. She was got out of the river, and was buried, and there the matter ended.”
Key was furious and indicted the newspaper for intending “to injure, oppress, aggrieve & vilify the good name, fame, credit & reputation of the Magistrates & constables of Washington County.”
You can decide for yourself whether there’s some connection between what happened 200 years ago and what Colin Kaepernick is angry about today. Maybe it’s all ancient, meaningless history. Or maybe it’s not, and Kaepernick is right, and we really need a new national anthem.
The white supremacist roots of the Republicans’ so-called ‘right-to-work’ laws
Tamara Draut, Salon - raw story
07 JUN 2018 AT 12:57 ET
The U.S. Supreme Court will soon hand down its decision in Janus v. AFSCME Council 31, which challenges the ability of public sector unions to collect “fair share” fees from workers who are covered by a negotiated union contract but don’t want to join the union. While the case may seem technocratic, its argument is one thread of a well-worn tapestry by conservatives: attacking union rights to thwart working-class solidarity, especially across racial and ethnic lines.
At the heart of the case is what are deceptively known as “right-to-work” laws, which were conceived with the sole intention of maintaining racial wage hierarchies in the Jim Crow South as part of a larger conservative backlash to the success of union organizing in the years immediately following the passage of the National Labor Relations Act (also known as the Wagner Act) in 1935. We have to go that far back because what Congress did in that year, as long ago as it seems, greatly constrains working-class power today.
The Wagner Act essentially legalized the rights of employees to organize in unions and developed the process of union certification through a new agency created under the law, the National Labor Relations Board. It’s hard to overstate the radical nature of this law at the time it was enacted — and how surprising it was that the law was upheld by the Supreme Court. It seemed destined to be overturned, given the Court’s longstanding opposition to government involvement in the economy.
Indeed, most of the big corporations at the time — DuPont, General Motors and Republic Steel — ignored the law under this assumption, carrying on their normal business of fighting union attempts by firing activists, hiring spies, and stocking up on guns and tear gas. They funded the legal challenge to the National Labor Relations Act and a major public relations effort to smear the law in the court of public opinion. But in a 1937 decision in National Labor Relations Board v. Jones & Laughlin Steel Corporation, the Supreme Court declared the Wagner Act constitutional by sustaining Congress’s power to regulate employers under the commerce clause.
Within a year-and-a-half, 3 million new workers voted to be represented by a union in the Congress of Industrial Organizations (CIO). As America entered World War II and demand for machinery, ammunitions and aircraft soared, another 5 million workers voted for a union in just three years, including many women and African-Americans, who had gained new protections under federal contracts related to the war effort.
By the end of the 1940s, nearly one-third of American workers were unionized, winning contracts for better wages, job security and benefits. With unprecedented gains in the North and Midwest, the CIO set its sights on organizing the Jim Crow South. Termed Operation Dixie, the CIO aimed to organize one million Southern white and black workers, provoking the ire of Southern segregationists who rightly worried that working-class solidarity between blacks and whites would uproot the political power structure of the South.
It turned out that the Wagner Act did exactly what it was supposed to do — unleash a great spurt of workplace democracy — and segregationists and business leaders were less than pleased. So Republicans and Southern Democrats in Congress drafted a bill that would amend the Wagner Act by gutting many of the hard-fought labor rights it guaranteed. The bill, passed over President Truman’s veto, is known as Taft-Hartley for its co-sponsors, Senator Robert Taft and Representative Fred Hartley.
Taft-Hartley provided so-called free-speech rights to employers during an NLRB election, providing companies ample time and leeway to spread false and anti-union information to its workers. Employers could now hold mandatory meetings with workers to detail the perils of welcoming a union into the workplace, intimating that their jobs, or indeed the entire factory, might up and disappear. The Taft-Hartley amendments also outlawed industry-wide strikes, secondary boycotts, and sympathy strikes and gave the president more expansive authority to obtain injunctions against strikes if they jeopardized national interests.
Its most ideological mandate was to require all union officers to sign an affidavit saying that they were not members of the Communist Party. At the time, some of the most effective organizers in the labor movement were Communists. After Taft-Hartley was passed, some unions collapsed and others were purged from the CIO or left rather than signing the pledge. As intended, this provision of the law neutered the most radical and effective elements in the labor movement and washed the labor movement free of its most ardent supporters of women’s and civil rights.
Taft-Hartley also allowed states to pass “right-to-work” laws, which gave workers, even in a unionized workplace, the right to refuse to pay fair-share fees. Under these laws, workers can free-ride — enjoy the benefits of representation without having to pay for it — which makes the establishment and sustenance of a new union a much riskier proposition. In states without right-to-work laws, all workers who benefit from a union contract must pay a fair-share fee, even if they decline union membership. This fee basically represents the worker’s portion of the costs of the union’s providing collective bargaining and other benefits. This is a smaller fee than the overall union dues, which cover broader costs, such as political funding and lobbying.
That loss of revenue makes organizing in right-to-work states much more financially precarious and is a major reason that union density in those states is so much lower. In the aftermath of Taft-Hartley, a number of states quickly passed so-called “right-to-work” laws. Ten states, mostly in the South, passed them immediately in 1947, followed by another half dozen or so in the early 1950s.
It’s a technocratic policy with profound consequences, dreamed up by a rather infamous segregationist.
The development and promotion of “right-to-work” laws is largely credited to Vance Muse, a Texas oil industry lobbyist and known white supremacist who warned that without such legislation to impede union organizing, “white women and white men will be forced into organizations with black African apes whom they will have to call ‘brother’ or lose their jobs.” When it came to squashing working class solidarity, big business and segregationists forged common cause — an alliance in the history books with significant impacts on worker’s rights today.
Right-to-work laws and other anti-union efforts are aimed at consolidating economic and political power for businesses and capital by preventing any whiff of working-class solidarity. Through union dues, the labor movement can amass significant resources to engage in voter turnout, agenda setting and issue advocacy, all on behalf of ordinary Americans. It is that amassing of political power — in addition to the fairer distribution of profits — that is so threatening to conservatives and corporate America. After all, big labor has been responsible for advances in our day-to-day lives that still make conservatives livid: Medicare, Medicaid and, yes, Obamacare too; unemployment insurance; Social Security; the 40-hour workweek; pensions (what’s left of them, anyway); the minimum wage. These are just the greatest hits; many other humane advances in our lives owe their existence to labor unions.
Big business saw the weakening of union rights as the first step in a campaign to bring down the entire New Deal order, and weren’t shy about saying so. At the end of the war, Alfred Sloan, CEO of General Motors, spoke honestly about his disdain for the New Deal, saying, “It took 14 years to rid this country of Prohibition. It is going to take a good while to rid the country of the New Deal, but sooner or later the ax falls and we get a change.”
Power in America might be thought of as being historically represented by two scales. On the left is labor and on the right is capital. When one side loses political clout, the other side gains it. Today the right side of the scale overpowers the left. And that was no accident.
If the court sides with the plaintiff in Janus, it would essentially nationalize “right-to-work” laws in the public sector. All of us will pay the price — in the form of lower wages, job insecurity, miserly benefits and soaring inequality — so America’s plutocrats can tighten the political and economic vise that leaves most of us struggling.
These same plutocrats will point the finger for our struggles at new immigrants, black people and the poor, in a well-worn narrative that allows some politicians and their corporate donors to tilt the rules in their favor while the rest of us fight among ourselves for the leftovers. But we can change. We can come together across racial and ethnic lines to elect new leaders who will represent all of us, not just the wealthy few. We can start by refusing to fall for the divide-and-conquer politics at issue in the Janus case.
At the heart of the case is what are deceptively known as “right-to-work” laws, which were conceived with the sole intention of maintaining racial wage hierarchies in the Jim Crow South as part of a larger conservative backlash to the success of union organizing in the years immediately following the passage of the National Labor Relations Act (also known as the Wagner Act) in 1935. We have to go that far back because what Congress did in that year, as long ago as it seems, greatly constrains working-class power today.
The Wagner Act essentially legalized the rights of employees to organize in unions and developed the process of union certification through a new agency created under the law, the National Labor Relations Board. It’s hard to overstate the radical nature of this law at the time it was enacted — and how surprising it was that the law was upheld by the Supreme Court. It seemed destined to be overturned, given the Court’s longstanding opposition to government involvement in the economy.
Indeed, most of the big corporations at the time — DuPont, General Motors and Republic Steel — ignored the law under this assumption, carrying on their normal business of fighting union attempts by firing activists, hiring spies, and stocking up on guns and tear gas. They funded the legal challenge to the National Labor Relations Act and a major public relations effort to smear the law in the court of public opinion. But in a 1937 decision in National Labor Relations Board v. Jones & Laughlin Steel Corporation, the Supreme Court declared the Wagner Act constitutional by sustaining Congress’s power to regulate employers under the commerce clause.
Within a year-and-a-half, 3 million new workers voted to be represented by a union in the Congress of Industrial Organizations (CIO). As America entered World War II and demand for machinery, ammunitions and aircraft soared, another 5 million workers voted for a union in just three years, including many women and African-Americans, who had gained new protections under federal contracts related to the war effort.
By the end of the 1940s, nearly one-third of American workers were unionized, winning contracts for better wages, job security and benefits. With unprecedented gains in the North and Midwest, the CIO set its sights on organizing the Jim Crow South. Termed Operation Dixie, the CIO aimed to organize one million Southern white and black workers, provoking the ire of Southern segregationists who rightly worried that working-class solidarity between blacks and whites would uproot the political power structure of the South.
It turned out that the Wagner Act did exactly what it was supposed to do — unleash a great spurt of workplace democracy — and segregationists and business leaders were less than pleased. So Republicans and Southern Democrats in Congress drafted a bill that would amend the Wagner Act by gutting many of the hard-fought labor rights it guaranteed. The bill, passed over President Truman’s veto, is known as Taft-Hartley for its co-sponsors, Senator Robert Taft and Representative Fred Hartley.
Taft-Hartley provided so-called free-speech rights to employers during an NLRB election, providing companies ample time and leeway to spread false and anti-union information to its workers. Employers could now hold mandatory meetings with workers to detail the perils of welcoming a union into the workplace, intimating that their jobs, or indeed the entire factory, might up and disappear. The Taft-Hartley amendments also outlawed industry-wide strikes, secondary boycotts, and sympathy strikes and gave the president more expansive authority to obtain injunctions against strikes if they jeopardized national interests.
Its most ideological mandate was to require all union officers to sign an affidavit saying that they were not members of the Communist Party. At the time, some of the most effective organizers in the labor movement were Communists. After Taft-Hartley was passed, some unions collapsed and others were purged from the CIO or left rather than signing the pledge. As intended, this provision of the law neutered the most radical and effective elements in the labor movement and washed the labor movement free of its most ardent supporters of women’s and civil rights.
Taft-Hartley also allowed states to pass “right-to-work” laws, which gave workers, even in a unionized workplace, the right to refuse to pay fair-share fees. Under these laws, workers can free-ride — enjoy the benefits of representation without having to pay for it — which makes the establishment and sustenance of a new union a much riskier proposition. In states without right-to-work laws, all workers who benefit from a union contract must pay a fair-share fee, even if they decline union membership. This fee basically represents the worker’s portion of the costs of the union’s providing collective bargaining and other benefits. This is a smaller fee than the overall union dues, which cover broader costs, such as political funding and lobbying.
That loss of revenue makes organizing in right-to-work states much more financially precarious and is a major reason that union density in those states is so much lower. In the aftermath of Taft-Hartley, a number of states quickly passed so-called “right-to-work” laws. Ten states, mostly in the South, passed them immediately in 1947, followed by another half dozen or so in the early 1950s.
It’s a technocratic policy with profound consequences, dreamed up by a rather infamous segregationist.
The development and promotion of “right-to-work” laws is largely credited to Vance Muse, a Texas oil industry lobbyist and known white supremacist who warned that without such legislation to impede union organizing, “white women and white men will be forced into organizations with black African apes whom they will have to call ‘brother’ or lose their jobs.” When it came to squashing working class solidarity, big business and segregationists forged common cause — an alliance in the history books with significant impacts on worker’s rights today.
Right-to-work laws and other anti-union efforts are aimed at consolidating economic and political power for businesses and capital by preventing any whiff of working-class solidarity. Through union dues, the labor movement can amass significant resources to engage in voter turnout, agenda setting and issue advocacy, all on behalf of ordinary Americans. It is that amassing of political power — in addition to the fairer distribution of profits — that is so threatening to conservatives and corporate America. After all, big labor has been responsible for advances in our day-to-day lives that still make conservatives livid: Medicare, Medicaid and, yes, Obamacare too; unemployment insurance; Social Security; the 40-hour workweek; pensions (what’s left of them, anyway); the minimum wage. These are just the greatest hits; many other humane advances in our lives owe their existence to labor unions.
Big business saw the weakening of union rights as the first step in a campaign to bring down the entire New Deal order, and weren’t shy about saying so. At the end of the war, Alfred Sloan, CEO of General Motors, spoke honestly about his disdain for the New Deal, saying, “It took 14 years to rid this country of Prohibition. It is going to take a good while to rid the country of the New Deal, but sooner or later the ax falls and we get a change.”
Power in America might be thought of as being historically represented by two scales. On the left is labor and on the right is capital. When one side loses political clout, the other side gains it. Today the right side of the scale overpowers the left. And that was no accident.
If the court sides with the plaintiff in Janus, it would essentially nationalize “right-to-work” laws in the public sector. All of us will pay the price — in the form of lower wages, job insecurity, miserly benefits and soaring inequality — so America’s plutocrats can tighten the political and economic vise that leaves most of us struggling.
These same plutocrats will point the finger for our struggles at new immigrants, black people and the poor, in a well-worn narrative that allows some politicians and their corporate donors to tilt the rules in their favor while the rest of us fight among ourselves for the leftovers. But we can change. We can come together across racial and ethnic lines to elect new leaders who will represent all of us, not just the wealthy few. We can start by refusing to fall for the divide-and-conquer politics at issue in the Janus case.
just like trump!!!
(NYT) Evidence Shows That Nixon Betrayed The U.S. In Order To Become President
the intellectualist
5/28/18
Evidence shows that President Richard Nixon colluded with the South Vietnamese for the purpose of winning the 1968 election over his Democratic opponent Herbert Humphrey. By doing so, Mr. Nixon betrayed the United States for his own personal ambitions.
At the time, President Lyndon B. Johnson was negotiating a peace settlement with the North Vietnamese.
Nixon, through a trusted intermediary, contacted the South Vietnamese and promised them a better deal if they refused to work with Johnson. By 1968, 30,000 Americans had already died in America’s war in Vietnam.
The South Vietnamese, apparently believing Nixon’s promises, chose not to cooperate in American peace negotiations, damning the process.
Nearly 60,000 Americans were killed in Vietnam by the time the U.S. fled in 1975.
Following his resignation, Nixon denied harming peace talks between the Johnson Administration and North Vietnam, however, evidence discovered following Mr. Nixon’s death eviscerates these claims of innocence.
According to the New York Times:
“Now we know Nixon lied. A newfound cache of notes left by H. R. Haldeman, his closest aide, shows that Nixon directed his campaign’s efforts to scuttle the peace talks, which he feared could give his opponent, Vice President Hubert H. Humphrey, an edge in the 1968 election. On Oct. 22, 1968*, he ordered Haldeman to “monkey wrench” the initiative.
Haldeman’s notes return us to the dark side. Amid the reappraisals, we must now weigh apparently criminal behavior that, given the human lives at stake and the decade of carnage that followed in Southeast Asia, may be more reprehensible than anything Nixon did in Watergate.”
In a conversation with the Republican Senator Everett Dirksen, the minority leader, Johnson lashed out at Nixon. “I’m reading their hand, Everett,” Johnson told his old friend. “This is treason.”
“I know,” Dirksen said mournfully.
Johnson’s closest aides urged him to unmask Nixon’s actions. But on a Nov. 4 conference call, they concluded that they could not go public because, among other factors, they lacked the “absolute proof,” as Defense Secretary Clark Clifford put it, of Nixon’s direct involvement.”*
At the time, President Lyndon B. Johnson was negotiating a peace settlement with the North Vietnamese.
Nixon, through a trusted intermediary, contacted the South Vietnamese and promised them a better deal if they refused to work with Johnson. By 1968, 30,000 Americans had already died in America’s war in Vietnam.
The South Vietnamese, apparently believing Nixon’s promises, chose not to cooperate in American peace negotiations, damning the process.
Nearly 60,000 Americans were killed in Vietnam by the time the U.S. fled in 1975.
Following his resignation, Nixon denied harming peace talks between the Johnson Administration and North Vietnam, however, evidence discovered following Mr. Nixon’s death eviscerates these claims of innocence.
According to the New York Times:
“Now we know Nixon lied. A newfound cache of notes left by H. R. Haldeman, his closest aide, shows that Nixon directed his campaign’s efforts to scuttle the peace talks, which he feared could give his opponent, Vice President Hubert H. Humphrey, an edge in the 1968 election. On Oct. 22, 1968*, he ordered Haldeman to “monkey wrench” the initiative.
Haldeman’s notes return us to the dark side. Amid the reappraisals, we must now weigh apparently criminal behavior that, given the human lives at stake and the decade of carnage that followed in Southeast Asia, may be more reprehensible than anything Nixon did in Watergate.”
In a conversation with the Republican Senator Everett Dirksen, the minority leader, Johnson lashed out at Nixon. “I’m reading their hand, Everett,” Johnson told his old friend. “This is treason.”
“I know,” Dirksen said mournfully.
Johnson’s closest aides urged him to unmask Nixon’s actions. But on a Nov. 4 conference call, they concluded that they could not go public because, among other factors, they lacked the “absolute proof,” as Defense Secretary Clark Clifford put it, of Nixon’s direct involvement.”*
The Retired General Who Stopped a Wall Street Coup
General Smedley Butler blew the whistle on a millionaire-led effort to oust FDR and the New Deal.
By Jim Hightower - other words
May 23, 2018
Many Americans would be shocked to learn that political coups are part of our country’s history. Consider the Wall Street Putsch of 1933.
Never heard of it? It was a corporate conspiracy to oust Franklin D. Roosevelt, who had just been elected president.
With the Great Depression raging and millions of families financially devastated, FDR had launched several economic recovery programs to help people get back on their feet. To pay for this crucial effort, he had the audacity to raise taxes on the wealthy, and this enraged a group of Wall Street multimillionaires.
Wailing that their “liberty” to grab as much wealth as possible was being shackled, they accused the president of mounting a “class war.” To pull off their coup, they plotted to enlist a private military force made up of destitute World War I vets who were upset at not receiving promised federal bonus payments.
One of the multimillionaires’ lackeys reached out to a well-respected advocate for veterans: Retired Marine general Smedley Darlington Butler. They wanted him to lead 500,000 veterans in a march on Washington to force FDR from the White House.
They chose the wrong general. Butler was a patriot and lifelong soldier for democracy, who, in his later years, became a famous critic of corporate war profiteering.
Butler was repulsed by the hubris and treachery of these Wall Street aristocrats. He reached out to a reporter, and together they gathered proof to take to Congress. A special congressional committee investigated and found Butler’s story “alarmingly true,” leading to public hearings, with Butler giving detailed testimony.
By exposing the traitors, this courageous patriot nipped their coup in the bud. But their sense of entitlement reveals that we must be aware of the concentrated wealth of the imperious rich, for it poses an ever-present danger to majority rule.
Never heard of it? It was a corporate conspiracy to oust Franklin D. Roosevelt, who had just been elected president.
With the Great Depression raging and millions of families financially devastated, FDR had launched several economic recovery programs to help people get back on their feet. To pay for this crucial effort, he had the audacity to raise taxes on the wealthy, and this enraged a group of Wall Street multimillionaires.
Wailing that their “liberty” to grab as much wealth as possible was being shackled, they accused the president of mounting a “class war.” To pull off their coup, they plotted to enlist a private military force made up of destitute World War I vets who were upset at not receiving promised federal bonus payments.
One of the multimillionaires’ lackeys reached out to a well-respected advocate for veterans: Retired Marine general Smedley Darlington Butler. They wanted him to lead 500,000 veterans in a march on Washington to force FDR from the White House.
They chose the wrong general. Butler was a patriot and lifelong soldier for democracy, who, in his later years, became a famous critic of corporate war profiteering.
Butler was repulsed by the hubris and treachery of these Wall Street aristocrats. He reached out to a reporter, and together they gathered proof to take to Congress. A special congressional committee investigated and found Butler’s story “alarmingly true,” leading to public hearings, with Butler giving detailed testimony.
By exposing the traitors, this courageous patriot nipped their coup in the bud. But their sense of entitlement reveals that we must be aware of the concentrated wealth of the imperious rich, for it poses an ever-present danger to majority rule.
The Myth of the Roosevelt “Trustbusters”
Teddy and FDR weren't the anti-corporate crusaders that they're portrayed as by populists today.
By Robert D. Atkinson and Michael Lind - the new republic
May 4, 2018
In the aftermath of the Great Recession, amid growing concerns about income inequality and wage stagnation, politicians and pundits on the left and right have blamed the problems of twenty-first-century America on a familiar populist scapegoat: big business. The solution, they say, can be found in the nation’s past—in particular, the reign of two twentieth-century presidents.
In the early 1900s, the narrative goes, Theodore Roosevelt waged war on corporate concentration as a crusading “trustbuster.” A generation later, during the Great Depression, his cousin Franklin D. Roosevelt stood up for small banks against Wall Street’s big bullies. The Roosevelts saved America from plutocracy and created a golden age for the middle class. Thus, many argue, we need a new generation of trustbusters to save us from the robber barons of tech and banking.
It makes for a compelling case. But it’s based on a false history.
Teddy Roosevelt was far from the business-bashing “trustbuster” of popular memory. The Republican president distinguished between “good” and “bad” trusts, telling Congress in 1905, “I am in no sense hostile to corporations. This is an age of combination, and any effort to prevent combination will not only be useless, but in the end, vicious…”
It is true that his administration brought 44 antitrust actions against corporations and business combinations, including the Northern Securities railroad company and the “beef trust” in meatpacking, which were ultimately broken up by the Supreme Court. But Roosevelt had profound doubts about antitrust, observing that “a succession of lawsuits is hopeless from the standpoint of working out a permanently satisfactory solution” to the problems posed by big business. Indeed, he wanted to replace antitrust policy with federal regulation of firms by a powerful Bureau of Corporations, whose decisions would be shielded from judicial review.
His Republican successor in the White House, William Howard Taft, initiated twice as many antitrust lawsuits in four years as Roosevelt had done in his seven and a half years in office. Privately, Roosevelt raged when the Supreme Court ordered the break-up of Standard Oil, in an antitrust lawsuit begun under his administration and completed under Taft: “I do not see what good can come from dissolving the Standard Oil Company into 40 separate companies, all of which will still remain really under the same control. What we should have is a much stricter government supervision of these great companies, but accompanying this supervision should be a recognition of the fact that great combinations have come to stay and we must do them scrupulous justice just as we exact scrupulous justice from them.”
Anger at Taft was one of the factors that motivated Roosevelt to run for president again in 1912 as the candidate of the Progressive Party. The party’s platform reflected his view that big business overall was a positive force, but needed federal regulation: “The corporation is an essential part of modern business. The concentration of modern business, in some degree, is both inevitable and necessary for national and international business efficiency.” The remedy for abuse was not mindlessly breaking up big firms, but preventing specific abuses by means of a strong national regulation of interstate corporations.
Like Roosevelt, FDR is falsely remembered as an enemy of big business. When running for office in 1932, the Democrat mocked the populists who supported antitrust: “The cry was raised against the great corporations. Theodore Roosevelt, the first great Republican Progressive, fought a Presidential campaign on the issue of ‘trust busting’ and talked freely about malefactors of great wealth. If the government had a policy it was rather to turn the clock back, to destroy the large combinations and to return to the time when every man owned his individual small business. This was impossible.” FDR agreed with his cousin that the answer was regulation, not breaking up big corporations: “Nor today should we abandon the principle of strong economic units called corporations, merely because their power is susceptible of easy abuse.”
In his first term, FDR in attempted to restructure the U.S. economy under the National Industrial Recovery Act (NIRA), a system of industry-wide minimum wages and labor codes, which small businesses claimed gave an unfair advantage to big firms. In his second term, after the Supreme Court struck down the NIRA in 1935, Roosevelt briefly fell under the influence of Robert Jackson, Thurman Arnold, and other champions of an aggressive approach to antitrust in the Justice Department. But when World War II broke out, such an approach became an impediment to enlisting major industrial firms for war production, and FDR sidelined the antitrust advocates.
Surely FDR wanted to “break up big banks,” though, given his support of the Glass-Steagall Act of 1933? That’s a myth, too.
FDR and Senator Carter Glass of Virginia shared the goal of separating commercial and investment banking, ending what FDR called “speculation with other people’s money.” But they were also hostile to what American populists loved—the fragmented system of small, unstable local “unit banks” protected from competition with big eastern banks by laws against interstate branch banking. To prop up local banks, Representative Henry B. Steagall of Alabama pushed an old populist idea: federal deposit insurance. Shortly before his election in 1932, FDR explained why he opposed the policy in a letter to the New York Sun: “It would lead to laxity in bank management and carelessness on the part of both banker and depositor. I believe that it would be an impossible drain on the Federal Treasury to make good any such guarantee. For a number of reasons of sound government finance, such a plan would be quite dangerous.”
FDR was so opposed that he threatened to veto the bank reform legislation if it included deposit insurance. In the end, in order to enact other reforms he favored, he reluctantly signed the Glass-Steagall bill. If FDR had prevailed, there would be no Federal Deposit Insurance Corporation (FDIC).
Today, the growth and consolidation of multinational corporations presents American democracy with genuine policy challenges. But the answer need not come from bogus history; real history will suffice. Teddy Roosevelt argued that “big trusts” must be “taught that they are under the rule of law,” yet added that “breaking up all big corporations, whether they have behaved well or ill,” is “an extremely insufficient and fragmentary measure.”
And FDR said: “Nor today should we abandon the principle of strong economic units called corporations, merely because their power is susceptible of easy abuse.” The answer to the problems caused by corporate concentration, the Roosevelts agreed, is prudent government oversight and using antitrust laws to police abuses—not to break up every big company simply because it’s big.
In the early 1900s, the narrative goes, Theodore Roosevelt waged war on corporate concentration as a crusading “trustbuster.” A generation later, during the Great Depression, his cousin Franklin D. Roosevelt stood up for small banks against Wall Street’s big bullies. The Roosevelts saved America from plutocracy and created a golden age for the middle class. Thus, many argue, we need a new generation of trustbusters to save us from the robber barons of tech and banking.
It makes for a compelling case. But it’s based on a false history.
Teddy Roosevelt was far from the business-bashing “trustbuster” of popular memory. The Republican president distinguished between “good” and “bad” trusts, telling Congress in 1905, “I am in no sense hostile to corporations. This is an age of combination, and any effort to prevent combination will not only be useless, but in the end, vicious…”
It is true that his administration brought 44 antitrust actions against corporations and business combinations, including the Northern Securities railroad company and the “beef trust” in meatpacking, which were ultimately broken up by the Supreme Court. But Roosevelt had profound doubts about antitrust, observing that “a succession of lawsuits is hopeless from the standpoint of working out a permanently satisfactory solution” to the problems posed by big business. Indeed, he wanted to replace antitrust policy with federal regulation of firms by a powerful Bureau of Corporations, whose decisions would be shielded from judicial review.
His Republican successor in the White House, William Howard Taft, initiated twice as many antitrust lawsuits in four years as Roosevelt had done in his seven and a half years in office. Privately, Roosevelt raged when the Supreme Court ordered the break-up of Standard Oil, in an antitrust lawsuit begun under his administration and completed under Taft: “I do not see what good can come from dissolving the Standard Oil Company into 40 separate companies, all of which will still remain really under the same control. What we should have is a much stricter government supervision of these great companies, but accompanying this supervision should be a recognition of the fact that great combinations have come to stay and we must do them scrupulous justice just as we exact scrupulous justice from them.”
Anger at Taft was one of the factors that motivated Roosevelt to run for president again in 1912 as the candidate of the Progressive Party. The party’s platform reflected his view that big business overall was a positive force, but needed federal regulation: “The corporation is an essential part of modern business. The concentration of modern business, in some degree, is both inevitable and necessary for national and international business efficiency.” The remedy for abuse was not mindlessly breaking up big firms, but preventing specific abuses by means of a strong national regulation of interstate corporations.
Like Roosevelt, FDR is falsely remembered as an enemy of big business. When running for office in 1932, the Democrat mocked the populists who supported antitrust: “The cry was raised against the great corporations. Theodore Roosevelt, the first great Republican Progressive, fought a Presidential campaign on the issue of ‘trust busting’ and talked freely about malefactors of great wealth. If the government had a policy it was rather to turn the clock back, to destroy the large combinations and to return to the time when every man owned his individual small business. This was impossible.” FDR agreed with his cousin that the answer was regulation, not breaking up big corporations: “Nor today should we abandon the principle of strong economic units called corporations, merely because their power is susceptible of easy abuse.”
In his first term, FDR in attempted to restructure the U.S. economy under the National Industrial Recovery Act (NIRA), a system of industry-wide minimum wages and labor codes, which small businesses claimed gave an unfair advantage to big firms. In his second term, after the Supreme Court struck down the NIRA in 1935, Roosevelt briefly fell under the influence of Robert Jackson, Thurman Arnold, and other champions of an aggressive approach to antitrust in the Justice Department. But when World War II broke out, such an approach became an impediment to enlisting major industrial firms for war production, and FDR sidelined the antitrust advocates.
Surely FDR wanted to “break up big banks,” though, given his support of the Glass-Steagall Act of 1933? That’s a myth, too.
FDR and Senator Carter Glass of Virginia shared the goal of separating commercial and investment banking, ending what FDR called “speculation with other people’s money.” But they were also hostile to what American populists loved—the fragmented system of small, unstable local “unit banks” protected from competition with big eastern banks by laws against interstate branch banking. To prop up local banks, Representative Henry B. Steagall of Alabama pushed an old populist idea: federal deposit insurance. Shortly before his election in 1932, FDR explained why he opposed the policy in a letter to the New York Sun: “It would lead to laxity in bank management and carelessness on the part of both banker and depositor. I believe that it would be an impossible drain on the Federal Treasury to make good any such guarantee. For a number of reasons of sound government finance, such a plan would be quite dangerous.”
FDR was so opposed that he threatened to veto the bank reform legislation if it included deposit insurance. In the end, in order to enact other reforms he favored, he reluctantly signed the Glass-Steagall bill. If FDR had prevailed, there would be no Federal Deposit Insurance Corporation (FDIC).
Today, the growth and consolidation of multinational corporations presents American democracy with genuine policy challenges. But the answer need not come from bogus history; real history will suffice. Teddy Roosevelt argued that “big trusts” must be “taught that they are under the rule of law,” yet added that “breaking up all big corporations, whether they have behaved well or ill,” is “an extremely insufficient and fragmentary measure.”
And FDR said: “Nor today should we abandon the principle of strong economic units called corporations, merely because their power is susceptible of easy abuse.” The answer to the problems caused by corporate concentration, the Roosevelts agreed, is prudent government oversight and using antitrust laws to police abuses—not to break up every big company simply because it’s big.
Bigotry stopped Americans from intervening before the Holocaust. Not much has changed
By JAMES GROSSMAN - la times
APR 29, 2018 | 4:05 AM
A ruthless dictator unleashes terror on his own citizens. Those fleeing elicit sympathy — but encounter obstacles to entering the United States. Americans learn of mass killings, but their moral revulsion doesn’t easily turn into policy or military intervention. One thing remains consistent: America doesn’t want refugees, at least not of this ilk; those people aren’t welcome here.
Historians like me are wary of the adage that “history repeats itself.” But comparisons and analogies help us learn from the past, showing us how context matters and conventional wisdom deceives. To most Americans in 1945, “those people” meant “European Jews.” Today, they are Syrians, Congolese, Hondurans.
No visitor to the new exhibition “Americans and the Holocaust” at the U.S. Holocaust Memorial Museum in Washington, D.C., will walk away with conventional wisdom about World War II intact. In the 1930s, anti-Semitism rested comfortably within American ideologies of race, but this context, not widely acknowledged at the time, has now virtually disappeared from mainstream collective memory. Instead, America’s pre-Pearl Harbor isolationism is viewed as a mistaken but understandable disinclination to intervene in another European war, further tempered by the suggestion that Americans had only slight knowledge of Nazi depravity.
Museum visitors enter the new exhibit’s galleries in 1933 and walk through 12 years without the benefit of 80 years of hindsight. They see what Americans knew about events in Nazi Germany as they learned it. Public opinion (as documented by polls) and U.S. policy are revealed within that context.
It is a sobering journey. Americans knew that something was dreadfully wrong in Germany. As early as 1932, and even more in 1933, popular magazines including Cosmopolitan, Time and Newsweek included major stories on the persecution of Jews in Germany and on Nazi governance. Hitler and Goebbels appeared on covers of Time in 1933, with Goebbels accompanied by a clear message: “Say it in your dreams — THE JEWS ARE TO BLAME.”
An imaginative crowdsourcing effort carried out by the museum uncovered no fewer than 15,000 U.S. newspaper articles documenting persecution published between 1933 and 1945. Newsreels told the same story.
Commentators who have the benefit of hindsight have criticized President Franklin D. Roosevelt for his refusal to intervene. In 1933, the U.S. ambassador to Germany recorded in his diary Roosevelt’s instructions: “The German authorities are treating the Jews shamefully and the Jews in this country are greatly excited. But this is also not a governmental affair.” It comes across as cold-hearted in retrospect, but Roosevelt understood his fellow Americans; they would not march to war — or even expend substantial public resources — to save Jews.
If this feels in any way familiar, consider what comes next. Even when 94% of polled Americans claimed to “disapprove of the Nazi treatment of Jews in Germany,” 71% of them opposed permitting any more than a trickle of German Jews to enter the United States — two weeks after Kristallnacht. Two-thirds of Americans opposed admitting refugee children in 1939.
America kept its doors closed to the people for whom they professed sympathy. This sentiment, shaped by racism, was nothing new, nor was it confined to immigrants. One need only cross the Mall to the National Museum of African American History and Culture to be reminded that in the 1850s white Northerners were as repulsed by the suggestion that emancipation would result in black migration northward as they were by the cruelty of slavery.
Anti-Semitism would remain central to American foreign policy even as the nation stared down Nazi Germany. The United States entered the war in Europe, of course, but Roosevelt was shrewd enough to cast the move as fighting fascism on behalf of democracy. The war was about preserving American values, not saving European Jews.
At war’s end, Americans encountered graphic, overwhelming evidence of what they had been hearing about regularly since the first news reports about the death camps in 1942. Films, photographs, articles and official documents laid out the horrific details of ghettos, concentration camps and gas chambers. Aside from the Jewish media, however, few of these accounts named the victims as Jews.
Terrible people those Nazis, those fascists. The survivors of their terror, however, the “displaced persons,” still could not be trusted to be our neighbors even if there was an orderly bureaucracy of refugee screening, documented here by a wall of letters and official forms.
The ring of familiarity impels us to ask chilling questions about our current moment.
Historians like me are wary of the adage that “history repeats itself.” But comparisons and analogies help us learn from the past, showing us how context matters and conventional wisdom deceives. To most Americans in 1945, “those people” meant “European Jews.” Today, they are Syrians, Congolese, Hondurans.
No visitor to the new exhibition “Americans and the Holocaust” at the U.S. Holocaust Memorial Museum in Washington, D.C., will walk away with conventional wisdom about World War II intact. In the 1930s, anti-Semitism rested comfortably within American ideologies of race, but this context, not widely acknowledged at the time, has now virtually disappeared from mainstream collective memory. Instead, America’s pre-Pearl Harbor isolationism is viewed as a mistaken but understandable disinclination to intervene in another European war, further tempered by the suggestion that Americans had only slight knowledge of Nazi depravity.
Museum visitors enter the new exhibit’s galleries in 1933 and walk through 12 years without the benefit of 80 years of hindsight. They see what Americans knew about events in Nazi Germany as they learned it. Public opinion (as documented by polls) and U.S. policy are revealed within that context.
It is a sobering journey. Americans knew that something was dreadfully wrong in Germany. As early as 1932, and even more in 1933, popular magazines including Cosmopolitan, Time and Newsweek included major stories on the persecution of Jews in Germany and on Nazi governance. Hitler and Goebbels appeared on covers of Time in 1933, with Goebbels accompanied by a clear message: “Say it in your dreams — THE JEWS ARE TO BLAME.”
An imaginative crowdsourcing effort carried out by the museum uncovered no fewer than 15,000 U.S. newspaper articles documenting persecution published between 1933 and 1945. Newsreels told the same story.
Commentators who have the benefit of hindsight have criticized President Franklin D. Roosevelt for his refusal to intervene. In 1933, the U.S. ambassador to Germany recorded in his diary Roosevelt’s instructions: “The German authorities are treating the Jews shamefully and the Jews in this country are greatly excited. But this is also not a governmental affair.” It comes across as cold-hearted in retrospect, but Roosevelt understood his fellow Americans; they would not march to war — or even expend substantial public resources — to save Jews.
If this feels in any way familiar, consider what comes next. Even when 94% of polled Americans claimed to “disapprove of the Nazi treatment of Jews in Germany,” 71% of them opposed permitting any more than a trickle of German Jews to enter the United States — two weeks after Kristallnacht. Two-thirds of Americans opposed admitting refugee children in 1939.
America kept its doors closed to the people for whom they professed sympathy. This sentiment, shaped by racism, was nothing new, nor was it confined to immigrants. One need only cross the Mall to the National Museum of African American History and Culture to be reminded that in the 1850s white Northerners were as repulsed by the suggestion that emancipation would result in black migration northward as they were by the cruelty of slavery.
Anti-Semitism would remain central to American foreign policy even as the nation stared down Nazi Germany. The United States entered the war in Europe, of course, but Roosevelt was shrewd enough to cast the move as fighting fascism on behalf of democracy. The war was about preserving American values, not saving European Jews.
At war’s end, Americans encountered graphic, overwhelming evidence of what they had been hearing about regularly since the first news reports about the death camps in 1942. Films, photographs, articles and official documents laid out the horrific details of ghettos, concentration camps and gas chambers. Aside from the Jewish media, however, few of these accounts named the victims as Jews.
Terrible people those Nazis, those fascists. The survivors of their terror, however, the “displaced persons,” still could not be trusted to be our neighbors even if there was an orderly bureaucracy of refugee screening, documented here by a wall of letters and official forms.
The ring of familiarity impels us to ask chilling questions about our current moment.
The Americans who helped Hitler
History News Network - raw story
26 MAR 2018 AT 11:25 ET
Why were so many “great” Americans tarred with a pro-Nazi brush? Henry Ford. Connecticut Banker and Senator, Prescott Bush, father and grandfather of the Bush presidents. Charles Lindbergh. Even the first, albeit short-lived, America First Committee (1940-1941) with its origins at Yale University allowed itself to become infiltrated by dangerous agents of the Third Reich in America.
Granted, at the outset, there was some considerable sympathy for Hitler in Europe and America. The real enemy, Communism, had swept away the Russian Empire, and was making headway in Europe. Spain had gone communist. Fascism was seen as an antidote to the hammer and sickle. But Hitler’s personal interest in infiltrating America, as early as 1925, was purely economic. Germany needed foreign exchange to stay afloat. It was drowning in hyperinflation. A loaf of bread cost a trillion Reichsmarks. Wheelbarrows and muscles were needed to transport the cash to the bakery. Yesterday’s marks became tomorrow’s kites or paper toys for children—that’s how quickly they were devalued. But few Americans recognized that America’s bankers were behind the bankrupt currency. The names Morgan, Bush, Chase, Union Banking Corporation, First National City are just some that spring to mind.
Germany needed foreign exchange more than anything else to get back on its feet. From 1933, Hitler cunningly lassoed other Americans to help him in his task. Where else could he turn for money coupled with an unwillingness to stop his European expansion plan? As a failed artist, art seemed as good a place to start as any. Alfred H. Barr, Founding Director of the Museum of Modern Art, knew he was buying art taken from German museums that Hitler deemed to be “Entartarte” or “degenerate.” Barr befriended one of Hitler’s art dealers, Karl Buchholz, and was a close personal friend of the Hamburg-born art aficionado Curt Valentin from the time Valentin’s feet hit terra firma in 1937 New York. So Hitler got his foreign exchange, initially from looted museum art, later from desperate, mostly Jewish families, and Barr filled his new museum under the guise of “saving modern art.”
Barr was far from alone. American banks, like Chase National, were involved in a scheme to bring dollars to Germany in something called the Rückwanderer Mark Scheme. And why not? They had to do something to stop the rot on their poor Mark investments of the 1920s. Literally meaning “returning home,” the Rückwanderer Mark was designed to allow Germans living in the U.S.A. who wanted to return to Germany—on a temporary or permanent basis—to buy Rückwanderer Marks at an advantageous exchange rate. The Reichsbank allowed any returnees to Germany to exchange half of their dollars at the favorable RM 4.10 rate, even though the real exchange rate was only RM 2.48.
How could a bankrupt country afford such largesse? The surplus was paid from blocked accounts and assets once owned by refugees fleeing Germany, mostly Jews. The refugees lost an additional 25 percent minimum through a mechanism called a “flight tax,” which was often as elastic as a rubber band. The elasticity stemmed from the official practice of restricting refugees to one small suitcase to take with them and valuing any nonmonetary assets for two or three cents (pfennigs) on the Reichsmark. “The German government,” the FBI noted, “thereby netted a profit in dollars of nearly 90 percent.”
American Companies trading Rückwanderers needed to pay wholesalers, among which were a host of companies, including American Express, the Hamburg-Amerika Line, and the Swiss import-export firm Interkommerz in America, run by Henri Guisan, son of the commander-in-chief of the Swiss army. Jean Guisan, a close family relation, got the idea to introduce the seductive American, Florence Lacaze Gould (wife of the youngest son of American robber baron Jay Gould and the subject of my biography), to act as their “clean skin” banker in Monaco and France. The man who vetted Mrs. Gould was August T. Gausebeck, a German national working in New York since 1933. Gausebeck had the backing of the wealthiest supporters of Hitler including Fritz Thyssen, Prescott Bush’s main banking client. Gausebeck’s New York company, Robert C. Mayer & Co., and his German-inspired investment company called the New York Overseas Corporation, were the primary vehicles complicit in the theft of millions from Jews fleeing Germany. They should have received an acknowledgement somewhere for helping Hitler and Göring to build the Luftwaffe.
But have a heart. Uncle Sam did get around to stopping them. In 1943. The Neutrality Act in the United States prohibited loans and gifts to belligerent nations. J. Edgar Hoover, FBI director, was told in October 1939: “Representatives approach investors and indicate to them that Germany will undoubtedly win the war… and that marks will undoubtedly increase many times in value.” Hoover was onto the scam like any mollusk clinging to a juicy rock. What attracted Hoover’s attention was Gausebeck, that German resident alien who was secretly funding the anti-Semitic campaign of Father Charles Coughlin on the radio. Coughlin? Just a minute….
Surely the Canadian-American Catholic priest who took to the airwaves since June 1930 could not have been in direct Nazi pay? Wrong. The National Archives are littered with documents proving that the priest was on the take. And why not? The America he broadcast to in 1930 was bust, just like Germany after 1918. Investors in the stock market were looking at profits down some 45.9 percent in the leading two hundred industrial companies. Steel production was down 60 percent; automobile production a staggering 60 percent. Farmers selling wheat in the autumn of 1930 were getting half of what they had been offered in 1929. Office workers, if they still had jobs, watched the breadlines form and wondered if the lines might not be for banks about to close. (By December, there were 328 banks closing each month.)
So when Father Charles E. Coughlin took to the airwaves on Station WJR Detroit with his richly mellow, reassuring voice, his ingratiating charm just begged his listeners to wrap their arms around and listen to his own brand of “Fireside Chat.” Cloaking his fascist message in words of the times, Father Coughlin had discovered his pulpit. His listeners were, like the Germans in 1918, angry. Really angry at bankers. They feared the Communists more than the Fascists, and like other demagogues, Coughlin built his Church of the Little Flower on the wretchedness of others. By early 1933, it was estimated that Coughlin had an audience of ten million people in the U.S.A, and only a handful of critics. But that December, CBS refused to renew his contract unless his sermons were submitted to censorship prior to his broadcasts. Why?
Incensed Americans—Jews, Protestants, Catholics and others—ran to Father Coughlin’s defense. No one else stood up for America’s poor. They became members of his People’s Lobby, partly-funding his programs on another station, increasing his hook-up from twenty-nine to thirty-five stations. Enter August T. Gausebeck, Göring’s banker in America. If Coughlin took off the gloves and plainly said what he meant, Gausebeck would fund any shortfalls the good father might experience—in five to ten dollar untraceable donations. So Coughlin, freed from tedious financial burdens, spoke out against the C.I.O. and organized labor; against the League of Nations, swaying millions to vote as he saw it. Coughlin threw his considerable weight behind Franklin D. Roosevelt, and in the good father’s opinion, brought about Roosevelt’s first presidential victory. Angered that Roosevelt did not recognize his contribution, he turned on the president-elect. Coughlin publicly called Roosevelt a “liar.” That was his first big mistake.
He also spoke out vociferously against the “money lenders”—meaning Jews—and adopted the platform of a man eager to install the first American Reichstag. Coughlin leaned further to the right, republishing the disreputable forgery The Protocols of Zion in his magazine Social Justice and attacked American unionism as having its headquarters in Moscow. (Much of the commentary in Social Justice regarding Jews was taken verbatim from the speeches of Joseph Goebbels, literally line by line.)
Roosevelt was in the pocket of the “money lenders,” Coughlin endlessly jeered. Cheered at the German-American Bund meeting at Madison Square Garden in 1939, Coughlin and his platoons of Christian Front followers were revealed as nothing more than criminal thugs, out to terrify the neighborhoods they lived in. As hundreds were arrested for their violence and pointed racial hatred, claiming Coughlin as their spiritual father, the radio priest ran hot and cold in reply, depending on his audience.
By the time Pearl Harbor came, Coughlin had three-quarters of the United States clamoring for his scalp and demanding to lock up his lunatic fringe. Like Hitler, his little empire lasted a scant twelve years. The Catholic Church had cut him loose, clearly recognizing Nazi ideals, Nazi methods and the un-Christian message Coughlin preached. Yet his fascist worldview remains a danger today. Preaching hatred is not freedom of expression. It is dangerous, deadly propaganda—intent on destroying our souls through fear. We would all do well to learn the lessons of history, and understand how forces that use our democracy against us work.
Granted, at the outset, there was some considerable sympathy for Hitler in Europe and America. The real enemy, Communism, had swept away the Russian Empire, and was making headway in Europe. Spain had gone communist. Fascism was seen as an antidote to the hammer and sickle. But Hitler’s personal interest in infiltrating America, as early as 1925, was purely economic. Germany needed foreign exchange to stay afloat. It was drowning in hyperinflation. A loaf of bread cost a trillion Reichsmarks. Wheelbarrows and muscles were needed to transport the cash to the bakery. Yesterday’s marks became tomorrow’s kites or paper toys for children—that’s how quickly they were devalued. But few Americans recognized that America’s bankers were behind the bankrupt currency. The names Morgan, Bush, Chase, Union Banking Corporation, First National City are just some that spring to mind.
Germany needed foreign exchange more than anything else to get back on its feet. From 1933, Hitler cunningly lassoed other Americans to help him in his task. Where else could he turn for money coupled with an unwillingness to stop his European expansion plan? As a failed artist, art seemed as good a place to start as any. Alfred H. Barr, Founding Director of the Museum of Modern Art, knew he was buying art taken from German museums that Hitler deemed to be “Entartarte” or “degenerate.” Barr befriended one of Hitler’s art dealers, Karl Buchholz, and was a close personal friend of the Hamburg-born art aficionado Curt Valentin from the time Valentin’s feet hit terra firma in 1937 New York. So Hitler got his foreign exchange, initially from looted museum art, later from desperate, mostly Jewish families, and Barr filled his new museum under the guise of “saving modern art.”
Barr was far from alone. American banks, like Chase National, were involved in a scheme to bring dollars to Germany in something called the Rückwanderer Mark Scheme. And why not? They had to do something to stop the rot on their poor Mark investments of the 1920s. Literally meaning “returning home,” the Rückwanderer Mark was designed to allow Germans living in the U.S.A. who wanted to return to Germany—on a temporary or permanent basis—to buy Rückwanderer Marks at an advantageous exchange rate. The Reichsbank allowed any returnees to Germany to exchange half of their dollars at the favorable RM 4.10 rate, even though the real exchange rate was only RM 2.48.
How could a bankrupt country afford such largesse? The surplus was paid from blocked accounts and assets once owned by refugees fleeing Germany, mostly Jews. The refugees lost an additional 25 percent minimum through a mechanism called a “flight tax,” which was often as elastic as a rubber band. The elasticity stemmed from the official practice of restricting refugees to one small suitcase to take with them and valuing any nonmonetary assets for two or three cents (pfennigs) on the Reichsmark. “The German government,” the FBI noted, “thereby netted a profit in dollars of nearly 90 percent.”
American Companies trading Rückwanderers needed to pay wholesalers, among which were a host of companies, including American Express, the Hamburg-Amerika Line, and the Swiss import-export firm Interkommerz in America, run by Henri Guisan, son of the commander-in-chief of the Swiss army. Jean Guisan, a close family relation, got the idea to introduce the seductive American, Florence Lacaze Gould (wife of the youngest son of American robber baron Jay Gould and the subject of my biography), to act as their “clean skin” banker in Monaco and France. The man who vetted Mrs. Gould was August T. Gausebeck, a German national working in New York since 1933. Gausebeck had the backing of the wealthiest supporters of Hitler including Fritz Thyssen, Prescott Bush’s main banking client. Gausebeck’s New York company, Robert C. Mayer & Co., and his German-inspired investment company called the New York Overseas Corporation, were the primary vehicles complicit in the theft of millions from Jews fleeing Germany. They should have received an acknowledgement somewhere for helping Hitler and Göring to build the Luftwaffe.
But have a heart. Uncle Sam did get around to stopping them. In 1943. The Neutrality Act in the United States prohibited loans and gifts to belligerent nations. J. Edgar Hoover, FBI director, was told in October 1939: “Representatives approach investors and indicate to them that Germany will undoubtedly win the war… and that marks will undoubtedly increase many times in value.” Hoover was onto the scam like any mollusk clinging to a juicy rock. What attracted Hoover’s attention was Gausebeck, that German resident alien who was secretly funding the anti-Semitic campaign of Father Charles Coughlin on the radio. Coughlin? Just a minute….
Surely the Canadian-American Catholic priest who took to the airwaves since June 1930 could not have been in direct Nazi pay? Wrong. The National Archives are littered with documents proving that the priest was on the take. And why not? The America he broadcast to in 1930 was bust, just like Germany after 1918. Investors in the stock market were looking at profits down some 45.9 percent in the leading two hundred industrial companies. Steel production was down 60 percent; automobile production a staggering 60 percent. Farmers selling wheat in the autumn of 1930 were getting half of what they had been offered in 1929. Office workers, if they still had jobs, watched the breadlines form and wondered if the lines might not be for banks about to close. (By December, there were 328 banks closing each month.)
So when Father Charles E. Coughlin took to the airwaves on Station WJR Detroit with his richly mellow, reassuring voice, his ingratiating charm just begged his listeners to wrap their arms around and listen to his own brand of “Fireside Chat.” Cloaking his fascist message in words of the times, Father Coughlin had discovered his pulpit. His listeners were, like the Germans in 1918, angry. Really angry at bankers. They feared the Communists more than the Fascists, and like other demagogues, Coughlin built his Church of the Little Flower on the wretchedness of others. By early 1933, it was estimated that Coughlin had an audience of ten million people in the U.S.A, and only a handful of critics. But that December, CBS refused to renew his contract unless his sermons were submitted to censorship prior to his broadcasts. Why?
Incensed Americans—Jews, Protestants, Catholics and others—ran to Father Coughlin’s defense. No one else stood up for America’s poor. They became members of his People’s Lobby, partly-funding his programs on another station, increasing his hook-up from twenty-nine to thirty-five stations. Enter August T. Gausebeck, Göring’s banker in America. If Coughlin took off the gloves and plainly said what he meant, Gausebeck would fund any shortfalls the good father might experience—in five to ten dollar untraceable donations. So Coughlin, freed from tedious financial burdens, spoke out against the C.I.O. and organized labor; against the League of Nations, swaying millions to vote as he saw it. Coughlin threw his considerable weight behind Franklin D. Roosevelt, and in the good father’s opinion, brought about Roosevelt’s first presidential victory. Angered that Roosevelt did not recognize his contribution, he turned on the president-elect. Coughlin publicly called Roosevelt a “liar.” That was his first big mistake.
He also spoke out vociferously against the “money lenders”—meaning Jews—and adopted the platform of a man eager to install the first American Reichstag. Coughlin leaned further to the right, republishing the disreputable forgery The Protocols of Zion in his magazine Social Justice and attacked American unionism as having its headquarters in Moscow. (Much of the commentary in Social Justice regarding Jews was taken verbatim from the speeches of Joseph Goebbels, literally line by line.)
Roosevelt was in the pocket of the “money lenders,” Coughlin endlessly jeered. Cheered at the German-American Bund meeting at Madison Square Garden in 1939, Coughlin and his platoons of Christian Front followers were revealed as nothing more than criminal thugs, out to terrify the neighborhoods they lived in. As hundreds were arrested for their violence and pointed racial hatred, claiming Coughlin as their spiritual father, the radio priest ran hot and cold in reply, depending on his audience.
By the time Pearl Harbor came, Coughlin had three-quarters of the United States clamoring for his scalp and demanding to lock up his lunatic fringe. Like Hitler, his little empire lasted a scant twelve years. The Catholic Church had cut him loose, clearly recognizing Nazi ideals, Nazi methods and the un-Christian message Coughlin preached. Yet his fascist worldview remains a danger today. Preaching hatred is not freedom of expression. It is dangerous, deadly propaganda—intent on destroying our souls through fear. We would all do well to learn the lessons of history, and understand how forces that use our democracy against us work.
Forced sterilization programs in California once harmed thousands – particularly Latinas
The Conversation - raw story
22 MAR 2018 AT 17:43 ET
In 1942, 18-year-old Iris Lopez, a Mexican-American woman, started working at the Calship Yards in Los Angeles. Working on the home front building Victory Ships not only added to the war effort, but allowed Iris to support her family.
Iris’ participation in the World War II effort made her part of a celebrated time in U.S. history, when economic opportunities opened up for women and youth of color.
However, before joining the shipyards, Iris was entangled in another lesser-known history. At the age of 16, Iris was committed to a California institution and sterilized.
Iris wasn’t alone. In the first half of the 20th century, approximately 60,000 people were sterilized under U.S. eugenics programs. Eugenic laws in 32 states empowered government officials in public health, social work and state institutions to render people they deemed “unfit” infertile.
California led the nation in this effort at social engineering. Between the early 1920s and the 1950s, Iris and approximately 20,000 other people – one-third of the national total – were sterilized in California state institutions for the mentally ill and disabled.
To better understand the nation’s most aggressive eugenic sterilization program, our research team tracked sterilization requests of over 20,000 people. We wanted to know about the role patients’ race played in sterilization decisions. What made young women like Iris a target? How and why was she cast as “unfit”?
Racial biases affected Iris’ life and the lives of thousands of others. Their experiences serve as an important historical backdrop to ongoing issues in the U.S. today.
‘Race science’ and sterilization
Eugenics was seen as a “science” in the early 20th century, and its ideas remained popular into the midcentury. Advocating for the “science of better breeding,” eugenicists endorsed sterilizing people considered unfit to reproduce.
Under California’s eugenic law, first passed in 1909, anyone committed to a state institution could be sterilized. Many of those committed were sent by a court order. Others were committed by family members who wouldn’t or couldn’t care for them. Once a patient was admitted, medical superintendents held the legal power to recommend and authorize the operation.
Eugenics policies were shaped by entrenched hierarchies of race, class, gender and ability. Working-class youth, especially youth of color, were targeted for commitment and sterilization during the peak years.
Eugenic thinking was also used to support racist policies like anti-miscegenation lawsand the Immigration Act of 1924. Anti-Mexican sentiment in particular was spurred by theories that Mexican immigrants and Mexican-Americans were at a “lower racial level.” Contemporary politicians and state officials often described Mexicans as inherently less intelligent, immoral, “hyperfertile” and criminally inclined.
These stereotypes appeared in reports written by state authorities. Mexicans and their descendants were described as “immigrants of an undesirable type.” If their existence in the U.S. was undesirable, then so was their reproduction.
Targeting Latinos and Latinas
In a study published March 22, we looked at the California program’s disproportionately high impact on the Latino population, primarily women and men from Mexico.
Previous research examined racial bias in California’s sterilization program. But the extent of anti-Latino bias hadn’t been formally quantified. Latinas like Iris were certainly targeted for sterilization, but to what extent?
We used sterilization forms found by historian Alexandra Minna Stern to build a data set on over 20,000 people recommended for sterilization in California between 1919 and 1953. The racial categories used to classify Californians of Mexican origin were in flux during this time period, so we used Spanish surname criteria as a proxy. In 1950, 88 percent of Californians with a Spanish surname were of Mexican descent.
We compared patients recommended for sterilization to the patient population of each institution, which we reconstructed with data from census forms. We then measured sterilization rates between Latino and non-Latino patients, adjusting for age. (Both Latino patients and people recommended for sterilization tended to be younger.)
Latino men were 23 percent more likely to be sterilized than non-Latino men. The difference was even greater among women, with Latinas sterilized at 59 percent higher rates than non-Latinas.
In the first half of the twentieth century, approximately 20,000 people – many of them Latino – were forcibly sterilized in California.
In their records, doctors repeatedly cast young Latino men as biologically prone to crime, while young Latinas like Iris were described as “sex delinquents.” Their sterilizations were described as necessary to protect the state from increased crime, poverty and racial degeneracy.
Lasting impact
The legacy of these infringements on reproductive rights is still visible today.
Recent incidents in Tennessee, California and Oklahoma echo this past. In each case, people in contact with the criminal justice system – often people of color – were sterilized under coercive pressure from the state.
Contemporary justifications for this practice rely on core tenets of eugenics. Proponents argued that preventing the reproduction of some will help solve larger social issues like poverty. The doctor who sterilized incarcerated women in California without proper consent stated that doing so would save the state money in future welfare costs for “unwanted children.”
The eugenics era also echoes in the broader cultural and political landscape of the U.S. today. Latina women’s reproduction is repeatedly portrayed as a threat to the nation. Latina immigrants in particular are seen as hyperfertile. Their children are sometimes derogatorily referred to as “anchor babies” and described as a burden on the nation.
Reproductive justice
This history – and other histories of sterilization abuse of black, Native, Mexican immigrant and Puerto Rican women – inform the modern reproductive justicemovement.
This movement, as defined by the advocacy group SisterSong Women of Color Reproductive Justice Collective is committed to “the human right to maintain personal bodily autonomy, have children, not have children and parent the children we have in safe and sustainable communities.”
As the fight for contemporary reproductive justice continues, it’s important to acknowledge the wrongs of the past. The nonprofit California Latinas for Reproductive Justice has co-sponsored a forthcoming bill that offers financial redress to living survivors of California’s eugenic sterilization program. “As reproductive justice advocates, we recognize the insidious impact state-sponsored policies have on the dignity and rights of poor women of color who are often stripped of their ability to form the families they want,” CLRJ Executive Director Laura Jiménez said in a statement.
This bill was introduced on Feb. 15 by Sen. Nancy Skinner, along with Assemblymember Monique Limón and Sen. Jim Beall.
If this bill passes, California would follow in the footsteps of North Carolina and Virginia, which began sterilization redress programs in 2013 and 2015.
In the words of Jimenez, “This bill is a step in the right direction in remedying the violence inflicted on these survivors.” In our view, financial compensation will never make up for the violation of survivors’ fundamental human rights. But it’s an opportunity to reaffirm the dignity and self-determination of all people.
Iris’ participation in the World War II effort made her part of a celebrated time in U.S. history, when economic opportunities opened up for women and youth of color.
However, before joining the shipyards, Iris was entangled in another lesser-known history. At the age of 16, Iris was committed to a California institution and sterilized.
Iris wasn’t alone. In the first half of the 20th century, approximately 60,000 people were sterilized under U.S. eugenics programs. Eugenic laws in 32 states empowered government officials in public health, social work and state institutions to render people they deemed “unfit” infertile.
California led the nation in this effort at social engineering. Between the early 1920s and the 1950s, Iris and approximately 20,000 other people – one-third of the national total – were sterilized in California state institutions for the mentally ill and disabled.
To better understand the nation’s most aggressive eugenic sterilization program, our research team tracked sterilization requests of over 20,000 people. We wanted to know about the role patients’ race played in sterilization decisions. What made young women like Iris a target? How and why was she cast as “unfit”?
Racial biases affected Iris’ life and the lives of thousands of others. Their experiences serve as an important historical backdrop to ongoing issues in the U.S. today.
‘Race science’ and sterilization
Eugenics was seen as a “science” in the early 20th century, and its ideas remained popular into the midcentury. Advocating for the “science of better breeding,” eugenicists endorsed sterilizing people considered unfit to reproduce.
Under California’s eugenic law, first passed in 1909, anyone committed to a state institution could be sterilized. Many of those committed were sent by a court order. Others were committed by family members who wouldn’t or couldn’t care for them. Once a patient was admitted, medical superintendents held the legal power to recommend and authorize the operation.
Eugenics policies were shaped by entrenched hierarchies of race, class, gender and ability. Working-class youth, especially youth of color, were targeted for commitment and sterilization during the peak years.
Eugenic thinking was also used to support racist policies like anti-miscegenation lawsand the Immigration Act of 1924. Anti-Mexican sentiment in particular was spurred by theories that Mexican immigrants and Mexican-Americans were at a “lower racial level.” Contemporary politicians and state officials often described Mexicans as inherently less intelligent, immoral, “hyperfertile” and criminally inclined.
These stereotypes appeared in reports written by state authorities. Mexicans and their descendants were described as “immigrants of an undesirable type.” If their existence in the U.S. was undesirable, then so was their reproduction.
Targeting Latinos and Latinas
In a study published March 22, we looked at the California program’s disproportionately high impact on the Latino population, primarily women and men from Mexico.
Previous research examined racial bias in California’s sterilization program. But the extent of anti-Latino bias hadn’t been formally quantified. Latinas like Iris were certainly targeted for sterilization, but to what extent?
We used sterilization forms found by historian Alexandra Minna Stern to build a data set on over 20,000 people recommended for sterilization in California between 1919 and 1953. The racial categories used to classify Californians of Mexican origin were in flux during this time period, so we used Spanish surname criteria as a proxy. In 1950, 88 percent of Californians with a Spanish surname were of Mexican descent.
We compared patients recommended for sterilization to the patient population of each institution, which we reconstructed with data from census forms. We then measured sterilization rates between Latino and non-Latino patients, adjusting for age. (Both Latino patients and people recommended for sterilization tended to be younger.)
Latino men were 23 percent more likely to be sterilized than non-Latino men. The difference was even greater among women, with Latinas sterilized at 59 percent higher rates than non-Latinas.
In the first half of the twentieth century, approximately 20,000 people – many of them Latino – were forcibly sterilized in California.
In their records, doctors repeatedly cast young Latino men as biologically prone to crime, while young Latinas like Iris were described as “sex delinquents.” Their sterilizations were described as necessary to protect the state from increased crime, poverty and racial degeneracy.
Lasting impact
The legacy of these infringements on reproductive rights is still visible today.
Recent incidents in Tennessee, California and Oklahoma echo this past. In each case, people in contact with the criminal justice system – often people of color – were sterilized under coercive pressure from the state.
Contemporary justifications for this practice rely on core tenets of eugenics. Proponents argued that preventing the reproduction of some will help solve larger social issues like poverty. The doctor who sterilized incarcerated women in California without proper consent stated that doing so would save the state money in future welfare costs for “unwanted children.”
The eugenics era also echoes in the broader cultural and political landscape of the U.S. today. Latina women’s reproduction is repeatedly portrayed as a threat to the nation. Latina immigrants in particular are seen as hyperfertile. Their children are sometimes derogatorily referred to as “anchor babies” and described as a burden on the nation.
Reproductive justice
This history – and other histories of sterilization abuse of black, Native, Mexican immigrant and Puerto Rican women – inform the modern reproductive justicemovement.
This movement, as defined by the advocacy group SisterSong Women of Color Reproductive Justice Collective is committed to “the human right to maintain personal bodily autonomy, have children, not have children and parent the children we have in safe and sustainable communities.”
As the fight for contemporary reproductive justice continues, it’s important to acknowledge the wrongs of the past. The nonprofit California Latinas for Reproductive Justice has co-sponsored a forthcoming bill that offers financial redress to living survivors of California’s eugenic sterilization program. “As reproductive justice advocates, we recognize the insidious impact state-sponsored policies have on the dignity and rights of poor women of color who are often stripped of their ability to form the families they want,” CLRJ Executive Director Laura Jiménez said in a statement.
This bill was introduced on Feb. 15 by Sen. Nancy Skinner, along with Assemblymember Monique Limón and Sen. Jim Beall.
If this bill passes, California would follow in the footsteps of North Carolina and Virginia, which began sterilization redress programs in 2013 and 2015.
In the words of Jimenez, “This bill is a step in the right direction in remedying the violence inflicted on these survivors.” In our view, financial compensation will never make up for the violation of survivors’ fundamental human rights. But it’s an opportunity to reaffirm the dignity and self-determination of all people.